Are We Out of Work? Film and TV Industry Worries About On-Set Body Scans | AI

FIt’s common for actors on film and TV sets to be asked to enter a booth lined with cameras prepared to capture their likeness from multiple perspectives. However, the cast and crew are increasingly anxious about the implications of AI in the industry, complicating this process further.

“It occurs unexpectedly,” Olivia Williams notes. She recalls being scanned more times than she can count throughout her career, from *The Sixth Sense* to *Dune: Prophecy*.

“You’re on set, in costume, with a friendly assistant director who knows you well, bringing you tea or managing your phone while you act. Then VFX says, ‘You’re on set. The visual effects team is here today—can you please head to the VFX bus as soon as the scene wraps?’ And off we go.

“Actors often strive to please. Being approached for a scan mid-scene can be detrimental to your creativity and instill a fear of never working again or losing your agent. So you comply.”

Lead and supporting actors, stunt performers, and dancers have shared similar experiences with the Guardian, where they’ve been ushered through scanners on set, often unclear about their rights regarding the biometric data collected.

Williams mentioned that the cast was informed that a scan was needed “if they wanted to be part of the scene or to create visually interesting moments, like aliens coming out of their brains.”

Olivia Williams stated that scans “happened unexpectedly” and that actors complied out of “the fear of never working again.” Photo: David Bintiner/Observer

While anxiety regarding this issue has lingered, recent discussions about “AI doubles” and the rise of “AI actors” have sparked a pressing need to clarify the fate of data captured on set.

This concern was highlighted by reports of an AI character named “Tilly Norwood.” Although it may seem improbable that a production company will unveil the first AI star, it underscores the ongoing struggle to establish performers’ rights.

Worries about the future for emerging actors and the existential threats faced by performers, often referred to as support artists (SAs), prompted Williams to speak out.

Dave Watts, a seasoned SA with experience in numerous superhero films, has also encountered scanning several times and pointed out the wider implications for the industry.

“I can easily envision crew members saying, ‘We don’t need to cast anyone anymore. We can just have the AI create a crowd of 1,000 people based on our existing data,’” he remarked.

“If the usual 100, 200, or 500 SAs aren’t necessary for big productions, there’s no need for an assistant director to oversee them. We wouldn’t need hair and makeup artists, costumers, caterers, or drivers. AI threatens nearly every job out there.”

An AI-generated image of actor Tilly Norwood has raised significant concerns among cast members. Photo: Reuters

An anonymous dancer, fearing repercussions for voicing their opinion, echoed these sentiments regarding the pressure associated with scanning and data usage. “Filming is challenging. You’re awake at 3 a.m. and can’t leave until the day’s over at 8 p.m. Situations like this arise, leaving you with limited options.

“We all ponder whether we might as well quit our jobs, don’t we? It seems somewhat foolish when you frame it that way.”

Alex Lawrence Archer, a data rights attorney at AWO, which is navigating this issue with actors, stated that performers are hindered by a labyrinth of complex and overlapping regulations. He emphasized the necessity for clearer agreements regarding production, rather than scrambling to address data issues after they occur.

“Contracts are often vaguely written and standard industry language that is outdated,” he explained. “They weren’t made to address this technology. There exists a vacuum of ambiguity, wherein AI developers and studios can maneuver as they please.”

Skip past newsletter promotions

“Actors and their representatives need to focus on this upcoming training case. They must negotiate clearer contracts that accurately convey fair agreements between performers, studios, and AI developers.”

Signs of a rebellion are beginning to appear. On a recent shoot, the cast was informed in advance about the scan following concerns that were voiced.

One cast member, speaking anonymously, shared, “Performers are collectively resisting an environment that feels ambushing. We managed to add an addendum to our contract that essentially prevents the use of our digital scans for any purpose outside of the show without our written consent.”

Filming in Cardiff for *Mr. Burton*. In addition to actors, many jobs within the industry, including assistant directors, hair and make-up artists, costume designers, caterers, drivers, and location managers, are at risk due to AI, according to one supporting actor. Photo: Sara Lee/The Guardian

The struggle for rights may appear daunting in the face of the data-hungry AI industry, which can gather information from countless sources without involving professional performers. However, there’s a shared understanding of the need to regain some control.

Theo Morton, a professional stunt performer and member of the British Stunt Register, stated, “This technology could either reduce the need for human performers drastically or enhance creativity in a positive manner. But the uncertainty looms large, highlighting the necessity for contractual safeguards to prevent a loss of control.”

Yet, Williams expresses a deep concern about the potential loss of control.

A key unknown is the origin of data-trained AI models. Lawrence Archer highlighted that this remains a closely guarded secret that must be revealed. He also warned against reducing the discourse to merely compensation issues for performers.

“The AI industry depends on vast amounts of data,” he explained. “Someone is gathering it. We recognize these are sensitive topics for AI developers and studios. We are assisting performers in making data access requests to learn more. I know several performers who have been compensated by AI companies to withdraw such requests.”

“We must foster an environment where human creativity, actor connectivity, and performance are valued. If we focus solely on legal and compensation matters, we risk relegating actors to the status of data gig workers instead of recognizing them as creative artists.”

Source: www.theguardian.com

Early Detection of Parkinson’s Disease Possible 30 Years Before Onset of Symptoms, Scientists Find

Researchers have discovered a way to detect Parkinson’s disease up to 30 years before symptoms appear using biomarkers and PET scans. This breakthrough includes tracking neurodegeneration more sensitively than current methods and shows that rapid eye movement sleep behavior disorder (RBD) is an important early indicator of Parkinson’s disease. is identified. This discovery could lead to earlier diagnosis and treatment, potentially up to 10 years earlier than currently.

Researchers at The Florey and Austin Health in Melbourne, Australia, have demonstrated the potential to identify early indicators of Parkinson’s disease 20 to 30 years before the onset of symptoms. This breakthrough paves the way for early screening programs and intervention, potentially allowing treatment before significant damage occurs.

Researchers at the Florey Institute and Austin Health have demonstrated the possibility of identifying early indicators of Parkinson’s disease 20 to 30 years before the onset of symptoms. This breakthrough paves the way for early screening efforts and preventive treatment, long before permanent damage occurs.

Florey Professor Kevin Burnham said that although Parkinson’s disease, a debilitating neurodegenerative disease, is often thought of as a disease of the elderly, it actually begins in midlife and can last for decades. He said it may not be detected.

“Parkinson’s disease is very difficult to diagnose until symptoms become apparent, by which time up to 85 percent of the neurons in the brain that control motor coordination have been destroyed. At that point, many treatments are likely to be ineffective,” Professor Burnham said. “Our long-term goal is to find ways to detect diseases earlier and treat people before they cause harm.”

Advanced diagnostic technology

In a recently published study, neurologylead researcher Professor Burnham and colleagues explore how a known biomarker called F-AV-133 can be used in positron emission tomography (PET) scans to diagnose Parkinson’s disease and accurately track neurodegeneration. I’m explaining how it can be done.

In the Melbourne study, Austin Health’s Frawley Professor Chris Rowe and his team studied 26 patients with Parkinson’s disease, 12 controls, and 11 patients with rapid eye movement sleep behavior disorder (RBD), a strong indicator of Parkinson’s disease. I checked the name. .

Each person underwent two PET scans two years apart. Key findings include:

  • Currently available assessments of Parkinson’s disease showed no significant changes in clinical symptoms in any of the participants.
  • In contrast, PET scans showed “significant neuronal loss” in three key areas of the brains of people with the disease, making F-AV-133 more effective than what is currently available. also suggests that it is a sensitive means of monitoring neurodegeneration.

Further mathematical modeling yields the following calculation:

  • Slow nerve cell loss over a total of approximately 33 years in Parkinson’s disease
  • This loss takes about 10.5 years before the disease is detected on a PET scan.
  • Even if a PET scan detects the disease, it will take another six and a half years for motor symptoms to appear.
  • It takes about 3 years after physical symptoms appear until a clinical diagnosis is confirmed.
  • This corresponds to approximately 22.5 years of neuronal loss before clinical symptoms are sufficient for diagnosis.

Professor Burnham said the findings pave the way for the development of screening protocols to diagnose and treat Parkinson’s disease up to 10 years earlier than is currently possible. It may also help identify patients for clinical trials.

What is RBD?

  • RBD stands for Rapid Eye Movement Behavior Disorder.
  • Patients with RBD scream, thrash, and sometimes move violently during sleep, enacting vivid and disturbing dreams.
  • RBD is caused by a lack of muscle relaxation (sleep paralysis).
  • 90% of RBD patients develop Parkinson’s disease.
  • Half of all Parkinson’s patients have RBD.
  • RBD is an important warning sign for early Parkinson’s disease.
  • If you have RBD, see a sleep specialist or neurologist.

Reference: “Use of 18F-AV-133 VMAT2 PET Imaging to Monitor Progressive Nigrostriatal Degeneration in Parkinson’s Disease”, Leah C. Beauchamp, Vincent Dore, Victor L. Villemagne, SanSan Xu, David Finkelstein, Kevin J. Barnham, Christopher Rowe, 28 November 2023 neurology.
DOI: 10.1212/WNL.0000000000207748

Source: scitechdaily.com

Fuzzy Door’s Viewscreen: On-set Augmented Reality Brings Computer-Generated Characters and Locations to Your Viewfinder

Almost all TV shows and movies use computer graphics (CG) these days, but a show with fully digital characters takes it to another level. Seth MacFarlane’s “Ted” is one such show, and his production company Fuzzy Door has developed a new tool to enhance the filming process. This tool, called Viewscreen, turns the potentially messy process of working with CG characters and environments into an opportunity for collaboration and improvisation on set.

Viewscreen is an on-set augmented reality tool that allows for real-time interaction with CG assets through the camera. This has dramatically improved the creative process, making it easier to get the necessary shots faster, according to MacFarlane. Typically, the process of filming with CG assets occurs after the camera is turned off, and it involves using stand-ins like tennis balls and motion capture performers. The footage is then sent to a VFX person for adjustments, which can be a repetitive and traditional process, leaving little room for spontaneity.

Viewscreen Studio is a wireless system that can sync between multiple cameras and integrate various data streams simultaneously. This system creates a middle ground between pre and post-production, allowing for live compositing and positioning of CG assets in the viewfinder and on a monitor. It also allows for live adjustments, such as changing waypoints and lighting, and creating different shots and scenarios naturally.

This new tool enables directors and camera operators to see and interact with invisible CG elements in real time, allowing for more creative freedom and spontaneity. It has already been successfully used in the production of “Ted” to enhance over 3,000 shots in the film.

Fuzzy Door has made Viewscreen available today and is already working with several studios and productions. The company offers four specific modules, including a tracker, compositor, exporter, and motion, to assist in the filming process. This tool has the potential to revolutionize the way CG elements are integrated into live-action productions.

Source: techcrunch.com