Olivia Williams Advocates for ‘Nude Rider’ Style Regulations for AI Body Scanning in Acting

In light of rising apprehensions regarding the effects of artificial intelligence on performers, actress Olivia Williams emphasized that actors should handle data obtained from body scans similarly to how they approach nude scenes.

The star of Dune: Prophecy and The Crown stated that she and fellow actors often face mandatory body scans by on-set cameras, with scant assurances on the usage and destination of that data.

“It would be reasonable to adhere to the ‘Nude Rider’ standard,” she noted. “This footage should only be used within that specific scene; it must not be repurposed elsewhere. Furthermore, any edited scenes must be removed across all formats.”

Williams drew attention to a vague provision in contracts that seems to grant studios extensive rights to use images of performers “on every platform currently existing or created in the future worldwide, indefinitely.”

A renewed conversation about AI’s impact on actors has been ignited by widespread criticism of the development of an AI performer named Tilly Norwood. Actors fear their likenesses and poses will be utilized to train AI systems, potentially threatening their employment.

Actors, stunt performers, dancers, and supporting actors relayed to the Guardian that they felt “ambushed” and compelled to participate in body scans on set. Many reported there was little time to discuss how the generated data would be handled or whether it could be used for AI training purposes.

Ms. Williams recounted her unsuccessful attempts to eliminate the ambiguous clause from her contract. She explored options for obtaining a limited license to control her body scan data, but her lawyer advised her that the legal framework was too uncertain. The costs of trying to reclaim the data were prohibitively high.

“I’m not necessarily looking for financial compensation for the use of my likeness,” she remarked. “What concerns me is being depicted in places I’ve never been, engaging in activities I’ve never done, or expressing views I haven’t shared.”

“Laws are being enacted, and no one is intervening. They’re establishing a precedent and solidifying it. I sign these contracts because not doing so could cost me my career.”

Williams expressed that she is advocating for younger actors who have scant options but to undergo scans without clear information regarding the fate of their data. “I know a 17-year-old girl who was encouraged to undergo the scan and complied, similar to the scene from Chitty Chitty Bang Bang. Being a minor, a chaperone was required to consent, but her chaperone was a grandmother unfamiliar with the legal implications.”

The matter is currently under discussion between Equity, the UK performing arts union, and Pact, the trade body of the UK film industry. “We are urging for AI safeguards to be integrated into major film and television contracts to prioritize consent and transparency for on-set scanning,” stated Equity Executive Director Paul W. Fleming.

“It is achievable for the industry to implement essential minimum standards that could significantly transform conditions for performers and artists in British TV and film.”

Pact issued a statement saying: “Producers are fully aware of their responsibilities under data protection legislation, and these concerns are being addressed during collective negotiations with Equity. Due to the ongoing talks, we are unable to provide further details.”

Source: www.theguardian.com

NASA to launch Spherex Space Telescope for Sky Scanning Mission

Impressions of the artists of Spherex Space Telescope

NASA/JPL-Caltech

The latest addition to NASA's Space Telescope Fleet will be launched this weekend and will soon scan the entire sky in near-infrared wavelength ranges, collecting a wealth of data on more than 450 million galaxies.

The history of the universe, the reionization epoch, and the spectrophotometer for Ice Explorer (Spherex) will be released on March 2nd on a SpaceX Falcon 9 rocket from the Vandenberg Space Force Base in California at 10:09 pm local time.

It carries a camera with filters that divide the light that enters like a prism and beams different parts of the spectrum into 102 separate color sensors. As the telescope pans around the sky, it slowly tightens the full image pixels pixel by pixel. This strategy allows you to use a relatively small and simple camera to do what you need to have a heavy, expensive suite of sensors, even without moving parts.

“If you slowly scan the sky slowly by moving the telescope, after a sufficient amount of time, every pixel in the sky is observed over a very wide wavelength range, giving you a coarse spectrum of every bit of the sky that has never been done before.” Richard Ellis University College London. “It's a very small space telescope, but it has some very unique features.”

Ellis says this rich dataset allows for accidental discoveries. “There's a high chance that you'll find something unexpected,” he says.

Infrared data is outside the human vision range, allowing scientists to determine the distance of objects and learn how to form galaxies. It can also be used to determine the chemical composition of an object, potentially revealing the presence of water and other important components.

The interesting stuff thrown by Spherex can be investigated in a more focused way using NASA's existing space telescope fleet.

Christopher Conseris At the University of Manchester in the UK, Spherex says it doesn't match the JWST solution or create similarly adoring images, but it says it will become a “maintainer” for scientific discovery.

“JWST can point to a part of the sky and take some big photos [and reveal] Something completely new. And Spherex really can't do the same thing,” he says. “It's going to be an analysis that takes years, and it's going to cover the sky many times.”

Spherex orbits the Earth 14.5 times a day away from the Earth's surface, completing 11,000 orbits over a two-year lifespan. Three cone-shaped shields protect the instrument from the Earth's radiant heat and interference from the sun.

The same rocket will be released on the polarimeter, another NASA mission to unify the Corona and Heliosphere Fair (punch), which will study the solar winds of the sun.

topic:

Source: www.newscientist.com

The latest technology enables scanning of faces in 3D from hundreds of meters away

The new imaging device can capture 3D scans of human faces hundreds of meters away

Aon McCarthy of Heriot Watt University

After 325 meters apart, your eyes can probably distinguish a person’s head from the body. However, new laser-based devices can create three-dimensional models of faces.

Aongus McCarthy The University of Heriot Watt in Scotland and his colleagues have built a device that can create detailed three-dimensional images containing 1 millimeter ridges and indents a few hundred meters apart. An imaging technique called Lidar is used to emit pulses of laser light, collide with the object and is reflected on the device. Based on how long it takes each pulse to return, Lidar can determine the shape of the object.

To reach this level of detail, the team had to carefully tune and align many different components, McCarthy said, including small parts that direct the laser pulse into the device. To enable discrimination between single light particles, the researchers used photodetectors based on extremely thin superconducting wires, a component not common in LIDAR. Exclude sunlight that could enter the detector and break down the image was another challenge.

Researchers tested the rider system on a roof near the lab by taking detailed three-dimensional images of the team members’ heads from 45 meters and 325 meters apart. On a small scale, they captured LEGO figurines from a distance of 32 meters.

The imaging system can scan LEGO characters from 32 meters away

Aon McCarthy of Heriot Watt University

Another test imaged a segment of a communications tower one kilometre away. “It was a very difficult test. I couldn’t control what the scene could do due to the bright background. [that we were imaging]McCarthy says.

Feihu Xu At the University of Science and Technology in China, the team previously used LIDAR for imaging From 200km awayMcCarthy and his colleagues say they achieved “amazing results” in terms of the device’s depth resolution. “It’s the best so far,” he says.

Lidar says that modern technology is only becoming more relevant Vivek Goyal at Boston University, Massachusetts. He says that being able to create detailed 3D maps of the surroundings is also important for self-driving cars and some robots, but before using them for this purpose, new devices need to be made smaller and more compact. There is.

topic:

Source: www.newscientist.com