This is anotherA year-long seriesThis is a story that showcases how the rapid rise of artificial intelligence is impacting our lives, and how we should work to make that impact as beneficial as possible.
Clues to which images are deepfakes may be found in the eyes.
Deepfakes are fake images created by artificial intelligence (AI) that are becoming increasingly difficult to distinguish from real photographs, but new research suggests that eye reflections may be one way to spot a deepfake – using a technique used by astronomers to study galaxies.
The researchers published their findings on July 15. They Royal Astronomical Society National Astronomy Meeting In Hull, England.
In real images, the light reflections in the eyes match up — for example, both eyes reflect the same number of window and ceiling lights — but that's not necessarily the case in fake images: In AI-created photos, the eye reflections often don't match up.
Put simply, “physics is actually wrong,” says Kevin Pimblett, an astronomer at the University of Hull who worked on the new research with Adejumoke Owolabi when she was a graduate student there.
Astronomers use the “Gini coefficient,” a measure of how light is spread across an image of a galaxy. If all the light is in one pixel, the value is 1. If the light is evenly spread across the pixel, the index is 0. This index helps astronomers classify galaxies by shape, such as spiral or elliptical.
The researchers applied this idea to photography. First, they used a computer program to find eye reflections in photos of people. Then they looked at the pixel values ​​of those reflections, which represent the intensity of light at a particular pixel. Using these values, they could calculate the Gini coefficient for each eye reflection.
The difference between the Gini coefficients for the left and right eyes was found to be a clue as to whether the image was genuine. In about 7 out of 10 fake images examined, this difference was much larger than in the genuine images, where there tended to be little difference between the Gini coefficients of the reflections in each eye.
“Specific [difference in Gini index] “It's a fake,” Pimblett said. “But [a red flag] “There may be a problem,” he said, in which case “maybe a human should take a closer look.”
The technique can also be applied to video, but it's no silver bullet when it comes to spotting fakes: blinking can make a real image look fake, or being so close to a light source that only one eye is reflecting it.
Still, this method could be another useful tool in weeding out deepfakes — at least until AI gets better at recognizing reflections.
Source: www.snexplores.org