This is a crucial election year for the world, with misinformation swirling on social media as countries including the UK, US and France go to the polls.
There are major concerns about whether deepfakes – images and audio of key politicians created using artificial intelligence to mislead voters – could influence election outcomes.
While it has not been a major talking point in the UK elections so far, examples are steadily emerging around the world, including in the US, where a presidential election is looming.
Notable visual elements include:
Discomfort around the mouth and jaw
In deepfake videos, the area around the mouth can be the biggest clue: There may be fewer wrinkles on the skin, less detail around the mouth, and a blurry or smudged chin. Poor syncing between a person’s voice and mouth is another telltale sign.
The deepfake video, posted on June 17, shows Nigel Farage simulating the destruction of Rishi Sunak’s house in Minecraft. Deepfake satire trend A video showing politicians playing online games.
A few days later, Another Simulation Video Keir Starmer was seen playing Minecraft and setting up traps in “Nigel’s Pub”.
Dr Mhairi Aitken, an ethics researcher at the Alan Turing Institute, the UK’s national AI lab, says the first feature of Minecraft deepfakes is, of course, the “absurdity of the situation”, but another sign of AI-generated media and manipulation is the imperfect synchronization of voice and mouth.
“This is particularly clear in the section where Farage is speaking,” Aitken said.
Another way to tell, Aitken says, is to see if shadows fall in the right places, or if lines and creases in the face move in the way you expect them to.
Ardi Djandzheva, a researcher at the institute, added that the low resolution of the overall video is another telltale sign people should look out for because it “looks like something that was quickly stitched together.” He said people have become accustomed to this amateurish technique due to the prevalence of “rudimentary, low-resolution scam email attempts.”
This lo-fi approach also shows up in prominent areas like the mouth and jawline, he says: “There’s an excessive blurring and smudge of facial features that are the focus of the viewer’s attention, like the mouth.”
Strange elements of the speech
Another deepfake video featured audio edited from Keir Starmer’s 2023 New Year’s speech pitching an investment scheme.
If you listen closely, you’ll notice some odd sentence structure: Starmer repeatedly says “pound” before a figure, for example “pound 35,000 per month”.
Aitken said the voice and mouth were again out of sync and the lower part of the face was blurred, adding that the use of “pounds” before the numbers suggested a text-to-speech tool had probably been used to recreate Starmer’s voice.
“This mirrors typical spoken language patterns, as it is likely a written-to-speech tool was used, which has not been confirmed,” she says. “There are clues in the intonation as well, which maintains a fairly monotonous rhythm and pattern throughout. A good way to check the authenticity of a video is to compare the voice, mannerisms and expressions to a recording of a real person to see if there is consistency.”
Face and body consistency
This deepfake video of Ukrainian President Volodymyr Zelensky calling on civilians to lay down their arms to Russian forces was circulated in March 2022. The head is disproportionately large compared to the rest of the body, and the skin on the neck and face is a different color.
Hany Farid, a professor at the University of California, Berkeley and an expert on deepfake detection, said this is “a classic deepfake.” The immobile body is the telltale sign, he said. “The defining feature of this so-called Puppet Master deepfake is that the body is immobile from the neck down.”
Discontinuities throughout the video clip
The video, which went viral in May 2024, falsely shows U.S. State Department spokesman Matthew Miller telling a reporter that “there are virtually no civilians left in Belgorod,” justifying the Ukrainian military’s attack on the Russian city of Belgorod. The video was tweeted by the Russian embassy in South Africa and has since been removed, according to Russian media. BBC journalist.
Source: www.theguardian.com