M
My brother put the cell phone to my ear. “You’re going to think this is creepy,” he warned. Ann
instagram reels
The footage, which showed teenage boys attending the rally, included a news broadcast-style narration. “The recent protests by British students have become a powerful symbol of the deepening crisis in Britain's education system,” she said in a soft, female voice with barely a hint of a Manchenian accent. I opened my eyes wide and sat up straight.
As a presenter on a YouTube news channel, I was used to hearing my voice on screen. But this wasn't me – even if that voice said so.
definitely mine.
“They force us to learn about Islam and Muhammad in school,” he continued. “Listen, this is disgusting.” It was horrifying to hear my voice being associated with far-right propaganda, but more than that, I was horrified to hear how this fraud is being perpetrated. As I dug deeper, I learned how far-reaching the effects of false voices can be.
AI voice cloning is an emerging form of audio “deepfake” and the third fastest growing form
Scam of 2024.
Unwitting victims find that their voices have been cleverly duplicated without their consent or even knowledge, a phenomenon that has already led to bank security checks.
bypassed and people
deceived He had a stranger he believed to be a relative send money to him. My brother was sent the clip by a friend who recognized my voice.
After some research, I was able to find a far-right YouTube channel with about 200,000 subscribers. Although this was said to be an American channel, many of the misspellings in the video were typical of misinformation accounts from non-native English speakers. I was shocked to learn that my voice was featured in 8 of the channel's 12 most recent videos. I scrolled back and found one video using my voice from 5 months ago.
10m views.
The voice was almost the same as mine. The voice was AI-generated, except the pace of my speech was a little odd.
This increasing sophistication of AI voice cloning software is a cause for serious concern. In November 2023, an audio deepfake of London Mayor Sadiq Khan allegedly making inflammatory remarks about Armistice Day was widely circulated on social media. The clip almost caused a “serious injury”;
Mr Khan told the BBC..
“If you're looking to sow disharmony and cause trouble, there's no better time.” At a time when confidence in Britain's political system is already at record levels.
lowThe ability to manipulate public rhetoric is more harmful than ever, with 58% of Britons saying they have “little trust” in politicians to tell the truth.
The legal right to own one's voice falls within a vague gray area of poorly legalized AI issues. TV naturalist David Attenborough became the center of an AI voice cloning scandal in November. He said he was “deeply disturbed” to learn that his voice was being used to deliver partisan breaking news in the United States. In May, actor Scarlett Johansson sued OpenAI for using a text-to-speech model in ChatGPT, an OpenAI product, that Johansson described as “eerily similar” to her own voice. There was a collision.
In March 2024, OpenAI postponed the release of a new voice replication tool, deeming it “too risky” to make it publicly available in a year with a record number of global elections. Some AI startups that let users clone their own voices can detect the creation of voice clones that imitate politicians actively involved in election campaigns, including in the US and UK. We have a preventive policy in place.
However, these mitigation measures are not enough. In the United States, concerned senators are proposing legislation to crack down on those who copy audio without consent. In Europe, the European Identity Theft Surveillance System (Aitos) has developed four tools to help police identify deepfakes, with plans to have them ready by the end of this year. But tackling the audio crisis is no easy task. Dr Dominic Rees, an expert on AI in film and television who advises a UK parliamentary committee, told the Guardian: “Our privacy and copyright laws are not prepared for what this new technology will bring.”
If declining trust within organizations is one problem, creeping distrust among communities is another. The ability to trust is central to human cooperation as globalization advances and personal and professional lives become increasingly intertwined, but we have never come to the point of undermining it to this extent. Hany Farid, a professor of digital forensics at the University of California, Berkeley and an expert on deepfake detection, said:
told the Washington Post The consequences of this voice crisis could be as extreme as mass violence or “election theft.”
Is there any benefit to this new ability to easily clone audio? Maybe. AI voice clones could allow people to seek solace by connecting with the dead
loved ones
or help give a voice to people who:
medical condition. American actor
val kilmerhas been undergoing treatment for throat cancer, and returned to “Top Gun: Maverick'' in 2022 with a voice restored by AI. Our ability to innovate may serve those with evil intentions, but it also serves those working for good.
When I became a presenter, I happily shared my voice on screen, but I did not agree to sign on to anyone who wanted to use this essential and precious part of me. As broadcasters, we sometimes worry about how colds and winter viruses will affect our recordings. But my recent experience has given the concept of losing one's voice a different, far more sinister meaning.
Source: www.theguardian.com