On his weekly HBO show, John Oliver discussed the alarming risks of AI, labeling it “worrisomely corrosive” to our society.
During “Last Week Tonight,” Oliver remarked, “The rampant use of AI generation tools has made it effortlessly simple to clutter social media platforms with cheap, professional, and often bizarre content, coining the term AI Slop to categorize everything.”
He described it as “the latest version of spam,” with peculiar images and videos overwhelming users’ feeds, leading people to say, “I have no idea that this isn’t the real thing.”
“It’s highly probable that this content will flood platforms in the near future,” Oliver warned.
With such content, “The main goal is to grab your attention,” and barriers to entry have significantly lowered due to its ease of creation.
Meta has jumped into the fray with its own tools and has also refined its algorithm. This means more than a third of the content in your feed originates from accounts that currently do not comply. “That’s how the slops infiltrate without your consent,” he noted.
A monetization program has emerged for those who manage to make their content go viral, and numerous AI slop experts are now offering to teach individuals the tricks of the trade for a small fee.
This has become “ultimately a spam-like volume game in all forms,” resulting in AI generators appropriating the work of real artists without credit. However, “Due to the tales of wealth linked to these slop gurus, the amount of money involved can be relatively minimal.”
It might only be a few hundred dollars, sometimes even less, leading to what can be termed a megavirus. Much of this originates from nations where financial advancements are notable, such as India, Thailand, Indonesia, and Pakistan.
One challenge is having to explain to your parents that the content isn’t genuine. “There’s this really adorable animal, but I can assure you it’s not Moo Deng; it’s AI,” he stated.
Additionally, there are environmental repercussions regarding the resources necessary to produce this content, along with a concerning proliferation of misinformation.
Oliver highlighted numerous fake disasters depicted through images and videos, showcasing tornadoes, explosions, and plane crashes. “Air travel is stressful enough without the creation of new disasters,” he lamented.
AI-generated content has also been utilized during the Israeli-Iran conflict, complicating situations for first responders during last year’s floods in North Carolina. Republicans likewise exploited it to suggest that Biden was mishandling the latter crisis.
“It’s a conundrum for those who have been yelling ‘fake news’ over the last decade and are now suddenly more vocal in denouncing actual fake news,” he remarked.
The impact of these spreads wasn’t as damaging as some had feared during last year’s U.S. elections, but AI is “already considerably more advanced than it was at that time.”
He concluded: “Not only will you be deceived by fakes, but your very existence may cause you to dismiss authentic videos and images as forgeries from bad actors.”
Oliver argues that this all contributes to “corroding the very notion of objective reality,” and finds it increasingly difficult to identify AI content on these platforms.
“I’m not suggesting that some of this content isn’t entertaining, but some of it is potentially quite dangerous,” he warned.
Source: www.theguardian.com












