and othersLess than three hours after the stabbing that left three children dead on Monday, an AI-generated image was shared on X by the account “Europe Invasion.” The image shows bearded men in traditional Islamic garb standing outside Parliament Building, one of them brandishing a knife, with a crying child behind them wearing a Union Jack T-shirt.
The tweet has since been viewed 900,000 times and was shared by one of the accounts most prolific in spreading misinformation about the Southport stabbing, with the caption “We must protect our children!”.
AI technology has been used for other purposes too – for example, an anti-immigration Facebook group generated images of large crowds gathering at the Cenotaph in Middlesbrough to encourage people to attend a rally there.
Platforms such as Suno, which employs AI to generate music including vocals and instruments, have been used to create online songs combining references to Southport with xenophobic content, including one titled “Southport Saga”, with an AI female voice singing lyrics such as “we'll hunt them down somehow”.
Experts warn that with new tactics and new ways of organizing, Britain's fragmented far-right is seeking to unite in the wake of the Southport attack and reassert its presence on the streets.
The violence across the country has led to a surge in activism not seen in years, with more than 10 protests being promoted on social media platforms including X, TikTok and Facebook.
This week, a far-right group's Telegram channel has also received death threats against the British Prime Minister, incitements to attacks on government facilities and extreme anti-Semitic comments.
Amid fears of widespread violence, a leading counter-extremism think tank has warned that the far-right risks mobilizing on a scale not seen since the English Defence League (EDL) took to the streets in the 2010s.
The emergence of easily accessible AI tools, which extremists have used to create a range of material from inflammatory images to songs and music, adds a new dimension.
Andrew Rogojski, director of the University of Surrey's Human-Centred AI Institute, said advances in AI, such as image-generation tools now widely available online, mean “anyone can make anything”.
He added: “The ability for anyone to create powerful images using generative AI is of great concern, and the onus then shifts to providers of such AI models to enforce the guardrails built into their models to make it harder to create such images.”
Joe Mulhall, research director at campaign group Hope Not Hate, said the use of AI-generated material was still in its early stages, but it reflected growing overlap and collaboration between different individuals and groups online.
While far-right organizations such as Britain First and Patriotic Alternative remain at the forefront of mobilization and agitation, the presence of a range of individuals not affiliated to any particular group is equally important.
“These are made up of thousands of individuals who, outside of traditional organizational structures, donate small amounts of time and sometimes money to work together toward a common political goal,” Mulhall said. “These movements do not have formal leaders, but rather figureheads who are often drawn from among far-right social media 'influencers.'”
Joe Ondrack, a senior analyst at British disinformation monitoring company Logical, said the hashtag #enoughisenough has been used by some right-wing influencers to promote the protests.
“What's important to note is how this phrase and hashtag has been used in previous anti-immigration protests,” he said.
The use of bots was also highlighted by analysts, with Tech Against Terrorism, an initiative launched by a branch of the United Nations, citing a TikTok account that first began posting content after Monday's Southport attack.
“All of the posts were Southport-related and most called for protests near the site of the attack on July 30th. Despite having no previous content, the Southport-related posts garnered a cumulative total of over 57,000 views on TikTok alone within a few hours,” the spokesperson said. “This suggests that a bot network was actively promoting this content.”
At the heart of the group of individuals and groups surrounding far-right activist Tommy Robinson, who fled the country ahead of a court hearing earlier this week, are Laurence Fox, the actor turned right-wing activist who has been spreading misinformation in recent days, and conspiracy websites such as Unity News Network (UNN).
On a Telegram channel run by UNN, a largely unmoderated messaging platform, some commentators rejoiced at the violence seen outside Downing Street on Wednesday. “I hope they burn it down,” one commentator said. Another called for the hanging of Prime Minister Keir Starmer, saying “Starmer needs Mussalini.” [sic] process.”
Among those on the scene during the Southport riots were activists from Patriotic Alternative, one of the fastest growing far-right groups in recent times. Other groups, including those split over positions on conflicts such as the Ukraine war and the Israeli war, are also seeking to get involved.
Dr Tim Squirrell, director of communications at the counter-extremism think tank the Institute for Strategic Dialogue, said the far-right had been seeking ways to rally in the streets over the past year, including on Armistice Day and at screenings of Robinson's film.
“This is an extremely dangerous situation, exacerbated by one of the worst online information environments in recent memory,” he said.
“Robinson remains one of the UK far-right's most effective organizers, but we are also seeing a rise in accounts large and small that have no qualms about aggregating news articles and spreading unverified information that appeals to anti-immigrant and anti-Muslim sentiment.”
“There is a risk that this moment will be used to spark street protests similar to those in the 2010s.”
Source: www.theguardian.com