If a week is a long time in politics, five years between elections feels like an eternity in the UK. The political landscape has changed dramatically since the Conservative Party’s landslide victory in 2019, but so has the social media landscape.
In 2019, TikTok was “the video-sharing app that became phenomenally popular among teenagers,” according to a commentator at The Guardian.
Fast forward to 2023 and an Ofcom investigation has found that: 10% of people aged 16 and over The number of people saying they get their news from TikTok is higher than BBC Radio 1 and on par with the Guardian, a significant increase from 1% in 2020 after the last election.
While some say the so-called battle over TikTok has been exaggerated, the platform’s creators are well aware that there is an audience among TikTok users, young and old, who enjoys political content.
To understand how the 2024 election unfolded on TikTok, we monitored the platform for one hour per day for a week using four separate accounts, searching for the widely used tag “#ukpolitics” as well as campaign-specific hashtags and terms.
Before we begin, a few disclaimers: No one outside TikTok knows how TikTok’s algorithm works, nor do we know whether and how the algorithm can be manipulated to promote certain content.
The platform is also notoriously difficult to measure: there’s no “most popular” section, so the sample is just a snapshot of what people saw on the site for one hour each day for one week over the duration of the campaign.
Straight TikTok: “Traditional” News for a New Audience
If you think of TikTok as all dance crazes, lip-sync challenges, and make-up artist tutorials, you’d be right – but you’ll also find some familiar faces, including BBC and ITV news anchors, LBC radio presenters, and broadcast journalists.
…
Conspiracy theorist
We found very few accounts spreading conspiracy theories, at least in the sample we collected, but they do exist.
While we do not intend to help conspiracy theorists by spreading their videos more widely on this platform, topics we saw included false claims that Labour would introduce Sharia law if it came to power.
Again, it is not known why such content was served, but AI Forensics warns that such content could be amplified by a “secret recipe” hidden in the platforms’ algorithms.
“Engagement can be both good and bad, so polarized discussions around extreme views and hate speech can drive up engagement metrics,” Romano said.
At least three accounts initially identified as containing conspiracy theories were removed during the investigation, though it is unclear whether this was of the accounts’ own volition or if they were removed by TikTok.
Source: www.theguardian.com