Four British parents who are suing Tiktok for the alleged unlawful deaths of their children express concerns about the suspected deletion of their child’s data from social media platforms.
These parents have filed a lawsuit in the US claiming that four children died in 2022 after participating in the “Blackout Challenge,” a viral trend that emerged on social media in 2021.
A week after the lawsuit was filed, Tiktok executives mentioned that certain data had been deleted due to legal requirements. UK GDPR regulations mandate that platforms do not retain excessive personal data.
The parents were surprised by how quickly their child’s data was removed.
“My initial reaction was that it’s a complete lie,” said Lisa Kennevan, whose son Isaac passed away at 13.
Liam Walsh remains skeptical about Tiktok deleting data on her daughter Maia, who passed away at 14, as the investigation is ongoing. He has issued a statement.
Ellen Room is advocating in Congress for the introduction of “Jules’ Law” in memory of her 12-year-old son Julian.
“If you have a physical diary in [your children’s] bedroom, I’m sure you’d read it to understand. Nowadays, they’ve moved online, and social media serves as a diary for kids. So why not examine their online diaries for potential answers?” she remarked.
Hollie Dance should have automatic rights to the data, as her son Archie Battersbee was 12 years old, but she faces challenges in accessing it. “There are still three [of his] active accounts. I can see them myself,” she noted.
Tiktok has stated that searches related to dangerous challenges have been blocked since 2020. The platform aims to remove harmful content preemptively and direct users to safety resources.
Dance mentioned that she has screenshots of dangerous challenges that were easily accessible.
The parents expressed their wish to restrict their children’s access to social media and were unaware of the limited rights they have to their children’s data.
“Essentially, we’re handing the kids loaded guns,” Kennevan remarked. “A child’s brain isn’t fully developed until around 25. The amount of exposure to content isn’t healthy. They’ve witnessed harmful content, such as porn, at ages 10 and 11. They don’t need social media.”
This year, the Online Safety Act was enforced, obliging platforms to take action against illegal or harmful content. Walsh expressed skepticism towards Ofcom.
Dance suggested that the organization should screen all videos before they are uploaded to the platform.
Walsh revealed that a US court exposed a video of her child, leading to a damaging impact on her mental state. She intends to press manslaughter charges against the company in UK courts.
Room explained that the family resorted to a US lawsuit after being unable to file a case in the UK due to legal constraints.
She emphasized on making a difference for other families and parents. “It’s challenging and emotionally draining, but we’re going to make an impact here,” she mentioned.
Source: www.theguardian.com