Meta CEO Mark Zuckerberg Exits California Superior Court in Los Angeles
Kyle Grillot/Bloomberg via Getty Images
As I sit down to write, I instinctively check my calendar on my phone. A notification from a friend catches my eye, leading me to an Instagram meme. Next, I find myself engulfed in an endless scroll of captivating short videos: one about the crows of the Tower of London and another featuring street food in Indonesia. Before I know it, 45 minutes have slipped away while I consume disturbing and increasingly political content.
While this time loss hasn’t ruined my day, I can’t shake off the lingering feelings of fatigue and sadness. Where did that lost time go? Why did Instagram lead me to consume countless videos when all I intended to do was check my calendar? Furthermore, why do I feel so drained?
These questions are crucial and are currently at the heart of ongoing debates, set to unfold in court in 2020. Two lawsuits filed in California by thousands of plaintiffs, including school districts and concerned parents, target major social media platforms—Meta (the owner of Facebook and Instagram), Google (YouTube), Snap (Snapchat), ByteDance (TikTok), and Discord. These cases argue that social media poses serious risks to children, leading to psychological harm and potentially fatal consequences. Children often encounter content filled with violence and unrealistic beauty standards, which can lead them down perilous paths.
For over a decade, U.S. lawmakers have suggested that the answer lies in restricting children’s access to social media rather than holding these companies accountable. Some states have enacted laws requiring parental consent for minors creating accounts, while others aim to curb adolescent bullying by prohibiting like counts on posts. These regulations primarily focus on content dangers, which in essence liberates businesses from accountability, particularly under the Communications Decency Act’s notorious Section 230, shielding companies from liability for user-generated content.
Section 230 may have seemed like a pragmatic solution when established in the 1990s, pre-empting concerns over doomscrolling, algorithm manipulation, and harmful influencers. With platforms like YouTube witnessing the upload of 20 million videos daily, holding them accountable for all posted content would be untenable.
The U.S. commitment to free speech complicates matters, as companies like Meta and Google easily challenge any regulations perceived as inhibiting access to online expression, even concerning harmful content. Many laws aimed at protecting minors from social media have faced setbacks in courts for conflicting with free speech principles, allowing companies to exploit these laws as a protective shield.
However, the ongoing California lawsuits intriguingly redirect focus away from content and free speech issues. Instead, they highlight the inherent design flaws of social media platforms, citing endless scrolling, incessant notifications, autoplay videos, and algorithm-driven lure tactics that ensnare users. The lawsuits argue that these ‘deficiencies’ transform social media apps into ‘addictive’ products, similar to ‘slot machines,’ exploiting young users through an AI-driven feed that perpetuates scrolling behavior. The overarching intention behind these legal actions is to hold these companies accountable for the adverse effects their products impose on vulnerable demographics.
This argument mimics the government’s legal approach against tobacco companies in the 1990s, where it successfully demonstrated that companies were aware of the harm their products caused while deliberately obscuring this fact. Consequently, these companies faced substantial settlements, were mandated to include warning labels, and were pressured to adjust their marketing strategies to become less appealing to children.
Leaked documents from Meta have revealed that the company recognized its products’ addictive nature. In a notable case involving a teenage girl’s suicide linked to social media addiction, a federal judge released communications from Instagram, wherein a user experience expert presumably remarked: “[Instagram] is a drug…we’re basically pushers.” This sheds light on the negligence and recklessness attributed to companies producing potentially harmful products.
Two pivotal clinical trials are currently in progress, holding the potential to dramatically reshape the landscape of social media. Perhaps, American law will finally arrive at the consensus that the core issue isn’t merely the content but the actions of the companies that facilitate its distribution.
If you or someone you know needs support, please reach out to the British Samaritan at 116123 (samaritans.org) or the U.S. Suicide and Crisis Lifeline at 988 (988lifeline.org). For resources in additional countries, please visit bit.ly/SuicideHelplines.
Topics:
Source: www.newscientist.com












