In the corridors of power in the UK, a vital adage states that scientific advisers need to be grounded rather than elevated. This principle, often credited to Winston Churchill, asserts that in a democracy, it is essential for science to inform policymaking, rather than dictate it.
This idea became particularly relevant during the Covid-19 pandemic, when British leaders claimed to be “following the science.” However, many critical decisions—like paying individuals to self-isolate or shutting down schools—couldn’t rely solely on scientific guidance. Numerous questions remained unanswered, placing policymakers in a challenging position.
In stark contrast, the Trump administration has been working to dismantle established guidelines from health agencies regarding various issues, from vaccination to cell phone radiation, in pursuit of the “Make America Healthy Again” initiative, all while curtailing scientific research.
“
By mid-2027, we should have stronger evidence on the harms of social media. “
But what should policymakers do when scientific understanding is still developing and no immediate global crisis is present? The pressing question is how long they should wait for scientific clarity.
Currently, a significant debate is brewing in various nations regarding the potential ban on social media use for those under 16, as Australia implemented late last year. While public support for such a ban is high, the prevailing scientific evidence indicates that social media’s impact on teens’ mental health is minimal at a population level. Should political leaders disregard this evidence to cater to public opinion?
To do so would align with Churchill’s maxim. Yet, as we explore further, by mid-2027, more reliable evidence regarding social media’s negative influences should emerge from both a randomized trial in the UK and data stemming from Australia’s ban. Thus, the most prudent course of action is to allow scientists the time to gather concrete evidence before implementing significant policy changes. Progress in policy must stem from proactive science—not from its supremacy—and this requires adequate time.
Instagram alerts that accounts for users under 16 will be terminated
Stringer/AFP (via Getty Images)
Australia’s groundbreaking social media restrictions on users under 16 have officially started, unveiling some contentious issues from the inaugural day of the new law. Notably, some minors managed to sidestep age verification measures intended to prevent them from accessing their accounts.
This initiative has garnered backing from numerous parents who hope it will mitigate online harassment, promote outdoor activities, and lessen exposure to inappropriate material. However, critics argue that the ban may be ineffective or even counterproductive, as highlighted by a variety of satirical memes.
Andrew Hammond, associated with KJR, a consultancy in Canberra where he oversaw age verification initiatives for the Australian government, is keenly observing how the current situation evolves. He mentioned having spoken to several parents of children covered by the ban, none of whom had lost access to their accounts yet. “Some have reported they circumvented it or haven’t yet been prompted,” Hammond stated, though he anticipates more accounts will be disabled next week.
Meta, the parent company of Instagram and Facebook, has initiated account removals about a week ago. A spokesperson affirmed, “As of today, we have disabled all accounts confirmed to be under 16.” They confirmed, “As the social media ban in Australia takes effect, we will preclude access to Instagram, Threads, and Facebook for teenagers known to be under this age and will restrict newcomers under 16 from setting up accounts.”
While Meta did not disclose the specific number of accounts terminated, a representative referred to earlier data indicating that approximately 150,000 users aged 13 to 15 are active on Facebook, and around 350,000 on Instagram in Australia. This implies that at least half a million accounts belonging to young Australians have been deleted on these two platforms alone.
The company stated its dedication to fulfilling its legal responsibilities, yet many concerns voiced by community organizations and parents have already manifested on the first day of the ban. These include risk of isolating vulnerable youth from supportive online communities, nudging them towards lesser-regulated apps and web areas, irregular age verification practices, and minimal concern for compliance among numerous teenagers and their parents, according to the spokesperson.
Mr. Hammond raised further questions, particularly regarding the status of minors under 16 who are vacationing or studying in Australia. The government has clarified that this regulation applies equally to visiting minors. While Australian accounts have been deleted, Mr. Hammond suspects that visitors’ accounts may simply be momentarily suspended. “It’s been merely a few hours since the ban was enacted, so there remains substantial uncertainty about its implementation,” he stated.
Australia and other nations are closely monitoring the repercussions as the law is fully enforced. “We will soon discover how attached minors under 16 are to social media and the actual situation that unfolds,” he said. He speculated that perhaps “they will venture outside to play sports.” Nonetheless, he warned, “if their lives are deeply intertwined with it, we may witness a plethora of attempts to evade these restrictions.”
As Australia readies itself to restrict access to 10 major social media platforms for users under 16, lesser-known companies are targeting the teen demographic, often engaging underage influencers for promotional content.
“With a social media ban on the horizon, I’ve discovered a cool new app we can switch to,” stated one teenage TikTok influencer during a sponsored “collaboration” video on the platform Coverstar.
New social media regulations in Australia will take effect, effectively prohibiting all users under 16 from accessing TikTok, Instagram, Snapchat, YouTube, Reddit, Twitch, Kick, and X starting December 10.
It remains uncertain how effective this ban will be, as numerous young users may attempt to bypass it. Some are actively seeking alternative social media platforms.
Sign up: AU breaking news email
Alongside Coverstar, other lesser-known apps like Lemon8 and Yope have recently surged in popularity, currently sitting at the top two spots in Apple’s lifestyle category in Australia.
The government has stated that the list of banned apps is “dynamic,” meaning additional platforms may be added over time. Experts have voiced concerns that this initiative might lead to a game of “whack-a-mole,” pushing children and teens into less visible corners of the internet.
Dr. Catherine Page-Jeffrey, a specialist in digital media and technology at the University of Sydney, remarked, “This legislation may inadvertently create more dangers for young people. As they migrate to less regulated platforms, they might become more secretive about their social media activities, making them less likely to report troubling content or harmful experiences to their parents.”
Here’s what we know about some of the apps that kids are opting for.
Coverstar
Coverstar, a video-sharing app based in the U.S., identifies itself as “a new social app for Generation Alpha that emphasizes creativity, utilizes AI, and is deemed safer than TikTok.” Notably, it is not subject to the social media ban and currently holds the 45th position in Apple’s Australian download rankings.
A screenshot from Yope reveals that the Guardian was able to set up an account for a fictitious four-year-old named Child Babyface without needing parental consent. Photo: Yope
Children as young as 4 can use this platform to livestream, post videos, and comment. For users under 13, the app requires them to record themselves stating, “My name is ____. I give you permission to use Coverstar,” which the app then verifies. Adults are also permitted to create accounts, post content, and engage in comments.
Similar to TikTok and Instagram, users can spend real money on virtual “gifts” for creators during live streams. Coverstar also offers a “premium” subscription featuring additional functionalities.
The app highlights its absence of direct messaging, adherence to an anti-bullying policy, and constant monitoring by AI and human moderators as key safety measures.
Dr. Jennifer Beckett, an authority on online governance and social media moderation at the University of Melbourne, raised concerns regarding Coverstar’s emphasis on AI: “While AI use is indeed promising, there are significant limitations. It’s not adept at understanding nuance or context, which is why human oversight is necessary. The critical question is: how many human moderators are there?”
Coverstar has been reached for comments.
Lemon8
Lemon8, a photo and video sharing platform reminiscent of Instagram and owned by TikTok’s parent company, ByteDance, has experienced a notable rise in user engagement recently.
Users can connect their TikTok accounts to easily transfer content and follow their favorite TikTok creators with a single click.
However, on Tuesday, Australian eSafety Commissioner Julie Inman-Grant revealed that her office has advised Lemon8 to conduct a self-assessment to ascertain if it falls under the new regulations.
Yope
With only 1,400 reviews on the Apple App Store, Yope has emerged as a “friends-only private photo messaging app” that is positioned as an alternative to Snapchat after the ban.
Bahram Ismailau, co-founder and CEO of Yope, described the company as “a small team dedicated to creating the ideal environment for teenagers to share images with friends.”
Similar to Lemon8, Australia’s eSafety Commissioner also reached out to Yope, advising a self-assessment. Ismailau informed the Guardian that he had not received any communication but is “prepared to publicly express our overall eSafety policy concerning age-restricted social media platforms.”
He claimed that after conducting a self-assessment, Yope determines it fully meets the law’s exemption for apps designed solely for messaging, email, video calls, and voice calls.
Australian government adds Reddit and Kick to social media ban for under-16s – video
“Yope functions as a private photo messenger devoid of public content,” asserted Ismailau. “It’s comparable in security to iMessage or WhatsApp.”
According to Yope’s website, the app is designed for users aged 13 and above, with those between 13 and 18 required to engage a parent or guardian. However, the Guardian successfully created an account for a fictitious four-year-old named Child Babyface without needing parental consent.
A mobile number is mandatory for account creation.
Ismailau did not address inquiries about under-13 accounts directly but confirmed that plans are underway to update the privacy policy and terms of service to better reflect the app’s actual usage and intended audience.
Red Note
The Chinese app Red Note, also referred to as Xiaohongshu, attracted American users when TikTok faced a temporary ban in the U.S. earlier this year.
Beckett noted that the app might provide a safe space, considering that “Social media is heavily regulated in China, which is reflected in the content requiring moderation.”
“Given TikTok’s previous issues with pro-anorexia content, it’s clear that the platform has faced its own challenges,” she added.
Nonetheless, cybersecurity experts highlight that the app collects extensive personal information and could be legally obligated to share it with third parties, including the Chinese government.
Despite the increasing number of restricted social media services, specialists assert that governments are underestimating children’s eagerness to engage with social media and their resourcefulness in doing so.
“We often overlook the intelligence of young people,” Beckett remarked. “They are truly adept at finding ways to navigate restrictions.”
Anecdotal evidence suggests that some kids are even exploring website builders to create their own forums and chat rooms; alternatives include using shared Google Docs for communication.
“They will find ways to circumvent these restrictions,” Beckett asserted. “They will be clever about it.”
YouTube will fall under the federal government’s ban on social media for users under 16, but its parent company Google has stated that the law “fails to ensure teens’ safety online” and “misunderstands” the way young people engage with the internet.
Communications Minister Annika Wells responded by emphasizing that YouTube must maintain a safe platform, describing Google’s concerns as “absolutely bizarre.”
In a related development, Guardian Australia has reported that Lemon8, a recently popular social media app not affected by the ban, will implement a restriction of users to those over 16 starting next week. The eSafety Commissioner has previously indicated that the app will be closely scrutinized for any potential bans.
Before Mr. Wells’ address at the National Press Club on Wednesday, Google announced it would start signing out minor users from its platform on December 10. However, the company cautioned that this might result in children and their parents losing access to safety features.
Initially, Google opposed the inclusion of YouTube, which had been omitted from the framework, in the ban and hinted it might pursue legal action. Nevertheless, the statement released on Wednesday did not provide further details on that front, and Google officials did not offer any comments.
Rachel Lord, Google’s senior manager of Australian public policy, stated in a blog post that users under 16 could view YouTube videos while logged out, but they would lose access to features that require signed-in accounts, such as “subscriptions, playlists, likes,” and standard health settings like “breaks” and bedtime reminders.
Additionally, the company warned that parents “will no longer be able to manage their teens’ or children’s accounts on YouTube,” including blocking certain channels in content settings.
Mr. Lord commented, “This rushed regulation misunderstands our platform and how young Australians use it. Most importantly, this law does not fulfill its promise of making children safer online; rather, it will render Australian children less safe on YouTube.”
While Lord did not address potential legal actions, they expressed commitment to finding more effective methods to safeguard children online.
Wells mentioned at the National Press Club that parents could adjust controls and safety settings on YouTube Kids, which is not included in the ban.
“It seems odd that YouTube frequently reminds us how unsafe the platform is when logged out. If YouTube asserts that its content is unsuitable for age-restricted users, it must address that issue,” she remarked.
Annika Wells will address the National Press Club on Wednesday. Photo: Mick Tsikas/AAP
Mr. Wells also acknowledged that the implementation of the government’s under-16 social media ban could take “days or even weeks” to properly enforce.
“While we understand it won’t be perfect immediately, we are committed to refining our platform,” Wells stated.
Wells commended the advocacy of families affected by online bullying or mental health crises, asserting that the amendments would “shield Generation Alpha from the peril of predatory algorithms.” She suggested that social media platforms intentionally target teens to maximize engagement and profits.
“These companies hold significant power, and we are prepared to reclaim that authority for the welfare of young Australians beginning December 10,” asserted Mr. Wells.
Sign up: AU breaking news email
Meta has informed users of Facebook, Instagram, and Threads, along with Snapchat, about forthcoming changes. Upon reaching out to Guardian Australia, a Reddit spokesperson mentioned that they had no new information. Meanwhile, X, TikTok, YouTube, and Kick have not publicly clarified their compliance with the law nor responded to inquiries.
Platforms that do not take appropriate measures to exclude users under 16 may incur fines of up to $50 million. Concerns have been raised about the timing and execution of the ban, including questions about the age verification process, and at least one legal challenge is in progress.
The government believes it is essential to signal to parents and children the importance of avoiding social media, even if some minors may manage to bypass the restrictions.
Wells explained that it would take time to impose $50 million fines on tech companies, noting that the e-safety commissioner will request information from platforms about their efforts to exclude underage users starting December 11, and will scrutinize data on a monthly basis.
At a press conference in Adelaide on Tuesday, Mr. Wells anticipated that additional platforms would be included in the under-16 ban if children were to migrate to sites not currently on the list.
She advised the media to “stay tuned” for updates regarding the Instagram-like app Lemon8, which is not subject to the ban. Guardian Australia understands that the eSafety Commission has communicated with Lemon8, owned by TikTok’s parent company, ByteDance, indicating that the platform will be monitored for potential future inclusion once the plan is enacted.
Guardian Australia can confirm that Lemon8 will restrict its user base to those over 16 starting December 10.
“If platforms like LinkedIn become hubs of online bullying, targeting 13- to 16-year-olds and affecting their mental and physical health, we will address that issue,” Wells stated on Tuesday.
“That’s why all platforms are paying attention. We need to be prompt and flexible.”
Australian crisis support services lifeline is available at 13 11 14. In the UK and Ireland, you can reach Samaritan via freephone 116 123 or by email at jo@samaritans.org or jo@samaritans.ie. In the US, contact the 988 Lifeline for suicide and crisis at 988 or via chat at 988lifeline.org. For further international helplines, visit: befrienders.org
A Two Swedish automotive brands, Volvo and Polestar, are spearheading an initiative to urge Brussels to adhere to the established timeline, especially as tensions escalate with Germany increasing its calls on the European Commission to reconsider the ban on new petrol and diesel vehicles by 2035.
They contend that such a decision is merely a temporary fix for the fractures within Germany’s automotive sector, arguing it would both delay the transition to electric vehicles and inadvertently grant an edge to China.
“Delaying the 2035 target is simply a terrible idea. There’s no other way to put it,” stated Michael Loescherer, the CEO of Polestar, Europe’s sole manufacturer of fully electric vehicles.
“Make no mistake, if Europe fails to spearhead this shift, other nations will take the lead.”
German Chancellor Friedrich Merz has urged European Commission President Ursula von der Leyen to reconsider the 2035 deadline. He advocated for permitting the production of new hybrid and high-efficiency internal combustion engine vehicles beyond the cutoff, noting consumer reluctance towards EVs.
“We are sending the right message to the commission with this letter,” Merz asserted, claiming the German government aims to address climate issues in a “technology-neutral manner.”
From Polestar’s transparent office in Gothenburg, Sweden, Loescherer is astounded by the current situation.
His attempts to engage in the EU’s year-long “strategic dialogue” concerning the future of the automotive industry were ignored. “I sent two letters and I’m not even sure if there was a response to the second one,” he shared.
Nearby, viewing the expansive Volvo assembly facility in Gothenburg, Håkan Samuelsson, the 74-year-old CEO of Volvo Cars, reflects on the industry landscape.
“I don’t perceive any reason to slow our progress,” he remarked.
Samuelsson compares the opposition faced by the lucrative automotive sector today to the backlash that greeted catalytic converters and seat belts half a century ago.
“If not mandated, probably 30% of our vehicles wouldn’t come equipped with seat belts, and without a requirement, we likely wouldn’t have seen the adoption of catalytic converters either,” he explained.
Volvo CEO Håkan Samuelsson indicated that reversing the 2035 petrol car ban lacks rationale.
Photo: Josefin Stenersen/Guardian
Volkswagen and BMW can pursue their own paths, Samuelsson noted, but easing up on electrification will only widen the gap with China.
“China will establish factories in Hungary, Slovakia, Romania… countries with low labor costs. I doubt we can isolate China from the EU through tariffs. We need to compete directly with them,” he added.
Samuelsson suggested that von der Leyen need not make an immediate decision and could defer it until closer to the deadline. “We have time. Another 10 years is at our disposal.”
Michael Bross, the Green Party’s representative in the European Parliament, remarked that Merkel’s requests would “significantly dilute” contentious EU legislation and “essentially grant a free pass” to internal combustion engines.
The Greens and the Sweden Party argue that extending the lifespan of hybrid vehicles sends a signal to consumers that electric cars aren’t necessary, thus validating the automobile industry’s stance.
Loescherer shares similar thoughts. “China will not remain static. They will assert dominance. If Brussels opts to suspend this; [target] when they state, ‘We’ll grant you five extra years, stop,’ they are genuinely jeopardizing hundreds of thousands of jobs.”
Polestar CEO Michael Loescherer asserts that abolishing the 2035 deadline is misguided.
Photo: Josefin Stenersen/Guardian
The articulate, marathon-running executive finds it ludicrous to even contemplate abolishing the 2035 target established just three years prior.
Loescherer was involved in the initial discussions that led to the EU’s 2022 resolution to phase out the sale of new internal combustion engines by 2035, a move celebrated by then-Vice President Frans Timmermans as a crucial step toward achieving carbon neutrality by 2050.
“During my tenure at Opel, I participated in these meetings and visited Brussels biannually. We debated this extensively,” said the Polestar president.
“I’m a marathon runner; I’ve completed 126 marathons throughout my life. Would I train and decide to run a half marathon because it’s difficult? No.”
Mr. Loescherer, with extensive experience as former CFO of VW and ex-CEO of Opel and Vietnamese car manufacturer Vinfast, states that Germany, amidst economic challenges, must learn to adapt quickly.
“It’s about mindset, it’s about attitude. Recently, I traveled to China and South Korea and have returned home to Germany.”
“In Germany, the sentiment is clear: everyone wants to safeguard the past, resisting change and striving to maintain the status quo. I’m German, so I can assert this with conviction. In China or the US, the focus is on, ‘What’s the next breakthrough? What’s the next initiative? What’s the next enterprise to launch?’ It’s a significant contrast. The mindsets are fundamentally different.”
Polestar, initially a racing car manufacturer in 1996, was acquired by Volvo in 2015, restructured in 2017, and relaunched as an independent EV manufacturer. Geely Automobile, Volvo’s Chinese stakeholder, now holds a majority stake.
When questioned if Chinese ownership might create unease in Brussels regarding Volvo’s stance, Mr. Samuelsson reiterated that Volvo remains a Swedish entity. “We’ve been part of Ford for 11 years, now we’re in our 14th or 15th year at Geely, and we’re experiencing significant growth. We’re listed on the Swedish stock exchange, adhering to European regulations. We’re Swedish. We are no more Chinese than we are American. We are as Swedish as Ava or IKEA.”
He emphasized that the EU must continue to expedite electrification, asserting its vital role in the future. Polestar has developed a vehicle capable of traveling 560 miles (900 km) on a single charge.
Samuelsson revealed that Volvo has five fully electric vehicles and is on the verge of introducing the EX60, an electric version of its top-selling XC60, already offering a range of 310 to 370 miles.
This approach addresses one of the three primary concerns consumers have when purchasing EVs, noted Samuelsson. The second concern pertains to charging time, which he believes should be reduced to 15 to 20 minutes—akin to the brief breaks drivers typically take for coffee, restroom, or stretching at a rest stop. “In the future, there will be no issue,” he asserted.
“The third obstacle hindering consumer adoption is price,” he continued.
“[If] we in the automotive sector can address these three necessities, the adoption rate for EVs will escalate. Therefore, I see no reason to question whether 2035 is too early. We’ve got time. Our goal should be to accelerate, not decelerate.”
Samuelsson also criticized the ongoing discourse surrounding net zero, arguing that it’s not reflective of real-world progress.
“As I follow the debates in Brazil concerning police issues, I can’t help but ponder whether all this discussion is genuinely advancing climate improvement?”
“I find myself increasingly inclined to believe that technological advancement and innovation are what we truly need to facilitate progress. Mere discussion won’t suffice.”
“Electrification is the effective solution. It benefits the environment, which is crucial. Moreover, it also appeals to customers. It’s one of the rare green innovations that consumers are enthusiastic about as well.”
Roblox maintains that Australia’s forthcoming social media restrictions for users under 16 should not extend to its platform, as it rolls out a new age verification feature designed to block minors from communicating with unknown adults.
The feature, which is being launched first in Australia, allows users to self-estimate their age using Persona age estimation technology built into the Roblox app. This utilizes the device’s camera to analyze facial features and provide a live age assessment.
This feature will become compulsory in Australia, the Netherlands, and New Zealand starting the first week of December, with plans to expand to other markets in early January.
After completing the age verification, users will be categorized into one of six age groups: under 9, 9-12, 13-15, 16-17, 18-20, or 21 and older.
Roblox has stated that users within each age category will only be able to communicate with peers in their respective groups or similarly aged groups.
Sign up: AU breaking news email
These changes were initially proposed in September and received positive feedback from Australia’s eSafety Commissioner, who has been in discussions with Roblox for several months regarding safety concerns on the platform, labeling this as a step forward in enhancing safety measures.
A recent Guardian Australia investigation revealed a week’s worth of virtual harassment and violence experienced by users who had set their profiles as eight years old while on Roblox.
Regulatory pressure is mounting for Roblox to be included in Australia’s under-16 social media ban, set to be implemented on December 10. Although there are exceptions for gaming platforms, Julie Inman-Grant stated earlier this month that eSafety agencies are reviewing chat functions and messaging in games.
“If online gameplay is the primary or sole purpose, would kids still utilize the messaging feature for communication if it were removed? Probably not,” she asserted.
During a discussion with Australian reporters regarding these impending changes, Roblox’s chief safety officer, Matt Kaufman, characterized Roblox as an “immersive gaming platform.” He explained, “I view games as a framework for social interaction. The essence lies in bringing people together and spending time with one another.”
When asked if this suggests Roblox should be classified as a social media platform subject to the ban, Kaufman responded that Roblox considers social media as a space where individuals post content to a feed for others to view.
“People return to look at the feed, which fosters a fear of missing out,” he elaborated. “It feels like a popularity contest that encapsulates social media. In contrast, Roblox is akin to two friends playing a game after school together. That’s not social media.”
“Therefore, we don’t believe that Australia’s domestic social media regulations apply to Roblox.”
When questioned if the new features were introduced to avoid being encompassed in the ban, Kaufman stated that the company is engaged in “constructive dialogue” with regulators and that these updates showcase the largest instance of a platform utilizing age verification across its entire user base.
Persona, the age verification company partnering with Roblox, Participating in Australian Age Guarantee Technology Trial. They reported a false positive rate of 61.11% for 15-year-olds identified as 16 years old and 44.25% for 14-year-olds.
Kaufman explained that the technology would likely be accurate within a year or two and that users who disagree with the assessment could correct it using a government ID or parental controls to establish an age. He assured that there are “strict requirements” for data deletion after age verification. Roblox states that ID images will be retained for 30 days for purposes such as fraud detection and then erased.
Users who opt not to participate in the age verification will still have access to Roblox, but they will be unable to use features like chat.
More than 150 million people globally engage with Roblox every day across 180 countries, including Australia. According to Kaufman, two-thirds of users are aged 13 and above.
Increasing concerns have been raised regarding the federal government’s need to tackle the dangers that children face on the widely-used gaming platform Roblox, following a report by Guardian Australia that highlighted a week of incidents involving virtual sexual harassment and violence.
While role-playing as an 8-year-old girl, the reporter encountered a sexualized avatar and faced cyberbullying, acts of violence, sexual assault, and inappropriate language, despite having parental control settings in place.
From December 10, platforms including Instagram, Snapchat, YouTube, and Kick will be under Australia’s social media ban preventing Australians under 16 from holding social media accounts, yet Roblox will not be included.
Independent councillor Monique Ryan labeled this exclusion as “unexplainable.” She remarked, “Online gaming platforms like Roblox expose children to unlimited gambling, cloned social media apps, and explicit content.”
At a press conference on Wednesday, eSafety Commissioner Julie Inman Grant stated that platforms would be examined based on their “singular and essential purpose.”
“Kids engaging with Roblox currently utilize chat features and messaging for online gameplay,” she noted. “If online gameplay were to vanish, would kids still use the messaging feature? Likely not.”
Sign up: AU breaking news email
“If these platforms start introducing features that align them more with social media companies rather than online gaming ones, we will attempt to intervene.”
According to government regulations, services primarily allowing users to play online games with others are not classified as age-restricted social media platforms.
Nonetheless, some critics believe that this approach is too narrow for a platform that integrates gameplay with social connectivity. Nyusha Shafiabadi, an associate professor of information technology at Australian Catholic University, asserts that Roblox should also fall under the ban.
She highlighted that the platform enables players to create content and communicate with one another. “It functions like a restricted social media platform,” she observed.
Independent MP Nicolette Boere urged the government to rethink its stance. “If the government’s restrictions bar certain apps while leaving platforms like Roblox, which has been called a ‘pedophile hellscape’, unshielded, we will fail to safeguard children and drive them into more perilous and less regulated environments,” she remarked.
Communications minister spokesperson Annika Wells mentioned that excluding Roblox from the teen social media ban does not imply that it is free from accountability under the Online Safety Act.
A representative from eSafety stated, “We can extract crucial safety measures from Roblox that shield children from various harms, including online grooming and sexual coercion.”
eSafety declared that by the year’s end, Roblox will enhance its Age Verification Technology, which restricts adults from contacting children without explicit parental consent and sets accounts to private by default for users under 16.
“Children under 16 who enable chat through age estimation will no longer be permitted to chat with adults. Alongside current protections for those under 13, we will also introduce parental controls allowing parents to disable chat for users between 13 and 15,” the spokesperson elaborated.
Should entities like Roblox not comply with child safety regulations, authorities have enforcement capabilities, including fines of up to $49.5 million.
eSafety stated it will “carefully oversee Roblox’s adherence to these commitments and assess regulatory measures in the case of future infractions.”
Joanna Orlando, an expert on digital wellbeing from Western Sydney University, pointed out that Roblox’s primary safety issues are grooming threats and the increasing monetization of children engaging with “the world’s largest game.”
She mentioned that it is misleading to view it solely as a video game. “It’s far more significant. There are extensive social layers, and a vast array of individuals on that platform,” she observed.
Green Party spokesperson Sarah Hanson-Young criticized the government for “playing whack-a-mole” with the social media ban.
“We want major technology companies to assume responsibility for the safety of children, irrespective of age,” she emphasized.
“We need to strike at these companies where it truly impacts them. That’s part of their business model, and governments hesitate to act.”
Shadow communications minister Melissa Mackintosh also expressed her concerns about the platform. She stated that while Roblox has introduced enhanced safety measures, “parents must remain vigilant to guard their children online.”
“The eSafety Commissioner and the government carry the responsibility to do everything within their power to protect children from the escalating menace posed by online predators,” she said.
A representative from Roblox stated that the platform is “dedicated to pioneering safety through stringent policies that surpass those of other platforms.”
“We utilize AI to scrutinize games for violating content prior to publication, we prohibit users from sharing images or videos in chats, and we implement sophisticated text filters designed to prevent children from disclosing personal information,” they elaborated.
The Duke and Duchess of Sussex have joined forces with AI innovators and Nobel laureates to advocate for a moratorium on the advancement of superintelligent AI systems.
Prince Harry and Duchess Meghan are signatories of a declaration urging a halt to the pursuit of superintelligence. Artificial superintelligence (ASI) refers to as-yet unrealized AI systems that would surpass human intelligence across any cognitive task.
The declaration requests that the ban remain until there is a “broad scientific consensus” and “strong public support” for the safe and controlled development of ASI.
Notable signatories include AI pioneer and Nobel laureate Jeffrey Hinton, along with fellow “godfather” of modern AI, Yoshua Bengio, Apple co-founder Steve Wozniak, British entrepreneur Richard Branson, Susan Rice, former National Security Advisor under Barack Obama, former Irish president Mary Robinson, and British author Stephen Fry. Other Nobel winners, like Beatrice Finn, Frank Wilczek, John C. Mather, and Daron Acemoglu, also added their names.
The statement targets governments, tech firms, and legislators, and was sponsored by the Future of Life Institute (FLI), a US-based group focused on AI safety. It called for a moratorium on the development of powerful AI systems in 2023, coinciding with the global attention that ChatGPT brought to the matter.
In July, Mark Zuckerberg, CEO of Meta (parent company of Facebook and a key player in U.S. AI development), remarked that the advent of superintelligence is “on the horizon.” Nonetheless, some experts argue that the conversation around ASI is more about competition among tech companies, which are investing hundreds of billions into AI this year, rather than signaling a near-term technological breakthrough.
Still, FLI warns that achieving ASI “within the next 10 years” could bring significant threats, such as widespread job loss, erosion of civil liberties, national security vulnerabilities, and even existential risks to humanity. There is growing concern that AI systems may bypass human controls and safety measures, leading to actions that contradict human interests.
A national survey conducted by FLI revealed that nearly 75% of Americans support stringent regulations on advanced AI. Moreover, 60% believe that superhuman AI should not be developed until it can be demonstrated as safe or controllable. The survey of 2,000 U.S. adults also found that only 5% endorse the current trajectory of rapid, unregulated development.
Leading AI firms in the U.S., including ChatGPT creator OpenAI and Google, have set the pursuit of artificial general intelligence (AGI)—a hypothetical state where AI reaches human-level intelligence across various cognitive tasks—as a primary objective. Although this ambition is not as advanced as ASI, many experts caution that ASI could unintentionally threaten the modern job market, especially due to its capacity for self-improvement toward superintelligence.
When the Australian Christian College, a secondary school situated in Melbourne’s Casey suburb, enforced a mobile phone ban, it was driven by numerous factors. There was an escalation in peer conflicts online, students had difficulty maintaining focus, and teachers noticed students engaging in “code-switching on notifications.”
Caleb Peterson, the school’s principal, stated, “When a phone is within arm’s reach, a student’s attention is only half in the room. We aimed to reclaim their full attention.”
Traditionally, cell phone bans in educational institutions necessitate that devices be stored in bags or lockers during class hours, with confiscation upon discovery to be retained in the school office until the day’s end. This month signifies the two-year mark since the introduction of phone bans across many Australian states. Victoria notably pioneered this move by prohibiting mobile phone usage in public primary and secondary schools back in 2020. By the close of the fourth term in 2023, Western Australia, Tasmania, New South Wales, and South Australia implemented similar measures, with Queensland limiting mobile phone use starting early 2024.
The announcement regarding the ban received endorsement from both parents and politicians, many of whom contended that: restricting access to phones enhances focus and minimizes distractions, though some experts expressed doubts concerning its efficacy. Two years later, what has truly transpired within Australia’s phone-free schools?
At a high school in New South Wales, students’ mobile phones are being stored in a container after being “checked in.” Photo: Stephen Safoir/AAP
“The effects have been evident,” Peterson remarked. “Post-ban, we’ve enhanced class beginnings, diminished disruptions, and improved class dynamics. Conflicts related to devices have reduced, and recess and lunch have transformed. We now see games, conversations, and positive interactions among students and staff. That’s the atmosphere young people seek.”
Research from South Australia—released earlier this March—indicated that 70% of educators noticed increased focus and engagement during learning periods, while 64% noted “a reduction in the rate of serious incidents” attributable to device usage.
Lucaya, a graduate from a western Sydney high school in 2024, views the ban as an “overreaction.” Having experienced both unrestricted cell phone use and the ban during her final year, she reports that students still find covert ways to use their devices.
“Teenagers regard cell phones as vital,” she asserts. “It provides them with a sense of safety and security. Denying them something that holds such significance will only exacerbate stress and anxiety, complicating matters for teachers and administrators.” [and] assisting staff in coping.”
Several students believe that the removal of cell phones from the classroom has curtailed their options to cheat. Photo: Mike Bowers/The Guardian
Nevertheless, anecdotal evidence from dialogues with students and staff across various public and private institutions suggests a general consensus that the ban has yielded positive outcomes. An anonymous high school teacher noted that simply having mobile phones present in classrooms can prove distracting, even if not actively used. “They simply offer opportunities,” she commented. “You can distinctly notice the difference in their absence.”
Many students believe the ban has created a more equitable learning environment. Amy, a Year 11 student at a public high school in Sydney’s west, remarked that eliminating mobile phones in classrooms has curtailed misbehavior while also fostering social connections for those who spend excess time online.
“Students [feel more at ease] “It fosters a safe environment where we don’t have to stress about people sharing pictures of us,” she stated.
Mariam, a Year 11 student at a public high school in Sydney’s south, felt that the phone ban was “unjust” and claimed that teachers occasionally used it to exert authority, but admitted it positively influenced learning outcomes. Aisha, a Year 11 student from a private Islamic school in Sydney’s west, noted that the phone ban has helped her “maintain attention longer and perform better academically.”
Dr. Tony Mordini, principal of Melbourne High School, a public selective institution, has observed this heightened attention firsthand. His school adopted a no-phone policy in January 2020, following guidelines from the Victorian Department of Education.
“From a professional perspective, this ban has clearly had a beneficial impact,” he stated. “Students exhibit increased focus during lessons and are less sidetracked by online distractions. Furthermore, the absence of phones has significantly curtailed opportunities for cyberbullying and harassment in classrooms.”
However, Mordini acknowledges that the ban also curtails certain student opportunities.
“It’s crucial to recognize what we’ve surrendered,” he remarks. “Mobile phones can serve as powerful educational tools, capable of storing extensive content, assisting with research, capturing photographs, creating videos, and hosting valuable applications. Lacking a mobile phone necessitates reliance on the traditional resources and devices provided by the school.”
Professor Neil Selwyn from Monash University’s School of Education, Culture, and Society, stated, “We’ve been informed that banning phones will curb cyberbullying, enhance concentration in class, and reduce the need for teachers to discipline for phone misuse.” Some politicians promised to boost student learning and mental health, but a significant impetus behind these bans was their popularity.
He suggested that schools might serve as a stand-in for wider concerns about children and their device usage, but questions whether schools serve as the optimal solution.
“Young people spend a significant amount of time outside school, thus parents and families must engage in discussions on regulating their children’s device usage at home,” he emphasizes. “Regrettably, this isn’t a priority for most policymakers, so enacting phone bans in schools feels like an easy way to address the broader issue of excessive digital device use.”
Mr. Selwyn indicated that Australia’s phone ban was not implemented “with the intent of thoroughly investigating its effectiveness” and termed specific research into this field as “not conclusive or particularly rigorous.”
He further asserted that recent government data from New South Wales and South Australia is “not particularly illuminating.”
“The critical concern remains how these bans will affect us over time,” he noted. “Claims suggesting these bans suddenly result in dramatic improvements may sound politically appealing, but the tangible impact of these bans necessitates more comprehensive and ongoing investigation.
“We must go beyond merely asking principals if they believe student learning has enhanced. We need to enter classrooms and engage students and teachers about their varied experiences with the ban, and the potential benefits they foresee moving forward.”
He referenced a recent UK study of 30 schools and over 1,200 students which concluded that “students in schools devoid of smartphones showed no notable differences in mental health, sleep, academic performance in English or mathematics, or even disruptive behavior in class.”
“Phone bans are not a silver bullet, but they serve as an important tool,” Peterson comments. Photo: Dan Peled/AAP
“While some studies imply a connection between phone bans and improved academic performance, they are not deemed to provide reliable evidence of direct causation,” he states. “It would be imprudent to assume a phone ban would singularly and significantly rectify these issues.”
Peterson takes care not to “exaggerate” the ban’s implications but asserts that it aims to “foster conditions conducive to successful learning and friendships.” Despite exempting medical management, disability support, or assistive translation applications, he contends that academic flow is enhanced, conflict is reduced, and social unity is improved. His school’s “health metrics” indicate “lessened psychological distress.”
“Phone bans are not a panacea,” he notes. “However, they are a valuable resource, particularly when paired with digital citizenship, mental health advocacy, and positive playground initiatives.”
Peterson conveyed that numerous students suggested the ban offers them a “reprieve.”
“Phone bans have now simply become the norm, with real and modest benefits that are genuinely worthwhile.”
The concept of genetically modifying wild lions sparks debate
Andrewfel/Shutterstock
Is there a need to genetically modify wild lions? While it may seem unnecessary, it provokes a quick reaction. Consider a scenario where a devastating disease, introduced by humans, threatens their survival. What if genetic alterations could boost immunity against this disease, providing a natural evolution path through time as more lions perish?
This debate is fracturing the environmentalist community, with discussions set to intensify. Next week, at a meeting of the International Union for Conservation of Nature (IUCN)—the leading conservation organization—delegates will vote on a proposal to “suspend” genetic engineering in wildlife, including the introduction of modified microorganisms.
“I’m uncertain how the voting will unfold,” says Piero Genovesi from the Italian Institute of Environmental Protection, who backs an open letter opposing the proposal.
While the IUCN’s moratorium on synthetic biology carries no legal weight, it may still have significant repercussions. Various conservation organizations might halt projects involving genetic engineering, and some nations could incorporate such restrictions into their laws.
“Moratoriums would undoubtedly pose challenges on various fronts,” states Ben Novak, of the US-based nonprofit Revive & Restore, which aims to leverage biotechnology for the recovery of endangered and extinct species.
Why is this issue gaining attention now? The answer lies in CRISPR. In 2014, the potential for gene drives using CRISPR technology was demonstrated. Gene drives allow specific DNA segments to be passed down through generations, enabling them to spread even if detrimental. This technology could theoretically eliminate invasive species or spread beneficial traits like disease resistance.
Discussions emerged at a 2016 conference in Hawaii regarding employing gene drives to eradicate invasive mosquitoes that have decimated Hawaii’s native bird species, according to Genovesi. Reactions were mixed; some were enthusiastic, while others expressed deep concern.
This tension led to the proposed moratorium. “Gene drives are being promoted by some as a one-size-fits-all solution to environmental issues,” mentions Ricarda Steinbrecher from Econex, an organization also advocating for the moratorium.
However, the broad language of the proposed motion could affect much more than just gene drives. It might unintentionally restrict passive conservation efforts and the use of live vaccines.
Steinbrecher suggests the moratorium is a temporary halt, indicating another vote may take place later “when more data becomes available.” However, with many proponents of the ban being staunchly against genetic engineering, changing their perspectives may be challenging. “I’m concerned it could lead to an extended pause,” Genovesi states.
Imagine the prospect of using gene editing to make wild animals disease-resistant. While Steinbrecher raises concerns about unintended consequences, current evidence suggests the risks remain low. This is why some genetically edited foods are already being consumed, and the first CRISPR therapy received approval last year.
The same considerations regarding benefits and risks are applicable to conservation efforts. For instance, is it preferable to witness global warming decimating coral reefs rather than releasing genetically engineered symbiotic algae to enhance coral heat tolerance?
The scalability of such endeavors is crucial, asserts Novak. Manual transplanting of corals will not be enough to salvage the reefs. “Synthetic biology tools are essential for achieving the broad objective of restoring 30% of land and saving seed varieties,” he emphasizes.
Ultimately, this discourse revolves around conflicting visions of nature. Some regard it as a pristine entity, wary of genetic modification. Nonetheless, humans have already altered nature significantly. Our actions have unintentionally interfered with genetic selection through practices like hunting, pollution, pesticide use, and the introduction of invasive species and diseases.
These actions necessitate adaptations among many species for their survival; for instance, specific elephant populations are now nearly devoid of tusks.
However, this does not imply that further interference will yield positive outcomes. The release of gene drives carries significant risks, such as their potential spread beyond intended targets.
Researchers are cognizant of these hazards. Methods like self-limiting gene drives can be implemented to prevent unrestrained gene dispersion.
“We are confronted with a severe biodiversity crisis,” Genovesi argues. “We shouldn’t close ourselves off to innovative tools that could assist us in combatting substantial threats.”
Conservation and Rewilding in the Central Apennines: Italy
A journey through Italy’s central Apennines introduces the practical realities and philosophy behind rewilding.
The former Meta executive, who authored a provocative book highlighting social media companies’ interactions with China and their treatment of teenagers, is reportedly facing bankruptcy after its release.
Lawmakers in Congress have contended that Mark Zuckerberg’s company is trying to “silence and punish” Sarah Wynn Williams, the former director of global public policy at Facebook, Meta’s predecessor.
Former Labor Transport Secretary Louise Hayes stated that Wynn Williams may incur a fine of $50,000 (£37,000) for each breach of an order obtained by Meta.
In her book, Eardaling People, published this year, Wynn-Williams made several claims regarding the conduct and culture of social media firms, including allegations of sexual harassment that the company denied. She asserts that her dismissal was due to “poor performance and toxic behavior.”
Nevertheless, the former diplomat has been prohibited from publishing memoirs after Meta secured a ruling against her. She later testified before the US Senate Judiciary Subcommittee, claiming Meta collaborated “with gloves” with Beijing regarding censorship tools.
Pan Macmillan, which published the memoir, reported over 150,000 copies sold across all formats. The book was also recognized as a Sunday Times bestseller in Hardback for 2025, with a paperback edition due for release early next year.
Haigh pointed out Wynn-Williams’ situation during a House of Representatives debate on employment rights on Monday, asserting that her decision has led to significant financial jeopardy.
“Despite previous official statements indicating that Meta had ceased using NDAs [non-disclosure agreements] in cases of sexual harassment,” she noted, “Sarah is being pushed towards financial ruin within the UK arbitration system.
“Meta has given Sarah a disturbing order and is gearing up to impose a $50,000 fine for any violations. She is on the brink of bankruptcy, and I am confident that the home and government will push this legislation to protect individuals with moral courage.”
It’s understood that the $50,000 figure pertains to damages Wynn-Williams must pay for violating a separation agreement she signed when leaving Meta in 2017, with Meta asserting that she voluntarily agreed to the terms.
Mehta indicated that, as of now, Wynn-Williams has not been compelled to adhere to the contract.
The company refrained from commenting on Hayes’ intervention. Senate testimony from Wynn-Williams previously asserted that the company has been “disconnected from reality” and is plagued by false claims.
Meta characterized the book as “an outdated, previously reported compilation of company claims and unfounded allegations against executives.” She claimed she was dismissed for “poor performance and toxic behavior,” with investigations concluding that she made misleading harassment allegations.
The ruling that barred her memoir’s publication affirmed that “the false narrative should never have seen the light of day.”
The order dictated that Wynn-Williams must halt promotion of the book and minimize any further publications, though no actions were mandated against Pan Macmillan.
Since her Senate hearing in April, Wynn-Williams has remained publicly silent. In a statement this month, she expressed gratitude for the continued investigation into Meta’s actions by the US Senate.
“I wish I could elaborate,” she stated. “I urge other tech employees and potential whistleblowers to share their insights before more harm comes to children.”
Her attorney mentioned that Wynn-Williams “will remain silent regarding the matters currently under Congressional investigation.”
In a few months, Australian teenagers may face restrictions on social media access until they turn 16.
As the December implementation date approaches, parents and children are left uncertain about how this ban will be enforced and how online platforms will verify users’ ages.
Experts are anticipating troubling outcomes, particularly since the technology used by social media companies to determine the age of users tends to have significant inaccuracies.
From December 10th, social media giants like Instagram, Facebook, X, Reddit, YouTube, Snapchat, and TikTok are required to remove or deactivate any accounts for users under 16 in Australia. Failing to comply could result in fines reaching up to $49.5 million (around $32 million USD), while parents will not face penalties.
Prior to the announcement of the ban, the Australian government initiated a trial on age verification technology, which released preliminary findings for June, with a comprehensive report expected soon. This study aimed to test an age verification tool on over 1,100 students across the country, including indigenous and ethnically diverse groups.
Andrew Hammond from KJR, the consulting firm based in Canberra that led the trial, shared an anecdote illustrating the challenge at hand. One 16-year-old boy’s age was inaccurately guessed to be between 19 and 37.
“He scrunched up his face and held his breath, turning red and puffy like an angry older man,” he said. “He didn’t do anything wrong; we wanted to see how our youth would navigate these systems.”
Other technologies have also been evaluated with Australian youth, such as hand gesture analysis. “You can estimate someone’s age broadly based on their hand appearance,” Hammond explains. “While some children felt uneasy using facial recognition, they were more comfortable with hand assessments.”
The interim report indicated that age verification could be safe and technically viable; previous headlines noted that while challenges exist, 85% of subjects’ ages could be accurately estimated within an 18-month range. If a person initially verified as being over 16 is later identified as under that age, they must undergo more rigorous verification processes, including checks against government-issued IDs or parental verification.
Hammond noted that some underage users can still be detected through social media algorithms. “If you’re 16 but engage heavily with 11-year-old party content, it raises flags that the social media platform should consider, prompting further ID checks.”
Iain Corby from the London Association of Age Verification Providers, which supported the Australian trial, pointed out that no single solution exists for age verification.
The UK recently mandated age verification on sites hosting “harmful content,” including adult material. Since the regulations went into effect on July 25th, around 5 million users have been verifying their ages daily, according to Corby.
“In the UK, the requirement is for effective but not foolproof age verification,” Corby stated. “There’s a perception that technology will never be perfect, and achieving higher accuracy often requires more cumbersome processes for adults.”
Critics have raised concerns about a significant loophole: children in Australia could use virtual private networks (VPNs) to bypass the ban by simulating locations in other nations.
Corby emphasized that social media platforms should monitor traffic from VPNs and assess user behavior to identify potential Australian minors. “There are many indicators that someone might not be in Thailand, confirming they could be in Perth,” he remarked.
Apart from how age verification will function, is this ban on social media the right approach to safeguarding teenagers from online threats? The Australian government asserted that significant measures have been implemented to protect children under 16 from the dangers associated with social media, such as exposure to inappropriate content and excessive screen time. The government believes that delaying social media access provides children with the opportunity to learn about these risks.
Various organizations and advocates aren’t fully convinced. “Social media has beneficial aspects, including educational opportunities and staying connected with friends. It’s crucial to enhance platform safety rather than impose bans that may discourage youth voices,” stated UNICEF Australia on its website.
Susan McLean, a leading cybersecurity expert in Australia, argues that the government should concentrate on harmful content and the algorithms that promote such material to children, expressing concern that AI and gaming platforms have been exempted from this ban.
“What troubles me is the emphasis on social media platforms, particularly those driven by algorithms,” she noted. “What about young people encountering harmful content on gaming platforms? Have they been overlooked in this policy?”
Lisa Given from RMIT University in Melbourne explained that the ban fails to tackle issues like online harassment and access to inappropriate content. “Parents may have a false sense of security thinking this ban fully protects their children,” she cautioned.
The rapid evolution of technology means that new platforms and tools can pose risks unless the underlying issues surrounding harmful content are addressed, she argued. “Are we caught in a cycle where new technologies arise and prompt another ban or legal adjustment?” Additionally, there are concerns that young users may be cut off from beneficial online communities and vital information.
The impact of the ban will be closely scrutinized post-implementation, with the government planning to evaluate its effects in two years. Results will be monitored by other nations interested in how these policies influence youth mental health.
“Australia is presenting the world with a unique opportunity for a controlled experiment,” stated Corby. “This is a genuine scientific inquiry that is rare to find.”
The White House launched its official TikTok account on Tuesday, even as Donald Trump continues to permit China-owned platforms to operate in the US, despite legislation necessitating their sale.
The caption on the popular video-sharing app was featured in a 27-second clip, marking the inaugural post on the “America is Back! What’s wrong with TikTok?” account.
Within an hour of the video’s release, the account gathered around 4,500 followers. Meanwhile, Trump’s personal TikTok account boasts 15.1 million followers, though his last post dates back to November 5, 2024, Election Day.
Trump has expressed a fondness for popular apps and believes their influence helped secure his support among younger voters during his victory over Democratic opponent Kamala Harris in the November 2024 presidential election.
According to White House spokesperson Carolyn Leavitt, “The Trump administration is dedicated to showcasing the historic achievements that President Trump has delivered to Americans through as many platforms and viewers as possible,” when announcing the account’s launch.
A federal law mandating a ban on TikTok’s sale on national security grounds is set to take effect on January 20, the day before Trump’s inauguration.
However, the 2024 election campaign heavily relies on social media, and the Republican president, who advocates for TikTok, has opted against the suspension.
TikTok remains immensely popular, with 170 million users in the US. The platform’s parent company revealed in April that discussions with the US government regarding potential solutions for the app were ongoing. The contract “is subject to approval under Chinese law.”
In mid-June, Trump extended TikTok’s deadline for an additional 90 days, allowing for the identification of non-Chinese buyers, or facing a ban in the US for a third time.
This extension is expected to expire in mid-September.
Trump initially proposed the idea of banning TikTok in 2020, voicing concerns that China-owned apps posed a national security threat. The issue quickly garnered bipartisan support, leading Congress to overwhelmingly pass a vote to ban the app last year. The original deadline for the TikTok ban was set for January 19.
After joining the platform during his presidential campaign last year, Trump reversed his position on TikTok, amassing nearly 15 million followers and even hosting TikTok CEO Shou Zi Chew at his Mar-a-Lago Estate in Florida, where Chew also attended Trump’s inauguration.
Trump has maintained his stance on the necessity of a ban or sale but pledged to support the platform after changing his viewpoint, believing it helped him gain traction with young voters during the November election.
While Trump’s official account on X (formerly Twitter) boasts 108.5 million followers, his preferred social media platform is the one he owns, Truth Social, where he has 10.6 million followers.
The official White House accounts on X and Instagram have 2.4 million and 9.3 million followers, respectively.
Australians engaging with various social media platforms like Facebook, Instagram, YouTube, Snapchat, X, and others should verify that they are over 16 years old ahead of the upcoming social media ban set to commence in early December.
Beginning December 10th, new regulations will come into effect for platforms defined by the government as “age-restricted social media platforms.” These platforms are intended primarily for social interactions involving two or more users, enabling users to share content on the service.
The government has not specified which platforms are included in the ban, implying that any site fitting the above criteria may be affected unless it qualifies for the exemptions announced on Wednesday.
Prime Minister Anthony Albanese noted that platforms covered by these rules include, but aren’t limited to, Facebook, Instagram, X, Snapchat, and YouTube.
Communications Minister Annika Wells indicated that platforms are anticipated to disable accounts for users under 16 and implement reasonable measures to prevent younger individuals from creating new accounts, verifying their age, and bypassing established restrictions.
What is an Exemption?
According to the government, a platform will be exempt if it serves a primary purpose other than social interaction.
Messaging, email, voice, or video calling.
Playing online games.
Sharing information about products or services.
Professional networking or development.
Education.
Health.
Communication between educational institutions and students or their families.
Facilitating communication between healthcare providers and their service users.
Determinations regarding which platforms meet the exemption criteria will be made by the eSafety Commissioner.
In practice, this suggests that platforms such as LinkedIn, WhatsApp, Roblox, and Coursera may qualify for exemptions if assessed accordingly. LinkedIn previously asserted that the government’s focus is not on children.
Hypothetically, platforms like YouTube Kids could be exempt from the ban if they satisfy the exemption criteria, particularly as comments are disabled on those videos. Nonetheless, the government has yet to provide confirmation, and YouTube has not indicated if it intends to seek exemptions for child-focused services.
What About Other Platforms?
Platforms not named by the government and that do not meet the exemption criteria should consider implementing age verification mechanisms by December. This includes services like Bluesky, Donald Trump’s Truth Social, Discord, and Twitch.
How Will Tech Companies Verify Users Are Over 16?
A common misunderstanding regarding the social media ban is that it solely pertains to children. To ensure that teenagers are kept from social media, platforms must verify the age of all user accounts in Australia.
There are no specific requirements for how verification should be conducted, but updates from the Age Assurance Technology Trial will provide guidance.
The government has mandated that identity checks can be one form of age verification but is not the only method accepted.
Australia is likely to adopt an approach for age verification comparable to that of the UK, initiated in July. This could include options such as:
Requiring users to be 18 years of age or older to allow banks and mobile providers access to their users.
Requesting users to upload a photo to match with their ID.
Employing facial age estimation techniques.
Moreover, platforms may estimate a user’s age based on account behavior or the age itself. For instance, if an individual registered on Facebook in 2009, they are now over 16. YouTube has also indicated plans to utilize artificial intelligence for age verification.
Will Kids Find Workarounds?
Albanese likened the social media ban to alcohol restrictions, acknowledging that while some children may circumvent the ban, he affirmed that it is still a worthwhile endeavor.
In the UK, where age verification requirements for accessing adult websites were implemented this week, there has been a spike in the use of virtual private networks (VPNs) that conceal users’ actual locations, granting access to blocked sites.
Four of the top five free apps in the UK Apple App Store on Thursday were VPN applications, with the most widely used one, Proton, reporting an 1,800% increase in downloads.
The Australian government expects platforms to implement “reasonable measures” to address how teenagers attempt to evade the ban.
What Happens If a Site Does Not Comply With the Ban?
Platforms failing to implement what eSafety members deem “reasonable measures” to prevent children from accessing their services may incur fines of up to $49.5 million, as determined in federal court.
The definition of “reasonable measures” will be assessed by committee members. When asked on Wednesday, Wells stated, “I believe a reasonable step is relative.”
“These guidelines are meant to work, and any mistakes should be rectified. They aren’t absolute settings or rules, but frameworks to guide the process globally.”
The Australian government is rapidly identifying which social media platforms will face restrictions for users under 16.
Social Services Minister Tanya Plibersek stated on Monday that the government “will not be intimidated by the actions of social media giants.” Nevertheless, tech companies are vigorously advocating for exemptions from the law set to take effect in December.
Here’s what social media companies are doing to support their case:
The parent company of Facebook and Instagram has introduced new Instagram teen account settings to signal their commitment to teenage safety on the platform.
Recently, Meta revealed New protections, which aim to enhance direct message security by automatically censoring nude images and implementing blocking features.
Additionally, Meta hosted a “Screen Smart” safety event in Sydney targeted at “Parent Creators,” led by Sarah Harris.
Sign up: AU Breaking News Email
YouTube
YouTube’s approach is even more assertive. Last year, Communications Minister Michelle Roland suggested the platform would be exempt from social media restrictions.
However, last month, the Esafety Commissioner advised the government to reconsider this exemption, citing research indicating that children often encounter harmful materials on YouTube.
Since then, the company has escalated its lobbying efforts, including full-page advertisements claiming YouTube can be used by “everyone,” alongside a letter sent to Communications Minister Anica Wells warning of a potential high court challenge if YouTube is subjected to the ban.
YouTube advertisement campaign opposing social media restrictions set to commence in December. Photo: Michael Karendiane/Guardian
As reported by Guardian Australia last month, Google is hosting its annual showcase this week at the Capitol on Wednesday. There, content creators, including child musicians, who oppose the YouTube ban will likely express their views to politicians.
Last year’s event featured the Wiggles, who met with Roland. This meeting was mentioned in a letter sent to Rowland last year when YouTube’s global CEO Neal Mohan requested the exemption within 48 hours of the promised relief.
Guardian Australia reported last week that YouTube met with Wells this month for an in-person discussion regarding the ban.
TikTok
Screenshots from TikTok’s advertisements highlighting its benefits for teenagers. Photo: TikTok
This month, TikTok is running ads on its platform as well as on Meta channels, promoting educational benefits for teens on vertical video platforms.
“The 1.7m #fishtok video encourages outdoor activities in exchange for screen time,” the advertisement states, acknowledging the government’s assertion that the ban would promote time spent outside. “They are developing culinary skills through cooking videos that have garnered over 13m views,” it continues.
“A third of users visit the STEM feed weekly to foster learning,” another ad claims.
Snapchat
Screenshot of Snapchat’s educational video about signs of grooming featuring Lambros army. Photo: Snapchat
Snapchat emphasizes user safety. In May, Guardian Australia reported on an instance involving an 11-year-old girl who added random users as part of a competition with her friend for high scores on the app.
This month, Snapchat announced a partnership with the Australian Federal Police-led Australian Centre to address child exploitation through a series of educational videos shared by various Australian influencers, along with advertisements advising parents and teens on identifying grooming and sextortion.
“Ensuring safety within the Snapchat community has always been our top priority, and collaborating closely with law enforcement and safety experts is crucial to that effort,” stated Ryan Ferguson, Australia’s Managing Director at Snap.
The platform has also reiterated account settings for users aged 13-17, including default private accounts and chat warnings when communicating with individuals who lack shared friends or are absent from contact lists.
“It is undeniable that young people’s mental health has been adversely affected due to social media engagement, prompting the government’s actions,” Prime Minister Anthony Albanese told ABC insiders on Sunday.
“I will meet again with individuals who have faced tragedy this week… one concern expressed by some social media companies is our leadership on this matter, and we take pride in effectively confronting these threats.”
Australia’s online safety regulators advise that YouTube should not be granted an exemption from a social media ban for individuals under 16, stating that video streaming platforms can expose children to dangerous content.
In contrast, YouTube contends that it should adhere to a proposed regulation indicating that the government will provide exemptions to the platform.
What are the advantages and disadvantages of regulating YouTube? And what implications does this have for a child watching YouTube if it becomes prohibited?
Why did the government consider exempting YouTube initially?
Last November, when Congress introduced legislation banning access to social media for children under 16, then Communications Minister Michelle Roland indicated that YouTube would be exempted.
This exemption was justified on the grounds that YouTube serves “an important purpose by providing youth with educational and health resources.”
The ban on social media in Australia for individuals under 16 is now law. Many details remain unclear – Video
This exemption came just 48 hours after revelations in April by Guardian Australia regarding the minister’s direct lobbying efforts involving the global CEO of YouTube.
This decision surprised YouTube competitors such as Meta, TikTok, and Snapchat. TikTok described it as a “special deal.” YouTube has launched vertical video products like Instagram and TikTok reels, leading its competitors to believe it should be included in the ban.
What led the eSafety Commissioner to recommend banning YouTube?
As new regulations regarding social media platforms were being formulated, the Minister consulted with eSafety Commissioner Julie Inman Grant.
In a recent report, Inman Grant highlighted findings from a youth survey indicating that 76% of individuals aged 10 to 15 use YouTube. The survey also showed that 37% of children who experienced potentially harmful content online encountered it on YouTube.
Additionally, it was observed that increased time spent on YouTube correlates with higher levels of depression, anxiety, and insomnia among youth, according to the Black Dog Institute.
“Currently, YouTube boasts persuasive design elements associated with health risks, including features that could encourage unnecessary or excessive usage (such as autoplay, social validations, and algorithm-driven content feeds),” noted Inman Grant.
“When combined, these elements can lead to excessive engagement without breaks and heighten exposure to harmful material.”
Inman Grant concluded that there is insufficient evidence to suggest that YouTube provides exclusively beneficial experiences for children under 16.
However, it’s noted that children may still view content on YouTube even if they are logged out and thereby prohibited from using accounts.
What is YouTube’s position?
In a recent statement, Rachel Lord, YouTube’s senior public policy manager for Australia and New Zealand, commented on the eSafety Commissioner’s advice which was examined and subsequently supported by Parliament. YouTube views the findings on community opinion regarding the platform’s suitability for younger audiences as being “inconsistent with government commitments.”
YouTube has been developing age-appropriate offerings for over ten years, and in Q1 of 2025, the company removed 192,856 videos for breaching its hate speech and abusive content policies, a 22% increase from the previous year.
The platform asserts its role primarily as a video hosting service rather than a promoter of social interaction. A survey conducted among Australian teachers revealed that 84% use YouTube monthly as a resource for student learning.
YouTube also stated that the eSafety Commission and potentially the Communications Minister may be reconsidering the exemption following pressures from YouTube’s competitors.
What about YouTube Kids?
YouTube asserts that it offers a platform tailored for younger users, restricting both the uploading of content and commenting features for children.
The company does not seek exemptions solely for its children’s products.
When questioned about YouTube Kids during the National Press Club event, it was indicated that the platform is considered low-risk, designed specifically for children, and possesses adequate safety measures. However, it was stated, “I cannot respond until I have seen the regulations.”
Can children access YouTube without an account?
Yes. Inman Grant confirmed that if teachers wish to show videos to their students, they can access YouTube without needing to log in.
She noted that YouTube has “opaque algorithms that create addictive ‘rabbit holes’,” and remarked that when she accessed the site while logged out, her experience was positive, empowering users to engage without being subjected to addictive technological features.
In response to YouTube’s assertions on Thursday, Inman Grant reiterated that the call for exclusion from the ban aims to “allow young Australians to access YouTube’s diverse content.” However, she clarified that her advice does not imply that children will lose access to YouTube’s educational resources.
“The new law strictly restricts children under 16 from holding their own accounts. They will not be able to access YouTube or other services while logged out,” she added.
“There is nothing preventing educators with their own accounts from continuing to share educational content on YouTube or other platforms approved for school use.”
What are the next steps?
The Minister will finalize the guidelines and identify the social media platforms covered by the ban in the upcoming months.
A trial on age verification technology is expected to be reported to the Minister by the end of July, which will establish the technology platforms must implement to prevent access for users under 16.
The government has announced that the ban is anticipated to come into force in early December.
YouTube has expressed its discontent with the nation’s online safety authorities for sidelining parents and educators, advocating to be included in the proposed social media restriction for users under 16.
Julie Inman Grant from the eSafety Commissioner’s office has called on the government to reconsider its choice to exclude video-sharing platforms from the age restrictions that apply to apps like TikTok, Snapchat, and Instagram.
In response, YouTube insists the government should adhere to the draft regulations and disregard Inman Grant’s recommendations.
“The current stance from the eSafety Commissioner offers inconsistent and contradictory guidance by attempting to ban previously acknowledged concerns,” remarked Rachel Lord, YouTube’s public policy and government relations manager.
“eSafety’s advice overlooks the perspectives of Australian families, educators, the wider community, and the government’s own conclusions.”
Inman Grant highlighted in her National Press Club address on Tuesday that the proposed age limits for social media would be termed “delays” rather than outright “bans,” and are scheduled to take effect in mid-December. However, details on how age verification will be implemented for social media users remain unclear, though Australians should brace for a “waterfall of tools and techniques.”
Guardian Australia reported that various social media platforms have voiced concerns over their lack of clarity regarding legal obligations, expressing skepticism about the feasibility of developing age verification systems within six months of the impending deadline.
Inman Grant pointed out that age verification should occur on individual platforms rather than at the device or App Store level, noting that many social media platforms are already utilizing methods to assess or confirm user ages. She mentioned the need for platforms to update eSafety on their progress in utilizing these tools effectively to ensure the removal of underage users.
Nevertheless, Inman Grant acknowledged the imperfections of the system. “For the first time, I’m aware that companies may not get it right. These technologies won’t solve everything, but using them in conjunction can lead to a greater rate of success.”
“The social media restrictions aren’t a panacea, but they introduce some friction into the system. This pioneering legislation aims to reduce harm for parents and caregivers and shifts the responsibility back to the companies themselves,” Inman Grant stated.
“We regard large tech firms as akin to an extraction industry. Australia is calling on these businesses to provide the safety measures and support we expect from nearly every other consumer industry.”
YouTube has committed to adhering to regulations outlined by former Communications Minister Michelle Rowland, who included specific exemptions for resources such as the Kids Helpline and Google Classroom to facilitate access to educational and health support for children.
Communications Minister Annika Wells indicated that a decision regarding the commissioner’s recommendations on the draft rules will be made within weeks, according to a federal source.
YouTube emphasized that its service focuses on video viewing and streaming rather than social interaction.
They asserted their position as a leader in creating age-appropriate products and addressing potential threats, denying any changes to policies that would adversely impact younger users. YouTube reported removing over 192,000 videos for violating hate speech and abuse policies just in the first quarter of 2025, and they have developed a product specifically designed for young children.
Lord urged that the government should maintain a consistent stance by not exempting YouTube from the restrictions.
“The eSafety advice contradicts the government’s own commitments, its research into community sentiment, independent studies, and perspectives from key stakeholders involved in this matter.”
Shadow Communications Minister Melissa Mackintosh emphasized the need for clarity regarding the forthcoming reforms from the government.
“The government must clarify the expectations placed on social media platforms and families to safeguard children from prevalent online negativity,” she asserted.
“There are more questions than answers regarding this matter. This includes the necessary verification techniques and those platforms will need to adopt to implement the minimum social media age standard by December 10, 2025.”
The “nudifice” app utilizing artificial intelligence to generate explicit sexual images of children is raising alarms, echoing concerns from English children’s commissioners amidst rising fears for potential victims.
Girls have reported refraining from sharing images of themselves on social media due to fears that generative AI tools could alter or sexualize their clothing. Although creating or disseminating sexually explicit images of children is illegal, the underlying technology remains legal, according to the report.
“Children express fear at the mere existence of this technology. They worry strangers, classmates, or even friends might exploit smartphones to manipulate them, using these specialized apps to create nude images,” a spokesperson stated.
“While the online landscape is innovative and continuously evolving, there’s no justifiable reason for these specific applications to exist. They have no rightful place in our society, and tools that enable the creation of naked images of children using deepfake technology should be illegal.”
De Souza has proposed an AI bill mandating that developers of generative AI tools address product functionalities, and has urged the government to implement an effective system for eliminating explicit deepfake images of children. This initiative should be supported by policy measures recognizing deep sexual abuse as a form of violence against women and girls.
Meanwhile, the report calls on Ofcom to ensure diligent age verification of nudification apps, and for social media platforms to restrict access to sexually explicit deepfake tools targeted at children, in accordance with online safety laws.
The findings revealed that 26% of respondents aged 13 to 18 had encountered deep, sexually explicit images of celebrities, friends, teachers, or themselves.
Many AI tools reportedly focus solely on female bodies, thereby contributing to an escalating culture of misogyny, the report cautions.
An 18-year-old girl conveyed to the commissioner:
The report highlighted cases like that of Mia Janin, who tragically died by suicide in March 2021, illustrating connections between deepfake abuse, suicidal thoughts, and PTSD.
In her report, De Souza stated that new technologies confront children with concepts they struggle to comprehend, evolving at a pace that overwhelms their ability to recognize the associated hazards.
The lawyer explained to the Guardian that this reflects a lack of understanding regarding the repercussions of actions taken by young individuals arrested for sexual offenses, particularly concerning deepfake experimentation.
Daniel Reese Greenhalgh, a partner at Cokerbinning law firm, noted that the existing legal framework poses significant challenges for law enforcement agencies in identifying and protecting abuse victims.
She indicated that banning such apps might ignite debates over internet freedom and could disproportionately impact young men experimenting with AI software without comprehension of the consequences.
Reece-Greenhalgh remarked that while the criminal justice system strives to treat adolescent offenses with understanding, previous efforts to mitigate criminality among youth have faced challenges when offenses occur in private settings, leading to unintended consequences within schools and communities.
Matt Hardcastle, a partner at Kingsley Napley, emphasized the “online youth minefield” surrounding access to illegal sexual and violent content, noting that many parents are unaware of how easily their children can encounter situations that lead to harmful experiences.
“Parents often view these situations from their children’s perspectives, unaware that their actions can be both illegal and detrimental to themselves or others,” he stated. “Children’s brains are still developing, leading them to approach risk-taking very differently.”
Marcus Johnston, a criminal lawyer focusing on sex crimes, reported working with an increasingly youthful demographic involved in such crimes, often without parental awareness of the issues at play. “Typically, these offenders are young men, seldom young women, ensnared indoors, while parents mistakenly perceive their activities as mere games,” he explained. “These offenses have emerged largely due to the internet, with most sexual crimes now taking place online, spearheaded by forums designed to cultivate criminal behavior in children.”
A government spokesperson stated:
“It is appallingly illegal to create, possess, or distribute child sexual abuse material, including AI-generated images. Platforms of all sizes must remove this content or face significant fines as per online safety laws. The UK is pioneering the introduction of AI-specific child sexual abuse offenses, making it illegal to own, create, or distribute tools crafted for generating abhorrent child sexual abuse material.”
In the UK, the NSPCC offers support to children at 0800 1111 and adults concerned about children can reach out at 0808 800 5000. The National Association of People Abused in Childhood (NAPAC) supports adult survivors at 0808 801 0331. In Australia, children, young adults, parents, and educators can contact the 1800 55 1800 helpline for children, or Braveheart at 1800 272 831. Adult survivors may reach the Blue Knot Foundation at 1300 657 380.
France has implemented stricter rules on the use of mobile phones in middle schools, with students aged 11 to 15 required to keep their devices in lockers or pouches during school hours and can only access them again at the end of the day.
The Education Minister informed the Senate that the goal was for children to be completely separated from their phones throughout the school day in all French middle schools starting in September.
Elisabeth Borne stated, “Given the widespread concerns about the negative impact of screen time, this measure is crucial for the well-being and academic success of children in school.”
In 2018, a ban was imposed on mobile phone use for children in all middle schools in France – Colege. Phones must remain switched off in school bags and cannot be used anywhere on school premises, including during breaks.
Schools have reported positive outcomes such as increased social interactions, more physical activity, decreased bullying, and improved focus. However, some students still find ways to access their phones, such as sneaking into the restroom or watching videos during breaks.
The government is now requiring children to be completely separated from their devices for the entire school day, enforcing a “digital suspension.” Pilot schemes at around 100 middle schools over the past six months have shown that children have been willing to surrender their phones upon arrival.
Mobile devices are prohibited at elementary schools as well.
Borne informed the Senate, “Feedback from the trials has been overwhelmingly positive, with strong support from parents and teachers for enhancing the school environment.”
In response to concerns about costs and logistics from some unions, Borne stated that principals can choose the format for implementing the ban, such as lockers or pouches.
Referring to a recent study by the National Council of France, Borne mentioned, “Currently, young people spend an average of five hours a day on screens but only three hours a week reading books.”
Last year, a scientific report commissioned by French President Emmanuel Macron recommended that children should not use smartphones until age 13 and should not have access to social media platforms like Tiktok, Instagram, and Snapchat until age 18.
According to the report, children should not own phones before age 11 and should only have phones without internet access until age 13.
Macron expressed his support for measures to limit children’s screen time.
The largest education union in England called for a statutory ban on mobile phone use in schools, with a survey revealing that 99.8% of elementary schools and 90% of middle schools in Britain have implemented some form of ban.
The future of Tiktok in the United States is once again on the line. Following years of debates over whether to ban domestic apps, the deadline for the company to sell or transfer assets to non-Chinese owners is approaching on April 5th. Donald Trump has stated that his administration is nearing a deal with the app.
A few potential buyers have expressed interest in acquiring the immensely popular social media app. Reports have surfaced suggesting various deals, such as investments from Trump-friendly venture capital firm Andreessen Horowitz and bids from Amazon. In January, the president signed an executive order extending the ban or sale deadline to April. Despite his recent remarks expressing his desire to see Tiktok continue operating, the future for Tiktok and its 170 million US users remains uncertain.
In light of imposing sweeping tariffs on numerous countries, including China, Trump hinted during an Air Force event that trade penalties could be eased if the Chinese company owning Tiktok agrees to the sale.
Bytedance has stated that they have no intentions of selling the app, with court filings deeming the sale “simply impossible.” Bytedance and Tiktok have not responded to requests for comments.
The notion of banning Tiktok was first raised by Trump in 2020, citing national security risks posed by Chinese-owned apps. The issue garnered bipartisan support, leading to Congress overwhelmingly voting to ban the app last year. In January, the US Supreme Court sided with Congress, upholding federal law calling for the sale or ban of Tiktok. The original deadline was set for January 19th.
On the eve of the deadline, Tiktok ceased operations with a message stating, “I’m sorry, but Tiktok is currently unavailable.” Apple and Google also removed the app from their stores to comply with federal law. The social media company expressed gratitude that President Trump was willing to work towards a solution to bring Tiktok back online.
On his first day in office, Trump extended the deadline for the ban or divestment of Tiktok to the 75th. The looming deadline is now fast approaching.
Initially proposing a ban on Tiktok, Trump later joined the app and amassed millions of followers while campaigning for the presidency. He previously vowed to support Tiktok’s presence in the US and has endeavored to fulfill that promise.
Recent reports from CBS suggest that Trump is considering final proposals for Tiktok, including bids from various investors in private equity, venture capital, and the high-tech industry. Investors like Blackstone and Oracle are among those interested in acquiring Tiktok. Oracle, co-founded by Trump’s ally Larry Ellison, has been eyeing Tiktok’s profitable stake for years.
Analysts believe that it is highly unlikely for Tiktok to face another ban. Speculations point towards a potential sale or another form of expansion. The key question revolves around whether the algorithm will be included in the sale, as Tiktok without its algorithm would significantly impact its power and appeal.
Donald Trump is getting ready to review a final proposal that will determine the fate of TikTok before the app either gets acquired by non-Chinese buyers or faces a ban in the US.
US Vice President J.D. Vance, Commerce Secretary Howard Lutnick, National Security Advisor Mike Waltz, and National Intelligence Director Tarsi Gabbard will convene in the oval office on Wednesday to discuss the matter, as reported by Reuters.
In the closely watched sale of TikTok, the White House is acting as an investment bank with Vance leading an auction.
Private equity firm Blackstone is in talks regarding the involvement of current non-Chinese shareholders of Baitedan, spearheaded by Susquehanna International Group and Atlantic General.
Trump stated that a deal with ByteDance to sell the video-sharing app used by 170 million Americans will be finalized before the deadline on Saturday.
Trump is gearing up to announce global tariffs on what he’s calling “liberation day” on Wednesday. He expressed willingness to reduce China’s tariffs to seal the TikTok deal last week.
Trump had set a deadline for TikTok to secure non-Chinese buyers by January or face a US ban on national security grounds, as per the law enacted in 2024.
US venture capital firm Andreessen Horowitz is reportedly discussing an investment in TikTok as part of an effort led by Trump to gain control of the app, according to the Financial Times.
Mark Andreessen, a Silicon Valley luminary and co-founder of Andreessen Horowitz, is in talks to bring in new external investments to acquire TikTok’s Chinese investors alongside Oracle and other American investors in a bid to separate it from its parent company, as per the FT report.
Blackstone is said to value TikTok’s US business as a small minority investment.
Discussions about TikTok’s future involve plans to raise stakes and acquire clauses to outbid the major Chinese investors to secure the US business for short video apps, as reported by Reuters.
Last month, Trump mentioned that his administration is in talks with four different groups regarding potential deals with TikTok in the future.
TikTok and Andreessen Horowitz have yet to respond to Reuters’ request for comment.
On January 18th, I was one of millions of Americans, scrolling through Tiktok when the all-you-can-eat video buffet service suddenly stopped just before the federal ban came into effect.
It was a breathtaking moment when I was mourning me. For daily doses of Hollywood gossip, video game news, anime updates, where did I wonder where I was going now?
Tiktok, owned by a Chinese company, was bytedance, and rose to life the next day, facing legal deadlines to find or face a ban on US owners. President Trump then quickly signed an executive order extending the window for Tiktok’s sale to April 5th.
With that new deadline approaching, Tiktok’s fate, claiming more than 170 million American users, remains uncertain. However, for now, at least, it seems unlikely that there will be a repeated blackout in January.
Last month, Trump He told reporters That he can extend the deadline again. And while bytedance has not confirmed sales plans, Oracle, Data Center Company and others have emerged as potential suitors.
The latest deadlines provide convenient members to reflect on the role of apps in society. This is what I found.
Tiktok is still the best short video app
Tiktok started as Musical.ly 11 years ago. It’s an app for users to post lip sync videos, but over time it has evolved into a generic video app that lets people scroll through short clips of news and entertainment. Currently, there are over 1 billion users worldwide.
With Tiktok’s popularity surged worldwide over the past five years, Meta, Google and others have created clones that allow users to scroll through video clips endlessly. but Young users still prefer Tiktok To watch a short video, according to a survey by research firm Emarketer.
Tiktok’s preferences may be linked in part to product quality. Videos made on Tiktok generally look clearer, more rigorously edited and catchy than videos made with similar apps like Instagram reels. (Why drink lukewarm cola when you can get a classic cola?) Tiktok’s tools, including the editing app Capcut, streamline the production of video for your app.
For me, switching to the reel felt crazy when Tiktok was temporarily down. Many users have posted videos they found to be incomplete, including a video of sourdough bread that I was asked to read the caption to learn how to bake the perfect bread. Why don’t you explain it in a video instead of a small text caption?
Meta, who owns Instagram, catches up to Tiktok’s editing tools. An Instagram spokesperson mentioned a company spokesperson announcement The editor is CapCut’s competitor for editing reel videos and is expected to debut in the coming weeks. This tool allows Instagram users to upload videos to a higher resolution, improving image quality, among other perks.
Tiktok’s secret source, which others have not replicated either, is an algorithm for people to decide which video they want to watch next. Many people in their research say that Tiktok surfaces the type of video they want to watch for everything from diet ideas to video games, and glues them to the screen for hours a day.
Mental health concerns are rising
The effectiveness of Tiktok in keeping people scrolling has been a topic of widespread concern among parents and academic researchers wondering whether people could be thought of as obsessed with apps, just like video game addiction.
Research on this topic continues and remains conclusive. One, It was released last year He also looked into the overuse of Tiktok, led by Christian Montag, a professor of cognitive and brain science at the University of Macau in China. The study reported that although few people involved 378 participants of various ages, they were obsessed with Tiktok.
But broadly speaking, the consensus from multiple studies on Tiktok and other social media apps is that young people are more likely to report being addicted, Dr. Montag said in an interview.
“I don’t think kids should appear on these platforms at all,” he said of an app similar to Tiktok. People’s brains can take at least 20 years, mature and self-regulate, he added.
A Tiktok spokesperson said the app includes tools to manage screen time, including new settings for Tiktok to block children’s phone work during certain times.
Growth of a marketing platform for brands
Tiktok has become the main hub for companies to promote their products through posted videos and products sold at the in-app store, Tiktok Shop.
The company is working hard to make Americans realize the impact on the economy, running flashy advertising campaigns in newspapers and billboards, portraying them as a small business champion.
A Tiktok spokesperson cited a study claiming that Tiktok increased revenues for small businesses to $15 billion in 2023. This is the number that should be collected with salt grains because Tiktok asked for research. However, from scrolling through Tiktok, it is clear that many brands enjoy using it to spread videos showing quirky products.
Tiktok’s video confesses that he was inspired to buy expensive tools to remove dog fur from car seats and an automatic scrubber to clean the kitchen sink.
As for the so-called creators, the platform usually helps self-promotion rather than making money, as influencers post videos of Tiktok that often get viral, said actress Alyssa McKay, who has a follower of Tiktok in New Jersey.
The video, which earns 2 million views, can earn her a few dollars, she said. She added that it is because Tiktok only pays for the scenery that comes from people who have not yet followed you.
That’s still a national security concern
Tiktok was banned in the first place because he feared that US government officials could share data collected by American users with the Chinese government for espionage purposes.
These concerns peaked at the Supreme Court hearing in January. There, the Biden administration argued to ban the app. This cites concerns that Tiktok could create new pathways for China’s intelligence reporting agency that permeates American infrastructure. However, authorities did not provide evidence that Tiktok was associated with such a threat.
But Tiktok is linked to a small US data scandal. Tiktok confirmed in 2022 that four employees were fired for using the app to silly several journalists to track information sources.
Tiktok spokesperson pointed to a video This app protects the data of American users on server systems protected by Oracle, a collaborative US database giant, and prevents unauthorized foreign access.
Matthew Green, a security researcher at Johns Hopkins University and an associate professor of computer science, said that the US government’s security concerns about Tiktoc have been exaggerated as there has yet to be a major scandal, but it is effective because of the potential for hypothetical harm.
Many apps created by American companies are companies that collect information about us and sell insights to data brokers, marketers, including parts of China. But Tiktok in particular can gather sensitive data on Americans that are useful for hostile governments, such as address books, Dr. Green added.
“We’re leaking so much information, we don’t need Tiktok to make things worse, but with millions of different phones running this app, things get worse,” Dr. Green said.
Health and Human Services Secretary Robert F. Kennedy Jr. focused on school mobile phones as part of his “American Health Again Again” agenda this week.
In an interview with “Fox & Friends” on Thursday, Kennedy praised the restrictions on mobile phones in schools, citing health risks associated with phone use among children and teenagers supported by scientific research.
Kennedy pointed out the link between social media use and depression and poor school performance, as well as the potential neurological damage caused by electromagnetic radiation emitted by cell phones that could lead to cancer.
Despite most studies finding no direct link between cell phone use and cancer or DNA damage, Kennedy’s statements have mixed misinformation with scientific facts. The issue of limiting school cell phone use has bipartisan support, with nine states already implementing restrictions and 15 states and Washington, DC considering legislation to do the same.
While concerns about the health effects of cell phone radiation exist, there is currently insufficient scientific evidence to definitively link cell phone use to cancer. Kennedy’s claims about the physical harms of cell phones have been met with skepticism from many experts.
Despite the pros of mobile phones, such as being able to call 911 in emergencies, concerns about mental health risks and distractions in classrooms have led to debates over appropriate school policies regarding cell phone use.
Kennedy’s support for limiting school cell phone use aligns with efforts in some states to create a healthier learning environment by reducing phone distractions among students.
Before his role as HHS secretary, Kennedy emphasized the importance of parents and teachers making their own decisions regarding communication strategies without government interference.
President Donald Trump has signed an executive order suspending sales of Chinese-owned social media platform TikTok, as mandated by a law passed in the United States last year.
Trump’s order was part of a series of actions he took on his first day back in the White House. The order instructed President Trump’s attorney general to hold off on enforcing a law that would require the sale or closure of major social media apps in the U.S. for 75 days.
The moratorium allows for a careful consideration of the next steps in a way that protects national security and avoids an abrupt shutdown of platforms used by millions of Americans.
Additionally, the order directs the Department of Justice to inform other tech giants like Apple, Google, and Oracle, who have ties to TikTok, that they will not be penalized for any actions during this period.
When asked about the purpose of the TikTok executive order, President Trump stated that it gives the government the option to sell or shut down the platform, but a decision on the course of action has not been made yet.
Critics of the video-sharing platform argue that it poses a security threat because it is owned by ByteDance, a company with ties to the Chinese government. They fear that the personal information of U.S. users could be used for malicious purposes.
During his presidency, Trump had previously criticized TikTok for these reasons and attempted to ban it. However, he has since shifted his stance due to various factors, including his popularity on the platform and the views of TikTok investor Jeff Yass.
Despite Trump’s change in position, Congressional Republicans have remained firm, and under bipartisan legislation signed by President Biden, TikTok was required to sell its assets to a U.S.-based company by January 19, with a possible 90-day extension for the sale process.”
Plans to sell TikTok have not been confirmed, but there is interest from figures like Frank McCourt and Kevin O’Leary. The U.S. Supreme Court has been involved in the matter, and despite objections from free speech advocates, the law remains in effect.
Trump’s court filing emphasizes his unique ability to negotiate a solution that addresses national security concerns while preserving the platform, but experts question the effectiveness of his approach.
Alan Rosenstein, a former National Security Adviser, dismissed the executive order as merely a symbolic gesture and stated that TikTok would likely remain banned despite Trump’s intentions.
TikTok stated on Sunday that it would resume service in the United States following President Donald Trump’s inauguration. Earlier that day, the video app received a reprieve from its ban in the country.
President Trump has allowed Truth Social additional time to find a buyer, giving the Chinese-owned video app a lifeline before facing a total shutdown. He proposed that a US company acquire 50% of the stake, signaling his intention to sign an executive order in support of this proposal.
“By doing this, we will save TikTok, ensure it remains in good hands, and keep it afloat,” Trump declared. “Without approval from the US, TikTok would not exist. With our approval, its value could reach hundreds of billions, even trillions of dollars.”
Late Saturday, TikTok suspended its services for approximately 170 million users in the US.
In April, Congress passed a law requiring TikTok, now owned by ByteDance, to sell to a non-Chinese entity or face expulsion from the US. The Supreme Court upheld this provision, leading to the app’s decision to shut down temporarily. The law prohibits the distribution, maintenance, or updates of TikTok in the US if a sale is not secured.
A message popped up for US users of the app from Saturday night to Sunday afternoon, stating, “A US law has been enacted banning TikTok, hence its current unavailability.” Trump advocated for a ban during his previous presidential campaign but found it challenging to enforce it in the 2024 election. He made a last-minute attempt to intervene on TikTok’s behalf upon realizing its substantial user base.
TikTok’s CEO, Shou Zi Chew, expressed gratitude to President Trump for his efforts to maintain the app’s availability in the US. He anticipated attending Trump’s inauguration personally.
In response to Trump’s Sunday message, the company affirmed in a statement its “restoration of services” and assured service providers that there would be no repercussions for enabling TikTok. They thanked President Trump for this action, emphasizing their positive impact on millions of Americans and small businesses, supporting the First Amendment, and opposing arbitrary censorship. They expressed eagerness to work with Trump towards a long-term solution for TikTok in the US.
Several TikTok users reported that the app was fully functional again soon after the announcement.
Concerns about TikTok revolve around the potential access of personal data of US users by the Chinese government and manipulation of the app’s algorithms to control user content. Chu refuted any involvement of the Chinese government in the app, clarifying that ByteDance is not acting as an agent of China or any other country as of 2023.
Reports surfaced last week suggesting that Trump was considering extending the ban through an executive order. The bill allowing the ban on TikTok includes a provision that allows the president to extend the sale deadline by 90 days if sufficient progress is demonstrated, but evidence of substantial progress is required for such an extension to be granted.
Republican House Speaker Mike Johnson announced his support for banning TikTok in a NBC press event on Sunday. He interpreted Trump’s call to “save TikTok” as a directive to facilitate a legitimate sale and change of ownership for the app.
Lawmakers are primarily concerned about the Chinese Communist Party rather than the app itself, emphasizing the need for ByteDance to complete the sale of TikTok within 270 days to avert potential national security risks.
Some Republican officials oppose the idea of extending the ban’s timeframe, noting that the law should be enforced as written. Senators Tom Cotton and Pete Ricketts stated that China must sever all ties with TikTok and agree to a qualified sale for the app to be considered safe for US users.
Several Democratic lawmakers urged President Biden to allow TikTok a grace period before any shutdown, emphasizing the app’s importance to content creators, privacy concerns, and national security.
Investor Kevin O’Leary reportedly offered TikTok’s owners a $20 billion buyout, while other reports suggest a potential merger with TikTok US or a sale to Elon Musk, which TikTok dismissed as untrue.
TikTok suspended its service in the United States late Saturday, just before a federal ban on the Chinese-owned short video app went into effect.
This app is no longer available on Apple’s iOS App Store or Google’s Play Store. In April, the U.S. Congress passed a law requiring parent company ByteDance to sell TikTok to a non-Chinese owner or face complete shutdown. I chose the latter.
TikTok said the sale was “commercially, technically and legally impossible.” The company stuck to that policy until the end.
It took five years for the app to disappear. Donald Trump first proposed banning TikTok by executive order in mid-2020, but was unsuccessful. Various lawmakers proposed similar measures, but only one passed. The Protecting Americans from Controlled Applications by Foreign Adversaries Act was passed, requiring TikTok to be sold or banned.
“A law has been enacted in the United States that bans TikTok. Unfortunately, that means you can’t use TikTok at this time.” Luckily, once President Trump takes office, we have a solution to bring TikTok back. has expressed its intention to cooperate with Stay tuned,” a message to users trying to use the app appeared.
TikTok’s lawyers told the Supreme Court that the app will “cease use” on January 19th. After TikTok disappears from the app store and no new downloads or updates are possible, it will gradually become obsolete while the ban continues. Without regular maintenance, your app may fail to function smoothly and become vulnerable to cyber-attacks.
Users trying to access TikTok in the United States encountered the message late Saturday. Photo: Blake Montgomery/The Guardian
TikTok fought this action vigorously in court, arguing without success that blocking the much-loved app would violate its right to free speech. It seemed like the bill might die before it became law, as in Montana, where a similar provision became the first state in the U.S. to ban TikTok within its borders in 2023. The state law was overturned before it took effect.
Two days before ByteDance was due to sell the popular app used by 170 million Americans, the U.S. Supreme Court ruled that the law was constitutional and its provisions should remain. did. Biden said he would leave enforcement of the bill to Trump. The White House said in a statement Friday that TikTok “should remain available to Americans, but simply under American ownership.”
In response to the ruling, TikTok chief Shou Chiu called on the president-elect to save the app. “On behalf of everyone at TikTok and our users across the United States, I want to thank President Trump for his commitment to working with us to find a solution to keep TikTok available in the United States,” he said in a video posted to TikTok. Ta. .
At the 11th hour, Trump tried to intervene on TikTok’s behalf before the Supreme Court, even though Trump himself is the author of the ban. He drew attention to the app after gaining a large audience during the 2024 presidential campaign. He is scheduled to take office on Monday and could order the Justice Department not to implement the bill, but said the Supreme Court’s ruling “should be:”respected”.It is unclear whether he will be able to completely avoid the TikTok ban.
President Trump said Saturday that he would likely give TikTok a 90-day reprieve from a potential ban after he takes office on Monday.
“The 90-day extension is appropriate and will most likely be implemented,” he told NBC. “If we decide to do that, we’ll probably announce it on Monday.”
US TikTok users are leaking to Chinese video-sharing app Xiaohongshu (also known as RedNote) rather than YouTube Shorts or Instagram Reels, both of which are likely to gain traction after the ban.
One user said: “Before I look at your Instagram reels, I want to dropship my DNA to the doorstep of the Chinese Communist Party.”
○ On a recent Monday morning, Olivia Shalhoup opened her laptop and braced herself for the day’s meetings. As the founder of marketing and PR agency Amethyst, about 40% of her work focuses on helping musicians take advantage of TikTok. Her client was nervous that day, with a Supreme Court decision looming and the fate of the app in the United States hanging in the balance. “The key thing we talked about on every call was, ‘What are we going to do?'” Shalhoup said. “It’s no exaggeration to say that TikTok is critical to artists’ campaigns at this point. No one is immune to this.”
Since its debut in 2017, TikTok has become a star-making machine, with short-form video content overtaking traditional music promotion formats like TV and radio. This app features up-and-coming artists. A-listers promotes the rise to topof the chart And make Magic FM classics like Running Up That Hill a Generation Alpha hit. With the help of TikTok, Lil Nas Ta. More recently, songs such as Djo’s “End of Beginning” and Artemas’ “I Like the Way You Kiss Me” have become global hits after going viral on the app. The ability to track a song’s tenacity, engagement, and reach is a label executive’s dream, and one that author John Seabrook provides. I called “Real-time global callout data” helps leading companies make smarter trades.
Lil Nas X performs in New Jersey in 2019. The rapper and singer’s career soared after his smash single “Old Town Road” went viral on TikTok. Photo: Scott Ross/Invision/AP
“Right now, most label strategies rely heavily on TikTok,” said Ray Uskata, managing director of the Americas at music marketing agency Round. “It’s not just an entertainment platform, it’s a discovery platform. People go to Instagram to see what their friends are up to, they go to YouTube to see what their favorite creators are up to. I go to TikTok to see something new.”
The key to TikTok’s success is a feed filled with algorithmic recommendations that seem to know you better than you know yourself, and keep you in tune, sometimes unnervingly, with the trends and music you’re obsessed with. We provide you with a stream of carefully selected content.
It was enough to give lawmakers pause. In April, the U.S. Congress ordered TikTok’s parent company ByteDance to sell the app to a U.S.-based owner, citing national security concerns over possible manipulation of TikTok by the Chinese government and its collections. They passed a law forcing them to do so or face complete closure. Sensitive User Data Signed by Joe Biden. On January 10th, the Supreme Court convened to decide whether to force TikTok to go dark in the US on January 19th. Despite widespread protests from the creators (and the ACLU) slam On January 17, a court upheld a law that threatens to kill the app in the United States (the proposal is unconstitutional).
Music’s new kingmaker
Many marketers say they are at a loss. “I think a lot of people are in denial,” said Meredith Gardner, co-founder of agency Tenth Floor and former senior vice president of digital marketing at Capitol Records. She said that as recently as 10 days ago, potential clients at major labels were still talking about TikTok as a priority. “I think a lot of people are still hopeful that there will be some form of Hail Mary,” Gardner said.
Artists and record labels view TikTok as the closest thing to a kingmaker in today’s fragmented mainstream music industry, making it difficult to imagine a future without it. “If you look at the top 50 in the world, [chart] Compared to the viral charts, on Spotify, most of these songs are currently charting or trending on TikTok,” Uskata says. “These aren’t actually from other platforms.”
Its influence spreads worldwide. Patrick Clifton, a UK-based music and technology strategy consultant, says the power of TikTok’s network effects in the vast US market is such that it influences what people listen to on Spotify, and that TikTok directly connects Spotify to Spotify. It states that you can click through to listen to the song. Post – Around the World.
“TikTok has been a huge catalyst for music trends in the U.S., and the size and distribution of its user population in the U.S. has made it a catalyst for algorithmic trends on platforms like Spotify around the world,” Clifton said. I say. Therefore, the US ban could change what Spotify offers to listeners in regions where TikTok is still available, such as the UK.
Jeff Halliday, vice president of marketing for Downtown Artist & Label Services, said a potential ban “would immediately cause a lot of disruption.” “It’s like every stage of grief. At first it was mostly denial. A lot of people thought, ‘That’s never going to happen.’ And then the negotiation begins where you say, ‘Well, there’s another way.'”
In the face of uncertainty, marketers are advising artists not to put all their eggs in one basket. Gardner said he tells the artists he works with to take a cue from the pre-iTunes era and cultivate a digital Rolodex of fans. She was recently contacted by a singer-songwriter client seeking advice on how to share his rich archive of demos and home recordings with listeners. In another era, a collection like this would seem tailor-made for TikTok, but Gardner took a different view. “We encourage TikTok to launch Substack.”
A demonstrator holds a pro-TikTok sign in front of the U.S. Supreme Court on January 10, 2025.
Alison Robert/Washington Post/Getty Images
The United States Supreme Court supported A ban on the popular video streaming app TikTok is set to come into effect on January 19th.
of Prohibited Unless ByteDance, the app's Chinese parent company, sells TikTok to a U.S. company by a January 19 deadline, the U.S. company will have to restrict users from accessing and updating TikTok through app stores and internet browsers. You will be required to block it.
TikTok's challenge to the law, which the Supreme Court began hearing on January 10, argues that TikTok violates the U.S. Constitution's free speech protections. On the same day, the court heard arguments in a related case, with lawyers representing TikTok content creators arguing that the ban also violates the constitutional rights of these individuals.
However, U.S. Attorney General Elizabeth Preloger argued that the ban on TikTok was not meant to crack down on free speech, but to prevent foreign espionage. The US government's case is that the Chinese government used TikTok to collect sensitive personal data on hundreds of millions of people in the US, which could later be used against them.
The Supreme Court unanimously agreed with the government's arguments and ruled against TikTok and individual creators in both cases. “There is no question that TikTok provides a unique and far-reaching source of expression, participation, and community for more than 170 million Americans. “We determined that division was necessary to address widely held national security concerns regarding relations with foreign adversaries,” the opinion states.
TikTok plans to shut down its app for U.S. users on January 19, the same day the ban goes into effect. According to Reuters. But this may not be the last twist in the courtroom drama.
US President Joe Biden is scheduled to leave office on January 20, the day after the ban goes into effect. Administration officials said Mr. Biden would not enforce the law. According to the Associated Press. Rather, the strength of the ban will depend on the actions of President-elect Donald Trump's incoming administration.
President Trump initially supported banning TikTok during his first term, but later changed his stance and expressed support for allowing the platform to continue operating in the United States. After taking office on January 20, he could ask MPs to repeal or amend the law, or instruct the government not to enforce it.
If TikTok disappears from the United States, its 170 million American users won’t be the only ones who lose out.
British TikTokers and executives told the Guardian that they would lose a significant portion of their audience after the ban. The video app has become a key entry point into the U.S. for British online video creators who make a living by gaining views and securing sponsored content deals. The ban is scheduled to go into effect on Sunday, leaving a U.S.-sized hole in the global user base.
“In English-speaking markets, many creators have significant U.S. audiences following them,” says Billion Dollar Boy, a UK-based advertising agency that connects creators and influencers with blue-chip advertisers. CEO Thomas Walters said: He added that a ban would be “really sad” for creators who have “built an audience from nothing” on TikTok.
The Guardian spoke to several UK-based creators and one entrepreneur, all of whom said they would be affected by the ban.
Jay Beach, 30, London
Almost half of the users are from the United States. Beech’s 1.7 million viewers On TikTok. He said there were strong relationships between creators and users on both sides of the Atlantic, and that millions of Brits and Americans would miss this kind of digital cultural exchange.
“Seeing that gap in our feeds is going to make a big difference for all of us,” he says.
Beach, who describes her posts as “high energy fashion content”, said sponsored content from brands such as US skincare brand Kiehl’s and Sky TV makes up the bulk of her income. He also has a presence on YouTube Shorts and Instagram, but says he’s noticed that TikTok users “don’t necessarily follow you anywhere else.”
“[A ban] “It’s going to throw people into this diaspora of rediscovering their favorite creators and finding a new home on their platform of choice,” he said.
Fats Timbo, 28, Kent
Fats Timbo is a comedian and podcaster. Photo: Fats Timbo
Fats is a comedian and disability activist who posts comedy, beauty, and lifestyle content. 3 million followers on TikTok. She says the platform’s reach in the U.S. (about a quarter of her followers) is essential to her work.
“TikTok is very important to my career because it allows me to connect with an audience in the United States, where there is a lack of representation for people like me – Black women with dwarfism. “That’s often the case,” she says.
Timbo added that the United States offers creators like her the opportunity to “grow, collaborate, and get noticed on a global stage.”
“It’s not just about the numbers. It’s about the impact I can have and the representation I can give to people who rarely see people like them in the media. Losing that connection is something that I You feel like you’re losing some of your purpose,” she says.
Timbo says the US audience is “key to securing deals, collaborations and global visibility with brands.” Losing TikTok in the US would be a “major setback,” but she is also creating content on Instagram to stay connected with her US followers.
M Wallbank, 25, South Yorkshire
Approximately 40% of Em Wallbank’s audience is from the United States. The South Yorkshire-based creator said it was also thanks to her accent that her comedy skits became a hit across the Atlantic. Wallbank is best known for his posts riffing on Harry Potter characters. TikTok has 1.7 million followers.
“I think part of my popularity is because I’m from the north and my accent is a bit unusual.” [to US users]” she says.
Wallbank, who started posting skits on TikTok in 2022, said the U.S. social media market is a test of the ability for creators like the Kardashians and Nicole Richie to build broad careers.
“People who have careers outside of social media are getting more attention from American audiences,” she says.
Wallbank’s popularity in the US has led her to perform at US fan conventions and create sponsored content with multinational companies such as Disney+ in the UK. She’s concerned about aspiring creators who are using TikTok and its U.S. audience to access creative careers that might otherwise be out of reach.
“Being able to use my background to break into a creative industry is huge,” she says.
Sarah Yuma, 30, London
Uma says her American TikTok audience is essential to the growth of her business, which sells home accessories and hair accessories made from African fabrics.
“It can be difficult to build a business solely relying on a UK audience. It was the US audience that propelled my business during lockdown and took it to the next level.” she says.
Yuma has more than that 3,000 followers on TikToksaid it saw an influx of U.S. customers and followers in 2020 as the Black Lives Matter movement grew in popularity.
Sarah Yuma sells handmade products made from African fabrics from her home in London on Thursday. Photo: Martin Godwin/The Guardian
If TikTok were to disappear from the U.S., “we would be losing a huge part of our community,” Yuma said. “They helped me design it. It’s a really beautiful community I’ve created.”
She added that if TikTok were to be suspended in the U.S., it would have to rethink how it connects with U.S. audiences.
“We need to rethink our strategies on how to keep them in the community and keep them in touch,” she says. “I don’t want to isolate them. They’re really important to my business.”
Sam Cornforth, 29, London
corn force post fitness comedy sketches He has 460,000 followers, about a quarter of them in the United States. He said income from sponsored content would be protected by the fact that it was working with UK-based brands like Argos.
But he said brands could react negatively to creators losing a significant portion of their audience.
“Brands are paying attention to your entire reach. If you potentially cut 20% to 30% of that, would that impact future opportunities with those brands?” he asked.
Cornforth added that TikTok’s U.S. audience is important in establishing trends that filter down to other platforms. Without that influence, creators may lose the impetus and inspiration for their work.
“This is where the trends come from, which later narrows down to YouTube Shorts and Instagram,” he says.
President Donald Trump is reportedly weighing the possibility of lifting the TikTok ban in the United States through an executive order once he assumes office on January 20th.
The incoming president is contemplating an executive order to delay the ban, initially set to take effect on January 19th, as per The Washington Post. However, the legality of Trump’s decision to suspend the Congressional law is dubious.
Per the law, TikTok’s U.S. operations must be divested by its Chinese parent company by Sunday. Failure to do so will result in new users being unable to download TikTok from app stores.
In the absence of Supreme Court intervention to block the law, TikTok is gearing up to block access to the app for U.S. users on Sunday, reports tech news site Information.
On Wednesday, The Washington Post reported that Trump and his team are mulling over an executive order to temporarily halt law enforcement for 60 to 90 days, citing anonymous sources. The Supreme Court is anticipated to rule on the law’s progression, with recent indications suggesting it is unlikely to be halted.
“I have positive sentiments towards TikTok,” stated President Trump last month, requesting the Supreme Court to delay law enforcement to pursue a “political solution” post-inauguration. Congress voted to ban the app, owned by ByteDance in Beijing, citing fears of potential Chinese state data access for 170 million U.S. users.
“TikTok is a valuable platform,” affirmed Mike Walz, President Trump’s incoming national security advisor, on Fox News. “We will ensure data protection while preserving the app.”
The New York Times disclosed that TikTok’s CEO, Shou Zi Chu, shared plans to attend President Trump’s inauguration in a prestigious setting.
NBC reported that the Biden administration is exploring strategies to prolong social media platform operations post-Sunday to defer President Trump’s decision.
“The American public should not anticipate an abrupt TikTok ban on Sunday,” reassured an administration official to NBC.
The U.S. Supreme Court is set to hear arguments from TikTok and ByteDance, its China-based owner, on Friday. ByteDance is seeking an injunction against a bill signed by President Joe Biden that bans short-form video apps starting January 19 unless they are divested from ByteDance. TikTok argues that the sale would be impossible and is seeking an injunction to suspend the ban pending legal proceedings.
Over 170 million Americans use TikTok, and the company’s lawyers claim that banning the app violates the First Amendment rights of many users. Despite this argument, the federal appeals court upheld the ban in December. The bill received bipartisan support from Congress in April over concerns that China could spread propaganda through the app.
Starting on January 19, new users will be unable to download TikTok, and existing users will not be able to update the app. Lawmakers have instructed major app stores, like Apple Inc. and Google, to be prepared to remove TikTok from their platforms on that date.
TikTok’s 7,000 U.S. employees are uncertain about their future. Some new roles are still being advertised by the company, but there is pessimism among employees following the court’s decision to uphold the anti-sale law in December. Advertisers are also considering their options, with some planning to continue advertising on TikTok even after January 19.
TikTok has insisted that it cannot be sold, but potential buyers, like Frank McCourt, have expressed interest in acquiring the app. McCourt has secured commitments from investors for a bid and hopes to negotiate a sale with ByteDance.
State-level bans, like the one planned in Montana, have faced legal challenges. In China, a forced sale of TikTok may require approval from Beijing authorities, which could prove to be a significant hurdle.
Days before popular US social media app TikTok is proposed to be banned, Chinese social media app Red Note is seeing a flood of new users as the little-known company eases English language restrictions while strategically taking advantage of the sudden influx.
More than 50,000 users from the United States and China participated in a live chat dubbed “TikTok Refugees” on RedNote on Monday. Veteran Chinese users welcomed the American users, with some trepidation, and exchanged notes on topics such as food and youth unemployment, although at times the conversation delved into more sensitive subjects.
Such impromptu cultural exchanges were happening across Red Note, also known as “Xiaohongshu” in China, as it rose to the top of the US download rankings this week. Its popularity was boosted by social media users in the U.S. who had been searching for alternatives to ByteDance Inc.’s TikTok in the days before its impending ban.
RedNote, a venture capital-backed startup valued at $17 billion, allows users to curate photos, videos, and text to document their lives. With more than 300 million users relying on it for travel tips, anti-aging creams, and restaurant recommendations, the company is considered an IPO candidate in China.
In just two days, over 700,000 new users have joined Xiaohonshu, and Red Note downloads in the U.S. have increased significantly, according to estimates from app data research firm Sensor Tower.
The surge in U.S. users comes ahead of a Jan. 19 deadline for ByteDance to sell TikTok or face a U.S. ban on national security grounds. TikTok is currently used by about 170 million Americans, about half of the U.S. population, and is overwhelmingly popular with young people and advertisers.
Stella Kittrell, a 29-year-old content creator based in Baltimore, Maryland, expressed her support for Americans using Red Note as a response to concerns over business and privacy issues with the U.S. government. She joined RedNote in hopes of collaborating with Chinese companies and finding an alternative to other social media platforms.
Brian Atavansi, a 29-year-old business analyst and content creator from San Diego, California, noted that apps like Instagram and Facebook are not able to recreate the sense of community found on TikTok due to its organic nature.
The U.S. Supreme Court is set to hear oral arguments on Friday regarding the future of TikTok. This marks the latest development in an ongoing debate over whether to ban the immensely popular social media platform in the U.S. The judges will consider the balance between national security concerns and the preservation of free speech.
TikTok and its Chinese parent company ByteDance have appealed to the Supreme Court after a lower court upheld a law banning the app in the U.S. The ban is scheduled to take effect on January 19th, unless ByteDance sells TikTok’s assets to a non-Chinese entity. ByteDance has argued that a sale is not feasible from commercial, technical, and legal standpoints.
The oral arguments are expected to last for two hours, with each side given the opportunity to present their case. The court has outlined that the discussion will focus on whether the ban infringes on the First Amendment.
TikTok boasts 170 million users, approximately half of the U.S. population, making the potential ban a contentious issue. While some believe the app could be exploited by the Chinese government, there is a coalition of influencers, civil rights groups, and even President Donald Trump advocating against the ban, citing concerns about free speech violations.
ByteDance has faced legal challenges from federal and state authorities, with legislation to ban TikTok passing in Congress last year. The company maintains that it operates independently from Chinese influence and handles U.S. user data through Oracle.
Federal law at the center of the case
The law in question, known as the Protecting Americans from Regulatory Applications by Foreign Adversaries Act, was enacted by President Joe Biden. It follows a previous ban on TikTok in federal devices and underscores concerns about national security risks associated with the app.
U.S. lawmakers have expressed apprehensions about China’s potential control over TikTok’s content and user data, citing security threats and propaganda dissemination. However, no concrete evidence has been presented to show that China or ByteDance have manipulated the app for espionage purposes.
Shortly after Biden signed the law, TikTok filed a lawsuit against the U.S. government, arguing that the ban violates the Constitution and impinges on free speech rights. The company emphasized the importance of preserving communication and expression for its vast user base.
Supreme Court review and President Trump’s opinion
Following a recent ruling by a federal appeals court, TikTok sought an emergency motion from the Supreme Court to halt the ban. The court agreed to expedite oral arguments and has received numerous briefs from both sides of the debate.
Notably, former President Trump submitted an amicus brief requesting the court to suspend the ban to allow for negotiation. This stance contrasts with his previous efforts to ban TikTok over national security concerns.
President Trump’s involvement in the case underscores the complexity of the issue, with diverging viewpoints within the political landscape. The upcoming Supreme Court decision will have far-reaching implications for the future of TikTok in the U.S.
A US law banning popular video-sharing app TikTok is expected to take effect in early 2025, but the US Supreme Court has ruled agreed To hear TikTok’s legal challenge to this. Meanwhile, President-elect Donald Trump has signaled he may take action against the law, raising new questions about whether it will survive.
What does a TikTok ban actually do?
From January 19, 2025, “Act to protect Americans from regulatory applications by foreign adversaries' will prevent US companies such as Google and Apple from allowing users to access or update TikTok through their own app stores unless TikTok's Chinese owner ByteDance sells the app to US companies. It turns out. It would also require internet service providers to block the platform on US internet browsers. The bill was approved by the House and Senate with bipartisan support and signed into law by President Joe Biden in April 2024.
If the ban were implemented, it would be virtually impossible for new users in the US to download the TikTok app. Kate Ruan At the Center for Democracy & Technology, a nonprofit organization based in Washington, DC. For the 170 million existing TikTok users in the United States, the app may remain on their phones. However, not having access to updates will reduce functionality over time.
People in the United States may still access TikTok using virtual private network (VPN) services that disguise the user's location. But the experience of using the app could still deteriorate, Ruan said. TikTok content will no longer be stored on nearby U.S. servers, so it will load more slowly.
These restrictions stem from privacy and security concerns. US lawmakers fear that the Chinese government could force ByteDance to hand over TikTok users' data, pressure the app to change its algorithm, and present content that could manipulate public opinion. , said TikTok is a “national security threat.” However, no solid evidence has been provided to support these claims. TikTok said We are investing heavily to keep U.S. data safe. From outside influences and manipulation.
“It is deeply concerning that a country like the United States, which has always led the world stage in championing a free, open, and interoperable internet, is taking steps to ban access to entire platforms within its borders. 'This is unusual and should be done,' says Luan.
Will the Supreme Court block TikTok's ban?
Previously, he was a judge on the D.C. Lower Circuit Court of Appeals. allowed With U.S. law in effect, the Supreme Court agreed to hear TikTok's appeal. TikTok position That is, the ban amounts to censorship that violates Americans' right to free speech under the First Amendment.
“We hope courts will seriously address how this law violates these rights and how governments should account for the rights of social media users when seeking to regulate these speech platforms. I think so,” Luan said. “Despite the fact that some users have filed lawsuits claiming that this law violates their First Amendment rights, which are different from TikTok, the court did so in the process of considering this particular law.” have not done so.”
The most likely short-term impact, Ruan said, is that the U.S. Supreme Court will temporarily halt enforcement of the law while the justices consider the case. This could delay implementation of the law for months, no matter how long it takes for the Supreme Court to rule in 2025. TikTok specifically seeks such a suspension in its court filing.
Ruane said the ban violates First Amendment rights and that the government would be justified in such an outright ban if the Supreme Court found that the U.S. government had less restrictive options at its disposal. It is possible that an injunction could be issued that would make it virtually impossible to do so. The Supreme Court could also ask the lower D.C. Circuit Court of Appeals to reconsider its analysis of the case. Such a decision could force governments to find more tightly tailored options for regulating TikTok.
How can Trump stop banning TikTok?
President-elect Trump supported plans to ban TikTok during his first term, but has since changed his mind. During the 2024 presidential campaign, he promised:Save TikTok'' he urged American voters to support him in a post on his social media platform “Truth Social.'' On December 16th, President Trump met TikTok's CEO later said in a press conference that the administration would “consider” the ban. Even if the Supreme Court ultimately agrees to keep the ban in place, President Trump could change the law's impact.
For example, the president could meet with U.S. lawmakers and ask for changes, such as repealing or amending domestic laws, Ruane said. She also described a possible scenario in which President Trump could instruct his administration's attorney general not to enforce the law, but warned that it would be outside the norm for how the U.S. government normally operates.
Even if President Trump's attorney general announces that the US government will not enforce the ban, US companies such as Google and Apple remain reluctant to allow people to access apps through their platforms. There is a possibility. “If I were in charge of legal risk at one of these companies, I don't know if I would be able to say, 'We believe in it.' [decision]It’s okay to allow access to this app, which is prohibited,” Ruane said.
What does the US ban on TikTok mean for the rest of the world?
If passed, the U.S. ban could have significant ramifications around the world. First, people in other countries will not be able to access new content from US-based TikTok creators and influencers. But more importantly, the U.S. government's actions could prompt other countries to consider similar restrictions.
The US is not the first country to take action against TikTok, with the Indian government blocking the app since 2020, but Luan said the US ban would lead to “authoritarian regimes” They expressed concern that this could prompt the banning of all apps, including those that are Similar national security justification.
“This will no doubt be used as a justification to ban TikTok elsewhere, and to ban access to other applications that have served as important speech platforms in countries where the internet is less open. will also be used,” Luan said.
Will banning TikTok protect privacy?
The ostensible purpose of the ban is to protect the privacy of U.S. TikTok users and prevent their data from falling into the hands of other countries, as well as to prevent the Chinese government from potentially manipulating the content presented to U.S. app users. It is to address the concerns that there are. But Ruane says there are many alternative steps U.S. lawmakers could take before blocking TikTok completely.
For example, governments could require TikTok to be more transparent about how it collects and shares individual user data and what steps it takes to protect privacy. There is sex. Lawmakers could require platforms to share how their algorithms filter and control the content users see to alleviate concerns about tampering, Ruan said.
The U.S. government may also consider enacting consumer privacy laws that would provide better legal protections for how social media platforms share personal data with other companies and the government. “These consumer privacy and transparency choices are not as extreme as banning the entire platform,” Ruan said.
HHello. Welcome to TechScape. Happy belated Thanksgiving to all my American readers. I hope you all enjoy a fun holiday party this weekend. I’m looking forward to baking gritty bunts for the Feast of St. Nicholas. This week in tech: Australia causes panic, Bluesky raises the issue of custom feeds, and we cover the online things that brought me joy over the holidays.
Australia on Thursday passed a law banning children under 16 from using social networks.
My colleague Helen Sullivan reports from Sydney: The Online Safety Amendment (Social Media Minimum Age) would prohibit social media platforms from allowing users under the age of 16 to access their services, with penalties of up to A$50 million (A$3,200) for failure to comply. He is threatening to impose a fine of US$ 1,000,000. However, it does not contain any details about how it will work, only that companies are expected to take reasonable steps to ensure that users are over 16 years of age. Further details will be available by the time the Age Assurance Technology trials are completed in mid-2025. The bill will not take effect for another 12 months.
The bill also does not specify which companies would be subject to the ban, but Communications Minister Michel Rolland has said that Snapchat, TikTok, X, Instagram, Reddit, and Facebook are likely to be subject to the ban. YouTube is not included because it is for “important” educational purposes, she said.
The new law was drafted in response to Labor Prime Minister Anthony Albanese saying there was “a clear causal link between the rise of social media and the harm it causes to Australian youth mental health.”
TikTok, Facebook, Snapchat, Instagram, and X are angry. Following the bill’s passage, Mehta said the process was “fast-tracked” and that it would take a long time to hear from young people, the steps the tech industry has already taken to protect them, and existing research on the impact of their social media use. He said he did not consider the evidence.
Australian children are not a significant user base for these companies. According to UNICEF, in 2023, there were 5.7 million people under the age of 18 living in Australia. Facebook reported 3 billion monthly users in 2023. May 2023. There are approximately 370 million Facebook users in India. Even if all Australian children were to leave social media, which is unlikely, the number of users would not decline significantly.
If countries around the world turn their young people away from social media, social media companies will face an uncertain future.
Of concern to tech companies is the precedent set by the new law. Tech companies also fiercely opposed measures in both Australia and Canada that would require them to pay for news content. The issue was not the amount requested, but what happened next. If countries around the world required people to pay for news, the financial burden it would place on Facebook and others would be enormous, as would the responsibility of determining what is news. As countries around the world turn their young people away from social media, social media companies will face an uncertain future. The pipeline of incoming users will dry up.
What tech companies want in Australia is a measure that would require parental consent, but this would be a more vague standard and one that would divide responsibility between companies and users. Mehta and others opposed a 2023 law passed in France requiring parents to approve accounts for children under 15 with far less vigor than Australia’s new law. However, in an ominous sign for Australia’s measures, local French media reports that technical challenges mean the under-15 rule has not yet been implemented. Also, does the parental consent feature work? Data from several European countries shows that it doesn’t. Nick Clegg from Meta said the company’s data shows that parents are not using parental control measures on social networks.
Australian law shows that this is indeed possible in any country. We have seen the laws of one country tilt the global governance of social networks before. In the United States, a law governing children’s privacy passed in 2000 imposed a minimum age of 13 for social media users. Social network privacy policy.
Click here for a comparison of Australia’s social media ban laws with those of other countries.
Ministers have stated that the social media ban for under-16s is not currently being considered, despite teenagers urging a reconsideration of plans to restrict access to platforms like TikTok, Instagram, and Snapchat following Australia’s example.
Peter Kyle, Secretary of State for Science and Technology, issued a warning to social media platforms about potential fines and prison sentences for breaching online safety laws coming into effect next year. Efforts are being made to increase prevention of online harm.
During a meeting with teenagers at NSPCC headquarters, Mr. Kyle emphasized that there are no immediate plans to ban children from using smartphones, as it is not his preferred choice.
Teenagers expressed concerns about platform addiction and difficulties in seeking help for hacked accounts or offensive content, but did not call for a ban. They highlighted the importance of social connections, support, and safety.
Mr. Kyle’s initial comments about considering a ban caused worry among teenagers, but he clarified that a ban could be a possibility depending on evidence of its effectiveness, especially in light of similar legislation in Australia.
The main focus remains on preventing child fatalities linked to social media activity, with Mr. Kyle citing instances of tragic outcomes. Efforts are ongoing to enhance age verification software to protect children from inappropriate online content.
○Last Week Tonight, John Oliver investigated the impending ban of TikTok in the United States. TikTok, a popular social media app known for its cooking tutorials and trendy dances, has captured the attention of many users, especially those born after 1985.
With 170 million active users in the U.S., TikTok has a significant following, particularly among young adults. Despite its popularity, the app faces potential extinction as the Senate passed a bill in April giving its Chinese parent company ByteDance an ultimatum to sell TikTok or risk being banned in the U.S. due to national security concerns.
Lawmakers from both parties view TikTok as a threat, with one likening it to a “gun to Americans’ heads.” Despite this, Oliver humorously points out that Congress tends to act differently when faced with literal gun violence as opposed to figurative threats.
Oliver delves into TikTok’s history, highlighting the app’s rapid rise in popularity, especially during the pandemic. He humorously notes that TikTok thrived during lockdowns as people turned to it for entertainment and distraction.
Concerns about TikTok’s ties to China have been ongoing, with President Trump attempting to block the app through an executive order. Despite TikTok’s efforts to distance itself from China, questions remain about the security of user data and potential government influence on the app.
Oliver examines TikTok’s data collection practices and algorithm, pointing out the extensive information the app gathers about its users. He raises alarm about the potential vulnerabilities and privacy risks associated with TikTok’s operations.
While acknowledging concerns about propaganda and censorship on TikTok, Oliver questions the evidence supporting these claims. He suggests that underlying motives, including competition from other tech companies, may be at play in the push to ban TikTok.
Oliver concludes by emphasizing the need for stronger privacy protections in the U.S. and questioning the efficacy of banning TikTok as a solution. He highlights the complex nature of the debate and the lack of clear solutions in addressing the risks associated with data privacy and national security.
A parliamentary committee investigating the impact of social media on Australian society has recommended empowering users to change, reset, and disable algorithms, as well as enhancing privacy protections. However, the committee also proposed a ban on social media use by individuals under 16 years old. No final recommendations have been made yet regarding access to social media.
The inquiry primarily focused on the influence of social media on young people. Both the opposition coalition and the federal government have announced plans to regulate social media for individuals under 16, pending legislation to be introduced in parliament by the year’s end in response to the current usage policy.
One of the 12 recommendations in the final report suggests enabling governments to enforce laws on digital platforms more effectively, creating a duty of care for platforms, and requiring platforms to provide data access to researchers and public interest groups. The report also suggests that users should have more control over their online experiences, understand algorithms, enhance digital literacy education, and submit age-guaranteed technology testing results to Congress.
Although there’s bipartisan support for banning social media access for those under 16, the study suggests that ensuring children’s safety may not necessarily involve outright bans until they reach an appropriate age. It emphasizes the need for collaborative efforts with young people in designing regulatory frameworks impacting them.
The Commission highlights the importance of evidence-based decisions regarding age restrictions and the necessity of involving young people in the policymaking process.
The committee suggests that a blanket ban on social media for certain age groups may not be the optimal solution and underscores the need for comprehensive digital reforms to tackle harmful online practices.
Chairperson Labor MP Sharon Claydon emphasizes the complexity of the issue and the necessity for immediate action to safeguard Australian users.
The Greens propose lifting the review of online safety laws, banning data mining of young people’s information, providing more education, and considering a digital services tax on platforms.
The Environmental Protection Agency announced on Tuesday that an emergency order has been issued. This action is the first of its kind in almost four decades and aims to halt the use of pesticides that may harm unborn babies.
The herbicide in question, dimethyltetrachloroterephthalate (DCPA or Dacthal), is commonly used to control weeds in various crops like broccoli, onions, kale, Brussels sprouts, cabbage, and strawberries.
Exposure to this chemical during pregnancy can lead to changes in thyroid hormone levels in the fetus, which could result in long-term negative impacts such as low birth weight, impaired brain development, lower IQ, and diminished motor skills later in life, according to the EPA.
This risk prompted the EPA to take decisive action and suspend the use of the pesticide. Michal Friedhoff, deputy director of the Office for Chemical Safety and Pollution Prevention, stated, “DCPA is extremely dangerous and needs to be removed from the market immediately.” The agency emphasized this in a statement.
The emergency order is now in effect.
Friedhoff further emphasized the EPA’s role in safeguarding the public from hazardous chemicals, saying, “In this case, a pregnant woman who unknowingly encounters DCPA could give birth to a child with irreversible health issues.”
The DCPA has been banned in the European Union since 2009.
Miri Treviño Sauceda, executive director of the National Farmers Union, praised the EPA’s decision as “historic.”
The suspension follows years of dialogue between the EPA and AMVAC Chemical Corporation, the sole manufacturer of DCPA.
The company has not responded to requests for comment.
In 2013, the EPA requested data from AMVAC on the herbicide’s health effects, specifically requesting comprehensive studies on DCPA’s impact on thyroid development. Despite receiving multiple studies from AMVAC between 2013 and 2021, the EPA found the data inadequate and did not accept certain requests, including the thyroid study, until it was finally submitted in August 2022.
The EPA’s recent assessment of DCPA was part of a routine process to reassess registered pesticides. Inspections occur every 15 years to ensure there are no adverse health effects or environmental hazards.
A collective of schools in London has made the decision to prohibit the use of smartphones, reflecting a growing concern about the reliance on mobile devices among children.
The heads of 17 out of 20 state secondary schools in Southwark, south London, have united to discourage students from using smartphones outside of school premises in an effort to address the negative impacts of excessive smartphone use.
Additionally, three other public schools in the area are working towards implementing the same policy.
The schools aim to educate families and students about the various harmful consequences associated with smartphone and social media use in young individuals. These include mental health issues, addiction to screen time, disruptions to sleep and concentration, exposure to inappropriate content, as well as an increased risk of theft and robbery.
Mike Baxter, principal at City of London Academy, stated, “We have witnessed firsthand the detrimental effects of smartphones and social media on the health and education of children. The negative behaviors often manifested outside of school hours but were subsequently revealed within the school environment.”
The schools have collectively agreed to confiscate cellphones if used during class. Traditional phones without Wi-Fi access may be quickly returned, while smartphones may only be retrieved after a week or upon personal collection by a parent.
The new measures will impact over 13,000 young individuals in one of London’s top-performing boroughs. The policy applies to students in years 7 to 9 across all secondary schools, with some schools adopting a comprehensive approach.
Furthermore, a group of secondary school principals are collaborating with primary school leaders in Southwark to establish a borough-wide initiative.
Jessica West, principal at Ark Walworth Academy, emphasized that the inaction of phone companies compelled schools to take action to ensure the well-being of children. They aim to guide families and children in making healthy choices regarding smartphone usage.
Recent reports indicate a significant rise in screen time among young children and teenagers, with screen time among children increasing by 52% between 2020 and 2022, according to the UK House of Commons Education Committee.
Approximately 25% of children and adolescents are reported to use smartphones in a manner consistent with behavioral addiction, as per the findings of the report.
The collaborative effort has been praised by Daisy Greenwell, co-founder of A childhood without smartphones (SFC), who stated, “This united action by a headteacher in south London is groundbreaking and truly impactful. It is unprecedented for secondary schools to collectively address this issue. Commendably, this could potentially alter the lives of a generation of children in south London who are at risk of developing mental health challenges due to early smartphone usage.”
Concerns regarding smartphones and children are escalating rapidly, with SFC expanding its reach to other countries such as the US, UAE, South Africa, Australia, New Zealand, Switzerland, and Portugal.
In the UK, an increasing number of parents are committing to delaying the provision of smartphones to their children until they reach the age of 14. Bristol is a prime example, where 80 schools have established SFC groups and over 1,000 parents have pledged their support.
Greenwell expressed excitement about the organic growth of this movement among schools, principals, and parents, indicating that this long-awaited conversation is finally gaining traction.
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.