Instagram’s Age Verification: Adults with Mustaches Over 16—But What About 13-Year-Olds?

Instagram’s method for confirming if a user surpasses 16 years old is fairly straightforward, especially when the individual is evidently an adult. However, what occurs if a 13-year-old attempts to alter their birth date to seem older?

In November, Meta informed Instagram and Facebook users whose birth dates are registered as under 16 that their accounts would be disabled as part of Australia’s prohibition on social media use for children. This rule will take effect on December 10, with Meta announcing that access for users younger than 16 will start being revoked from December 4.

Subscribe: AU breaking news email

Dummy social media accounts were created on phones as part of Guardian Australia’s investigation into what content different age groups access on the platform.




Instagram notification sent to a test account with an age set to 15. Photo: Instagram/Meta

One account was created on Instagram with the age set at 15 to observe the impact of the social media ban for users under 16. Instagram later stated: “Under Australian law, you will soon be unable to use social media until you turn 16.”

“You cannot use an Instagram account until you’re 16, which means your profile will not be visible to you or anyone else until that time.”

“We’ll inform you when you can access Instagram again.”




Notice informing that test account users will lose access due to the Australian social media ban. Photo: Instagram/Meta

The account was then presented with two choices: either download account data and deactivate until the user is 16, or verify their date of birth.




Instagram notification sent to test account set to age 15 regarding date of birth review options. Photo: Instagram/Meta

The second option enables users to submit a “video selfie” to validate that the account holder is older than 16. The app activated the front-facing camera and prompted the adult test user, distinguished by a thick beard, to shift their head side to side. This resembles the authentication method used for face unlock on smartphones.




Explanation on how the “Video Selfie” feature estimates the user’s age. Photo: Instagram/Meta

The notification indicated that the verification process usually takes 1-2 minutes, but may extend up to 48 hours.




Notification sent to the test account following the date of birth verification request. Photo: Instagram/Meta

The app promptly indicated that accounts created by adult test users were recognized as 16 years or older.




A notification confirming the user’s date of birth was updated by Instagram. Photo: Instagram/Meta

In another test, a 13-year-old boy created a fresh account on his mobile device, avoiding installing Instagram and using a birth date that clearly suggested he was under 16. There was no immediate alert regarding the upcoming social media ban.

When the child attempted to change their date of birth to reflect an adult age, the same video selfie facial age estimation process was performed.

Skip past newsletter promotions

Within a minute, it replied, “We couldn’t verify your age,” and requested a government-issued ID for date of birth verification.

Facial age testing during the Age Assurance Trial revealed that individuals over 21 were generally much less prone to being misidentified as under 16. Meanwhile, those closer to 16 years of age and minorities experienced higher rates of false positives and negatives.


Meta may have already assessed users who haven’t been notified as 18 years or older, utilizing data such as birth date, account lifespan, and other user activity.

A Meta representative mentioned that the experiment demonstrated that the process functions as expected, with “adult users being capable of verifying their age and proceeding, while users under 16 undergo an age check when attempting to alter their birth date.”

“That said, we must also recognize the findings of the Age Assurance Technical Examination, which highlights the specific difficulties of age verification at the 16-year threshold and anticipates that the method may occasionally be imperfect,” the spokesperson added.

Last month, Communications Minister Annika Wells acknowledged the potential challenges confronting the implementation of the ban.

“We recognize that this law isn’t flawless, but it is essential to ensure that there are no gaps,” she stated.

Meta collaborates with Yoti for age verification services. The company asserts on its website that facial images will be destroyed once the verification process concludes.

The ban impacts Meta’s Facebook, Instagram, and Threads platforms, as well as others such as Kick, Reddit, Snapchat, TikTok, Twitch, X, and YouTube.




Source: www.theguardian.com

Instagram’s Ongoing Commitment to Youth Safety: Will the New “PG-13” Guidelines Make a Difference?

For months, Instagram has faced challenges in persuading parents, advocates, and officials that it is a safe environment for children, despite increasing evidence indicating otherwise. Now, the platform is rolling out another safety feature intended to protect teens. Yet, given its track record, parents remain skeptical.

Beginning this week, all users under 18 will automatically be categorized for ages 13+ and their feeds will be restricted to content suitable for the U.S. PG-13 movie rating.

However, Instagram’s previous unfulfilled commitments make this latest content restriction feel like mere window dressing—an illusion of action without genuine effectiveness.

The company has accrued substantial profits while advocacy groups have long cautioned against exposing minors to inappropriate content and individuals. $100 billion annually is what it reports in profits. Meta’s own estimates suggest that about 100,000 children using Facebook and Instagram face online sexual harassment daily. This is concerning, especially considering that as of July 2020, internal communications revealed that the measures to prevent child grooming on the platform were, at best, “between zero and negligible.” The lawsuit in New Mexico claims that Meta’s social networks, including Instagram, have essentially become a haven for child predators. (Meta refutes these core allegations, claiming the lawsuit is ‘unfair’.)

Last year, the firm finally enacted mandatory Instagram accounts for teenagers. However, a recent study led by a whistleblower revealed that 64% of the new safety features designed for teens were ineffective.

Research indicates that 47% of young teen users on Instagram encounter unsafe content, and 37% of users aged 13 to 15 receive at least one unsafe piece of content or unwanted message weekly. This includes “approximately 1 in 7 users viewing self-harm content, unwanted sexual content, discriminatory content, or substance-related content every week.”

“These failures showcase a corporate culture at Meta that prioritizes engagement and profit over safety,” stated Andy Burrows, CEO of the UK’s Molly Rose Foundation, which advocates for stronger online safety legislation, as part of the investigative team. BBC reported. A spokesperson for Meta countered that the study “misrepresents our commitment to empowering parents and protecting youth, and mischaracterizes the functionality of our safety tools and their use by millions of parents and youth.”

Concurrently, measures introduced last year followed a significant moment for Meta’s public perception. In January 2024, the leaders of the world’s major social media firms were summoned to the U.S. Senate to discuss their security policies. Meta CEO Mark Zuckerberg issued an apology to parents whose children allegedly endured harm from social media.

Despite Instagram’s lengthy struggle to address these concerns, it appears to continually place children at risk, only to issue apologies afterward. On Monday, Reuters reported that it has been found in company-specific research that teens who frequently felt negative about their bodies on Instagram encountered three times more “eating disorder-related content” than their peers. Alarmingly, technology companies and social media platforms have become so entrenched in everyday life that it’s nearly impossible to engage with society without them, particularly for children.

So, what is the resolution? Primarily, we must acknowledge online spaces as extensions of the real world, rather than merely digital counterparts. Social media platforms replicate real-life violence and can cause other tangible harms, putting children at a higher risk.

It’s essential for lawmakers to require these companies to incorporate safety measures into their design processes rather than treating them as an afterthought. Equally vital is for parents to educate their children on online safety, just as they would about physical safety in public.

The technology developed by these profit-driven companies is pervasive. If we cannot rely on them to safeguard our most vulnerable users, it falls upon us to ensure our own protection.

Source: www.theguardian.com

Mother on a Quest to Find Trafficked Daughter Despite Instagram’s Absence

R
Obin Khoury’s daughter Kristen was 15 years old when she was allowed to open her own Instagram account. “We considered ourselves responsible and did everything we could to ensure safety,” Corey says. Months later, Kristen disappeared from her parents’ home after being groomed by a criminal gang on Instagram’s direct message service and sold as prostitutes on the streets of Houston.

Colley said her daughter never recovered from the ordeal. Kristen returned to her home, but after she was trafficked again, she went missing. Her mother does not know if she is still alive.

Corey blames the gang that trafficked his daughter for destroying her life. She also blames Instagram, which she believes played a key role in the sex trafficking of her daughter. “If Instagram didn’t exist, none of this would have happened to my daughter,” she says. “Instagram made it so easy.” [for these people] do this. ”

This week, Mark Zuckerberg Wealth estimated at $139 billion The founder of Meta, the company that owns Instagram, Facebook and WhatsApp, received £109bn. Big Tech and the online child sexual exploitation crisis.




Meta CEO Mark Zuckerberg said this after addressing the audience at this week’s U.S. Senate committee hearing on online child sexual exploitation. Photo: Evelyn Hochstein/Reuters

During the hearing, Zuckerberg, along with other social media executives, was questioned about the extent of harm to children on their platforms, and said there was no “causal effect” between social media and poor mental health among young people. He denied there was any evidence that there was a relationship.

He then addressed parents gathered at a Congressional hearing and apologized to parents who had lost children to sexual exploitation and harassment for what they had gone through.

Corey doesn’t have time to hear Zuckerberg’s apology. “If she wanted to, she could stop all the harm that’s happening on her platform,” she says. “Words are not enough. He needs to act.”

She said her daughter was a “normal, happy child” until 2019, when she became the target of a criminal organization operating a prostitution ring in downtown Houston.

Boys working as gang scouts at Kristen’s school identified Kristen as a potential target, and she was flooded with messages from gang members through the direct message feature on her Instagram account.

“They promised her a Louis Vuitton bag in an Instagram message, promised her a yacht trip, things like that,” Corey says. “They say, ‘Come party with me.'”

A few months after she started receiving messages, the gang lured Kristen out of her parents’ home. Two weeks later, police found her in Houston’s red-light district.

“[When she came home] She started talking more about what they had done to her,” Corey says. “The babies’ mouths came out with horrifying stories about what they were forced to do.”

Corrie said Kristen returned to her family traumatized, frightened, angry and a different person.





“I searched for my daughter in some really dark places,” Robin Corey says. “We are conducting DNA testing on the remains that match her description.”
Photo: Tola Olawale/The Guardian

Source: www.theguardian.com

Instagram’s CEO confirms that development of Threads API is underway

Instagram head Adam Mosseri said today that the Threads API is in development. This gives developers the opportunity to create different apps and experiences around threads.

Mosseri responded to journalist Casey Newton, who was talking to users about Threads’ TweetDeck-like experience. The head of Instagram expressed concern that publishers are posting so much content that it overshadows the content of creators.

“We’re working on it. My concern is that it means more publisher content rather than creator content, but it still seems like something that needs to be resolved,” Mosseri said. said. post.

Thread takes the position that while the news content is not “anti-news,” it “does not actively amplify the news.” Until now, news publications have relied on third-party tools and integration with various social networks to automatically post to platforms like Twitter, LinkedIn, and Facebook. Newer platforms like Threads don’t have APIs available, so publishers have to submit content manually, which isn’t ideal for news organizations that submit a large number of articles per day.

Mosseri is concerned that publishers are pushing an overwhelming amount of content through API integrations, but creators also need different tools to post content in different formats. API integration makes it easy for developers to create platform-appropriate functionality.

Social networks like Twitter (now X) and Reddit make it difficult for third-party developers to write clients, allowing Threads to expose APIs for a healthy app ecosystem. Developers have written several clients for competing networks such as Bluesky and Mastodon. However, both networks have a relatively smaller user base than Threads.

Earlier this week, Meta announced that Threads has just under 100 million monthly active users. The API and third-party app ecosystem won’t necessarily drive this number higher, but it will give people a different way to explore the network. The Threads team has released a number of features in the past few months since release. However, with an ecosystem of third-party apps in place, developers can use a variety of ship features that users are looking for.

Additionally, Meta and Mosseri talk about integrating Threads with Fedisverse. Therefore, an open ecosystem with well-managed APIs would be a good step toward achieving that goal.

Source: techcrunch.com