Interestingly, recent studies indicate that individuals with higher intelligence often experience earlier puberty but tend to have children later and fewer overall.
This appears contradictory from a biological standpoint, as earlier puberty typically signifies readiness for reproduction.
However, an analysis of data from thousands in the UK and US revealed that more intelligent individuals tend to progress more slowly through key reproductive milestones.
They often begin sexual activity later, have fewer sexual partners, delay marriage, and have their first child at an older age.
Research suggests that this may stem from the fact that highly intelligent individuals enjoy greater opportunities, pursue extended education, embark on ambitious career paths, and prioritize personal goals before contemplating family life.
In some instances, they might even choose not to have children.
Long-term Mental Health Consequences During Adolescence
The timing of puberty can significantly affect how adolescents perceive themselves.
For instance, research has shown that girls entering puberty early are more susceptible to body image issues, anxiety, and low self-esteem due to feeling different from their peers and lacking readiness for the transformation.
These feelings can have profound implications for mental health, often extending into adulthood.
Research indicates that girls who undergo early puberty are more likely to experience body image concerns, anxiety, and low self-esteem due to their differences from peers and emotional unpreparedness for change – Image courtesy of Getty Images
Late puberty can present significant challenges for boys as well. A study reveals that boys who bloom later are often dissatisfied with their bodies, particularly because their muscle development may lag behind.
This dissatisfaction can contribute to low self-esteem and a sense of inadequacy.
Additionally, some boys encounter teasing, social pressure, and bullying, potentially leading to depression and other mental health issues.
While these feelings may diminish over time, they can leave lasting effects and elevate the risk of mental health concerns later on.
Risks Associated with Early or Late Puberty
A major UK study indicates that individuals entering puberty earlier than average are at a higher risk for developing type 2 diabetes and heart disease in adulthood.
Conversely, those with delayed development are more prone to asthma.
Researchers postulate that the timing of hormonal shifts can impact factors such as weight, stress levels, and lifestyle habits.
While early or late onset of puberty might lead to issues, it is not necessarily alarming. Everyone matures at their own pace.
If you have concerns regarding your child’s development or health, it’s advisable to consult your doctor for guidance.
This article addresses the inquiry (from Natalie Montagu in Stockport): “What impact does the timing of puberty have on a person’s long-term health?”
If you have questions, feel free to email us at:questions@sciencefocus.com or send us a messageFacebook,Twitter, or Instagram (please include your name and location).
Explore our ultimatefun facts and discover more fascinating science pages.
A report from the Campaign Group reveals that TikTok is guiding child accounts towards pornographic content within just a few clicks.
Global Witness activists created fake accounts using a birth date of 13 and activated the app’s “limited mode,” designed to reduce visibility to “sexually suggestive” material.
Researchers discovered that TikTok suggested sexual and explicit search phrases for seven test accounts established on new mobile devices with no prior search history.
The suggested terms under the “You May Want” feature included “very rude and revealing attire” and “very rude babe,” escalating to phrases like “hardcore porn clip.” Sexual search suggestions appeared instantly for three of the accounts.
After just “a few clicks,” researchers encountered pornographic material ranging from depictions of women to explicit sexual acts. Global Witness indicated that some content tried to evade moderation by appearing as innocuous photos or videos. For one account, access to explicit content required only two clicks: one on the search bar and another on a suggested search term.
Global Witness, an organization focused on climate issues and the implications of Big Tech on human rights, conducted two rounds of testing on July 25, one before and one after the Child Protection Regulation (OSA) was enacted in the UK.
Two videos featuring individuals who appeared under 16 were reported to the Internet Watch Foundation, tasked with monitoring online child sexual abuse material.
Global Witness accused TikTok of breaching the OSA, which mandates tech companies to shield children from harmful content, including pornography.
A spokesperson for the UK Communications Regulatory Authority, Ofcom, stated they would “support the study’s findings and evaluate the results.”
OFCOM’s compliance code stipulates that media promoting harmful content or high-risk tech companies must “design their algorithms to eliminate harmful material from child feeds.” TikTok’s content guidelines expressly prohibit pornographic material.
In response to Global Witness’s concerns, TikTok confirmed the removal of troubling content and modifications to its search recommendations.
“Upon recognizing these issues, we promptly initiated an investigation, eliminated content that breached our policies, and began enhancing our search proposal features,” stated a spokesperson.
Since the implementation of stringent age verification measures last month, visits to popular adult websites in the UK have seen a significant decline, according to recent data.
Daily traffic to PornHub, the most frequented porn site in the UK, dropped by 47%, from 3.6 million on July 24 to 1.9 million on August 8.
Data from digital market intelligence firm Sircerweb indicates that the next popular platforms, Xvideos and Xhamster, also experienced declines of 47% and 39% during the same period.
As reported initially by the Financial Times, this downturn seems to reflect the enforcement of strict age verification rules commencing on July 25 under the Online Safety Act. However, social media platforms implementing similar age checks for age-restricted materials, like X and Reddit, did not experience similar traffic declines.
A representative from Pornhub remarked, “As we have observed in various regions globally, compliant sites often see a decrease in traffic, while non-compliant ones may see an increase.”
The Online Safety Act aims to shield children from harmful online content, mandating that any site or app providing pornographic material must prevent access by minors.
Ofcom, the overseeing body for this law in the UK, endorses age verification methods such as: verifying age via credit card providers, banks, or mobile network operators; matching photo ID with a live selfie; or using a “digital identity wallet” for age verification.
Additionally, the law requires platforms to block access to content that could be harmful to children, including materials that incite self-harm or promote dangerous behaviors, which has sparked tension over concerns of excessive regulation.
Ofcom contends that the law does not infringe upon freedom of expression, highlighting clauses intended to protect free speech. Non-compliance can lead to penalties ranging from formal warnings to fines amounting to 10% of global revenue, with serious violations potentially resulting in websites being blocked in the UK.
Nigel Farage’s Reform British Party has vowed to repeal the act following the age verification requirement, igniting a heated exchange where the technology secretary, Peter Kyle, was accused by Farage of making inappropriate comments.
The implementation of age checks has accordingly led to a surge in virtual private network (VPN) downloads, as users seek to circumvent national restrictions on certain websites. VPN applications frequently dominate the top five spots in Apple’s App Store.
In the last two weeks, countless “adults only” and “not safe for work” games have been removed from Steam and itch.io—two leading platforms for PC game distribution—as they scramble to adhere to stricter regulations set by payment processors like MasterCard, Visa, and PayPal.
These regulations came about following a campaign known as A Collective Cry, which pressured payment processors to cease facilitating transactions for platforms hosting content that features “games related to rape, incest, and child sexual abuse.” However, the scope of these new rules extends far beyond those issues, affecting even award-winning titles.
How did this begin?
On July 16th, Valve, the developer behind Steam, revised its Rules and Guidelines for game distribution. Existing prohibitions against “nude or sexually explicit images of real people” and “adult content that is not labeled or age-gated” were expanded to include “content that may violate the rules and standards laid out by certain types of adult content.”
In a statement to PC gamers on July 18th, Valve acknowledged that several games have been “retired” from the Steam store due to these new regulations. However, they did not specify which games were removed or define what types of “adult content” are deemed unacceptable.
A week later, itch.io issued a statement indicating it was also being “scrutinized” by payment processors. Consequently, all games tagged as “NSFW” were “deleted” (removed from search results) following a “comprehensive audit of content to comply with payment processor requirements.” Unlike Valve, itch.io explicitly mentioned the collective cry as the impetus behind the increased scrutiny.
What is A Collective Cry?
Collective Shout is an Australian group describing itself as “a grassroots campaign against the objectification of women and the sexualization of girls.”
In April, they achieved a major victory by having the game No Mercy—featuring themes of “incest,” “coerced sexual acts,” and “inevitably non-consensual gender”—removed from both Steam and itch.io. Following this, the group released an open letter stating, “We have identified hundreds of additional games on Steam and itch.io that involve themes of rape, incest, and child sexual abuse,” demanding companies to “demonstrate corporate social responsibility” by ceasing payment processing for those platforms.
After Steam updated its rules, A Collective Cry requested credits for the changes, stating that over 1,000 supporters contacted their payment processors to “stop financing these games.”
What is the role of payment processors?
Like many online retailers, both Steam and itch.io depend on payment processors to facilitate transactions. As such, these processors wield significant influence, impacting what products can be sold, published, or purchased, regardless of their legality.
In recent years, payment processors have tightened rules regarding transactions linked to adult content. In 2021, several subscription-based adult content platforms considered banning such materials following pressure from payment processors but ultimately reversed that decision. Later that same year, MasterCard initiated a new policy governing adult content retailers. The American Civil Liberties Union has pointed out that these measures can restrict free speech and be detrimental to sex workers, contributing to financial censorship.
The award-winning game Consume Me has been affected by these policies. Photo: 66
Which games have been impacted?
It’s unclear how many games have been “retired” from Steam as a result of these new regulations, but several titles featuring incest themes have been deleted. As previously mentioned, itch.io has also removed most games labeled as NSFW. According to the Games Industry Newsletter Game Files, since July 16th, over 20,000 games have been removed from itch.io’s NSFW category.
Among those affected are games that explore unique themes of identity and sexuality, such as Radiator 2, created by Robert Yang, a former faculty member at New York University’s arcade center. The rules have also impacted games that do not feature sexual content at all, including The Last Call, an award-winning narrative about domestic violence survival created by Nina Freeman, and Consume Me, which has received numerous industry accolades for its approach to intricate topics.
How has the gaming industry reacted?
Many developers have criticized the power that payment processors hold over the market, particularly how their influence is amplified through campaign groups like A Collective Cry. In a post on Bluesky, Yang labeled the collective cry and payment processors’ actions as “a cultural war against sexual expression, particularly affecting LGBTQ individuals,” but noted that it is entirely unacceptable for payment processors to engage in selective censorship that systematically marginalizes adult content creators.
On July 17, a petition was launched on change.org, urging payment processors and activist groups to “refrain from controlling what we can watch, read, or play.” The petition argues that “MasterCard and Visa hinder legal entertainment” and calls for “the right to select our own narratives without moral surveillance.” Over 150,000 signatures have been gathered so far, as gamers and developers share contact information for major payment companies to encourage complaints.
What comes next?
It’s difficult to predict the future. Ongoing consumer pressure could prompt payment processors to reconsider their stance, but it may also challenge anti-censorship advocates to gain political backing amidst shifting legislative attitudes toward online adult content. The UK recently implemented stricter regulations regarding age verification for internet users seeking access to adult material, while the EU has proposed draft guidelines for similar measures.
Thus, new rules may be here to stay on Steam and itch.io for now. However, the resulting fallout has illuminated the influence of payment processors and the ambiguity surrounding the regulations. Such uncertainty may compel companies to tread more carefully in responding to pressure from future advocacy groups.
The importance of online safety for children in the UK is reaching a pivotal moment. Starting this Friday, social media and other internet platforms must take action to safeguard children or face substantial fines for non-compliance.
This marks a critical evaluation of the online safety law, a revolutionary regulation that encompasses platforms like Facebook, Instagram, TikTok, YouTube, Google, and more. Here’s an overview of the new regulations.
What will happen on July 25th?
Companies subject to the law are required to implement safety measures that shield children from harmful content. Specifically, all pornography sites must establish stringent age verification protocols. According to Ofcom, the UK communications regulator, 8% of children aged 8 to 14 accessed online pornographic sites or apps within a month.
Furthermore, social media platforms and major search engines must block access for children to pornography and content that promotes or encourages suicide, self-harm, and eating disorders. This may involve completely removing certain feeds for younger users. Hundreds of businesses will be impacted by these regulations.
Platforms must also minimize the distribution of other potentially harmful content, such as promoting dangerous challenges, substance abuse, or instances of bullying.
What are the suggested safety measures?
Recommended measures include: Algorithms that suggest content to users must exclude harmful materials. All sites and applications must implement procedures to rapidly eliminate dangerous content. Additionally, children should have a straightforward method to report concerns. Compliance is flexible if businesses believe they have effective alternatives to meet their child safety responsibilities.
Services deemed “high risk”, like major social media platforms, must utilize “highly effective” age verification methods to identify users under 18. If a social media platform is found hosting harmful content without age checks, it is responsible for ensuring a “positive” user experience.
X states that if it cannot determine a user’s age as 18 or older, it defaults to sensitive content settings, thereby restricting adult material. They are also integrating age estimation technology and ID verification to ensure users are not underage. Meta, the parent company of Instagram and Facebook, claims to have a comprehensive approach to age verification that includes a teen account feature set by default for users under 18.
“We collaborate with the law firm Payne Hicks Beach,” noted Mark Jones, a partner at the firm. “[Online Safety Act] If not, we strive to clarify it for the company.”
The Molly Rose Foundation, set up by the family of British teenager Molly Russell, who tragically lost her life in 2017 due to harmful online content, is advocating for further changes, including the prohibition of perilous online challenges and requiring platforms to proactively mitigate depressive and body image-related content.
How will age verification be implemented?
Some age verification methods for pornographic providers supported by OFCOM include: assessing a person’s age through live photos and videos (face age estimation), verifying age via credit card, bank, or mobile network operator, matching photo ID, and utilizing a “digital identity wallet” that contains proof of age.
Ria Moody, a lawyer at Linklaters, commented, “Age verification measures must be highly accurate. OFCOM indicates these measures are ineffective unless they ensure the user is over 18, so platforms should not rely solely on them.”
What does this mean in practice?
Pornhub, the UK’s most frequented online porn site, has stated it will implement a “regulatory approved age verification method” by Friday, though specific methods have yet to be disclosed. Another adult site, OnlyFans, is already using facial age verification software, which estimates users’ ages without saving their facial images, relying instead on data from millions of other images. A company called Yoti provides this software and has also made it available on Instagram.
Last week, Reddit began verifying the age of forums and threads containing adult content. The platform utilizes technology from a company named Persona, which verifies age using uploaded selfies or government-issued ID photos. Reddit does not retain the photos, instead storing validation statuses to streamline the process for users.
How accurate is facial age verification?
The software allows websites or apps to set a “challenge” age (e.g., 20 or 25) to minimize the number of underage users accidentally accessing content. When Yoti set a challenge age of 20, less than 1% of 13-17-year-olds were mistakenly verified.
What other methods are available?
Another direct approach entails requiring users to present formal identification, like a passport or driver’s license. Importantly, the ID details need not be stored and can be used solely to verify access.
Will all pornographic sites conduct age checks?
They are expected to, but many smaller sites might try to circumvent the regulations, fearing it will deter demand for their services. Industry representatives suggest that those who disregard the rules may await Ofcom’s response to violations before determining their course of action.
How will child protection measures be enforced?
Ofcom has a broad spectrum of penalties it can impose under the law. Companies can face fines of up to £18 million or 10% of their global revenue for violations—potentially amounting to $16 billion for Meta. Additionally, sites or apps can receive formal warnings. For severe violations, Ofcom may seek a court order to restrict the availability of the site or app in the UK.
Moreover, senior managers at technology firms could face up to two years in prison if they are found criminally liable for repeated breaches of their obligations to protect children and for ignoring enforcement notices from Ofcom.
European officials have initiated an investigation into four adult websites suspected of inadequately preventing minors from viewing adult content.
Following a review of the companies’ policies, the European Commission criticized PornHub, StripChat, XNXX, and XVideos for not implementing adequate age verification procedures to block minors from accessing their sites.
This inquiry has been launched in accordance with the EU’s Digital Services Act (DSA), a comprehensive set of regulations aimed at curbing online harm such as disinformation, cyber threats, hate speech, and counterfeit merchandise. The DSA also enforces stringent measures to safeguard children online, including preventing mental health repercussions from exposure to adult materials.
The committee noted that all four platforms employed a simple one-click self-certification for age verification.
“Today marks a significant step toward child protection online in the EU, as the enforcement action we are initiating… clearly indicates our commitment to hold four major adult content platforms accountable for effectively safeguarding minors under the DSA.”
While no specific deadline has been set for concluding the investigation, officials stressed that they aim to act swiftly on potential next steps based on the platforms’ responses.
The platforms can resolve the investigation by implementing an age verification system recognized as effective by EU regulators. Failure to comply could result in fines of up to 6% of their global annual revenue.
The DSA regulates platforms with over 45 million users, including Google, Meta, and X, while national authorities in each of the 27 member states are responsible for those that fall beneath this threshold.
On Tuesday, the committee announced that StripChat no longer qualifies as a “very large online platform.” Following the company’s appeal, its oversight will now be handled by Cyprus rather than Brussels, under its parent company, Techinius Ltd.
However, this new designation will not take effect until September, meaning that the investigation into age verification remains active.
The child protection responsibilities of StripChat will continue unchanged.
Aylo FreeSites, the parent company of Pornhub, is aware of the ongoing investigation and has stated its “full commitment” to ensuring the online safety of minors.
“We are in full compliance with the law,” the company remarked. “We believe the effective way to protect both minors and adults is to verify user age at the point of access through their device, ensuring that websites provide or restrict access to age-sensitive content based on that verification.”
Techinius has been approached for comments. A Brussels-based attorney, recently representing the parent company of XVideos (Web Group Czech Republic) and XNXX (NKL Associates) in EU legal matters, has also reached out for statements.
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.