UK Criminalizes Pornographic Content Involving Strangulation

The act of choking in pornography, often referred to as ‘choking,’ will be criminalized, with legal obligations imposed on technology platforms to prevent users in the UK from accessing such content.

Proposed amendments to the Crime and Policing Bill introduced in Parliament on Monday will make it illegal to possess or distribute pornography that features choking.

An additional amendment aims to extend the timeframe for victims of intimate image abuse to come forward, increasing the prosecution limit from six months to three years.

The government stated that these changes would eliminate unnecessary obstacles for victims reporting crimes, thus “enhancing access to justice for those in need.”

The choking ban follows a government review that suggested pornography was fostering the normalization of strangulation as a “sexual norm.”

The Independent Pornography Review, initiated by former chancellor Rishi Sunak and conducted by Baroness Gabby Bertin, published its findings in February, recommending a prohibition on pornography featuring strangulation.

Despite the common belief that strangulation can be performed safely, studies indicate that it poses significant risks. While there may be no visible injuries, oxygen deprivation—even for brief moments—can cause detrimental changes to the brain’s delicate structures.

Research has revealed specific alterations in the brains of women who have been subjected to choking during sexual activities, showing indicators of brain damage; Hemisphere disruption associated with depression and anxiety.

Given these dangers, non-fatal strangulation and non-fatal asphyxiation were categorized as criminal offenses in the Domestic Abuse Act 2021, amid concerns that offenders often escape punishment due to the absence of visible injuries.

The new amendments will mandate platforms to take proactive measures to block users’ access to illegal content involving strangulation and suffocation.

Choking-related offenses will be prioritized, imposing a legal responsibility on pornographic sites and tech platforms to ensure UK users cannot view such material.

The Ministry of Justice indicated that this might involve the use of automated systems for the detection and removal of images, moderation tools, or stricter content policies to hinder the spread of abusive material.

Failure to comply could result in fines of up to £18 million imposed by Ofcom.

Barney Ryan, CEO of the Strangulation Research Institute, expressed support for the ban, stating, “While consenting adults should have the freedom to safely explore their sexuality, we must recognize the severe risks posed by unregulated online content, particularly to children and young people.

“Strangulation represents a severe form of violence, often employed in domestic violence for control, silence, or to induce fear. Its portrayal in pornography, especially without context, can impart confusing and harmful messages to youth regarding what is normal and acceptable in intimate relationships. Our research confirms that there is no safe way to strangle.”

Alex Davis-Jones, Minister for Victims and Violence Against Women and Girls, emphasized that online misogyny “has devastating real-world impacts on all of us.” Daily, women and girls have their lives disrupted by cowards who abuse and exploit them from behind screens.

“This government will not remain passive while women face online violations and become victims of normalized and violent pornography.

“We are delivering a strong message that dangerous and sexist behavior will not be tolerated.”

This initiative comes on the heels of a government-commissioned inquiry in 2020 that revealed “significant evidence.” The link between pornography use and harmful sexual attitudes and behaviors toward women.

Additionally, a study conducted that year found that many children had encountered violent or offensive pornography, which left them feeling upset or anxious; some even mimicked the behaviors observed online. Children who engaged with pornography were three to six times more likely to participate in “potentially risky behavior” concerning consent, according to a study by the British Board of Film Classification. .

Source: www.theguardian.com

FBI and NSPCC Alarmed by “Shocking” Increase in Online Sextortion Cases Involving Children

Tech firms like Snapchat and Facebook disclosed over 9,600 instances of adults grooming children online within a mere six months last year, averaging around 400 cases weekly.

Law enforcement agencies, such as the FBI and the UK’s National Crime Agency (NCA), are increasingly alarmed by the rising threats posed by various crimes targeting minors.

In 2023, the U.S.-based National Center for Missing and Exploited Children (NCMEC) documented 546,000 reports concerning children from high-tech companies globally.

Of these, approximately 9,600 reports originated from the UK during the first half of 2024. Records indicate that Snapchat reported significantly more distressing content to NCMEC than any other platform during this timeframe.

The NSPCC, a child welfare charity, termed the statistic “shocking,” suggesting that it is likely an underrepresentation.


The NCA is launching an “unprecedented” campaign in the UK aimed at informing teachers, parents, and children about the perils of sexual exploitation.

The NCA emphasized: “Sextortion is a cruel crime that can lead to devastating outcomes for victims. Tragically, teenagers in the UK and worldwide have taken their lives as a result.”

NCMEC’s data is crucial as it is derived from reports submitted by online platforms and internet providers—such as Snapchat, Instagram, and TikTok—rather than from victims, who may feel hesitant to disclose their abuse.

High-tech companies are mandated by U.S. law to report suspicious content to NCMEC. The data indicates that Snapchat reported around 20,000 instances of concerning materials in the first half of 2023, which included instances of sextortion and Child Sexual Abuse Materials.

This number surpasses the combined total of reports submitted by Facebook, Instagram, TikTok, X (formerly Twitter), Google, and Discord. Snapchat revised its policy on reporting such content last year, which is believed to have resulted in lower subsequent figures.

Rani Govender from NSPCC remarked that sextortion and other profit-driven sexual offenses have a profoundly “devastating” impact on young individuals, hindering their ability to seek help and, in some cases, leading to suicide.

NCMEC revealed that they are aware of “more than three dozen” teenage boys globally who have taken their lives after falling victim to sextortion since 2021.

Govender noted that some tech companies “misjudge the abuse occurring online” by implementing protections like end-to-end encryption.

In contrast to certain other platforms, Snapchat does not employ end-to-end encryption for text-based messaging.

Authorities are increasingly worried that predators are utilizing more sophisticated methods to target children online.

The Guardian has uncovered a 101-page manual that provides detailed instructions on how to exploit young internet users, including recommendations for effective mobile phones, encryption, apps, and manipulative tactics.

Skip past newsletter promotions

This document instructs users on how to ensnare victims as “modern slaves” by obtaining explicit images, followed by coercive demands.

The guide is purportedly authored by a 20-year-old individual named Baron Martin from Arizona, USA. Arrested by the FBI in December, he refers to himself as the “king of terror.” According to the U.S. Department of Justice, Martin was a “catalyst for widespread control.”

Researchers report that the sextortion manual has been circulated among numerous “com networks”—an online community that promotes sadistic and misogynistic material while encouraging criminal behavior.

Milo Comerford, a strategic dialogue researcher at the ISD think tank, stated:

The FBI has pinpointed numerous online gangs collaborating to identify and exploit vulnerable victims, targeting them with compromising romantic interests.

These strategies are then used to blackmail victims, often resulting in further explicit imagery, self-harm, and other acts of violence and animal cruelty.

Comerford emphasized that “robust multi-agency” measures are urgently needed to raise awareness about the risks of sextortion among young people, parents, guardians, teachers, and others.

He added, “These transnational networks operate within a constantly shifting landscape of victims, groomers, and abusive entities utilizing social media platforms, sometimes leading to mass violence.”

Both Snapchat and Facebook have been requested to provide comments on this matter.

Source: www.theguardian.com

Big Tech Companies Dispute President Trump’s $500 Billion AI Investment Announcement, Involving Elon Musk

Major technology giants criticized their competitors following Donald Trump’s announcement of significant investments in AI the day before.

President Trump revealed Stargate, a $500 billion initiative funded by OpenAI, Oracle, and SoftBank. The announcement featured leaders from both companies: Sam Altman, Larry Ellison, and Masayoshi Son, with Son as the project chairman. A representative from Abu Dhabi’s state-run AI fund MGX, another major investor, was notably absent.

The partnership aims to establish data centers and computing infrastructure crucial for AI development. While the initial investment amount is substantial, estimates suggest that developing AI will require as much funding.

Notably missing from the event was Elon Musk, CEO of Tesla, SpaceX, and xAI, who is also the wealthiest person globally. Despite Musk’s close ties to Trump and rumored office in the White House, he dismissed Stargate as a financial sham the following night.

When OpenAI announced on X (Musk’s social network) that they would immediately deploy $100 billion, Musk countered, stating that they lacked the funds and criticizing SoftBank’s funding of less than $10 billion. Musk, with a net worth of about $430 billion, tweets prolifically on a variety of subjects.

President Trump has yet to respond to Musk’s comments, focusing instead on Melania’s anniversary on his social network, Truth Social.

Musk continued his criticism on Twitter, sharing a leaked image of a research tool supposedly used to calculate Stargate’s $500 billion cost. He spent much of Wednesday afternoon attacking the project.

Sam Altman initially praised Musk’s work but later questioned his motives for criticizing SoftBank. Satya Nadella, CEO of Microsoft, responded diplomatically when asked about the situation, emphasizing Microsoft’s plans to invest in Azure.

Skip past newsletter promotions

The tension between Musk and Altman dates back to their history at OpenAI, where Musk eventually parted ways with Altman. The heads of Oracle and SoftBank involved in Stargate have not yet spoken on the matter.

Source: www.theguardian.com

Leak Indicates Israel Attempted to Prevent US Lawsuit Involving Pegasus Spyware | Israel

The Israeli government has blocked a costly U.S. lawsuit that could reveal secrets about a hacking tool called Pegasus. Documents suggest the Israeli authorities seized Pegasus spyware documentation from NSO Group to prevent the disclosure of sensitive information.

Pegasus is used to infect smartphones with hidden software that can extract data and spy on users. NSO Group’s customers include both authoritarian regimes and democracies, raising concerns about human rights abuses.

NSO has been fighting a lawsuit alleging WhatsApp vulnerabilities were exploited, compromising users in multiple countries. Israel’s close ties with NSO and the impact of the seizures on the legal battle have raised questions about the country’s involvement.

Media organizations are trying to uncover the details of the seizures and Israel’s interference in the case, shedding light on the complex relationship between NSO, Israel, and the legal system.

The documents obtained reveal the extent of Israel’s efforts to protect NSO from disclosing sensitive information demanded by the U.S. court, impacting the ongoing legal proceedings.

Israel’s covert actions have complicated WhatsApp’s attempts to obtain crucial information from NSO, highlighting the challenges faced in the legal battle.

The leaked files and emails provide insight into the behind-the-scenes activities and the attempts to prevent the exposure of sensitive information related to the Pegasus spyware.

“Strange Procedure”

Israel’s intervention in the lawsuit has raised concerns about the transparency of the legal process and the protection of national interests.


Do you have information about this story? Email stephanie.kirchgaessner@theguardian.com or send a message (from a non-work phone) using Signal or WhatsApp to +1 646 886 8761.

The legal battle between WhatsApp and NSO has revealed underlying complexities and challenges posed by the lawsuit, reflecting the broader implications of the case.

NSO’s actions and Israel’s involvement have significantly influenced the course of the lawsuit, raising questions about transparency and accountability in the legal process.

Additional reporting by Phineas Rueckert and Karine Pfenniger of Forbidden Stories.

Source: www.theguardian.com

Tesla emerges victorious in jury trial regarding fatal accident involving autopilot

Tesla scored another victory Tuesday after a jury sided with the company over charges that its advanced driver assistance system, Autopilot, caused a fatal crash.

The lawsuit, being heard in California’s Riverside County Superior Court, was brought by two surviving passengers in a 2019 crash, alleging that Tesla knew its products were defective. The two survivors sought $400 million in damages for the driver’s loss of life, physical injuries, and emotional distress.

Tesla maintains that the crash that killed driver Mika Lee was the result of human error, and has taken a similar position in other Autopilot lawsuits.

Tesla has won other lawsuits, including a jury trial in California earlier this year that determined the automaker’s Autopilot system was not responsible for a 2019 crash. In that case, a jury awarded no damages to Los Angeles resident Justin Hsu, who sued Tesla in 2020 alleging negligence, fraud and breach of contract. The case, which concluded Tuesday, was the first to result in a fatality after a jury trial.

Tesla still faces a number of other lawsuits in California. That includes a wrongful death lawsuit filed by the family of Apple engineer Walter Huang, who was killed when his Tesla Model X, equipped with Autopilot, crashed into a highway median. The California Department of Transportation is also named in the lawsuit. The wrongful death lawsuit filed in California Superior Court in Santa Clara County alleges that the crash that killed Juan on March 23, 2018 was caused by an error in Tesla’s Autopilot driver assistance system. Huang, 38, died while driving a 2017 Tesla Model X. The vehicle crashed into a freeway barrier on Highway 101 in Mountain View, California. A jury trial in the case is scheduled to begin next year.

Tesla also faces scrutiny from federal and state regulators, all related to Autopilot and its upgraded version known as full self-driving.

Tesla cars come standard with a driver assistance system called Autopilot. Owners can purchase an enhanced autopilot for his $6,000 upgrade. It includes several other features, such as an active guidance system that navigates the car from highway on-ramps to exit ramps, including interchanges and lane changes.

For an additional $12,000, owners can purchase “Full Self-Driving” (FSD). This is a feature that CEO Elon Musk has been promising for years, one day delivering full self-driving capabilities.

Tesla cars are not self-driving. Instead, FSD includes a number of self-driving features that require the driver to be in control at all times. This includes all of the enhanced Autopilot, which is supposed to handle steering on city roads and recognize and react to traffic lights and stop signs.

Source: techcrunch.com