EU Fines Elon Musk’s X €120 Million in First Enforcement of New Digital Law

Elon Musk’s social media platform X has received a €120m (£105m) fine from the European Commission after being found in violation of new EU digital laws. This high-profile ruling is expected to cause friction with US billionaire Donald Trump.

The violations include what the EU described as “misleading” blue checkmark verification badges given to users, as well as insufficient transparency in advertising practices, which have been under investigation for two years.

The EU’s regulations mandate that tech companies maintain public lists of advertisers to ensure their structures prevent illegal fraud, false advertising, and organized political campaign manipulations.

Additionally, the EU found that X had not granted sufficient access to public data typically available to researchers observing critical topics like political content.

This significant ruling marks the conclusion of an inquiry that started two years ago.

On Friday, the commission announced that X had failed to meet its transparency responsibilities under the Digital Services Act (DSA), marking the first judgment against the platform since the enforcement of regulations on social media and major tech platforms began in 2023.

In December 2023, the Commission began formal proceedings to determine if X violated the DSA regarding illegal content distribution and the effectiveness of measures to address information manipulation, with the investigation ongoing.

Under the DSA, X could face fines of up to 6% of its global revenue, which is projected to be between $2.5bn and $2.7bn (£1.9bn and £2bn) in 2024.

There are still three additional investigations underway, two of which examine alterations to content and algorithms implemented after Musk’s acquisition of Twitter in October 2022, when it was rebranded to “X.”

The commission is also exploring whether laws against inciting violence or terrorism have been violated.

Moreover, the company is evaluating a system that would permit users to report and flag content they suspect to be illegal.

The fine, divided into three components, includes a €45 million penalty for the introduction of a blue “authentication” checkmark that users could purchase, which obscured the reliability of account holders, according to senior officials.

Prior to Musk’s takeover, blue checkmarks were exclusively awarded to verified account holders, including politicians, celebrities, public bodies, and established journalists from mainstream and emerging media like bloggers and YouTubers. Following the acquisition, users subscribing to X Premium can now obtain blue check status.

“With the DSA’s first compliance decision, we aim to hold Company X accountable for infringing on users’ rights and evading responsibility,” stated Hena Virkunen, executive vice president of the European Commission overseeing technology regulation.

“Deceiving users with blue checkmarks, obscuring information in advertisements, or restricting access for researchers is unacceptable online within the EU.”

X was also fined €35 million for advertising violations and €40 million for failures related to data access for researchers.

Skip past newsletter promotions

This ruling could provoke backlash from the Trump administration. Recently, US Commerce Secretary Howard Lutnick stated that the EU might have to explore technical regulations to cut steel tariffs by 50%.

His statement was labeled “blackmail” by Spanish commissioner Teresa Rivera.

EU officials asserted that the ruling stands independent of allegations brought forth by a US delegation meeting with trade ministers in Brussels last week. The EU emphasized its right to regulate US tech firms, noting that 25 companies, including non-US entities like TikTok, must adhere to the DSA.

Musk, who is on the path to becoming the first trillionaire, has 90 days to draft an “action plan” to address the fine, though he remains free to contest the EU’s decision, similar to appeals made by other corporations like Apple to the European Court of Justice.

In contrast, the EU announced it had secured a commitment from TikTok to establish an advertising repository, addressing transparency concerns raised with the European Commission earlier this year.

The DSA mandates that platforms maintain accessible and searchable ad repositories to enable researchers and civil society representatives to detect fraudulent, illegal, or age-inappropriate advertisements.

Government officials indicated that the emerging issue of fraudulent political ads and ads featuring impersonated celebrities cannot be adequately analyzed without compliance from social media companies.

Mr. X has been contacted for commentary. The EU confirmed that the company has been made aware of the decision.

Source: www.theguardian.com

Enforcement of Australia’s Social Media Ban for Users Under 16: Which Platforms Are Exempt?

Australians engaging with various social media platforms like Facebook, Instagram, YouTube, Snapchat, X, and others should verify that they are over 16 years old ahead of the upcoming social media ban set to commence in early December.


Beginning December 10th, new regulations will come into effect for platforms defined by the government as “age-restricted social media platforms.” These platforms are intended primarily for social interactions involving two or more users, enabling users to share content on the service.

The government has not specified which platforms are included in the ban, implying that any site fitting the above criteria may be affected unless it qualifies for the exemptions announced on Wednesday.

Prime Minister Anthony Albanese noted that platforms covered by these rules include, but aren’t limited to, Facebook, Instagram, X, Snapchat, and YouTube.

Communications Minister Annika Wells indicated that platforms are anticipated to disable accounts for users under 16 and implement reasonable measures to prevent younger individuals from creating new accounts, verifying their age, and bypassing established restrictions.


What is an Exemption?

According to the government, a platform will be exempt if it serves a primary purpose other than social interaction.

  • Messaging, email, voice, or video calling.

  • Playing online games.

  • Sharing information about products or services.

  • Professional networking or development.

  • Education.

  • Health.

  • Communication between educational institutions and students or their families.

  • Facilitating communication between healthcare providers and their service users.

Determinations regarding which platforms meet the exemption criteria will be made by the eSafety Commissioner.

In practice, this suggests that platforms such as LinkedIn, WhatsApp, Roblox, and Coursera may qualify for exemptions if assessed accordingly. LinkedIn previously asserted that the government’s focus is not on children.


Hypothetically, platforms like YouTube Kids could be exempt from the ban if they satisfy the exemption criteria, particularly as comments are disabled on those videos. Nonetheless, the government has yet to provide confirmation, and YouTube has not indicated if it intends to seek exemptions for child-focused services.


What About Other Platforms?

Platforms not named by the government and that do not meet the exemption criteria should consider implementing age verification mechanisms by December. This includes services like Bluesky, Donald Trump’s Truth Social, Discord, and Twitch.


How Will Tech Companies Verify Users Are Over 16?

A common misunderstanding regarding the social media ban is that it solely pertains to children. To ensure that teenagers are kept from social media, platforms must verify the age of all user accounts in Australia.

There are no specific requirements for how verification should be conducted, but updates from the Age Assurance Technology Trial will provide guidance.

The government has mandated that identity checks can be one form of age verification but is not the only method accepted.

Australia is likely to adopt an approach for age verification comparable to that of the UK, initiated in July. This could include options such as:

  • Requiring users to be 18 years of age or older to allow banks and mobile providers access to their users.

  • Requesting users to upload a photo to match with their ID.

  • Employing facial age estimation techniques.

Moreover, platforms may estimate a user’s age based on account behavior or the age itself. For instance, if an individual registered on Facebook in 2009, they are now over 16. YouTube has also indicated plans to utilize artificial intelligence for age verification.


Will Kids Find Workarounds?

Albanese likened the social media ban to alcohol restrictions, acknowledging that while some children may circumvent the ban, he affirmed that it is still a worthwhile endeavor.

In the UK, where age verification requirements for accessing adult websites were implemented this week, there has been a spike in the use of virtual private networks (VPNs) that conceal users’ actual locations, granting access to blocked sites.

Four of the top five free apps in the UK Apple App Store on Thursday were VPN applications, with the most widely used one, Proton, reporting an 1,800% increase in downloads.


The Australian government expects platforms to implement “reasonable measures” to address how teenagers attempt to evade the ban.


What Happens If a Site Does Not Comply With the Ban?

Platforms failing to implement what eSafety members deem “reasonable measures” to prevent children from accessing their services may incur fines of up to $49.5 million, as determined in federal court.

The definition of “reasonable measures” will be assessed by committee members. When asked on Wednesday, Wells stated, “I believe a reasonable step is relative.”

“These guidelines are meant to work, and any mistakes should be rectified. They aren’t absolute settings or rules, but frameworks to guide the process globally.”


Source: www.theguardian.com

Face ID: A Useful Resource or a Source of Concern? The Subtle Integration of Facial Recognition in Law Enforcement

The future is arriving ahead of schedule in Croydon. While it may not initially seem like the UK’s forefront, North End is a pedestrian-friendly high street filled with typical pawn shops, fast-food restaurants, and a blend of branded clothing stores. It’s anticipated that this area will host one of the UK’s first permanent fixed facial recognition cameras.

Digital images of passersby will be captured discreetly and processed to derive biometric data, which includes facial measurements. This data will be rapidly compared against a watchlist via artificial intelligence, and a match will trigger an alert that might lead to an arrest.

As per the latest violence reduction strategy from the South London Borough, North End and its adjacent streets are identified as “major crime hotspots.” However, they do not rank among the most hazardous routes in the capital.

The crime rate here is the 20th worst among the 32 London Boroughs, excluding the City of London. Plans to launch permanent cameras for a trial phase later this summer are not an emergency measure; instead, North End and nearby London Roads might soon see more surveillance.

When approached about the surveillance initiative, most shopkeepers and visitors in the North End were unaware of the police’s plans or the underlying technology.

For many, the cameras appear as just another form of street furniture alongside signs promoting safe cycling. While some express concern, others reference studies indicating widespread exhaustion of the public facing rising crime rates.

The police began experimenting with facial recognition cameras in the UK and Wales in 2016. Recent documents released under the Freedom of Information Act (FOI) and police statistics shared with the Guardian reveal substantial growth in usage over the last year. This technology is evolving from a niche tool to a regular component of police strategies.

Last year, police scanned almost 4.7 million faces using live facial recognition cameras, with deployments more than doubling in 2023. In 2024, live facial recognition vans were utilized at least 256 times, up from 63 the previous year.

There’s speculation that mobile units of 10 live facial recognition vans may operate throughout the country.

Meanwhile, civil servants collaborate with law enforcement to develop a new national facial recognition system called strategic facial matchers. This platform will enable searches through various databases, including custody images and immigration files.

“The implementation of this technology could become a common sight in city centres and transit hubs across England and Wales,” states one funding document submitted by the South Wales police to the Home Department and released by Metropolitan Police under FOI.

Activists warn that this technology may disrupt everyday public life by subjecting individuals to impromptu identity checks facilitated by extensive facial recognition systems. Advocates of the technology acknowledge its risks but emphasize its importance for safety.

Recently, David Scheneller, a 73-year-old registered sex offender from Lewisham, who had served nine years for 21 offenses, was sentenced to two years in prison for breaching probation terms.

Officers were alerted by the live facial recognition cameras to Scheneller walking alone with his six-year-old child.

“He was on the watchlist due to his compliance conditions,” said Lindsay Chiswick, Metropolitan’s Intelligence Director and advisor to the National Police Chief of Facial Recognition.

“He formed a relationship with his mother over time and began picking up his daughter from school. If something went wrong that day, he was aware of the repercussions. This exemplifies how police could track him. Without facial recognition, recognizing him would have posed a challenge.”

Many see this as a compelling argument, but critics raise concerns about the unanticipated ramifications as law enforcement adopts technology without legislative guidance.

Madeline Stone from the NGO Big Brother Watch, who has observed mobile camera deployments, reported witnessing misidentifications of schoolchildren in uniforms undergoing “long, humiliating, and unnecessary police stops,” where they were compelled to verify their identities and provide fingerprints.

In these instances, the affected individuals were young Black boys, leaving them frightened and distressed, she noted.

Skip past newsletter promotions

“The effectiveness diminishes as the threshold rises,” Stone added. “The police might not prefer employing it in specific environments. There are no legal mandates requiring them to do so. The notion that police could unilaterally create their own guidelines for usage is truly alarming.”

A judicial review was initiated by Londoner Sean Thompson, with backing from Big Brother Watch, after he was wrongly identified as a person of interest due to the technology and detained for 30 minutes upon returning from a volunteer shift with the anti-knife initiative Street Father.

Additionally, Dr. Dara Murray, tasked with an independent evaluation of the trials by the Met in 2019, highlights the potential “chilling” effect this technology might have on society, suggesting that considerations must go beyond just the technology’s implementation.

“It’s akin to police tailing you, recording your interactions, where you go, how often, and for how long,” he remarked. “I believe most would be uncomfortable with such reality. Democracy thrives on dissent and discourse; if surveillance stifles that, it risks entrenching the status quo and limiting future opportunities.”

Live facial recognition is being utilized to apprehend individuals for traffic violations, growing cannabis, and neglecting community orders. Is this truly justified?

Fraser Sampson, former biometrics and surveillance camera commissioner in England and Wales until his position was dissolved in October 2023, currently serves as a non-executive director for FaceWatch, the leading UK firm in retail security systems designed to prevent shoplifting.

While he acknowledges the technology’s potential, he expresses concern that independent regulations concerning surveillance haven’t kept pace with its deployment by the state.

Sampson commented: “There’s an abundance of information about the technology’s functionalities, yet in practical terms—its application, the reason for its use, and the avenues for challenges or complaints—those clarity elements seem lacking.”

Chiswick noted her understanding of the concerns while recognizing the potential advantages of regulatory measures. The Met is cautiously making “small strides” that are continually reviewed, she stated. With limited resources, law enforcement needs to adapt and capitalize on the possibilities brought by AI. They are cognizant of potential “chilling effects” on society and have made it clear that cameras will not be deployed in protest areas.

“Will this become common? I cannot say,” Chiswick remarked. “We need to approach that assumption with caution. There are numerous possible scenarios; areas like the West End? It’s conceivable, instead of the static trials we’re conducting in Croydon, we could utilize it there. However, that’s not our current plan.”

She added: “I believe the integration of technology, data, and AI will continue to rise in the coming years, as personally, that’s how we can improve our operations.”

Source: www.theguardian.com

The Department of Justice to disband cryptocurrency enforcement unit

The Trump administration has disbanded Justice Department troops responsible for investigating cryptocurrency crimes, criticizing the Biden administration for being too aggressive towards the fast-growing industry.

In a memo issued late Monday, Deputy Attorney General Todd Blanche denounced his predecessor for investigating cryptocurrency operators in a way that he was called “pregnant and not executed properly.” He instead directed the department to narrow the focus of cryptocurrency investigations into crimes such as fraud, drug trafficking and terrorism.

The directive coincides with President Trump’s broad embrace of the crypto industry during his campaign and as he moves to ease enforcement.

The Trump family expanded business profits in the industry, including establishing a crypto venture, World Liberty Financial. Just before he took office, Trump issued his own memo coin. Trump Media & Technology Group, a social media company whose majority shareholder, also said it plans to introduce many digital asset investment products this year.

The Department of Justice directive follows a similar move in the Securities and Exchange Commission. This dismissed lawsuits and pending investigations that included issues that the crypto company had not registered as an exchange. Many SEC attorneys in these cases have left the regulatory authority.

The SEC has also significantly reduced staffing for crypto enforcement units. On a policy issue, the SEC says it will not attempt to regulate memokine because novelty digital assets are not securities.

In its memo, the Justice Department accused the Biden administration of “a reckless regulatory strategy through prosecution” towards the world of digital currency.

Going forward, Blanche writes that prosecutors should only pursue cryptocurrency cases that “include the actions of victim investors,” and that fund fraud, hacking, and other crimes such as fentanyl and human trafficking. The prosecution said “is important to restore stolen funds to customers and build investors’ trust in the security of the digital asset market and the growth of the digital asset industry.”

He ordered a group of prosecutors investigating market integrity and major fraud to halt the pursuit of cryptocurrency enforcement and instead focus on immigration issues and contractor fraud.

He also disbanded the National Cryptocurrency Enforcement team, a group within the Department of Justice headquarters that was recently created to handle such cases. Blanche writes that the office of a personal lawyer may still pursue cases that include cryptocurrency investigations.

This new approach appears to be aimed at preventing cases like those submitted in 2023 against Binance founder Changpeng Zhao, a violation of the Bank’s Secret Act. The company has agreed to pay a $4.3 billion fine as part of its guilty plea.

During the first days of the administration, Trump officials signaled their dissatisfaction with such cases when they effectively demoted the prosecutor who founded the cryptocurrency enforcement team, Eun Young Choi.

The team was created in 2022 to help prosecutors penetrate the frequently vague world of cryptocurrency as cross-border criminals began to use digital money more and more to promote crime.

Matthew Goldstein Contributed with a report from New York.

Source: www.nytimes.com

Apple requests court order to disclose customer information to law enforcement officers

WASHINGTON, Dec. 12 (Reuters) – Apple (AAPL.O) says it is seeking a judge’s order to turn over information about its customers’ push notifications to law enforcement, bringing the iPhone maker’s policy in line with rival Google’s and allowing authorities to obtain app data about users. The hurdles that must be cleared have been raised.

The new policy was not officially announced, but was announced in the past few days. Law enforcement guidelines published by Apple. This follows revelations by Oregon Sen. Ron Wyden that officials had requested such data not only from Apple but also from Alphabet Inc.’s Google. (GOOGL.O) Create an operating system for Android phones.

Apps of all kinds rely on push notifications to notify smartphone users of incoming messages, breaking news, and other updates. These are the audible “sounds” or visual indicators that users receive when they receive an email or when a sports team wins a game. What users often do not realize is that almost all such notifications are sent through Google and his Apple servers.

In the letter, first revealed by Reuters last week, Wyden said the practice gives the companies unique insight into the traffic flowing to users from these apps, and that the two companies can “see how users use specific apps.” “We are in a unique position to facilitate government oversight of what is happening.”


Although Apple did not officially announce this new policy, it was included in Apple’s published law enforcement guidelines within the past few days. Getty Images

Apple and Google both acknowledged receiving such requests. Apple added a section to its guidelines stating that such data can be obtained “via subpoena or larger legal process.” This text has now been updated to refer to more stringent warrant requirements.

Apple has not released an official statement. Google did not immediately respond to a request for comment.

Wyden said in a statement that Apple is “doing the right thing by aligning with Google in seeking a court order to turn over data related to push notifications.”

Source: nypost.com