As tech companies spread more lies, why is the BBC reporting less truth?

On August 4, 2024, the riots and unrest following the murder of three children in Southport, Merseyside, escalated further. That day, violence struck Rotherham, Middlesbrough and Bolton, where people tried to set fire to hotels housing asylum seekers, amid chaos amid far-right misinformation and rumors. Elon Musk showed a renewed interest in British affairs, posting a photo of the violence in Liverpool on X with the characteristically cautious caption: “Civil war is inevitable.” And 24 hours later, a wave of unrest reached the city of Plymouth.

It struck the city center throughout the evening of August 5th. To quote the Guardian, “150 police officers in riot gear and with dogs tried to separate the far-right mob and anti-racism demonstrators.” Others defended the mosque. Bricks, bottles and fireworks were thrown. Six people were arrested, several police officers were injured, and two civilians were taken to hospital. local civil servant He said the events were “unprecedented.”

Where should the city's 260,000 residents turn for reliable information? As ever, people's social media feeds are filled with falsehoods and provocations, making more traditional media the obvious choice. But if you had been listening to your local BBC radio station while the riots were going on, you might not have known anything about them. BBC Radio Devon ran reports of the violence on the 6 o'clock news, but Plymouth was not mentioned at all on the 7pm and 9pm news. Other breaking news stories mentioned what was happening but failed to make it into a major story. The violence was horrifying and very important, but the attention of the city's supposedly most reliable news sources was clearly elsewhere.

We now know all this thanks to BBC reaction to Complaint by David LloydHe is a radio veteran who has worked in both corporate and commercial stations. The relevant official document written by the company's complaints manager is very easy to read. It included an admission that “there was little evidence that the BBC was present at the scene” and that some of the content related to “some logistical issues” on the day. . Issues include “securing journalists with the necessary riot training'' and “technical problems with broadcasting kits.''

there were, The report says:“Elements of System Failure.” Even online, where modern businesses say they need to focus most of their efforts, there is no dedicated live coverage of the Plymouth riots, and as the report suggests, major social media platforms lack sufficient updates. Not posted. Regarding the latter point, he said, “If it weren’t for staff vacations, we could have done more.”

A spokesperson said: The BBC accepted the findings of its complaints department and had “already made adjustments to its working practices” before the Plymouth complaint was investigated. But the mix of excuses and admitted shortcomings remains mind-boggling. And the larger story of this corporate degradation of local broadcasting and how it fits into similar changes in commercial radio and the dire state of Britain’s local press is left untouched. As Mark Zuckerberg abandons meta fact-checking and Musk becomes endlessly radicalized by his platform, the result is a growing vacuum in local news. There is a growing susceptibility to online lies that may soon surpass people’s ability to fully understand what is going on in their immediate lives. someone's control.

The story of Plymouth is a case study in the impact of change, which still appears to be chronically overlooked. These include the forced cuts to BBC Radio’s broadcasts in 2023, and the fact that many local stations now only broadcast regionally specific programs until the afternoon. Share produce locally or nationally until breakfast time the next day. Number of spectators This drastic cut has further diminished an already fragile part of the national media landscape, further reduced listeners and hastened the decline of local radio, while our nation’s public broadcasters have The obvious question is whether the survival of such a major broadcasting station can be guaranteed. Grassroots news, who will do it?

It’s certainly not commercial radio. Eight years ago, broadcasting regulator Ofcom announced a relaxation of rules allowing commercial station owners to reduce the minimum hours of daytime local programming from seven hours a day to three. In 2019, radio giant Global consolidated more than 40 independent breakfast shows featuring local news and takeaways into three nationally broadcast programs, exposing its newsroom to fluctuations in efficiency. Since then, a single reporting team has been assigned to cover an area stretching from Cornwall to Gloucester.

And then there is the terrible fate of local newspapers that may have successfully transitioned into the online world, but have been repeatedly mismanaged, cut and wiped out, especially by online giants. Between 2009 and 2019, more than 320 such titles closed in the UK. Just over a year ago, Reach, the owner of Mirror, Express and a number of local titles grouped online under the “Live” banner, announced its third job cuts in a year. This reduced the total number of roles lost. The company's local and regional news websites drew a healthy audience of about 35 million people per month, but its reliance on siphoning digital advertising revenue put its long-term survival at risk. As one anonymous Reach official stated, the results were clear. “Manchester, Birmingham, Bristol, Newcastle, Liverpool, Cardiff and many other major cities will soon no longer have a local newspaper, and it is increasingly likely that they will no longer have a well-known local newspaper.”Local authorities and others Accountable news website. ”

In some areas, nimble local news outlets are beginning to fill the gap. In Hull, a start-up company called story of the hull It was founded in 2020 as an online operation by two former Hull Daily Mail employees and expanded into print last year. Last week's headlines reflected the city's experience with the 2024 riots: “Shame, Resilience, Justice.” won an award On this year’s cover. Bristol Cable has long pioneered a new kind of investigative and political reporting, driven by the fact that its titles are owned by its readers. Manchester has a Substack newsletter The Millis currently setting up branches in Liverpool, Birmingham, Sheffield and London. Former Guardian staffer Jim Waterson has also started up to fill the void left by the retrenchment of the Evening Standard. central london. All of these projects highlight one stark point: a place not only needs its own journalism, but can provide an audience to support it.

The problem is that they still outnumber some parts of the country, let alone the world, where the worst kinds of news cycles are unfortunately a reality. Something happens, but what do people read or hear about it? Is it nothing at all, plucked from the corners of the internet by some foreign billionaire, or amplified by an algorithm, true or false? It’s such a bad version that the question of whether or not is gone and the deceptive narrative creates its own shockwaves. If that is the future we all need to avoid, then local reporting should be our first antidote.

Source: www.theguardian.com

Apple accused by UK watchdog of not reporting child sexual images

Child safety experts have claimed that Apple lacks effective monitoring and scanning protocols for child sexual abuse materials on its platforms, posing concerns about addressing the increasing amount of such content associated with artificial intelligence.

The National Society for the Prevention of Cruelty to Children (NSPCC) in the UK has criticized Apple for underestimating the prevalence of child sexual abuse material (CSAM) on its products. Data obtained by the NSPCC from the police shows that perpetrators in England and Wales use Apple’s iCloud, iMessage, and FaceTime for storing and sharing more CSAM than in all other reported countries combined.

Based on information collected through a Freedom of Information request and shared exclusively with The Guardian, child protection organizations discovered that Apple was linked to 337 cases of child abuse imagery offenses recorded in England and Wales between April 2022 and March 2023. In 2023, Apple reported only 267 suspected instances of child abuse imagery globally to the National Centre for Missing and Exploited Children (NCMEC), contrasting with much higher numbers reported by other leading tech companies, with Google submitting over 1.47 million and Meta reporting more than 30.6 million, as per NCMEC reports mentioned in the Annual Report.

All US-based technology companies are mandated to report any detected cases of CSAM on their platforms to the NCMEC. Apple’s iMessage service is encrypted, preventing Apple from viewing user messages, similar to Meta’s WhatsApp, which reported about 1.4 million suspected CSAM cases to the NCMEC in 2023.

Richard Collard, head of child safety online policy at NSPCC, expressed concern over Apple’s discrepancy in handling child abuse images and urged the company to prioritize safety and comply with online safety legislation in the UK.

Apple declined to comment but referenced a statement from August where it decided against implementing a program to scan iCloud photos for CSAM, citing user privacy and security as top priorities.

In late 2022, Apple abandoned plans for an iCloud photo scanning tool called Neural Match, which would have compared uploaded images to a database of known child abuse images. This decision faced opposition from digital rights groups and child safety advocates.

Experts are worried about Apple’s AI system, Apple Intelligence, introduced in June, especially as AI-generated child abuse content poses risks to children and law enforcement’s ability to protect them.

Child safety advocates are concerned about the increase in AI-generated CSAM reports and the potential harm caused by such images to survivors and victims of child abuse.

Sarah Gardner, CEO of Heat Initiative, criticized Apple’s insufficient efforts in detecting CSAM and urged the company to enhance its safety measures.

Child safety experts worry about the implications of Apple’s AI technology on the safety of children and the prevalence of CSAM online.

Source: www.theguardian.com

Federal police union advocates for creation of portal for reporting AI deepfake victimization

The federal police union is calling for the establishment of a dedicated portal where victims of AI deepfakes can report incidents to the police. They expressed concern over the pressure on police to quickly prosecute the first person charged last year for distributing deepfake images of women.

Attorney General Mark Dreyfus introduced legislation in June to criminalize the sharing of sexually explicit images created using artificial intelligence without consent. The Australian Federal Police Association (Afpa) supports this bill, citing challenges in enforcing current laws.

Afpa highlighted a specific case where a man was arrested for distributing deepfake images to schools and sports associations in Brisbane. They emphasized the complexities of investigating deepfakes, as identifying perpetrators and victims can be challenging.

Afpa raised concerns about the limitations of pursuing civil action against deepfake creators, citing the high costs and challenges in identifying the individuals responsible for distributing the images.

They also noted the difficulty in determining the origins of deepfake images and emphasized the need for law enforcement to have better resources and legislation to address this issue.

Skip Newsletter Promotions

The federal police union emphasized the need for better resources and legislation to address the challenges posed by deepfake technology, urging for an overhaul of reporting mechanisms and an educational campaign to raise awareness about this issue.

The committee is set to convene its first hearing on the proposed legislation in the coming week.

Source: www.theguardian.com