Can Floating Data Centers Meet the Growing Energy Demands of AI?

Panthalassa Floating Data Center Prototype

Panthalassa Floating Data Center Prototype

Panthalassa

Data centers powering the AI revolution are consuming more electricity than many small countries. According to an
International Energy Agency report,
energy demand could soar to 945 terawatt hours per year by 2030, surpassing Japan’s entire electricity consumption.
As AI technology continues to evolve, the hunger for power has led companies to consider not just earthly solutions but also outer-space options to harness constant solar energy.
Intriguingly, startup Panthalassa is working on an autonomous floating data center that harnesses computational power in the open sea.

Recently, the Oregon-based company announced $140 million in funding, stating that its innovative platform could bypass overloaded power grids while offering carbon-free computing at sea. However, it remains to be seen if relocating computing power offshore can truly ease the challenges faced by current data centers; it might simply replace one expensive issue with another.

“Wave energy is a well-established technology, but the ocean presents a brutal environment,” warns Jonathan Koomey, a data center energy utilization expert at the Lawrence Berkeley National Laboratory.
“Saltwater and wave action can lead to mechanical malfunctions.”

Shaped like a golf ball atop a tee, Panthalassa’s floating data center stands 85 meters high (almost the height of Big Ben) and is constructed from sheet steel.
These structures are towed into position and can autonomously generate power to run AI workloads without relying on grid electricity, zero emissions, or traditional engines.

The “tee” section of the platform features a long tube with an open base that captures seawater as waves cause it to rise and fall. This movement forces water through the tube into the hollow “ball” section, which floats primarily due to trapped air.
The dynamic movement of water activates turbines that generate electricity, powering the vehicle’s GPUs, computing hardware, and satellite communication systems.

Conventional data centers consume significant quantities of water for cooling AI hardware.
In contrast, Panthalassa’s servers are housed in enclosed modules below the water surface, allowing the container walls to act as heat exchangers that dissipate heat into the cooler seawater.
While ocean mixing helps eradicate waste heat, the implications for local marine ecosystems remain uncertain.

Panthalassa aims to achieve what few data center operators have dared: to maintain crucial computing infrastructure beyond the oversight of human engineers.
“Our data consistently identifies power and network issues as the leading causes of data center outages,” states Jacqueline Davis of the Uptime Institute, a global authority on data center performance.
“These complications can be particularly challenging to manage in remote locales with limited or no staffing.”
Panthalassa did not respond to inquiries from the New Scientist before this article was published.

Automation within data centers mostly focuses on monitoring and analysis, with physical human intervention often necessary during unusual events, such as manual compressor restarts.

This presents a significant challenge for Panthalassa. Latency is another hurdle.
Data processed on the floating platform is sent to users on land via Starlink satellites, which have limited bandwidth and higher latency compared to fiber optic cables.
Consequently, while the node may work effectively for long-duration AI workloads—such as training advanced mathematical models—applications that require swift responses, like chatbots and search assistants, could face challenges.

“Constraints in power are most severely impacting extensive AI training data centers,” noted Davis. She suggests that Panthalassa’s model is more plausible when power demands for advanced AI become significant enough to align with AI training needs.
Until then, floating data centers may struggle to remain competitive with land-based options.

Although Panthalassa’s approach is innovative, the concept of offshore data centers isn’t entirely new.
Aikido Technologies is developing floating data centers integrated with offshore wind platforms, while Mitsui O.S.K. Lines is exploring ship-based computing systems designed to exploit ocean energy sources.
Past initiatives, like Microsoft’s underwater Natick project, have examined whether housing servers in or near water enhances cooling and efficiency.

Nevertheless, offshore computing remains predominantly experimental. Besides engineering challenges, firms must demonstrate that ocean-based systems can economically compete with traditional data centers linked to power grids and fiber networks.
“Data centers can achieve economies of scale by building at larger sizes, which is why they are typically so extensive today,” remarks Koomey.
“Scaling up to accommodate the fixed costs of computing is considerably more challenging and riskier on water.”

Topics:

Source: www.newscientist.com

Create an Extensive Cancer Data Library: A Comprehensive Guide – Sciworthy

Computational cancer researchers utilizing machine learning technology face a critical challenge. Large datasets are available for training machine learning models, but the process is demanding due to inconsistencies in data formats, names, structures, and other attributes. Consequently, when scientists analyze different cancer types or apply varying data cleaning methods, the performance of the resulting models can diverge significantly.

This discrepancy has created a gap between available datasets and their practical usability, posing a significant barrier for researchers lacking specialized bioinformatics training. Variations in data processing methodologies further complicate the comparison of different machine learning approaches, making it challenging to identify the optimal method for tasks such as classifying patient samples as benign or malignant.

In response, collaborative researchers from Japan and the United States have developed a robust database tailored for machine learning applications, comprising genetic and molecular data from over 8,000 cancer patients. They named this groundbreaking database MLOmics. Similar to a well-organized library, MLOmics provides cancer data ready for immediate use by computer models, eliminating the need for extensive data preprocessing.

To create MLomics, researchers retrieved patient samples from 32 cancer types from publicly accessible databases, including the Cancer Genome Atlas. They collected four distinct types of molecular data per patient, comprising two DNA product types. The dataset includes transcriptomics data, data on DNA regions termed copy number variation, and details regarding chemical DNA markers known as methylation. For transcriptomics data, the team labeled experimental factors influencing data quality, eliminated contamination from non-human samples, and addressed unlabeled values.

For copy number variation data, researchers focused on cancer-specific repeated sequences, identifying and labeling recurrent aberrant repeats along with their corresponding genes. They adjusted methylation data to eliminate biases caused by various experimental platforms. In addition, a uniform identifier was assigned to all molecular data to standardize naming conventions.

Subsequently, the team developed a coding pipeline to assess data quality and integrate each patient’s molecular data types into a single, cohesive dataset using the multi-omics approach, which amalgamates diverse molecular measurements. They matched each patient sample with its associated cancer type, thereby creating an organized dataset prime for analysis.

The researchers designed 20 task-aware datasets across three categories of machine learning problems, establishing appropriate metrics for model evaluation in each category. They aimed to showcase how MLOmics can be employed for a variety of common research tasks.

The first category is classification, comprising six datasets that facilitate training models to categorize samples into known classes, such as malignant or benign tumors. The second category, clustering, includes nine datasets that allow scientists to explore how samples group naturally based on molecular characteristics when predefined labels are absent. The final category, data completion, consists of five datasets aimed at addressing incomplete molecular data caused by technical or experimental errors, detailing how models can estimate or fill in missing values, a common challenge in real-world scenarios.

The researchers also organized the MLOmics database into three distinct sections, each with comprehensive usage guidelines. The first section primarily offers task-aware cancer multi-omics datasets formatted as comma-separated values (CSV files). CSV files were selected for their efficiency with large genomic datasets, as they are easily processed by programming languages like Python and R. The second section provides code files designed to assist scientists in model development and evaluation. Finally, the last section includes links to additional resources that complement the primary datasets, ensuring accessibility for all interested researchers, regardless of their background.

In conclusion, the researchers affirmed that MLOmics represents a significant asset for the cancer research community, allowing scientists to concentrate on enhancing algorithms instead of expending time on data preparation. They highlighted MLOmics’ suitability for non-specialists, encouraging interdisciplinary research and broader biological studies. The team is committed to continuously updating MLOmics with new resources and tasks in alignment with advancements in the field.

Post views: 676

Source: sciworthy.com

Creating a Comprehensive Cancer Data Library: A Step-by-Step Guide by Sciworthy

Computational cancer researchers leverage machine learning technology to tackle a significant challenge: the vast amounts of data available for training machine learning models. Despite this abundance, training is hindered by inconsistent data formats, structures, and properties. Consequently, when scientists apply various cancer types and data cleaning procedures, the resulting models can yield vastly different outcomes.

Researchers have identified the disparity between available and usable datasets as a considerable obstacle for scientists lacking specialized bioinformatics training. Furthermore, varied processing strategies make it difficult to equitably compare new machine learning techniques and identify the most effective method for specific cancer research tasks—such as classifying patient samples into benign or malignant categories.

To address this issue, a collaboration between researchers in Japan and the United States has resulted in the development of a comprehensive database tailored for machine learning applications. This database, named MLOmics, encompasses genetic and molecular information from over 8,000 cancer patients. Similar to a well-organized library, MLOmics offers cancer data that can be directly utilized by computer models, eliminating the need for extensive preprocessing.

In constructing MLOmics, the team gathered patient samples from 32 cancer types sourced from publicly available databases like the Cancer Genome Atlas. Data collection included four distinct types of molecular information, consisting of two forms of DNA products: Transcriptomics data, data on repetitive DNA regions termed Copy Number Variations, and information about chemical DNA tags known as Methylation. The team meticulously labeled experimental sources affecting data quality, eliminated contamination from non-human samples, and removed unlabeled values specific to transcriptomics data.

For the copy number variation data, researchers focused on cancer-specific repeats, identifying and labeling recurrent aberrant repeats along with corresponding genes in those regions. They also adjusted the methylation data to eliminate biases from various experimental platforms. Each processed molecular data type was then assigned a standardized identifier to mitigate discrepancies in naming conventions.

Subsequently, a coding pipeline was established to assess data quality and consolidate each patient’s molecular data types into a unified dataset—an approach known as multi-omics, as it integrates various molecular measurements. The researchers matched each patient’s sample to its relevant cancer type, resulting in an organized dataset suitable for analysis.

The research team developed 20 task-aware datasets across three categories of machine learning problems, providing crucial metrics for model evaluation in each. Their objective was to showcase how other scientists can effectively utilize MLOmics for a range of common tasks.

The first category focuses on classification, including six datasets that assist scientists in training models to categorize samples as malignant or benign. The second category, clustering, incorporates nine datasets that reveal natural groupings among samples based on molecular patterns when predefined labels are absent. The final category, data completion, features five datasets aimed at addressing incomplete molecular data resulting from experimental or technical challenges, showcasing how models estimate or fill in missing values—a common occurrence in real-world scenarios.

The MLomics database is organized into three sections, each offering detailed usage guidelines. The first section includes task-aware cancer multi-omics datasets in comma-separated values (CSV) format. This format is ideal for large genomic datasets, as programming languages like Python and R have built-in functions for effective reading, writing, and analysis. The second section offers code files to facilitate model development and application of evaluation metrics, while the final section contains links to supplementary resources to enhance biological analyses and ensure the database is accessible to all researchers, regardless of their educational background.

In conclusion, the researchers assert that MLOmics represents a vital resource for the cancer research community, enabling researchers to concentrate on developing superior algorithms instead of data preparation. They highlight the accessibility of MLOmics for non-specialists and its support for interdisciplinary and broader biological research. The team is committed to continuously updating MLOmics with new resources and tasks to align with advancements in the field.


Post views: 59

Source: sciworthy.com

How AI Data Centers Can Increase Surrounding Temperatures by Up to 9.1°C

Data Centers Heat Effects

Growing Number of Data Centers

Jim Roe Scalzo/EPA/Shutterstock

Data centers designed for AI operations generate substantial heat, leading to elevated temperatures in surrounding areas, creating significant data center heat islands. This phenomenon currently impacts around 340 million individuals worldwide.

The construction of data centers is projected to rise dramatically. According to JLL, data center capacity is expected to double between 2025 and 2030, with AI contributing to half of this demand.

The researcher Andrea Marinoni from the University of Cambridge observed a steady increase in energy consumption by data centers, predicting a surge in the near future and sought to evaluate the resultant impact.

Researchers analyzed land surface temperatures using satellite data over the last two decades, correlating findings with geographical coordinates from over 8,400 AI data centers. To ensure accuracy, they focused solely on data centers situated in less populated regions.

The findings revealed an average temperature rise of 2°C (3.6°F) in the vicinity of operational AI data centers, with peaks reaching as high as 9.1°C (16.4°F).

This temperature spike is not confined to the immediate area around the data center; the impact was observed up to 10 kilometers away, with a reduction in intensity of merely 30% at a distance of 7 kilometers.

“The results we obtained were quite surprising,” Marinoni states. “This could pose a significant problem.”

Using demographic data, estimates reveal that over 340 million individuals reside within a 10-kilometer radius of a data center, experiencing higher temperatures due to their proximity. Marinoni noted that regions such as Mexico’s Bajío and Spain’s Aragón recorded a temperature increase of 2°C (3.6°F) during the 20-year span from 2004 to 2024, attributable to this phenomenon.

Researcher Chris Priest of the University of Bristol highlighted the need for further investigation into whether the heat produced by structures themselves contributes to the overall thermal effects, suggesting that buildings exposed to sunlight could play a role.

Regardless, Marinoni emphasizes that data centers are still contributing to rising surface temperatures. “My key message is to proceed cautiously in the design and development of data centers.”

Topics:

  • Artificial Intelligence/
  • Data Centers

Source: www.newscientist.com

Startup Innovates with First Data Center Powered by Human Brain Cells

Close-up of an artificial brain illustrating neural activity and orange light dots, representing artificial intelligence. Synapses and neurons are crafted from cubic particles rendered in 3D format.

Exploring Biological Computers

Floriana/Getty Images

As energy demands soar in data centers and the need for chips intensifies, could biological cells offer a solution? Australian startup Cortical Labs is pioneering this concept by establishing two innovative biological data centers in Melbourne and Singapore. These facilities will utilize chips populated with reproducible neurons for data processing.

Cortical Labs stands out as a leader in the emerging field of biological computing, using nerve cells linked to microelectrode arrays to both stimulate and record cellular responses during data input. Recently, the company showcased its flagship computer, the CL1, demonstrating its ability to learn to play games like Doom within a week.

The Melbourne data center is set to feature approximately 120 CL1 units, while a collaboration with the National University of Singapore will launch with 20 units, aiming for a total of 1,000 CL1s, pending regulatory approval. This ambitious expansion is designed to enhance their cloud-based brain computing services.

Michael Barros from the University of Essex remarks, “Biological computers like CL1 have been developed by multiple research teams globally but pose construction challenges for widespread adoption.” He continues, “Cortical Labs is making biocomputers more accessible, set to be the first company to do this at scale.”

These biological systems can be trained for tasks like playing Doom, although understanding the optimal training methods for neurons remains a complex issue. Reinhold Scherer, also from the University of Essex, notes, “Having access can facilitate explorations in learning and programming, yet neurons cannot be programmed as traditional computers.”

Moreover, Cortical Labs asserts that its biological data centers are significantly more energy-efficient than conventional computing systems, with each CL1 unit consuming just 30 watts compared to thousands of watts used by state-of-the-art AI chips.

Paul Roach from Loughborough University highlights that scaling up these systems to function like traditional data servers could lead to remarkable energy savings, even if they require nutrients to sustain the neuron chips. However, the cooling requirements are expected to be much lower than in traditional setups, indicating considerable power conservation according to Cortical Labs’ estimates.

Yet, the technology is still nascent. Tjeerd Olde Scheper, who has collaborated with a competitor, FinalSpark, poses questions about efficacy, stating, “We’re still in early development stages.” He emphasizes that transitioning from a small network managing simple tasks to a larger-scale language model is a substantial leap.

A primary challenge remains: the capacity to save training outcomes and utilize these neurons for computational algorithms beyond specific tasks like gaming. Retraining these neurons after their life cycle is another hurdle, as Scherer points out, “If retraining is needed every month, longevity of use becomes an issue.”

Topics:

Source: www.newscientist.com

Revolutionary Startup Develops First Data Center Powered by Human Brain Cells

Close-up of an artificial brain showcasing neural activity and orange light dots, illustrating the concept of artificial intelligence. 3D rendering of synapses and neurons made up of cubic particles.

A small number of companies are developing biological computers

Floriana/Getty Images

Data centers consume vast amounts of energy while the demand for computer chips continues to soar. Could utilizing brain cells be the solution?
Australian startup Cortical Labs is pioneering this field, planning to establish two innovative “biological” data centers in Melbourne and Singapore. These cutting-edge data centers will feature chips integrated with reproducible neurons.
Pon vs. Doom.

Cortical Labs stands out as one of the few firms creating biological computers that link nerve cells to microelectrode arrays, enabling the stimulation and measurement of cell responses during data input. Recently, the company successfully showcased that its primary model, the CL1, can learn to play games like Doom within just a week.

The first data center in Melbourne is set to accommodate around 120 CL1 units, while a second facility in collaboration with the National University of Singapore will initially support 20 CL1 systems, with plans to expand to 1,000 pending regulatory approval. This initiative aims to enhance cloud-based brain computing services.

According to Michael Barros from the University of Essex, UK, while biological computers have been constructed and tested globally, they remain challenging to build and use. He states, “We invest a lot of time and resources developing these systems.”

Barros further elaborates that Cortical Labs is democratizing access to biocomputers at scale, pioneering an accessible approach in the industry.

These systems can be trained for simple tasks, such as playing Doom, yet there are challenges in understanding how neurons function and training them for more complex tasks like machine learning. Reinhold Scherer, also from the University of Essex, notes, “When you access this technology, it opens doors to exploration in learning, training, and programming, but neurons cannot be programmed like standard computers.”

Cortical Labs asserts that its biological data centers use significantly less energy than traditional computing systems, with each CL1 requiring only 30 watts compared to thousands needed by leading conventional AI chips.

Paul Roach from Loughborough University, UK, emphasizes that scaling biocomputers into entire rooms, akin to traditional data servers, could yield substantial energy savings. Notably, while biological data centers may necessitate nutrients to sustain neuron chips, they require less cooling energy than conventional computing infrastructures, suggesting significant potential for energy conservation.

Nevertheless, experts like Tjeerd Olde Scheper, who holds a PhD from Oxford Brookes University, recognize that the technology remains nascent. “Will it perform as expected? We are still in the early developmental phase,” he comments.

Although direct comparisons between the sizes of biological and silicon AI systems remain complex, it’s notable that the envisioned biological data center would integrate hundreds of biological chips in contrast to the hundreds of thousands of GPUs typically found in large-scale AI data centers.

“We have a long way to go before these systems are production-ready. Transitioning from a small network playing games to a large language model is a substantial leap,” says Steve Furber from the University of Manchester, UK.

A pressing concern is the lack of clarity on how to store training outcomes within neurons as memory, or how to execute computational algorithms beyond specific tasks, such as video gaming.

Additionally, retraining neurons post-task completion poses challenges, as their training and learning may be lost upon the end of their lifespan. “Proper retraining is essential,” Scherer states. “If retraining is required every 30 days, it may hinder technological continuity.”

Topics:

This optimization includes enhanced keywords relevant to biological computing, energy efficiency, and neural networks, while ensuring the structure and relevant HTML tags remain intact.

Source: www.newscientist.com

How Data Centers Use Glass Technology to Store Information for Thousands of Years

Close-up of glass with Microsoft Flight Simulator Map Data

Microsoft Research

Innovative automated systems for storing vast amounts of data on glass could revolutionize the future of data centers.

In our data-driven world, everything relies on information—from the internet and industrial sensors to scientific data from particle colliders, all of which require secure and efficient storage solutions.

Back in 2014, Professor Peter Kazansky and his team at the University of Southampton demonstrated that lasers could be utilized for encoding hundreds of terabytes of data into nanostructures within glass, resulting in a data storage method anticipated to outlast the universe itself.

While their technique was impractical for industrial applications, Richard Black and his colleagues at Microsoft’s Project Silica have successfully demonstrated a similar glass-based technology. This innovation could pave the way for long-lasting glass data libraries in the near future.

“Glass can endure extreme temperatures, humidity, particulates, and electromagnetic fields,” explains Black. “Moreover, glass boasts a long lifespan and doesn’t need frequent replacement, making it a more sustainable medium. It requires significantly less energy to produce and is easy to recycle once it has served its purpose.”

The research team’s pioneering process starts with a femtosecond laser, which emits light pulses lasting just 100 billionths of a second. This technology etches tiny structures into a thin layer of glass to encode data. To minimize read and write errors, the researchers also incorporate additional bits into the data.

The data is read using a combination of microscope and camera systems, with images processed by a neural network algorithm that converts them back into bits. This entire process is easily reproducible and automated, making it a perfect example of a robotic data facility.

Remarkably, researchers successfully stored 4.8 terabytes of data on a square glass piece measuring 120 millimeters wide and 2 millimeters thick. This is roughly one-third the volume of an iPhone, equivalent to about 37 iPhones’ storage capacity.

Project Silica Glass Writing Instruments

Microsoft Research

Accelerated aging experiments, including heating the glass in a furnace, suggest that the data may remain stable and readable for over 10,000 years at 290°C, even longer at room temperature. Additionally, the researchers tested borosilicate glass, which, while cheaper, only effectively stored less complex data.

Kazansky highlighted Project Silica’s main breakthrough: delivering an end-to-end system scalable to data center size. Although the principles of glass-based data storage have existed for over a decade, this study confirms its feasibility as a technology.

Microsoft isn’t alone in exploring this groundbreaking technology. Kazansky also co-founded S Photonics, focused on preserving the human genome in glass. The Austrian startup Serabite proposes similar storage techniques using ultrathin layers of ceramic and glass.

Nonetheless, challenges persist, such as the cost of integrating the glass library into existing data centers and whether the Project Silica team can enhance glass capacity, potentially up to 360 terabytes as per Kazansky’s findings.

For now, Black identifies the primary potential applications for Project Silica’s technology in national libraries, scientific repositories, cultural records, and anywhere data needs to survive for centuries. Collaborations with companies like Warner Bros. and Global Music Vault are underway to safeguard data currently stored in the cloud for the long term.

Kazansky adds that this technology has even inspired cinematic portrayals. In Mission: Impossible – The Final Reckoning, a protagonist discovers the capacity and security necessary to trap an advanced artificial intelligence. “It’s a rare moment when Hollywood science fiction aligns with peer-reviewed reality,” he remarks.

Topics:

Source: www.newscientist.com

New Juno Data Reveals Jupiter is Smaller and More ‘Squeezed’ Than Previously Thought

Here’s the rewritten content that is SEO optimized while preserving the necessary HTML tags:

Leveraging high-precision radio occultation measurements from NASA’s Juno mission, planetary scientists have significantly refined the shape of Jupiter. Their findings reveal that the planet’s polar, equatorial, and mean radii are smaller than earlier estimates from NASA’s Pioneer and Voyager missions, with substantially reduced uncertainty.

This vibrant visible-light image of Jupiter was captured using the Hubble Wide-Field Camera 3 on January 11, 2017. Featured prominently are the Great Red Spot and a long brown feature known as the “Brown Barge,” stretching approximately 72,000 km (around 45,000 miles) from east to west, with Red Spot Junior (Oval BA) on the lower right. Image credits: NASA / ESA / NOIRLab / NSF / AURA / Wong et al. / De Peyter et al. / M. Zamani.

“Jupiter, recognized as the largest planet in our solar system, is an almost oblate spheroid due to its rapid rotation of 9 hours, 55 minutes, and 29 seconds, causing a slight flattening at the poles and a bulge at the equator,” stated Dr. Eli Galanti of the Weizmann Institute of Science.

“This unique shape results from the gravitational forces pulling inward and centrifugal forces pushing outward from its rotation axis. Consequently, Jupiter’s equatorial radius is approximately 7% larger than its polar radius.”

“For celestial bodies with a uniform density, the shape is ideally ellipsoidal. However, Jupiter’s internal density varies significantly from the cloud layer of about 1 bar, where density is less than 1 kg/m3, to deeper layers reaching densities of several thousand kg/m3.”

“This density variation causes the planet’s shape to deviate from a simple ellipsoid by tens of kilometers, as reflected in fluctuations of the gravitational field across latitudes.”

“Additional alterations in Jupiter’s shape are induced by strong zonal winds detected at cloud level,” Dr. Galanti continued.

“These winds modify the centrifugal force, leading to variations of about 10 km mainly at lower latitudes.”

Historically, Jupiter’s dimensions were based on data from six radio occultation experiments conducted by NASA’s Pioneer and Voyager missions in the 1970s.

In this groundbreaking study, researchers reviewed radio occultation data collected during 13 close encounters between Juno and Jupiter, integrating the effects of zonal winds into their analysis.

“Radio occultation enables us to peer through Jupiter’s dense, opaque atmosphere to understand its internal structure,” the researchers elucidated.

“During the occultation experiment, Juno transmits radio signals to NASA’s Deep Space Network on Earth.”

“As these signals traverse Jupiter’s electrically charged ionosphere, they experience bending and delay.”

“By measuring the frequency changes caused by this bending, we can derive the temperature, pressure, and electron density at various atmospheric depths.”

The research concluded that Jupiter is approximately 8 km narrower at its equator and 24 km flatter at its poles.

“Including the effects of zonal winds significantly diminishes uncertainty in our understanding of Jupiter’s shape,” the researchers noted.

“At a pressure level of 1 atmosphere, we’ve determined a polar radius of 66,842 km, an equatorial radius of 71,488 km, and a mean radius of 69,886 km, which are smaller by 12 km, 4 km, and 8 km than previously estimated, respectively.”

For more details, view the findings published in this week’s Nature Astronomy.

_____

E. Galanti et al. Jupiter’s Size and Shape. Nat Astron published online on February 2, 2026. doi: 10.1038/s41550-026-02777-x

This version maintains the original meaning but enhances keyword density and relevance for SEO while adhering to the provided HTML structure.

Source: www.sci.news

Data Reveals 2025 as Earth’s Third Hottest Year on Record

According to Copernicus, the European Union’s climate monitoring service, last year ranked as the third warmest on record in modern history.

This finding aligns with existing trends; Copernicus data reveals that the last 11 years have consistently been the warmest in history.

In 2025, the average global temperature soared to approximately 1.47 degrees Celsius (2.65 degrees Fahrenheit) above the baseline period from 1850 to 1900. This reference period is significant as it predates the industrial era, marking a time before extensive carbon emissions entered our atmosphere.

“Annual surface temperatures exceeded average levels across 91 percent of the globe,” stated Samantha Burgess, head of climate strategy at the European Center for Medium-Range Forecasts, which operates Copernicus. “The primary contributor to these record temperatures is the accumulation of greenhouse gases, largely from fossil fuel combustion.”

Under the 2015 Paris Agreement, global leaders committed to limiting warming to 1.5 degrees Celsius above pre-industrial levels. However, this goal appears increasingly unachievable as temperatures have neared or surpassed this threshold for three consecutive years.

Mauro Facchini, director of Earth Observation at the European Commission’s Directorate-General for Defense, Industry, and Space, noted at a press conference: “A three-year average temperature exceeding 1.5 degrees Celsius compared to pre-industrial levels is a milestone we never anticipated.” He emphasized the urgent need to address climate change.

A woman shields herself from the scorching sun near the Colosseum in Rome during July.
Tiziana Fabi/AFP via Getty Images File

The U.S. government is anticipated to unveil its 2025 climate metrics on Wednesday. NASA provides its reports separately from the National Oceanic and Atmospheric Administration, owing to differing methodologies in calculating average annual temperatures, which often leads to variations in findings.

Nevertheless, the overarching trend is unmistakable: the planet is warming at an alarming rate, possibly faster than scientists had predicted.

Europe faces bleak climate data, compounded by the U.S. administration’s aggressive moves to roll back climate regulations and retreat from international efforts to mitigate warming.

Last week, the Trump administration announced its withdrawal from the United Nations Framework Convention on Climate Change, diminishing the U.S. role in global climate change discussions. Additionally, plans to withdraw support from the Intergovernmental Panel on Climate Change, which produces crucial reports on climate change impacts, were made public.

The United States is set to officially leave the Paris Agreement later this month, following a one-year waiting window.

A child enjoys a refreshing mist under a fog system in Milan during July.
Luca Bruno / AP File

President Donald Trump has labeled climate change “the work of con artists,” and his administration has actively sought to downplay critical climate reports such as the National Climate Assessment. Efforts are underway to reduce the Environmental Protection Agency’s ability to regulate greenhouse gas emissions, a primary cause of global warming.

Simultaneously, steps are being taken to promote the coal industry, including ordering coal-fired power plants to continue operations (coal is notorious for generating significant greenhouse gas emissions). The administration is also attempting to reverse many of the Biden administration’s climate initiatives, including subsidies for electric vehicles.

According to preliminary findings from Rhodium Group, an independent research firm monitoring U.S. emissions, climate pollution in the United States is projected to rise by approximately 2.4% in 2025. This increase may not stem directly from President Trump’s policies, as many regulations are yet to be implemented. The rise is likely due to high natural gas prices, growth in energy-intensive data centers, and particularly cold winters.

Rhodium Group anticipates that U.S. emissions will eventually decrease as renewable energy sources become more economically feasible compared to fossil fuels. However, the expectation of emission reductions is now less optimistic than prior to Trump’s administration.

The greenhouse gases that trap heat are intensifying weather patterns, resulting in more extreme conditions and increasing the likelihood of heavy rainfall, heatwaves, and flooding.

Last year emerged as the third-costliest year for weather-related disasters, an analysis by the nonprofit organization Climate Central revealed. In 2025, it was reported that 23 meteorological events inflicted damages surpassing $1 billion, resulting in 276 fatalities and $115 billion in total damages.

In Fleurance, France, a pharmacy thermometer indicates a scorching 45 degrees Celsius, equivalent to 113 degrees Fahrenheit.
Isabel Souliment / Hans Lukas, from Reuters file

While greenhouse gas emissions remain the principal driver of rising global temperatures, natural fluctuations also contribute. La Niña patterns, characterized by colder-than-average water in the central Pacific, generally lead to lower global temperatures, while El Niño events can raise them.

Though the La Niña pattern emerged in late 2025, NOAA scientists expect a return to neutral conditions early this year.

Source: www.nbcnews.com

Asteroid Breaks Records: Discovery via Pre-Survey Data from Vera Rubin Observatory

Astronomers have identified a fascinating asteroid named 2025 MN45 using early data from the Legacy Space-Time Survey (LSST) Camera, the largest digital camera in the world, at the NSF-DOE Vera C. Rubin Observatory.



Artist’s impression of asteroid 2025 MN45. Image credit: NSF-DOE Vera C. Rubin Observatory / NOIRLab / SLAC / AURA / P. Marenfeld.

Asteroids orbiting the sun rotate at varying speeds, providing critical insights into their formation conditions billions of years ago, as well as their internal structure and evolutionary history.

Fast-spinning asteroids may have been propelled by prior collisions with other space rocks, suggesting they could be remnants of larger parent bodies.

To withstand such rapid spinning, these asteroids must possess enough internal strength to prevent fragmentation, a process where an object breaks apart due to its rotation speed.

Most asteroids consist of aggregates of debris, with their construction limiting how swiftly they can spin without disintegrating based on their density.

In the main asteroid belt, the threshold for stable fast rotation is approximately 2.2 hours. Asteroids exceeding this rotation period must be exceptionally strong to remain intact.

The faster an asteroid spins and the larger it is, the more durable its material must be.

A recent study published in the Astrophysical Journal Letters reveals important insights into asteroid composition and evolution, showcasing how the NSF-DOE Vera C. Rubin Observatory is redefining our understanding of solar system discoveries.

This research presents data on 76 asteroids with verified rotation rates.

It includes 16 ultra-fast rotators with periods ranging from approximately 13 minutes to 2.2 hours, along with three extreme rotators completing a full rotation in under 5 minutes.

All 19 newly identified high-rotation objects exceed the length of an American football field (around 90 meters).

Notably, the fastest-spinning known main-belt asteroid, 2025 MN45, has a diameter of 710 meters and completes a rotation every 1.88 minutes.

This combination establishes it as the fastest rotating asteroid discovered, surpassing 500 meters in diameter.

“Clearly, this asteroid must be composed of exceptionally strong material to maintain its structure at such high rotation speeds,” commented Dr. Sarah Greenstreet, an astronomer at NSF’s NOIRLab and the University of Washington.

“Our calculations suggest it requires cohesive forces comparable to solid rock.”

“This is intriguing because most asteroids are believed to be ‘rubble heap’ structures, composed of numerous small rocks and debris that coalesced through gravitational forces during solar system formation and collisions.”

“Discoveries like this incredibly fast-rotating asteroid result from the observatory’s unmatched ability to deliver high-resolution time-domain astronomical data, thus expanding the limits of what we can observe,” stated Regina Lameika, DOE associate director for high-energy physics.

In addition to 2025 MN45, other significant asteroids researched by the team include 2025 MJ71 (rotation period of 1.9 minutes), 2025 MK41 (rotation period of 3.8 minutes), 2025 MV71 (rotation period of 13 minutes), and 2025 MG56 (rotation period of 16 minutes).

All five of these ultra-fast rotators are several hundred meters in diameter, categorizing them as the fastest-rotating subkilometer asteroids known to date, including several near-Earth objects.

“As this study illustrates, even during its initial commissioning stages, Rubin allows us to investigate populations of relatively small, very fast-rotating main-belt asteroids that were previously unattainable,” Dr. Greenstreet concluded.

_____

Sarah Greenstreet et al. 2026. Light curve, rotation period, and color of the first asteroid discovered by the Vera C. Rubin Observatory. APJL 996, L33; doi: 10.3847/2041-8213/ae2a30

Source: www.sci.news

Space-Based Data Centers Are Still a Long Way Off.

Starcloud aims to establish a 4km x 4km data center satellite

star cloud

Is the overwhelming need for massive data centers by AI manageable through extraterrestrial solutions? Tech firms are considering low-Earth orbit as a viable option, although experts warn that substantial engineering and unresolved challenges currently hinder progress.

The explosive demand and investment in generative AI platforms like ChatGPT have sparked an unparalleled need for computing resources, requiring vast land areas as well as electricity levels comparable to those consumed by millions of households. Consequently, many data centers are increasingly relying on unsustainable energy sources such as natural gas, with tech companies expressing concerns that renewable energy sources cannot meet their skyrocketing power needs or stability requirements for reliable operations.

In response, executives like Elon Musk and Jeff Bezos are advocating for the launch of data centers into orbit, where they could benefit from continuous sunlight, surpassing that of terrestrial solar panels. Bezos, founder of Amazon and owner of Blue Origin, stated earlier this year: It is anticipated that gigawatt-class data centers could be operational in space within 10 to 20 years.

Google is moving forward with its vision for a space data center through its pilot initiative, Project Suncatcher, which plans to launch two prototype satellites equipped with TPU AI chips by 2027 to experiment with their functionality in orbit. However, one of the most notable advancements in space data processing occurred this year with the launch of a solitary H100 graphics processing unit by StarCloud, an Nvidia-backed company. Nevertheless, this is significantly less computing power than what modern AI systems require; OpenAI is estimated to utilize around a million of such chips.

For data centers to function effectively in orbit, many unresolved issues must be tackled. “From an academic research standpoint, [space data centers] are still far from being production-ready,” remarks Benjamin Lee from the University of Pennsylvania, USA.

According to Lee, one of the major hurdles is the extensive scale required to meet AI’s computational needs. This involves not only the power demands from solar panels—requiring substantial surface area—but also the challenge of dissipating heat produced by the chips, the only feasible cooling method in a vacuum. “We can’t use cold air and evaporative cooling like we do on Earth,” Lee explained.

“Square kilometers will be occupied independently for energy generation and cooling,” he added. “These structures expand rapidly. When discussing capacity in the range of 1,000 megawatts, it essentially equates to a considerable area in orbit.” Indeed, StarCloud plans to construct a data center of 5,000 megawatts over 16 square kilometers, roughly 400 times the area of the solar panels on the International Space Station.

Lee believes that several promising technologies could help mitigate these requirements. Krishna Muralidharan from the University of Arizona is investigating thermoelectric devices that can convert heat into electricity, enhancing the efficiency of chips functioning in space. “It’s not a matter of feasibility; it’s a challenge,” Muralidharan stated. “For now, we can temporarily rely on large thermal panels, but ultimately we will require more sophisticated solutions.”

Additionally, space presents unique challenges unlike those found on Earth. For instance, there is a significant presence of high-energy radiation that can impact computer chips, leading to errors and disrupted calculations. “Everything will slow down,” Lee cautioned. “A chip positioned in space might perform worse compared to one on Earth due to the need for recalibration and error correction.”

To function at this scale, Muralidharan noted that thousands of satellites need to operate in tandem, necessitating highly precise laser systems for communication both between data centers and with Earth, where atmospheric interference can distort signals. Despite this, Muralidharan remains optimistic, believing these challenges are surmountable. “The real question is not if, but when,” he asserts.

Another point of uncertainty is whether AI will still necessitate such extensive computational resources by the time the data centers are in place. This is particularly relevant if anticipated advancements in AI do not align with the growing computing power we are beginning to observe. “It’s evident that training requirements may peak or stabilize, which would likely cause the demand for large-scale data centers to follow suit,” Lee explained.

Yet, even in such a scenario, Muralidharan suggests potential applications for space-based data centers, such as facilitating space exploration beyond Earth and monitoring terrestrial phenomena.

topic:

Source: www.newscientist.com

Job Crisis: The Impact of Large Data Centers on Australia’s Freshwater Resources

a■ Australia is capitalizing on the AI boom, with numerous new investments in data centers located in Sydney and Melbourne. However, experts caution about the strain these large-scale projects may impose on already limited water resources.

The projected water demand for servicing Sydney’s data centers is anticipated to surpass the total drinking water supply in Canberra within the next decade.

In Melbourne, the Victorian government has pledged a $5.5 million investment to transform the city into Australia’s data center hub. Currently, hyperscale data center applications already exceed the collective water demands of nearly all of the top 30 business customers in the state.

Tech giants like Open AI and Atlassian are advocating for Australia to evolve into a data processing and storage hub. With 260 data centers currently operational and numerous others planned, experts express concern regarding the repercussions for drinking water resources.

Sydney Water projects that it will require as much as 250 megalitres daily to support the industry by 2035—more than the total drinking water supply in Canberra drinking water).

Cooling Requires Significant Water

Professor Priya Rajagopalan, director of RMIT’s Center for Post Carbon Research, points out that a data center’s water and energy requirements are largely dictated by the cooling technology implemented.

“Using evaporative cooling leads to significant water loss due to evaporation, while a sealed system conserves water but requires substantial amounts for cooling,” she explains.

Older data centers typically depend on air cooling. However, the increased demand for computational power means greater server rack densities, resulting in higher temperatures. Hence, these centers rely more heavily on water for cooling solutions.

Water consumption in data centers varies significantly. For instance, NextDC has transitioned to liquid-to-chip cooling, which cools processors and GPUs directly, as opposed to cooling entire rooms with air or water.

NextDC reports that while initial trials of this cooling technology have been concluded, liquid cooling is far more efficient and can scale to ultra-dense environments, improving processing power without a proportional increase in energy consumption. Their modeling suggests that the power usage efficiency (PUE) could decline to as low as 1.15.

Subscribe to Climate and Environment Editor Adam Morton’s Clear Air column for free!

The data center sector measures its sustainability using two key metrics: water usage efficiency (WUE) and power usage efficiency (PUE). These metrics gauge the levels of water or power consumed per unit of computing work.

WUE is calculated by dividing annual water usage by annual IT energy usage (kWh). For instance, a 100MW data center that uses 3ML daily would yield a WUE of 1.25. A number closer to 1 indicates greater efficiency. Certain countries enforce minimum standards; for example, Malaysia recommends a WUE of 1.8.

Even facilities that are efficient can still consume substantial amounts of water and energy at scale.

NextDC’s last fiscal year’s PUE stood at 1.44, up from 1.42 the previous year. The company indicates that this reflects the changing nature of customer activity across its facilities and the onboarding of new centers.

Calls to Ban Drinking Water Usage

Sydney Water states that estimates regarding data center water usage are continually reassessed. To prepare for future demands, the organization is investigating alternative, climate-resilient water sources like recycled water and rainwater harvesting.

“Every proposed connection for data centers will undergo case-by-case evaluations to guarantee adequate local network capacity. If additional services are necessary, operators might need to fund upgrades,” a Sydney Water representative said.

In its submission to the 2026-2031 rate review in Victoria, Melbourne Water observed that hyperscale data center operators seeking connectivity “expect instantaneous and annual demand to surpass nearly all of Melbourne’s leading 30 non-residential customers.”

Melbourne Water mentioned, “This has not been factored into our demand forecasting or expenditure plans.”

The agency is requesting upfront capital contributions from companies to mitigate the financial burden of necessary infrastructure improvements, ensuring those costs do not fall solely on the broader customer base.

Documents show that Greater Western Water in Victoria has received 19 data center applications. See more from ABC provided to the Guardian.

Skip past newsletter promotions

The Concerned Waterways Alliance, composed of various Victorian community and environmental organizations, has expressed concerns regarding the potential diversion of drinking water for cooling servers when the state’s water supplies are already under stress.

Alliance spokesperson Cameron Steele emphasized that expanding data centers would create a greater reliance on desalinated water, thereby diminishing availability for ecological streams and possibly imposing costs on local communities. The group is advocating for a ban on potable water usage for cooling and demanding that all centers transparently report their water consumption.

“We strongly promote the use of recycled water over potable water within our data centers.”

Closed Loop Cooling

In hotter regions, like much of Australia during summer, data centers require additional energy or water to remain cool.

Daniel Francis, customer and policy manager at the Australian Water Works Association, highlights that there is no universal solution for the energy and water consumption of data centers, as local factors such as land availability, noise restrictions, and water resources play significant roles.

“We constantly balance the needs of residential and non-residential customers, as well as environmental considerations,” says Francis.

“Indeed, there is a considerable number of data center applications, and it’s the cumulative effect we need to strategize for… It’s paramount to consider the implications for the community.”

“Often, they prefer to cluster together in specific locations.”

One of the data centers currently under construction in Sydney’s Marsden Park is a 504MW facility spanning 20 hectares with six four-story buildings. The company claims this CDC center will be the largest data campus in the southern hemisphere.

Last year, CDC operated its data centers with 95.8% renewable electricity, achieving a PUE of 1.38 and a WUE of 0.01. A company representative stated that this level of efficiency was made possible through a closed-loop cooling system that does not require continuous water extraction, in contrast to traditional evaporative cooling systems.

“CDC’s closed-loop system is filled only once at its inception and functions without ongoing water extraction, evaporation, or waste generation, thereby conserving water while ensuring optimal thermal performance,” the spokesperson noted.

“This model is specifically designed for Australia, a nation characterized by drought and water shortages, focusing on long-term sustainability and establishing industry benchmarks.”

Despite CDC’s initiatives, community concerns regarding the project persist.

Peter Rofile, acting chief executive of the Western NSW Health District, expressed in a letter last June that the development’s proximity to vulnerable communities and its unprecedented scale posed untested risks to residents in western Sydney.

“This proposal does not guarantee that this operation can adequately mitigate environmental exposure during extreme heat events, potentially posing an unreasonable health risk to the public,” Rofile stated.

Source: www.theguardian.com

Irish Authorities Request Microsoft to Investigate Alleged Illegal Data Processing by IDF

Irish officials have received a formal request to look into Microsoft regarding claims of unlawful data processing by the Israel Defense Forces.

The Irish Council for Civil Liberties (ICCL), a human rights organization, filed the complaint with the Data Protection Commissioner, who is legally charged with overseeing all data processing activities within the European Union.

This comes after reports in August from the Guardian, along with Israeli-Palestinian publication +972 Magazine and Hebrew media Local Call, highlighted that substantial amounts of Palestinian phone communications were stored on Microsoft’s Azure cloud platform as part of an extensive surveillance initiative by the Israeli military.

The ICCL asserts that the handling of personal data “aided in the commission of war crimes, crimes against humanity, and genocide by Israeli forces.” Microsoft’s European headquarters are located in Ireland.

“Microsoft’s technologies are endangering millions of Palestinians. These are not just theoretical data protection issues,” said Joe O’Brien, executive director of ICCL.

He remarked that cloud services “enable tangible violence” and emphasized the need for the “DPC to respond swiftly and decisively” given the “risk to life involved in the matter at hand.”

He further stated, “When European infrastructure is used to facilitate surveillance and targeting, the Irish Data Protection Commissioner must step in and utilize its full authority to hold Microsoft accountable.”

A collection of leaked documents reviewed by the Guardian has indicated that as early as 2021, the Israeli military’s intelligence unit, Unit 8200, started discussions to transfer large amounts of classified intelligence data to a cloud service operated by a US company.

The documents revealed that Microsoft’s storage facilities were employed by Unit 8200 to archive extensive records of Palestinian daily communications, which facilitated specific airstrikes and other military actions.

Following this revelation, Microsoft initiated an urgent external inquiry into its connections with Unit 8200. Preliminary findings led the company to suspend this unit’s access to certain cloud storage and AI services.

ICCL contends that Microsoft played a crucial role in enabling Israel’s military surveillance system known as “Al-Minasek.”

The organization claims that records of intercepted conversations between EU servers and Israel were reportedly “deleted,” obstructing evidence of unlawful processing before an EU inquiry could commence, violating the EU’s General Data Protection Regulation (GDPR) that regulates personal data usage.

With Azure’s vast storage and computational capabilities, Unit 8200 was establishing an indiscriminate system allowing agents to collect, replay, and analyze cell phone calls from entire populations.

A spokesperson for the DPC stated, “We can confirm that the DPC has received the complaint and is currently evaluating it.”

Microsoft has been approached for a response.

Source: www.theguardian.com

Do Data Centers’ High Energy Demands Threaten Australia’s Net Zero Goals?

The demand for electricity by data centers in Australia could triple over the next five years, with projections indicating it may surpass the energy consumed by electric vehicles by 2030.

Currently, data centers obtain approximately 2% of their electricity from the National Grid, equating to around 4 terawatt-hours (TWh). The Australian Energy Market Operator (Aemo) is optimistic about this share significantly increasing, projecting a growth of 25% annually to reach 12TWh, or 6% of grid demand by 2030, and 12% by 2050.

Aemo anticipates that the rapid expansion of this industry will drive “substantial increases in electricity usage, especially in Sydney and Melbourne.”


In New South Wales and Victoria, where the majority of data centers are situated, they contribute to 11% and 8% of electricity demand, respectively, by 2030. Electricity demand in each state is projected to grow accordingly.

Tech companies like OpenAI and SunCable are pushing Australia towards becoming a central hub for data processing and storage. Recently, the Victorian Government announced a $5.5 million investment aimed at establishing the region as Australia’s data center capital.

However, with 260 data centers currently operating across the nation and numerous others in the pipeline, experts express concerns about the implications of unchecked industry growth on energy transition and climate objectives.

Energy Usage Equivalent to 100,000 Households

The continual operation of numerous servers generates substantial heat and requires extensive electricity for both operation and cooling.

Subscribe to Climate and Environment Editor Adam Morton’s Clear Air column for insightful newsletters

Globally, the demand for data centers is growing at a rate four times faster than other sectors, according to the International Energy Agency. The number and size of centers are escalating, with large facilities becoming increasingly common.

As highlighted by the IEA, “AI-centric hyperscale data centers possess a capacity exceeding 100MW and consume energy equivalent to what 100,000 homes use annually.”

Professor Michael Blair, a mechanical engineering professor at the University of Melbourne and director of the Net Zero Australia project, stated that there is a significant connection between electricity and water usage due to cooling requirements, as servers convert electrical energy into heat.

“In confined spaces with many computers, air conditioning is required to maintain an optimal operating temperature,” he explains.

Typically, digital infrastructure is cooled through air conditioning or water systems.

Ketan Joshi, a climate analyst at the Oslo-based Australia Institute, shares that many tech companies are reporting a surge in electricity consumption compared to last year. The intensity of energy usage has also been increasing across several metrics: energy per active user and energy per unit of revenue, when compared to five years ago.

“They aren’t consuming more energy to serve additional users or increase revenue,” he asserts. “The pertinent question is: why is our energy consumption escalating?”

In the absence of concrete data, Joshi suggests that the undeniable growth in demand is likely attributed to the rise of energy-intensive generative AI systems.

“Running Harder to Stay in the Same Place”

Joshi is monitoring this issue, as data centers globally are evidenced to place substantial and inflexible demands on power grids, resulting in two significant repercussions: increased dependence on coal and gas generation, and diverting resources away from the energy transition.

While data center companies often assert they operate using clean energy through investments in solar and wind, Joshi remarks that there can often be a mismatch between their companies’ persistent reliance on the grid and their renewable energy production profiles.

“What’s the ultimate impact on the power grid?” he questions. “Sometimes, we have surplus energy, and other times, there isn’t enough.”

Skip past newsletter promotions

“So, even if everything appears favorable on paper, your data center might be inadvertently supporting fossil fuel transportation.”

Moreover, instead of renewable energy sources displacing coal and gas, these sources are accommodating the growing demands of data centers, Joshi notes. “It’s like sprinting on a treadmill—no matter how hard you run, it feels like the speed is continually increasing.”


The demand for electricity has surged to the extent that some companies have resorted to restarting their operations. Nuclear power plants in the U.S. that were once mothballed are being revived as demand for gas turbines increases. Some Australian developers are even proposing the installation of new gas generators to fulfill their energy needs.

Aemo predicts that by 2035, data centers could consume 21.4TWh, nearing the country’s annual energy consumption, comparable to that of four aluminum smelters.

Blair pointed out that AI adoption is in its infancy, and the outlook remains uncertain, as Aemo’s 2035 energy consumption scenarios range between 12TWh and 24TWh, indicating that the future might not be as expansive as anticipated.

In the National AI Plan released Tuesday, the federal government recognized the necessity for advancements in new energy and cooling technologies for AI systems. Industry Minister Tim Ayers stated that principles for data center investments will be established in early 2026, emphasizing requirements for supplementary investments in renewable energy generation and water sustainability.

“Undeniable Impact” on Electricity Prices

Dr. Dylan McConnell, an energy systems researcher at the University of New South Wales, noted that while renewable energy is on the rise in Australia, it is not yet progressing rapidly enough to meet required renewable energy and emissions targets. The expansion of data centers will complicate these challenges.

“If demand escalates beyond projections and renewables can’t keep pace, we’ll end up meeting that new demand instead of displacing coal,” he explains.

Unlike electric vehicles, which enhance demand on the grid while lowering gasoline and diesel usage, data centers do not reduce fossil fuel consumption elsewhere in the economy, according to McConnell.

“If this demand materializes, it will severely hamper our emissions targets and complicate our ability to phase out coal in alignment with those targets,” he advises.

In its climate targets recommendations, the Climate Change Agency stated: “Data centers will continue to scale up, exerting deeper pressure on local power sources and further hampering renewable energy expansions.”

McConnell asserted there will be a significant effect on overall energy costs, influencing electricity prices.

“To support this load, we will need a larger system that utilizes more costly resources.”

Source: www.theguardian.com

Virginia Democrats Advocate for Data Centers to Secure State House Seat

JOrne McAuliffe, a 33-year-old entrepreneur and former public servant, stands as an unexpected Democratic contender in this month’s Virginia House of Representatives election, especially given a campaign approach that occasionally resembled that of his Republican opponents.

Recently, Mr. McAuliffe joined 13 Democrats who secured Congressional seats in Virginia during a significant electoral win for his party, granting them robust control over state governance. With victories in states like New Jersey and California, this outcome provides a renewed advantage for Democrats nationwide, following a disheartening setback against Donald Trump and the Republican Party the previous year.

The northern Virginia district he aimed to represent, characterized by residential areas, agricultural land, and charming small towns, hadn’t seen a Democratic representative in decades. Thus, McAuliffe campaigned door-to-door on his electric scooter, reaching out to constituents with a pledge to “protect their way of life.” He dismissed the label “woke” and attributed the “chaos” to Washington, D.C., located over an hour away.


One of his primary talking points was a widespread concern resonating with many Democrats today, but with a distinct angle: the adverse impacts of data centers on electricity costs.

“I spent a majority of the year visiting households I never imagined were Democratic,” McAuliffe recounted. “Independents, Republicans, and an occasional Democrat, yet many began shutting their doors on me.”

“However, once they voiced a desire to discuss data centers, it opened a dialogue. That allowed me to draw a contrast, which is rare.”

Loudoun County’s data centers occupy about half of Virginia’s 30th House District, known for its high per capita income, and handle more traffic than any other region globally. While essential for many Internet functions, McAuliffe argued—and many voters concurred—that their presence can be burdensome.

Sizeable as warehouses, these data centers loom over nearby neighborhoods, buzzing with the sounds of servers and machinery. Developers seek to establish facilities in Fauquier County, the district’s other Republican-leaning area, but McAuliffe mentioned that residents are apprehensive about construction on rural farmland, renowned for its scenic vistas. He noted receiving complaints regarding the impact of data centers on electricity bills across the board.

According to a 2024 report from the Virginia General Assembly’s Joint Legislative Audit and Review Committee, the state’s energy demands are projected to double over the next decade, chiefly due to data centers and the substantial infrastructure required to cater to this demand.

The report also indicated that while Virginia’s electricity pricing structures are “appropriately” aligned with facility usage, “energy costs for all consumers are likely to rise” to cover new infrastructure expenses and necessary electricity imports. Earlier this month, Virginia’s public utility regulators approved a rise in electricity rates, though not to the extent Dominion Energy, the state’s primary provider, initially requested.

“The costs tied to infrastructure—the extensive transmission lines and substations—are being passed down to consumers,” McAuliffe explained from a co-working space in Middleburg, Virginia, where his campaign operates.

“These essentially represent taxes that we’ve wrongfully placed on ordinary Virginians to benefit corporations like Amazon and Google. While there may be some advantages for these communities, these companies are capable of affording them, and we must strive to better negotiate those benefits.”

McAuliffe’s opponent was Republican Geary Higgins, who had been elected in 2023. The battle between the two parties proved costly, with Democrats investing nearly $3 million and their adversaries spending just over $850,000, according to records from the Virginia Public Access Project.

This campaign encompassed more than just data centers; McAuliffe also spotlighted reproductive rights and teacher salary increases. Democrats have committed to codifying access to abortion if they gain full power in Virginia’s state government, and the governance in his district deteriorated under Democratic Party criticisms that Higgins failed to return contributions from controversial politicians.

Yet, McAuliffe chose to concentrate on data centers, believing their impacts presented “the most pressing issue we can address.” This focus surprised some of his consultants, and although he acknowledged it was a “somewhat niche topic,” data centers frequently emerged as a primary concern during his door-to-door visits.

To counter Higgins, his campaign even launched a website called data center geary, attempting to associate the Republican (a former Loudoun County Supervisor) with the spread of these facilities. Higgins and his family and allies condemned the efforts as misleading.

Mr. McAuliffe ultimately won with 50.9% of the votes, while Mr. Higgins gathered 49%. In response to a request for an interview, Higgins stated that McAuliffe’s “entire campaign was based on falsehoods regarding me and my history.”

“Thanks to an influx of external funding and high Democratic turnout, he was able to fabricate a misleading caricature of me and narrowly triumph,” Higgins remarked.

As Mr. Trump faced the polls nationwide last year, voters in conservative rural and suburban areas turned away from Democrats, resulting in the party’s loss of the presidency and Congressional control. McAuliffe’s victory leaves some party leaders pondering the lessons Democrats can glean from his campaign.

“In typically red regions, he identified common issues that resonated with both Republicans and Democrats while making a convincing case for solutions,” noted Democratic Rep. Suhas Subrahmanyam, who represents McAuliffe’s district.

Democratic National Committee Chairman Ken Martin, who campaigned alongside McAuliffe, characterized him as “an extraordinary candidate who triumphed by focusing squarely on the relevant issues of his district.”

“Democrats are capable of winning in any setting, especially in suburbs and rural environments, when they have candidates who commit themselves to addressing the genuine needs of their community. Presently, what Americans require is the capability to manage their expenses,” stated Martin.

Chaz Natticomb, founder and executive director of Virginia’s nonpartisan election monitoring organization State Navigate, remarked that while McAuliffe may not have surpassed Democrat Abigail Spanberger’s standout gubernatorial victory, his success in garnering votes illustrates his appeal to some Republicans over Higgins.

“He outperformed everyone else, primarily because he gained the support of Republican-leaning voters,” Natticombe concluded.

Source: www.theguardian.com

Mumbai Families Struggle as Data Centers Increase City’s Coal Dependence

EEvery day, Kiran Kasbe navigates her rickshaw taxi amid the bustling Mahuls near her home on Mumbai’s eastern coast, where stalls brim with tomatoes, gourds, and eggplants, often enveloped in thick smog.

Earlier this year, doctors identified three tumors in her 54-year-old mother’s brain. The specific cause of her cancer remains unclear, yet those residing near coal-fired power plants have a significantly higher risk of developing such illnesses. A study indicates that Mahul’s residents live mere hundreds of meters from these plants.

The air quality in Mahul is notoriously poor; even with closed car windows, the pungent odor of oil and smoke seeps in.

“We are not the only ones suffering health issues here. Everything is covered in grime,” noted Kasbe, 36.

Last year, plans to shut down two coal-fired power plants operated by Indian firms Tata Group and Adani were announced as part of the government’s initiative to reduce emissions. However, by late 2023, these decisions were overturned after Tata claimed escalating electricity demand in Mumbai necessitated coal.

Neither firm responded to inquiries for comment.

Buildings blanketed in smog in Mumbai, India, January. Photo: Bloomberg/Getty Images

India’s electricity demand has surged in recent years, driven by economic growth and increased air conditioning needs due to severe heat exacerbated by climate change. However, a study by Source Material and The Guardian highlighted that a primary hindrance for cities in relying on fossil fuels is the insatiable energy demands of data centers.

Leaked documents also expose Amazon’s significant presence in Mumbai, where it stands as the largest data center operator globally.

In metropolitan areas served by Amazon, the organization has noted three “availability zones,” indicating one or more data centers. Leaked data from a year ago indicated that the company operates 16 machines in the city.

Bhaskar Chakravorty, an academic at Tufts University analyzing technology’s societal impacts, remarked that the surge in data centers is creating a tension between energy needs and climate goals as India evolves its economy into an artificial intelligence hub.

“I’m not surprised by the slow progression towards a greener transition, particularly as demands grow rapidly,” he said regarding the Indian government’s stance.

Amazon spokesperson Kylie Jonas asserted that Mumbai’s “emissions issue” cannot be attributed to Amazon.

“On the contrary, Amazon is among the largest corporate contributors to renewable energy in India, backing 53 solar and wind initiatives capable of generating over 4 million megawatt-hours of clean energy each year,” she stated. “Once operational, these investments will power more than 1.3 million Indian households annually.”

Amazon is establishing numerous data centers globally, vying with Microsoft, Google, and other entities for dominance in the burgeoning AI sector.

Tata Consultancy Services Ltd. office in Mumbai, India. Photo: Bloomberg/Getty Images

Amazon Employee Climate Justice representative Eliza Pan criticized the company for not acknowledging its role in perpetuating reliance on one of the most polluting energy sources.

“Amazon is leveraging this shiny concept called AI to distract from the reality of building a dirty energy empire,” she said.

Jonas refuted this assertion, stating, “Not only are we recognized as the most efficient data center operator, but we’ve also been the top corporate purchaser of renewable energy for five successive years, with over 600 projects globally.”

Amazon’s claims regarding green energy are contentious. The organization has been scrutinized for engaging in “creative accounting” by acquiring renewable energy certificates alongside direct green energy purchases, as noted by a member of Amazon Employees for Climate Justice.

“Everything is contaminated”

Kasbe operates her rickshaw in Mahul, a former fishing settlement that has transformed into a residence for tens of thousands who were displaced from slums across the city.

Kiran Kasbe’s mother. Photo: Provided by Sushmita

Kasbe and her mother relocated here in 2018 after their home in Vidyavihar’s outskirts faced demolition. She was in good health prior to the move, but her medical condition significantly worsened, culminating in a brain tumor diagnosis.

Gajanan Tandol, a local resident, shared that pollution-related diseases are prevalent. “There are numerous instances of skin and eye inflammation, cancer, asthma, and tuberculosis, yet we receive no government assistance,” he lamented.

Another community member, Santosh Jadhav, implored the government to relocate residents from Mahul.

“Everything is tainted. We’re exhausted from fighting for a decent existence,” he stated. “This is hell for us.”

Skip past newsletter promotions

hidden data center

Amazon, an e-commerce platform facilitating 13 million customer transactions daily, is investing billions into expanding its profitable cloud computing sector and enhancing its AI-assisted services, such as automated coding and translation, as per research from CapitalOne.

Many of the centers in Mumbai remain under the radar because they are leased rather than owned. Unlike in the U.S., where Amazon predominantly owns its facilities, it frequently rents entire data farms or server racks in centers shared with other companies elsewhere.

Xiaolei Ren, a computing scholar from the University of California, Riverside, remarked that shared “colocation” units lead to significantly higher energy consumption in data centers compared to wholly owned or fully leased operations.

“The majority of energy used in the data center sector is concentrated in colocation facilities,” he noted. “They are ubiquitous.”

Employees near the Amazon Prime brand in Mumbai, India, September. Photo: NurPhoto/Getty Images

Based on leaked information, Amazon’s colocation data center in Mumbai consumed 624,518 megawatt-hours of electricity in 2023, sufficient to power over 400,000 homes in India for an entire year.

India is on the verge of surpassing Japan and Australia, poised to become the second-largest consumer of data center power in the Asia-Pacific region. S&P predicts that by 2030, data centers will account for one-third of Mumbai’s energy consumption, according to Techno & Electric Engineering CEO Ankit Saraiya.

“Poison hell”

In a bid to keep up with power demand, the Maharashtra government has extended the operational duration of the Tata coal-fired power plant in Mahul by at least five years. Additionally, the closure of a 500-megawatt plant operated by Tata competitor Adani Group in the city’s north has been postponed.

When Tata requested an extension in its proposal to the State Energy Commission, it cited the rising energy demand from data centers as the primary justification. Adani projected that the anticipated surge in demand during the five years following the plant’s scheduled closure would come predominantly from data centers.

These power plants represent merely two of the numerous polluting sources within Mumbai’s Mahul district. The area also houses three oil refineries and 16 chemical facilities, as stated in a 2019 report by the Indian Center for Policy Research, which branded the locality a “toxic hell.”

The Tata power plant has been operational since 1984, and like many old power stations, it is subject to lenient emissions regulations, as noted by Raj Lal, chief air quality scientist at the World Emissions Network, who labeled it “one of the major contributors to air pollution in Mumbai.”

The Center for Energy and Clean Air Research noted that PM2.5 particles comprise nearly a third of the area’s pollution. PM2.5 particles are airborne and less than 2.5 micrometers in diameter, which can lead to severe health issues when inhaled.

Smoke emanates from the chimney of Tata Power Company’s Trombay thermal facility in Mumbai, India, August 2017. Photo: Bloomberg/Getty Images

Shripad Dharmadhikari, founder of the environmental organization Manthan Adhyayan Kendra, stated that the toxic heavy metals in ash generated by the factories are likely to trigger “respiratory diseases, kidney ailments, skin issues, and heart problems.”

While Tata’s facilities continue operations, Mumbai’s power grid is buckling under the increasing demand. To mitigate potential power shortages, Amazon’s colocation data center in the city has invested in 41 backup diesel generators and is seeking permission for additional installations, according to the leaked documents.

A report from the Center for Science and Technology Policy (CSTEP) released in August identified diesel generators as a primary pollutant source in the locality.

Air quality expert Swagata Dey at CSTEP argued that the presence of data centers requiring continuous electricity, coupled with the backup diesel generators, “will inevitably exacerbate emissions,” advocating for legal requirements for data center operators to utilize pollution-free solar energy.

Particularly, the Amazon facility across Thane Creek from Mahul has 14 generators already installed, and one partner was granted permission to set up another 12 generators on-site earlier this year.

“Public health considerations must be central to decisions regarding data center locations and energy source selections,” stated Wren from the University of California, Riverside, co-author of a recent paper evaluating the public health consequences of diesel generators in U.S. data centers.

Sushmita notes that in India, surnames are not commonly used as they signify caste, reflecting a hierarchical and discriminatory social structure.

Source: www.theguardian.com

Civil Liberties Organization Demands Inquiry into UK Data Protection Authority

Numerous civil liberties advocates and legal professionals are demanding an inquiry into the UK’s data protection regulator. The regulator has referred to the situation as a “collapse in enforcement activity” following a significant scandal, specifically the Afghanistan data breach.

A group of 73 individuals—including academics, leading lawyers, data protection specialists, and organizations like Statewatch and the Good Law Project—have sent a letter to Chi Onwurah, the chair of the bipartisan Commons Science, Innovation and Technology Committee. This effort was coordinated by the Open Rights Group and calls for an investigation into the actions of Information Commissioner John Edwards’ office.

“We are alarmed by the failure in enforcement actions by the Directorate of Intelligence, which has resulted in not formally investigating the Ministry of Defense (MoD) after the Afghanistan data breach,” stated the signatories. They caution that there are “more serious structural flaws” beyond just data breaches.

The Afghanistan data breach represented a grave leak involving information about Afghan individuals who collaborated with British forces prior to the Taliban’s takeover in August 2021. Those whose names were disclosed indicated that this exposure endangered their lives.

“Data breaches can pose serious risks to individuals and disrupt the continuity of government and business,” the letter emphasized. “However, during a recent hearing conducted by your committee, Commissioner John Edwards suggested he has no intention of reassessing his approach to data protection enforcement, even in light of the most significant data breach ever in the UK.”

The signatories also referenced other notable data breaches, including those affecting the victims of the Windrush scandal.

They argue that the ICO has adopted a “public sector approach” to such incidents, issuing disciplinary actions characterized by unenforceable written warnings and substantially lowering fines.

“The ICO’s choice not to initiate any formal action against the MoD, despite ongoing failures, is as remarkable as its lack of documentation regarding its decisions. This paints a picture in which the ICO’s public sector approach provides minimal deterrence and fails to encourage effective data management across government and public entities.”

“The response to the Afghanistan data breach signifies a broader issue. Many have been left disillusioned by the ICO’s lack of use of its remedial powers and its continual shortcomings.”

The letter warns that the trend of declining enforcement in the public sector will inevitably reflect in the accompanying statistics. Latest ICO report Enforcement actions by the private sector are also becoming increasingly rare, as the ICO fails to pursue matters and organizations redirect resources away from compliance and responsible data practices.

“Instead of simply hoping for a positive outcome, Congress has endowed the ICO with ample authority to ensure compliance with legally binding orders. During the hearing you conducted, it was clear that the ICO opted not to exercise these powers regarding the Afghan data breach.”

“Regrettably, the Afghanistan data breach is not an isolated case but rather an indication of deeper structural issues in the operations of ICOs.”

The letter concludes with the assertion that “change seems improbable unless the Science, Innovation and Technology Committee steps in with its oversight capabilities.”

An ICO spokesperson commented: “We possess a comprehensive array of regulatory powers and tools to tackle systemic concerns within specific sectors or industries.”

“We appreciate the essential role civil society plays in scrutinizing our decisions and look forward to discussing our strategies in our upcoming regular meeting. We also welcome the opportunity to clarify our work when engaging with or presenting before the DSIT Selection Committee.”

Source: www.theguardian.com

Data Breach Exposes Personal Information of Tate Gallery Job Seekers

The Guardian has revealed that personal information from job applicants at the Tate has been exposed online, compromising addresses, salaries, and phone numbers of examiners.

These extensive records, running hundreds of pages, were shared on a site not affiliated with the government-supported organization managing London’s Tate Modern, Tate Britain, Tate St Ives in Cornwall, and Tate Liverpool.

The leaked data encompasses details like the current employers and educational background of applicants related to the Tate’s Website Developer Search in October 2023, affecting 111 individuals. While names are undisclosed, referees’ phone numbers and personal email addresses might be included. It remains unclear how long this information has been available online.

Max Kohler, a 29-year-old software developer, learned his data had been compromised after one of his application reviewers received an email from an unfamiliar source who accessed the online data dump.

Kohler found that the breach contained his last paycheck, current employer’s name, other reviewers’ names, email addresses, home addresses, and extensive responses to job interview questions.

“I feel extremely disappointed and disheartened,” he stated. “You dedicate time filling out sensitive information like your previous salary and home address, yet they fail to secure it properly and allow it to be publicly accessible.”

“They should publicly address this issue, provide an apology, and clarify how this happened, along with actions to prevent future occurrences. It likely stems from inadequate staff training or procedural oversights.”

Reported incidents of data security breaches to the UK’s Information Commissioner’s Office (ICO) continue to rise. Over 2,000 incidents were reported quarterly in 2022, increasing to over 3,200 between April and June of this year.

Kate Brimstead, a partner at Shoesmith law firm and an authority on data privacy, information law, and cybersecurity, commented: “Breaches do not always have to be intentional. While ransomware attacks attract significant attention, the scale of current breaches is substantial.” Errors can often contribute to these incidents, highlighting the necessity for robust checks and procedures in daily operations. “Managing our data can be tedious, but it remains crucial,” she added.

The ICO emphasized that organizations must report a personal data breach to them within 72 hours of being aware, unless there is no risk to individuals’ rights and freedoms. If an organization decides not to report, they should maintain a record of the breach and justify their decision if needed.

Skip past newsletter promotions

A spokesperson for Tate stated: “We are meticulously reviewing all reports and investigating this issue. Thus far, we haven’t identified any breaches in our systems and will refrain from further comment while this issue is under investigation.”

quick guide

Contact us about this story

show

The integrity of public interest journalism relies on firsthand accounts from knowledgeable individuals.

If you have insights regarding this issue, please contact us confidentially using the methods listed below.

Secure messaging in the Guardian app

The Guardian app provides a way to share story tips. Messages sent via this feature are encrypted end-to-end and integrated within typical app functions, keeping your communication with us secure.

If you haven’t yet downloaded the Guardian app, you can find it on (iOS/Android) and select “Secure Messaging” from the menu.

SecureDrop, instant messaging, email, phone, and postal mail

If you can secure your use of the Tor network, you can send us messages and documents through our <a href=”https://www.theguardian.com/securedrop”>SecureDrop platform</a>.

Additionally, our guide located at <a href=”https://www.theguardian.com/tips”>theguardian.com/tips</a> lists several secure contact methods and discusses the advantages and disadvantages of each.

Illustration: Guardian Design/Rich Cousins

Thank you for your feedback.

Source: www.theguardian.com

7 Expert Tips for Safeguarding Your Personal Data

VPN providers are experiencing significant growth, offering virtual private networks that create encrypted paths for Internet data, effectively masking a user’s location.

Previously, VPNs were of interest mainly to a niche audience. Nowadays, they are increasingly utilized by individuals frustrated with the age verification requirements imposed by the Online Safety Act.

Since the law became effective on July 25th, VPNs have surged to prominence in UK app stores, as users seek to safeguard their identities.

It’s understandable that users overwhelmed by the demand for personal information turn to VPNs, though there are other ways to maintain safety online.

Want to go incognito?

Many users instinctively turn to the “Incognito” or “Private Browsing” mode available in their browsers. However, be cautious of misleading terminology.

“Private browsing isn’t as private as it seems,” warns Jake Moore, a cybersecurity expert at ESET. “It merely prevents your browser from saving your search history, cookies, and autofill information on your device.”

This feature is handy for avoiding traces on shared computers (e.g., when purchasing gifts online) but does little to conceal your identity from external parties.

“Your identity remains visible to websites, ISPs, and advertisers,” Moore emphasizes. “They can still see your IP address and track you if desired.”

read more:

Steer clear of major players

Search engines are the primary doorway to countless websites, yet many users are uneasy about the extensive data tech companies gather about them for advertising purposes.

“Google collects vast amounts of user data for profiling and targeted advertising, which is their main revenue source,” says Moore.

“This level of data analysis can be very invasive, and many users are unaware of it. [Tech companies] track their online behavior extensively.”

As an alternative, consider privacy-focused search engines like DuckDuckGo. “I always recommend DuckDuckGo,” says Alan Woodward, a Professor of Cybersecurity at the University of Surrey.

Beyond being a search engine, DuckDuckGo also features a web browser as an alternative to Google Chrome, Microsoft Edge, or Apple’s Safari, both of which are endorsed by Moore and Woodward.

Email also plays a crucial role in our online lives.

While “Big Tech” isn’t scrutinizing every detail of your holiday emails, it does analyze your data for advertising purposes—something they’ve been doing for years, Moore notes.

This data analysis allows Gmail, Outlook, Yahoo, and others to provide free services.

For many, myself included, the balance between cost and convenience seems acceptable. However, alternatives exist.

ProtonMail, a well-known option available for over ten years, features end-to-end encryption and built-in anonymity.

With 1 GB of free storage and an additional 5 GB from their Proton Drive cloud service, users get ample space, even if it’s less than what Google offers.

Increased social media usage compromises your identity security – Image courtesy of Alamy

Many web users have concerns about online payments, yet they are becoming increasingly unavoidable. PayPal is a reliable option.

“PayPal offers great convenience, and I’ve never encountered issues with them selling my data,” says Woodward.

Moore adds that PayPal can be a safer choice than directly entering your credit card information on websites.

For those apprehensive about credit card details being stolen, many banks provide virtual cards that can be utilized for one-time or occasional use through their apps.

Apple vs. Android

The debate between Apple and Android enthusiasts is as intense as sports rivalries, leading to divided opinions among experts.

Apps in the Apple App Store undergo more rigorous vetting, making them a potentially safer choice.

Nonetheless, “Both Apple and Android are vying to gather information on their users,” Moore points out. “Both seek user data, which translates to power.”

Avoid social media

The same applies to social media. Although it has become integral to modern life, there’s no perfect middle ground between engaging in online discussions and protecting your identity or controlling how platforms manage your data.

“For data-heavy social media, the best privacy strategy is simply to avoid it altogether,” Moore advises. “Keep in mind: If the service is free, you’re probably the product.”

read more:

Source: www.sciencefocus.com

Anthropic Unveils $50 Billion Initiative to Construct Data Centers Across the U.S.

On Wednesday, artificial intelligence firm Anthropic unveiled plans for a substantial $50 billion investment in computing infrastructure, which will include new data centers in Texas and New York.

Anthropic’s CEO, Dario Amodei, stated in a press release, “We are getting closer to developing AI that can enhance scientific discovery and tackle complex challenges in unprecedented ways.”

In the U.S., the typical timeframe to construct a large data warehouse is around two years, requiring significant energy resources to operate. “This level of investment is essential to keep our research at the forefront and to cater to the escalating demand for Claude from numerous companies,” the firm—known for Claude, an AI chatbot embraced by many organizations implementing AI—mentioned in a statement. Anthropic anticipates that this initiative will generate approximately 800 permanent roles and 2,400 construction jobs.

The company is collaborating with London-based Fluidstack to develop new computing facilities to support its AI frameworks. However, specific details regarding the location and energy source for these facilities remain undisclosed.

Recent transactions highlight that the tech sector continues to invest heavily in energy-intensive AI infrastructure, despite ongoing financial concerns like market bubbles, environmental impacts, and political repercussions linked to soaring electricity prices in construction areas. Another entity, TeraWulf, a developer of cryptocurrency mining data centers, recently stated its partnership with Fluidstack on a Google-supported data center project in Texas and along the shores of Lake Ontario in New York.

In a similar vein, Microsoft announced on Wednesday its establishment of a new data center in Atlanta, Georgia, which will link to another facility in Wisconsin, forming a “massive supercomputer” powered by numerous Nvidia chips for its AI technologies.

According to a report from TD Cowen last month, leading cloud computing providers leased an impressive amount of U.S. data center capacity in the third fiscal quarter of this year, exceeding 7.4GW—more than the total energy utilized all of last year.

As spending escalates on computing infrastructure for AI startups that have yet to achieve profitability, concerns regarding a potential AI investment bubble are increasing.

Skip past newsletter promotions

Investors are closely monitoring a series of recent transactions between leading AI developers like OpenAI and Anthropic, as well as companies that manufacture the costly computer chips and data centers essential for their AI solutions. Anthropic reaffirmed its commitment to adopting “cost-effective and capital-efficient strategies” to expand its business.

Source: www.theguardian.com

Google to Establish Space-Based Data Centers to Support AI Needs

Google is set to establish an artificial intelligence data center in space, with initial test equipment scheduled for launch into orbit in early 2027.

The company’s scientists and engineers are confident about deploying a densely clustered array of around 80 solar-powered satellites at approximately 400 miles above the Earth, each outfitted with robust processors to cater to the escalating AI demands.

Google highlights the rapid decline in space launch costs, suggesting that by the mid-2030s, operating space-based data centers may become as affordable as their terrestrial counterparts. The study was made public on Tuesday.

Utilizing satellites could significantly lessen the impact on land and water resources that are currently required for cooling ground-based data centers.

Once operational in orbit, the data center will harness solar energy and aim to achieve up to eight times the productivity of grounded facilities. However, launching a single rocket into orbit emits hundreds of tons of CO.2.

Astronomers have expressed concerns about the burgeoning number of satellites in low-Earth orbit, describing it as “akin to a bug on a windshield” when observing the cosmos.

The envisioned data centers under Project Suncatcher would use optical links for data transmission, primarily leveraging light or laser beams.

Major technology firms aiming for swift advancements in AI are projected to invest $3 trillion (£2.3 trillion) in data centers worldwide, ranging from India to Texas and Lincolnshire to Brazil. This surge in spending raises alarms regarding the carbon footprint if sustainable energy solutions are not sourced for these facilities.

“In the future, space might be the ideal environment for advancing AI computing,” stated Google.

“In light of this, our new research initiative, Project Suncatcher, envisions a compact array of solar-powered satellites utilizing Google TPUs and linked through free-space optical connections. This strategy has significant scaling potential and minimal impact on terrestrial resources.”

TPUs are specialized processors designed for AI model training and routine use. Free-space optical connections enable wireless communication.

Elon Musk, who oversees satellite internet provider Starlink and the SpaceX rocket program, announced last week that his company would begin expanding efforts to develop data centers in space.

Skip past newsletter promotions

Nvidia AI chips are anticipated to be launched into space later this month in collaboration with startup Starcloud.

“Space provides virtually limitless low-cost renewable energy,” commented Philip Johnston, co-founder of the startup. “The environmental cost occurs only at launch, and over the lifespan of the data center, there’s a tenfold reduction in carbon dioxide compared to ground-based power.”

Google aims to deploy two prototype satellites by early 2027, referring to the research findings as “the first milestone toward scalable space-based AI.”

However, the company cautions that “substantial engineering challenges persist, including thermal management, high-bandwidth ground communications, and the reliability of systems in orbit.”

Source: www.theguardian.com

Balancing Faith and Fear: Speculation Surrounds the $3 Trillion Global Data Center Surge

Global investments in artificial intelligence are yielding remarkable figures, with approximately $3 trillion (£2.3 trillion) allocated to data centers.

These immense facilities serve as the backbone for AI applications like OpenAI’s ChatGPT and Google’s Veo 3, driving the training and functioning of technologies that have attracted billions from investors.

Although there are worries that the AI boom might lead to a bubble poised to burst, indicators of such a downturn are currently absent. Recently, Nvidia, a Silicon Valley AI chip manufacturer, became the first company to reach a valuation of $5 trillion, while Microsoft and Apple each hit a $4 trillion valuation for the first time, marking a historic moment. OpenAI’s restructuring now appraises it at $500 billion, with Microsoft’s investment exceeding $100 billion. Projections suggest a potential $1 trillion surge as early as next year.

Moreover, Google’s parent company Alphabet announced $100 billion in revenue for a single quarter, driven by an increasing demand for AI infrastructure. Apple and Amazon also recently reported robust results.

Trust in AI extends beyond the financial sector; local communities housing the AI infrastructure are equally invested.

In the 19th century, the demand for coal and steel determined Newport’s trajectory. Today, Welsh towns are looking forward to a fresh era of growth generated by the latest global economic transformation.

At the site of a former radiator factory on the outskirts of Newport, Microsoft is constructing a data center to cater to the tech industry’s increasing demand for AI.

Microsoft is constructing a data center at Imperial Park near Newport, Wales. Photo: Dimitris Regakis/Athena Pictures

While standing on the concrete floor where thousands of buzzing servers will soon be installed, Dimitri Batrouni, the Labour leader of Newport City Council, remarked that the Imperial Park data center represents an opportunity to delve into the economy of the future.

“In a city like mine, what should we do? Should we dwell on the past in hopes of reviving the steel industry and bringing back 10,000 jobs? That’s not feasible. Or should we embrace the future?” he stated.

Yet, despite the current optimistic outlook regarding AI, uncertainties linger concerning the sustainability of spending in the tech sector.

The top four players in the AI industry (Amazon, Meta, Google, and Microsoft) are ramping up their AI spending. Over the upcoming two years, they are expected to invest more than $750 billion in AI-related capital expenditures, covering not just data centers and staff, but also the chips and servers they contain.

This expenditure is highlighted by the American investment firm Manning & Napier, which describes it as “nothing too remarkable.” The Newport facility alone could demand hundreds of millions of dollars. Recently, Equinix, based in California, announced intentions to invest £4 billion in a central hub in Hertfordshire.

Joe Tsai, chairman of the Chinese e-commerce giant Alibaba, cautioned in March that the data center market was beginning to exhibit signs of oversupply. “We’re starting to observe the early stages of a potential bubble,” he commented, referencing projects that finance constructions without securing commitments from prospective clients.

There are already 11,000 data centers globally, representing a 500% increase over the past two decades, and more are on the horizon. The means of funding this expansion raises concerns.

Analysts from Morgan Stanley predict that worldwide spending on data centers will approach $3 trillion by 2028, with $1.4 trillion of that anticipated from cash flow generated by large US tech firms known as “hyperscalers.”

Consequently, $1.5 trillion will need to be sourced from alternative means, such as private credit, which has been increasingly scrutinized by institutions like the Bank of England. Morgan Stanley estimates that private credit could cover more than half of the funding shortfall. Meta Inc. utilized private credit markets to raise $29 billion for an expansion of a data center in Louisiana.

Gil Luria, the head of technology research at DA Davidson, described investments in hyperscalers as a “healthy” aspect of the current boom, while labeling the remainder as “speculative assets devoid of customers.”

He noted that the debt being utilized could lead to repercussions extending beyond the tech sector if the situation deteriorates.

“Providers of this debt are so eager to invest in AI that they may not have adequately assessed the risks associated with a new and unproven category reliant on assets that depreciate quickly,” he indicated.

“We are in the initial phase of this influx of debt capital, but if it escalates to hundreds of billions of dollars, it could ultimately present structural risks to the global economy.”

Hedge fund founder Harris Kupperman noted in an August blog that data centers: depreciate at twice the rate of revenue generation.

The $500 billion Stargate project in Abilene, Texas, involves a collaboration between OpenAI, SoftBank, and Oracle. Photo: Daniel Cole/Reuters

Supporting this expenditure are heightened revenue forecasts from Morgan Stanley, which estimates that income generated from AI innovations such as chatbots, AI agents, and image generators could grow to $1 trillion by 2028 from $45 billion last year. To substantiate these revenue projections, tech firms are counting on enterprises, the public sector, and individual users to generate sufficient demand for AI and fund it.

OpenAI’s ChatGPT, a landmark product of the AI wave, currently boasts 800 million weekly active users. This statistic is a boon for optimists. However, concerns have arisen regarding user acquisition. For instance, investor confidence in the AI surge took a hit in August when the Massachusetts Institute of Technology released a study indicating that 95% of organizations reported zero return on investment from generative AI projects.

According to the Uptime Institute, which inspects and evaluates data centers, many projects go unconstructed, suggesting that some are part of a hype cycle and fail to materialize.

“It is crucial to understand that much of this is speculative,” stated Andy Lawrence, the Uptime Institute’s executive director of research. “Frequently, many data centers announced with great excitement are either never built or are only partially constructed and developed progressively over a ten-year span.”

He further added that numerous data centers unveiled as part of this multitrillion-dollar initiative “will be specifically designed for or primarily intended to support AI workloads.”

Microsoft has pointed out that its Newport data center will not solely serve AI. Data centers form the core for AI systems like ChatGPT and Microsoft’s Copilot but also cater to everyday IT tasks many take for granted (like managing email traffic, storing company files, and supporting Zoom calls) as providers of “cloud” services, where companies lease servers rather than purchasing them outright.

“The infrastructure has multiple applications, making it highly versatile,” explained Alistair Speirs, general manager of Microsoft’s cloud operations.

However, various large-scale projects are completely committing to AI. The US Stargate initiative is a $500 billion partnership among OpenAI, Oracle, and SoftBank, with plans to establish a network of AI data centers throughout the U.S. A British counterpart will also be set up in North Tyneside, in the northeast of England. Microsoft is constructing the most powerful AI data center in Fairview, Wisconsin, and is backing a dedicated AI site in Laughton, Essex, while Elon Musk’s xAI is developing a colossal project in Memphis, Tennessee.


Construction of an estimated 10GW of new data center capacity worldwide—equivalent to around a third of the UK’s electricity demand—is expected to commence this year, as reported by the property group JLL. However, this represents total maximum capacity, as data centers generally operate around 60% of their capacity.

JLL projects another 7GW will be completed this year.

The growth rate is swift, with current global data center capacity standing at 59GW, and Goldman Sachs forecasting capacity will double by the end of 2030. This expansion will elevate the costs related to the infrastructure, necessitating $720 billion in grid investments to satisfy that energy demand, according to Goldman.

Mike O’Connell, a construction safety specialist from Newport, has returned as a consultant at the Newport facility. With a career spanning oil rigs, offshore wind farms, and data centers globally, he returned to his hometown, now a tech hub filled with data centers and semiconductor firms.

“My aim is to remain within my local community,” he stated. Mr. O’Connell’s teenage grandson is embarking on his career at the Newport site as an electrical apprentice. There is optimism that such a data center will offer generational employment opportunities for the area.

Investors and tech giants are committing trillions of dollars in investments with hopes for long-term returns.

Source: www.theguardian.com

Data Reveals That the First Half of 2025 Marked the Most Fatal Weather Disaster Ever Recorded.

The initial months of this year marked the highest incidence of weather and climate disasters on record in the United States, as revealed by a recent analysis from the nonprofit Climate Central.

This crucial information may have remained unknown to the general public. Earlier this spring, the Trump administration shut down the National Oceanic and Atmospheric Administration’s program that monitored weather patterns. This event resulted in damages exceeding $1 billion. Adam Smith, the researcher who spearheaded the analysis, left NOAA in response to this decision.

Following his departure, Climate Central, a research organization dedicated to studying climate change impacts, employed Smith to revamp a database with records dating back to 1980.

Their latest analysis indicates that 14 individual weather events caused damages exceeding $1 billion in the first half of 2025. The wildfires in Los Angeles during January represented the most expensive natural disaster thus far this year, incurring costs over $61 billion, making it the most destructive wildfire recorded.

These findings illustrate that the financial toll from weather and climate disasters continues to escalate as extreme weather events become more frequent and severe, while populations migrate to areas increasingly vulnerable to wildfires and floods.

The report serves as a testament to the shift towards nonprofit organizations taking over federal initiatives that traditionally monitored and measured the effects of climate change, particularly as the Trump administration moves to scale back climate science funding. President Trump labeled climate change as a “crook’s job,” and the administration has reduced funding for clean energy initiatives while stripping the Environmental Protection Agency of its ability to control greenhouse gas emissions, which contribute to global warming.

Jennifer Brady, a senior data analyst and research manager at Climate Central involved in the project, noted that the staff was profoundly affected by the discontinuation of NOAA’s extensive disaster database, prompting them to take action.

“This has always been one of our most valued datasets. It narrates diverse stories. It articulates the narrative of climate change as well as the implications of where individuals reside and how they live at risk,” Brady stated. “I am ready to take it home.”

Kim Doster, a spokesperson for NOAA, expressed appreciation that the $1 billion disaster product has secured funding from sources other than taxpayers.

“NOAA remains committed to upholding ethical, unbiased research and reallocating resources to products that comply with executive directives aimed at restoring high standards in science,” Doster conveyed via email.

This database has been a source of political contention. House Republicans raised concerns with NOAA officials in 2024 regarding allegations of “deceptive data.” Recently, Senate Democrats proposed legislation to obligate NOAA to publish and update this dataset biannually, claiming it helps lawmakers in disaster funding decisions. However, this bill is currently stalled in committee and faces bleak prospects in the Republican-majority Senate.

Last month, officials from the Trump administration informed NBC News that NOAA terminated the database project due to uncertainty in accurately estimating disaster costs. The official highlighted that the project would incur annual costs of around $300,000, require considerable staff effort, and yield “pure information at best, with no clear objective.”

“This data is frequently utilized to bolster the claim that climate change enhances the frequency, severity, and expense of disasters, neglecting other factors like increased development in flood-prone and weather-sensitive areas as well as the cyclical variations in climate across different regions,” the official remarked at the time.

Despite this, Brady contends that the database has always acknowledged the significance of population shifts and climate change in exacerbating disaster costs.

She noted that Climate Central’s study employs the same methodologies and data sources as the NOAA database, including claims from the National Flood Insurance Program, NOAA storm event data, private insurance claims, and more.

This analysis captures the “direct costs” of disasters, such as damage to infrastructure, buildings, and crops, while omitting other considerations like loss of life, health-related disaster expenses, and economic losses to “natural capital” such as forests and wetlands. All data has been adjusted for inflation.

A recent evaluation of the first half of 2025 suggests that this year is on track to become the deadliest recorded year, despite the absence of hurricanes making landfall in the continental United States.

In the previous year, NOAA reported that $27 billion in disaster costs totaled around $182.7 billion, marking the second-highest total of billion-dollar disasters in the report’s history, following the figures from 2023.

Climate Central is not alone in its efforts to reproduce the work previously undertaken by the federal government as the Trump administration cut back on climate science.

A collective of dismissed NOAA employees established climate.us, a nonprofit successor to climate.gov, the former federal site that offered data and analyses to help the general public grasp climate issues. The site went offline this summer.

Edited by Rebecca Lindsay climate.gov. Before her termination in February, along with other NOAA colleagues who co-founded the nonprofit, Lindsay stated they had raised about $160,000 with plans to host climate.gov, where they will share their archives and begin publishing new articles on climate change in the upcoming weeks.

“We are preserving this information to ensure that when people seek answers about climate status, they can find them,” Lindsey asserted.

Both the American Geophysical Union and the American Meteorological Society have announced intentions to publish a special collection of studies focused on climate change, particularly after the Trump administration informed volunteer scientists working on the National Climate Assessment that their services were no longer required.

The administration dismissed employees from the U.S. Global Change Research Program, responsible for organizing the National Climate Assessment and coordinating climate research initiatives across various federal offices.

Walter Robinson, from the American Weather Society’s publication committee, highlighted that the National Climate Assessment was “effectively stopped” due to the government’s decision, which he described as an “abandonment” of federal duty.

Though the new collection cannot replace comprehensive assessments, it aims to consolidate the latest scientific understanding on climate change impacts within the United States, he added. The research will be featured in numerous scientific journals on an ongoing basis.

“Individuals are stepping up,” Robinson remarked regarding his group’s endeavors. “As scientists, we do our utmost.”

Source: www.nbcnews.com

Leaked Age Verification IDs from Discord Data Breaches | Gaming News

Discord, the popular video game chat platform, has informed users about a data breach that has potentially compromised the personal information required for age verification.

Last week, the company reported that unauthorized individuals accessed one of Discord’s third-party customer service providers, impacting “a limited number of users” who interacted with customer service or the trust and safety teams.

Compromised data could encompass usernames, email addresses, billing details, the last four digits of credit card numbers, IP addresses, and messages exchanged with customer support.

According to Discord, the alleged attackers “gained access to a small number of government ID images (e.g., driver’s licenses, passports, etc.) from users who submitted appeals regarding their age verification.

The affected users were informed as of last week.

“If any ID is accessed, it will be explicitly mentioned in the email you receive,” Discord stated.

The support system was reportedly exploited to retrieve user data in an attempt to extort a financial ransom from Discord, the company clarified.

Discord mentioned that the third-party provider has since revoked access to the ticketing system and has initiated an internal investigation in collaboration with law enforcement.

Users who received the notification indicated that the attack likely occurred on September 20th.

With over 200 million active users each month, Discord continues to grow.

Earlier this year, Discord began verifying user ages in the UK and Australia using facial age verification tools. The company stated that age verification face and ID images are “deleted immediately afterwards,” but according to their website, users can reach out to the trust and safety team for a manual review if verification fails.

Under the upcoming social media ban for users under 16, effective December 10, the Australian government specified that platforms like Discord will have various ways to verify user ages and hopes to address unfavorable decisions swiftly.

As part of the age verification scheme, the platform can request an ID document, though it is not the sole method of age verification available under their policy.

Australia’s Privacy Committee has confirmed that it has been notified of the breach involving Discord.

Discord has been contacted for further comments.

Source: www.theguardian.com

UK Government to Renew Dispute with Apple Over Access to User Data | Data Protection

The UK government has accessed customer information and intensified its dispute with Apple by requesting a backdoor to the cloud storage services of high-tech companies.

Previously, the Home Office sought access to data tied to Apple’s Advanced Data Protection (ADP) services uploaded by users globally, leading to tensions with the White House.

On Wednesday, The Financial Times reported that the government has introduced a new access order known as the Technical Capacity Notice (TCN), which aims to gain access to encrypted cloud backups for UK citizens.

A spokesperson for the Ministry of Home Affairs noted that the department does not comment on operational matters such as “confirming or denying the presence of such notices.” The spokesperson added: “We will always take all necessary actions at the national level to ensure the safety of our British citizens.”


In February, Apple withdrew ADP for new UK users, advising that existing users would need to deactivate security features in the future. Messaging services such as iMessage and FaceTime continue to be end-to-end encrypted by default.

Tulsi Gabbard, director of US national intelligence, mentioned that the UK had backed down in August by insisting on access to US customer data. Donald Trump characterized the demand for access as “what you hear is China.”

While Apple did not directly address the FT report, it expressed regret over its inability to provide ADP (an optional additional layer) to UK customers, stating it would “never” implement backdoors in its products.

“Apple remains dedicated to delivering the highest level of security for personal data, and we hope to achieve this in the UK in the future. As I’ve reiterated many times, we’ve never created a backdoor or a master key for any product or service.”

Apple has challenged the initial TCN via the Investigatory Powers Tribunal, questioning whether the national intelligence agency acted unlawfully. The Home Office had attempted to keep the case’s details confidential, but after a ruling in April, it was confirmed that Apple’s appeal resulted in some information being released for the first time.

However, the specifics of the TCN remain undisclosed, and recipients of such notices are prohibited from revealing their existence under investigatory rights. The FT indicates that the original TCN is “not limited to” data stored under the ADP, suggesting the UK government seeks access to fundamental and widespread iCloud services.

The ADP service employs end-to-end encryption, ensuring that only account holders can decrypt files like documents and photos, leaving no one else, including Apple, with that capability.

Privacy International, the organization that initiated a legal challenge against the first TCN, remarked that this new order “may pose as significant a threat as the previous ones.” It noted that if Apple is compelled to compromise end-to-end encryption in the UK, it would create vulnerabilities affecting all users by undermining the entire system.

“Such vulnerabilities could be exploited by hostile states, criminals, and other malevolent entities across the globe,” the organization stated.

Source: www.theguardian.com

Kido Nursery Hackers Claim to Have Removed Stolen Data | Cybercrime

Cybercriminals who compromised the personal information and photos of thousands of nursery children have since removed the data following a public outcry.

The group responsible for the breach has erased details of children from the UK-based Kido nursery network.

Screenshots reviewed by the Guardian show that the child’s profile from the breach is no longer visible. Currently, the Kido logo is displayed with “More” under “More,” but sources in cybersecurity report that the link is non-functional, indicating that the data has been removed.

A spokesperson for Kido confirmed that the attacker had indeed deleted the previously exposed information.

The spokesman stated: “We are adhering to guidance from authorities regarding ransom payments to prevent incentivizing further criminal activities. We are collaborating closely with families, regulatory bodies, law enforcement, and cybersecurity experts to ensure our data is permanently removed.”

The BBC first reported on the data deletion and mentioned a hacker who expressed remorse, stating, “I’m sorry for hurting the child.”

Targeting children has drawn widespread condemnation, with cybersecurity experts labeling the breach as “crossing a line” and “testing ethical boundaries.” A parent of a child at Kido in London remarked that the hackers were “sinking to new lows.”

The Guardian has also found indications of notorious gang members in underground cybercrime forums being advised by their peers to avoid attacking minors.

On Wednesday, members of Nova, a faction that offers hacking services to other criminals, cautioned a persona named Radiant on an anonymous Russian forum, saying, “reputation matters, so do not target children.” Radiant responded, “We have not been allowed to cease any operations concerning them,” adding, “data of those under 19 who attended has been deleted.”

The leak site and forum posts were documented by analysts at the cybersecurity firm Sophos.

Hacking teams are acutely aware of the impact of negative publicity, which can lead to increased scrutiny from law enforcement and disrupt internal relationships within the hacking community.

Sophos researcher Rebecca Taylor noted: “Even criminals understand that there are lines they shouldn’t cross. We have discovered that stealing data from minors not only draws attention but also damages credibility.”

Taylor emphasized, “credibility is crucial” for groups that demand ransoms for stolen information. The BBC reported that Radiant had sought £600,000 in Bitcoin from Kido for the return of the data, but Kido refused to comply.

“The deletion of data was not an act of benevolence, but rather a move for damage control. This was an unusual instance where morality and self-interest briefly aligned,” Taylor remarked.

However, the revamped Radiant Leak site, a portal for such data, appears to be more user-friendly, featuring a search bar to locate companies targeted by the group and contact information through TOX, an encrypted messaging platform.

Radiant demonstrates proficient English in communication, but analysts suspect this group may not be Western-based. Most ransomware groups originate from former Soviet states. Analysts believe that Radiant may represent a new entity in the cybercrime landscape.

Before the data was deleted, one woman informed the BBC that she received a threatening call from a hacker who claimed they would publish information about her child online unless she pressured her child to comply with ransom demands. Kido operates 18 locations in London, along with nurseries in the US, India, and China.

Radiant boasted about having sensitive information on over 8,000 children and their families, including incident reports, protection records, and billing information. All Kido nursery locations in the UK reported being affected by the breach.

One cybercriminal told the BBC: “All child data has been removed. There is nothing left, and this should reassure parents.”

Source: www.theguardian.com

South Korea Elevates Cyber Threat Levels Following Data Center Incident that Triggered Hacking Nightmare

South Korea’s intelligence agency has elevated the national cyber threat level due to fears that hackers may exploit the chaos caused by recent fires in government data centers, which have disrupted crucial digital infrastructure nationwide.

The National Cybersecurity Centre, managed by the Intelligence Reporting Agency, has raised its alert from “Warning” to “Warning” as of Monday, highlighting fears that hackers could take advantage of the vulnerabilities during recovery efforts.

The incident occurred on Friday evening at the National Information Resources Service in Great Jeon, approximately 140 kilometers (87 miles) south of Seoul. This facility is one of three operational government data centers that handle critical digital infrastructure across the nation.

Workers had relocated a Lithium-ion battery from the server room on the fifth floor to the basement when the fire started. It spread to other nearby batteries and servers, resulting in one worker sustaining first-degree burns, while firefighters managed to extinguish the blaze after 22 hours.


By Saturday morning, officials had shut down 647 government systems to prevent further damage. Government email and intranet systems were offline, along with mobile identification services, postal banks, complaint portals, and major government websites.

Schools lost access to student records, and tax deadlines passed without being processed. Real estate transactions faced delays due to the inability to verify digital documents. The national crematorium reservation system was impacted, and many hospitals and transport terminals initially left citizens without physical identification cards.

As of 1 PM on Tuesday, 89 out of the 647 affected systems had been restored, including significant government portals, postal services, and identity verification systems.

Officials estimate that 96 of the affected systems have suffered complete failure, necessitating a recovery period of about four weeks as they are moved to a large backup facility. This disruption is expected to persist through Chuseok, the major public holiday in early October.

President Lee Jae Myung issued an apology on Sunday. During a crisis meeting, he expressed dismay at the lack of a backup operating system, stating, “It was a foreseeable incident, but there were no countermeasures. It’s not that the measures didn’t work; they simply didn’t exist.”

When questioned about the backup procedures, an official remarked that they were “driving without a map.”

The upcoming Asia-Pacific Economic Cooperation (APEC) Summit, set to be hosted in the southeastern city of Kyoto at the end of October, has raised security concerns as officials from the US, China, and other regions plan to attend.

In October 2022, a fire involving a lithium-ion battery at Kakao, the company behind the popular messaging app KakaoTalk, resulted in millions losing access to messaging, taxis, and digital payments, leading to national chaos.

Following the Kakao incident, parliament passed legislation mandating redundant systems and intervals between batteries and other equipment for internet service providers and data center operators.

The left-leaning Hankyoreh newspaper questioned what last week’s failures indicated about “a nation that prides itself on being an information technology powerhouse.”

In a similar vein, the conservative Dong-a Ilbo remarked that referring to South Korea as a digital leader has become “embarrassing.”

Lawmakers from both the ruling party and the opposition have traded blame regarding the responsibility for the crisis. President’s Chief of Staff Kang Hoon-Sik directed authorities on Monday to focus on resolving the issue rather than criticizing the previous administration.

Source: www.theguardian.com

US Border Patrol Collects DNA from Thousands of American Citizens, Data Reveals

In March 2021, a 25-year-old American citizen arrived at Chicago’s Midway Airport and was detained by US Border Patrol agents. According to a recent report, the individual underwent a cheek swab for DNA collection. This person was later identified by state authorities, and their DNA was entered into the FBI’s genetic database, all without any criminal charges being filed.

This 25-year-old is among roughly 2,000 US citizens whose DNA was gathered and forwarded to the FBI by the Department of Homeland Security between 2020 and 2024, as reported by Georgetown’s Privacy and Technology Center. The report highlights that even some 14-year-old US citizens had their DNA collected by Customs and Border Protection (CBP) officials.

“We have witnessed a significant breach of privacy,” stated Stevie Gloverson, director of research and advocacy at Georgetown’s Privacy Center. “We contend that the absence of oversight on DHS’s collection powers renders this program unconstitutional and a violation of the Fourth Amendment.”

When immigration officials collect DNA to share it with the FBI, it is stored in the Combined DNA Index System (Codis), which is utilized nationwide by various law enforcement agencies to identify crime suspects. A 2024 report also revealed that CBP collects DNA data from the Privacy and Technology Center in Georgetown. Additionally, the data indicates that DNA was collected and shared from immigrant children, with initial estimates suggesting that approximately 133,000 teens and children have had their sensitive genetic information uploaded to this federal criminal database for permanent retention.

The recent CBP document specifically outlines the number of US citizens from whom genetic samples were collected at various entry points, including significant airports. The agency gathered data on the ages of individuals whose DNA was obtained by border agents as well as any charges associated with them. Like the 25-year-old, around 40 US citizens had their DNA collected and forwarded to the FBI, including six minors.

Under current regulations, CBP is authorized to gather DNA from all individuals, regardless of citizenship status or criminal background.

However, the law does not permit Border Patrol agents to collect DNA samples from US citizens merely for being detained. Yet, recent disclosures indicate that CBP lacks a system to verify whether there is a legal basis for collecting personal DNA.

In some atypical instances, US citizens had DNA collected for minor infractions like “failure to declare” items. In at least two documented cases, citizens were subjected to DNA swabbing, with CBP agents merely noting the accusation as “immigration officer testing.”

“This is data from CBP’s own management,” Gloverson pointed out. “What the documentation reveals is alarming. Afterward, CBP agents are isolating US citizens and swabbing their mouths without justification.”

No formal federal charges have been filed in approximately 865 of the roughly 2,000 cases of US citizens whose DNA was collected by CBP, indicating, according to Gloverson, that no legal cases have been presented before an independent authority, such as a judge.

Skip past newsletter promotions

“Many of these individuals do not go before a judge to assess the legality of their detention and arrest,” she remarked.

DNA records can disclose highly sensitive information, such as genetic relationships and lineage, regardless of an individual’s citizenship status. Information found in the criminal database, utilized for criminal investigations, could subject individuals to scrutiny that may not otherwise occur, Gloverson warned.

“If you believe your citizenship guards you against authoritarian measures, this situation is clear evidence that it does not,” she concluded.

Source: www.theguardian.com

Clues to Exotic Dark Matter Particles Could Be Found in LHC Data

SEI 266779422

ATLAS Detector of the Large Hadron Collider

Xenotar/Getty Images

The theoretical particles known as axions have attracted the attention of physicists for decades, as they are significant candidates for identifying dark matter. Recent research suggests that we might not need new experiments to discover these exotic particles; evidence could already be embedded in existing data from previous particle collider experiments.

Particle colliders like the Large Hadron Collider (LHC), located at CERN near Geneva, Switzerland, discover new particles by colliding protons and ions, analyzing the resulting debris. Now, Gustabo Gilda Silveyra and his team at CERN are exploring another avenue: can we detect when a proton or ion emits a new particle during acceleration? Their findings indicate that this may indeed be possible.

The axion was theorized in the 1970s as part of a pivotal solution to a significant problem in physics. Its importance surpasses even that of antimatter. Although the ongoing search for experimental evidence of axions has not yet yielded results, it raises the possibility that other particles resembling axions might exist. Due to their incredibly low mass, they bear a close resemblance to substantial quantities of light or photons, interacting together with the LHC.

This interaction primarily occurs when protons or ions are accelerated to astonishing energy levels. As these particles approach each other, they begin to emit radiation in the form of photons, which may then collide with one another. Researchers have modeled this scenario, replacing photons with axion-like particles. Their results indicate that accelerated protons exhibit a higher likelihood of generating axion-like particles compared to accelerated ions, with both producing photons simultaneously. Consequently, the team has identified collisions between protons and lead ions as optimal for uncovering signals related to axions influencing photons. The specific proton-lead ion collisions were executed at the LHC in 2016, and the researchers propose that data from these experiments might have been previously overlooked but could contain vital hints about new axion-like particles.

Lucien Haaland Lang from University College London has remarked that this approach presents an intriguing new pathway to uncover potential undiscovered particles, though he cautions about the challenges involved. “Such collision events are rare, and we must be cautious to differentiate our findings from background processes that may inadvertently mimic the signals we seek,” he notes.

Access to older LHC data poses challenges due to updates in software, according to Da Silveira. However, he expresses optimism regarding future experiments at the LHC. “We will be able to adjust the detector to capture this specific signal,” he states.

Identifying a particle signal analogous to an axion does not equate to discovering an actual axion, thus leaving one of the major unresolved questions in physics unanswered. Nonetheless, it expands our understanding of particle physics, prompting inquiries into how new particles might interact with known counterparts and whether they might help explain the enigmatic dark matter that permeates the universe.

Journal Reference: Physical Review Letter, In print

Topics:

  • Large Hadron Collider/
  • Particle Physics

Source: www.newscientist.com

Google’s Massive New Essex Data Centre Releases 570,000 Tonnes of CO2 Annually

The new Google Data Centre in Essex is projected to emit over 500 tons of carbon dioxide annually.

Spanning 52 hectares (128 acres), the “Hyperscale Data Centre” in Thurrock is set to join the ranks of large-scale computers and AI infrastructures, pending planning approval.

This proposal was submitted by a subsidiary of Google’s parent company, Alphabet. Concerns about carbon emissions arose before a coordinated initiative by Donald Trump’s White House and Downing Street aimed at enhancing the UK’s AI capabilities. A multibillion-dollar investment deal with major tech firms from Silicon Valley is anticipated to be unveiled during the US president’s state visit, starting on Tuesday.


According to Keir Starmer’s Government, there is a forecast that AI will require 13 times the current processing power by 2035, leading to a rush in data centre construction to fulfill demand. The expectation is that this technological advancement will enhance the UK’s economic productivity. A collaboration is anticipated involving Nvidia, the largest AI chip manufacturer, and OpenAI, the creators of the ChatGPT AI assistant.

However, advocates argue that the influx of new large computer facilities will raise UK greenhouse gas emissions and strain limited electricity and water resources.

If approved, the Thurrock facility will encompass up to four data centers on “Grey Belt” land, some of which has been used for speedway events and stock car racing. This will contribute to a net increase of 568,727 tons of greenhouse gas emissions (carbon dioxide equivalent) annually at operational status. For further details, refer to the planning document reviewed by the Guardian.

According to the United Nations International Civil Aviation Organization, this is roughly equivalent to 500 flights from Heathrow to Malaga each week, as calculated by the carbon calculator. Google’s planning application contends that this will not significantly impact the UK carbon budget, a view challenged by campaigners.

A spokesperson from FoxGlove, a group advocating for fair technology, stated, “The facility planned by Google in Essex will generate emissions significantly higher than those produced by an international airport.” This reflects a broader trend of imposing ‘hyperscale’ data centers across the UK, prioritizing profit over environmental health.

“The Starmer government must resist the influence of big tech and advocate for the UK populace,” they continued. “Otherwise, we will all bear the consequences of expensive energy bills, dwindling water supplies, and the effects of a warming planet.”

Currently, data centers account for approximately 2.5% of the UK’s electricity consumption, with demand predicted to quadruple by 2030, as noted by the Commons Library.

Skip past newsletter promotions

The UK government asserts that data centers will not significantly affect the UK carbon budget due to an ambitious objective to decarbonize the electricity grid. However, there are concerns that without significant investment in new data centers, the UK risks falling behind international competitors like France, jeopardizing its ambitions in national security, economic growth, and AI.

Other noteworthy data center initiatives include a £10 billion project at a former coal-fired power plant in Blythe, Northumberland, which received planning approval in March. This facility is positioned at the core of a major contract involving Nvidia and OpenAI. Over the weekend, there were also reports that Google was in discussions regarding a large data center in Teesside.

Global consultancy Bain & Company reported on Monday that AI and data centers could contribute to 2% of global emissions and 17% of industrial emissions by 2035, with the most significant impact occurring in nations where fossil fuels dominate energy generation.

Google declined to comment on the planning application for the Thurrock site, while Teesside stated that they “do not comment on rumors or speculation.”

Source: www.theguardian.com

Unexpectedly Valuable Mathematical Patterns in Real-World Data

“When you search for stock market prices, you may see patterns…”

Muhla1/Getty Images

Flipping through the front page of a newspaper, one is greeted by a myriad of numbers—metrics about populations, lengths, areas, and more. If you were to extract these figures and compile them into a list, it might seem like a random assortment.

However, these figures are not as arbitrary as they may appear. In reality, the leading digit of many numbers, such as total revenues or building heights, tends to be predominantly the number 1. While true randomness would suggest that each digit has an equal chance of leading, the actual data shows that about one-third of the time, the first digit is a 1. The number 9, interestingly, appears as the leading digit in about 5% of cases, with other digits following such a trend.

This phenomenon is referred to as Benford’s Law, which illustrates the expected distribution of first digits within a dataset of a certain type—especially those spanning a wide, unspecified range. Although values like human height (where numbers are confined within a limited spectrum) or dates (which also have defined limits) don’t follow this law, others do.

Consider checking your bank balance, numbering a house, or analyzing stock prices (as displayed). Such numbers commonly exhibit a distribution with varied digit lengths. In neighborhoods with just a handful of houses, you might see a balance of numbers, whereas in larger towns, hundreds may share similar leading digits.

Picture a street hosting nine houses. The proportion of leading digits resembles an even split among the nine options. Conversely, on a street with 19 houses, a larger fraction—often over fifty percent—will begin with 1. As the housing number increases, this pattern persists. With 100 houses, you would observe a fairly uniform distribution across all digits, yet with 200 occupants, once again, more than half will typically start with 1.

Due to the diverse origins of data in real-world collections, the average likelihood of seeing numbers that start with 1 fluctuates between these two extremes. Similar calculations can be made for other digits, resulting in an overall frequency distribution observable in extensive datasets.

This characteristic is particularly useful in identifying potential data fabrication. When analyzing a company’s financial records, a Benford-like distribution is expected in their sales figures. However, when someone generates random numbers, the frequency distribution of the leading digits lacks a defined curve. This principle serves as one of the many tools forensic accountants employ to root out dubious activities.

The next time you examine your bank statement or compare river lengths, take note of how often those numbers start with 1.

Katie Steckles is a mathematician, lecturer, YouTuber, and author based in Manchester, UK. She also contributes advice to Brent Wister, a puzzle column for New Scientist. Follow her @stecks

For additional projects, please visit newscientist.com/maker

topic:

Source: www.newscientist.com

Government Under Scrutiny Following Examination of 11 Significant UK Data Breaches | Data Protection

The government is under pressure to clarify why it has not yet acted on all recommendations from the 2023 review. This includes findings concerning Afghans, victims of child sexual abuse, and 6,000 disability claimants working alongside the British military.

On Thursday, the Minister finally published an information security review. This move followed a 2023 leak involving personal data of approximately 10,000 military personnel from Northern Ireland’s police service.

The Cabinet Office’s review of 11 public sector data breaches revealed three overarching themes affecting entities such as HMRC, the Metropolitan Police, Benefits Systems, and the MOD.

  • Insufficient control over incidental downloads and the aggregation of sensitive data.

  • Disclosure of sensitive information through “wrong recipient” emails and improper use of BCC.

  • Undisclosed personal data emerging from spreadsheets set for release.

The review was released 22 months after the database of 18,700 Afghans was finalized just a month following its publication and was praised by Chi Onwurah, chair of the Science, Innovation and Technology Committee. However, she remarked:

Data breaches concerning Afghans have instilled fear among those concerned for their safety under the Taliban and those wary of the UK government, which promised relocation to thousands of Afghans under a confidential plan.

The government reported that it has acted on 12 of the 14 recommendations aimed at enhancing data security. Onwurah stated: “There are still questions that the government must address regarding the review. Why have only 12 out of the 14 recommendations been executed?”

“For governments to leverage technology to boost the economy and fulfill their aspirations of public sector transformation, they must earn their citizens’ trust in safeguarding their data.

Intelligence Commissioner John Edwards urged the government to “encourage the broader public sector to expedite the organization of its practices to secure Whitehall.”

He emphasized to Cabinet Secretary Pat McFadden on Thursday, “It is imperative that the government fully actualizes the recommendations from the Information Security Review.”

It remains unclear which of the 14 recommendations are still pending implementation. The full list includes collaboration with the National Cybersecurity Centre to disseminate existing guidance on the technical management of “official” labeled products and services, marking of “official” information, launching a “behavioral impact communication campaign” to combat ongoing deficiencies in information processing, and the necessity for a “review of sanctions related to negligence.”

McFadden and Peter Kyle, the secretaries of state for science, innovation, and technology, communicated to Onwurah in a letter on Thursday.

A spokesperson for the government stated: “This review concluded in 2023 under the previous administration.

“Safeguarding national security, particularly government data security, remains one of our top priorities. Since taking office, we have introduced plans to enhance inter-sector security guidance, update enforcement training for civil servants, and improve the digital infrastructure throughout the public sector, aligning with the shift towards modern digital governance.”

Source: www.theguardian.com

UK Relents on Demand for Access to Apple User Data, Reports Spy Chief

The UK government has dismissed claims made by Donald Trump’s intelligence chief, Tulsi Gabbard, that Apple permits law enforcement to “backdoor” access to U.S. customer data.

Gabbard shared her assertion on X, following months of tension involving Apple, the UK government, and the U.S. presidency. Trump accused the UK of acting like China and warned Prime Minister Kiel Starmer, “You can’t do this.”

Neither the Home Office nor Apple has commented on the supposed agreement. Gabbard stated that this indicates the UK does not mandate Apple to provide access to secured, encrypted information related to American citizens, thus preventing backdoors that infringe on civil liberties.

The international dispute intensified when the Department of the Interior issued a “Technical Capacity Notice” to Apple under its statutory authority. Apple responded by initiating a legal challenge, but the Home Office insisted on confidentiality, although the instructed judge’s decision was later made public.

U.S. Vice President JD Vance remarked, “American citizens don’t want to be spied on.” He added that “we’re creating backdoors in our own tech networks that our adversaries are already exploiting,” labeling the situation as “crazy.”

Civil liberties advocates cautioned that backdoors could pose risks to politicians, activists, and minority groups.

In February, Apple retracted an option to enable advanced data protection features, prompting new UK customers to express their “deep disappointment” and declare they would never create a backdoor for their products. Consequently, many UK users remain vulnerable to data breaches and lack access to end-to-end encryption for services like iCloud drives, photos, notes, and reminders.

Gabbard noted, “In recent months, we have collaborated closely with our UK partners and President Trump to safeguard private data belonging to Americans and uphold constitutional rights and civil liberties.”

It’s uncertain if the notification requiring data access will be entirely retracted or modified. Theoretically, it may be restricted to allowing data access solely for UK citizens, but experts caution that this may be technically unfeasible. Additionally, there remains a risk that foreign governments could exploit any established backdoor.

Quick Guide

Please contact us about this story

show

The best public interest journalism depends on direct accounts from informed individuals.

If you have relevant information, please reach out to us confidentially using the following methods:

Secure Messaging in Guardian App

The Guardian app has a feature for sending story tips. Messages are encrypted end-to-end and are camouflaged within normal activities of the Guardian mobile app, preventing any observer from knowing that you are in communication with us.

If you haven’t already, download the Guardian app (iOS/Android) and select ‘Secure Messaging’ from the menu.

SecureDrop, Instant Messenger, Email, Phone, and Mail

If you can safely utilize the TOR network without being observed, you can send us messages and documents through our <a href=\"https://www.theguardian.com/securedrop\">SecureDrop platform</a>.

Lastly, our guide at <a href=\"https://www.theguardian.com/tips\">theguardian.com/tips</a> lists various secure contact methods, including the pros and cons of each.

Illustration: Guardian Design / Rich Cousins

Thank you for your feedback.


It remains unclear whether Apple will regain access to the highest level of data protection for new UK customers.

The Home Office declined to confirm Gabbard’s statements, stating that it “does not comment on operational matters, including whether such notices exist.” They emphasized their long-standing joint security and intelligence agreement with the United States aimed at addressing the most serious threats, including terrorism and child sexual abuse, which involves the role of advanced technologies in exacerbating these issues.

“These agreements have consistently included safeguards to uphold privacy and sovereignty. For example, Data Access Agreements incorporate crucial protections to prevent the UK and the US from targeting each other’s citizens’ data. We are committed to enhancing these frameworks while maintaining a robust security structure that can effectively combat terrorism and ensure safety in the UK,” they added.

The UK Data Access Agreement permits UK agencies to directly request telecommunications content from service providers, including U.S. social media platforms and messaging services, but solely for the investigation, prevention, detection, and prosecution of serious crimes.

Apple was contacted for a statement.

Source: www.theguardian.com

UK Can Request Backdoor Access to Encrypted Data for Apple Users on Demand

Reports suggest that pressure from Washington is compelling the UK government to insist that Apple give UK law enforcement backdoor access to encrypted customer data.

In January, the UK’s Home Office formally requested that Apple grant law enforcement access to the heavily encrypted data stored on behalf of its customers. Nevertheless, the US company has resisted offering advanced data protection services in the UK and subsequently withdrew them, asserting that privacy is one of their “core values.”

According to the Financial Times, sources within the UK government believe that pressure from Washington, including from US Vice President JD Vance, is creating significant challenges for the Home Office.

Vance has previously criticized the concept of “creating a backdoor in our own technology network,” labeling it “crazy” because such vulnerabilities could be exploited by adversaries, even if intended for domestic security.

The FT, citing Whitehall sources, reported that “the Home Office will essentially have to back down.”




JD Vance criticizes the creation of backdoors to access encrypted data. Photo: Saul Loeb/AFP/Getty Images

The Home Office has not commented immediately.

The Ministry of Home Affairs issued a “Technical Capability Notice” to Apple under the Investigatory Powers Act. However, in February, Apple responded by withdrawing its advanced data protection (ADP) services from the UK, stating, “We’ve never built a backdoor or a master key to either our products or services, and we never will.”

ADP is available globally, providing end-to-end encryption for iCloud drives, backups, notes, wallet passes, reminders, and other services.

Apple has initiated a legal challenge in the Investigatory Powers Court regarding the Home Office’s authority to request backdoor access. Although the Home Office requested confidentiality, the judge ordered that case details be disclosed.

Skip past newsletter promotions

The government aims to position the UK as an attractive destination for investment from US tech companies.

Some ministers contend that encryption technology hinders law enforcement’s ability to address crimes, such as child exploitation. However, there are concerns that demanding backdoors could jeopardize a technological agreement with the US, which is a critical aspect of the trade strategy.

Source: www.theguardian.com

Google Inc. Secures $3 Billion US Hydroelectric Contract to Power Energy-Intensive Data Centers

Google has committed to securing up to 3GW of hydropower in what is being termed the largest clean power agreement by a corporation, as the tech giant seeks to expand its energy-intensive data centers, the company announced on Tuesday.

The agreement with Brookfield Asset Management includes a 20-year power purchase deal worth $3 billion for electricity generated from two hydroelectric plants located in Pennsylvania.

Additionally, the tech giant will invest $25 billion into data centers across Pennsylvania and neighboring states over the next two years, according to Semafor’s report on Tuesday.

The technology sector is increasingly seeking vast amounts of clean energy to support the power demands of data centers essential for artificial intelligence and cloud computing.

Ruth Porat, president and chief investment officer of Google’s parent company Alphabet, spoke about the initiative at the AI Summit in Pittsburgh, where Donald Trump announced a $70 billion investment in AI and energy.

Amanda Peterson Corio, head of Datacenter Energy at Google, commented on the collaboration with Brookfield, stating, “This partnership is a crucial step towards ensuring a clean energy supply in the PJM region where we operate.”

Almost a year ago, Google initiated several unique power purchase agreements involving carbon-free geothermal energy and advanced nuclear options. The company is also collaborating with PJM Interconnect, the largest power grid operator in the U.S., to expedite the integration of new power sources using AI technology.

Skip past newsletter promotions

Google has entered into an initial framework agreement with Brookfield, the owner of Brookfield Renewable Partners, stating its intent to develop and operate a renewable energy facility. The two hydroelectric plants in Pennsylvania will undergo upgrades and refurbishment as part of this agreement. Furthermore, Google intends to expand its commitment beyond these facilities to other regions within the Mid-Atlantic and Midwest.

Source: www.theguardian.com

Louis Vuitton Reports Cyberattack Compromising UK Customer Data | Cybercrime

Louis Vuitton has announced that data from some of its UK customers has been compromised, making it the latest retailer to fall victim to cyber hackers.

The prestigious brand, part of the French luxury conglomerate LVMH, reported that an unauthorized third party gained access to the UK operations system, retrieving personal information such as names, contact information, and purchase histories.

Last week, Louis Vuitton informed customers that its South Korean business was experiencing similar cyber incidents and reassured them that financial data, including bank information, remained secure.

“Currently, there is no evidence of misuse of your data; however, you may encounter phishing attempts, fraud attempts, or unauthorized use of your information,” the email stated.

The company has reported the breach to the appropriate authorities, including the intelligence committee.

As reported by Bloomberg, the hack occurred on July 2nd and marked the third breach of the LVMH system within the past three months.

In addition to the incidents involving Louis Vuitton, LVMH’s second-largest fashion brand, Christian Dior Couture, disclosed in May that hackers also had access to customer data.

On Thursday, four individuals were arrested in connection with a cyberattack involving Marks & Spencer, The Co-op, and Harrods.

Those arrested included a 17-year-old British male from the West Midlands, a 19-year-old Latvian male also from the West Midlands, a 19-year-old British male from London, and a 20-year-old British female from Staffordshire.

M&S was the initial target of this wave of attacks back in April, which led to the online store’s closure for nearly seven weeks. The Co-op was similarly attacked that month, forcing a shutdown of several IT systems.

Harrods reported being targeted on May 1, which resulted in restricted internet access across its website following attempts to gain unauthorized entry to the system.

The chairman of M&S, Archie Norman, stated that days after the arrests, two other large UK companies had also experienced unreported cyberattacks in recent months.

Louis Vuitton has been contacted for further comments.

Source: www.theguardian.com

Palantir Claims UK Physicians Prioritize “Ideology Over Patients’ Interests” in NHS Data Legislation

Palantir, a U.S. data firm collaborating with the Israeli Defense Department, criticized British doctors for prioritizing “ideology over patient interests” following backlash against its contract to manage NHS data.

Louis Mosley, executive vice president of Palantir, recently addressed the British Medical Association, which labeled the £330 million agreement to create a unified platform for NHS data—covering everything from patient information to bed availability—as a potential threat to public trust in the NHS data system.

In a formal resolution, the association expressed concerns over the unclear processing of sensitive data by Palantir, a company co-founded by Trump donor Peter Thiel. They highlighted the firm’s “study on discriminatory policing software in the U.S.” and its “close ties with the U.S. government, which often overlooks international law.”

However, Mosley dismissed these critiques during his testimony to lawmakers on the Commons Science and Technology Committee on Tuesday. Palantir has also secured contracts for processing large-scale data for the Ministry of Defense, police, and local governments.


Libertarian Thiel, who named the company after “Seeing Stones” from the Lord of the Rings series, previously remarked that British citizens’ admiration for the NHS reflects “Stockholm syndrome.” However, Mosley claimed he was not speaking on behalf of Palantir.

Palantir also develops AI-driven military targeting systems and software that consolidates and analyzes data across multiple systems, including healthcare.

“It’s incorrect to accuse us of lacking transparency or that we operate in secrecy,” claimed Mosley. “I believe the BMA has chosen ideology over the interests of patients. Our software aims to enhance patient care by streamlining treatment, making it more effective, and ultimately improving the efficiency of the healthcare system.”

In 2023, the government awarded Palantir a contract to establish a new NHS “Federated Data Platform,” though some local NHS trusts have raised concerns that the system might not only be subpar compared to existing technologies but could also diminish functionality, as reported. Palantir is also among the tech companies reported by the Guardian last week, which recently led to a discussion with Attorney General Shabana Mahmood about solutions for the prison and probation crisis, including robotic support for prisoners and tracking devices.

During the session, Senator Chi Onwurah questioned the appropriateness of involving the company in the NHS while also working with the Israeli Defense Forces in military applications in Gaza.

Mosley did not disclose operational specifics regarding Palantir’s role with Israeli authorities. Their offerings include a system labeled “supporting soldiers with AI-driven kill chains and responsibly integrating target identification.”

Onwurah remarked on the necessity for cultural change within the NHS to foster acceptance of new data systems, posing the question to Mosley: “What about a unified patient record in the future?”

“Trust should depend more on our capabilities than anything else,” Mosley responded. “Are we delivering on our promises? Are we improving patient experiences by making them quicker and more efficient? If so, we should be trusted.”

Liberal Democrat Martin Wrigley expressed serious concerns about the interoperability of the data systems provided by Palantir for both health and defense, while Conservative MP Kit Malthouse inquired about the military’s potential use of Palantir’s capacity to process large datasets to target individuals based on specific characteristics. Mosley reassured: “Our software enables that type of functionality and provides extensive governance and control to organizations managing those risks.”

Malthouse remarked, “It sounds like a Savior.”

The hearing also revealed that Palantir continues to engage Global Counsel, a lobbying firm co-founded by the current U.S. ambassador. Mosley denied any claims that British Prime Minister Keir Starmer visited Palantir’s Washington, D.C. office “through appropriate channels,” clarifying that Mandelson resigned as a global advisor “in early 2025.” According to the consultant’s website.

Source: www.theguardian.com

Significant Reductions in Hurricane Data May Leave Predictors in the Dark

Forecasters are about to lose a vital source of satellite data just months ahead of the peak of the Atlantic hurricane season, as the Department of Defense prepares to shut down a more critical data stream than cybersecurity issues.

The data is generated by microwave sensors on three aging polar orbit satellites that serve both military and civilian functions. These sensors are crucial for hurricane forecasting, as they can analyze cloud layers and the storm’s core, providing insights even at night without relying on visible light.

Experts are concerned that this loss of data will hinder forecasters during a period when the National Weather Service is deploying fewer weather balloons due to budget cuts and insufficient meteorological staff. The absence of this data affects meteorologists’ ability to assess storm threats effectively and prepare emergency managers accordingly. Microwave data offers some of the earliest signs that wind speeds are intensifying in storms.

“It’s a tool that enables deeper insight. Losing it will significantly impair hurricane forecasts. It can detect the formation of eye walls in tropical storms, indicating whether these storms are intensifying,” an expert commented.

Researchers suggest that as ocean temperatures rise due to human-induced climate change, rapid intensification in tropical storms may become more common.

The three satellites operate through a collaborative initiative involving the Defense Weather Satellite Program, NOAA, and the Department of Defense.

While hurricane experts expressed concern about the loss of this tool, NOAA’s communications director, Kim Doster, minimized the potential impact of the National Weather Service’s decision on hurricane forecasting.

In a message, Doster described the military’s microwave data as “one dataset in a robust suite of hurricane prediction and modeling tools” within the NWS.

According to Doster, these forecasting models integrate data from various satellites located around 22,300 miles away from Earth, providing a synchronized view that follows the planet’s rotation.

They also incorporate measurements from Hurricane Hunter planes, buoys, weather balloons, land radars, and additional polar orbit satellites, including NOAA’s joint polar satellite system.

A U.S. Space Force representative confirmed that the satellites and their equipment are operational, and data will continue to be sent directly to satellite readout terminals across the DOD. However, the Navy’s Fleet Numerical Weather and Oceanography Center has opted to cease public data processing and sharing, officials reported.

The visible and infrared images show Hurricane Eric, which has intensified since the June 18th Category 2 storm.CIMSS

The Navy did not respond promptly to requests for comments.

Earlier this week, a Navy division informed researchers that it would halt data processing and sharing by June 30. Some researchers received notifications from the Navy’s Fleet Numerical Weather and Oceanography Center regarding their reliance on outdated and insecure operating systems.

“We cannot upgrade our systems; it raises cybersecurity risks and jeopardizes our DOD network,” stated an email reviewed by NBC News.

This decision could lead to forecasters losing up to half of the available microwave data, according to McNoldy.

Additionally, this microwave data is crucial for snow and ice researchers tracking polar sea ice levels, which helps understand long-term climate patterns. Sea ice, formed from frozen seawater, expands in winter and melts in summer. Tracking sea ice is essential as it reflects sunlight back into space, cooling the planet. This metric is vital to monitor over time, especially since summer Arctic sea ice levels are showing declining trends due to global warming.

Walt Meier, a senior research scientist at the National Snow and Ice Data Center, mentioned that his program learned about the Navy’s decision earlier this week.

Meier noted the satellites and sensors have been operational for approximately 16 years. While researchers anticipated their eventual failure, they did not expect the military to abruptly discontinue data sharing with little notice.

Meier stated that the National Snow and Ice Data Center has depended on military satellites for sea ice coverage data since 1987 but will adapt by utilizing similar microwave data from Japanese satellites known as AMSR-2.

“Integrating that data into our system could take several weeks,” said Meier. “While it may not undermine the integrity of sea ice climate records, it will pose additional challenges.”

Polar orbit satellites, part of the Defense Weather Satellite Program, offer intermittent coverage of regions prone to hurricanes.

These satellites generally circle the Earth in a north-south path every 90 to 100 minutes at relatively low altitudes, according to Meier. The microwave sensors scan narrow bands of the Earth, estimated to be around 1,500 miles wide.

As the Earth rotates, these polar orbit satellites capture images that can help researchers analyze storm structure and potential strength when they are within range.

“Often, great passes provide extensive data beyond just the hurricane,” said McNoldy, who added that the loss will decrease the frequency of scans for areas covered by microwave scans and specific storms.

Hurricane modeler Andy Hazelton, an associate scientist at the University of Miami Ocean and Atmospheric Research Institute, mentioned that microwave data is still utilized in some hurricane models and by forecasters with access to real-time visualizations.

Hazelton highlighted that forecasters always look for visual cues from microwave data, which typically provides early indications of rapidly strengthening storms.

The National Hurricane Center defines rapid intensification as a 35 mph or greater increase in sustained winds in tropical storms within a 24-hour period. The loss of microwave data is particularly concerning as scientists have observed a rise in rapid intensification linked to climate change due to warmer seawater.

A 2023 scientific report indicated that tropical cyclones in the Atlantic have about a 29% higher likelihood of rapid intensification from 2001 to 2020 compared to the period from 1971 to 1990. For instance, Hurricane Milton was strengthened into a Category 5 hurricane just 36 hours after being classified as a tropical storm, with part of this intensification occurring overnight when other satellite equipment offered less information.

From the International Space Station, Hurricane Milton, a Category 5 storm, was captured on October 8th in the Gulf of Mexico off the Yucatan Peninsula.NASA/Getty Images

This trend poses significant risks, particularly when storms like Hurricane Idria intensify just before approaching the coast.

“We’ve definitely observed numerous instances of rapid intensification right before landfall recently, something we cannot afford to overlook,” McNoldy remarked.

Brian Lamare, a dedicated forecaster at the National Weather Service in Tampa Bay, noted that this data is crucial for predicting flood impacts when hurricanes make landfall.

“These scans are key for predicting the areas of heaviest rainfall and the rates of rainfall,” said Lamarre. “This data is vital for public safety.”

Hurricane season lasts from June 1 to November 30, peaking at the end of summer and early fall. NOAA forecasters anticipate a busier hurricane season in 2025, with expectations of 6-10 hurricanes.

Source: www.nbcnews.com

23AndMe Fined £2.3 Million by UK Regulators Over 2023 Data Breach | Technology News

The genetic testing firm 23AndMe has been penalized with a fine exceeding £2.3 million following a significant cyberattack in 2023, which compromised the personal information of over 150,000 UK residents.

Sensitive data, including family tree details, health reports, names, and postal codes, were among the information breached from the California-based company. The UK Intelligence Commission’s office confirmed the breach after employees discovered that stolen data was being offered for sale on the social media platform Reddit.

Intelligence Commissioner John Edwards referred to the incidents during the summer of 2023 as “a deeply damaging violation.” The data breach affecting the UK was just a fraction of a larger security incident that compromised data from 7 million individuals.

23AndMe offers DNA screening for £89 through a saliva-based kit, allowing users to trace their ancestry in terms of ethnicity and geographical origin. However, many customers sought bankruptcy protection in the US in March, requesting the removal of their DNA data from the company’s records following the hack.

The penalty coincided with a $355 million acquisition bid for the company led by former CEO Anne Wassicki.

Edwards noted that the data breaches included sensitive personal information, family histories, and even health conditions of numerous individuals in the UK.

“As one affected individual remarked, once this information is out there, it cannot be altered or replaced like a password or credit card number,” he added.

UK data protection regulators found that 23AndMe did not take fundamental steps to safeguard user information, revealing inadequacies in its security system, including a failure to implement stricter user authentication measures.

Hackers exploited a widespread weakness due to the reuse of passwords compromised in unrelated data breaches. They employed automated tools in a method called “credential stuffing.”

Edwards remarked, “The warning signs were evident, and the company’s response was sluggish. This has made individuals’ most sensitive data vulnerable to exploitation and harm.”

Skip past newsletter promotions

A company spokesperson stated that 23AndMe has taken various measures to enhance security for individual accounts and data. They have made a firm commitment to improving the protection of customer data and privacy in connection with an initiative that will benefit 23AndMe, a nonprofit associated with Wojcicki, the TTAM Research Institute.

Fines are part of the substantial penalties imposed on various organizations by ICOs in recent years due to their inability to secure data from hacking and ransomware incidents. In 2022, a fine levied against construction firms exceeded £4.4 million when staff data was compromised, including contact information, bank details, sexual orientation, and health data.

In March of this year, NHS IT supplier Advanced Computer Software Group faced a fine of nearly £3.1 million for endangering the personal information of approximately 80,000 individuals.

Source: www.theguardian.com

Public Health Agencies Urged to Develop Period Tracking Apps for Data Protection

As public health organizations indicate that women’s personal information is vulnerable to exploitation by private entities, experts advocate for public health groups to create alternatives to for-profit period tracker applications.

A study from the University of Cambridge reveals that smartphone apps used for menstrual cycle tracking serve as a “Goldmine” for consumer profiling, collecting data on exercise, diet, medication, hormone levels, and birth control methods.

The economic worth of this information is often “greatly underestimated” by users who share intimate details in unregulated markets with profit-driven businesses, according to the report.

If mishandled, data from cycle tracking apps (CTAs) could lead to issues like employment bias, workplace monitoring, discrimination in health insurance, risks of cyberstalking, and restricted access to abortion services, research indicates.

The authors urge for improved regulation in the expanding Femtech sector to safeguard users as data is sold in large quantities, suggesting that apps should offer clear consent options regarding data collection and promote the establishment of public health agency alternatives to commercial CTAs.

“The menstrual cycle tracking app is marketed as empowering women and bridging gender health disparities,” stated Dr. Stephanie Felberger, PhD, of the Center for Technology and Democracy at Cambridge, the lead author of the report. “Nevertheless, its underlying business model relies on commercial usage, wherein user data and insights are sold to third parties for profit.

“As a consequence of the monetization of data collected by cycle tracking app companies, women face significant and alarming privacy and safety threats.”

The report indicates that most cycle tracking apps cater to women attempting to conceive, making the stored data highly commercially valuable. Other life events, aside from home purchasing, do not trigger such notable shifts in consumer behavior.

Data pertaining to pregnancy is valued at over 200 times more than information about age, gender, or location for targeted advertisements. Furthermore, tracking cycle duration can allow for targeting women at various phases of their cycles.

The three most popular apps project a quarterly download figure of 500 million yen for 2024. The digital health sector focused on women’s wellness is anticipated to surpass $60 billion (£44 billion) by 2027, as noted in the report.

In light of the considerable demand for period tracking, the authors are calling on public health entities, including the UK’s NHS, to create transparent and reliable apps as alternatives to commercial offerings.

“The UK is ideally positioned to address researchers’ challenges related to menstrual data access, as well as privacy and data concerns, by developing an NHS app dedicated to tracking menstrual cycles,” added that the parent-child relationship in the US Reproductive Medicine Plan currently utilizes its own app.

“Apps situated within public health frameworks, which are not primarily profit-driven, can significantly reduce privacy violations, gather essential data on reproductive health, and empower users regarding the utilization of their menstrual information.”

“Utilizing cycle tracking apps is beneficial. Women deserve better than having their menstrual tracking data treated merely as consumer data,” remarked Professor Gina Neff, executive director of the Mindeoo Center.

In the UK and the EU, period tracking data falls under “special categories” and enjoys greater legal protection, similar to genetics and ethnicity. In the United States, authorities collect menstrual cycle data which may hinder access to abortion services, according to the report.

Source: www.theguardian.com