Starcloud aims to establish a 4km x 4km data center satellite
star cloud
Is the overwhelming need for massive data centers by AI manageable through extraterrestrial solutions? Tech firms are considering low-Earth orbit as a viable option, although experts warn that substantial engineering and unresolved challenges currently hinder progress.
The explosive demand and investment in generative AI platforms like ChatGPT have sparked an unparalleled need for computing resources, requiring vast land areas as well as electricity levels comparable to those consumed by millions of households. Consequently, many data centers are increasingly relying on unsustainable energy sources such as natural gas, with tech companies expressing concerns that renewable energy sources cannot meet their skyrocketing power needs or stability requirements for reliable operations.
Google is moving forward with its vision for a space data center through its pilot initiative, Project Suncatcher, which plans to launch two prototype satellites equipped with TPU AI chips by 2027 to experiment with their functionality in orbit. However, one of the most notable advancements in space data processing occurred this year with the launch of a solitary H100 graphics processing unit by StarCloud, an Nvidia-backed company. Nevertheless, this is significantly less computing power than what modern AI systems require; OpenAI is estimated to utilize around a million of such chips.
For data centers to function effectively in orbit, many unresolved issues must be tackled. “From an academic research standpoint, [space data centers] are still far from being production-ready,” remarks Benjamin Lee from the University of Pennsylvania, USA.
According to Lee, one of the major hurdles is the extensive scale required to meet AI’s computational needs. This involves not only the power demands from solar panels—requiring substantial surface area—but also the challenge of dissipating heat produced by the chips, the only feasible cooling method in a vacuum. “We can’t use cold air and evaporative cooling like we do on Earth,” Lee explained.
“Square kilometers will be occupied independently for energy generation and cooling,” he added. “These structures expand rapidly. When discussing capacity in the range of 1,000 megawatts, it essentially equates to a considerable area in orbit.” Indeed, StarCloud plans to construct a data center of 5,000 megawatts over 16 square kilometers, roughly 400 times the area of the solar panels on the International Space Station.
Lee believes that several promising technologies could help mitigate these requirements. Krishna Muralidharan from the University of Arizona is investigating thermoelectric devices that can convert heat into electricity, enhancing the efficiency of chips functioning in space. “It’s not a matter of feasibility; it’s a challenge,” Muralidharan stated. “For now, we can temporarily rely on large thermal panels, but ultimately we will require more sophisticated solutions.”
Additionally, space presents unique challenges unlike those found on Earth. For instance, there is a significant presence of high-energy radiation that can impact computer chips, leading to errors and disrupted calculations. “Everything will slow down,” Lee cautioned. “A chip positioned in space might perform worse compared to one on Earth due to the need for recalibration and error correction.”
To function at this scale, Muralidharan noted that thousands of satellites need to operate in tandem, necessitating highly precise laser systems for communication both between data centers and with Earth, where atmospheric interference can distort signals. Despite this, Muralidharan remains optimistic, believing these challenges are surmountable. “The real question is not if, but when,” he asserts.
Another point of uncertainty is whether AI will still necessitate such extensive computational resources by the time the data centers are in place. This is particularly relevant if anticipated advancements in AI do not align with the growing computing power we are beginning to observe. “It’s evident that training requirements may peak or stabilize, which would likely cause the demand for large-scale data centers to follow suit,” Lee explained.
Yet, even in such a scenario, Muralidharan suggests potential applications for space-based data centers, such as facilitating space exploration beyond Earth and monitoring terrestrial phenomena.
The Trump administration has announced plans to dismantle Colorado’s National Center for Atmospheric Research, which is the largest climate research institute in the federal state.
Russ Vought, the White House Director of Management and Budget, revealed the proposal on Tuesday. In a statement on X.
“The National Science Foundation intends to dissolve the National Center for Atmospheric Research (NCAR) in Boulder, Colorado,” Vought stated. According to a USA Today report. “This facility is a core source of concern regarding climate change in our country. A thorough review is in progress, and vital activities related to climate research will be reassigned to another organization or location.”
This action could pose a significant blow to U.S. climate research, as United Nations and other global leaders indicate that time is running out to avert the dire consequences of global warming.
The University Corporation for Atmospheric Research, NCAR’s parent organization, issued a statement on Tuesday. They acknowledged awareness of the closures but had no further details.
“We are eager to collaborate with the administration to ensure the security and prosperity of our nation remains a top priority,” UCAR President Antonio Busalacci stated.
In response to an NBC News inquiry about NCAR’s fate, a senior White House official criticized Colorado Governor Jared Polis, a Democrat.
“Perhaps if Colorado had a governor willing to engage with President Trump, it would be more beneficial for voters,” said the official.
The official characterized NCAR as “a prominent research center perpetuating left-leaning climate change ideologies” and asserted that dismantling NCAR would “put an end to the research activities linked to the Green New Scam.”
Polis responded on Tuesday. He mentioned that Colorado has not received any communication about plans to dismantle NCAR, emphasizing that such actions would equate to an assault on science if confirmed.
“Climate change is a real issue, but NCAR’s contributions extend well beyond climate research,” Polis stated. “NCAR supplies crucial data on severe weather incidents like fires and floods, aiding our nation in safeguarding lives and property. If these cuts proceed, we risk losing our competitive edge against foreign adversaries in scientific exploration.”
Many within the climate and weather field expressed shock at this announcement.
Daniel Swain, a climatologist at UCLA, remarked: I commented on X that this would represent a “significant setback for American science.”
“This will disrupt not only climate research but also studies on weather, wildfires, and disasters that have supported decades of advancements in forecasting, early warnings, and resilience improvement,” Swain said, noting that the repercussions would cascade throughout the global weather and climate communities.
“NCAR has likely played an unparalleled role in enhancing weather forecasting and atmospheric modeling compared to any other organization worldwide,” he added.
Katherine Hayhoe, an atmospheric scientist and chief scientist at the Nature Conservancy, asserted that dismantling NCAR would be “akin to using a sledgehammer on the foundation of our scientific understanding of the planet.”
“Almost everyone studying climate and weather, not just in the U.S. but globally, has benefited from NCAR’s invaluable resources,” she mentioned on X.
Andy Hazelton, an associate scientist at the University of Miami’s Oceanic and Atmospheric Institute, described the decision to move resources as “incredibly shortsighted.”
Some Democratic representatives have pledged to fight against the closure of NCAR.
“This represents a dangerously blatant act of retaliation from the Trump administration,” stated Rep. Joe Neguse, D-Colorado, whose constituency includes the climate research hub. I posted on X. “NCAR is a leading scientific facility globally, with our scientists engaging in pioneering research every day. We will use every legal avenue to combat this reckless directive.”
a■ Australia is capitalizing on the AI boom, with numerous new investments in data centers located in Sydney and Melbourne. However, experts caution about the strain these large-scale projects may impose on already limited water resources.
The projected water demand for servicing Sydney’s data centers is anticipated to surpass the total drinking water supply in Canberra within the next decade.
In Melbourne, the Victorian government has pledged a $5.5 million investment to transform the city into Australia’s data center hub. Currently, hyperscale data center applications already exceed the collective water demands of nearly all of the top 30 business customers in the state.
Tech giants like Open AI and Atlassian are advocating for Australia to evolve into a data processing and storage hub. With 260 data centers currently operational and numerous others planned, experts express concern regarding the repercussions for drinking water resources.
Sydney Water projects that it will require as much as 250 megalitres daily to support the industry by 2035—more than the total drinking water supply in Canberra drinking water).
Cooling Requires Significant Water
Professor Priya Rajagopalan, director of RMIT’s Center for Post Carbon Research, points out that a data center’s water and energy requirements are largely dictated by the cooling technology implemented.
“Using evaporative cooling leads to significant water loss due to evaporation, while a sealed system conserves water but requires substantial amounts for cooling,” she explains.
Older data centers typically depend on air cooling. However, the increased demand for computational power means greater server rack densities, resulting in higher temperatures. Hence, these centers rely more heavily on water for cooling solutions.
Water consumption in data centers varies significantly. For instance, NextDC has transitioned to liquid-to-chip cooling, which cools processors and GPUs directly, as opposed to cooling entire rooms with air or water.
NextDC reports that while initial trials of this cooling technology have been concluded, liquid cooling is far more efficient and can scale to ultra-dense environments, improving processing power without a proportional increase in energy consumption. Their modeling suggests that the power usage efficiency (PUE) could decline to as low as 1.15.
Subscribe to Climate and Environment Editor Adam Morton’s Clear Air column for free!
The data center sector measures its sustainability using two key metrics: water usage efficiency (WUE) and power usage efficiency (PUE). These metrics gauge the levels of water or power consumed per unit of computing work.
WUE is calculated by dividing annual water usage by annual IT energy usage (kWh). For instance, a 100MW data center that uses 3ML daily would yield a WUE of 1.25. A number closer to 1 indicates greater efficiency. Certain countries enforce minimum standards; for example, Malaysia recommends a WUE of 1.8.
Even facilities that are efficient can still consume substantial amounts of water and energy at scale.
NextDC’s last fiscal year’s PUE stood at 1.44, up from 1.42 the previous year. The company indicates that this reflects the changing nature of customer activity across its facilities and the onboarding of new centers.
Calls to Ban Drinking Water Usage
Sydney Water states that estimates regarding data center water usage are continually reassessed. To prepare for future demands, the organization is investigating alternative, climate-resilient water sources like recycled water and rainwater harvesting.
“Every proposed connection for data centers will undergo case-by-case evaluations to guarantee adequate local network capacity. If additional services are necessary, operators might need to fund upgrades,” a Sydney Water representative said.
In its submission to the 2026-2031 rate review in Victoria, Melbourne Water observed that hyperscale data center operators seeking connectivity “expect instantaneous and annual demand to surpass nearly all of Melbourne’s leading 30 non-residential customers.”
Melbourne Water mentioned, “This has not been factored into our demand forecasting or expenditure plans.”
The agency is requesting upfront capital contributions from companies to mitigate the financial burden of necessary infrastructure improvements, ensuring those costs do not fall solely on the broader customer base.
Documents show that Greater Western Water in Victoria has received 19 data center applications. See more from ABC provided to the Guardian.
The Concerned Waterways Alliance, composed of various Victorian community and environmental organizations, has expressed concerns regarding the potential diversion of drinking water for cooling servers when the state’s water supplies are already under stress.
Alliance spokesperson Cameron Steele emphasized that expanding data centers would create a greater reliance on desalinated water, thereby diminishing availability for ecological streams and possibly imposing costs on local communities. The group is advocating for a ban on potable water usage for cooling and demanding that all centers transparently report their water consumption.
“We strongly promote the use of recycled water over potable water within our data centers.”
Closed Loop Cooling
In hotter regions, like much of Australia during summer, data centers require additional energy or water to remain cool.
Daniel Francis, customer and policy manager at the Australian Water Works Association, highlights that there is no universal solution for the energy and water consumption of data centers, as local factors such as land availability, noise restrictions, and water resources play significant roles.
“We constantly balance the needs of residential and non-residential customers, as well as environmental considerations,” says Francis.
“Indeed, there is a considerable number of data center applications, and it’s the cumulative effect we need to strategize for… It’s paramount to consider the implications for the community.”
“Often, they prefer to cluster together in specific locations.”
One of the data centers currently under construction in Sydney’s Marsden Park is a 504MW facility spanning 20 hectares with six four-story buildings. The company claims this CDC center will be the largest data campus in the southern hemisphere.
Last year, CDC operated its data centers with 95.8% renewable electricity, achieving a PUE of 1.38 and a WUE of 0.01. A company representative stated that this level of efficiency was made possible through a closed-loop cooling system that does not require continuous water extraction, in contrast to traditional evaporative cooling systems.
“CDC’s closed-loop system is filled only once at its inception and functions without ongoing water extraction, evaporation, or waste generation, thereby conserving water while ensuring optimal thermal performance,” the spokesperson noted.
“This model is specifically designed for Australia, a nation characterized by drought and water shortages, focusing on long-term sustainability and establishing industry benchmarks.”
Despite CDC’s initiatives, community concerns regarding the project persist.
Peter Rofile, acting chief executive of the Western NSW Health District, expressed in a letter last June that the development’s proximity to vulnerable communities and its unprecedented scale posed untested risks to residents in western Sydney.
“This proposal does not guarantee that this operation can adequately mitigate environmental exposure during extreme heat events, potentially posing an unreasonable health risk to the public,” Rofile stated.
The demand for electricity by data centers in Australia could triple over the next five years, with projections indicating it may surpass the energy consumed by electric vehicles by 2030.
Currently, data centers obtain approximately 2% of their electricity from the National Grid, equating to around 4 terawatt-hours (TWh). The Australian Energy Market Operator (Aemo) is optimistic about this share significantly increasing, projecting a growth of 25% annually to reach 12TWh, or 6% of grid demand by 2030, and 12% by 2050.
Aemo anticipates that the rapid expansion of this industry will drive “substantial increases in electricity usage, especially in Sydney and Melbourne.”
In New South Wales and Victoria, where the majority of data centers are situated, they contribute to 11% and 8% of electricity demand, respectively, by 2030. Electricity demand in each state is projected to grow accordingly.
Tech companies like OpenAI and SunCable are pushing Australia towards becoming a central hub for data processing and storage. Recently, the Victorian Government announced a $5.5 million investment aimed at establishing the region as Australia’s data center capital.
However, with 260 data centers currently operating across the nation and numerous others in the pipeline, experts express concerns about the implications of unchecked industry growth on energy transition and climate objectives.
Energy Usage Equivalent to 100,000 Households
The continual operation of numerous servers generates substantial heat and requires extensive electricity for both operation and cooling.
Subscribe to Climate and Environment Editor Adam Morton’s Clear Air column for insightful newsletters
Globally, the demand for data centers is growing at a rate four times faster than other sectors, according to the International Energy Agency. The number and size of centers are escalating, with large facilities becoming increasingly common.
As highlighted by the IEA, “AI-centric hyperscale data centers possess a capacity exceeding 100MW and consume energy equivalent to what 100,000 homes use annually.”
Professor Michael Blair, a mechanical engineering professor at the University of Melbourne and director of the Net Zero Australia project, stated that there is a significant connection between electricity and water usage due to cooling requirements, as servers convert electrical energy into heat.
“In confined spaces with many computers, air conditioning is required to maintain an optimal operating temperature,” he explains.
Typically, digital infrastructure is cooled through air conditioning or water systems.
Ketan Joshi, a climate analyst at the Oslo-based Australia Institute, shares that many tech companies are reporting a surge in electricity consumption compared to last year. The intensity of energy usage has also been increasing across several metrics: energy per active user and energy per unit of revenue, when compared to five years ago.
“They aren’t consuming more energy to serve additional users or increase revenue,” he asserts. “The pertinent question is: why is our energy consumption escalating?”
In the absence of concrete data, Joshi suggests that the undeniable growth in demand is likely attributed to the rise of energy-intensive generative AI systems.
“Running Harder to Stay in the Same Place”
Joshi is monitoring this issue, as data centers globally are evidenced to place substantial and inflexible demands on power grids, resulting in two significant repercussions: increased dependence on coal and gas generation, and diverting resources away from the energy transition.
While data center companies often assert they operate using clean energy through investments in solar and wind, Joshi remarks that there can often be a mismatch between their companies’ persistent reliance on the grid and their renewable energy production profiles.
“What’s the ultimate impact on the power grid?” he questions. “Sometimes, we have surplus energy, and other times, there isn’t enough.”
“So, even if everything appears favorable on paper, your data center might be inadvertently supporting fossil fuel transportation.”
Moreover, instead of renewable energy sources displacing coal and gas, these sources are accommodating the growing demands of data centers, Joshi notes. “It’s like sprinting on a treadmill—no matter how hard you run, it feels like the speed is continually increasing.”
The demand for electricity has surged to the extent that some companies have resorted to restarting their operations. Nuclear power plants in the U.S. that were once mothballed are being revived as demand for gas turbines increases. Some Australian developers are even proposing the installation of new gas generators to fulfill their energy needs.
Aemo predicts that by 2035, data centers could consume 21.4TWh, nearing the country’s annual energy consumption, comparable to that of four aluminum smelters.
Blair pointed out that AI adoption is in its infancy, and the outlook remains uncertain, as Aemo’s 2035 energy consumption scenarios range between 12TWh and 24TWh, indicating that the future might not be as expansive as anticipated.
In the National AI Plan released Tuesday, the federal government recognized the necessity for advancements in new energy and cooling technologies for AI systems. Industry Minister Tim Ayers stated that principles for data center investments will be established in early 2026, emphasizing requirements for supplementary investments in renewable energy generation and water sustainability.
“Undeniable Impact” on Electricity Prices
Dr. Dylan McConnell, an energy systems researcher at the University of New South Wales, noted that while renewable energy is on the rise in Australia, it is not yet progressing rapidly enough to meet required renewable energy and emissions targets. The expansion of data centers will complicate these challenges.
“If demand escalates beyond projections and renewables can’t keep pace, we’ll end up meeting that new demand instead of displacing coal,” he explains.
Unlike electric vehicles, which enhance demand on the grid while lowering gasoline and diesel usage, data centers do not reduce fossil fuel consumption elsewhere in the economy, according to McConnell.
“If this demand materializes, it will severely hamper our emissions targets and complicate our ability to phase out coal in alignment with those targets,” he advises.
In its climate targets recommendations, the Climate Change Agency stated: “Data centers will continue to scale up, exerting deeper pressure on local power sources and further hampering renewable energy expansions.”
McConnell asserted there will be a significant effect on overall energy costs, influencing electricity prices.
“To support this load, we will need a larger system that utilizes more costly resources.”
JOrne McAuliffe, a 33-year-old entrepreneur and former public servant, stands as an unexpected Democratic contender in this month’s Virginia House of Representatives election, especially given a campaign approach that occasionally resembled that of his Republican opponents.
Recently, Mr. McAuliffe joined 13 Democrats who secured Congressional seats in Virginia during a significant electoral win for his party, granting them robust control over state governance. With victories in states like New Jersey and California, this outcome provides a renewed advantage for Democrats nationwide, following a disheartening setback against Donald Trump and the Republican Party the previous year.
The northern Virginia district he aimed to represent, characterized by residential areas, agricultural land, and charming small towns, hadn’t seen a Democratic representative in decades. Thus, McAuliffe campaigned door-to-door on his electric scooter, reaching out to constituents with a pledge to “protect their way of life.” He dismissed the label “woke” and attributed the “chaos” to Washington, D.C., located over an hour away.
One of his primary talking points was a widespread concern resonating with many Democrats today, but with a distinct angle: the adverse impacts of data centers on electricity costs.
“I spent a majority of the year visiting households I never imagined were Democratic,” McAuliffe recounted. “Independents, Republicans, and an occasional Democrat, yet many began shutting their doors on me.”
“However, once they voiced a desire to discuss data centers, it opened a dialogue. That allowed me to draw a contrast, which is rare.”
Loudoun County’s data centers occupy about half of Virginia’s 30th House District, known for its high per capita income, and handle more traffic than any other region globally. While essential for many Internet functions, McAuliffe argued—and many voters concurred—that their presence can be burdensome.
Sizeable as warehouses, these data centers loom over nearby neighborhoods, buzzing with the sounds of servers and machinery. Developers seek to establish facilities in Fauquier County, the district’s other Republican-leaning area, but McAuliffe mentioned that residents are apprehensive about construction on rural farmland, renowned for its scenic vistas. He noted receiving complaints regarding the impact of data centers on electricity bills across the board.
According to a 2024 report from the Virginia General Assembly’s Joint Legislative Audit and Review Committee, the state’s energy demands are projected to double over the next decade, chiefly due to data centers and the substantial infrastructure required to cater to this demand.
The report also indicated that while Virginia’s electricity pricing structures are “appropriately” aligned with facility usage, “energy costs for all consumers are likely to rise” to cover new infrastructure expenses and necessary electricity imports. Earlier this month, Virginia’s public utility regulators approved a rise in electricity rates, though not to the extent Dominion Energy, the state’s primary provider, initially requested.
“The costs tied to infrastructure—the extensive transmission lines and substations—are being passed down to consumers,” McAuliffe explained from a co-working space in Middleburg, Virginia, where his campaign operates.
“These essentially represent taxes that we’ve wrongfully placed on ordinary Virginians to benefit corporations like Amazon and Google. While there may be some advantages for these communities, these companies are capable of affording them, and we must strive to better negotiate those benefits.”
McAuliffe’s opponent was Republican Geary Higgins, who had been elected in 2023. The battle between the two parties proved costly, with Democrats investing nearly $3 million and their adversaries spending just over $850,000, according to records from the Virginia Public Access Project.
This campaign encompassed more than just data centers; McAuliffe also spotlighted reproductive rights and teacher salary increases. Democrats have committed to codifying access to abortion if they gain full power in Virginia’s state government, and the governance in his district deterioratedunder Democratic Party criticisms that Higgins failed to return contributions from controversial politicians.
Yet, McAuliffe chose to concentrate on data centers, believing their impacts presented “the most pressing issue we can address.” This focus surprised some of his consultants, and although he acknowledged it was a “somewhat niche topic,” data centers frequently emerged as a primary concern during his door-to-door visits.
To counter Higgins, his campaign even launched a website called data center geary, attempting to associate the Republican (a former Loudoun County Supervisor) with the spread of these facilities. Higgins and his family and allies condemned the efforts as misleading.
Mr. McAuliffe ultimately won with 50.9% of the votes, while Mr. Higgins gathered 49%. In response to a request for an interview, Higgins stated that McAuliffe’s “entire campaign was based on falsehoods regarding me and my history.”
“Thanks to an influx of external funding and high Democratic turnout, he was able to fabricate a misleading caricature of me and narrowly triumph,” Higgins remarked.
As Mr. Trump faced the polls nationwide last year, voters in conservative rural and suburban areas turned away from Democrats, resulting in the party’s loss of the presidency and Congressional control. McAuliffe’s victory leaves some party leaders pondering the lessons Democrats can glean from his campaign.
“In typically red regions, he identified common issues that resonated with both Republicans and Democrats while making a convincing case for solutions,” noted Democratic Rep. Suhas Subrahmanyam, who represents McAuliffe’s district.
Democratic National Committee Chairman Ken Martin, who campaigned alongside McAuliffe, characterized him as “an extraordinary candidate who triumphed by focusing squarely on the relevant issues of his district.”
“Democrats are capable of winning in any setting, especially in suburbs and rural environments, when they have candidates who commit themselves to addressing the genuine needs of their community. Presently, what Americans require is the capability to manage their expenses,” stated Martin.
Chaz Natticomb, founder and executive director of Virginia’s nonpartisan election monitoring organization State Navigate, remarked that while McAuliffe may not have surpassed Democrat Abigail Spanberger’s standout gubernatorial victory, his success in garnering votes illustrates his appeal to some Republicans over Higgins.
“He outperformed everyone else, primarily because he gained the support of Republican-leaning voters,” Natticombe concluded.
EEvery day, Kiran Kasbe navigates her rickshaw taxi amid the bustling Mahuls near her home on Mumbai’s eastern coast, where stalls brim with tomatoes, gourds, and eggplants, often enveloped in thick smog.
Earlier this year, doctors identified three tumors in her 54-year-old mother’s brain. The specific cause of her cancer remains unclear, yet those residing near coal-fired power plants have a significantly higher risk of developing such illnesses. A studyindicates that Mahul’s residents live mere hundreds of meters from these plants.
The air quality in Mahul is notoriously poor; even with closed car windows, the pungent odor of oil and smoke seeps in.
“We are not the only ones suffering health issues here. Everything is covered in grime,” noted Kasbe, 36.
Last year, plans to shut down two coal-fired power plants operated by Indian firms Tata Group and Adani were announced as part of the government’s initiative to reduce emissions. However, by late 2023, these decisions were overturned after Tata claimed escalating electricity demand in Mumbai necessitated coal.
Neither firm responded to inquiries for comment.
Buildings blanketed in smog in Mumbai, India, January. Photo: Bloomberg/Getty Images
India’s electricity demand has surged in recent years, driven by economic growth and increased air conditioning needs due to severe heat exacerbated by climate change. However, a study by Source Material and The Guardian highlighted that a primary hindrance for cities in relying on fossil fuels is the insatiable energy demands of data centers.
Leaked documents also expose Amazon’s significant presence in Mumbai, where it stands as the largest data center operator globally.
In metropolitan areas served by Amazon, the organization has noted three “availability zones,” indicating one or more data centers. Leaked data from a year ago indicated that the company operates 16 machines in the city.
Bhaskar Chakravorty, an academic at Tufts University analyzing technology’s societal impacts, remarked that the surge in data centers is creating a tension between energy needs and climate goals as India evolves its economy into an artificial intelligence hub.
“I’m not surprised by the slow progression towards a greener transition, particularly as demands grow rapidly,” he said regarding the Indian government’s stance.
Amazon spokesperson Kylie Jonas asserted that Mumbai’s “emissions issue” cannot be attributed to Amazon.
“On the contrary, Amazon is among the largest corporate contributors to renewable energy in India, backing 53 solar and wind initiatives capable of generating over 4 million megawatt-hours of clean energy each year,” she stated. “Once operational, these investments will power more than 1.3 million Indian households annually.”
Amazon is establishing numerous data centers globally, vying with Microsoft, Google, and other entities for dominance in the burgeoning AI sector.
Tata Consultancy Services Ltd. office in Mumbai, India. Photo: Bloomberg/Getty Images
Amazon Employee Climate Justice representative Eliza Pan criticized the company for not acknowledging its role in perpetuating reliance on one of the most polluting energy sources.
“Amazon is leveraging this shiny concept called AI to distract from the reality of building a dirty energy empire,” she said.
Jonas refuted this assertion, stating, “Not only are we recognized as the most efficient data center operator, but we’ve also been the top corporate purchaser of renewable energy for five successive years, with over 600 projects globally.”
Amazon’s claims regarding green energy are contentious. The organization has been scrutinized for engaging in “creative accounting” by acquiring renewable energy certificates alongside direct green energy purchases, as noted by a member of Amazon Employees for Climate Justice.
“Everything is contaminated”
Kasbe operates her rickshaw in Mahul, a former fishing settlement that has transformed into a residence for tens of thousands who were displaced from slums across the city.
Kiran Kasbe’s mother. Photo: Provided by Sushmita
Kasbe and her mother relocated here in 2018 after their home in Vidyavihar’s outskirts faced demolition. She was in good health prior to the move, but her medical condition significantly worsened, culminating in a brain tumor diagnosis.
Gajanan Tandol, a local resident, shared that pollution-related diseases are prevalent. “There are numerous instances of skin and eye inflammation, cancer, asthma, and tuberculosis, yet we receive no government assistance,” he lamented.
Another community member, Santosh Jadhav, implored the government to relocate residents from Mahul.
“Everything is tainted. We’re exhausted from fighting for a decent existence,” he stated. “This is hell for us.”
Amazon, an e-commerce platform facilitating 13 million customer transactions daily, is investing billions into expanding its profitable cloud computing sector and enhancing its AI-assisted services, such as automated coding and translation, as per research from CapitalOne.
Many of the centers in Mumbai remain under the radar because they are leased rather than owned. Unlike in the U.S., where Amazon predominantly owns its facilities, it frequently rents entire data farms or server racks in centers shared with other companies elsewhere.
Xiaolei Ren, a computing scholar from the University of California, Riverside, remarked that shared “colocation” units lead to significantly higher energy consumption in data centers compared to wholly owned or fully leased operations.
“The majority of energy used in the data center sector is concentrated in colocation facilities,” he noted. “They are ubiquitous.”
Employees near the Amazon Prime brand in Mumbai, India, September. Photo: NurPhoto/Getty Images
Based on leaked information, Amazon’s colocation data center in Mumbai consumed 624,518 megawatt-hours of electricity in 2023, sufficient to power over 400,000 homes in India for an entire year.
India is on the verge of surpassing Japan and Australia, poised to become the second-largest consumer of data center power in the Asia-Pacific region. S&P predicts that by 2030, data centers will account for one-third of Mumbai’s energy consumption, according to Techno & Electric Engineering CEO Ankit Saraiya.
“Poison hell”
In a bid to keep up with power demand, the Maharashtra government has extended the operational duration of the Tata coal-fired power plant in Mahul by at least five years. Additionally, the closure of a 500-megawatt plant operated by Tata competitor Adani Group in the city’s north has been postponed.
When Tata requested an extension in its proposal to the State Energy Commission, it cited the rising energy demand from data centers as the primary justification. Adani projected that the anticipated surge in demand during the five years following the plant’s scheduled closure would come predominantly from data centers.
These power plants represent merely two of the numerous polluting sources within Mumbai’s Mahul district. The area also houses three oil refineries and 16 chemical facilities, as stated in a 2019 report by the Indian Center for Policy Research, which branded the locality a “toxic hell.”
The Tata power plant has been operational since 1984, and like many old power stations, it is subject to lenient emissions regulations, as noted by Raj Lal, chief air quality scientist at the World Emissions Network, who labeled it “one of the major contributors to air pollution in Mumbai.”
The Center for Energy and Clean Air Research noted that PM2.5 particles comprise nearly a third of the area’s pollution. PM2.5 particles are airborne and less than 2.5 micrometers in diameter, which can lead to severe health issues when inhaled.
Smoke emanates from the chimney of Tata Power Company’s Trombay thermal facility in Mumbai, India, August 2017. Photo: Bloomberg/Getty Images
Shripad Dharmadhikari, founder of the environmental organization Manthan Adhyayan Kendra, stated that the toxic heavy metals in ash generated by the factories are likely to trigger “respiratory diseases, kidney ailments, skin issues, and heart problems.”
While Tata’s facilities continue operations, Mumbai’s power grid is buckling under the increasing demand. To mitigate potential power shortages, Amazon’s colocation data center in the city has invested in 41 backup diesel generators and is seeking permission for additional installations, according to the leaked documents.
A report from the Center for Science and Technology Policy (CSTEP) released in August identified diesel generators as a primary pollutant source in the locality.
Air quality expert Swagata Dey at CSTEP argued that the presence of data centers requiring continuous electricity, coupled with the backup diesel generators, “will inevitably exacerbate emissions,” advocating for legal requirements for data center operators to utilize pollution-free solar energy.
Particularly, the Amazon facility across Thane Creek from Mahul has 14 generators already installed, and one partner was granted permission to set up another 12 generators on-site earlier this year.
“Public health considerations must be central to decisions regarding data center locations and energy source selections,” stated Wren from the University of California, Riverside, co-author of a recent paper evaluating the public health consequences of diesel generators in U.S. data centers.
Sushmita notes that in India, surnames are not commonly used as they signify caste, reflecting a hierarchical and discriminatory social structure.
On Wednesday, artificial intelligence firm Anthropic unveiled plans for a substantial $50 billion investment in computing infrastructure, which will include new data centers in Texas and New York.
Anthropic’s CEO, Dario Amodei, stated in a press release, “We are getting closer to developing AI that can enhance scientific discovery and tackle complex challenges in unprecedented ways.”
In the U.S., the typical timeframe to construct a large data warehouse is around two years, requiring significant energy resources to operate. “This level of investment is essential to keep our research at the forefront and to cater to the escalating demand for Claude from numerous companies,” the firm—known for Claude, an AI chatbot embraced by many organizations implementing AI—mentioned in a statement. Anthropic anticipates that this initiative will generate approximately 800 permanent roles and 2,400 construction jobs.
The company is collaborating with London-based Fluidstack to develop new computing facilities to support its AI frameworks. However, specific details regarding the location and energy source for these facilities remain undisclosed.
Recent transactions highlight that the tech sector continues to invest heavily in energy-intensive AI infrastructure, despite ongoing financial concerns like market bubbles, environmental impacts, and political repercussions linked to soaring electricity prices in construction areas. Another entity, TeraWulf, a developer of cryptocurrency mining data centers, recently stated its partnership with Fluidstack on a Google-supported data center project in Texas and along the shores of Lake Ontario in New York.
In a similar vein, Microsoft announced on Wednesday its establishment of a new data center in Atlanta, Georgia, which will link to another facility in Wisconsin, forming a “massive supercomputer” powered by numerous Nvidia chips for its AI technologies.
According to a report from TD Cowen last month, leading cloud computing providers leased an impressive amount of U.S. data center capacity in the third fiscal quarter of this year, exceeding 7.4GW—more than the total energy utilized all of last year.
As spending escalates on computing infrastructure for AI startups that have yet to achieve profitability, concerns regarding a potential AI investment bubble are increasing.
Investors are closely monitoring a series of recent transactions between leading AI developers like OpenAI and Anthropic, as well as companies that manufacture the costly computer chips and data centers essential for their AI solutions. Anthropic reaffirmed its commitment to adopting “cost-effective and capital-efficient strategies” to expand its business.
Google is set to establish an artificial intelligence data center in space, with initial test equipment scheduled for launch into orbit in early 2027.
The company’s scientists and engineers are confident about deploying a densely clustered array of around 80 solar-powered satellites at approximately 400 miles above the Earth, each outfitted with robust processors to cater to the escalating AI demands.
Google highlights the rapid decline in space launch costs, suggesting that by the mid-2030s, operating space-based data centers may become as affordable as their terrestrial counterparts. The study was made public on Tuesday.
Utilizing satellites could significantly lessen the impact on land and water resources that are currently required for cooling ground-based data centers.
Once operational in orbit, the data center will harness solar energy and aim to achieve up to eight times the productivity of grounded facilities. However, launching a single rocket into orbit emits hundreds of tons of CO.2.
Astronomers have expressed concerns about the burgeoning number of satellites in low-Earth orbit, describing it as “akin to a bug on a windshield” when observing the cosmos.
The envisioned data centers under Project Suncatcher would use optical links for data transmission, primarily leveraging light or laser beams.
Major technology firms aiming for swift advancements in AI are projected to invest $3 trillion (£2.3 trillion) in data centers worldwide, ranging from India to Texas and Lincolnshire to Brazil. This surge in spending raises alarms regarding the carbon footprint if sustainable energy solutions are not sourced for these facilities.
“In the future, space might be the ideal environment for advancing AI computing,” stated Google.
“In light of this, our new research initiative, Project Suncatcher, envisions a compact array of solar-powered satellites utilizing Google TPUs and linked through free-space optical connections. This strategy has significant scaling potential and minimal impact on terrestrial resources.”
TPUs are specialized processors designed for AI model training and routine use. Free-space optical connections enable wireless communication.
Elon Musk, who oversees satellite internet provider Starlink and the SpaceX rocket program, announced last week that his company would begin expanding efforts to develop data centers in space.
Nvidia AI chips are anticipated to be launched into space later this month in collaboration with startup Starcloud.
“Space provides virtually limitless low-cost renewable energy,” commented Philip Johnston, co-founder of the startup. “The environmental cost occurs only at launch, and over the lifespan of the data center, there’s a tenfold reduction in carbon dioxide compared to ground-based power.”
Google aims to deploy two prototype satellites by early 2027, referring to the research findings as “the first milestone toward scalable space-based AI.”
However, the company cautions that “substantial engineering challenges persist, including thermal management, high-bandwidth ground communications, and the reliability of systems in orbit.”
The galactic center excess refers to an unexpected intensity of gamma rays emerging from the core of the Milky Way galaxy.
This view displays the entire sky at energies exceeding 1 GeV, derived from five years of data from the LAT instrument on NASA’s Fermi Gamma-ray Space Telescope. The most striking aspect is a luminous band of diffuse light along the center of the map, indicating the central plane of the Milky Way galaxy. Image credit: NASA/DOE/Fermi LAT collaboration.
Gamma rays are a form of electromagnetic radiation characterized by the shortest wavelengths and the highest energy.
The intriguing gamma-ray signal from the Milky Way’s center was initially observed in 2009 by the Large Area Telescope, the primary instrument of NASA’s Fermi Gamma-ray Space Telescope.
The source of this signal remains under discussion, with main hypotheses involving self-annihilating dark matter and undetected populations of millisecond pulsars.
“When Fermi directed its gaze toward the galaxy’s center, the outcome was unexpected,” remarked Dr. Noam Libeskind, an astrophysicist at the Leibniz Institute for Astrophysics in Potsdam.
“The telescope detected an excessive number of gamma rays, the most energetic form of light in the universe.”
“Astronomers worldwide were baffled, and numerous competing theories emerged to clarify the so-called gamma-ray excess.”
“After extensive discussion, two primary theories surfaced: either these gamma rays stem from millisecond pulsars (highly dense neutron stars rotating thousands of times per second) or from dark matter particles colliding and annihilating. Both theories, however, have their limitations.”
“Nonetheless, our findings strongly support the notion that the gamma-ray excess arises from dark matter annihilation.”
In their study, Dr. Libeskind and his team simulated the formation of the Milky Way galaxy under conditions akin to those in Earth’s neighboring universe.
They discovered that dark matter does not radiate outward from the galaxy’s core but is organized similarly to stars, suggesting that it could also contribute to the excess gamma rays.
“The Milky Way has long been recognized as existing within a spherical region filled with dark matter, often referred to as a dark matter halo,” explained Dr. Mourits Mikkel Mur, an astrophysicist at the Potsdam Leibniz Institute for Astrophysics and the University of Tartu.
“However, the degree to which this halo is aspheric or ellipsoidal remains unclear.”
“We analyzed simulations of the Milky Way and its dark matter halo and found that the flattening of this region sufficiently accounts for the gamma-ray excess due to self-annihilation of dark matter particles.”
“These calculations indicate that the search for dark matter particles capable of self-annihilation should be emphasized, bringing us closer to uncovering the enigmatic properties of these particles.”
A study of the survey results was published in this month’s edition of Physical Review Letters.
_____
Mikel Mur the Moor et al. 2025. Excess forms of dark matter in Fermi LAT galactic center Milky Way simulations. Physics. Pastore Rhett 135, 161005; doi: 10.1103/g9qz-h8wd
Google has committed to securing up to 3GW of hydropower in what is being termed the largest clean power agreement by a corporation, as the tech giant seeks to expand its energy-intensive data centers, the company announced on Tuesday.
The agreement with Brookfield Asset Management includes a 20-year power purchase deal worth $3 billion for electricity generated from two hydroelectric plants located in Pennsylvania.
Additionally, the tech giant will invest $25 billion into data centers across Pennsylvania and neighboring states over the next two years, according to Semafor’s report on Tuesday.
The technology sector is increasingly seeking vast amounts of clean energy to support the power demands of data centers essential for artificial intelligence and cloud computing.
Ruth Porat, president and chief investment officer of Google’s parent company Alphabet, spoke about the initiative at the AI Summit in Pittsburgh, where Donald Trump announced a $70 billion investment in AI and energy.
Amanda Peterson Corio, head of Datacenter Energy at Google, commented on the collaboration with Brookfield, stating, “This partnership is a crucial step towards ensuring a clean energy supply in the PJM region where we operate.”
Almost a year ago, Google initiated several unique power purchase agreements involving carbon-free geothermal energy and advanced nuclear options. The company is also collaborating with PJM Interconnect, the largest power grid operator in the U.S., to expedite the integration of new power sources using AI technology.
Google has entered into an initial framework agreement with Brookfield, the owner of Brookfield Renewable Partners, stating its intent to develop and operate a renewable energy facility. The two hydroelectric plants in Pennsylvania will undergo upgrades and refurbishment as part of this agreement. Furthermore, Google intends to expand its commitment beyond these facilities to other regions within the Mid-Atlantic and Midwest.
IT is a warehouse resembling the size of 12 football pitches, poised to provide essential employment and development opportunities in the city of Caucaia, northeastern Brazil. Yet, the shelves remain empty. This extensive facility is set to transform into a data center, as designated by TikTok, になったんです。 English: The first thing you can do is to find the best one to do. part of a 5.5 billion Reais (7.3 billion pounds) project aimed at expanding the global data center infrastructure.
With the increasing demand for supercomputer facilities, Brazil is attracting an array of high-tech companies, buoyed by the AI boom. The selection of Caucaia is strategic. Submarine cables carry data from Fortaleza, the nearby capital of Ceará, to various continents. Proximity to these cables enhances traffic capacity and reduces latency—the response time across the Internet network.
Additionally, Caucaia is home to the Pecém EPZ, where businesses can produce goods and services for export, benefiting from various tax incentives and streamlined bureaucratic processes.
However, data from Brazil’s disaster digital atlas and integrated disaster information system indicate that Caucaia is also prone to extreme weather events, including drought and heavy rainfall.
Between 2003 and 2024, the city experienced drought-related emergency conditions declared at least once. In 2019, around 10,000 residents were impacted by water shortages. The digital atlas of disasters shows that as reservoirs depleted, water quality diminished, leading to crop failures and challenges in access to basic food supplies.
Data centers consume vast amounts of energy and water to keep supercomputers cool. Nevertheless, public agencies are promoting green construction in drought-affected areas. Caucaia is part of a broader trend.
According to the Digital Disaster Atlas, five of the 22 planned data centers are situated in cities that have faced repeated drought and water scarcity since 2003.
So far, China’s social networks have not been mentioned in Caucaia’s permit application. However, in February, the chief of staff for the state government, Chagas Vieira, confirmed in an interview with local radio stations that discussions were ongoing with Chinese firms, and representatives from TikTok and its parent company ByteDance met with senior officials, including the Vice President and Minister of Development, Industry, Trade, and Services, Geraldo Alckmin.
ByteDance has been approached for comments.
The truck will deliver water to Caucaia, a city facing repeated problems with drinking water supply. Photo: Marília Camelo/The Guardian
The project is officially led by Casa dos Ventos, a Brazilian wind energy firm that has invested in the data center sector. Mario Araripe, the company’s founder and president, announced last year that he aims to attract major global technology companies like Apple, Amazon, Google, Meta, and Microsoft to fill the facility with computers.
Casa dos Ventos has already secured one of the three required licenses from the state of Ceará. According to the state’s Environmental Supervision (SEMACE), the project received a license for “30m³/day water consumption in closed circuits supplied by Artesian Well.” Specific details have been withheld for commercial confidentiality.
Casa dos Ventos claims it is “committed to transforming Porto do Pecém into a complex of technological innovation and energy transition.”
Projects requiring significant energy, such as data centers, are required to obtain special permission from the Brazilian government. As of 2024, at least seven of the 21 approvals granted by the Ministry of Mines and the Ministry of Energy were linked to data centers.
Casa dos Ventos is also responsible for another data center project currently under state review in Campo Redondo, Rio Grande do Norte, a region that has experienced drought for 14 out of the last 21 years. During the water crisis in 2022, local governments sought federal aid, and water trucks were dispatched to address the demand.
A similar situation is unfolding in Igaporanga, Bahia, where a Brazilian renewable energy company plans to establish two data centers. The city has been in a state of emergency due to drought conditions for 12 of the years between 2003 and 2022. In 2021, about 5,500 people faced rural water shortages.
Transparency regarding water usage by data centers under construction in these areas is lacking. Companies have not disclosed this information voluntarily, and the government has withheld technical documents for licensing, citing commercial confidentiality.
In early April, the National Electric Power System Operator (ONS) denied requests for access to the grid for Casa dos Ventos, citing concerns for grid stability. Consequently, the Ministry of Mines and Energy requested a recalculation to assess potential grid adjustments.
bIG tech firms acknowledge their water consumption in water-scarce areas heightened by AI requirements. The 2024 Sustainability Report details that Microsoft reported that 42% of its water usage originates in regions experiencing water stress. Similarly, Google stated that in the same year, 15% of its water consumption fell in areas marked by “high water scarcity.”
Data centers utilize a large volume of water to prevent overheating in computers and machines. However, some water may evaporate, potentially exacerbating the local climate crisis where they are located. As AI technologies evolve, the demand for processing power increases, leading to heightened energy and cooling requirements. Consequently, water and energy consumption are projected to rise.
Workers at a Data Center in Porto Alegre, Rio Grande do Sul, Brazil. Such facilities utilize considerable amounts of water for cooling machinery. Photo: Jeff Botega
The International Energy Agency projects a significant increase in data center energy consumption to double, reaching 945,000 GWh by 2030—equivalent to Japan’s annual energy consumption. Countries like Brazil will account for approximately 5% of this growth within that timeframe.
Water consumption is expected to surge. Researchers from the University of California, Riverside, and the University of Texas at Arlington estimate that global AI demand will require between 4.2 billion and 6.6 billion cubic meters of water by 2027, surpassing half of the UK’s annual water usage.
However, Shaolei Ren, a researcher from UC Riverside and co-author of the study, highlights a crucial distinction between consumption (water extracted from the system) and loss (water evaporated).
“Residential users generally do not withdraw significant amounts of water, but data centers often consume between 60% and 80%,” notes Ren, meaning that much water is lost.
Data centers can be cooled through two approaches: one is air conditioning, a widely adopted method for various facilities, while the second is utilizing water.
The outskirts of Caucaia, where inadequate water became unsuitable for urban consumption after reservoirs were depleted in 2019. Photo: Marília Camelo/The Guardian
One method involves recycling or reusing water but incorporates fans and radiators within closed systems, resembling car engine technologies. Alternatively, a cooling tower might use evaporation to expel heat from heated water, allowing the return of cold water to the system. The final method involves misting water into the air, increasing humidity and reducing temperature.
Nonetheless, these methods are not without inefficiencies. “Both evaporation and misting lead to water loss,” asserts Emilio Franceschini, an associate professor at ABC Federal University.
A small data center with a capacity of 1MW consumes around 25.5 million liters of water annually, with an estimated 1% (255,000 liters) lost to evaporation.
In Pecém, alternatives to extracting water include purchasing desalinated seawater or recycled water from Fortaleza.
It falls upon the state government to grant water concessions to data centers as part of the environmental licensing process.
rOnildo Mastroianni, technical director at Esplar, an NGO with a 50-year presence in Ceará, argues that projects demanding high water consumption in semi-arid areas are misguided. “It’s simply pushing for increased dryness,” he asserts.
Mastroianni cautions that such projects could alter the local hydrological basin, which may weaken fragile ecosystems, like the Caatinga, and heighten food insecurity due to rural water scarcity. He indicates that representatives from local NGOs and various Kilombola and Indigenous communities were not included in project discussions.
Due to water stress, many communities have constructed reservoirs to secure water supply during drought periods. Photo: Marília Camelo/The Guardian
Other Latin American nations are also witnessing a surge in the data center industry. Chile has launched 22 data centers in the Santiago region alone. In December, the government announced a National Plan to establish 30 additional projects, projected to place the country at medium to high levels of water stress by 2040, signifying decreased water availability.
In Chile, both governmental and corporate bodies are facing escalating opposition. In 2019, Google disclosed plans for its second data center in Santiago, which sparked estimates from the activist organization MOSACAT indicating the project would extract 700 million liters of water annually.
Following a wave of protests, a Santiago court reviewed the project. By early 2024, the court halted Google’s assessments concerning environmental impacts, pending further evaluation.
Among those advocating against the project was Tania Rodriguez of MOSACAT, who lamented, “That turned into extractivism,” she said in interviews with other outlets. “We will become everyone’s backyards.”
The rapid adoption of AI technology globally is projected to consume a substantial amount of energy equivalent to Japan’s current energy consumption by the end of the decade. However, only half of this energy demand is expected to come from renewable sources.
The International Energy Agency (IEA) report suggests that the electricity consumed by processing data with AI in the United States alone will be significant by 2030. The overall electricity demand from data centers worldwide is anticipated to more than double by 2030, with AI being a key driver of this surge.
One data center currently consumes as much energy as 100,000 households, but newer ones under construction may require up to 20 times more. Despite these demands, fears that AI adoption will hinder efforts to combat climate change are deemed “exaggerated” in the report, which highlights the potential for AI to improve energy efficiency and reduce greenhouse gas emissions.
The Executive Director of IEA, Fatih Birol, emphasizes that AI presents a significant technological shift in the energy sector and underscores the importance of responsible use. AI has the potential to optimize energy grids for renewable sources and enhance efficiencies in energy systems and industrial processes.
Furthermore, AI can facilitate advancements in various sectors like transportation, urban planning, and resource exploration. Despite the energy challenges posed by AI, strategic government intervention is crucial to ensure a sustainable balance between technological growth and environmental preservation.
However, concerns persist regarding the potential negative impacts of AI, such as increased water consumption in arid regions and potential reliance on non-renewable energy sources. To address these challenges, transparent governance and proactive measures are essential to harness the benefits of AI while mitigating its adverse effects.
Numerous companies, including a national leisure center chain, are reassessing or discontinuing the use of facial recognition technology and fingerprint scanning for monitoring employee attendance in response to actions taken by Britain’s data authority.
The Information Commissioner’s Office (ICO) instructed a Serco subsidiary to halt the use of biometrics for tracking employee attendance at its leisure centers and prohibited the use of facial recognition and fingerprint scans. The ICO also issued stricter guidelines.
Following an investigation, the ICO found that more than 2,000 employees’ biometric data was unlawfully processed at 38 Serco-managed centers using facial recognition and, in two instances, fingerprint scanning to monitor attendance.
In response, Serco has been given a three-month deadline by the ICO to ensure compliance with regulations and has committed to achieving full compliance within that timeframe.
Other leisure center operators and businesses are also reevaluating or discontinuing the use of similar biometric technology for employee attendance monitoring in light of the ICO’s actions.
Virgin Active, a leisure club operator, announced the removal of biometric scanners from 32 properties and is exploring alternatives for staff monitoring.
Ian Hogg, CEO of Shopworks, a provider of biometric technology to Serco and other companies, highlighted the ICO’s role in assisting businesses in various industries to meet new standards for biometric authentication.
The new ICO standards emphasize exploring alternative options to biometrics for achieving statutory objectives, prompting companies to reconsider their use of such technology.
1Life, owned by Parkwood Leisure, is in the process of removing the Shopworks system from all sites, clarifying that it was not used for biometric purposes.
Continuing discussions with stakeholders, the ICO aims to guide appropriate use of facial recognition and biometric technology in compliance with regulations and best practices.
The widespread concerns raised by the ICO’s actions underscore the need for stronger regulations to protect employees from invasive surveillance technologies in the workplace.
The case of an Uber Eats driver facing issues with facial recognition checks highlights ongoing debates about the use of artificial intelligence in employment relationships and the need for transparent consultation processes.
Emphasizing the importance of respecting workers’ rights, the use of artificial intelligence in employment must be carefully regulated to prevent discriminatory practices and ensure fair treatment of employees.
It has been close to two years since the world was first introduced to Sagittarius A* (Sgr A*), the supermassive black hole residing at the center of the Milky Way.
A true behemoth, Sgr A* boasts a mass equivalent to 4 million suns and is encircled by hot pockets of swirling gas. Despite its immense size, it sits about 27,000 light-years away from Earth, appearing in the sky only as large as a donut on the moon’s surface.
In a recent study published in the Astrophysics Journal Letter and released by the event horizon telescope (EHT), Sgr A* was captured in polarized light for the first time.
Similar to how sunglasses can filter polarized light from the sun, astronomers utilize polarized light to unveil concealed magnetic fields.
The lines within the image indicate the direction of polarization, which correlates with the structure of the magnetic field surrounding the black hole.
“The spiral pattern observed swirling around the black hole signifies that the magnetic field must also be swirling, indicating a very strong and ordered field,” stated Dr. Sarah Isaun, an Einstein Fellow and co-leader of the project in the NASA Hubble Fellowship Program, as quoted in BBC Science Focus.
A comparison between the supermassive black holes M87* and Sagittarius A*, depicted in polarized light, reveals similar magnetic field structures, suggesting a universal feature among supermassive black holes. – Image credit: EHT Collaboration
The first-ever image of a black hole was unveiled by EHT in 2019, featuring a much grander black hole at the core of the Messier 87 galaxy (M87*).
M87* is approximately 1,000 times heavier than Sgr A*, leading to a slower rotation making it easier to image.
Further developments include astronomers releasing images of the magnetic field encompassing M87* in 2021. Overcoming the challenge of capturing our own supermassive black hole in polarized light took an additional three years.
In a surprising revelation, despite the contrasting sizes of the two black holes, the new images demonstrate strikingly similar magnetic field structures, emphasizing the prevalence of strong magnetic fields in both. This highlights a fundamental feature of supermassive black holes.
Isaun emphasized, “Sgr A* now holds a polarization structure remarkably akin to the larger, more potent M87* black hole, supporting the significance of a robust, well-ordered magnetic field in these entities.”
A comparison of the sizes of two black holes imaged by the Event Horizon Telescope (EHT) collaboration: M87* at the core of the galaxy Messier 87 and Sagittarius A* (Sgr A*) at the center of the Milky Way. – Image credit: EHT Collaboration (Acknowledgment: Lia Medeiros, xkcd)
Previous investigations on M87* disclosed that the encircling magnetic field generates potent jets of energy and matter extending far beyond the galaxy. While astronomers have visualized the jet around M87*, it has remained elusive around Sgr A*. However, recent images unveil remarkable similarities between the two black holes, suggesting the potential existence of jets in both.
Isaun highlighted, “The jets within the host galaxy can stimulate or counteract star formation, exhibiting a fascinating interplay between the dynamics of these emanating jets from these black holes and the evolvement of the host galaxy. There exists a connection.”
“I believe we can extract valuable insights into our galaxy’s history from this connection.”
Upon the release of this image in 2022 by the EHT collaboration, it served as the premier visual evidence of a supermassive black hole residing at the heart of our galaxy, Sagittarius A*. – Image credit: EHT Collaboration
The upgraded EHT is set to observe Sgr A* once more next month, with astronomers hopeful of uncovering concealed jets and other facets of the galaxy’s central region.
Anticipate further groundbreaking revelations from EHT, potentially including more awe-inspiring images and even real-time video footage in years to come.
About our experts
Sarah Isaun is an observational astronomer and member of the Event Horizon Telescope (EHT) collaboration. Her research focuses on aggregating, calibrating, and visualizing millimeter-wave radio observations of supermassive black holes. She led a project to produce new images of Sagittarius A* in polarized light.
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.