AI-Driven Electricity Usage Forecasting Shows Industry is Far from Achieving Net-Zero Goals

Sure! Here’s the rewritten content while keeping the HTML tags intact:

Data Center in Ashburn, Virginia

Jim Roe Scalzo/EPA/Shutterstock

As the artificial intelligence sector grows swiftly, concerns about the ecological effects of data centers are increasingly being discussed. New projections indicate that the industry may fall short of achieving net-zero emissions by 2030.

Fenki Yu and researchers from Cornell University in New York have evaluated the potential energy, water, and carbon consumption of current leading AI servers by 2030, under various growth scenarios and specific U.S. data center locations. Their analysis integrates anticipated chip production, server energy demands, and cooling efficiency, coupled with state power grid data. While not all AI enterprises have declared net-zero objectives, major tech firms involved in AI, like Google, Microsoft, and Meta, have set targets for 2030.

“The rapid expansion of AI computing is fundamentally altering everything,” says Yu. “We’re striving to understand the implications of this growth.”

The researchers estimate that establishing AI servers in the U.S. may require between 731 million to 1.125 billion cubic meters of additional water by 2030, along with greenhouse gas emissions ranging from 24 million to 44 million tons of carbon dioxide each year. These estimates hinge on the pace of AI demand growth, the actual number of advanced servers that can be produced, and the sites of new U.S. data centers.

To address these issues, the researchers modeled five scenarios based on varying growth rates and outlined potential measures to minimize the impact. “The top priority is location,” Yu explains. By situating data centers in Midwestern states with abundant water resources and a significant share of renewable energy in the power grid, the environmental fallout can be mitigated. The team also emphasizes that transitioning to decarbonized energy sources and enhancing efficiency in computing and cooling processes are essential strategies for minimizing environmental impact. Collectively, these three measures could potentially lower industry emissions by 73% and reduce water usage by 86%.

However, public resistance may disrupt these predictions, particularly regarding the environmental ramifications of establishing data centers. In Virginia, where 1/8 of the world’s data centers are located, residents have voiced opposition to upcoming construction plans, citing concerns over water resources and broader environmental impacts. Similar petitions against data centers have arisen in Pennsylvania, Texas, Arizona, California, and Oregon. As per Data Center Watch, a firm that monitors data center developments, local opposition is stalling approximately $64 billion worth of projects. Even where certain locations successfully deny data center projects, questions remain regarding their potential power and water consumption.

This new research is viewed cautiously by those analyzing and quantifying AI’s environmental effects. “The AI field evolves so quickly that making accurate future predictions is incredibly challenging,” says Sasha Luccioni from the AI company Hugging Face. “As mentioned by the authors, breakthroughs in the industry can radically alter computing and energy needs, reminiscent of DeepSeek’s innovative techniques that reduced reliance on brute-force calculations.”

Chris Priest from the University of Bristol in the UK concurs, highlighting the necessity for increased investment in renewable energy infrastructure and the importance of data center placement. “I believe their projections for water usage in direct cooling of AI data centers are rather pessimistic,” he remarks, suggesting that the model’s “best case” scenario aligns more closely with “business as usual” for contemporary data centers.

Luccioni believes the paper underscores a vital missing element in the AI ecosystem: “greater transparency.” She notes that this issue can be addressed by “mandating model developers to track and disclose their computing and energy consumption, share this information with users and policymakers, and commit to reducing overall environmental impacts, including emissions.”

Topic:

If you need further adjustments or another type of rewrite, let me know!

Source: www.newscientist.com

British AI Startup Outperforms Humans in Global Forecasting Competition

The artificial intelligence system has outperformed numerous prediction enthusiasts, including a number of experts. A competition focused on event predictions spanned events from the fallout between Donald Trump and Elon Musk to Kemi Badenok being dismissed as a potential Conservative leader.

The UK-based AI startup, established by former Google DeepMind researchers, ranks among the top 10 in international forecasting competitions, with participants tasked with predicting the probabilities of 60 events occurring over the summer.

Manticai secured 8th place in the Metaculus Cup, operated by a forecasting firm based in San Francisco aiming to predict the futures of investment funds and corporations.

While AI performance still lags behind the top human predictors, some contend that it could surpass human capabilities sooner than anticipated.

“It feels odd to be outperformed by a few bots at this stage,” remarked Ben Sindel, one of the professional predictors who ended up behind the AI during the competition, eventually finishing on Mantic’s team. “We’ve made significant progress compared to a year ago when the best bots were ranked around 300.”

The Metaculus Cup included questions like which party would win the most seats in the Samoan general election, and how many acres of the US would be affected by fires from January to August. Contestants were graded based on their predictions as of September 1st.

“What Munch achieved is remarkable,” stated Degar Turan, CEO of Metaculus.

Turan estimated that AI would perform at par or even surpass top human predictors by 2029, but also acknowledged that “human predictors currently outshine AI predictors.”

In complex predictions reliant on interrelated events, AI systems tend to struggle with logical validation checks when interpreting knowledge into final forecasts.

Mantic effectively dissects prediction challenges into distinct tasks and assigns them to various machine learning models such as OpenAI, Google, and DeepSeek based on their capabilities.

Co-founder Toby Shevlane indicated that their achievements mark a significant milestone for the AI community, utilizing large language models for predictive analytics.

“Some argue that LLMs merely replicate training data, but we can’t predict such futures,” he noted. “We require genuine inference. We can assert that our system’s forecasts are more original than those of most human contenders, as individuals often compile average community predictions. AI systems frequently differ from these averages.”

Mantic’s systems deploy a range of AI agents to evaluate current events, conduct historical analyses, simulate scenarios, and make future predictions. The strength of AI prediction lies in its capacity for hard work and endurance, vital for effective forecasting.

AI can simultaneously tackle numerous complex challenges, revisiting each daily to adapt based on evolving information. Human predictors also leverage intuition, but Sindel suggests this may emerge in AI as well.

“Intuition is crucial, but I don’t think it’s inherently human,” he commented.

Top-tier human super forecasters assert their superiority. Philip Tetlock, co-author of the bestseller SuperForecasting, recently published research indicating that, on average, experts continue to outperform the best bots.

Turan reiterated that AI systems face challenges in complex predictions involving interdependent events, struggling to identify logical inconsistencies in output during validation checks.

“We’ve witnessed substantial effort and investment,” remarked Warren Hatch, CEO of Good Judgement, a forecasting firm co-founded by Tetlock. “We anticipate AI excelling in specific question categories, such as monthly inflation.

Or, as Lubos Saloky, the human forecaster who placed third in the Metaculus Cup, expressed, “I’m not retiring. If you can’t beat them, I’ll collaborate with them.”

Source: www.theguardian.com

White House Funding Cuts Endanger AI Weather Forecasting Institute

Funding for a $20 million artificial intelligence lab aimed at enhancing weather forecasts has been halted by the Trump administration. This decision threatens both the pipeline of scientists and the nation’s capability to evaluate the effects of hurricanes and other weather-related disasters.

According to Amy McGovern, the director of the Institute for AI2ES (AI Institute for Heather and Weather, Climate, and Coastal Oceanography), the National Science Foundation (NSF) informed the institute last month that it would not extend its five-year grant.

McGovern, who serves as a professor of meteorology and computer science at the University of Oklahoma, stated:

She emphasized that, without private funding, the institute may have to close its doors next year.

AI2ES collaborates with various universities to integrate AI into weather forecasting while evaluating its reliability.

This move to shut down AI2ES occurs as the Trump administration is heavily investing in AI and accelerating the establishment of data centers. The administration’s own AI plan advocates for the development of AI systems and programs aimed at fostering AI vocational training programs and specialized AI labs across various scientific fields.

In July, the administration unveiled an ambitious plan to achieve “global dominance” in artificial intelligence, emphasizing both innovation and its implementation—key areas of focus for AI2ES.

Alan Gerald, the former director of the National Intensive Storm Institute at the National Oceanic and Atmospheric Administration, described the cut as “dissonance” in light of this trend toward advancing technology.

The White House has not responded to requests for comments regarding this matter.

The institute was established in 2020 under the previous Trump administration as part of the NSF’s AI research labs, having received around $20 million in funding over the past five years. An NSF spokesperson, Michael England, stated that the agency holds the AI Institute’s groundbreaking work in high regard.

The National Science Foundation is fully committed to advancing artificial intelligence research through the National AI Research Institute Program, a pivotal aspect of the administration’s strategy to reinforce the US’s leadership in transformative AI.

NSF and its collaborating partners have provided funding for a network of 29 AI institutes. This year, AI2ES was one of five labs updated through the NSF, with three labs having received updates, while the status of the fourth remains pending, according to McGovern.

The Trump administration has proposed a 55% budget cut for the NSF; however, Congress has not yet ratified the budget. Senate and House appropriations have diverged from the Trump administration’s proposals, suggesting smaller cuts to scientific institutions like the NSF.

“We were an AI lab, so we believed we were secure, given our alignment with the president’s priorities,” McGovern noted.

The Trump administration’s AI plan aims for NSF and other organizations to expose K-12 students to AI careers, develop industry-driven training programs to generate AI jobs, and bolster workforce initiatives to enhance the nation’s AI talent pool.

“They desire a more robust AI-trained workforce. We were doing a significant amount of work,” McGovern emphasized.

She expressed concern that private AI firms are “poaching talent constantly,” as the institute funds around 70 positions each year at various universities, creating a talent pipeline. Among the institute’s achievements are over 130 academic publications and the development of AI tools used by the government today.

The center aided in the creation of AI tools that predict weather events potentially endangering sea turtles near Corpus Christi, Texas, making these animals susceptible to hazards onboard vessels.

Additionally, the institute developed an application enabling forecasters to “see” within hurricanes, even without a polar orbit satellite equipped with a microwave sensor capable of penetrating storm clouds. This application utilizes data from Earth-measuring satellites that cannot penetrate clouds and simulates the internal structure of a hurricane.

The center is also investigating how forecasters evaluate the reliability of AI tools developed by private companies, including Google.

“We have social scientists who engage with end-users to comprehend their trust in AI, their reservations, and what improvements are necessary,” remarked McGovern.

According to Gerald, if the center were to shut down, it wouldn’t adversely affect current weather forecasting but could limit innovation and place the nation at a disadvantage.

“Many other countries are heavily investing in AI-related weather research, like China. They risk falling behind many nations committed to enhancing weather forecasting,” Gerald concluded.

Source: www.nbcnews.com

Reported advancements in AI-driven weather forecasting | Artificial Intelligence (AI)

With the use of a new AI weather forecast approach, a single researcher working on desktop computers can deliver precise weather forecasts that are significantly faster and require much less computing power compared to traditional systems.

Traditional weather forecasting methods involve multiple time-consuming stages that rely on supercomputers and teams of experts. Aardvark Weather offers a more efficient solution by training AI on raw data collected from various sources worldwide.

This innovative approach, detailed in a publication by researchers from the University of Cambridge, Alan Turing Institute, Microsoft Research, and ECMWF, holds the potential to enhance forecast speed, accuracy, and cost-effectiveness.

Richard Turner, a machine learning professor at Cambridge University, envisions the use of this technology for creating tailored forecasts for specific industries and regions, such as predicting agricultural conditions in Africa or wind speeds for European renewable energy companies.

Members of New South Wales Emergency Services will inspect the advancement of the tropical cyclone Alfred on March 5, 2025 at a weather satellite view in Sydney, Australia. Photo: Bianca de Mart/Reuters

Unlike traditional forecasting methods that rely on extensive manual work and lengthy processing times, this new approach streamlines the prediction process, offering potentially more accurate and extended forecasts.

According to Dr. Scott Hosking from the Alan Turing Institute, this breakthrough can democratize weather forecasting by making advanced technologies accessible to developing countries and aiding decision-makers, emergency planners, and industries that rely on precise weather information.

Dr. Anna Allen, the lead author of the Cambridge University research, believes that these findings could revolutionize predictions for various climate-related events like hurricanes, wildfires, and air quality.

Skip past newsletter promotions

Drawing on recent advancements by tech giants like Huawei, Google, and Microsoft, Aardvark aims to revolutionize weather forecasting by leveraging AI to accelerate predictions. The system has already shown promising results, outperforming existing forecast models in certain aspects.

Source: www.theguardian.com

How NOAA’s reduction in cutting methods impacts weather forecasting reliability

A devastating tornado near Minden, Iowa in April 2024

Jonah Lange/Getty Images

Wide range of firing and staffing changes at the National Oceanic and Atmospheric Administration (NOAA) could reduce the reliability of the country's weather forecasts, according to several researchers and the American Meteorological Association.

“The consequences for Americans will be vastly broad, including increasing vulnerability to dangerous weather,” the organization states: statement.

More than 880 NOAA employees have been fired under President Donald Trump's control. statement From US Senator Maria Cantwell. This includes researchers working to improve hurricane predictions and build next-generation weather models, as well as more than 200 people within the National Weather Service, part of NOAA. According to two former NOAA employees, another 500 people accepted an offer to resign from their previous “Folk in the Road” offer, and shouted more for the agency.

A NOAA spokesman declined to discuss shootings and staffing changes. They said the agency will “continue to provide weather information, forecasts and warnings based on our public safety mission.” However, external researchers and former NOAA employees say the cuts could reduce the quality of the agency's weather forecasts.

The change states, “it has a clear cascade effect that affects predictions, even what people are watching on the phone via third parties.” Kari Bowen University of Colorado at Boulder University.

Cuts can quickly affect alerts about extreme weather like tornadoes and hurricanes, and in the long run, even commercial weather apps rely on modeling from NOAA, allowing general weather reports to be more accurate. Below are four ways experts can predict a shooting storm, and four ways that resignation can affect weather forecasts.

Delayed Tornado Warning

National Weather Service operates a network of 122 weather forecasting offices nationwide. At least 16 offices in the central part of the country's prone to tornadoes are currently understaffed. William Galls At Iowa State University. A former NOAA employee said that over 12 offices in the central region have resigned from head meteorologists. And then the harsh weather season begins in the region.

Nearby offices may be able to help understaffed sites track and alert tornadoes, but confusion can lead to delays. “There's a good chance there's a lot of mistakes,” Gallus says.

Such delays were evident last year when a tornado evacuated local forecast offices in Iowa, Galls said. An adjacent station intervened to track the storm. But amidst the chaos, some residents received five minutes of warning that the tornado was heading their path, rather than the minimum 15 minutes that the forecaster aims to provide. In an emergency, these lost times can make a difference whether they can reach safely.

I don't know when a hurricane suddenly becomes stronger

Some employees fired from NOAA were working to improve hurricane forecasts. In particular, we estimate the time when the situation will rapidly intensify. Rapid strengthening can make hurricanes even more dangerous by reducing the time people prepare. However, these events are well known for predicting.

Hurricane modelers at NOAA and other agencies have made great strides in predicting rapid strengthening in recent years, says Brian Tan At Albany University in New York. This is due to improved modeling, data collection and data integration efforts by NOAA researchers. Currently, personnel delivery “destabilizes the entire process of improving hurricane track and intensity prediction,” he says.

“It will be slower to promote the improvements we have been expecting to improve hurricane forecasts over the past 30 years.” Andy Hazeltonworking on improving NOAA's hurricane forecasts before being fired from its position at the agency's environmental modelling centre last week. He says several people have been fired from a group of “Hurricane Hunters” that fly planes into the storm to collect data, including two flight supervisors.

Unreliable weather data

Accurate weather forecasts rely on a continuous stream of information about real-time conditions around the world, collected from marine buoys, satellites, radars and other sensors. Data will then be fed into global weather models that underlie both public and private forecasts. Much of the world's data and modeling is provided by NOAA.

Staff reductions could impact these critical data collection efforts and would reduce the quality of forecasts. In fact, some locals Weather Forecast Center Due to a lack of staff, regular balloon launches have already been suspended.

“All of these observation networks are maintained and run by people.” Emily Becker At the University of Miami in Florida. “And we've already lost a lot of people from those teams. That's going to be an aggregate effect.”

Improvements to future weather forecasts have stopped

At least eight people, a quarter of the staff, were fired from the Environmental Modeling Center. This is responsible for verifying weather data and integrating it into a model that is more or less underlying all predictions, says Hazelton. “What is the temperature this weekend?” and everything is “Are there any tornadoes?”

Personnel delivery at the Environmental Modeling Center will slow research to improve current global weather models, he says. Additionally, 10 people have been fired from the Geophysical Fluid Dynamics Institute, which researchers were building. Next Generation Global weather and climate models.

Such reductions are “very harmful” to efforts to make forecasts more reliable, Gallus says. He says that almost every improvement in forecasts over the past decades depends on improvements in modeling. “If we're losing a ton of researchers working on them, you're basically saying my predictions will never get better.”

topic:

Source: www.newscientist.com

Quantum-inspired algorithm improves weather forecasting.

It is essential for weather forecasts to accurately simulate the turbulent air flow.

EUMETSAT/ESA

The algorithm inspired by quantums allows you to simulate the turbulent liquid flow on a classic computer much faster than the existing tools, and calculate from a few days of a large supercomputer to a normal laptop. Can be reduced. Researchers say that the weather forecast can be improved and industrial processes can be improved.

Liquid or air turbulence has a lot of interactions and quickly becomes very complicated, so it is impossible for the most powerful computer to simulate accurately. The quantum counter part promises to improve the problem, but now the most advanced machine cannot do anything other than rudimentary demonstrations.

These turbulent simulations can be simplified by replacing accurate calculations with probability. However, even with this approximation, scientists will surely request scientists to solve them.

Nikita Guulianov Oxford University and his colleagues have now developed a new approach to the stream probability distribution using algorithms inspired by quantum computers called Tensol Network.

Tensol networks were derived from physics and were commonly used in the early 2000s. They now provide a promising path to show much more performance from existing classical computers before truly convenient quantum machines become available.

“Algorithms and ideas come from the world of quantum simulation. These algorithms are very close to the quantum computer,” says Gourianov. “Both the theory and the actual can see a very dramatic speed up.”

In just a few hours, the team was able to perform a simulation on a laptop that took several days on a supercomputer before. With the new algorithm, the demand for processors has decreased by 1000 times and memory demand has decreased by 1 million times. This simulation was just a test, but the same type of problem is behind the weather, aircraft analysis, and industrial chemistry analysis.

It is said that the turbulent problem with five dimensions data is very difficult without using the tensor. Gunner Meller At Kent University. “It's a nightmare in calculation,” he says. “If you have a super computer and are happy to run for 1-2 months, you can do it in a limited case.”

The tensor network actually works by reducing the amount of data required for simulation and greatly reducing the calculation capacity required to execute it. The amount and nature of the deleted data can be carefully controlled by dialing the upper and lower accuracy level.

These mathematics tools are already used in cats and mouse games between quantum computer developers and classic computer scientists. Google announced in 2019 that a quantum processor called Sycamore has achieved “quantum advantage.” This is a point where quantum computers can complete tasks that are impossible for regular computers for all intentions and purposes.

However, the Tensol network, which simulates the same problem with a large -scale cluster of a conventional graphic processing unit, later achieved the same thing over 14 seconds and lost its previous claim. Since then, Google has once again pulled a new WILLOW Quantum Machine.

When a large -scale and fault -resistant quantum computer is created, the tensor can be executed on a much larger scale than the classic computer, but Möller is excited about what may be achieved in the meantime. I say you are.

“If you use a laptop, the author of this paper may lose what you can do with a supercomputer. You can get a big profit right away and have a perfect quantum computer.

topic:

Source: www.newscientist.com

New Google AI technology significantly decreases computing power required for weather forecasting

AI could help us predict the weather more accurately

LaniMiro Lotufo Neto/Alamy

Google researchers have developed an artificial intelligence that they say can predict weather and climate patterns as accurately as current physical models, but with less computing power.

Existing forecasts are based on mathematical models run by extremely powerful supercomputers that deterministically predict what will happen in the future. Since they were first used in the 1950s, these models have become increasingly detailed and require more and more computer power.

Several projects aim to replace these computationally intensive tasks with much less demanding AI, including a DeepMind tool that forecasts localized rainfall over short periods of time. But like most AI models, the problem is that they are “black boxes” whose inner workings are mysterious and whose methods can’t be explained or replicated. And meteorologists say that if these models are trained on historical data, they will have a hard time predicting unprecedented events now being caused by climate change.

now, Dmitry Kochkov The researchers, from Google Research in California, and his colleagues created a model called NeuralGCM that balances the two approaches.

Typical climate models divide the Earth's surface into a grid of cells up to 100 kilometers in size. Due to limitations in computing power, simulating at high resolution is impractical. Phenomena such as clouds, turbulence, and convection within these cells are only approximated by computer codes that are continually adjusted to more closely match observed data. This approach, called parameterization, aims to at least partially capture small-scale phenomena that are not captured by broader physical models.

NeuralGCM has been trained to take over this small-scale approximation, making it less computationally intensive and more accurate. In the paper, the researchers say their model can process 70,000 days of simulations in 24 hours using a single chip called a Tensor Processing Unit (TPU). By comparison, competing models, called X-Shield A supercomputer with thousands of processing units is used to process the simulation, which lasts just 19 days.

The paper also claims that NeuralGCM performs predictions at a rate comparable to or better than best-in-class models. Google did not respond to a request for an interview. New Scientist.

Tim Palmer The Oxford researcher says the work is an interesting attempt to find a third way between pure physics and opaque AI approximations: “I'm uncomfortable with the idea of ​​completely abandoning the equations of motion and moving to AI systems that even experts say they don't fully understand,” he says.

This hybrid approach is likely to spur further discussion and research in the modeling community, but time will tell whether it will be adopted by modeling engineers around the world, he says. “It's a good step in the right direction and the type of research we should be doing. It's great to see different alternatives being explored.”

topic:

Source: www.newscientist.com

The Newest Developments in Science and Technology: Forecasting 2023

agio type

In 2020, michael snyderGeneticists at Stanford University in California have discovered that we tend to age along four different pathways. He said that the biological characteristics associated with aging are mainly found in four parts of the body (kidneys, liver, immune system, and general metabolism), and that of these systems one or two We found that it ages faster than the rest of the systems.

Snyder believes that understanding your “age type” can guide you to optimal strategies to target key aging pathways, helping you live longer, healthier lives. People with liver disease may consider quitting drinking, he said. On the other hand, people of metabolic age should focus on exercise.

In any case, one might expect the term to become popular, at least in circles obsessed with it, because it pioneers efforts to personalize anti-aging interventions. I don’t know.

agrivoltaics

Next time you’re walking through the countryside, you might come across a field that looks a bit unusual. Some areas may grow crops that coexist with large areas of solar panels, while others may have livestock sheltering or grazing under solar canopies. What you are looking at is “agrivoltaics”. This is the term used to describe solar energy facilities designed to work with crops and livestock.

Inevitably, some argue that solar power degrades the landscape and changes the nature of rural areas. But in North America, agrivoltaic proponents are working to convince people that solar power can help restore disappearing grasslands. In any case, the term is sure to stick around because it captures something new…

Source: www.newscientist.com