If you want evidence of Microsoft’s progress towards its environmental “moonshot” goals, look closer to Earth to a construction site on an industrial estate in west London. The company’s Park Royal data center is part of the company’s efforts to drive the expansion of artificial intelligence (AI), but its ambitions are The goal is to become carbon negative by 2030. Microsoft says the center will be run entirely on renewable energy, but construction of the data center and the servers it will house will contribute to the company’s Scope 3 emissions (CO2)2. These relate to the electricity people use when using building materials or products like the Xbox. 30% increase from 2020. As a result, the company is exceeding its overall emissions target by roughly the same percentage.
This week, Microsoft co-founder Bill Gates argued that AI can help fight climate change because big tech companies are “seriously willing” to pay extra to use clean sources of electricity so they can “say they’re using green energy.” In the short term, AI poses a problem for Microsoft’s environmental goals. Microsoft’s outspoken president, Brad Smith, once called the company’s carbon-reduction ambitions a “moonshot.” In May, he stretched that metaphor to its limits and said that the company’s AI strategy has “moved the moon” for it. It plans to spend £2.5bn over the next three years to expand its AI data center infrastructure in the UK, and has announced new data center projects around the world this year, including in the US, Japan, Spain, and Germany.
Training and running the AI models underlying products like OpenAI’s ChatGPT and Google’s Gemini uses significant amounts of electricity to power and cool the associated hardware, plus carbon is generated by manufacturing and transporting the associated equipment. “This is a technology that will increase energy consumption,” said Alex de Vries, founder of DigiConomist, a website that tracks the environmental impact of new technologies. The International Energy Agency estimates that the total electricity consumption of data centers is Doubling from 2022 levels to 1,000 TWh (terawatt hours) in 2026. This is equivalent to Japan’s energy demand. With AI, data centers 4.5% of world energy production That will happen by 2030, according to calculations by research firm Semianalysis.
The environment has also been in the spotlight amid concerns about AI’s impact on jobs and human lifespan. Last week, the International Monetary Fund said governments should consider imposing carbon taxes to capture the environmental costs of AI, either through a general carbon tax that covers emissions from servers, or a specific tax on CO2.2 It is generated by the device. The big tech companies involved in AI (Meta, Google, Amazon, Microsoft) are seeking renewable energy sources to meet their climate change targets. Largest Corporate Buyer Renewable Energy I bought more than half The power output of offshore wind farms in Scotland, which Microsoft announced in May it would invest $10 billion (£7.9 billion) in. Renewable Energy Projects.
Google aims to run its data centers entirely on carbon-free energy by 2030. “We remain steadfast in our commitment to achieving our climate change goals,” a Microsoft spokesperson said. Microsoft co-founder Bill Gates, who left the company in 2020 but retains a stake in the company through his Foundation, has argued that AI can directly help combat climate change. He said Thursday that any increase in electricity demand would be matched by new investments in green generation to more than offset usage. A recent UK government-backed report agreed, saying that “the carbon intensity of energy sources is an important variable in In calculating AI-related emissions, but adding that “a significant portion of AI training worldwide still relies on high-carbon sources such as coal and natural gas”. Water needed to cool servers is also an issue, A study It estimates that AI could account for up to 6.6 billion cubic meters of water use by 2027. Two thirds This is equivalent to the annual consumption of England.
De Vries argues that the pursuit of sustainable computing power will put a strain on demand for renewable energy, resulting in fossil fuels making up for shortfalls in other parts of the global economy. “Increasing energy consumption means there isn’t enough renewable energy to cover that increase,” he says. Data center server rooms consume large amounts of energy. Photo: i3D_VR/Getty Images/iStockphoto. NexGen Cloud, a UK company that provides sustainable cloud computing, says that in an industry that relies on data centers to provide IT services such as data storage and computing power over the internet, data centers could use renewable energy sources for AI-related computing if they were located away from urban areas and near hydroelectric or geothermal generation sources. “We are excited to join forces with NVIDIA to bring the power of cloud to the cloud,” said Youlian Tzanev, co-founder of NexGen Cloud.
“Until now, the industry standard has been to build around economic centers, not renewable energy sources.” This makes it even harder for AI-focused tech companies to meet their carbon emissions targets. Amazon, the world’s largest cloud computing provider, aims to be net zero (removing as much carbon as it emits) by 2040 and aims to source 100% of its global electricity usage from renewable energy by 2025. Google and Meta are also pursuing the same net zero goal by 2030. OpenAI, the developer of ChatGPT, uses Microsoft data centers to train and run its products.
There are two main ways that large-scale language models, the underlying technology behind chatbots like ChatGPT and Gemini, consume energy: The first is the training phase, where the model is fed huge amounts of data, often from the internet, to build up a statistical understanding of the language itself, which ultimately enables it to generate large numbers of compelling answers to queries. The initial energy costs of training an AI are astronomical, meaning that small businesses (and even smaller governments) that can’t afford to spend $100 million on training can’t compete in the field. But this cost pales in comparison to the cost of actually running the resulting models, a process called “inference.” According to Brent Till, an analyst at investment firm Jefferies, 90% of AI’s energy costs are in the inference stage – the power consumed when you ask an AI system to answer a factual question, summarize a chunk of text, or write an academic paper.
The power used for training and inference is delivered through a vast and growing digital infrastructure. Data centers contain thousands of servers built from the ground up for specific pieces of AI workloads. A single training server contains a central processing unit (CPU) that’s nearly as powerful as a computer’s, and dozens of specialized graphics processing units (GPUs) or tensor processing units (TPUs), microchips designed to speed up the vast amounts of simple calculations that make up AI models. When you use the chatbot, you watch it spit out answers word for word, powered by powerful GPUs that consume about a quarter of the power it takes to boil a kettle. All of this is hosted in a data center, whether owned by the AI provider itself or a third party. In the latter case, it’s sometimes called “the cloud,” a fancy name for someone else’s computer.
SemiAnalysis estimates that if generative AI were integrated into every Google search, it could consume 29.2 TWh of energy per year, roughly the annual consumption of Ireland, which would be prohibitively financial for the tech company, sparking speculation that Google may start charging for some of its AI tools. But some argue that focusing on the energy overhead of AI is the wrong way to think about it. Instead, think about the energy that new tools can save. A provocative paper published in Nature’s peer-reviewed journal Scientific Reports earlier this year argued that AI creates a smaller carbon footprint when writing or illustrating text than humans. Researchers at the University of California, Irvine estimate that AI systems emit “130 to 1,500 times” less carbon dioxide per page of text than a human writer, and up to 2,900 times less carbon dioxide per image. Of course, there’s no word on what human authors and illustrators will do instead: redirect and retrain their workforce in other areas, e.g. Green Jobs – It could be another moonshot.
Source: www.theguardian.com