How Gigafactories Will Revolutionize Energy: The Century’s Best Idea

New Scientist: Your source for the latest news and long reads in science, technology, health, and the environment.

Batteries and solar energy technologies have been evolving for centuries, but they reached a pivotal moment in 2016. This year marked the launch of the first Gigafactory in Nevada, which produces cutting-edge battery technologies, electric motors, and solar cells on a large scale. The term ‘Gigafactory’ implies vast production capabilities.

The renewable energy potential—including solar, wind, and hydropower—is staggering. In merely a few days, the sun provides more energy to Earth than we can harvest from all fossil fuel reserves combined.

Efficiently harnessing this power remains a challenge. The photovoltaic effect, discovered by Edmond Becquerel in 1839, allows light to generate electric current. Although the first functional solar panels emerged in the 1950s, only in the 2010s did solar technology advance enough to rival fossil fuels. Simultaneously, lithium-ion batteries invented in the 1980s have created reliable energy storage solutions.

The Gigafactory has been instrumental in advancing these solar and battery technologies—not through new inventions but by integrating all components of electric vehicle production. This approach reflects Henry Ford’s legacy, populating the world with Teslas instead of fossil fuel-burning vehicles. “Batteries have made it possible to utilize solar power efficiently, and electric vehicles are now a reality,” says Dave Jones from Ember, a British energy think tank.

The economies of scale introduced by gigafactories have extended their impact beyond electric vehicles. “These batteries will enable a host of innovations: smartphones, laptops, and the capacity to transport energy efficiently at lower costs,” remarks Sarah Hastings-Simon from the University of Calgary, Canada.

Due to recent advancements, the costs associated with these technologies have plummeted. Many experts believe that the electrification of energy systems is now inevitable. In states like California and countries such as Australia, the abundance of solar energy has led grid operators to offer electricity at no cost. Battery technology is rapidly improving, enabling the development of solar-powered planes, ships, and long-haul trucks, effectively breaking our reliance on fossil fuels that have dominated energy systems for centuries.

Topics:

  • Electric Cars/
  • Renewable Energy

Source: www.newscientist.com

Revolutionary Solution for Cosmic Acceleration: Overcoming Dark Energy Challenges

Researchers from the Center for Applied Space Technology and Microgravity at the University of Bremen and the University of Transylvania in Brașov have unveiled a groundbreaking theoretical framework that challenges our understanding of the universe’s accelerating expansion, potentially rendering dark energy obsolete. They suggest that this acceleration may be an intrinsic characteristic of space-time geometry, rather than a result of unknown cosmic forces.

This artist’s impression traces the evolution of the universe from the Big Bang, through the formation of the Cosmic Microwave Background, to the emergence of galaxies. Image credit: M. Weiss / Harvard-Smithsonian Center for Astrophysics.

For over 25 years, scientists have been puzzled by the unexpected observation that the expansion of the universe is accelerating, counter to the gravitational pull.

In the 1990s, astronomers identified this acceleration through observations of distant Type Ia supernovae, leading to the prevalent theory of dark energy, an invisible force believed to drive this expansion.

Nevertheless, the actual nature of dark energy remains elusive within the Standard Model of cosmology.

Dr. Christian Pfeiffer and his team propose that we may better understand this cosmic acceleration by re-evaluating the geometric framework used to describe gravity.

Central to modern cosmology is Einstein’s theory of general relativity, which details how matter and energy shape space-time.

The universe’s evolution is modeled using the Friedman equation, which originates from Einstein’s principles.

The researchers introduce an innovative solution based on Finsler gravity, an extension of Einstein’s theory.

This approach enhances our understanding of spacetime geometry and allows for a more nuanced exploration of how matter, especially gases, interacts with gravity.

Unlike general relativity, which depends on rigid geometric forms, Finsler gravity presents a more versatile space-time geometry.

With this methodology, the authors recalibrated the equations governing cosmic expansion.

Informed by the Finsler framework, the modified Friedman equation predicts the universe’s acceleration phenomena without necessitating the introduction of dark energy.

In essence, the accelerating expansion emerges directly from the geometry of space-time itself.

“This is a promising hint that we may explain the universe’s accelerating expansion partly without dark energy, drawing from generalized space-time geometry,” Pfeiffer remarked.

This concept does not entirely dismiss dark energy or invalidate the Standard Model.

Instead, it implies that some effects attributed to dark energy might have their roots in a deeper understanding of gravity.

“This fresh geometric outlook on the dark energy dilemma provides avenues for a richer comprehension of the universe’s foundational laws,” stated Dr. Pfeiffer.

The research team’s paper is published in the Journal of Cosmology and Astroparticle Physics.

_____

Christian Pfeiffer et al. 2025. From a moving gas to an exponentially expanding universe, the Finsler-Friedman equation. JCAP 10:050; DOI: 10.1088/1475-7516/2025/10/050

Source: www.sci.news

New Study Unveils Breakthrough Approach for Alzheimer’s Disease Recovery by Targeting Cellular Energy Deficits

Alzheimer’s Disease (AD) has long been deemed irreversible. However, a groundbreaking study by scientists from Case Western Reserve University, University Hospitals, and the Louis Stokes Cleveland VA Medical Center reveals that treatment for advanced Alzheimer’s disease can be reversed. Through extensive research on both preclinical mouse models and human brain samples, the team discovered that the brain’s failure to maintain normal levels of nicotinamide adenine dinucleotide (NAD+), the crucial energy molecule of cells, significantly contributes to the onset of Alzheimer’s. Furthermore, sustaining an appropriate NAD+ balance may not only prevent but also reverse the progression of Alzheimer’s disease.



Alzheimer’s disease severity correlates with NAD+ homeostatic dysregulation. Image credit: Chaubey et al., doi: 10.1016/j.xcrm.2025.102535.

Historically, Alzheimer’s disease, the primary cause of dementia, has been regarded as irreversible since its identification over a century ago, and it is expected to impact more than 150 million individuals globally by 2050.

Current therapies focused on amyloid beta (Aβ) and clinical symptoms offer limited benefits, underscoring the urgent need for complimentary and alternative treatment options.

Intriguingly, individuals with autosomal dominant AD mutations can remain symptom-free for decades, while others without Alzheimer’s neuropathology maintain cognitive function despite having numerous amyloid plaques.

These insights indicate potential intrinsic brain resilience mechanisms that may slow or halt disease progression, suggesting that enhancing these processes could enhance recovery from Alzheimer’s disease.

NAD+ homeostasis plays a pivotal role in cellular resilience against oxidative stress, DNA damage, neuroinflammation, blood-brain barrier degradation, impaired hippocampal neurogenesis, deficits in synaptic plasticity, and overall neurodegeneration.

In a recent study, Professor Andrew Pieper and his team from Case Western Reserve University discovered that NAD+ levels decrease significantly in the brains of Alzheimer’s patients, a trend also observed in mouse models.

While Alzheimer’s disease is unique to humans, it can be effectively modeled using genetically engineered mice that carry mutations linked to human Alzheimer’s disease.

The researchers utilized two distinct mouse models: one with multiple human mutations affecting amyloid processing and another with a human mutation in the tau protein.

Both models exhibited Alzheimer’s-like brain pathology, including blood-brain barrier degradation, axonal degeneration, neuroinflammation, impaired hippocampal neurogenesis, diminished synaptic transmission, and excessive oxidative damage.

They also developed cognitive impairments typical of Alzheimer’s patients.

Upon discovering the sharp decline in NAD+ levels in both humans and mice with Alzheimer’s, the scientists investigated whether preserving NAD+ levels before disease onset and restoring them after significant disease progression could prevent or reverse Alzheimer’s.

This research builds upon prior work showing potential recovery by restoring NAD+ balance following severe brain injuries.

The team achieved NAD+ balance restoration using a well-known pharmacological agent, P7C3-A20.

Remarkably, maintaining NAD+ balance not only shielded mice from developing Alzheimer’s but also enabled brain recovery from key pathological changes even when treatment was delayed in advanced disease stages.

Subsequently, both mouse strains fully regained cognitive function, accompanied by normalized levels of phosphorylated tau-217—a recently recognized clinical biomarker for Alzheimer’s disease in humans—confirming the restoration of cognitive function and highlighting a potential biomarker for future Alzheimer’s disease reversal trials.

“We are excited and hopeful about these results,” said Professor Pieper.

“Restoring brain energy balance led to both pathological and functional recovery in mice with advanced Alzheimer’s disease.”

“Observing this effect across two different animal models, driven by distinct genetic causes, reinforces the notion that recovery from progressive Alzheimer’s disease may be achievable through the restoration of brain NAD+ balance.”

These findings encourage a shift in how researchers, clinicians, and patients perceive treatment options for Alzheimer’s disease moving forward.

“The key takeaway is one of hope. Alzheimer’s disease effects may not necessarily be permanent,” noted Professor Pieper.

“Under certain conditions, the damaged brain can self-repair and regain functionality.”

“Through our research, we not only demonstrated a drug-based method for promoting recovery in animal models but also identified candidate proteins in human AD brains that may aid in reversing the disease,” remarked Dr. Kalyani Chaubey, a researcher at Case Western Reserve University and University Hospitals.

While current commercially available NAD+ precursors have been shown to elevate cellular NAD+ to unsafe levels—potentially promoting cancer—the pharmacological approach of this study employs P7C3-A20, which allows cells to maintain optimal NAD+ levels under stress without elevating them excessively.

“This is a crucial consideration for patient care, and clinicians should explore therapeutic strategies aimed at restoring the brain’s energy balance as a viable path toward disease recovery,” Professor Pieper concluded.

For more detailed information, see the study findings published in Cell Reports Medicine.

_____

Kalyani Chaubey et al. Pharmacological reversal of advanced Alzheimer’s disease in mice and identification of potential therapeutic nodes in the human brain. Cell Reports Medicine, published online on December 22, 2025. doi: 10.1016/j.xcrm.2025.102535

Source: www.sci.news

Do Data Centers’ High Energy Demands Threaten Australia’s Net Zero Goals?

The demand for electricity by data centers in Australia could triple over the next five years, with projections indicating it may surpass the energy consumed by electric vehicles by 2030.

Currently, data centers obtain approximately 2% of their electricity from the National Grid, equating to around 4 terawatt-hours (TWh). The Australian Energy Market Operator (Aemo) is optimistic about this share significantly increasing, projecting a growth of 25% annually to reach 12TWh, or 6% of grid demand by 2030, and 12% by 2050.

Aemo anticipates that the rapid expansion of this industry will drive “substantial increases in electricity usage, especially in Sydney and Melbourne.”


In New South Wales and Victoria, where the majority of data centers are situated, they contribute to 11% and 8% of electricity demand, respectively, by 2030. Electricity demand in each state is projected to grow accordingly.

Tech companies like OpenAI and SunCable are pushing Australia towards becoming a central hub for data processing and storage. Recently, the Victorian Government announced a $5.5 million investment aimed at establishing the region as Australia’s data center capital.

However, with 260 data centers currently operating across the nation and numerous others in the pipeline, experts express concerns about the implications of unchecked industry growth on energy transition and climate objectives.

Energy Usage Equivalent to 100,000 Households

The continual operation of numerous servers generates substantial heat and requires extensive electricity for both operation and cooling.

Subscribe to Climate and Environment Editor Adam Morton’s Clear Air column for insightful newsletters

Globally, the demand for data centers is growing at a rate four times faster than other sectors, according to the International Energy Agency. The number and size of centers are escalating, with large facilities becoming increasingly common.

As highlighted by the IEA, “AI-centric hyperscale data centers possess a capacity exceeding 100MW and consume energy equivalent to what 100,000 homes use annually.”

Professor Michael Blair, a mechanical engineering professor at the University of Melbourne and director of the Net Zero Australia project, stated that there is a significant connection between electricity and water usage due to cooling requirements, as servers convert electrical energy into heat.

“In confined spaces with many computers, air conditioning is required to maintain an optimal operating temperature,” he explains.

Typically, digital infrastructure is cooled through air conditioning or water systems.

Ketan Joshi, a climate analyst at the Oslo-based Australia Institute, shares that many tech companies are reporting a surge in electricity consumption compared to last year. The intensity of energy usage has also been increasing across several metrics: energy per active user and energy per unit of revenue, when compared to five years ago.

“They aren’t consuming more energy to serve additional users or increase revenue,” he asserts. “The pertinent question is: why is our energy consumption escalating?”

In the absence of concrete data, Joshi suggests that the undeniable growth in demand is likely attributed to the rise of energy-intensive generative AI systems.

“Running Harder to Stay in the Same Place”

Joshi is monitoring this issue, as data centers globally are evidenced to place substantial and inflexible demands on power grids, resulting in two significant repercussions: increased dependence on coal and gas generation, and diverting resources away from the energy transition.

While data center companies often assert they operate using clean energy through investments in solar and wind, Joshi remarks that there can often be a mismatch between their companies’ persistent reliance on the grid and their renewable energy production profiles.

“What’s the ultimate impact on the power grid?” he questions. “Sometimes, we have surplus energy, and other times, there isn’t enough.”

Skip past newsletter promotions

“So, even if everything appears favorable on paper, your data center might be inadvertently supporting fossil fuel transportation.”

Moreover, instead of renewable energy sources displacing coal and gas, these sources are accommodating the growing demands of data centers, Joshi notes. “It’s like sprinting on a treadmill—no matter how hard you run, it feels like the speed is continually increasing.”


The demand for electricity has surged to the extent that some companies have resorted to restarting their operations. Nuclear power plants in the U.S. that were once mothballed are being revived as demand for gas turbines increases. Some Australian developers are even proposing the installation of new gas generators to fulfill their energy needs.

Aemo predicts that by 2035, data centers could consume 21.4TWh, nearing the country’s annual energy consumption, comparable to that of four aluminum smelters.

Blair pointed out that AI adoption is in its infancy, and the outlook remains uncertain, as Aemo’s 2035 energy consumption scenarios range between 12TWh and 24TWh, indicating that the future might not be as expansive as anticipated.

In the National AI Plan released Tuesday, the federal government recognized the necessity for advancements in new energy and cooling technologies for AI systems. Industry Minister Tim Ayers stated that principles for data center investments will be established in early 2026, emphasizing requirements for supplementary investments in renewable energy generation and water sustainability.

“Undeniable Impact” on Electricity Prices

Dr. Dylan McConnell, an energy systems researcher at the University of New South Wales, noted that while renewable energy is on the rise in Australia, it is not yet progressing rapidly enough to meet required renewable energy and emissions targets. The expansion of data centers will complicate these challenges.

“If demand escalates beyond projections and renewables can’t keep pace, we’ll end up meeting that new demand instead of displacing coal,” he explains.

Unlike electric vehicles, which enhance demand on the grid while lowering gasoline and diesel usage, data centers do not reduce fossil fuel consumption elsewhere in the economy, according to McConnell.

“If this demand materializes, it will severely hamper our emissions targets and complicate our ability to phase out coal in alignment with those targets,” he advises.

In its climate targets recommendations, the Climate Change Agency stated: “Data centers will continue to scale up, exerting deeper pressure on local power sources and further hampering renewable energy expansions.”

McConnell asserted there will be a significant effect on overall energy costs, influencing electricity prices.

“To support this load, we will need a larger system that utilizes more costly resources.”

Source: www.theguardian.com

Save on Energy Bills: Harness Smart Technology to Reduce Heating Costs and Repair Your Boiler

Utilize Smart Technology

“Minor adjustments can lead to significant improvements in energy conservation and warmth,” said Sarah Pennells, a consumer finance expert at Royal London.

Firstly, if your boiler or thermostat is equipped with a timer, make use of it.

For enhanced control, consider upgrading to a smart thermostat that connects to the internet. This option lets you manage your thermostat remotely, typically through a mobile app, enabling you to turn the heating on or off when plans change unexpectedly. A smart thermostat acts like a timer for your boiler, allowing you to use the app for scheduling heating and hot water.

Smart thermostats come in various models and offer features like multi-room control, hot water management, and “geofencing” that tracks your presence in and out of the home. Their prices usually range from £60 to £250 depending on the brand.




Upgrading to a smart thermostat
Allows remote control, generally via a mobile app.
Photo: Stefan Nikolic/Getty Images

Bosch Room Thermostat II (£69.99); and Hive Thermostat V4 (£155 B&Q) requires a professional installation, which can typically be arranged through a retailer, though additional fees may apply.

Some energy suppliers offer discounts on smart thermostats from their partnered brands. The Octopus Energy and tado° partnership gives customers up to 50% off on tado° products. The Wireless Smart Thermostat X Starter Kit has been marked down from £159.99 to £112.

<h2 id="reduce-temperatures" class="dcr-n4qeq9"><strong>Reduce the Temperature</strong></h2>
<p class="dcr-130mj7b">Research indicates that decreasing the thermostat setting from 22°C to 21°C may save the typical UK household £90 annually.<a href="https://energysavingtrust.org.uk/take-control-your-heating-home/?_gl=1*boqspv*_up*MQ..*_ga*MTQ2OTcwMDExNy4xNzYyMjcwMDYy*_ga_GPYNXFLD7G*czE3NjIyNzAwNjAkbzEkZzEkdDE3NjIyNzA0NzY KajYwJGwwJGgw#jumpto-1" data-link-name="in body link"> Energy Saving Trust</a>. For most, a comfortable indoor temperature lies between 18°C and 21°C.</p>
<p class="dcr-130mj7b">According to <a href="https://www.youtube.com/watch?v=DDZNODZ5qyY" data-link-name="in body link">Citizen Advice</a>, lowering your thermostat can mean saving about 10% on energy bills. However, those who are elderly or have health concerns are advised not to set the temperature below 21°C.</p>
<figure id="02c5f80c-ea54-4dcd-bbfb-3af8d5a81874" data-spacefinder-role="supporting" data-spacefinder-type="model.dotcomrendering.pageElements.ImageBlockElement" class="dcr-a2pvoh">
    <figcaption data-spacefinder-role="inline" class="dcr-9ktzqp">
        <span class="dcr-1inf02i">
            <svg width="18" height="13" viewbox="0 0 18 13">
                <path d="M18 3.5v8l-1.5 1.5h-15l-1.5-1.5v-8l1.5-1.5h3.5l2-2h4l2 2h3.5l1.5 1.5zm-9 7.5c1.9 0 3.5-1.6 3.5-3.5s-1.6-3.5-3.5-3.5-3.5 1.6-3.5 3.5 1.6 3.5 3.5 3.5z"/>
            </svg>
        </span>
        <span class="dcr-1qvd3m6">Most people find a comfortable indoor temperature between 18°C and 21°C.</span> Photo: Rid Franz/Getty Images
    </figcaption>
</figure>
<p class="dcr-130mj7b">Moreover, experts suggest that maintaining a continuous lower temperature consumes more energy than heating intermittently at a slightly higher setting.</p>
<p class="dcr-130mj7b">Setting your heating to switch off 30 minutes before leaving the house or turning in for the night can further decrease your electricity costs.</p>

<h2 id="lower-the-flow" class="dcr-n4qeq9"><strong>Reduce Flow Rate</strong></h2>
<p class="dcr-130mj7b">If using a combi boiler, you can lower the temperature of the flow, which is the water temperature entering the radiator.</p>
<p class="dcr-130mj7b">For those using a system boiler or hot water cylinder, <a href="https://www.edfenergy.com/energywise/lower-flow-temperature-on-combi-boiler" data-link-name="in body link">EDF Energy advises</a> seeking assistance from an engineer for guidance.</p>
<p class="dcr-130mj7b">Typically, boilers have a high flow temperature around 75-80°C. Reducing this to about 60°C might cut your gas bills without noticeably affecting comfort levels.</p>
<p class="dcr-130mj7b">“This approach is particularly beneficial in homes with well-sized radiators and adequate insulation, showing no significant change in comfort,” notes Pennells.</p>
<p class="dcr-130mj7b">The charity Nesta provides an online and interactive <a href="https://www.moneysavingboilerchallenge.com/" data-link-name="in body link">tool</a> to help users adjust their boiler settings. They recommend documenting the boiler's original controls and settings with photos before making changes.</p>

<h2 id="turn-down-radiators" class="dcr-n4qeq9"><strong>Adjust Radiators</strong></h2>
<p class="dcr-130mj7b">If your radiators have a dial controlled by a thermostatic radiator valve (TRV), you can set the temperature individually for each room. TRVs generally have a scale from 0 to 6, with 0 being off and 6 being fully open.</p>
<aside data-spacefinder-role="supporting" data-gu-name="pullquote" class="dcr-19m4xhf">
    <svg viewbox="0 0 22 14" style="fill:var(--pullquote-icon)" class="dcr-scql1j">
        <path d="M5.255 0h4.75c-.572 4.53-1.077 8.972-1.297 13.941H0C.792 9.104 2.44 4.53 5.255 0Zm11.061 0H21c-.506 4.53-1.077 8.972-1.297 13.941h-8.686c.902-4.837 2.485-9.411 5.3-13.941Z"/>
    </svg>
    <blockquote class="dcr-zzndwp">Research shows that people have begun to heat individuals rather than entire spaces.</blockquote>
    <footer><cite>Sophie Barr of National Energy Action</cite></footer>
</aside>
<p class="dcr-130mj7b">The Energy Saving Trust recommends setting your room on the lowest temperature that maintains comfort. You can set 3 or 4 in frequently used rooms and reduce this to 2 or 3 in less-used spaces. They also mention that integrating a TRV into an existing system with a programmer and thermostat could save households around £35 each year.</p>
<p class="dcr-130mj7b">While turning off heating altogether may seem like a good way to save money, experts warn that this could result in mold and dampness, which could incur greater costs and health risks over time.</p>
<p class="dcr-130mj7b">“During the energy crisis, we observed changes in behavior where people started to prioritize heating individuals rather than entire homes,” says project development coordinator Sophie Barr. <a href="https://www.nea.org.uk/get-help/resources/" data-link-name="in body link">National Energy Action</a>. “Our findings indicate that it's more cost-effective to provide heat to the entire area by adjusting radiators in unused rooms to setting 2, thus providing sufficient warmth to deter mold spores that can lead to serious respiratory health issues.”</p>

<h2 id="get-reflectors" class="dcr-n4qeq9"><strong>Install Reflectors</strong></h2>
<p class="dcr-130mj7b">The <a href="https://britishgasenergytrust.org.uk/" data-link-name="in body link">British Gas Energy Trust</a> suggests placing foil behind radiators to reflect heat back into the room. Since approximately 35% of indoor heat escapes through the walls, these reflectors ensure that heat is redirected into the room rather than absorbed by exterior walls, making them particularly effective on uninsulated external walls.</p>
<p class="dcr-130mj7b">Though there may be a small initial expense, they are reasonably priced, simple to install, and durable. They can be purchased in rolls and cut to fit your radiators. They are easy to apply with included adhesive or double-sided tape—first ensuring the radiator is turned off and cool. Screwfix offers rolls of 1.88 square meters for <a href="https://www.screwfix.com/p/essentials-470mm-x-4m-radiator-heat-reflector-foil/88629?tc=JS7" data-link-name="in body link">£7.51</a>, while B&Q has a 5 square meter roll for <a href="https://www.diy.com/departments/diall-radiator-reflector-5m-/1906873_BQ.prd?storeId=1037" data-link-name="in body link">£14.97</a>, and Amazon sells a 15 square meter roll for <a href="https://www.amazon.co.uk/dp/B0CYM442P1?tag=track-ect-uk-2181897-21&amp;linkCode=osi&amp;th=1&amp;ascsubtag=ecSEPr67xojmhks6sn7" data-link-name="in body link">£27.99</a>.</p>
<p class="dcr-130mj7b">To enhance efficiency, bleed your radiators every few months. Ensure the radiator is switched off and cool before inserting the key (<a href="https://www.diy.com/departments/rothenberger-radiator-key-pack-of-2/191173_BQ.prd" data-link-name="in body link">£3.50</a> for a B&Q 2-pack) or a flat-head screwdriver into the bleed valve (often located in the top corner) and turn it counterclockwise. Listen for a hissing sound as air escapes; wait for it to stop, showing a steady flow of water (you can catch it with a cloth), then turn the valve clockwise to close it again.</p>
<figure id="ecc5fd24-5ed1-4f48-91f5-eabfbfb8530e" data-spacefinder-role="supporting" data-spacefinder-type="model.dotcomrendering.pageElements.ImageBlockElement" class="dcr-a2pvoh">
    <figcaption data-spacefinder-role="inline" class="dcr-9ktzqp">
        <span class="dcr-1inf02i">
            <svg width="18" height="13" viewbox="0 0 18 13">
                <path d="M18 3.5v8l-1.5 1.5h-15l-1.5-1.5v-8l1.5-1.5h3.5l2-2h4l2 2h3.5l1.5 1.5zm-9 7.5c1.9 0 3.5-1.6 3.5-3.5s-1.6-3.5-3.5-3.5-3.5 1.6-3.5 3.5 1.6 3.5 3.5 3.5z"/>
            </svg>
        </span>
        <span class="dcr-1qvd3m6">Regular boiler servicing enhances efficiency.</span> Photo: Joe Giddens/Pennsylvania
    </figcaption>
</figure>
<p class="dcr-130mj7b">Avoid obstructing radiators with furniture or curtains, especially beneath windows, to distribute heat more evenly throughout the space.</p>

<h2 id="keep-your-boiler-serviced" class="dcr-n4qeq9"><strong>Regular Boiler Maintenance</strong></h2>
<p class="dcr-130mj7b">Routine boiler service enhances efficiency and extends lifespan by addressing minor issues. According to Octopus Energy, neglecting boiler maintenance can lead to up to 10% more energy usage compared to those serviced annually. “Failure to regularly maintain your boiler can significantly affect fuel efficiency and health,” warns Barr.</p>
<p class="dcr-130mj7b">As per Which?, the average cost for a boiler service ranges from £70 to £110.</p>
<p class="dcr-130mj7b">Some energy providers include this service in their annual coverage plans, such as British Gas, which features it in their <a href="https://www.britishgas.co.uk/cover/boiler-and-heating.html" data-link-name="in body link">home care</a> options starting at £19 per month. However, a boiler care plan might not be suitable for every consumer. Which? recommends considering if your monthly contributions may exceed the costs of the annual service or repairs. Ensure you have savings to cover the full service fee as needed.</p>
<p class="dcr-130mj7b">For renters, it is the landlord’s obligation to arrange for annual boiler inspections and certifications. “Annual maintenance is mandatory for all rental properties,” says Barr. "For homes with gas boilers, only a gas safety engineer should perform this work, and an Oftec certified engineer should handle oil boilers. Annual boiler maintenance guarantees that your system operates efficiently and prevents carbon monoxide leaks in your home."</p>

Source: www.theguardian.com

AI’s Energy Drain from Poor Content: Can We Redefine AI for Climate Action?

aArtificial intelligence is frequently linked to massive electricity consumption, resulting in global warming emissions that often support unproductive or misleading gains which contribute little to human advancement.

However, some AI proponents at a significant UN climate summit are presenting an alternative perspective. Could AI actually assist in addressing the climate crisis rather than exacerbating it?

The discussion of “AI for good” resonated at the Cop30 conference in Belem, Brazil, where advocates claim AI has the potential to lower emissions through various efficiencies that could impact multiple aspects of daily life, including food, transportation, and energy—major contributors to environmental pollution.


Recently, a coalition of organizations, UN agencies, and the Brazilian government announced the establishment of the AI Climate Institute, a new global initiative aimed at leveraging AI as a tool for empowerment to assist developing nations in addressing environmental issues.

Proponents assert that, over time, this initiative will educate countries on utilizing AI in various ways to curb emissions, including enhancing public transportation, streamlining agricultural systems, and adjusting energy grids to facilitate the timely integration of renewable energy.

Forecasting weather patterns, including the mapping of impending climate crises like floods and wildfires, could also be refined through this approach, remarked Maria João Souza, executive director of Climate Change AI, one of the organizations involved in the initiative.

“Numerical weather prediction models demand significant computational power, which limits their implementation in many regions,” she noted. “I believe AI will act as a beneficial force that accelerates many of these advancements.”

Lorenzo Sarr, chief sustainability officer at Clarity AI and also present at Cop30, emphasized that AI could aid in tracking emissions and biodiversity, providing insights into current conditions.

“One can truly begin to identify the problem areas,” he said. “Then predictions can be made. These forecasts can address both short-term and long-term scenarios. We can predict next week’s flooding, and also analyze phenomena like rising sea levels.”

Sarr acknowledged valid concerns regarding AI’s societal and governance impacts, but he expressed optimism that the overall environmental outcomes could be beneficial. A report released in June by the London School of Economics delivered unexpectedly positive projections, suggesting that AI could slash global greenhouse gas emissions by 3.2 billion to 5.4 billion tons over the next decade, even factoring in significant energy usage.

“People already make poor energy choices, such as overusing their air conditioners,” Sarr commented. “How much of what we do on our phones is detrimental? It’s a recurring thought for me. How many hours do we spend scrolling through Instagram?”

“I believe society will gravitate toward this direction. We must consider how to prevent harming the planet through heating while ensuring a net positive impact.”

Yet, some experts and environmental advocates remain skeptical. The immense computational demands of AI, particularly in the case of generative models, are driving a surge in data centers in countries like the U.S., which consume vast quantities of electricity and water—even in drought-prone areas—leading to surging electricity costs in certain regions.

The climate ramifications of this AI surge, propelled by companies like Google, Meta, and OpenAI, are considerable and likely to increase, as indicated by a recent study from Cornell University. This impact is comparable to adding 10 million gasoline cars to the roads or matching the annual emissions of all of Norway.

“There exists a techno-utopian belief that AI will rescue us from the climate crisis,” stated Jean Hsu, a climate activist at the Center for Biological Diversity. “However, we know what truly will save us from the climate crisis: the gradual elimination of fossil fuels, not AI.”

While AI may indeed enhance efficiency and lower emissions, these same technologies can be leveraged to optimize fossil fuel extraction as well. A recent report by Wood Mackenzie estimated that AI could potentially unlock an additional trillion barrels of oil. Such a scenario, if accepted by energy markets, would obliterate any chances of preventing severe climate change.

Natasha Hospedares, lead attorney for AI at Client Earth, remarked that while the “AI for good” argument holds some validity, it represents “a very small niche” within a far larger industry focused primarily on maximizing profits.

“There is some evidence that AI could assist developing nations, but much of this is either in the early stages or remains hypothetical, and actual implementation is still lacking,” she stated. “Overall, we are significantly distant from achieving a state where AI consistently mitigates its detrimental environmental impacts.”

“The environmental consequences of AI are already alarming, and I don’t foresee a slowdown in data center expansion anytime soon. A minor fraction of AI is being applied for beneficial purposes, while the vast majority is being exploited by companies like Google and Meta, primarily for profit at the expense of the environment and human rights.”

Source: www.theguardian.com

Static Electricity Makes Window Defrosting More Energy Efficient

Airplanes are typically defrosted using antifreeze spray.

Jaromir Chalabala / Alamy

Static electricity has the potential to eliminate up to 75% of frost from surfaces, which could lead to significant energy savings and a reduction in the millions of tons of antifreeze currently utilized for vehicle defrosting.

In 2021, Jonathan Boreyko and his team at Virginia Tech serendipitously discovered that frost becomes electrically charged during its formation. They successfully employed this natural electric field to charge an adjacent water film, which could effectively dislodge ice crystals from the frost as a natural deicing agent. However, the impact was minimal and did not significantly affect total frost levels.

Now, Boreyko’s research group has engineered a more advanced defrosting system that utilizes ultra-high voltage copper electrodes positioned above frosted surfaces like glass or copper. This innovative system can eliminate half of the frost in approximately 10 to 15 minutes, and up to 75% if the surface is highly water-repellent. “Instead of tapping into the voltage created by the frost, we’re enhancing the effect by applying our own voltages,” Boreyko explains.

To achieve a 50% reduction in frost, their method requires electrodes charged to 550 volts, which is more than double the voltage generally supplied by utility power in many regions. Nonetheless, the current from these electrodes is minimal, making them relatively safe. Boreyko noted that accidental contact with the electrodes would result in an electric shock similar to that from electric fences used on farms.

Boreyko states that this low current draws less energy—less than half of what would be needed to directly heat the frost.

An effective and swift defrosting technique could be applicable not just to car windows and roadways but also in the aerospace sector, where significant quantities of antifreeze are employed to prevent ice accumulation on aircraft wings, which can impact flight performance.

“Instead of applying hundreds of liters of antifreeze to the aircraft wings during taxi to eliminate ice, we could employ this machine, which would move around the airport runway, utilizing a high-voltage wand to clear away all the ice and snow,” Boreyko remarks.

topic:

Source: www.newscientist.com

Analog Computers May Train AI 1,000 Times Faster While Consuming Less Energy

Analog computers use less energy compared to digital computers

Metamol Works/Getty Images

Analog computers that can swiftly resolve the primary types of equations essential for training artificial intelligence models may offer a viable solution to the growing energy demands of data centers spurred by the AI revolution.

Devices like laptops and smartphones are known as digital computers because they handle data in binary form (0s and 1s) and can be programmed for various tasks. Conversely, analog computers are generally crafted to tackle specific problems, using continuously variable quantities like electrical resistance rather than discrete binary values.

While analog computers excel in terms of speed and energy efficiency, they have historically lagged in accuracy compared to their digital counterparts. Recently, Zhong Sun and his team at Peking University in China developed two analog chips that work collaboratively to solve matrix equations accurately—crucial for data transmission, large-scale scientific simulations, and AI model training.

The first chip generates low-precision outputs for matrix computations at high speed, while the second chip refines these outputs through an iterative improvement algorithm to assess and minimize the error rate of the initial results. Sun noted that the first chip produced results with a 1% error rate, but after three iterations with the second chip, this rate dropped to 0.0000001%, comparable to the accuracy found in conventional digital calculations.

Currently, the researchers have successfully designed a chip capable of solving 16 × 16 matrices, which equates to handling 256 variables, sufficient for addressing smaller problems. However, Sun acknowledges that addressing the complexities of today’s large-scale AI models will necessitate substantially larger circuits, potentially scaling up to 1 million by 1 million.

A unique advantage of analog chips is their ability to handle larger matrices without increased solving time, unlike digital chips, whose solving complexity rises exponentially with matrix size. This translates to a 32 x 32 analog chip outperforming the Nvidia H100 GPU, a leading chip for AI training.

Theoretically, further scaling could yield throughput up to 1,000 times greater than digital alternatives like GPUs while consuming 100 times less energy, according to Sun. However, he cautions that practical applications may exceed the circuit’s limited capabilities, limiting the perceived benefits.

“This is merely a speed comparison; your specific challenges may differ in real-world scenarios,” Sun explains. “Our chip is designed exclusively for matrix computations. If these computations dominate your tasks, the acceleration will be substantial; otherwise, the benefits may be constrained.”

Sun suggests that the most realistic outcome may be the creation of hybrid chips that incorporate some analog circuitry alongside GPUs to tackle specific problem areas, although this development might still be years away.

James Millen, a professor at King’s College London, emphasizes that matrix calculations are pivotal in AI model training, indicating that analog computing has the potential to make a significant impact.

“The contemporary landscape is dominated by digital computers. These remarkable machines are universal, capable of tackling any computation, yet not necessarily with optimal efficiency or speed,” Millen states. “Analog computers excel in performing specific tasks, making them exceptionally fast and efficient. In this research, we leverage analog computing chips to enhance matrix inversion processes—essential for training certain AI models. Improving this efficiency could help mitigate the substantial energy demands accompanying our expanding reliance on AI.”

Topic:

Source: www.newscientist.com

Solar Energy Will Transform the World Sooner Than You Imagine

The Future of Solar Power is Promising

Liu Fuyu/Shutterstock

Can solar energy dominate the global power landscape? Recently, the rate of solar power installation has increased dramatically, with capacity doubling between 2022 and 2024, now providing 7% of global electricity. What are the future projections?

In the first half of 2025, solar and wind energy reached historic milestones by surpassing coal in electricity generation for the first time, making renewables the leading electricity source worldwide. According to the UK-based think tank Ember, solar power has been the primary contributor to this pivotal shift in the energy landscape, accounting for 83% of the surge in global electricity demand this year. Ember’s analysis shows that solar has been the largest new power source for three consecutive years.

What’s the advantage of solar? Its affordability! Installation costs for solar systems have plummeted by 90% over the past 15 years, making solar energy the most economical electricity source globally. “Currently, silicon panels are on par with the cost of plywood,” remarks Sam Stranks, from Cambridge University.

This translates to abundant, cost-effective energy solutions that can be implemented almost anywhere. Is it unrealistic to envision a future where solar energy powers everything?

On a fundamental level, Earth receives almost limitless solar energy. Even with current panel efficiencies, roughly 450,000 square kilometers would be needed to meet the entire world’s energy demands using solar power, as estimated by a 2021 report from the British think tank “Carbon Tracker.” This represents just 0.3% of global land area.


Kingsmill Bond, one of the report’s authors now working with Ember, noted that while land usage trade-offs exist—like competition with agriculture—”there’s ample space for most nations to adopt these technologies.”

Next-Gen Solar Panels

The question is, what hinders solar energy from fully dominating the global electricity market? Efficiency is the foremost challenge. Photovoltaic panels primarily made of silicon convert about 20% of solar energy into electricity. In contrast, hydroelectric power plants convert 90%, wind turbines around 50%, and fossil fuel plants 30-40%.

This disparity necessitates more solar panels to equate to the output of other energy sources. Therefore, companies and researchers are eager to enhance solar panel efficiency, hoping the improvements will concurrently reduce costs and land requirements.

However, crystalline silicon panels are approaching efficiency limits, with top-tier cells currently achieving around 25% efficiency. “The practical ceiling for crystalline silicon is likely around 28%,” explains Jenny Nelson from Imperial College London.


Further efficiency improvements may require a transition to tandem solar cells, which utilize an additional semiconductor to better harness the solar spectrum. Tandem silicon perovskite cells are considered the most promising, with a theoretical efficiency limit near 50%. Although real-world tandem panels haven’t reached that potential, Stranks anticipates efficiencies between 35% and 37%.

The first tandem silicon perovskite solar panels have commenced commercial production. They are now undergoing industry tests to assess their real-world operational longevity. Stranks is optimistic, projecting they will become the market’s leading technology in a decade. “On the surface, they appear similar to current panels, but they generate 50% more power,” he states. “That’s a significant advancement.”

Efficiency enhancements could not only cut costs further but also foster new application opportunities, such as solar roofs on electric vehicles that can charge batteries during the day. This stored energy could then be utilized for transportation or domestic use after sunset, he adds.

Solving Storage Issues

Innovations like these could mitigate one of solar power’s primary challenges: variability. The sun isn’t always shining, which poses less of an issue in Sunbelt countries like India, Mexico, and parts of Africa, where sunlight is almost year-round, enabling surplus energy during the day to be stored for nighttime use. This solar and storage model is becoming more affordable, with lithium-ion battery costs declining 40% in the past two years, according to BloombergNEF.

“Ultimately, fossil fuels’ only edge over solar is their storage capabilities,” Bond points out. “However, this issue is mostly resolved through advancements in battery technology.”

In northern regions, where winter days are short and overcast, the scenario is different. “[Solar] serves as an incredibly effective energy source, producing zero pollution with a rapid energy investment return. It ticks all the right boxes,” comments Andrew Blakers from the Australian National University. “Unless you reside in northern Europe, northeast Asia, or the northeastern United States—where you have abundant summer sun but limited winter light—[solar] is distinctly superior.”

For areas experiencing long winter nights, wind energy can bridge the gap, but we must also develop energy storage solutions capable of holding power for extended periods. Such “seasonal storage” technology is still emerging, with only a few solutions at commercial scales. However, methods like pumped water, hydrogen, and compressed air storage show promise. “In the short term, batteries will suffice for now, while pumped hydro storage will take over in the long run,” predicts Blakers.

Political Challenges

If anything, enhancing efficiency and storage represents low-hanging fruit. “The bottlenecks are likely political, with inconsistencies in policy, regulatory challenges, and vested interests from other industries,” says Nelson.

The Trump administration in the U.S., known for its climate change skepticism, epitomizes this issue. Recently, federal authorities halted a massive solar project in Nevada that was set to be the world’s largest, continuing a trend of reducing solar funding and obstructing initiatives.

Yet, Bond is confident the shift to renewable energy is nearly inevitable, given its economic advantages over conventional power sources. “While certain companies may slow the solar tide in specific countries or projects, the current U.S. government is inadvertently jeopardizing the nation’s position in the global race for advanced technology deployment,” he asserts.

Blakers concurs, emphasizing that solar energy might be the solution to the rapidly increasing energy demands from AI data centers. “Even in the U.S., with a determined federal approach, it’s hard to envision solar moving backward since many states favor solar, and it’s by far the most expedient method to procure substantial energy,” he notes.


Another significant obstacle for clean energy is logistics. Existing power grids will need rewiring to accommodate large and varying energy supplies from new regions. A more adaptable grid that manages generation spikes and fine-tunes electricity demand will optimize green electricity usage. However, achieving these advanced power grids will incur substantial costs. In the UK alone, energy firms plan to invest £77 billion over the next five years to upgrade the electricity grid for wind and solar integration.

In low-income countries with less developed electricity grids, there’s an opportunity to expediently establish renewable-friendly infrastructures from scratch, facilitating deeper penetration of renewables into their grid supply. Currently, the BRICS nations (Brazil, China, Egypt, Ethiopia, India, Indonesia, Iran, Russia, South Africa, and the United Arab Emirates) together produce more than half of the world’s solar power generation, according to Ember.

The broader challenge for many nations is to electrify a larger share of their energy needs—covering heating, transportation, and more—which is crucial for decreasing fossil fuel reliance throughout the global economy. “To decarbonize our planet, electrification is a priority,” Nelson emphasizes. Low-income countries are currently leading the charge against wealthier ones, with China’s portion of electricity in final energy consumption set to hit 32% in 2023, far eclipsing the 24% electrification rate of the United States and affluent European nations, as noted by Ember.

What Lies Ahead for Solar?

Despite the year’s achievements, the technical, logistical, and political hurdles mentioned could hinder solar PV adoption in some regions in the short term. The International Energy Agency announced that renewable electricity is set to more than double by the end of 2030, yet it might not meet the target of tripling capacity by that time. The agency identified changes in U.S. policy and challenges related to grid integration as resisting factors against expanding renewable capacity.

However, energy market analysts remain optimistic that solar power will lead the global energy supply by mid-century. “By century’s end, it’s clear that all our electricity will derive from renewable sources, primarily solar,” asserts Bond, who forecasts that solar energy could account for up to 80% of the world’s electricity supply by 2100. Additionally, he expects that a minimum of 80% of the total global energy needs will be electrified.

All political, storage, and infrastructure barriers will eventually fade, paving the way for a green power revolution. “Human ingenuity drives us to convert energy into resources,” Bond concludes. “Now that we have discovered this affordable and universal energy source, it’s only a matter of time before we harness it.”

Topics:

  • Solar Power/
  • Renewable Energy

Source: www.newscientist.com

High-Definition Video with Low Energy Consumption on Color E-Paper Screens

Gustav Klimt’s The Kiss presented on an iPhone (left) alongside a smaller e-paper display (right) showing the same artwork

Kingston Frameworks; Kunli Xiong et al. (2025)

A groundbreaking type of color e-paper is capable of showcasing vivid, high-resolution, full-color images and videos with minimal power consumption, heralding a potential new era for display technologies.

Unlike conventional LED screens, which produce colors through the emission of red, green, and blue light, e-paper screens utilize small molecules to generate images. Historically, these screens were restricted to black and white, but advancements have now allowed for color displays. However, they still face challenges in updating quickly enough for video playback.

To address this issue, Kunli Shion and their team at Uppsala University in Sweden have engineered electronic paper featuring pixels constructed from tungsten oxide nanodisks. Each pixel measures roughly 560 nanometers, resulting in an impressive resolution of 25,000 pixels per inch (PPI), whereas typical smartphone displays generally have resolutions in the hundreds.

The tungsten oxide nanodiscs are designed in various sizes and spacings to reflect distinct bands of light. By arranging them together, a range of colors can be created, with brightness adjustable through short electrical pulses that position ions within each disk. Once set, these ions maintain their placement, allowing the color to persist without a continuous power source.

The researchers constructed an e-paper display that measures just 1.9mm by 1.4mm, roughly 1/4000 the size of a conventional smartphone display, and utilized it to showcase a 4300×700 pixel segment of Gustav Klimt’s The Kiss – achieving remarkable resolution for such a compact device. It is also capable of refreshing approximately every 40 milliseconds, making it suitable for video display.

Another significant advantage of this novel e-paper technology is its remarkably low energy consumption, as noted by Xiong. The display utilizes about 1.7 milliwatts per square centimeter for video and around 0.5 milliwatts per square centimeter for still images.

“What I find impressive about this development is its capacity to support video at a rapid pace while consuming minimal energy, as each element remains switched after being activated,” comments Jeremy Baumberg from Cambridge University.

Topics:

Source: www.newscientist.com

A Simple Method to Dramatically Cut Your AI’s Energy Consumption

AI relies on data centers that consume a significant amount of energy

Jason Alden/Bloomberg/Getty

Optimizing the choice of AI models for various tasks could lead to an energy saving of 31.9 terawatt-hours this year alone, equivalent to the output of five nuclear reactors.

Thiago da Silva Barros from France’s Cote d’Azur University examined 14 distinct tasks where generative AI tools are utilized, including text generation, speech recognition, and image classification.

We investigated public leaderboards, such as those provided by the machine learning platform Hugging Face, to analyze the performance of various models. The energy efficiency during inference—when an AI model generates a response—was assessed using a tool named CarbonTracker, and total energy consumption was estimated by tracking user downloads.

“We estimated the energy consumption based on the model size, which allows us to make better predictions,” states da Silva Barros.

The findings indicate that by switching from the highest performing model to the most energy-efficient option for each of the 14 tasks, energy usage could be decreased by 65.8%, with only a 3.9% reduction in output quality. The researchers believe this tradeoff may be acceptable to most users.

Some individuals are already utilizing the most energy-efficient models, suggesting that if users transitioned from high-performance models to the more economical alternatives, overall energy consumption could drop by approximately 27.8%. “We were taken aback by the extent of savings we uncovered,” remarks team member Frédéric Giroir from the French National Center for Scientific Research.

However, da Silva Barros emphasizes that changes are necessary from both users and AI companies. “It’s essential to consider implementing smaller models, even if some performance is sacrificed,” he asserts. “As companies develop new models, it is crucial that they provide information regarding their energy consumption patterns to help users assess their impact.”

Some AI firms are mitigating energy usage through a method known as model distillation, where a more extensive model trains a smaller, more efficient one. This approach is already showing significant benefits. Chris Priest from the University of Bristol, UK notes that Google recently claimed an advance in energy efficiency: 33 times more efficient measures with their Gemini model within the past year.

However, allowing users the option to select the most efficient models “is unlikely to significantly curb the energy consumption of data centers, as the authors suggest, particularly within the current AI landscape,” contends Priest. “By reducing energy per request, we can support a larger customer base more rapidly with enhanced inference capabilities,” he adds.

“Utilizing smaller models will undoubtedly decrease energy consumption in the short term, but various additional factors need consideration for any significant long-term predictions,” cautions Sasha Luccioni from Hugging Face. She highlights the importance of considering rebound effects, such as increased usage, alongside broader social and economic ramifications.

Luccioni points out that due to limited transparency from individual companies, research in this field often relies on external estimates and analyses. “What we need for more in-depth evaluations is greater transparency from AI firms, data center operators, and even governmental bodies,” she insists. “This will enable researchers and policymakers to make well-informed predictions and decisions.”

topic:

Source: www.newscientist.com

What Factors Contributed to Solar Power Being the World’s Most Coveted Energy Source?

“Today’s solar panels will inevitably reach the end of their lives and will require recycling or disposal.”

Jacques Hugo/Getty Images

By the mid-2020s, solar energy had become a major player. It emerged as the most affordable form of power generation and was also one of the fastest-growing sources of energy. The lifespan of solar panels had extended significantly, lasting around 30 to 40 years. However, eventually, these panels would need to be recycled or disposed of. By 2050, predictions indicated that there could be as much as 160 million tonnes of solar module waste. While this amount was considerably less than that produced by fossil fuel sources, it still posed a challenge.

Researchers began exploring how to create self-healing and even self-organizing solar panels.

By the mid-2030s, advancements had led to the creation of live solar panels, also known as biological solar power generation (BPV), which were deployed globally. The aesthetically pleasing, natural look of this technology made it popular, leading to the mantra of “yes, in my backyard,” and rapid adoption of living sunlight technology.

One of the first benefits was easily observed in off-grid rural areas, particularly in sub-Saharan Africa, where BPVs provided energy for mobile phones and computers without the need for batteries. As the technology progressed, older buildings were revamped into BPVs resembling green walls and roofs, while new structures incorporated living solar panels right from the design phase, allowing more people to become less dependent on traditional grid energy. This also helped boost local biodiversity and enhance overall happiness.

BPV operates like a fuel cell, where electrons move from the cathode to the anode, generating electricity. In biological contexts, electrons are produced by photosynthetic organisms and subsequently transferred to the anode.

Back in 2011, scientists became intrigued by the phenomenon of electrical leakage from cyanobacteria in sunlight. They discovered that by placing cyanobacteria on electrodes, they could harvest current to power small electronic devices.

However, the electrical output was weak due to insufficient electron leakage from the bacteria. Scientists like Chris Howe from Cambridge University worked on genetically modifying cyanobacteria to enhance electron leakage, allowing them to be connected to electronic devices.

In 2022, Howe’s team found that they could power computers solely using photosynthesis. Soon after, scientists made significant strides in their ability to scale up current harvesting and develop devices powered by biological energy sources worldwide.


Members of Homo Photosyntheticus pledged to limit their electricity consumption strictly to that derived from photosynthesis.

One immediate benefit was a significant decrease in demand for small batteries that power many devices. By 2025, these batteries accounted for 3% of the global battery market, resulting in 10,000 tonnes of waste each year.

With the improvement in BPV technology, larger devices like mobile phones and refrigerators began operating on batteries charged by living solar cells. Electric vehicles could be charged using arrays of biological solar panels installed in garages and depots, leading to a reduced need for metals like lithium and manganese.

Remarkably, the devices continued to function in low light. At night, the cells metabolized compounds created during the day, producing a comparable amount of electrons to maintain power.

The rise of living solar technology had numerous implications. As buildings adopted a green aesthetic, urban planners started integrating more nature into streets and public areas. Even densely populated cities began to exhibit a vibrant green atmosphere, teeming with trees, plants, flowers, and wildlife.

The success of BPVs inspired a movement focused on integrating the organelles of plant cells responsible for photosynthesis. This enthusiastic group, identifying as members of Homo Photosyntheticus, drew inspiration from solar-powered sea slugs and incorporated chloroplasts sourced from plant leaves into their own biology.

Sea slugs have evolved methods to sustain and manage chloroplast functionality; however, they sometimes require additional chloroplasts. They possess a leaf-like structure that maximizes surface area, yet the energy obtained through photosynthesis only meets a small fraction of their energy requirements. For humans, without the cellular infrastructure to support chloroplast function or leaf-like shapes, this method could only yield negligible energy.

Nevertheless, for self-identified members of H. Photosyntheticus, the incorporation of chloroplasts held significant symbolic meaning. They engaged in what they referred to as “greening,” committing to utilize only electricity generated directly through photosynthesis—eschewing fossil fuels altogether! Additionally, they commonly tattooed chloroplasts on their skin as a visible testament to their dedication.

topic:

Source: www.newscientist.com

Unlocking Net Zero: UK Battery Companies Driving Change in the Energy Sector

tIt may conjure images of battery production lines and the extensive “gigafactory” projects of Elon Musk and Tesla across the globe, or thoughts of batteries powering everything from electric toothbrushes to smartphones and vehicles. However, at Invinity Energy Systems’ modest factory in Basgate, near Edinburgh, employees are nurturing the hope that Britain will also contribute to the battery revolution.

These batteries, which are based on vanadium

tIt may conjure thoughts of battery production lines and the expansive “gigafactory” projects of Elon Musk and Tesla worldwide, or images of batteries powering devices from electric toothbrushes to smartphones and cars. However, at Invinity Energy Systems’ modest factory in Basgate, near Edinburgh, employees are fostering hope that Britain will also play a pivotal role in the battery revolution.

These batteries, utilizing vanadium ions, can be housed within a 6-meter (20-foot), 25-ton shipping container. While they may not be used in vehicles, manufacturers aspire for this technology to find its place in the global storage rush, propelling a transition to net-zero carbon grids.

Renewable electricity represents the future of a cleaner and more economical energy system compared to fossil fuels. Its primary challenge lies in the fact that renewable energy generation is contingent on weather conditions—sunshine and wind may not be available when energy demand peaks. Battery storage allows for the shift of energy production, enabling it to be saved for later use, which is essential for a well-functioning electric grid.

“What has suddenly become apparent is that people have recognized the necessity of energy storage to integrate more renewable energy into the grid,” stated Jonathan Mullen, CEO of Invinity, at the factory where a series of batteries are stacked and shipped.

For a long time, experts have explored various methods for storing renewable electricity, but the issue of grid reliability gained political attention in April when Spain and Portugal experienced the largest blackouts in Europe in two decades. While some rushed to criticize renewable energy, a Spanish government report clarified that it was not the cause. Nonetheless, battery storage assists grids worldwide in avoiding similar complications as those seen in the Iberian Peninsula.


Power blackouts in Spain and Portugal in April highlighted the issues of energy security. Photo: Fermín Rodríguez/Nurphoto/Rex/Shutterstock

Much of the attention in battery research has focused on maximizing energy storage in the smallest and lightest containers suitable for electric vehicles. This development was crucial for the transition away from carbon-intensive gasoline and diesel, which are significant contributors to global warming. It also led to substantial reductions in the costs associated with lithium-ion batteries.

As with many aspects of the shift from fossil fuels to electric technologies, China is driving demand at an incredible scale. According to data from Benchmark Mineral Intelligence, China has installed batteries with a capacity of 215 gigawatt hours (GWh).

China’s battery installations are expected to nearly quadruple by the end of 2027 as new projects are completed. For instance, the state-owned China Energy Engineering Corporation recently bid on a 25GWh battery project utilizing lithium iron phosphate technology, typically used in more affordable vehicles.

Global battery storage capacity by country

Iola Hughes, research director at a Benchmark subsidiary, Rho Motion, stated that declining prices and increased adoption of renewable energy are propelling the rise in demand. By 2027, total global battery storage installations could increase fivefold, Hughes noted, adding, “This figure could rise even further as technological advancements and reduced costs enable developers to construct battery energy storage systems at an unprecedented pace.”

The majority of this growth (95% of current figures) will involve projects utilizing lithium-ion batteries, including a site in Aberdeenshire managed by UK-based Zenobē Energy, which claims to have “the largest battery in Europe.”

Energy storage companies harnessing various technologies must navigate a challenging landscape to secure early-stage funding while proving that their technologies are economically viable. Invinity’s flow batteries use vanadium, while U.S.-based rival EOS Energy employs zinc. However, flow batteries often excel in applications requiring storage durations of over 6-8 hours, where lithium batteries typically fall short.

Skip past newsletter promotions

Cara King, an R&D scientist at Invinity Energy Systems, holds a vial of vanadium electrolyte in various states of charge. Photo: Murdo Macleod/The Guardian

Flow batteries leverage the unique properties of certain metals that can stably exist with varying electron counts. One transport unit contains two tanks of vanadium ions, each with different electron counts—one is “Royal Purple” and the other “IRN-Bru Red.” The system pumps the vanadium solution through a membrane stack that allows protons to pass, while electrons travel around the circuit to provide power. If electrons are driven in the opposite direction by solar panels or wind turbines, the process reverses, charging the battery, which can support a charge of up to 300 kilowatts.

A significant benefit of flow batteries is their relative ease of manufacturing compared to lithium-ion counterparts. Invinity managed to assemble a battery stack with just 90 employees, primarily sourced from Scottish parts.

Throughout the project’s lifespan, Mullen has maintained that “on a cost-per-cycle basis, it offers more value than lithium.” While the upfront costs are higher than those for lithium batteries—Invinity estimates around £100,000 per container—the longer lifespan without capacity loss and the absence of flammability means no costly fire safety equipment is necessary. The shipping container is already deployed next to Vibrant Motivation in Bristol, Oxford Auto Chargers, casinos in California, and solar parks in South Australia.

“We can commission the entire site within a few days,” Mullen remarked.

Invinity is valued at just over £90 million in the London AIM junior stock market and aspires for the UK to spearhead the flow battery niche.

UK manufacturing could be favorably considered in government contests for support under a “cap and floor” scheme that ensures electricity prices remain within a specified range. Should they succeed, the company anticipates a substantial increase in production from its current rate of five containers per week. Mullen envisions the possibility of employing up to 1,000 workers if the company flourishes.

“The potential for growth is immense,” Mullen stated. “Have we moved past the question of whether technology can scale effectively?”

Source: www.theguardian.com

Why Solar Power is the Most Sustainable Energy Source for the Future

Only 0.3% of the Earth’s land area needs solar panels to fulfill all energy requirements

VCG via Getty Images

Solar energy has been gaining traction for years, and it’s easy to see why. It represents one of the most economical ways to produce energy almost anywhere and stands as a vital measure against climate change.

However, there are skeptics. U.S. Energy Secretary Chris Wright asserts that solar energy cannot meet global energy demands. Many experts highlight that this claim is fundamentally misguided. Over time, sunlight—along with wind energy—offers the only reliable power source capable of satisfying escalating energy demands without harming the planet.

On September 2nd, Wright posted on social media platform x, stating, “Even if we covered the entire planet with solar panels, it would only generate 20% of the world’s energy. One of the greatest mistakes politicians make is equating electricity with energy!”

First and foremost, electricity is quantified based on the energy it delivers, making it practical to consider electricity as equivalent to energy.

Climate scientist Gavin Schmidt from NASA’s Goddard Space Research Institute remarked on Bluesky that the total energy content utilized by all fuels globally in 2024 was approximately 186,000 terawatt hours. He emphasized that the Earth receives 6,000 times that amount in energy each year.

Moreover, Schmidt noted that since 60% of fossil fuel energy is typically wasted in the conversion process to usable electricity, the Earth receives 18,000 times more energy than is needed to satisfy current energy consumption levels.

While existing solar panels only capture around 20% of available solar energy and can’t be installed everywhere, a 2021 report by Carbon Tracker estimated that merely 0.3% of the world’s land area (limited to land) is required to address current energy needs through solar energy alone. This footprint is smaller than that of existing fossil fuel infrastructure. In essence, the report indicates that solar and wind can provide over 100 times the current global energy demand.

We are fortunate, as the current reliance on fossil fuels is already contributing to hazardous climate change with fossil fuels alone supplying 100 times more energy than the planet can sustainably handle. But what about nuclear fusion? If it becomes a feasible option, would it surpass solar energy?

The answer is negative. Eric Chaisson from Harvard University anticipates minimal growth in global energy demand; however, the waste heat generated could potentially elevate global temperatures by 3°C within three centuries. This refers to waste heat from everyday activities like boiling a kettle or using a computer, which consumes the energy produced.

Solar energy—along with wind, tides, and waves—functions fundamentally as a source harnessed from the sun, rendering waste heat irrelevant. The energy we utilize, whether it ends up as waste heat or not, determines its practical value. In contrast, other energy sources, like nuclear fission, do not currently address waste heat management.

“[Carl] Sagan preached to me, and I now relay that message to students. Any planet must ultimately utilize the energy it possesses,” Chaisson remarked in an interview with New Scientist in 2012.

Though three centuries is a long time, the implications of waste heat are already significant. Studies indicate that maximum temperatures in Europe’s summers have increased by 0.4°C. By 2100, average annual temperatures in certain industrialized regions may rise by nearly 1°C due to waste heat—effects not currently considered in climate models.

Ultimately, the only technology that can sustainably harness solar and wind energy to meet global energy demands for centuries, without triggering catastrophic warming, is these renewable sources. The projections couldn’t be more misguided.

Topics:

Source: www.newscientist.com

Is Africa on the Verge of a Solar Energy Revolution?

Explosive growth of solar energy and panels in Niamey, Niger

Boureima Hama/AFP via Getty Images

A remarkable increase in solar panel shipments from China to African nations over the past year suggests a significant boost in the continent’s renewable energy infrastructure. This growth facilitates broader access to affordable and clean electricity while decreasing the dependency on imported fossil fuels.

“We’re not witnessing a huge explosion yet,” says Dave Jones from Ember, a UK energy think tank. “This marks the beginning of momentum.”

Jones and his team examined export data for Chinese solar panels from 2017 to the present. Although Africa possesses the infrastructure for solar panel manufacturing, it remains reliant on Chinese imports for nearly all its needs.

From June 2024 to 2025, exports to Africa soared by 60%, surpassing the 15 gigawatts of electricity imported during this timeframe.

This recent surge differs from earlier increases in 2022 and 2023, which were mainly concentrated in South Africa; now, the growth is evident across the continent. Twenty nations report import records, and 25 nations import solar panels totaling 100 megawatts. “It’s not driven by one or two countries,” notes Jones, “which I find incredibly encouraging.”

While South Africa continues to lead, accounting for about a quarter of total imports, several other nations significantly increased their acquisitions. Nigeria ranks second with 1,721 megawatts, followed by Algeria, which imported 1,199 megawatts in total. In the last two years, imports of solar panels from China to African countries (excluding South Africa) have more than tripled.

If all panels imported in the past year have been installed, it’s estimated that 16 countries could meet at least 5% of their current electricity needs. Sierra Leone could potentially generate over 60% of its existing power from solar energy. This shift towards solar energy could also mitigate reliance on costly fossil fuel imports.

“The transition towards a just-energy Africa is no longer a distant goal; it is happening right now,” asserts Amos Wemanya, of Power Shift Africa, a Kenyan energy think tank. “This transition holds the promise to significantly enhance our resilience against climate disruptions and foster development.”

This surge can be attributed partly to substantial solar power projects in development; however, that isn’t the full story. Jones emphasizes that many imports are destined for small, distributed installations, such as rooftops and farms, as users seek more affordable and reliable alternatives to national grid power. A similar pattern has emerged in Pakistan, where rooftop solar has seen explosive growth in recent years, driven by falling panel prices.

While this trend is promising, around 600 million people in Africa—almost half the continent’s population—lack dependable electricity access. Nonetheless, the development of solar energy in Africa still lags behind other global regions. Many African countries struggle to secure investments in renewable energy, representing only 2% of global investments over the last few decades. Interestingly, over the past year, Pakistan has imported more solar panels than all of Africa combined, despite having only one-sixth of Africa’s population.

“Our key challenge is to transform this momentum into sustainable benefits by amending funding, policies, and local industries to ensure that clean energy is not only accessible but also reliable, affordable, and inclusive for all Africans,” concludes Wemanya.

Egypt: Scientific Pioneers of the Ancient World – Cairo and Alexandria

Embark on a remarkable journey through Cairo and Alexandria, Egypt’s two iconic cities where ancient history meets modern vibrance.

Topics:

Source: www.newscientist.com

Maximize Metal Resources for Clean Energy Without New Mining Operations

SEI 262810917

Open-pit mining at the Kennecott Copper Mine, also referred to as the Bingham Canyon Mine in Utah

Witold Skrypczak/Alamy

The leftover ore discarded by the rice mines is rich in vital minerals. This resource is sufficient to furnish all the necessary components for clean energy technologies. By reclaiming a portion of these minerals, we can satisfy the country’s rising demand for green energy without relying on imports or creating new environmentally-friendly mines, although the process of extraction poses its own challenges.

“We must enhance our utilization of mining resources,” states Elizabeth Holly from the Colorado School of Mines.

Traditionally, most individual mines concentrate on extracting a limited range of minerals, such as copper and gold. This involves excavating and grinding the ore, followed by separating the primary product through various metallurgical processes. Ultimately, the residue is discarded as tailings. “It’s pointless to mine if we’re not utilizing all the resources,” says Holly.

These byproducts often contain additional valuable materials, including many crucial minerals identified by the US government as essential for military and energy technologies like solar panels, wind turbines, and batteries. However, certain supply chains for these minerals are controlled by China, raising urgent concerns for the US and its allies, prompting a search for alternative mineral sources, including mining byproducts and tailings.

Yet, many mining operations lack a clear understanding of what they are discarding. “Numerous minerals that are now deemed critical were seldom employed in the past, so they weren’t analyzed for recovery,” remarks Holly.


Holly and her colleagues examined thousands of ore samples and production data from mines across the US. They utilized this information to project the quantity of additional minerals that could be retrieved from 54 active hard rock metal mines should new purification steps be implemented.

In some cases, it was found that only 1% of the minerals contained in mining byproducts were recoverable. Other minerals necessitated recovery rates in the 10-90% range to replace imports. Additionally, certain metals, such as gold, platinum, and palladium, still require imports, even though 100% recovery is achievable from byproducts.

These findings imply that the US could fulfill the growing demand for critical minerals without the need for new mines, according to Holly. This strategy would help secure a stable supply chain and mitigate the environmental impacts of mining. “It makes more sense to optimize what we’re already mining,” she asserts.

According to Brian McNulty from the University of British Columbia in Canada, this presents “a significant opportunity,” although further research is required to transform estimates of mineral amounts into actual recoverable quantities. “We hope to not only engage government but also industry, encouraging a more thorough assessment of our mining practices,” he comments.

Identifying the whereabouts of these minerals isn’t the only challenge. Current purification technologies do not cater well to these small, complicated waste streams, and deploying the necessary technology is prohibitively expensive for many US mines. Megan O’Connor, from NTH Cycle, which specializes in extracting vital minerals from unconventional sources, highlights this issue.

Mines may also hesitate to invest in new mineral extraction methods when future demand remains uncertain. Whether concerning electric vehicle batteries or solar panels, “technological advancements occur significantly faster than changes in mining practices,” notes McNulty.

Despite skepticism regarding renewable energy, the Trump administration prioritized US mineral production as a key aspect of its agenda. Recently, the Department of Energy (DOE) announced nearly $1 billion in funding for unconventional mining initiatives, including $250 million aimed at mineral recovery from mining byproducts.

A spokesperson from the DOE asserts that the tailings at these mines represent “a significant opportunity within the nation” and could assist the United States in diversifying its sources of critical minerals and materials.

Nonetheless, this does not diminish support for new mines, as stated by the agency’s executive director, P. Wells Griffith III, during a DOE strategy workshop on August 20th. “We should never apologize for modern lifestyles and our abundant natural resources,” he affirmed.

Topic:

Source: www.newscientist.com

Dwarf Planet Ceres Might Have Hosted a Lasting Source of Chemical Energy to Support Habitability

While there is no conclusive evidence of microorganisms on Ceres, recent research bolsters the theory that this dwarf planet may have once harbored conditions conducive to single-cell life.



An illustration of Ceres’ interior, highlighting the movement of water and gas from the rocky core to the saltwater reservoir. Carbon dioxide and methane are chemical energy carriers beneath Ceres’ surface. Image credit: NASA/JPL-Caltech.

Previous scientific data from NASA’s Dawn Mission indicated that bright reflective areas on Ceres’ surface were formed from salt left behind by liquid that seeped from below ground.

A subsequent 2020 analysis identified that this liquid originated from a vast reservoir of subsurface brine.

Additional studies found organic materials in the form of carbon molecules on Ceres. While this alone doesn’t confirm the existence of microbial life, it is a crucial component.

Water and carbon molecules are two fundamental aspects of the habitability puzzle for this distant world.

The latest findings suggest that ancient chemical energy on Ceres could have supported the survival of microorganisms.

This does not imply that Ceres currently hosts life, but if it did, “food” sources are likely to have been available.

In a new study led by Dr. Sam Courville from Arizona State University and NASA’s Jet Propulsion Laboratory, a thermal and chemical model was developed to simulate the temperature and composition within Ceres over time.

They discovered that approximately 2.5 billion years ago, Ceres’ underground oceans possibly maintained a stable supply of warm water with dissolved gases emanating from metamorphic rocks in the rocky core.

The heat originated from the decay of radioactive elements within the planet’s rocky interior, a process typical in our solar system.

“On Earth, when hot water from deep underground interacts with ocean water, it frequently creates a fertility hotspot for microorganisms, releasing a wealth of chemical energy,” stated Dr. Courville.

“Therefore, if Ceres’ oceans experienced hydrothermal activity in the past, it would align well with our findings.”

As it stands, Ceres is not likely to be habitable today, being cooler and having less ice and water than it once did.

At present, the heat from radioactive decay in Ceres is inadequate to prevent water from freezing, resulting in highly concentrated saltwater.

The timeframe during which Ceres was likely habitable ranges from 5 billion to 2 billion years ago, coinciding with when its rocky core peaked in temperature.

This is when warm liquid water would have been introduced into Ceres’ groundwater.

Dwarf planets generally lack the benefit of ongoing internal heating due to tidal interactions with larger planets, unlike Enceladus and Europa, moons of Saturn and Jupiter, respectively.

Thus, the highest potential for a habitable Ceres existed in its past.

“Since then, Ceres’ oceans are likely to be cold, concentrated saltwater with minimal energy sources, making current habitability unlikely,” the authors concluded.

A paper detailing these findings was published today in the journal Advances in Science.

____

Samuel W. Courville et al. 2025. Core metamorphosis controls the dynamic habitability of the medium-sized marine world – the case of Ceres. Advances in Science 11 (34); doi: 10.1126/sciadv.adt3283

Source: www.sci.news

Bill McKibben Delivers an Inspiring Case for Solar Energy in His Latest Book

The sun’s future in this Sichuan pepper field in Bijie, China

STR/AFP via Getty Images

The Sun Comes Here
(Bill McKibben) WW Norton UK, September 16th. US, August 19th

The sun is shining brighter through solar energy. According to Ember, a think tank on energy, solar energy has been the fastest-growing power source globally for the past two decades.

In 2022, solar power generation capacity surpassed 1 terawatt for the first time, and just two years later, it doubled, contributing 7% to the world’s electricity supply. When including wind turbines, which harness solar energy through different methods, solar accounted for 15% of global electricity last year.

This surge in solar energy is not simply due to an increased commitment to climate goals. Indeed, as noted in another Ember Report, many renewable energy targets have barely made progress towards achieving net-zero emissions over the past decade.

The true driver behind the rise of solar is its position as the most cost-effective method of electricity generation almost everywhere.

In his book Here Comes the Sun: The Last Chance for Climate and a New Chance for Civilization, long-time climate advocate Bill McKibben asserts that we are on the brink of a critical historical transition—from reliance on fossil fuels to embracing solar energy. “We are looking to the heavens for energy instead of to hell,” McKibben writes.

Below, he provides a thoughtful exploration of how solar energy not only addresses climate issues in time but also transforms the interaction between the economy and the natural world.

This is not the first call to action for a swift transition to renewable sources. However, it offers a visionary glimpse of what a solar-powered society could look like, going beyond just technological and economic considerations during the energy shift.

Solar-led energy transitions may be inevitable, but they may not happen quickly enough.

“This critical transformation is now presented as the most significant bargain ever, yet it remains cloaked in mysteries we have yet to fully unravel,” he notes.

This optimism is presented by McKibben, a renowned voice in environmentalism since his first book, The End of Nature, where he first alerted the world to the climate crisis.

Rather than detailing the ongoing damage from climate change, he emphasizes the numerous advantages of increased solar power, including more stable energy prices and reduced reliance on fossil fuel-rich states.

On a spiritual note, he suggests that this shift may rekindle our deep respect for the sun and its immense power.

McKibben also engages with skeptics of renewable energy, providing a balanced perspective on the trade-offs in the energy transition, such as the rising demand for minerals, land use, and potential job losses in fossil fuel industries. His argument is reinforced by an array of global anecdotes from different energy transitions, including a positive mention of the Kentucky Coal Mines Museum’s transition to solar energy to cut costs.

Nevertheless, doubts linger about the feasibility of McKibben’s optimistic outlook. A significant portion of the rapid growth in solar energy is currently occurring in China, which has unique advantages such as central planning and a distinct political structure that may not be replicable elsewhere. This rapid pace may not even be sustainable in China itself.

In the U.S., despite remarkable growth in solar energy in recent years, the industry now contends with challenges posed by the previous administration’s discontent towards renewable resources. Loss of tax credits that once leveled the playing field with subsidized fossil fuels and local opposition to solar projects also complicate future growth.

As McKibben acknowledges, both can be true: solar-driven energy transitions may be on the horizon, but reductions in emissions might not happen swiftly enough to avert further drastic impacts of global warming. “It won’t be easy, but it’s necessary,” he asserts. “We must cease burning, or we will face dire consequences.”

Personally, I resonate with this perspective—I’d much prefer to bask in the sunlight.

Topics:

  • Climate Change/
  • Solar Power Generation

Source: www.newscientist.com

Small Discs Can Ascend to the Upper Atmosphere Solely Using Solar Energy

SEI 261839461

Illustration of a solar-powered levitating disc

Schafer et al. Nature

A tiny disc, roughly the size of a nail, has the potential to ascend to high altitudes in sunlight while carrying sensors through some of the coldest and thinnest parts of the atmosphere. These swarms, flying higher than commercial aircraft and balloons, could reveal new insights regarding Earth’s evolving weather and climate.

These floating devices harness a phenomenon known as photophoresis. This was initially discovered over 150 years ago when chemist William Crookes invented a radiometer, a device with black and white feathers that spin when they are exposed to sunlight. The wings absorb light and release heat, increasing the momentum of nearby gas molecules. Due to the difference in temperature between the black and white sides of the wings, the black side emits more momentum, allowing the air to flow in one direction with sufficient force to turn the wings.

“We’ve embraced this lesser-known physics to develop applications that could benefit many people, enhancing our understanding of how weather and climate change unfolds over time.” Ben Schafer from Harvard University.

To create the levitating disc, Schafer and his team designed a device that spans 1 cm, composed of two sheets of aluminum oxide filled with microscale holes. When illuminated, the lower sheet, which contains alternating layers of chromium and aluminum oxide, heats up more than the top layer, similar to the black sides of the radiometer blades. This generates a directional airflow that moves upwards instead of sideways.

Under white LED and laser illumination — set to an intensity that mimics about 50% of natural sunlight — this upward force successfully lifted the device. This represents progress over previous solar-powered flyers, which required light intensity significantly brighter than sunlight. However, the tests were conducted under laboratory conditions with air pressure much lower than Earth’s surface pressure.

Fortunately, such low pressure conditions are common at higher altitudes, especially in the Mesosphere, which spans 50-85 km above the Earth. Researchers indicate that increasing the disc’s size to 3 centimeters could enable it to carry a 10-milligram payload to hard-to-reach research areas at altitudes of 75 km. Schafer has co-founded a startup, Rare Feed Technology, aiming to commercialize fleets of these high-flying devices for environmental monitoring and communications.

After sunset, computer modeling indicates that these discs could utilize the heat radiating from Earth’s surface to remain airborne. “If they can stay afloat during the night, that represents a significant advancement instead of simply descending and landing.” Igor Bargatin from the University of Pennsylvania, who is conducting similar research.

topic:

Source: www.newscientist.com

OpenAI Withholds GPT-5 Energy Consumption Details, Potentially Exceeding Previous Models

In response to inquiries about Artichoke recipes made to OpenAI’s ChatGPT in mid-2023, whether for pasta or guidance on rituals related to Moloch, the ancient Canaanite deity, the feedback was quite harsh—2 watts—which consumes approximately the same energy as an incandescent bulb over two minutes.

On Thursday, OpenAI unveiled a model that powers the widely-used chatbot GPT-5. When queried about Artichoke recipes, experts suggest that the energy consumed for similar pasta-related text could be multiple times greater (up to 20 times).

The release of GPT-5 introduced a groundbreaking capability for the model to answer PhD-level scientific inquiries, illuminating rationales for complex questions.

Nevertheless, specialists who have assessed energy and resource consumption of AI models over recent years indicate that these newer variants come with a cost. Responses from GPT-5 may require substantially more energy than those from earlier ChatGPT models.

Like many of its rivals, OpenAI has not provided official data regarding the power consumption of models since announcing GPT-3 in 2020. In June, Altman discussed the resource usage of ChatGPT on his blog. However, the figures presented—0.34 watt-hours and 0.000085 gallons of water per query—lack specific model references and supporting documentation.

“More complex models like GPT-5 require greater power during both training and inference, leading to a significant increase in energy consumption compared to GPT-4.”

On the day GPT-5 launched, researchers from the University of Rhode Island AI Lab found that the model could consume up to 40 watts to generate a medium-length response of approximately 1,000 tokens.

A dashboard released on Friday indicated that GPT-5’s average energy use for medium-length responses exceeds 18 watts, surpassing all other models except for OpenAI’s O3 inference model launched in April, developed by Chinese AI firm Deepseek.

According to Nidhal Jegham, a researcher in the group, this is “significantly more energy than OpenAI’s prior model, GPT-4O.”

To put that in perspective, one watt of 18 watt-hours equates to using that incandescent light bulb for 18 minutes. Recent reports indicate that ChatGPT processes 2.5 billion requests daily, suggesting that GPT-5’s total energy consumption could match that of 1.5 million American households.

Despite these figures, experts in the field assert they align with expectations regarding GPT-5’s energy consumption, given its significantly larger scale compared to OpenAI’s earlier model. Since GPT-3, OpenAI has not disclosed the parameter count of any models. The earlier version contained 17.5 billion parameters.

This summer, insights from French AI company Mistral highlighted a “strong correlation” between model size and energy use, based on their internal systems research.

“The amount of resources consumed by the model size [for GPT-5] is noteworthy,” observed Xiao Len, a professor at the University of California Riverside. “We are facing a significant AI resource footprint.”

AI Power Usage Benchmark

GPT-4 was widely regarded as being 10 times larger compared to GPT-3. Jegham, Kumar, and Ren believe GPT-5 is likely to be even larger than GPT-4.

Major AI companies like OpenAI assert that significantly larger models may be essential for achieving AGI, an AI system capable of performing human tasks. Altman has emphasized this perspective, stating in February: “It seems you can invest any amount and receive continuous, predictable returns,” but that GPT-5 does not surpass human intelligence.

Skip past newsletter promotions

According to benchmarks from a study performed in July, Mistral’s LE chatbot exhibited a direct correlation between model size and its resource usage regarding power, water, and carbon emissions.

Jegham, Kumar, and Ren indicated that while the scale of GPT-5 is crucial, other factors will likely influence resource consumption. GPT-5 utilizes more efficient hardware compared to previous iterations. It employs a “mixture” architecture, allowing not all parameters to be active while responding, which could help diminish energy use.

Moreover, since GPT-5 operates as an inference model that processes text, images, and video, this is expected to lead to a larger energy footprint when compared to solely text-based processing, according to Ren and Kumar.

“In inference mode, the resources spent to achieve identical outcomes can escalate by five to ten times,” remarked Ren.

Hidden Information

To assess the resource consumption of AI models, a team from the University of Rhode Island calculated the average time taken by the model to answer queries—such as pasta recipes or offerings to Moloch—multiplied by the average power draw of the model during operation.

Estimating the model’s power draw involved significant effort, shared Abdeltawab Henderwi, a Professor of Data Science at the University of Rhode Island. The team faced difficulties in sourcing information about the deployment of various models within data centers. Their final paper includes estimates detailing chip usage for specific models and the distribution of queries among different chips in the data centers.

Altman’s blog post from June affirmed their results, revealing that his indicated energy consumption for queries on ChatGPT, at 0.34 watt-hours, closely matches findings from the team for GPT-4O.

Other team members, including Hendawi, Jegham, and others emphasized the need for increased transparency from AI firms when releasing new models.

“Addressing the true environmental costs of AI is more critical now than ever,” stated Marwan Abdelatti, a Professor. “We urge OpenAI and other developers to commit to full transparency in disclosing the environmental impact of GPT-5.”

Source: www.theguardian.com

When Redshift Occurs, What Happens to Light’s Energy? It’s Complex.

“It can be hard to comprehend the vastness involved…”

Science Photo Library/Alamy

Many of us can relate to concerns about inflation. The rising cost of living weighs heavily on our minds, and we often scrutinize what political leaders are doing in response. Yet it’s essential to recognize the terminology issues present in physics, especially since inflation carries a vastly different meaning in this context.

In cosmology, space inflation refers to a model that elucidates why our universe appears so expansive. This theory posits that space-time underwent rapid expansion for a brief duration—around one second—leading to regions of the universe that are now uncommunicative but once were connected.

Understanding such immense scales can be a challenge. How do we truly grasp these vast distances that exceed our everyday experiences? Last month’s column tackled this concept by addressing distance measurement techniques. Yet, this inquiry itself unfolds layers of complexity.

In that discussion, I highlighted how Redshift serves as a crucial tool for gauging distances in space. Imagine a series of balloons being inflated; as they expand, their peaks and troughs elongate. This phenomenon mirrors how light behaves as it travels across the fabric of space-time. The light stretches, increasing its wavelength.

This shift in light wavelengths enables distance calculation. By measuring the wavelength of light from a distant object and comparing it to our observations, we can discern how much space-time has expanded between our position and the observable objects. Such Redshift measurements are consistently corroborated by both astronomical observations and lab experiments.

However, deeper questions linger. From a quantum standpoint, light’s wavelength is tied to its energy content. The stretching of light reduces its energy, resulting in a redshift effect. This phenomenon isn’t merely a nuisance; rather, it presents intriguing insights about quantum mechanics within cosmological discussions.

Energy conservation is a fundamental concept in everyday physics, but even cosmic principles can be bent or broken

What’s the dilemma? We prefer consistent principles across physics domains. A core tenet of everyday physics suggests that energy cannot be created or destroyed, only transformed. Thus, if we apply energy conservation to redshifted light, we face the question: where does the lost energy of light go? A curious reader posed this very question.

The response may be surprising. While energy conservation remains a guiding principle, it seems the cosmic realm can, at times, operate differently. Albert Einstein’s theory of general relativity plays a pivotal role here. Though widely recognized for its insights into the fabric of cosmic time and curvature, it also reveals how space-time itself may expand.

A unique aspect of general relativity is that energy conservation isn’t universally applicable. In essence, as light loses energy through redshift, this loss is not considered significant in the grand scheme. Energy doesn’t necessarily have to ‘go’ anywhere; it can merely dissipate.

That’s one way to frame it. Alternatively, we could also address the energy associated with gravitational fields. Historically, conflating these two perspectives has sparked considerable debate. Some argue they represent two facets of the same reality.

Personally, I contend that the essence of energy remains ambiguous. It’s challenging to delineate, yet it’s palpable in connection to physical entities like particles and stars. However, when discussing the energy entwined with space-time curvature, clarity dissolves. Where exactly is this energy located within the continuum of space and time? How concentrated is it at specific junctures? These inquiries reflect the complexities of inflation!

Thus, I find myself aligning with those who suggest that strict energy conservation may not be the most useful concept. What stands clear is the interdependence of space-time curvature and energy related to matter. Space-time’s dynamics guide matter’s trajectory, while matter’s mass (akin to energy) influences how space-time will behave.

Chanda’s Week

What I’m reading

Riley Black When the Earth was Green: The Epic of Plants, Animals, and Evolution beautiful.

What I’m watching

I’m re-watching Star Trek: A Strange New World from the start.

What I’m working on

We are pondering the Newathena X-Ray Observatory to deepen our understanding of neutron star interiors.

Chanda Prescod-Weinstein is an associate professor of physics and astronomy at the University of New Hampshire. She is the author of Cosmos with Disabilities and the forthcoming book, “Edges of Space-Time: Particles, Poetry, and the Universe’s Dreamscape.”

Topics:

  • Quantum Physics/
  • Space-Time

Source: www.newscientist.com

Transforming Retired Coal Plants into Green Energy Sources

Abandoned coal power plant at an abandoned Indiana Army Ammunition Factory

American Explorer/Shutterstock

Numerous decommissioned coal-fired power plants have the potential to become reliable backup or emergency energy sources for the grid, eliminating the dependence on fossil fuels. Instead, they can utilize thermal energy trapped in soil.

The idea involves accumulating a large mound of soil near the coal facility and embedding industrial heaters within it. During periods of low electricity demand, these devices transform inexpensive electricity into heat, storing it in the soil at around 600°C. When electricity demand peaks, the heat can be transferred from the soil through heated liquid pipes.

A generator linked to the turbine blades of a coal plant can convert this heat into supplemental energy. The heat transforms water into steam, turning the turbine blades to produce electricity. “Rather than burning coal to heat water for steam, we harness heat from the energy stored within the soil,” explains Ken Caldeira from Stanford University in California.

This type of energy storage is crucial in supporting renewable energy sources like wind and solar, which often generate power intermittently. Soil offers a more affordable, abundant, and accessible resource for long-term energy storage compared to alternatives like lithium batteries and hydrogen fuels.

“The most exciting aspect is the low cost of energy capacity, especially since it is significantly cheaper than other energy technologies,” states Alicia Wongel at Stanford University.

Nonetheless, this approach has its challenges. “In such systems, minimizing plumbing and electrical costs is crucial, yet can be difficult,” notes Andrew Maxson from the Electric Power Research Institute, a non-profit research organization based in California.

Most soil consists of naturally heat-resistant materials like silicon dioxide and aluminum oxide, which makes it “very resilient to heat,” says Austin Vernon from Standard Thermals in Oklahoma. His startup aims to commercialize this “thermal” technology, especially for repurposing retired coal power plants in conjunction with nearby solar and wind energy sources.

There are many retired coal facilities across the United States. Close to 300 coal-fired power plants were shut down between 2010 and 2019, and an additional 50 gigawatts of coal capacity is expected to reach retirement age by 2030. In the late 2000s, cheaper natural gas and renewable energy began to outcompete coal.

Christian Phong from the Rocky Mountain Institute, a research organization in Colorado, views the idea of repurposing defunct coal plants positively. “This provides an opportunity for local communities to engage in the clean energy transition, generating jobs and additional tax revenue while navigating the shift away from coal,” he remarks.

Topic:

Source: www.newscientist.com

Deep Microorganisms Capable of Harnessing Energy from Earthquakes

Sure! Here’s a rewritten version of your content while preserving the HTML tags:

SEI 261182732

Microorganisms may derive energy from surprisingly confined environments

Book Worms / Public Domain Sources from Aramie / Access Rights

Fractured rocks from earthquakes could reveal a variety of chemical energy sources for the microorganisms thriving deep beneath the surface, and similar mechanisms may feed microorganisms on other planets.

“This opens up an entirely new metabolic possibility,” says Kurt Konhauser, from the University of Alberta, Canada.

All life forms on Earth rely on flowing electrons to sustain themselves. On the planet’s surface, plants harness sunlight to create carbon-based sugars that are consumed by animals, including humans. This initiates a flow of electrons from the carbon to the oxygen we breathe. The chemical gradient formed by these carbon electron donors and oxygen electron acceptors, known as redox pairs, generates energy.

Underground, microbes also depend on redox pairs, but these deep ecosystems lack access to various solar energy forms. Hence, traditional carbon-oxygen pairings are inadequate. “Challenges remain in identifying these underground [chemical gradients]. Where do they originate?” Konhauser questions.

Hydrogen gas, generated by the interaction of water and rock, serves as a primary electron source for these microbes, much like carbon sugars do on the surface. This hydrogen arises from the breakdown of water molecules, which can occur when radioactive rocks react with water or iron-rich formations. During earthquakes, when silicate rocks are fragmented, they expose reactive surfaces that can split water, producing considerable amounts of hydrogen.

However, to utilize that hydrogen, microorganisms require electron acceptors to complete the redox pair. Attributing value solely to hydrogen is misleading. “Having the food is great, but without a fork, you can’t eat it,” remarks Barbara Sherwood Lollar from the University of Toronto, Canada.

Konhauser, Sherwood Lollar, and their research team employed rock-crushing machines to simulate the reactions that yield hydrogen gas within geological settings, which could subsequently form a complete redox pair. They crushed quartz crystals, mimicking strains in various types of faults and mixing the water present in most rocks with different iron and rock forms.

The crushed quartz reacted with water to generate significant quantities of hydrogen, both in stable molecular forms and more reactive species. The team’s findings revealed many of these hydrogen radicals react with iron-rich liquids, creating numerous compounds capable of either donating or accepting enough electrons to establish different redox pairs.

“Numerous rocks can be harnessed for energy,” Konhauser pointed out. “These reactions mediate diverse chemical processes, suggesting various microorganisms can thrive.” Secondary reactions involving nitrogen or sulfur could yield even broader energy sources.

“I was astonished by the quantities,” said Magdalena Osburn from Northwestern University, Illinois. “It produces immense quantities of hydrogen, and it also initiates fascinating auxiliary chemistry.”

Researchers estimate that earthquakes generate far less hydrogen than other water-rock interactions within the Earth’s crust. However, their insights imply that active faults may serve as local hotspots for microbial diversity and activity, Sherwood Lollar explained.

Importantly, a complete earthquake isn’t a prerequisite. Similar reactions can take place as rocks fracture in seismically stable areas, like continents or geologically dead planets such as Mars. “Even within these massive rocks, you can observe pressure redistributions and shifts,” she noted.

“It’s truly exciting to explore sources I was recently unfamiliar with,” stated Karen Lloyd from the University of Southern California. The variety of usable chemicals produced in actual fault lines is likely even more diverse. “This likely occurs under varying pressures, temperatures, and across vast spatial scales, involving a broader range of minerals,” she said.

Energy from infrequent events like earthquakes may also illuminate the lifestyles of what Lloyd refers to as aeonophiles—deep subterranean microorganisms thought to have existed for extensive time periods. “If we can endure 10,000 years, we may experience a magnitude 9 earthquake that yields a tremendous energy surge,” Lloyd added.

This research is part of a growing trend over the last two decades that broadens our understanding of where and how organisms can endure underground, states Sherwood Lollar. “The deep rocks of continents have revealed much about the habitability of our planet,” she concluded.

Topic:

Let me know if you need any further modifications or adjustments!

Source: www.newscientist.com

Are Contact Lens Batteries the Future of Energy Storage?

SEI 258681427

Faraday 2 battery developed by Superdielectrics

Superdielectrics

The innovative battery storage solution, utilizing SuperCapacitor Technology, may “jump” traditional lithium-ion batteries, transforming the landscape for renewable energy storage and use, according to its creator.

On July 8th, British firm SuperDielectrics unveiled its new prototype storage system, dubbed the Faraday 2, at an event in central London. Incorporating a polymer designed for contact lenses, this system boasts a lower energy density than lithium-ion batteries but claims numerous advantages, such as quicker charging, enhanced safety, reduced costs, and a recyclable framework.

“The current energy storage market at home is reminiscent of the computer market around 1980,” said SuperDielectrics’ Marcus Scott while addressing journalists and investors. “Access to clean, reliable, and affordable electricity isn’t a future goal; it’s now a practical reality, and we believe we are creating the technology to support it.”

Energy storage is pivotal for the global transition to green energy, crucial for providing stable electricity despite the intermittent nature of wind and solar power. While lithium-ion batteries dominate the storage technology market, they present challenges, including high costs, limited resources, complex recycling processes, and safety risks like overheating explosions.

With its aqueous battery design grounded in supercapacitor technology, SuperDielectrics aims to address these challenges. Supercapacitors store energy on material surfaces, facilitating extremely rapid charge and discharge cycles, albeit with lower energy density.

The company’s design employs a zinc electrolyte, separated from the carbon electrode by a polymer membrane. SuperDielectrics asserts that this membrane technology is cost-effective, utilizing abundant raw materials, thus unlocking a new generation of supercapacitors with significant energy storage capabilities.

During the event, the company’s CEO Jim Heathcote mentioned that the technology could outperform lithium-ion systems in renewable energy storage.

The Faraday 2 builds on the earlier Faraday 1 prototype launched last year, claiming to double the energy density. The Faraday 2 operates at 1-40 Wh/kg, allowing for faster charging times, which will harness fleeting spikes in renewable energy production, as noted by Heathcote.

However, Gareth Hinds from the UK National Physical Laboratory points out that the technology still lags behind lithium-ion batteries, which can achieve around 300 Wh/kg at the cell level. Andrew Abbott of the University of Leicester adds that the energy density now offered by SuperDielectrics is akin to that of lead-acid batteries commonly used in automobiles and backup power systems. “There are no immediate plans among leading manufacturers to transition,” he states.

Marcus Newborough, scientific advisor at SuperDielectrics, acknowledges that they are still “on a journey” to enhance the system’s energy density. “We are aware of our high theoretical energy density,” he mentioned, noting the company’s commitment to realizing this potential in the coming years, aiming for a commercial energy storage solution ready for launch by the end of 2027.

Despite the optimism, Hinds remains skeptical about the technology competing with lithium-ion batteries regarding energy density. “Clearly, it’s an early-stage development, and while they continue to push for higher energy density, achieving lithium-ion levels is a significant challenge due to strict limitations,” he comments.

Nonetheless, he suggests that there could be a market for larger storage solutions that provide lower energy density but at a much more affordable price than lithium-ion batteries and with a longer lifespan.

Sam Cooper from Imperial College, London, concurs: “If we can develop a system offering equal energy storage capacity to the Tesla Powerwall, regardless of size or weight, and at a cost of 95% less, that would represent a groundbreaking achievement.”

Source: www.newscientist.com

We May Have Finally Cracked the Mystery of Ultra-High Energy Cosmic Rays

Artistic rendering inspired by actual images of the IceCube neutrino detectors in Antarctica.

icecube/nsf

Our focus lies in understanding the true nature of the rarest and most energetic cosmic rays, which aids in deciphering their elusive origins.

The universe continuously showers us with bursts of particles. Brian Clark, from the University of Maryland, explains that the most energetic particles are termed ultra-high energy cosmic rays, possessing more energy than particles accelerated in labs. However, they are quite rare. Researchers are still investigating their sources and the constituent particles remain largely unidentified. Clark and his team are now analyzing the composition using data from the IceCube Neutrino detector situated in Antarctica.

Previous detections of ultra-high energy cosmic rays by the Pierre Auger Observatory in Argentina and a telescope array in Utah have led to disagreements. Clark posits that it remains uncertain whether these rays are mainly composed of protons or if they consist of a mix of other particles. The IceCube data sheds light on this, indicating that protons account for about 70% of these rays, with the remainder composed of heavier ions like iron.

Team member Maximilian Meyer from Chiba University in Japan notes that while IceCube data complements other measurements, it primarily detects neutrinos—by-products resulting from collisions between ultra-high-energy cosmic rays and residual photons from the Big Bang. Detecting and simulating neutrinos is inherently challenging.

The characteristics of cosmic ray particles influence how the magnetic fields generated in space affect their trajectories. Thus, comprehending their structure is crucial for the challenging endeavor of tracing their origins, according to Toshihiro Fujii from Osaka Metropolitan University in Japan.

These mysterious origins have given rise to numerous astonishing enigmas, such as the Amaterasu particle cosmic rays. Interestingly, it seems to have originated from a region in space near the Milky Way that lacks clear astronomical candidates for its source.

Clark expresses optimism about solving many of these mysteries within the next decade, as new observational tools, including an upgrade to IceCube, will soon be operational. “This domain has a clear roadmap for how we can address some of these questions,” he states.

topic:

Source: www.newscientist.com

Energy Drinks: Simple Additions to Minimize Tooth Damage

SEI 257592222

Energy drinks can enhance your mood, but excessive intake may harm your dental health.

Shutterstock/Francesco de Marc

Calcium-fortified energy drinks may mitigate tooth damage, though the impact on flavor remains uncertain.

Research reveals that dental enamel starts to erode when exposed to liquids with a pH below 5.5.

Investigating solutions, Eric Jacom from the University of Rio Grande and Grande in Brazil, along with his team, experimented with adding calcium and other minerals to standard Red Bull to assess the pH impact.

Notable combinations included calcium, phosphorus, and potassium, raising pH from 3.96 (for standard Red Bull) to 5.27, while dicoum malonate and calcium citrate both increased acidity.

The researchers exposed enamel samples from donated human teeth to these enhanced energy drinks for two minutes, evaluating changes in texture, hardness, and other indicators of enamel erosion.

All calcium-enriched variants showed less effect on roughness compared to the unmodified Red Bull, despite having a lower pH. Experts believe this might be due to calcium’s regenerative properties, which aid in mineral deposition on enamel to repair natural wear.

The formulation containing 2.15 grams of calcium-phosphorus-potassium mix and 2.5 grams of dicoum malate sustained enamel hardness. However, the former offers the most protective benefits, demonstrating a dual action of lowering calcium depletion while reducing acidity.

Future investigations should focus on identifying the optimal calcium formulation and the minimum concentration necessary to minimize enamel erosion, the researchers noted in their paper. Additionally, it is crucial to determine if calcium enhancement influences drink preferences and consumption patterns.

Before any shifts in formulation, David Bartlett from King’s College London advises that it’s premature to adjust energy drink compositions. “We recommend avoiding acidic foods or drinks between meals.” Consuming acidic foods and beverages in moderation is seen as less harmful since increased saliva helps neutralize some acid.

A representative from the UK Soft Drinks Association stated: “It’s important to reiterate that all soft beverages can be safely enjoyed within a balanced diet and healthy lifestyle.

Red Bull has not responded to requests for comment.

Topics:

Source: www.newscientist.com

20 Million Clouds of Energy Particles Found Surrounding Distant Galaxy Clusters

Astronomers have identified the largest known cloud of energy particles encircling galaxy clusters, with around 20 million annual clouds around the galaxy cluster PLCK G287.0+32.9.



This new composite image, created using X-rays from NASA’s Chandra X-Ray Observatory (blue and purple), radio data from Meerkat Radio Telescope (orange and yellow), and optical images from Panstarrs (red, green, and blue), illustrates the giant galaxy cluster PLCK G287.0+32.9. Image credit: NASA/CXC/CFA/Rajpurohit et al. / panstarrs / sarao / meerkat / sao / n. wolk.

Located 5 billion light years from Earth in the Hydra constellation, PLCK G287.0+32.9 has garnered astronomers’ attention since its initial detection in 2011.

Prior research uncovered two bright relics, revealing a massive shock wave illuminating the cluster’s edges. However, the extensive, faint radio emissions filling the space between them went unnoticed.

Recent radio images have shown that the entire cluster is enveloped in a faint radio glow that is nearly 20 times the diameter of the Milky Way, suggesting an extraordinary and powerful phenomenon at play.

“We anticipated finding a bright pair of relics at the cluster’s edge. Found “The Harvard & Smithsonian Astrophysics Center” mentioned: “The Harvard & Smithsonian’s Astrophysics Center is a great way to help you get started,” Dr. Kamursh Rajprohit, an astronomer at the Harvard & Smithsonian Center for Astrophysics, noted.

“No energy particle clouds of this magnitude have been spotted in such galaxy clusters or anything comparable.”

Previous record holders, located around Abel 2255 in the Galaxy Cluster, spanned about 16.3 million light years.

In the central region of the cluster, Dr. Rajprohit and his team identified radio halos where frequencies of this scale are typically undetectable, marking the first discovery of size at 114 million light years at 2.4 GHz.

The findings posed questions for the team, providing compelling evidence of magnetic fields where cosmic ray electrons and magnetic fields extend throughout the cluster.

However, it remains uncertain how these electrons can accelerate over such vast distances.

“Very extended radio halos are seldom visible across most frequencies, as the electrons responsible for them tend to lose energy. They are aged and have cooled over time,” Dr. Rajpurohit stated.

“The discovery of this colossal halo has now led to a significant increase in radio emissions between the catastrophic impact and the rest of the cluster.”

“This suggests something is actively accelerating or re-accelerating the electrons, yet none of the usual explanations apply.”

“We suspect that extensive shock waves and turbulence may be contributing factors, but additional theoretical models are needed to arrive at a definitive conclusion.”

This discovery offers researchers a new pathway to investigate cosmic magnetic fields—one of the primary unanswered questions in astrophysics—helping to elucidate how magnetic fields shape the universe on the largest scales.

“We’re beginning to perceive space in ways we have never imagined,” Dr. Rajprohit emphasized.

“This necessitates a reevaluation of how energy and matter traverse through its grandest structures.”

“Observations from NASA’s Chandra X-ray Observatory, managed by the Smithsonian Astrophysical Observatory, reveal boxy structures, comet-like tails, and several other distinct features of the cluster’s hot gas, indicating that the cluster is highly disturbed.”

“Some of these X-ray features correspond with radio-detected structures, pointing to substantial shocks and turbulence driven by merging events, facilitating electron acceleration or re-acceleration.”

“In the core of a cluster, some of these features may arise from the merger of two smaller galaxy clusters, or an explosion triggered by an exceptionally large black hole, or a combination of both.”

Source: www.sci.news

A Massive Untapped Energy Resource Lies Beneath the United States

Sure! Here’s the rewritten content while retaining the HTML tags:

Below the western United States lies a significant, untapped source of clean energy. According to the US Geological Survey (USGS), this potential is substantial.

This research is part of a long-term initiative to chart the nation’s geothermal capabilities, particularly focusing on the expansive basin regions that encompass Nevada, Utah, California, Idaho, Oregon, and Wyoming.

USGS projects that these geologically active states hold the potential to generate reliable and consistent geothermal energy of up to 135 gigawatts, provided new technologies can harness this underground resource. To put this in perspective, the typical U.S. household consumes about 1 kilowatt of electricity continuously, meaning that 135 gigawatts can fulfill the stable energy demands of nearly 135 million homes.

“The evaluation of USGS energy resources is geared towards the future,” stated Dr. Sarah Ryker, the acting director of USGS. “We emphasize undiscovered resources that have yet to be fully explored and developed, starting our work in the Great Basin due to its geothermal activity history.”

Currently, geothermal energy comprises less than 1% of the electricity in the U.S., predominantly sourced from conventional hydrothermal systems, where naturally heated water rises through permeable rocks.

Nonetheless, USGS findings suggest a much richer energy reservoir exists. This indicates that heat is trapped in dense, impermeable rock formations buried deep underground.

Geothermal systems generate electricity by circulating and heating liquids – USGS

To access these “enhanced geothermal systems” (EGS), engineers must drill deeper, sometimes reaching depths over 6 km (3.7 miles), fracturing the rock to allow water to circulate and capture heat.

This heated water can then be raised back to the surface to produce electricity, offering a constant, weather-independent energy source.

To estimate the potential energy available, USGS researchers have combined underground temperature maps, heat flow data, and sophisticated techniques for measuring extraction efficiency and energy conversion. They collaborated with the US Department of Energy (DOE), state geological surveys, and academic institutions nationwide.

Dr. Ryker stressed that this research offers a multitude of benefits beyond just energy generation. “Natural resources play a vital role in sustaining the national economy, and historically, we have advanced the technology for mapping and characterizing these resources.”

The large basins of Nevada and surrounding states showcase potential geothermal energy, indicated by colors ranging from green to red – USGS

However, advancing EGS technology presents substantial challenges. Although pilot projects have shown promise within the Great Basin, commercial-scale fortified geothermal plants are not yet operational in the U.S.

One of the primary hurdles is cost, which the U.S. Department of Energy aims to address through the Enhanced Geothermal Shot™, a program targeting a 90% reduction in technological costs by 2035.

The USGS’s efforts are not limited to the Great Basin. The agency plans to shift its focus to the Williston Basin in North Dakota, another region that may hold geothermal potential.

Should these efforts succeed, geothermal energy could emerge as a crucial component of America’s low-carbon future.

read more:

Source: www.sciencefocus.com

Trump Signs Executive Order to Loosen and Expand Nuclear Energy Regulations

On Friday, President Donald Trump enacted four executive orders designed to ease and broaden regulations surrounding nuclear production.

The orders focus on overhauling the Department of Energy’s nuclear energy research, facilitating the construction of reactors on federally owned land, reforming the Nuclear Regulation Authority, and accelerating U.S. uranium mining and enrichment efforts.

Alongside Trump, CEOs from various nuclear energy firms—such as Joseph Dominguez of Constellation Energy, Jacob DeWitt of Oklo, and Scott Nolan of General Substances—joined President Pete Hegses and Secretary of Interior Doug Burgham during the signing of the orders.

President Donald Trump displays an executive order he signed on May 23, 2025, in the Oval Office at the White House.
Get McNamee/Getty Images

Before the signing, Burgham remarked that this initiative “reverses over 50 years of excessive regulation on the industry,” and he added that “each of these will address another challenge that has hindered progress.”

Trump referred to the nuclear energy sector as “dynamic,” asserting to reporters, “It’s a dynamic industry. It’s a tremendous industry. It needs to be handled correctly.”

A senior administrator briefing reporters prior to the signing indicated that one executive order aimed at permitting nuclear reactors on federal land is designed to meet rising electricity demands linked to AI technology. They emphasized that “safe and reliable nuclear energy will provide power to vital defense installations and AI data centers.”

The executive order also seeks to expedite the review and regulatory processes for nuclear reactor construction and operation. The fourth order stipulates that the Nuclear Regulation Authority must make licensing decisions for new reactors within an 18-month timeframe, according to officials.

This new timeline aims to “reduce regulatory obstacles and shorten licensing periods” for nuclear reactors.

Dominguez commended the president’s initiative to streamline the nuclear regulation framework, noting, “Historically, regulatory delays have plagued our industry.”

“We often spend too long seeking approval and addressing irrelevant questions instead of the crucial ones,” he added.

Nuclear energy is viewed as a means to transition away from fossil fuels and lower greenhouse gas emissions since it generates electricity without the combustion of coal, oil, or natural gas.

Despite the tripling of solar and wind energy production in the U.S. over the last decade, there remain concerns that current energy sources will struggle to meet soaring energy demands.

Just before the president signed the executive order in his elliptical office, Heggs informed reporters, “We are integrating artificial intelligence across the board. If not, we cannot keep pace. We cannot afford to fall behind. Nuclear energy is essential to powering this.”

Recent reports have projected a 25% increase in U.S. electricity demand by 2030 (compared to 2023), with a staggering 78% rise by 2050, largely due to the surge in AI technology.

Even with the regulatory framework advancing, it may take years to complete the construction and enhancement of nuclear infrastructure. Furthermore, nuclear energy involves significant risks when compared to other green energy alternatives, requiring long-term plans for managing and disposing of hazardous waste, and risks related to potential core meltdowns or terrorist attacks that could release radioactive materials into the environment.

Additionally, Trump signed a fifth executive order on Friday aimed at “restoring trusted scientific rigor as the cornerstone of federal research,” according to officials.

Michael Krazios, head of the White House Office of Science and Technology, informed reporters that this executive order “ensures continued American strength and global leadership in the fields of science and technology.”

Source: www.nbcnews.com

Republican Budget Proposal Seeks to Halt the IRA Clean Energy Surge

In the United States, there are at least 24 factories manufacturing electric vehicles that meet credit qualifications. According to research by Atlas Public Policy.

Hyundai has invested $7.5 billion in a factory near Savannah, Georgia, to produce some of its most sought-after electric vehicle models. Local officials, who have lobbied for Hyundai’s establishment in the area for years, are worried about potential legal changes.

“For a company, it’s challenging to commit to an area and then face changing conditions,” noted Bert Brantley, CEO of the Savannah Regional Chamber of Commerce. “Our perspective is that stability is beneficial, especially when companies are making significant investments.”

Nevertheless, Brantley expressed hope that Georgia can maintain its position as a frontrunner in electric vehicle production, regardless of any alterations to the tax incentives. “This is a long-term strategy. We hope to be engaged in this for an extended period,” he remarked.

Over the last three years, the federal government has backed a variety of emerging energy technologies that are still in the developmental stage, including low-carbon hydrogen fuels suitable for trucks, innovative methods to manufacture steel and cement without emissions, and carbon dioxide extraction technologies.

Many of these initiatives could benefit from tax reductions under the Inflation Reduction Act. Additionally, several are funded by billions in grants and loans from the Department of Energy.

In western Minnesota, DG Fuel aims to construct a $5 billion facility to generate aviation fuel from agricultural waste. Meanwhile, in Indiana, cement producer Heidelberg Material is working on capturing the carbon dioxide it generates and storing it underground. In Louisiana, a company is set to produce low-carbon ammonia for use in fertilizers.

New Orleans, a key center for natural gas exports, has experienced a surge in new industries like carbon capture and hydrogen, which may help mitigate future emissions. “We are very diverse,” stated Michael Hecht, chairman of Greater New Orleans and the Southeast Louisiana Economic Development Bureau.

Source: www.nytimes.com

Energy Sector to Eliminate Electronics Efficiency Regulations

On Monday, the Energy Bureau announced it is set to revoke energy and water conservation standards impacting a range of appliances and gas devices, totaling 47 regulations. In this context “It was raising costs for Americans and diminishing quality of life.”

The initiative follows a Presidential Order in which President Trump directed the energy sector to “remove constraints on water pressure and efficiency regulations that make household products more costly and effective.”

However, energy efficiency specialists and climate advocates argue that this move will increase operational costs for household appliances like dehumidifiers and portable air conditioners, as well as industrial machines like air compressors.

“If this consumer assault is successful, President Trump will significantly raise expenses for families when manufacturers flood the market with energy and water-draining products,” stated Andrew Delaski, executive director of the Appliance Standards Awareness Project, a consortium of environmental, consumer groups, utilities, and governmental agencies.

Delaski further asserted that this initiative breaches anti-backsliding provisions established decades ago.

“It’s evidently illegal, so please exercise caution,” he remarked in a statement.

Similar to many nations, the US has been implementing standards for years that regulate the energy and water usage of appliances, including light bulbs, dishwashers, water heaters, and washing machines.

According to government scientists’ reports, the efficiency standards saved the average American household roughly $576 on water and gas bills in 2024, leading to a 6.5% reduction in national energy consumption and a 12% decrease in public water use. These measures have prevented the total energy and water usage by American households from rising faster than population growth.

Nonetheless, the Trump administration has characterized these standards as an example of government overreach. Trump frequently criticized weak water pressure from shower heads or toilets that do not flush effectively, denouncing the efficiency standards associated with these devices. Conservative factions, too, argue that efficiency standards compromise appliance performance, especially for dishwashers.

The list of energy sector appliance regulations targets various devices, including air cleaners, battery chargers, compressors, cooking tops, dehumidifiers, external power supplies, microwaves, dishwashers, and faucets.

The department indicated that the rescinded standards would “eliminate over 125,000 words from federal regulations.” However, rolling back the standards necessitates a new rule-making process that may take several months. Additionally, these rollbacks could encounter legal opposition.

The department has not yet responded to requests for comments.

Simultaneously, the Environmental Protection Agency is planning to eliminate the Energy Star program, a universal energy efficiency certification for appliances like dishwashers, refrigerators, and dryers.

Historically, manufacturers have backed government efficiency standards, but they are now attempting to leverage Trump’s inclination to deregulate.

The Association of Home Appliance Manufacturers, representing 150 manufacturers responsible for 95% of household appliances sold in the US, is still assessing Monday’s announcement.

However, Jill A. Notini, a public relations officer for the association, highlighted in a statement that the standards “have facilitated decades of successful advancements in appliance efficiency.” The association further noted, “With most appliances operating at near peak efficiency, substantial savings in some products are unlikely.”

In addition to rolling back efficiency standards, the energy sector intends to abolish several clean energy and climate change initiatives. This includes rescinding reporting requirements for voluntary programs that allow businesses to report greenhouse gas emissions and terminating programs that provide compensation for electricity generated from renewable sources.

The energy sector is also discarding what it terms “unscientific” diversity, equity, and inclusion prerequisites for grant recipients, proposing to eliminate regulations that prevent subsidies from discriminating based on gender, race, or age.

Certain proposals appear to be unrelated to the department’s core focus. One suggested repeal involves “termination requirements for a single sex member to compete on sports teams of the opposite sex.”

Source: www.nytimes.com

New Lawsuit Claims There’s No Such Thing as an “Energy Emergency”

Fifteen states have taken legal action against the Trump administration regarding the declaration of an “energy emergency,” contending that there is no legitimate emergency and that the directive instructs regulators to unlawfully circumvent reviews of fossil fuel projects, which could harm the environment.

The President’s executive order issued on January 20th, “Declaring a state of national energy emergency,” mandated federal agencies to hasten energy initiatives such as oil and natural gas drilling as well as coal mining, while omitting wind and solar energy. He argued that despite record-high production levels in the U.S., energy output still does not meet the nation’s demands.

The lawsuit filed on Friday claims that President Trump’s declaration, which was lodged in federal court in the Western District of Washington, means that reviews mandated by environmental laws like the Clean Water Act, the Endangered Species Act, and the National Historic Preservation Act have been either expedited or overlooked.

The lawsuit notes that emergency procedures have traditionally been reserved for major disasters. “Now, however, several federal agencies, pressured by dubious executive orders, are attempting to widely implement these emergency protocols in situations that do not qualify as emergencies,” the complaint asserts.

The plaintiffs are requesting the court to declare the directive unlawful and to prevent the agencies from issuing expedited permits under the order. Attorneys General from Washington, California, Arizona, Connecticut, Illinois, Massachusetts, Maine, Maryland, Michigan, Minnesota, New Jersey, Rhode Island, Vermont, and Wisconsin have submitted the case.

“The President’s efforts to circumvent essential environmental safeguards are illegal and will be detrimental to the residents of Washington,” remarked Washington Attorney General Nick Brown. “This will not lower prices, enhance our energy supply, or bolster our national safety.”

Trump spokeswoman Taylor Rogers stated that the President possesses “the exclusive authority to determine a national emergency, not state attorneys or judicial systems.” She emphasized that Trump “understands that unleashing American energy is vital for our economic and national security.”

In addition to Trump, the lawsuit lists Secretary of the Army Daniel Driscoll, the head of the Army Corps of Engineers, and the federal entity known as the Advisory Committee on Historic Preservation as defendants.

An Army spokesperson declined to make a comment. A representative from the Advisory Committee on Historic Preservation did not immediately respond to inquiries for comment.

The lawsuit contends that declaring an emergency is reserved “not due to a shift in presidential policy,” and that this alteration would adversely affect the state’s interests, including clean drinking water, wildlife habitats, and historical and cultural resources.

Source: www.nytimes.com

Underground Hydrogen: Potential Clean Energy Sources Hidden Beneath the Mountain Range

Could there be hydrogen under Mount Grison in Switzerland?

Thomas Stoyber/Alamie

Mountain ranges may serve as a significant source of clean energy in the form of unexplored hydrogen. Previous investigations hinted at the presence of “geological” hydrogen underground, but researchers have now pointed to mountains as potential reservoirs.

“Some minerals can react with water to produce hydrogen, serving as a source of sustainable green energy,” explains Frank Zwarn from the Helmholtz Geoscience Centre in Germany.

While a plethora of minerals exists on Earth, most are located at great depths in the mantle. However, during the formation and elevation of mountain ranges, certain mantle materials can be brought nearer to the surface, where they might interact with water through a process called meandering.

To understand the potential for hydrogen generation, Zwaan and his team modeled the uplift process and assessed the mantle material reaching areas with optimal temperatures and adequate circulating water for this reaction to occur. Their findings support the notion that large quantities of hydrogen could form below these mountains.

Serpentine minerals also exist in the ridges of the Central Sea, which some speculate may have played a role in the origin of life. However, Zwaan notes that the hydrogen created there is unlikely to remain trapped due to temperatures below 122°C (252°F), as bacteria can consume the trapped hydrogen. In contrast, it can be drilled from deeper areas of higher temperature below the mountains.

“I wouldn’t want to inhabit that area, but it’s ideal for preserving hydrogen,” Zwaan stated at the European Geoscience Union conference in Vienna last week. “There may be an additional opportunity to drill into what is known as a hydrogen kitchen, the zone where hydrogen is generated.”

The model’s outcomes are corroborated by preliminary findings from studies on various mountain ranges. For instance, Gianreto Manatschal from the University of Strasbourg in France confirmed evidence of hydrogen production beneath the Grison region of the Swiss Alps. However, he emphasized that there remains much to learn. “Our research is merely the beginning,” he remarked.

Notably, some hydrogen has been reported to be seeping from beneath the Northern Pyrenees, according to Alexandra Robert at the University of Toulouse, France. This research is still in its formative stages.

Topics:

  • Energy and fuel/
  • Hydrogen production

Source: www.newscientist.com

State and energy experts unite in new debate to combat federal budget cuts

Scientists, lawmakers, and energy executives have warned that President Trump’s “energy control” agenda will be compromised by abrupt cuts in federal agencies reportedly planned by the Trump administration. Pleas from various quarters have inundated the Cabinet Secretary’s inbox urging them to preserve different departments of the agency. A deadline looms for federal officials to present a new plan for significant budget cuts today, with energy and environment-related agencies expected to bear the brunt.

Experts have cautioned that cuts to the Environmental Protection Agency, the Department of Interior, and the Department of Energy would severely impact efforts to combat climate change. Unfortunately, there seems to be little hope that these concerns will be heeded by Trump administration officials who either deny or disregard the threat of global warming. Instead, the proposed job cuts align with the administration’s priorities, arguing that the cuts jeopardize the expansion of nuclear energy, mineral production, and energy accessibility.

The Department of Energy is expected to face significant losses, particularly in programs like the Clean Energy Demonstration overseeing major projects such as plans to establish seven hydrogen hubs nationwide. Another target is the Loan Program Office, which provides federal funding for clean energy initiatives.

A coalition of energy producers and trade groups representing various sectors like nuclear, data centers, wind and solar energy, and carbon dioxide removal technology expressed concerns that the proposed cuts jeopardize America’s energy and industrial strategies. They highlighted critical projects such as the loan office’s funding for a new nuclear power plant, major lithium mining projects in Nevada, and grid upgrades in Arizona and the Midwest to meet increasing electricity demand from manufacturing.

Additionally, 20 former commissioners and directors of state environmental agencies raised alarm over reports that the EPA intends to eliminate its Scientific Research Division and Research and Development Agency.

EPA administrator Lee Zeldin has announced plans to slash the agency’s budget and workforce by approximately 65%. State officials criticized these cuts, stating that they would hinder the agency’s ability to conduct essential research and uphold its regulatory responsibilities.

They emphasized the pivotal role of the EPA’s science department in addressing issues like PFA removal from drinking water and developing technologies for cleaning toxins from environmental sites.

Democrats on the House Energy Commerce Committee expressed concern over the impact of what they described as “mass cuts” at the EPA. They warned that targeting professional civil servants would endanger public health and impede the agency’s mission to protect human health and the environment.

Reports indicate that thousands of government employees have already resigned, including personnel from agencies like the National Park Service and the Bureau of Land Management. The anonymity was requested to disclose details of the resignations that have not been publicly disclosed by the administration.

Source: www.nytimes.com

Report: AI Data Centers Expected to Quadruple Energy Demand by 2030

The rapid adoption of AI technology globally is projected to consume a substantial amount of energy equivalent to Japan’s current energy consumption by the end of the decade. However, only half of this energy demand is expected to come from renewable sources.

The International Energy Agency (IEA) report suggests that the electricity consumed by processing data with AI in the United States alone will be significant by 2030. The overall electricity demand from data centers worldwide is anticipated to more than double by 2030, with AI being a key driver of this surge.

One data center currently consumes as much energy as 100,000 households, but newer ones under construction may require up to 20 times more. Despite these demands, fears that AI adoption will hinder efforts to combat climate change are deemed “exaggerated” in the report, which highlights the potential for AI to improve energy efficiency and reduce greenhouse gas emissions.

The Executive Director of IEA, Fatih Birol, emphasizes that AI presents a significant technological shift in the energy sector and underscores the importance of responsible use. AI has the potential to optimize energy grids for renewable sources and enhance efficiencies in energy systems and industrial processes.

Furthermore, AI can facilitate advancements in various sectors like transportation, urban planning, and resource exploration. Despite the energy challenges posed by AI, strategic government intervention is crucial to ensure a sustainable balance between technological growth and environmental preservation.

Skip past newsletter promotions

However, concerns persist regarding the potential negative impacts of AI, such as increased water consumption in arid regions and potential reliance on non-renewable energy sources. To address these challenges, transparent governance and proactive measures are essential to harness the benefits of AI while mitigating its adverse effects.

Source: www.theguardian.com

China’s renewable energy boom at risk of disruption from extreme weather

The three Gorge dams in China are the main sources of hydroelectric power generation

costfoto/nurphoto/shutterstock

China’s vast electric grids cause more fuss than any other country with renewable energy, but the system is also vulnerable to electricity shortages caused by adverse weather conditions. The need to ensure reliable power supply could encourage Chinese governments to use more coal-fired power plants.

China’s energy systems are rapidly becoming cleaner, setting new records for wind power and solar energy generation almost every month. The country’s overall greenhouse gas emissions – the highest emissions in the world are expected to peak soon and begin to decline. Wind, solar and hydroelectric power currently account for about half of China’s generation capacity, and is expected to increase to almost 90% by 2060, when the country promised to reach “carbon neutrality.”

This increasingly reliance on renewables means that the country’s electricity system is becoming increasingly vulnerable to changes in weather. Intermittent winds and sun can be replenished by more stable hydropower produced by huge hydroelectric dams enriched in southern China. But what happens when the wind and sun slump coincides with drought?

Jinjiang Shen Darian Institute of Technology in China and his colleagues modeled how power generation on increasingly renewable grids corresponds to these “extreme weather” years. They estimated how future mixing of wind, solar and hydropower behaves under the most favourable weather conditions seen in the past.

They found that future grids are much more sensitive to weather changes than they are today. In a very unfavourable year, 2060, it could reduce the amount of generation capacity by 12% compared to today’s grid, leading to a power shortage. In 2030, in the most extreme cases, they found that this leads to over 400 hours of blackout times, a power shortage of nearly 4% of total energy demand. “That’s not a number that everyone can ignore.” Li Shuo At the Institute of Policy Studies in Asia Association, Washington, DC.

In addition to the overall lack of force, drought could specifically limit the amount of hydroelectric power available to smooth out irregular winds and solar generation. This could also lead to a shortage of electricity. “It is essential to equip a suitable proportion of stable power sources that are less susceptible to weather factors to avoid large-scale, large-scale power shortages,” the researchers wrote in their paper.

One way to help is to run excess electricity more efficiently across states. By expanding the transmission infrastructure, researchers found that it could eliminate the risk of power shortages on today’s grids and reduce half of the risk by 2060. Adding new energy storage in tens of millions of kilowatts, whether using batteries or other methods, would also be alleviated against hydroelectric droughts.

According to Li Shuo, any additional storage amounts China needs to be added to achieve carbon neutrality “becomes an astronomical number.”

These changes are difficult, but they add that many storage is viable given the enormous amount of batteries already produced in China. Lauri Myllyvirta At the Finland Energy and Clean Air Research Centre. He says the country is also building 190 gigawatts of pumped hydropower storage. This says that it can provide long-term energy storage by using surplus electricity to pump water over the dam and releasing it when more electricity is needed.

But so far, the electricity shortage has primarily spurred the Chinese government to build more coal-fired power plants. For example, in 2021 and 2022, hydroelectric droughts and heat waves increased enough electricity demand to cause serious power outages; Continuous expansion of coal. Record hydropower generation in 2023 resulted in record time for emissions.

Chinese President Xi Jinping said coal would peak this year, but he has entrenched political support for power sources. “If China is struggling with another round of these episodes, more coal-fired power plants shouldn’t be the answer,” says Li Shuo. “It’s difficult to abolish coal. China loves coal.”

topic:

Source: www.newscientist.com

Reconsidering Dark Energy: A Potential Universe-Altering Discovery

The Mayall Telescope Star Trail in Arizona houses dark energy spectroscopy equipment

Luke Tyas/Berkeley Lab

Dark energy is one of the most mysterious features of our universe. We don’t know what it is, but it controls how the universe is expanding and its ultimate destiny. Now, the study of millions of heavenly objects reveals that they may have been thinking about all the wrong things that could potentially have dramatic consequences in the universe.

“This is the biggest hint we have about the nature of dark energy in the roughly 25 years since we discovered it,” he says. Adam Reese at Johns Hopkins University in Maryland.

The results come from three years of data collected by Arizona’s Dark Energy Spectroscopy (DESI). By combining this data with other measurements such as background radiation in cosmic microwaves and maps of supernovas, the DESI team concluded that dark energy may have changed over time.

“This is the cutting edge of human knowledge,” says a member of the DigiTeam. It’ll be Percival At the University of Waterloo, Canada. “We see amazing things throughout the universe.”

Desi is attached to a telescope and works by measuring the “redshift” of light emitted from a distant galaxy, or how that wavelength of light extends as it travels through space. From now on, researchers can determine how much the universe has expanded during the journey of light and calculate how this expansion is changing. So far, the team has analyzed light from nearly 15 million galaxies and other bright objects in the sky.

For decades, physicists have agreed that the universe is expanding at a fixed acceleration. This is a cosmological constant known as the lambda, interpreted as the driving force of dark energy. However, in April 2024, Desi’s measurements provide the first hint that the universe may actually be decreasing faster over time, with the cosmological constants not so constant.

Riess, who is not part of the Desi team, says at the time they were not sure if the discovery would last with more data. In fact, it’s just getting stronger. “It’s very exciting for me to see that. [the team] After another year and after they added more data, no issues were found in the analysis. If anything, the outcome is more important,” he says.

That being said, this discovery still does not meet the “5-sigma” statistical levels traditionally used by physicists to discover it as authentic, rather than as a statistical fluke. Current analysis reaches a maximum of 4.2 sigma, but team members Mustafa Ishak Bouzaki At the University of Texas and Dallas, the team says they believe the results will reach five sigma within two years as Digi continues to acquire the data. “This outcome with dark energy is something we never thought it would happen in our lifetime,” he says.

One of the relief, according to Ishak-Boushaki, is that the discovery relies on Desi’s data as well as several other investigations in the universe. Riess compares the situa…To read more, visit Example Website.

Source: www.newscientist.com

Dark energy could potentially develop in unforeseen manners as time progresses

New results from the collaboration of Digi (dark energy spectroscopy) reveal signs of time-varying dark energy.

Two “fans” corresponding to the two main areas were observed by Desi on top and bottom of the plane of the Milkyway Galaxy. Image credits: Desi Collaboration/DOE/KPNO/NOIRLAB/NSF/AURA/R. Proctor.

“The universe will never surprise us and will never surprise us,” said Dr Arjun Dei, a digiproject scientist at Noir Love and associate director of the Central Scale Observatory for Strategic Initiatives.

“By unprecedentedly revealing the evolving textures of our universe's fabrics, Digi and Mayall telescopes are changing our understanding of the future of our universe and nature itself.”

The DESI data, which is employed alone, is consistent with the standard model of the universe. In Lambda CDM, CDM is cold dark matter, and Lambda represents the simplest case of dark energy that acts as a cosmological constant.

However, when combined with other measurements, the effect of dark energy may be weaker over time, increasing indications that other models may be more appropriate.

Other measurements of them include light leftovers from the dawn of space (cosmic microwave background, or CMB), distance measurements of supernovae, and observations of how light from distant galaxies are distorted by the effects of dark matter gravity (weak lenses).

So far, the evolving dark energy preference has not risen to 5 sigma. This is the gold standard in physics that represents a commonly accepted threshold of discovery.

However, the various combinations of DESI data and CMB, weak lenses, and supernova sets range from 2.8 to 4.2 sigma.

This analysis used techniques to hide results from scientists to the end to reduce unconscious biases about data.

This approach sets new criteria for how data is analyzed from large-scale spectroscopic studies.

The Desi is a cutting-edge instrument mounted on the NSF Nicholas U. Mayall 4-M telescope of the NSF Noirlab program, Kitt Peak National Observatory.

Light from 5,000 galaxies can be captured simultaneously, allowing you to carry out one of the most extensive research to date.

The experiment is currently investigating the fourth sky in five years, with plans to measure around 50 million galaxies and quasars (very far but bright objects with black holes in their cores) and more than 10 million stars by the time the project is finished.

The new analysis uses data from the first three years of observations and includes nearly 15 million best measured galaxies and quasars.

This is a major leap, with the one used in Desi's initial analysis improving the accuracy of the experiment with more than twice as much data set, suggesting evolving dark energy.

Digi tracks the effects of dark energy by studying how matter spreads throughout the universe.

Very early cosmic events left subtle patterns in the way matter was distributed. This is a function called Barion Acoustic Vibration (BAO).

Its Bao pattern acts as a standard ruler, and its size is directly influenced by how the universe is expanding at different times.

Measuring rulers at different distances has shown the strength of dark energy throughout history by researchers.

DESI Collaboration begins work with additional analysis to extract more information from the current dataset, and Desi continues to collect the data.

Other experiments offered online over the next few years will also provide complementary data sets for future analysis.

“Our results are a fertile foundation for our theory colleagues looking at new and existing models, and we look forward to what they came up with,” says Dr. Michael Levi, Desi Director and Scientist.

“Whatever the nature of dark energy, it shapes the future of our universe. It is very noteworthy that we look up at the sky with a telescope and try to answer one of the biggest questions humanity has ever asked.”

“These are prominent results from very successful projects,” said Dr. Chris Davis, NSF Program Director at NSF Neil Love.

“The powerful combination of NSF Mayall Telescope and DOE's dark energy spectroscopy instruments demonstrates the benefits of federal agencies collaborating with fundamental science to improve our understanding of the universe.”

Physicists shared their findings in a A series of papers It will be posted above arxiv.org.

Source: www.sci.news

Research on Dark Energy Supports the Evolving Theory

The Lambda-CDM (λCDM) model has been the basis of modern cosmology for some time, and it successfully explains the large-scale structure of the universe. It proposes that 95% of cosmos consists of dark matter (25%) and dark energy (70%). Dark energy, represented by the cosmic constant (λ), is thought to promote accelerated expansion of the universe, and maintains a constant energy density over time. However, new results from the dark energy research suggest a departure from this assumption, suggesting that dark energy may evolve over time.

This artist's impression shows the evolution of the universe, beginning with the Big Bang on the left. After that, you will see the microwave background of the universe. The formation of the first stars ends the dark ages of the universe, followed by the formation of galaxies. Image credit: M. Weiss/Harvard – Smithsonian Center for Astrophysics.

The Dark Energy Survey (DES) was carried out using a 570 megapixel energy-enhanced dark energy camera (decam) mounted on the NSF Víctor M. Blanco 4-M telescope from the NSF Neuroab program, Cerro Tololo Inter-American Observatory.

By obtaining data of 758 nights over six years, DES scientists mapped almost one-eighth area of ​​the sky.

The project employs multiple observation techniques, including supernova measurement, galaxy clustering analysis, and weak gravity lenses, to study dark energy.

Two important DES measurements, baryon acoustic vibration (BAO) and explosive star distance measurements (type IA supernova) track the enlarged history of the universe.

Bao refers to a standard cosmic ruler formed by early universe sound waves, with peaks spanning approximately 500 million light years.

Astronomers can measure these peaks over several periods of universe history to see how dark energy has expanded the scale over time.

“By analyzing 16 million galaxies, DES discovered that the measured BAO scale is actually 4% smaller than predicted by λCDM,” says Dr. Santiago Avila, an astronomer at the Center for Energy and Environmental Technology Research (CIEMAT).

Type IA supernova acts as a standard candle. In other words, the essential brightness is known.

Therefore, its apparent brightness is combined with information about the host's galaxy to allow scientists to perform accurate distance calculations.

In 2024, the DES team released the most extensive and detailed supernova dataset to date, providing highly accurate measurements of space distance.

New discoveries from the combined supernova data and BAO data independently confirm the anomalies seen in the 2024 supernova data.

By integrating DES measurements with cosmic microwave background data, researchers infer the properties of dark energy, and the results suggest that they evolve time.

When verified, this implies a dynamic phenomenon in which the cosmological constant, dark energy, is not ultimately constant and requires a new theoretical framework.

“The results are interesting as they suggest physics beyond the standard models of cosmology,” says Dr. Juan Mena Fernandez, a researcher at the Institute of Subatomic Physics and Cosmology.

“If more data supports these findings, we may be on the brink of a scientific revolution.”

Although current results are still inconclusive, future analyses incorporating additional DES probes such as Galaxy Clustering and weak lenses could enhance the evidence.

Similar trends have emerged from other major cosmological projects, such as Dark Energy Spectroscopy (DESI).

“We've seen a lot of experience in our research,” said Jesse Muir, a researcher at the University of Cincinnati.

“There's still a lot to learn and it's exciting to see how understanding evolves as new measurements become available.”

Team's paper It will be published in journal Physical Review d.

____

TMC Abbott et al. (DES collaboration). 2025. Dark Energy Survey: Final Devalion Acoustic Vibrations and Impact on Cosmological Expansion Models from Supernova Data. Physical Review din press; Arxiv: 2503.06712

Source: www.sci.news

Physicists suggest that ultra-high energy cosmic rays originate from neutron star mergers

Ultra-high energy cosmic rays are the highest energy particles in the universe, and their energy is more than one million times greater than what humans can achieve.

Professor Farrar proposes that the merger of binary neutron stars is the source of all or most ultra-high energy cosmic rays. This scenario can explain the unprecedented, mysterious range of ultra-high energy cosmic rays, as the jets of binary neutron star mergers are generated by gravity-driven dynamos and therefore are roughly the same due to the narrow range of binary neutron star masses. Image credit: Osaka Metropolitan University / L-Insight, Kyoto University / Riunosuke Takeshige.

The existence of ultra-high energy cosmic rays has been known for nearly 60 years, but astrophysicists have not been able to formulate a satisfactory explanation of the origins that explain all observations to date.

A new theory introduced by Glennnies Farrer at New York University provides a viable and testable explanation of how ultra-high energy cosmic rays are created.

“After 60 years of effort, it is possible that the origins of the mysterious highest energy particles in the universe have finally been identified,” Professor Farrar said.

“This insight provides a new tool to understand the most intense events in the universe. The two neutron stars fuse to form a black hole. This is the process responsible for creating many valuable or exotic elements, including gold, platinum, uranium, iodine, and Zenon.”

Professor Farrer proposes that ultra-high energy cosmic rays are accelerated by the turbulent magnetic runoff of the dual neutron star merger, which was ejected from the remnants of the merger, before the final black hole formation.

This process simultaneously generates powerful gravitational waves. Some have already been detected by scientists from the Ligo-Virgo collaboration.

“For the first time, this work explains two of the most mystical features of ultra-high energy cosmic rays: the harsh correlation between energy and charge, and the extraordinary energy of just a handful of very high energy events,” Professor Farrar said.

“The results of this study are two results that can provide experimental validation in future work.

(i) Very high energy cosmic rays occur as rare “R process” elements such as Xenon and Tellurium, motivating the search for such components of ultra-high energy cosmic ray data.

(ii) Very high-energy neutrinos derived from ultra-high-energy cosmic ray collisions are necessarily accompanied by gravitational waves generated by the merger of proneutron stars. ”

study It will be displayed in the journal Physical Review Letter.

____

Glennys R. Farrar. 2025. Merger of dichotomous neutron stars as the source of the finest energy cosmic rays. Phys. Pastor Rett 134, 081003; doi:10.1103/physrevlett.134.081003

Source: www.sci.news