How Reducing Air Pollution May Impact Key AMOC Currents

Smog particles reflecting sunlight

Smog Contains Particles That Reflect Sunlight and Cool Earth’s Surface

Credit: Dennis McDonald/Alamy

Addressing air pollution in Europe and North America could inadvertently weaken the Atlantic Meridional Circulation (AMOC), a crucial ocean current influencing Europe’s climate.

Air pollution, including smog and soot, claims approximately 7 million lives annually and contributes to widespread health issues. Interestingly, aerosols, which are tiny particulate pollutants like sulfur dioxide, can reflect sunlight, making clouds brighter and reducing surface heat absorption.

Recent research indicates that reducing air pollution from maritime sources and other sectors could accelerate global temperature increases. “If we cut back on aerosols, we will start to see the extent of warming,” says Michael Diamond from Florida State University.

Historically, scientists’ insights into aerosols’ climatic impact have relied on global simulations akin to those used for examining the greenhouse effect. These models suggest that “higher aerosol levels cool the North Atlantic surface and strengthen the AMOC,” according to Robert Allen from the University of California, Riverside. Conversely, if global aerosol emissions are reduced, the Earth’s surface may warm, weakening the AMOC.

Nonetheless, these simulations often overlook the regional characteristics of air pollution. Unlike greenhouse gases, which linger in the atmosphere for years, most aerosols dissipate within a week, meaning their climatic effects typically manifest close to their source, revealing the complex consequences of pollution reductions.

To gain deeper insights into the impacts of clean air initiatives, Allen and his team employed eight distinct climate models to assess how changes in regional aerosol emissions impact both local and global climates. The models evaluated AMOC strength under high-emission scenarios established by the Intergovernmental Panel on Climate Change and reformulated these scenarios with enhanced air quality regulations.

The findings indicate that if greenhouse gas emissions continue to rise but aerosol pollutants decrease, the AMOC could weaken by a third by mid-century compared to scenarios where aerosol levels remain elevated.

While Allen’s research does not delve into the regional weather implications of AMOC weakening, previous studies suggest that such a decline could lead to adverse outcomes, including increased droughts across Europe, exacerbated sea level rise in northeastern North America, and disruption of global monsoons and rising temperatures in Northern Europe.

Allen’s analysis revealed that the most significant impact on AMOC would stem from reduced aerosol levels in Europe and North America. However, he noted that air quality improvement initiatives in East Asia are also proving impactful. Cleaner air in East Asia is affecting global temperatures—despite their short lifespan, aerosols can travel long distances and mask warming effects wherever they reach, potentially leading to further weakening of the AMOC.

“To improve air quality, we must acknowledge that there will be associated climate changes,” Allen states. “To achieve clean air while minimizing our climate impact, we must simultaneously reduce other greenhouse gases like CO2 and methane.”

Diamond echoes this sentiment, stating, “When considering clean air policies, it’s vital to concurrently address decarbonization strategies.”

Topics:

  • Climate Change/
  • Air Pollution

This version retains the original HTML structure while enhancing the SEO aspect through targeted keywords and improved readability.

Source: www.newscientist.com

Enhancing CAR T Cell Therapy: The Impact of First Eradicating Cancer Cells

Diagram illustrating CAR T cell therapy for melanoma treatment

Illustation of CAR T cell therapy for melanoma, a form of skin cancer

Nemeth Laszlo/Shutterstock

Innovative therapies are transforming the treatment landscape for blood and skin cancers, with recent studies highlighting enhanced effectiveness. In murine models with advanced skin cancer, researchers have discovered that manipulating the physical properties of cancer cells amplifies the efficacy of immunotherapy—specifically, CAR T-cell therapy. This promising breakthrough could significantly improve survival rates for patients undergoing immunotherapy.

“This groundbreaking concept addresses a critical medical issue from a physical perspective,” notes Lee Sui from Queen Mary University of London, who is not associated with this research. “The outlook is very hopeful.”

Cancer cells are often softer when compared to healthy cells, which poses challenges. T cells, vital components of the immune system responsible for targeting cancer, sense environmental stiffness.

“We examined whether the softness of cancer cells allows them to evade the immune response and how T cell mechanosensing affects the cellular response to cancer,” explains Lee Tan, who presented his findings on May 11 at an academic conference hosted by the Swiss Federal Institute of Technology in Lausanne, Switzerland. The Biophysical Immunoengineering: From Insights to Clinical Applications conference in London focused on these innovative approaches.

The researchers set out to uncover why cancer cells exhibit softness by contrasting their membranes with those of healthy cells. They discovered that both murine and human cancer cells tend to be softer due to high cholesterol content in their membranes.

The team subsequently injected 24 mice with melanoma cells, notorious for being the deadliest skin cancer. Nine days post-injection, the mice received genetically modified T cells specifically designed to target the tumor, emulating CAR T-cell therapy, which is approved for conditions like acute lymphoblastic leukemia and B-cell lymphoma.

Additionally, the mice underwent three injections over five days of IL-15, a protein that heightens the cancer-killing capacity of tumor-specific T cells.

Crucially, only half of the mice received a third treatment involving methyl beta-cyclodextrin (meβCD), a compound that reduces cholesterol levels in cell membranes, administered directly into the tumors daily from day 9 to day 18 post-cancer cell injection. The other mice received saline as a control.

After roughly one month, all 12 mice that did not receive meβCD succumbed to rapidly-growing tumors. In stark contrast, only seven mice in the meβCD group perished, while five experienced complete tumor resolution. “The results are compelling. Very encouraging,” states Lance Cam from Columbia University, New York.

Further analysis indicated that meβCD enhanced the adherence of tumor-specific T cells to tumor cells by stiffening them. Consequently, T cells were more effective in delivering toxic agents such as perforin, which perforates and obliterates cancer cells.

The research team aims to extend this approach to a broader array of tumors in mice, according to Tang. “The significant challenge lies in ensuring this understanding translates to human applications,” Kam emphasizes. Few successful immune-targeting drugs in mice yield equivalent results in humans, primarily due to immune system disparities. However, since cancer cells tend to be soft in both species, there is potential for therapies that modify cancer cell stiffness to be more effective.

Moreover, researchers are actively working on developing therapeutics with effects akin to meβCD that can be delivered with a single injection.

Topics:

Source: www.newscientist.com

Impact of Los Angeles Area Fires: How Pollution is Driving Residents Away from Their Homes

ALTADENA, Calif. — In response to alarming lead levels, an Altadena mother has initiated chelation therapy for her son. Geochemists are now required to don respirators and full-body suits before entering homes affected by contamination. A filmmaker has invested thousands in testing and remediation on his property, which once served as his home, to address heavy metal contamination—efforts not included in any government cleanup initiative.

Subscribe to read this story without ads

Enjoy unlimited access to ad-free articles and exclusive content.


Sixteen months post-Eaton Fire, the residents of Altadena are resorting to drastic measures to tackle severe contamination from toxic compounds like arsenic and asbestos affecting their homes and health. This pollution stems from an unprecedented urban firestorm which devastated numerous homes, releasing heavy metals into the air.

Despite efforts to clear charred debris and repair homes, tests indicated dangerously high lead levels—enough to threaten children’s health.

The Eaton fire not only destroyed structures but also left behind significant metal contamination.
Evan Bush/NBC News

Jennifer Rochlin, a potter and single mother, shared, “I purchased a lead test from Amazon for $75 and after several tests, I found lead everywhere,” including in her HVAC system. Her insurance provider initially refused to authorize a lead inspection at her home.

Rochlin has relocated twice, incurring costs to replace absorbent household items like mattresses.

The events surrounding the Eaton Fire have resulted in many residents of Altadena, a suburb in northeastern Los Angeles, still not returning home. Nearly two-thirds of residents were adversely affected by the fire, leading to prolonged stays in temporary housing, creating financial burdens for individuals and insurance companies as policies expire.

Uncertainty surrounding the timeline for rebuilding has compelled academics, independent scientists, and community advocacy groups to undertake their own assessments of the contamination. Below is an account of these findings and the conflicts they incite, based on interviews with numerous affected residents, scientists addressing the pollution crisis, debris removal workers, local officials, and insurance representatives.

The shared experiences highlight the inadequacies of existing systems for responding to fire-related disasters, including insurance frameworks, restoration services, local governance, and environmental regulations.

“This was an urban fire, and the contamination we encountered was unlike anything seen in previous events,” stated Dawn Fanning, managing director of the nonprofit Eaton Fire Residents United.

Dawn Fanning, managing director of Eaton Fire Residents United, noted that approximately 70% of residents in smoke-damaged homes have yet to return.
Evan Bush/NBC News

California currently lacks safety standards for indoor contamination from various hazardous substances prevalent in Altadena, aside from lead and asbestos. This absence complicates the decision-making process for homeowners and insurers regarding when it is safe to return to their properties. Moreover, testing companies don’t adhere to consistent methodologies. Areas affected by fire were not subjected to soil testing by FEMA and the U.S. Army Corps of Engineers, leaving residents with significant information gaps regarding potential hazards.

Whistleblowers from the Corps involved in the cleanup have expressed concerns that communities may face lingering soil contamination issues.

Both individuals, who requested anonymity for fear of reprisal, indicated that the cleanup process was rushed and plagued by inconsistencies. One whistleblower noted an alarming amount of debris left behind compared to past wildfire responses.

“This cleanup is subpar. We typically remove everything, going fence to fence, but this time ‘contaminants are still present,’” one individual remarked.

A spokesperson for the Corps stated that the cleanup’s scope, including the criteria for debris removal, was established by FEMA in coordination with California state officials and Los Angeles County.

“The assigned mission encompassed the removal of structural ash and debris, along with soil in the top six inches of the structural foundation,” the spokesperson clarified. “Soil testing was not included in the USACE mission directive from FEMA.”

The Hidden Soil Threat

Altadena embodies the intersection of nature and urban life.

Nestled against the San Gabriel Mountains, this area radiates warm terracotta hues at dusk, with the silhouettes of downtown Los Angeles visible in the distance.

The January 2025 Eaton Fire devastated 9,400 homes and structures, releasing smoke laden with lithium from electric car batteries, arsenic from antiquated wood, and asbestos from insulation. Winds during thefire reached up to 90 mph, propelling the flames.

Alireza Namayande, a National Science Foundation postdoctoral researcher at Stanford, collected smoke samples during the fire within the plume at Pasadena Park. His findings indicated that most particulates were nanoparticles, measuring one-thousandth the width of a human hair—capable of penetrating lungs, bloodstream, and brain.

Source: www.nbcnews.com

Arctic Fires Release Ancient Carbon: The Impact of Climate Change on Long-Stored Carbon Emissions

In 2025, wildfires will severely impact the boreal forest of Manitoba, Canada.

Anadolu (via Getty Images)

The increasing frequency of wildfires across the Arctic is having a more substantial impact on global warming than previously understood. While initial assumptions suggested that primarily recent vegetation was burning, soil core studies reveal that these fires are igniting ancient carbon deposits accumulated for over 5,000 years.

“Soil combustion has the potential to release long-term stored carbon from soil, which was previously considered a carbon sink,” explains Meri Rappel from the Finnish Meteorological Institute in Helsinki. Current climate models neglect the release of this ancient carbon.

In the cold conditions of the Arctic, plant growth is slow, leading to the accumulation of organic matter in the soil as peat and other forms over centuries or even millennia. This factor positions Arctic and adjacent boreal soils as significant carbon sinks, which effectively remove carbon dioxide from the atmosphere.

However, increasing wildfire incidents are changing this dynamic. Rappel’s research team has collected soil cores from recently burned areas to study the impact of these fires.

Their findings indicate that while surface vegetation may burn quickly, the underlying organic material smolders for a longer duration, releasing considerable amounts of soot and carbon dioxide into the atmosphere.

Black carbon, a byproduct of these fires, absorbs sunlight, contributing directly to atmospheric warming. Moreover, in colder regions, black carbon can accumulate on ice and snow, accelerating melting processes that would otherwise not occur.

“We discovered that the age of the carbon released during fires varied significantly depending on soil depth and burn intensity,” Rappel stated during the European Geosciences Union Conference in Vienna.

The risk of releasing ancient carbon is particularly high as it tends to increase toward the North Pole, where organic matter accumulates close to the surface. For instance, in Canada’s Northwest Territories, fires are penetrating soil just a few centimeters deep, unleashing carbon stored for up to 400 years.

In Greenland, fires can consume up to 10 centimeters of soil, releasing carbon that is over 560 years old, with some areas experiencing burns of up to 15 centimeters, releasing carbon that has been stored for 1,000 years.

Remarkably, a boreal forest site in Quebec, Canada, has been identified where fires released carbon dating back 5,000 years. “However, this occurrence is not widespread,” said Ruppel at the conference.

The critical question remains: how much ancient carbon is currently being released by wildfires? Rappel emphasizes that this study is merely the starting point, indicating the need for further research to quantify the released carbon.

“Rappel’s work is vital and underscores the urgency,” noted Sandy Harrison from the University of Reading. “It’s evident that substantial old carbon exists in high-latitude soils and peat. As new fire regimes evolve, destroying topsoil layers and peatlands, this ancient carbon will be released into the atmosphere.”

Topics:

Source: www.newscientist.com

Exploring Eurovision: Scientists Analyze 1,763 Songs for Nostalgia and Emotional Impact

Feedback from New Scientist

Welcome to New Scientist, your trusted source for the latest in science and technology news. If you have feedback or items that may interest our readers, please reach out via email at feedback@newscientist.com.

Eurovision 2026: Are You Ready?

The highly anticipated 2026 Eurovision Song Contest is fast approaching, with the grand finale set for Saturday, May 16th. Whether you’re a fan or not, get ready for an entertaining spectacle!

Coinciding with this buzz, a comprehensive study published in Royal Society Open Science delves into the rich history of Eurovision. Researchers analyzed data from every contest between 1956 and 2024, totaling 1,763 songs. They categorized entries by various musical attributes, including language, themes, lyrics, and genre, utilizing AI tools for analysis. It’s hard to ignore the auditory implications of such a massive dataset!

The analysis unearthed intriguing insights, revealing that past research identified 12 major themes prevalent in popular songs, such as desire, heartbreak, and pain. However, only 11 themes are reflected in the Eurovision entries, as researchers excluded the theme ‘Jaded’ for being underrepresented.

The data also shows a significant decline in songs expressing nostalgia, while themes of pain, rebellion, despair, confusion, and escapism have become more prominent over the years. The 1970s marked a notable rise in songs depicting disorder and escapism, reflecting the societal crises of that era. However, the increase in ‘pain’ themes began not until the 2000s, post-Great Recession, suggesting a correlation.

Interestingly, songs have shifted from acoustic to electronic styles, with a growing prevalence of English lyrics over national languages. This trend indicates that Eurovision participants are deliberately aligning their entries with the winning formula established by past champions.

There are notable exceptions, as countries like France, Italy, Portugal, and Spain continue to champion their native languages, suggesting a deeper cultural rationale beyond mere competition.

The researchers conclude by emphasizing the notion of “organizational learning” among Eurovision participants, reflecting an ongoing adaptation to the competition landscape. Feedback sees this as a testament to the enduring allure of the contest.

Moss Appeal: A Niche Attraction

In a previous article, we discussed a park filled with intricate foraminiferal carvings and pondered the existence of niche science-themed attractions. This inspired reader John Wilson to share information about the Serenity Moss Garden in North Carolina.

Spanning about 900 square meters, this moss-covered mountainside offers visitors a unique experience, though John humorously described it as “more like a climate-controlled box” rather than a traditional museum.

Feedback realizes that our quest for niche appeal may have been too limited. Are there any other unique attractions, such as a museum dedicated to Plecopteran (stoneflies) or specialized exhibits featuring beach pebbles?

New Math? A Logical Dilemma

Regardless of our professional backgrounds, math can sometimes overwhelm us. Navigating concepts like converting square kilometers to square meters can be perplexing.

Recently, U.S. Secretary of Health Robert F. Kennedy Jr. faced scrutiny for claiming a 600% decrease in drug prices, an assertion deemed mathematically implausible by rival politicians.

Feedback believes RFK Jr. has been misled. A 100% drop suggests prices have plummeted to zero, a mathematical limit. In theory, this could even lead to negative pricing, but the complexities of rate changes should ideally be left to mathematicians.

In a curious twist, RFK Jr. stated, “If that drug goes from $100 to $600, that’s a 600% price increase.” This form of reasoning feels like a new, perplexing brand of logic—while the premises hold, the conclusion is unmistakably flawed.

Contribute Your Story

If you have a story or feedback, share it with us! Email your article to Feedback and include your home address. You can also find this week’s and past feedback on our website.

Source: www.newscientist.com

Unveiling the Complex Legacy of Genomics Pioneer Craig Venter: A Deep Dive into His Impact on Genetics

Craig Venter, 2010

Reuters/Jessica Rinaldi

Renowned biologist Craig Venter, instrumental in decoding the human genome and advancing synthetic biology, has passed away.

According to the J. Craig Venter Institute, Venter died “after a brief hospitalization due to unexpected side effects from treatment for a recently diagnosed cancer.” He was 79 years old.

Venter’s legacy is vast and impactful, marked by significant advancements in genomics and biodiversity. His career also highlighted the commercialization of biological research and the competitive nature of modern science.

Venter’s journey into research was unconventional; after high school, he was an uninterested student drawn to sailing and surfing. His experience in the US Navy during the Vietnam War inspired him to turn his life around. Upon returning home, he pursued higher education, eventually becoming a biomedical researcher at the National Institutes of Health (NIH) in the 1980s.

Venter’s fascination with the human genome led him to utilize automated sequencing machines, significantly accelerating research. He began with sequencing short DNA fragments called expressed sequence tags, igniting controversy when he claimed NIH would patent these sequences, leading to heated debates within the scientific community.

The Official Human Genome Project (HGP) launched in 1990, but Venter deemed their methods too slow. In 1998, he founded the for-profit company Celera Genomics to expedite the sequence, competing against the publicly funded HGP.

While HGP employed Sanger sequencing, which involved mapping and piecing together the genome, Venter introduced the shotgun sequencing technique. This novel method involved breaking the genome into random pieces followed by sequencing and computer analysis. In 1995, he successfully sequenced the whole bacterial genome, laying the groundwork for targeting the more complex human genome.

The race culminated in a draw, with both teams publishing draft sequences in 2000, followed by their finalized results the next year. The HGP released all of its data publicly, while Venter’s Celera initially withheld some for commercial benefit.

Despite backlash from the genetic community, Venter moved forward with his innovative research. From 2004 to 2006, he sailed aboard his yacht, the Sorcerer II, collecting seawater samples and sequencing vast amounts of DNA, resulting in the identification of over 1000 new protein families.

Venter’s ambition extended to creating synthetic life forms, asserting that manipulating organisms could yield significant advantages in fields ranging from medicine to agriculture. In 2010, his team synthesized a novel cell.

Starting with the bacterium Mycoplasma mycoides, they synthesized an artificial genome by combining lab-generated DNA strands and replaced the original genome with an artificial one, allowing the cell to thrive and multiply instead of dying.

Venter clarified that he did not create life from scratch but engaged in generating a new form of life whose genome was entirely computer-generated, lacking biological ancestry. His team humorously inscribed their names onto the genome, symbolizing the successful transfer of genetic data.

Venter faced skepticism from fellow synthetic biologists who questioned the purpose of his flashy experiments, suggesting that alternative approaches may yield more practical outcomes. However, he persisted in refining his work, stripping away non-essential genes to develop organisms with “minimal genomes,” revealing many unknown essential gene functions and underscoring the complexity of life.

It will take extensive analysis for historians to evaluate Venter’s full impact on science. Nevertheless, his contributions are undeniably profound and transformative.

Topics:

Source: www.newscientist.com

Coral Reefs in Remote Islands Endure Extreme Heat Wave: Impact and Insights

Sure! Here’s the SEO-optimized rewrite of your content while maintaining the HTML tags:

Houtman Abrolhos Islands: Corals Exhibit Extreme Heat Tolerance

Bill Bachman/Alamy

The coral reefs of the Houtman Abrolhos Islands, located off the coast of Western Australia, have shown remarkable resilience against the severe heatwave that impacted coral ecosystems globally in early 2025. Researchers are eager to unveil the secrets behind the extraordinary heat tolerance of these corals, hoping to aid in the preservation of coral reefs worldwide, which face extinction due to climate change.

Under the guidance of Dr. Kate Quigley, a team from the University of Western Australia ventured to 11 dive sites in the Houtman Abrolhos Islands during July 2025.

In contrast, up to 60% of the corals on Ningaloo Reef succumbed to the same heatwave. This trend reflects a pattern observed in coral reefs globally, as the marine heatwave of 2025 resulted in disastrous coral mortality rates.

However, at Houtman Abrolhos, aside from a few minor areas, the corals exhibited no signs of distress, unlike the typical fluorescent coloring associated with stress. “We anticipated a massive bleaching event following the prolonged marine heatwave. Surprisingly, the corals thrived,” stated Quigley.

Coral bleaching typically occurs due to prolonged thermal stress, where corals expel the symbiotic algae living within them, which are crucial for their sustenance.

Researchers are currently evaluating the heat stress levels experienced by corals using the Degree Heating Week (DHW) metric, which measures the duration and intensity of heat waves.

Significant bleaching is generally observed after 4 °C weeks, with catastrophic conditions arising after 8 °C weeks. “Around 8°C per week is deemed disastrous and is often linked to widespread bleaching and coral mortality,” explained Quigley.

The waters around the Houtman Abrolhos Islands experienced 4°C per week in early February 2025, reaching 8°C per week by early March. By mid-April, the corals were subjected to heat stress equivalent to 22°C per week.

Quigley and her team were particularly astonished to observe that corals of various species at the reef remained unharmed despite the devastating conditions affecting other regions.

To further investigate the heat resistance of Houtman Abrolhos corals, scientists collected several coral colonies and subjected them to elevated temperatures in laboratory settings.

At 8°C weeks, survival rates in Houtman Abrolhos were double, and bleaching resistance was nearly quadruple when compared to established thresholds. Nearly 100% survival was noted even at approximately 16 °C weeks.

The maximum tolerance level of these corals remains to be fully determined, but Quigley asserts it is “remarkably substantial and exceeds the thresholds recorded at other coral reef locations studied globally.”

The next phase for researchers is to discern how these corals manage to thrive in such extreme conditions.

Quigley posits that the presence of symbiotic algae could be key to the heat resilience seen in Houtman Abrolhos corals. “There are likely unique environmental conditions in this area that promote heat tolerance evolution among local species,” she stated. For this reason, protecting these reefs should be a top priority, along with identifying other resilient reefs.

Petra Lundgren from the Great Barrier Reef Foundation mentions that such reefs serve as “natural laboratories for understanding heat tolerance.”

“They also promise insights into enhancing selective breeding and interventions aimed at bolstering thermal resilience in coral restoration and conservation aquaculture,” Lundgren noted.

While curbing global carbon emissions is crucial for safeguarding these vital ecosystems, “providing adaptive support, such as seeding reefs with heat-tolerant corals, will significantly improve their chances of surviving future heat stress events,” she concluded.

Topic:

This version uses targeted keywords such as “coral reefs,” “heat tolerance,” and “Houtman Abrolhos Islands,” aiming to enhance search visibility while preserving the original content’s intent.

Source: www.newscientist.com

Exploring the Impact of Climate Change on Wildfires in Georgia and Florida: Hotter, Drier Conditions and Hurricane Aftermath

Sure! Here’s the content rewritten for better SEO, while keeping the HTML structure intact.

Wildfires are currently raging across southern Georgia and northern Florida, exacerbated by intense heat, strong winds, severe drought, and dry vegetation left from previous hurricanes. These elements have created a perfect storm for wildfires in the region.

Subscribe to read this story without ads

Get unlimited access to ad-free articles and exclusive content.


This situation is exactly what climate scientists have been warning about for decades as our planet continues to warm.

“This is certainly abnormal, but aligns with our concerns regarding climate change,” explained Caitlin Trudeau, a climate scientist at Climate Central, a nonprofit scientific research organization. “These events highlight the dramatic changes occurring in our climate.”

The wildfires are consuming thousands of acres across both states. Notably, a wildfire in Atkinson, Georgia, has already destroyed approximately 90 homes since its ignition on Monday.

In response to these fires, multiple counties, including those in Georgia, have implemented burn bans, leading to Gov. Brian Kemp declaring a state of emergency on Wednesday across 91 counties.

The wildfires are primarily attributed to widespread drought conditions in the Southeast, exacerbated by remnants of previous hurricanes—circumstances tied to climate change.

Specifically, Hurricane Helen, which made landfall in Florida’s Big Bend area as a Category 4 storm in 2024, left behind scorched trees, branches, and other dry vegetation.

“It’s as if the hurricane stripped a significant number of trees and laid everything bare in that area,” Trudeau noted. “The remains were exposed to the sun, and wood with high oil content becomes extremely flammable when dry.”

This dry vegetation significantly amplifies wildfire risks, fostering their growth and increasing their destructiveness.

Researchers warn that catastrophic wildfires will become increasingly prevalent in a warming world. Studies indicate wildfires will not only occur more frequently but will also be more devastating due to climate change—a situation with serious environmental, economic, and health repercussions for communities nationwide and globally.

Trudeau emphasized that even in humid areas like the Southeast—traditionally not considered as wildfire-prone—the risks are evolving under climate change.

“This is the reality we’ve been anticipating with climate change,” she said. “Certain parts of the Southeast are extremely dry now. Although these regions have high humidity, climate change has intensified atmospheric thirst. As temperatures rise, the amount of water drawn from the landscape and extracted from plants and soils increases as well.”

For a wildfire to ignite, two key elements must be present: fire-prone weather, which includes dry conditions, lightning, and wind, and “fuel,” such as dead wood, dry leaves, and other flammable vegetation.

As temperatures rise due to climate change, the atmosphere can efficiently extract moisture from trees and soils. In the event of prolonged droughts, insufficient rainfall exacerbates the potential for destructive wildfires.

Currently, all of Florida is experiencing some level of drought, with much of the Panhandle region categorized as facing “extreme” or “exceptional” drought, according to the US Drought Monitor. Likewise, 71% of Georgia is experiencing “extreme” or “exceptional” drought, particularly in southern regions.

For Trudeau, the wildfires witnessed this week serve as a stark indication of climate change’s catastrophic effects on natural ecosystems, including increased fire activity in areas historically deemed humid.

“This is why we are facing such an extraordinary situation right now,” Trudeau concluded. “It’s truly a perfect storm.”

This version integrates keywords related to wildfires, climate change, and specific regions to improve its search engine optimization (SEO) effectiveness.

Source: www.nbcnews.com

The Impact of Rain Sounds on Seed Germination: How Nature Influences Plant Growth

New research on rice reveals that the acoustic vibrations from falling droplets have the ability to stimulate dormant seeds, marking the first direct evidence that plants can detect natural sounds.



Rice and its related seed types can detect the sound of rain hitting the soil or water, accelerating germination when the sound intensity is adequate to displace stationary stones away from cell membrane receptors, thereby facilitating gravitropic growth mechanisms.

Plants are remarkably sensitive organisms. To thrive, they have developed mechanisms to perceive and react to various environmental stimuli.

For instance, certain plants snap shut upon contact, while others retract when exposed to harmful odors.

Moreover, most plants exhibit phototropism, reaching for sunlight to optimize growth.

Plants also respond to gravity, with roots growing downward and shoots rising upward against the gravitational pull.

One important method of gravity perception involves stationary stones within plant cells.

These “stillstones” are denser than the cell’s cytoplasm, floating or sinking inside the cell, similar to sand in water.

When the stones settle at the bottom, they rest against the cell membrane, signaling the direction of gravity and guiding root and shoot growth.

Research has shown that removing the stationary stones can further stimulate seed growth.

“Our findings indicate that seeds can perceive sound as a vital survival mechanism,” stated Professor Nicholas Makris from the Massachusetts Institute of Technology (MIT).

“The energy generated by rain sounds is potent enough to trigger seed growth.”

Professor Makris and fellow MIT researcher Cadine Navarro conducted experiments involving rice seeds, which naturally thrive in shallow rice fields.

During multiple trials, they submerged approximately 8,000 rice seeds in a shallow bath, exposing a subset to dripping water.

By varying the droplet size and height, they simulated light, medium, and heavy rainfall.

The team deployed hydrophones to capture the acoustic vibrations generated by the water droplets underwater.

These laboratory measurements were validated against records taken in natural environments, such as puddles, ponds, wetlands, and storm-influenced soils.

The comparison confirmed that laboratory conditions replicate rain-induced acoustic vibrations seen in nature.

Moreover, they observed that rice seeds subjected to water sounds germinated 30 to 40 percent faster than those without sound exposure but in identical conditions.

Those seeds positioned nearer to the water surface demonstrated heightened sensitivity to droplet sounds and exhibited faster growth than their deeper counterparts.

This research indicates a clear link between acoustic vibrations from rain and enhanced seed growth.

Scientists speculate that seeds capable of sensing rain may gain evolutionary advantages. Seeds that are close enough to the surface to detect raindrop sounds are likely positioned optimally to absorb moisture and safely push through to the surface.

The research team conducted calculations to verify if the physical vibrations from the droplets could perturb the micro resting stones within the seeds.

Such findings would provide insights into how sound directly influences plant growth.

The calculations considered factors like droplet size and terminal velocity to evaluate the amplitude of acoustic vibrations generated by falling droplets.

Based on this data, the team assessed how vibrations affect submerged seeds and the impact on their biological dynamics.

The experiments on rice seeds aligned with their theoretical predictions, confirming that the sound of rain could indeed displace hard seeds’ resting stones, leading to collisions.

This phenomenon may underlie plants’ capacity to “hear” rain sounds and respond with growth.

“Extensive research worldwide continues to delve into the mechanisms facilitating plants’ gravity sensitivity,” noted Professor Makris.

“Our study revealed that these same mechanisms empower seeds to discern their submerged depth in soil or water, enhancing survival through sound detection of rain.”

“Titled Falling Rain Awakens the Soil, this insight offers a fresh perspective on Japan’s Fourth Microseason.”

A study detailing this research is featured in this week’s edition of Scientific Reports.

_____

N.C. Makris and C. Navarro. 2026. Seeds detect the sound of rain to promote germination at the appropriate planting depth. Science Officer 16, 11248; doi: 10.1038/s41598-026-44444-1

Source: www.sci.news

How Urban Living Affects Estrogen Levels: Understanding the Impact of City Life

How the Gut Microbiome Influences Hormonal Levels

Nopparit/Getty Images

Recent studies reveal that bacteria in our gut can recycle discarded sex hormones back into the bloodstream. Researchers found that individuals in industrialized societies host significantly more bacteria that perform this recycling than those in hunter-gatherer populations or non-industrialized farmers. This phenomenon may lead to elevated blood levels of certain sex hormones, presenting potential health risks.

“We don’t yet know how the body reacts to this increased input,” explains Rebecca Britten from Jagiellonian University School of Medicine in Poland. “However, the implications could be substantial.”

Sex hormones, including estrogen, travel in the bloodstream. Elevated hormone levels trigger a chemical signal in the liver, causing the hormone to be excreted via the intestines. Bacteria feed on a sugar molecule attached to the hormone, utilizing an enzyme named β-glucuronidase to remove this tag.

Once the tag is cleaved, hormones can be reabsorbed by the body and re-enter the bloodstream. Research indicates that a notable portion of excreted sex hormones undergoes this recycling process due to gut bacteria.

The term “oestrobolome,” introduced in 2011, refers to the collection of intestinal bacteria that influence estrogen levels. Recently, the term “Testbolome” was proposed, indicating gut bacteria’s role in altering testosterone levels as well.

The latest research, conducted by a British team, analyzed gut microbiome data from various populations, including hunter-gatherers in Botswana, rural farmers in Venezuela, and urban residents in Philadelphia and Colorado. The findings show that the estrogen recycling ability of gut microbes in industrialized populations is up to seven times greater and twice as diverse compared to hunter-gatherers or rural communities.

Interestingly, the study also highlights that formula-fed infants exhibit up to three times more recycling capacity and eleven times more diversity than breastfed infants. However, factors such as age, gender, and BMI did not significantly affect the oestrobolome composition.

Researchers are now investigating if the enhanced recycling capabilities linked to gene sequences translate to actual increases in estrogen levels in the bloodstream. It remains to be seen whether the body compensates for heightened recycling by adjusting hormone levels.

If certain individuals maintain high estrogen levels due to their microbiome, it could significantly impact fertility and overall health, potentially raising the risk for conditions like certain cancers. Conversely, increased recycling might be beneficial for those with low estrogen levels. “We shouldn’t automatically assume that higher estrogen recycling is detrimental,” Britten notes. “In some cases, it can be advantageous.”

Katherine Cook, a professor at Wake Forest University School of Medicine studying the microbiome’s connection to breast cancer risk, emphasizes the growing evidence of gut microbiome’s role in human health. However, she cautions that the current study’s cohort is primarily based in the United States, suggesting that including a European group could strengthen the findings.

Britten expresses her intention to explore the lifestyle factors contributing to these observed differences. “We want to gather more precise data for further research,” she remarks.

Topic:

This revision enhances SEO by incorporating relevant keywords and phrases, improving readability, and maintaining the HTML structure.

Source: www.newscientist.com

Is a Super El Niño Coming? Impact on Weather and Climate Explained

Super El Niño Results in 1998 China Floods

Photo by Robin Beck/AFP via Getty Images

Recent models predict the emergence of an exceptionally strong El Niño climate phase later this year, potentially the strongest recorded.

This event is being referred to as “Super El Niño” or “Godzilla El Niño,” which could lead to severe droughts in certain regions and catastrophic flooding in others, contributing to the hottest year on record globally.

“Projections indicate that the tropical Pacific Ocean will warm at an unprecedented rate this century,” says Adam Scaife from the UK’s Met Office. “Something unusual is clearly happening.”

What is Super El Niño?

El Niño is a recurring climate phenomenon that significantly raises temperatures and disrupts global weather patterns. This occurs when the trade winds over the tropical Pacific Ocean weaken, disrupting upwelling of cold water and causing warm surface water to accumulate in the central and eastern Pacific. As a result, atmospheric circulation is altered.

El Niño is characterized by sea surface temperatures in the central Pacific exceeding 0.5 degrees Celsius above the long-term average. A “super” El Niño occurs when this temperature rise exceeds 2 degrees Celsius.

The name El Niño, meaning “the Christ child,” originates from observations by Peruvian fishermen who noted that warming typically peaks in December.

While El Niño events occur every few years, “super” events have been recorded in 1982-1983, 1997-1998, and 2015-2016.

What are the Chances of a Super El Niño Occurring?

Westerly winds during March and early April have carried warm water toward the central and eastern Pacific, paving the way for a significant El Niño event. The Japan Meteorological Agency anticipates that the temperature anomaly could reach nearly 2 degrees Celsius by September. Additionally, models from the European Centre for Medium-Range Weather Forecasts (ECMWF) indicate a 50% chance of reaching 2.5 degrees Celsius by October.

The National Weather Service estimates a 25% chance of experiencing a Super El Niño by year’s end. If predictions hold that temperature anomalies in the central Pacific exceed 3 degrees Celsius by September, we could witness the strongest El Niño ever recorded.

Currently, signs of El Niño’s development remain weak, and models struggle to provide accurate forecasts, a challenge known as the “spring predictability barrier.” Meteorologists expect to have clearer insights into El Niño’s strength by May or June.

What are the Weather Impacts?

Changes in atmospheric circulation due to El Niño can have far-reaching consequences, including substantial economic damage, crop failures, coral bleaching, and the spread of diseases. “Conditions are chaotic and well outside normal ranges,” states Tim Stockdale from ECMWF. “It’s not solely about increased rainfall; these changes are occurring in areas typically shielded from such storms.”

Typically, El Niño brings intensified storms and wet weather to southern coastlines of the Americas, the Horn of Africa, and China, elevating flooding risks. Conversely, regions like Australia, Southeast Asia, south-central Africa, India, and the Amazon rainforest are likely to face hotter, drier conditions, heightening the potential for droughts, heat waves, and wildfires.

In the UK and northwestern Europe, the effects are less predictable, with El Niño potentially leading to warmer summers and colder winters, although other climatic factors may also contribute to milder, wetter winters.

Even after reaching its peak, El Niño’s damaging effects can persist. Following the Super El Niño of 1997-1998, heavy rains resulted in devastating floods in China’s densely populated Yangtze River basin, claiming over 3,000 lives, destroying 15 million homes, and causing $20 billion in economic losses.

A silver lining is that fewer hurricanes typically form in the Caribbean and off the U.S. east coast during El Niño, as enhanced atmospheric circulation increases wind shear, causing storms to dissipate quickly rather than evolving into major hurricanes.

How Will El Niño Affect Climate?

If climate change is likened to a slowly rising tide, El Niño acts as a powerful wave that temporarily elevates temperatures even further. A strong El Niño could lead to a global temperature increase of 0.2°C.

The last significant El Niño event in 2024 contributed to record-high global temperatures, briefly surpassing the Paris Agreement limit of 1.5°C for the first time. Many anticipate that a Super El Niño in 2027 could also set a new record.

“As we approach 1.4°C, it is very plausible that we will exceed the 1.5°C threshold in 2027,” Scaife noted. “Global warming is inching closer to the Paris Agreement limits.”

Will More Super El Niño Events Occur?

Despite rising El Niño temperatures in the central Pacific due to climate change, long-term temperature averages remain consistent, suggesting that we may not see an increase in the frequency or intensity of El Niño events. Consequently, the National Weather Service has begun to classify El Niño based on the central Pacific’s temperature relative to other tropical regions, although this new classification has not yet gained widespread acceptance.

Both El Niño and its counterpart, La Niña, have been observed with greater frequency and intensity over the past 50 to 60 years. One study indicated that climate change has intensified the temperature variation in the central Pacific by about 10%. However, with only 150 years of reliable data available, early measurements are often unreliable, leading many scientists to be cautious about asserting that climate change has intensified El Niño.

“Will climate change influence El Niño events? That remains a complex question,” Stockdale stated. “The answer is likely yes.”

It is evident that global warming exacerbates the consequences of El Niño. As global temperatures rise, evaporation heights increase, leading to higher atmospheric moisture retention and intensified extreme weather events, such as droughts and floods.

“We refer to this as the intensification of the water cycle,” Stockdale explained. “El Niño can cause dramatic shifts in typical precipitation patterns, likely compounded by climate change.”

Topics:

Source: www.newscientist.com

Exploring the Impact of Birth Order on Autism, Migraines, and More

Impact of Sibling Birth Order on Health

Exploring the Impact of Birth Order on Health Outcomes

iStockPhoto

A recent study involving over 10 million siblings reveals that birth order may significantly influence the risk of developing more than 150 health conditions, ranging from autism and anxiety to hay fever.

Birth order has intrigued researchers for over a century, igniting debates about its correlation with personality traits and IQ. However, many prior studies faced criticism for lacking robustness in data collection and analysis.

A groundbreaking study conducted by Julia Rohrer in 2015 examined data from 20,000 children, determining that birth order had minimal impact on personality, resulting in only a slight decrease in IQ — about 1 to 2.5 points for the youngest siblings.

The recent analysis took a comprehensive approach, evaluating the likelihood of various health outcomes. Researchers like Benjamin Kramer at the University of Chicago meticulously compared 1.6 million sibling pairs, accounting for gender, birth year, parental age, and age difference, thereby mitigating potential confounding factors that may arise from parental treatment differences.

Out of 418 medical conditions studied, 150 were associated with birth order, with 79 more prevalent among firstborns and 71 among second-borns.


Notably, firstborns displayed heightened risks for several neurodevelopmental disorders, including autism and Tourette syndrome, along with an increased tendency for anxiety, allergies, and acne. Conversely, second-borns exhibited greater susceptibility to conditions such as drug abuse, shingles, and migraines.

“This study provides a rigorous examination of the topic,” states Lawler, urging caution as the relationships observed are modest. For instance, firstborns have a 3.6% elevated risk of depression, emphasizing that individual life trajectories may differ significantly across birth order.

The research team explored several potential explanations for these findings. For example, the increased incidence of allergies among firstborns may align with the “friendly enemy” hypothesis, suggesting that younger siblings encounter more microorganisms from their older counterparts, fostering immune tolerance. Indeed, wider age gaps were linked to lesser allergy occurrences in firstborns.

A parallel trend was noted for substance abuse, with risk diminishing for second-borns as age differences increased. The authors connected this to enhanced risk-taking tendencies often observed in later-born children. However, Lawler emphasizes that much of this evidence remains contentious and may imply that later-borns often pursue environments that heighten exposure to substance-use opportunities.

Furthermore, the substantial prevalence of autism among firstborns might stem from both biological and environmental factors. The mother’s immune response in the first trimester is hypothesized to potentially impact the developing brain. Research indicates that families with one autistic child may choose not to have additional children, suggesting possible biases in families who do have a second child following an autism diagnosis in the first.

Another perspective from Lawler pertains to “diagnostic substitution.” Diagnoses of ADHD and autism often rely on cognitive assessments, where slight IQ variations may lead to different labels. Firstborns, possessing marginally higher IQs, might be diagnosed with autism, while their younger siblings may receive an ADHD diagnosis despite sharing similar symptoms.

As noted by Ray Blanchard from the University of Toronto, results may vary when considering sibling gender and birth order dynamics. His research suggests older brothers might increase the likelihood of later-born boys identifying as homosexual, potentially due to maternal antibodies affecting subsequent pregnancies. “These distinctions are pivotal in understanding birth order effects on sexual orientation,” concludes Blanchard, advocating for further studies that incorporate sibling gender hierarchies.

Source: www.newscientist.com

Impact of Ocean Current Disruptions on Carbon Feedback Loops

Iceberg in turbulent seas at sunset, Antarctica

Potential Carbon Release from Southern Ocean

Nigel Killeen/Getty Images

Human-induced global warming is disrupting the Atlantic Meridional Overturning Circulation (AMOC), a critical ocean current system that includes the Gulf Stream, responsible for warming Europe. A total shutdown of the AMOC could trigger a massive release of carbon from deep Antarctic waters into the atmosphere, exacerbating global warming.

Research indicates that an AMOC collapse can lead to severe climatic consequences, including colder winters in Europe and disrupted monsoons in Africa and Asia, while also increasing global temperatures. Recent computer models predict that this scenario could release 640 billion tonnes of carbon dioxide near the South Pole, raising global temperatures by an additional 0.2°C.

“The collapse of the AMOC may trigger large-scale mixing in the Southern Ocean, releasing carbon stored in deep waters,” states Danian, a researcher at the Potsdam Institute for Climate Impact Research. “This outcome is unprecedented.”

The co-authors emphasize that potential catastrophic events can have even more severe implications than previously understood. As Johan Rockström, also from the Potsdam Institute, notes, “We must remain vigilant, as one failure can trigger a domino effect.”

The AMOC operates by transporting warm, salty water from the Gulf of Mexico to the North Atlantic, where it cools, sinks, and circulates back southward along the ocean floor. Scientists believe that increased fresh meltwater from the Greenland ice sheet is diluting the AMOC, thereby slowing its sinking process.

Recent buoy measurements reveal a weakening return flow, suggesting a 15% decline in the AMOC, with models predicting a potential collapse within decades to centuries.

A new study exploring AMOC collapse under varying climate scenarios shows that if atmospheric CO2 levels exceed 350 ppm, the AMOC fails to recover after shutdown. Given the current CO2 level of 430 ppm, this indicates that AMOC decay may be irreversible.

The study also identified that if the AMOC, a key component of the global ocean current conveyor belt connecting the Southern Ocean and Pacific Ocean, collapses, it could lead to deep water convection near the South Pole. This deep water rests under a layer of fresher surface water, where carbon accumulates from both atmospheric CO2 and decaying plankton. The model suggests a significant portion of this carbon would be released into the atmosphere.

Previous research indicates that past AMOC collapses similarly triggered convection near the South Pole, aligning with evidence that the Southern Ocean is becoming less salty. This reduction in salinity disrupts the layered structure above the saltier deep water, facilitating surface access for deep water.

“It’s striking to observe these changes in such a warm climate amid rising CO2 levels,” says Jonathan Baker from the Met Office. “This study is intriguing, yet its findings depend on whether convection in the Southern Ocean intensifies; different models exhibit varied responses, leading to ongoing uncertainties.”

The study also forecasts that AMOC collapse could cool the Arctic by 7 degrees Celsius, freezing regions in Canada, Scandinavia, and Russia while concurrently warming Antarctica by 6 degrees Celsius. The West Antarctic Ice Sheet remains at risk of surpassing its tipping point, which could trigger a larger collapse of the East Antarctic Ice Sheet, resulting in significant sea level rises.

The repercussions of CO2 emissions could persist for over a millennium following any AMOC closure. However, Rockström cautions that continued greenhouse gas emissions may lock in a future collapse of the AMOC in just a few decades.

“The window for change could be as short as the next 25 to 50 years,” he warns. “It’s vital to recognize the urgency; it’s not just about the timing of impacts, but about our commitment to preventing an increasingly inhospitable planet for future generations.”

Topics:

Source: www.newscientist.com

The Unsettling Reality of Medical Cannabis and Its Impact on Mental Health

In 2018, the legalization of medical cannabis in the UK marked a pivotal change, driven by campaigns advocating for children with treatment-resistant epilepsy.

The legal reforms permit specialist medical consultants to prescribe cannabis-based medical products (CBPMs) for a variety of conditions, always prioritizing the patient’s well-being.

Despite this legalization, the possession and use of cannabis (classified as a class B drug) without a valid prescription continues to be illegal in the UK.

Most cannabis products available are unlicensed, lacking endorsement from the Medicines and Healthcare products Regulatory Agency (MHRA), resulting in limited prescriptions through the National Health Service (NHS). This gap has inadvertently triggered a burgeoning private market.

Currently, more than 30 specialist cannabis clinics are registered with the Healthcare Quality Commission, with estimated prescriptions for cannabis products reaching 80,000 patients. Conditions treated range from chronic pain and anxiety to ADHD.

Data reveals that 42% of patients were prescribed medical cannabis for mental health issues such as anxiety, depression, PTSD, and OCD, aligning with trends observed in Australia and the US.

The UK stands as a major producer of medical cannabis. Photo courtesy of Getty.

However, a recent review published in Lancet Psychiatry assessed over 50 randomized controlled trials (RCTs) and found “no evidence” supporting the efficacy of cannabinoids for treating conditions like anxiety, PTSD, substance use disorders, ADHD, bipolar disorder, psychotic disorders, or anorexia.

While some efficacy was noted for cannabis use disorder, insomnia, Tourette syndrome, and autism spectrum disorder, these findings were categorized as “low quality.”

The Advisory Committee on the Abuse of Drugs (ACMD) is conducting a review examining the implications of medical cannabis prescriptions in the UK, focusing on any “unintended consequences” resulting from recent legal changes.

Professor Owen Bowden Jones, former ACMD Chairman, indicated that the study results indicate that the benefits of medical cannabis may have been “overestimated” for numerous conditions, and these products “should not be administered for psychiatric conditions lacking supportive evidence.”

“We must focus on reducing barriers to facilitate superior research that further explores cannabis product effects,” he added.

The review asserts that routine cannabinoid use for mental health conditions is “seldom justified,” raising critical questions, notably, why is cannabis prescribed despite limited evidence of its effectiveness?

Treatment Options

It is stated that “absence of evidence is not evidence of absence.” Dr. Niraj Singh, a consultant psychiatrist in the UK, has prescribed medical cannabis for over six years.

“Numerous patients have reported that this treatment effectively addresses a range of conditions, and most use it responsibly. In my experience, it has yielded positive results, enabling patients to lead happy, fulfilling lives,” Singh remarked.

Many patients seeking treatment at cannabis clinics have reportedly exhausted all traditional options or lack access to adequate mental health support. As of January 2026, 1.5 million adults engaged with NHS mental health services, while 8.7 million people were prescribed antidepressants in the UK from 2023 to 2024, believed to be effective for approximately one year.

In a survey by the United Patient Alliance, a patient dealing with anxiety, depression, and PTSD expressed feeling “seen and supported” after receiving effective treatment without harmful side effects associated with previous prescriptions.

“In instances where individuals have plateaued in treatment options, medical cannabis is making a significant difference,” Singh expressed.

Evidence from peer-reviewed studies links cannabis to improved symptoms and quality of life for conditions such as: PTSD, OCD, and insomnia. However, observational studies were excluded from the aforementioned review due to concerns of biases that could not establish causality.

Despite the need for more robust clinical trials, Professor David Nutt, former chair of ACMD and founder of the independent charity Drug Science, argues that RCTs alone do not offer sufficient data on a drug’s effectiveness.

This sentiment is echoed by Sir Michael Rollins, former director of the MHRA and the National Institute for Healthcare Research and Evaluation (NICE). He emphasized the need for real-world evidence that could yield “better clinical data and statistical power” in a speech at the Royal College of Physicians.

According to Nutt, “Placebo-controlled trials are costly and involve highly selective patient populations, limiting their generalizability.” He also highlighted that cannabis’s numerous active compounds, which vary vastly in dosage and formulation, pose significant challenges when conducting double-blind, placebo-controlled studies. Professor Mike Burns, President of the Association of Medical Cannabis Clinicians, emphasized the need for a more nuanced approach in understanding mental health prescribing.

Clinical Supervision

Medical cannabis can induce side effects, including heightened anxiety and paranoia, making it unsuitable for individuals with a history of psychosis.

According to a survey published in BMJ Mental Health, those using cannabis for self-medication tend to use it more frequently and consume higher levels of tetrahydrocannabinol (THC), resulting in increased paranoia.

“Cannabis is not devoid of side effects,” stated Dr. Marta Di Forti, a Professor of Drug Use, Genetics, and Psychosis at King’s College London, who runs a clinic for individuals with mental health issues in London.

She recounted cases where patients developed complications after being prescribed products containing high THC levels, leading to hospitalizations for psychotic symptoms. Yet, much of our understanding in this area remains anecdotal.

“There is valid reasoning for prescribing cannabis as medication,” she noted. “However, there must be comprehensive evidence and proper oversight, which is currently lacking.”

The Association of Medical Cannabis Clinicians recommends a review by a peer panel for prescriptions exceeding 60 grams per month or containing over 25% THC. Like other controlled substances, prescribing CBPM requires diligent clinical oversight, thorough evaluation, and ongoing monitoring, especially in complex cases with significant mental health histories.

While Singh noted that side effects are relatively rare, he expressed concern about the rising availability of high-THC products. “Checks and balances are imperative,” he insisted, “as adjustments to THC concentrations must be carefully monitored.”

Prescribers maintain that a strong clinical oversight process is in place, stating they’ve never felt pressure to prescribe. Eligibility for medical cannabis entails having undergone at least two previous treatments, receiving an evaluation from a psychiatrist, and being reviewed by a multidisciplinary team.

Nonetheless, some critics argue that clinics should enhance support and training for prescribers and have a responsibility to foster research that substantiates their claims. “The industry has not adequately collected and analyzed patient outcomes,” Burns stated. “Clinics have a moral obligation to gather and share data whenever possible.”

In 2018, cannabis became legal for medical use in the UK with a prescription. Use without a prescription remains illegal. Photo credit: Getty.

Evidence Gap

There is a shared consensus on the urgent need to develop a robust evidence base. However, finding common ground proves challenging. Some advocate for cannabis’s efficacy, while others dispute it, with a lack of substantial research to confirm either stance.

Nutt emphasized that the current clinical research system is inadequate for medical cannabis. “In 2018, the Health Ministry pledged to conduct efficacy trials for children with epilepsy, but no progress has been made. This reflects a disinterest from pharmaceutical companies due to the impossibility of patenting plant medicines.”

This challenge cannot be solved solely by a call for further research, he noted, but requires prioritizing real-world data and practical experience to support cannabis in clinical settings.

Meanwhile, patients express fears of being pushed back into the illegal market, where they have no access to medical oversight or regulated products, which is widely viewed as more dangerous.

Denying access to medical marijuana based on “incomplete evidence” not only misrepresents scientific data but also inflicts harm on patients who rely on it, according to the United Patient Alliance.

“Real-world evidence studies, patient-reported outcomes, and research focusing on treatment-resistant populations are critically needed,” they added. “We do not ask for science to be ignored; we urge it to catch up with patient experiences.”

Read More:

Source: www.sciencefocus.com

Food Supply Shocks from Iran War: Inevitable Impact and Potential Escalation

Food Prices Expected to Surge in Late 2023

dpa picture alliance/alamy

World food prices are reaching unprecedented levels, comparable to the energy crisis of the 1970s. The ongoing conflict in the Middle East is exacerbating inflation, with rising costs for fuel, fertilizers, and pesticides. Are we on the brink of the worst food shock in history?

Many farmers are likely to decrease planting due to soaring costs, possibly leading to food shortages and increased prices later this year. How severe the situation becomes will depend on various factors, including the duration of the conflict and the impact of extreme weather events linked to climate change on crop yields.

“This could escalate into a major crisis for the impoverished and food-insecure,” warns Matin Kaim, a researcher at the University of Bonn, Germany.

“We’re facing a perfect storm. The resolution isn’t straightforward,” states Tim Benton of the University of Leeds, UK. “Even a resolution tomorrow may not yield immediate results, as seen with the post-COVID-19 recovery.”

After decades of decline since the 1970s, global food prices have climbed in real terms since the 2000s, nearing their historic peaks. Climate change intensifies this issue with increasing heatwaves, floods, and storms negatively affecting crop yields, resulting in global food shocks like those observed in 2010. The COVID-19 pandemic and Russia’s invasion of Ukraine have also led to significant price spikes.

Rising biofuel production is contributing to increased food prices, with over 5% of food calories now converted into fuel. Some governments have acknowledged the need to reduce food-based biofuels; however, a report suggests that by 2030, 92% of biofuels will still rely on food sources.

Currently, due to US and Israeli actions against Iran, there’s a significant depletion of essential raw materials for food production and distribution. Fuel, particularly diesel, is crucial as it powers agricultural equipment and transports food. Consequently, higher oil prices directly influence supermarket prices.

Fertilizers, crucial for global food supply, are also facing shortages. “If we halted the use of mineral fertilizers globally, it could lead to widespread hunger,” notes Keim.

Nitrogen fertilizers are produced using hydrogen and atmospheric nitrogen to create ammonia, relying heavily on natural gas for hydrogen and electricity. Qatar, with its abundant natural gas, is a significant fertilizer producer, supplying about 15% of the global urea market. However, due to the conflict, this urea cannot traverse the Strait of Hormuz, thus complicating supply chains.

Countries such as India, Bangladesh, and Pakistan, which produce substantial amounts of their fertilizers from Persian Gulf gas, are facing factory shutdowns due to war-related damages. Additionally, Australia’s main fertilizer facilities are currently non-operational due to an incident.

Consequently, nitrogen fertilizer prices have already surged by over 33% and could escalate further. “If fertilizer costs double, food prices could easily rise by 20 to 30%,” warns Keim.

Beyond urea, Gulf states like Qatar and the UAE are also major sulfur fertilizer producers, essential for various regions and for converting mined phosphates into usable forms for plants.

Urea Fertilizer Readied for Export at Yantai Port, China

CN-STR/AFP (via Getty Images)

Pesticides, essential for safeguarding global food production, are also influenced by rising prices tied to naphtha costs, a fossil fuel derivative used in food packaging.

“In March alone, three of the world’s key naphtha export terminals were targeted in drone attacks,” notes Jide Tijani of Argus Media, UK. These include Russia’s Ustiluga port and facilities in Qatar and the UAE.

The consequences of these developments will likely lead to escalated food prices and a range of other commodities in the coming months and years. “The number of affected markets is staggering,” remarks Jason Hill at the University of Minnesota.

Farmers face increasing costs for fuel, fertilizers, and pesticides, all of which affect their planting decisions. Uncertainties regarding profitability may lead farmers to switch crops or abstain from planting altogether. Speculation and profiteering could further compound price rises, according to Jennifer Clapp at the University of Waterloo, Canada.

How severe could the situation become? The dramatic increases in food prices in the 1970s were partly due to dwindling global food reserves, warns Clapp. While reserves are currently sufficient, prolonged conflict could drastically alter this, especially if abnormal weather caused by climate change negatively affects crop yields.

“There is a substantial chance this could escalate into a crisis of equal or greater magnitude,” Clapp asserts. “Significant climate change could worsen the situation further.”

“Food prices are causing distress across the globe, disproportionately affecting lower-income populations who spend a significant portion of their income on food,” notes Keim.

Additionally, international aid is already diminishing and will likely be further curtailed. “Rising food prices often coincide with increased demand for aid, yet the available funding diminishes as costs escalate,” shares Benton.

This rising tide of food prices may lead to social unrest in the most severely impacted regions, as explained by Paul Behrens at Oxford University. “We’ve observed instability in times of rising food costs throughout history.”

Strategies Nations Can Implement to Mitigate Food Shocks

There are strategies to alleviate the situation. “In Europe, around 15 million loaves of bread are produced daily for biofuel,” points out Behrens, calling it an illogical method for energy generation.

As biofuel production primarily hinges on state incentives, governments can curtail its production to divert more food to markets. “This would make a significant difference,” remarks Keim.

He advocates for an international consensus that limits biofuel production from food sources when prices surge. Unfortunately, such actions have not materialized in past crises.

Instead, nations are likely to ramp up biofuel production to counteract rising fuel prices, which could significantly affect food pricing, according to Keim.

Initiatives are already underway; the United States recently announced an increase in the bioethanol proportion in fuels to mitigate price hikes. Australia is also contemplating similar measures.

However, ramping up food-based biofuels won’t substantially impact fuel prices but will dramatically influence food prices. For instance, a third of corn produced in the U.S. is converted into bioethanol, contributing minimally to gasoline supplies but having a disproportionate effect on food availability, asserts Hill.

“Enhancing ethanol in gasoline harkens back to the 1990s—a policy that fails to address air pollution or climate change,” critiques Simon Donner at the University of British Columbia. “Higher oil prices should instead be seen as an opportunity to transition towards cleaner, more advanced technologies like electric vehicles.”

The global community is unlikely to want a repeat of this supply shock. “This situation poses a significant challenge, raising questions on how to build a more resilient system going forward,” Hill emphasizes.

Accelerating the transition to renewable energy and electric vehicles could leave economies vulnerable to oil price shocks. Furthermore, there’s a need to transform the chemical industry to reduce fossil fuel dependence.

In terms of nitrogen fertilizers, this means generating them from electricity rather than natural gas. “It’s feasible to produce ammonia with zero greenhouse gas emissions,” states Ryan. “The technology exists; the challenge is harnessing enough renewable energy.”

Demand for electricity is surging, especially for data centers supporting AI technology. This scenario is unlikely to improve unless there’s a significant decline in AI development.

In the meantime, there are several ways to optimize fertilizer use. Excessive fertilizer application in many regions leads to runoff into water systems or the release of nitrous oxide, a potent greenhouse gas. Techniques to mitigate overuse include precision agriculture, crop rotation with legumes, and the development of crops that utilize fertilizers more effectively.

“We need to promote a more sustainable farming system,” Keim concludes, highlighting that sustainability does not automatically mean organic practices. A shift to organic farming could dramatically elevate food prices and contribute to deforestation, given the need for additional farmland.

“A fundamental change in our food system is imperative,” asserts Behrens. This includes modifying our dietary habits—favoring protein sources such as beans and legumes over grain-fed meat, which require significant fertilizer input. “This transition could yield substantial benefits,” he emphasizes.

Topics:

  • Eating and Drinking/
  • Agriculture

This rewrite maintains the original HTML structure while optimizing content for SEO by incorporating relevant keywords and enhancing readability.

Source: www.newscientist.com

NASA Satellite Plummets to Earth: Minimal Risk of Debris Impact

A decommissioned NASA satellite, **Van Allen Spacecraft A**, launched 14 years ago to study Earth’s radiation belts, is set to crash into Earth on Tuesday.

Weighing in at 1,323 pounds, the spacecraft is predicted to enter the atmosphere around 7:45 p.m. EDT, according to U.S. Space Force forecasts. This will be an uncontrolled re-entry, which means NASA cannot steer the spacecraft; however, they anticipate that most of the satellite will incinerate during its fiery descent through the atmosphere.

As NASA stated, “some components are expected to survive reentry.”

“The risk of harm to anyone on Earth is low, approximately 1 in 4,200,” according to NASA. “NASA and the Space Force will continue to monitor the reentry.” For the latest updates, visit this forecast.

Deactivated satellites, spent rocket stages, and space debris re-enter Earth’s atmosphere regularly. In fact, such objects make uncontrolled descents nearly every day, as reported by the European Space Agency.

Typically, hardware burns up harmlessly upon re-entry, but some parts may survive. Fortunately, with oceans covering approximately 71% of the Earth’s surface, the chances of space debris landing on populated areas are minimal.

Accurately predicting the time and location of an uncontrolled spacecraft’s re-entry is challenging due to various factors, including atmospheric dynamics, space weather, and the spacecraft’s descent trajectory. The Space Force projects a re-entry window for Van Allen A with a margin of error of plus or minus 24 hours.

Van Allen Spacecraft A was launched on August 30, 2012, alongside its twin, Van Allen Spacecraft B. Both probes were designed to investigate a ring of high-energy radiation particles trapped in Earth’s magnetic field, known as the Van Allen radiation belts.

Three donut-shaped radiation belts around Earth.
NASA Goddard Space Flight Center/Johns Hopkins University Applied Physics Laboratory

The Van Allen belts are crucial for protecting Earth from solar storms, cosmic radiation, and charged particles from the solar wind. Without these belts, satellites could be damaged, human health could be jeopardized, and power grids on Earth could face disruptions. However, astronauts must navigate through the Van Allen belts to reach space, exposing them to potentially harmful radiation.

NASA’s Van Allen Probes A and B were instrumental in advancing our understanding of these radiation belts. The mission led to numerous discoveries about the radiation belts, including the identification of a temporary third radiation belt formed during intense solar activity.

These twin spacecraft continued their mission until 2019 when they exhausted their fuel. NASA subsequently concluded the mission, leaving the probes in orbit.

Initially, NASA projected that the spacecraft would re-enter Earth’s atmosphere in 2034. However, increased solar activity has recently intensified atmospheric drag on both probes, accelerating their descent. As solar activity rises, denser atmospheric layers can slow down satellites, complicating their orbits.

Van Allen Spacecraft B is anticipated to re-enter Earth’s atmosphere by 2030.

These re-entries shed light on the growing issue of space debris, especially as the frequency of launches rises. Tens of thousands of pieces of space junk, along with millions of tiny orbital debris, clutter low-Earth orbit, the zone where many telecommunications and GPS satellites operate.

Debris fragments can travel at speeds of up to 18,000 miles per hour, posing safety risks to functioning spacecraft and astronauts aboard the International Space Station.

Source: www.nbcnews.com

How Farming Transformed Human Evolution: The Impact of Agriculture on Our Development

Evolution and Agriculture Impact

The Advent of Agriculture and Evolutionary Pressures on Humans

Christian Jegou/Science Photo Library

The comprehensive analysis of ancient genomes has revealed significant insights into human evolution over the last 10,000 years. This research indicates that various populations worldwide have experienced similar evolutionary changes, particularly following the introduction of agriculture.

“Similar traits and genes are being selected in diverse populations,” says Laura Colbran from the University of Pennsylvania.

Evolution happens when genetic variation becomes prevalent in a population—often because it confers an advantage. By comparing genomes, we can identify recent signs of human evolution.

Colbran notes that ancient DNA is exceptionally valuable for this research, stating, “Using ancient genomes allows us to witness genetic history directly, as opposed to relying solely on inferential methods.”

Much of the recent research has primarily focused on European genomes, but Colbran’s team leveraged an increasing collection of genomes from outside Europe, analyzing over 7,000 ancient and contemporary genomes. Ancient genomes mainly originate from the last 10,000 years, while modern genomes are derived from living populations.

The research team utilized ancient genomes to predict possible modern genetic profiles without evolutionary influence, highlighting differences known as selection signals. They identified 31 selection signals, many of which were shared among varied populations, likely due to the independent rise of agriculture around the same era globally.

For instance, less than 25% of ancient individuals possessed the FADS1 gene, which encodes an enzyme that aids in converting short-chain fatty acids (common in plants) into long-chain fatty acids (predominant in meats). Increased production of this enzyme is thought to benefit individuals who adopt a plant-heavy diet. Currently, over 75% of people in Europe, Japan, and northern China carry advantageous FADS1 variants. The strength of selection for this gene has remained stable over the last 300 generations in Europe while intensifying in East Asia over the last century.

The genes impacting the alcohol dehydrogenase 1B enzyme, encoded by ADH1B, have also been critically analyzed. Variants of ADH1B are prevalent in East Asia and are associated with quick alcohol metabolism, leading to symptoms like facial flushing. Colbran stated, “This showcases the strongest selection signal we’ve observed in East Asia,” suggesting that this variant was favored to curb excessive alcohol consumption.

Even though this variant was absent in ancient Europeans, strong selection signals related to the ADH1B enzyme were identified. Colbran emphasized the need for further investigation to discern the involved variants and their specific impacts, indicating a likely adaptation to evolving alcohol consumption patterns.

The research team also explored traits influenced by multiple genetic variations, such as waist-to-hip ratios, often correlated with fertility. Surprisingly, they found a robust selection process stabilizing women’s waist-to-hip ratios within certain limits. “This is intriguing as it suggests a stabilization of selection,” Colbran remarked, emphasizing that while waist-to-hip ratios can differ across various populations, the ideal measurement likely exists in a balanced range.

As noted by Alexander Gusev at Harvard University, this study is remarkable for its analysis of ancient DNA that has yet to be thoroughly examined. Gusev explained, “The authors found enriched variants being selected within one population compared to others, indicating parallel selection across populations, previously hypothesized but not empirically demonstrated.”

Yashin Souilumi, from the University of Adelaide, emphasized that their novel approach reveals regions of the genome newly identified as subject to selection, complementing previously known areas. “Their innovative method optimally utilizes the vast amounts of available ancient DNA,” Souilumi stated.

Colbran concluded that these findings are merely the initial discoveries. As more non-European genomes are sequenced, we will uncover even more evidence of recent human evolution.

Discovery Tour: Archaeology, Human Origins, and Paleontology

New Scientist frequently covers extraordinary archaeological sites that reshape our understanding of human evolution and early civilizations. Join us on this fascinating journey!

Topics:

Source: www.newscientist.com

Understanding How Changing Atmospheres Impact Cosmology

NASA, ESA, CFHT, CXO, MJ Jee (University of California, Davis), A. Mahdavi (San Francisco State University)

Recently, there has been a significant shift in the realm of cosmology, reminiscent of the changing trends in fashion. Gone are the days of skinny jeans; in come the baggy styles. Likewise, the foundations of our cosmic understanding are being challenged.

For years, physicists relied on the Standard Model of cosmology, a robust framework that adeptly illustrated the universe’s inception and evolution. Central to this model is dark energy, an enigmatic force driving the universe’s expansion.

Last year, groundbreaking results from extensive telescopic surveys suggested an astonishing possibility: dark energy may be weakening over time. Should this prove true, the Standard Model of cosmology may necessitate a profound rewrite.

A collection of three enlightening features seeks to unravel the intricacies of the Standard Model, examining its current precarious status and what might come next.


It does not assist if attachment to old models is fueled by fear or nostalgia.

Despite these revelations, many physicists remain hesitant to abandon their trusted models. This skepticism is understandable, as many findings in modern physics may require reevaluation over time. However, clinging to outdated concepts out of fear of the unknown won’t advance our understanding.

In scientific discourse, paradigm shifts signify transformative moments when our comprehension fundamentally shifts. While challenging, history shows that such shifts enhance our ability to perceive reality. Whether the issues surrounding dark energy will spark a paradigm shift akin to the quantum or Copernican revolutions remains uncertain. If it does, we may reflect on this era of cosmology as an exhilarating chapter in our quest for knowledge.

Source: www.newscientist.com

Unveiling the Unexpected Impact of Targeted Cognitive Training on Dementia Risk

Cognitive training and dementia prevention

Cognitive Training May Protect Against Dementia

Gary Burchell/Getty Images

Cognitive ‘speed training’ can reduce the risk of a dementia diagnosis by 25%, according to a groundbreaking randomized controlled trial. This study is the first of its kind to assess the effectiveness of an intervention for dementia.

“Skepticism surrounded brain training interventions for years, but this study provides clear evidence of their benefits,” says Marilyn Albert from the Johns Hopkins University School of Medicine.

The brain training sector has faced controversy, especially after companies overstated claims about cognitive decline prevention. In 2014, around 70 scientists signed an open letter stating no conclusive evidence existed that brain training leads to significant real-world changes or enhances brain health, echoing sentiments later supported by another letter signed by over 100 scientists.

Now, a comprehensive 20-year study with 2,832 participants aged 65 and older indicates that specific cognitive exercises may yield tangible benefits.

Participants were divided into three intervention groups and a control group. One group underwent speed training with a computer task called “Double Decision,” where cars and road signs briefly appeared, challenging participants to recall details after they disappeared. This adaptive task increases in complexity as users improve.

The other two groups focused on memory and reasoning training aimed at enhancing cognitive skills.

Each group completed two sessions per week for five weeks, with about half receiving booster sessions and additional training at one-year and three-year intervals.

After twenty years, evaluations of U.S. Medicare claims revealed that participants who completed speed training with booster sessions had a 25% lower risk of an Alzheimer’s diagnosis or related dementias than those in the control group. Other groups without boosters showed negligible changes in risk, which Albert describes as “truly amazing.”

“The study’s rigorous methodology is commendable,” notes Torkel Klingberg from Karolinska Institutet, Stockholm. “The impressive 20-year follow-up and the significant reduction in dementia risk are crucial findings.”

However, Walter Boot from Weill Cornell Medical College cautions that measuring numerous outcomes over two decades can lead to coincidental findings. “While the results may suggest significance, they should be interpreted cautiously,” he adds.

Double Decision: A Cognitive Training Program

BrainHQ

The mechanism behind the effectiveness of speed training is still being explored. One theory suggests it relies on implicit learning, which can entail long-lasting changes without conscious effort, according to Albert.

Etienne de Villers Sidani from McGill University explains that brief, intense experiences can lead to significant, enduring changes in the brain—much like how a traumatic event can instill lasting fears.

This training may enhance the brain’s cognitive reserve, a potential buffer against cognitive decline. Albert notes that enhanced brain connectivity could improve attention division, facilitating daily activities and fostering physical activity and social engagement—key factors for sustained brain health.

The authors propose that results from the booster sessions suggest a dose-dependent effect of speed training. Bobby Stoyanowski from the Ontario Institute of Technology emphasizes the need for future research into optimal training levels: “What is the right amount of training to maximize benefits?”

In summary, Andrew Budson from Boston University advises against isolating oneself to play speed training games endlessly. Instead, engaging in activities that promote implicit learning—like learning new skills or sports—may provide long-term cognitive benefits while being enjoyable.

Topic:

Source: www.newscientist.com

Exploring the Environmental Impact of Space Debris: Air Pollution Concerns on the Rise

Falcon 9 Upper Stage Re-entry

An incredible 30-second exposure captures the Falcon 9 upper stage re-entering the atmosphere over Berlin, Germany, on February 19, 2025.

Photo by Gerd Baumgarten

A SpaceX Falcon 9 rocket ignited a plume of vaporized metal as it re-entered the atmosphere, raising concerns about atmospheric pollution across Europe. This type of contamination is expected to surge as the number of spacecraft and satellites continues to grow.

The Falcon 9’s upper stage, intended for recovery in the Pacific Ocean, suffered an engine failure that led to its uncontrolled descent over the North Atlantic on February 19, 2025.

Witnesses throughout Europe observed fiery debris streaking across the sky, with some fragments landing behind warehouses in Poland. Researchers from Germany’s Leibniz Institute for Atmospheric Physics employed lidar technology to monitor the atmosphere. They noted a tenfold increase in lithium concentration— a significant component of the rocket’s structure— twenty hours after the re-entry event.

Using atmospheric models, researchers concluded that the lithium plume drifted approximately 1,600 kilometers from the re-entry site. This investigation represents the first instance of tracking high-altitude contamination resulting from a specific spacecraft re-entry.

According to Wing, small metal particles could catalyze ozone depletion, create clouds in the stratosphere and mesosphere, and interfere with sunlight’s passage through the atmosphere. “However, this field remains largely underexplored.”

As commercial space launches surge and companies expand their satellite constellations, such as SpaceX’s Starlink and Amazon’s Kuiper, concerns regarding contamination are becoming more pronounced. Currently, around 14,500 satellites orbit Earth, and SpaceX recently applied to deploy an additional 1 million satellites to support Elon Musk’s vision of creating orbiting data centers for artificial intelligence.

To mitigate a potential cycle of collisions that could generate more space debris, satellites are often permitted to deorbit and burn up at the end of their operational lives. Experts warn that space debris could increase by fiftyfold over the next decade, potentially contributing to more than 40% of the mass currently entering the atmosphere from meteorites.

There is a common misconception that space debris simply burns up and disappears in the atmosphere. According to Daniel Cizzo of Purdue University, who did not partake in this study, “We need to be cautious and thoroughly analyze the potential impacts of this material.”

The Falcon 9’s plume is estimated to contain around 30 kilograms of lithium. However, given the alloy composition of the rocket’s hull, it likely contained significantly more aluminum.

When evaporated aluminum interacts with atmospheric oxygen, it forms aluminum oxide particles, which serve as surfaces for chlorine compounds to decompose more easily. The chlorine radicals generated through this process react with and deplete ozone molecules in the stratosphere.

Researchers estimate that the burnout of spacecraft releases approximately 1,000 tons of aluminum oxide into the atmosphere annually, a figure that continues to rise. This exacerbation could extend the ozone hole in the Southern Hemisphere, which has been shrinking as nations phase out ozone-depleting gases. The loss of ozone allows more harmful ultraviolet rays to penetrate, increasing the risk of skin cancer.

“In terms of metals, we are entering a new paradigm where anthropogenic pollution increasingly influences the upper atmosphere, overshadowing natural sources,” says Eloise Marais from University College London. “Space debris risks reversing the progress made in healing the ozone hole.”

Metal oxide particles also function as nuclei for water vapor to coalesce into droplets, potentially leading to the formation of cirrus clouds that trap heat in the upper troposphere.

Scientists have detected particles from a burned-out spacecraft within cirrus clouds. While the effect on global warming is currently considered minor compared to greenhouse gases like carbon dioxide, it could still pose increased risks.

“Substantial evidence indicates that this substance may adversely affect the atmosphere. It is now our responsibility as scientists to assess whether these effects are occurring and the degree of their negativity,” Cizzo stated.

Potential solutions include constructing satellites from wood-like materials—though these may emit black carbon soot upon re-entry—or relocating satellites to high-altitude “graveyard orbits.”

“You must take a moment to consider your intentions before proceeding,” Wing advises. “This rapid growth in satellite launches poses questions that remain unanswered.”

Topic:

Source: www.newscientist.com

How Endurance Brain Cells Impact Your Running Stamina

Neuroscience Research on Exercise

Your Limits When Exercising Can Be Mental

Cavan Images/Alamy

Recent research has unveiled specific neurons in mice that enhance endurance following exercise, suggesting that similar cells may exist in humans. These findings could pave the way for targeted drugs and treatments to amplify exercise effects.

Traditionally, the understanding has been that brain changes from physical activity differ from those occurring in muscles. However, Nicholas Betley from the University of Pennsylvania contends that these brain changes regulate all physical responses.

To investigate further, Betley and his team observed neuronal activity in mice before, during, and after treadmill sessions, concentrating on neurons located in the ventromedial hypothalamus. Previous research revealed that developmental issues in this area hinder fitness improvements, a finding likely applicable to humans due to the structural consistency across mammals.

Post-exercise, the researchers noted that a specific group of neurons with SF1 receptors exhibited increased activity. These neurons, critical for brain development and metabolism, activated more significantly with each subsequent run. By day 8, approximately 53% of neurons were activated compared to under 32% on day 1. As Betley emphasizes, “Just as your muscles get stronger through exercise, your brain’s activity adapts as well.”

Utilizing optogenetics, which uses light to manipulate neuron activity, the researchers turned off these neurons in another mouse cohort trained on the treadmill five days weekly for three weeks. Observed post-session, neuron inhibition lasted an hour, followed by endurance tests.

The findings showed that these inhibited mice improved their running distances by around 400 meters, compared to control mice whose neuron activity remained unaffected.

While the exact function of these neurons remains ambiguous, team member Morgan Kindel, also at the University of Pennsylvania, indicates their likely role in fuel utilization. During endurance exercises, carbohydrates are depleted faster, necessitating a shift to fat for fuel. However, when these neurons were inhibited, mice utilized carbohydrates earlier, leading to performance limitations. They also discovered that inhibiting these neurons hindered the release of a muscle protein, PGC-1 alpha, which optimizes fuel use, while also facilitating energy replenishment and muscle recovery.

Although optogenetics isn’t applicable to humans due to its invasive nature, Betley suggests potential alternative interventions could be developed to target these neurons. “If we can identify methods, like supplements, to activate these neurons, we could significantly boost endurance,” he states.

In experiments boosting neuron activity instead of suppressing it, the mice exhibited extraordinary endurance, able to run over twice the distance of control subjects.

Such advancements may particularly benefit individuals struggling with exercise, including the elderly or stroke survivors, as noted by Betley.

Nevertheless, several challenges remain. First, the applicability of these findings to humans is not confirmed. There are concerns about potential side effects, highlighted by Thomas Barris at the University of Florida. These neurons seem to regulate cellular energy uptake, and overstimulation might pose risks like dangerously low blood sugar levels.

Even if safely activatable in humans, Betley believes it won’t serve as a stand-alone solution for health. “Exercise fosters a wide array of benefits: reducing depression and anxiety, enhancing cognitive function, improving cardiovascular health, and strengthening muscles,” he notes. However, stimulating these neurons alone won’t unlock all the positive outcomes associated with exercise.

Topics:

Source: www.newscientist.com

How Ultramarathons May Negatively Impact Your Blood Health

There Can Be Too Much of a Good Thing When It Comes to Exercise

Reuters/Lucy Nicholson

Exercise is crucial for a long and healthy life; however, recent studies reveal that ultramarathons can significantly accelerate cellular aging in blood. Athletes completing 170 kilometers in mountainous terrain exhibit more age-related damage in their red blood cells compared to those who run shorter distances.

Long-distance running has been associated with health issues, including temporary immune system suppression and anemia. However, the impact on red blood cells, particularly in mountainous environments, is only beginning to be understood.

Angelo D’Alessandro and his team at the University of Colorado examined blood samples from 11 adults, approximately 36 years old, within hours before and after a 40km trail race. They conducted similar analyses on another group of 12 individuals of comparable age participating in a 170-kilometer ultramarathon over the same terrain.

The researchers discovered that participation in either race can lead to increased accumulation of damage in runners’ red blood cells due to reactive oxygen species. These highly reactive molecules are generated when red blood cells need to transport more oxygen throughout the body.

This cellular damage, which occurs naturally with aging, was markedly heightened in ultramarathon runners. D’Alessandro notes, “Anecdotally, the blood after an ultramarathon resembles that of someone who has just suffered a serious injury. Red blood cells accumulate damage, hastening cellular aging.”

Ultramarathon participation appeared to change red blood cells’ shape from disc-like to more spherical at an accelerated rate, a common occurrence as we age. The disc shape is crucial for navigating small blood vessels in the spleen, where aging red blood cells are typically destroyed. “This spherical morphology leads to entrapment in the spleen, resulting in immune cell clearance,” says Travis Nemkov, also from the University of Colorado Anschutz.

This damage likely stems from the inflammatory response triggered by intense exercise, which increases the circulation of red blood cells.

Additionally, ultramarathon runners experienced approximately 10% fewer red blood cells following the race. While this minor reduction isn’t indicative of a health issue or anemia, it suggests the body can probably recover swiftly from this change, according to Nemkov.

Current research efforts focus on analyzing ultramarathon runners’ red blood cells the day following a race to further comprehend the duration of these effects. Future studies aim to explore if these changes impact runners’ performance. Nemkov emphasizes, “This could reveal insights into the signals of damage that might enhance the body’s resilience to endurance running or indicate potential detriments.”

Topics:

Source: www.newscientist.com

The Growing Global Threat of Pesticides: Understanding Their Harmful Impact

Here’s a rewritten, SEO-optimized version of your content, keeping the HTML tags intact:

Farmers Spraying Pesticides on Cotton Fields

Tao Weimin/VCG via Getty Images

Over 60 years have passed since Rachel Carson’s influential book, Silent Spring, highlighted the dangers of pesticides. The negative impact on wildlife has escalated, potentially more than ever before.

“Across nearly every nation, there is a trend of increased pesticide toxicity,” explains Ralph Schulz from RPTU University Kaiserslautern-Landau, Germany.

The risks associated with pesticides depend on both the volume used and their toxicity levels, which can vary significantly among species. To quantify the overall pesticide burden, Schulz and his team formulated a metric called “applied toxicity.”

The team investigated the use of 625 pesticides across 201 countries from 2013 to 2019, incorporating both organic and conventional pesticide data.

They averaged toxicity data from regulatory bodies in various nations, assessing the toxicity levels to eight major organism groups: aquatic plants, aquatic invertebrates, fish, terrestrial arthropods, pollinators, soil organisms, terrestrial vertebrates, and terrestrial plants. This enabled them to calculate the total toxicity per country or organism group.

Globally, applied toxicity rose from 2013 to 2019 in six out of eight organism groups. Notably, pollinators saw a 13% increase, fish a 27% rise, and terrestrial arthropods—including insects, crustaceans, and spiders—experienced a 43% increase.

“This increase does not automatically translate to direct toxic effects on these organisms,” Schulz clarifies. “However, it serves as an important indicator of the toxicity levels of the pesticides currently in use.”

Numerous studies indicate that pesticide concentrations in various ecosystems, such as rivers, often exceed regulators’ assessments during approval processes.

“While this particular index does not account for it, significant evidence exists,” Schulz remarks, emphasizing that risk evaluations tend to underestimate real-world exposures.

The rise in the combined applied toxicity stems from two key factors: the increased use of pesticides and the replacement of older varieties with more toxic alternatives, spurred primarily by the emergence of pest resistance. Schulz notes, “In my view, resistance will only exacerbate with more chemical pesticide use.”

Pesticides like pyrethroids pose notable risks to fish and aquatic invertebrates, even when applied in minimal amounts. Neonicotinoids also significantly threaten pollinators.

Calls to eliminate glyphosate, known as Roundup, are growing. Although glyphosate’s overall toxicity is relatively low, its widespread use contributes to cumulative toxicity, according to Schulz. A ban could backfire if more toxic herbicides are adopted following the ban.

Reducing pesticide usage could lead to unintended consequences; declining farm productivity may necessitate more land clearance, resulting in biodiversity loss.

During the 2022 UN Biodiversity Summit, nations pledged to reduce biodiversity loss. Schulz states, “Overall risk from pesticides” has yet to be precisely defined, but he believes that the aggregate of applied toxicities could serve as a metric.

While this method has its limitations, he insists that no perfect measure of overall pesticide use exists. Roel Vermeulen of Utrecht University in the Netherlands adds, “Despite the uncertainties, the alarming trends it reveals are undeniable.” He warns, “The world is drifting away from UN objectives, which spells bad news for ecosystems and ultimately for human health.”

“Crucially, this study illustrates that a small number of highly toxic pesticides are responsible for the majority of overall risk, highlighting clear and actionable targets for significant benefits,” Vermeulen asserts.

Transforming agricultural practices will require broader societal shifts. “Consumers must adopt dietary modifications, minimize food waste, and pay fair prices that truly reflect the environmental costs of production,” he concludes.

Topics:

### Key SEO Sweets
– Used relevant keyword phrases like “pesticides,” “toxic effects,” and “biodiversity.”
– Enhanced the alt text for the image to improve accessibility and searchability.
– Integrated relevant links to authoritative sources to build trust and provide additional information.

Source: www.newscientist.com

Exploring the Impact of Illness on Our Lives: A Captivating Yet Imperfect Read

Healthcare professionals in protective masks stand near a triage tent for possible COVID-19 patients outside Santa María Hospital in Lisbon on April 2, 2020. Over 500 lives have been claimed by COVID-19 in Portugal, with more than 6,000 confirmed cases. (Photo by PATRICIA DE MELO MOREIRA/AFP via Getty Images)

Healthcare professionals stand near a COVID-19 triage tent in Lisbon, Portugal, April 2020.

Patricia de Melo Moreira/AFP via Getty Images

The Great Shadow
by Susan Wise Bauer, St. Martin’s Press

Publishing a book on the history of disease seems timely given the ongoing challenges with public health. As we face yet another tough winter in the Northern Hemisphere, we reflect on the previous harsh winter during the COVID-19 pandemic. Our vulnerability to illness has never been more evident.

Introducing The Great Shadow: A History of How Disease Shapes Our Actions, Thoughts, Beliefs, and Purchases, authored by Susan Wise Bauer. This work chronicles how disease influences individual lives and collective societal behaviors over centuries. From our guilty pleasures to our shopping habits, microbial influences are always at play.

However, the subject isn’t entirely new; similar works have emerged since the pandemic, such as Jonathan Kennedy’s comprehensive essays, along with updated versions of Sean Martin’s A Short History of Disease and Frederick F. Cartwright and Michael Biddis’s Disease and History. So, what sets this work apart?

The distinction lies in Bauer’s emphasis. She explores the shift from the “Hippocratic universe” to our contemporary understanding steeped in “germ theory.” The former relies on antiquated beliefs regarding bodily humors and inner balance, while the latter focuses on scientific evidence.

A key revelation of this book is the prolonged timeline for this transition; the acknowledgment that microbes cause disease took centuries to establish and only gained traction in the late Victorian era—resulting in millions of unnecessary deaths.

Yet, have we completely moved past Hippocratic medicine? The Great Shadow doubles as a discussion piece. Each chapter unveils a timeline, navigating through urbanization, the Black Death, and the trenches of World War I, before linking historical events to current beliefs surrounding disease.

At its best, this research provokes contemplation. Are we surprised that 19th-century anti-vaccination advocates resemble today’s skeptics? At its worst, some discussions may prove perplexing. For instance, Bauer confesses that following COVID-19, she avoided checkups out of fear of being criticized for her weight gain—a worrying reflection on the pressures surrounding health discussions.

Nevertheless, glimmers of insight are present in The Great Shadow. Despite moments of verbose writing, Bauer efficiently crafts narratives from historical archives. Her account of early germ theory proponents like Alexander Gordon and Ignace Semmelweis, often marginalized for their views, merits cinematic adaptation.

The final, memorable aspect of this book discusses our shift from superstition to science, presenting the current crisis termed the Third Epidemiological Transition. Bauer notes that we face not only epidemic failures with antibiotics but also the rapid emergence of novel diseases for which vaccines and treatments are yet unavailable, facilitated by modern global travel.

Peter Hoskin is the Books and Culture Editor at Prospect magazine.

Topic:

Source: www.newscientist.com

Why Only Some People Get Seriously Ill from Epstein-Barr Virus: Understanding the Infection’s Impact

Epstein-Barr Virus

Epstein-Barr Virus: A Common Infection with Serious Implications

Science History Images/Alamy

Approximately 10% of individuals carry genetic mutations that heighten their susceptibility to the Epstein-Barr virus (EBV), a common pathogen linked to diseases like multiple sclerosis and lupus. Insights from a study involving over 700,000 participants may clarify why EBV results in severe illness for some, yet remains relatively harmless for the majority.

“Nearly everyone has encountered EBV,” explains Chris Whincup from King’s College London, who did not partake in the research. “How is it that, despite widespread exposure, only a fraction of the population develops autoimmune conditions?” This research offers plausible answers.

The Epstein-Barr virus was initially identified in 1964 when scientists detected its particles in Burkitt’s lymphoma, a type of cancer. Today, over 90% of the population has been infected with EBV, evidenced by the presence of antibodies against the virus.

Initially, EBV is responsible for infectious mononucleosis, often referred to as monofever or glandular fever, which typically resolves in a few weeks. However, it is also linked to chronic autoimmune disorders, as evidenced by a 2022 study demonstrating its role in the onset of multiple sclerosis, leading to nerve damage.

“Why do individuals exhibit such varied responses to the same viral infection?” questions Caleb Lareau at Memorial Sloan Kettering Cancer Center.

To investigate, Lareau and her research team analyzed health data from over 735,000 individuals participating in the British Biobank study and a U.S. cohort called All of Us. Their genomes were sequenced using blood samples. “When EBV infects certain cells, it leaves behind copies in the blood,” shares Lareau, indicating that the human genome in their sample includes EBV genome copies.

The research highlights substantial variability in EBV DNA levels among subjects. Of the participants, 47,452 (9.7%) exhibited over 1.2 complete EBV genomes per 10,000 cells, indicating that while many cleared the virus post-infection, this subset did not.

To comprehend the heightened vulnerability of these individuals, the research team sought specific genomic differences that correlated with high EBV levels. As noted by Ryan Dhindsa from Baylor College of Medicine, they identified 22 genomic regions linked to elevated EBV levels, many of which are previously associated with immune-mediated diseases.

The strongest correlation was found in genes related to the major histocompatibility complex, essential immune proteins in distinguishing between self and foreign cells. “Certain individuals possess mutations in their major histocompatibility complex,” Dhindsa explains. Further studies indicated that these variants may impede the immune system’s capacity to detect EBV infections.

“This virus profoundly impacts our immune system, having lasting effects on certain individuals,” comments Ruth Dobson at Queen Mary University of London. Persistent EBV DNA can subtly stimulate the immune system, potentially leading to autoimmune attacks on the body.

Moreover, the genetic variants linked to high EBV levels were associated with various traits and symptoms, notably an elevated risk for autoimmune diseases such as rheumatoid arthritis and lupus, reinforcing the hypothesis of the virus’s involvement in these conditions.

The research team also identified a connection between these mutations and chronic fatigue, intriguing given that some studies have posited EBV as a contributing factor to myalgic encephalomyelitis, commonly known as chronic fatigue syndrome (ME/CFS). Due to the large sample size, “we can assert that this signal exists,” Dhindsa remarked, although the precise relationship remains unclear.

For Wincup, the primary takeaway is the identification of immune system components damaged by continuous EBV presence. Targeting these components could lead to more effective treatments for EBV-related conditions.

Additionally, vaccination against EBV is a potential avenue. Currently, only experimental vaccines exist. Wincup emphasizes that developing a vaccine would be a significant advancement, arguing that despite its common perception as benign, EBV causes considerable suffering for many. “How benign is it really?”

Topics:

Source: www.newscientist.com

How Termination Shocks Could Intensify the Economic Impact of Climate Change

Solar geoengineering: A solution to save ice sheets with potential risks

Credit: Martin Zwick/REDA/Universal Images Group (via Getty Images)

Research indicates that an abrupt halt to solar geoengineering may lead to a “termination shock,” causing a rapid temperature rise that could make the initiative more expensive than continuing without intervention.

With greenhouse gas emissions on the rise, there’s increasing attention on solar radiation management (SRM), which cools the planet by dispersing sulfur dioxide aerosols into the stratosphere to reflect sunlight.

However, sustained solar geoengineering is crucial for centuries; otherwise, the hidden warming could quickly reemerge. This rebound, referred to as termination shock, leaves little time for adaptation and could catalyze critical climate events such as ice sheet collapses.

According to Francisco Estrada, researchers from the National Autonomous University of Mexico assessed the risk of inaction on climate change compared to solar geoengineering approaches.

Projections suggest that if emissions aren’t curtailed, temperatures may soar by an average of 4.5 degrees Celsius above pre-industrial levels by 2100, leading to approximately $868 billion in economic damages. In contrast, a hypothetical stratospheric aerosol injection program initiated in 2020 could limit warming to around 2.8°C, potentially reducing these costs by half.

Nevertheless, if the aerosol program ends abruptly in 2030, resulting in a temperature rebound of 0.6 degrees Celsius over eight years, economic damages could surpass $1 trillion by century’s end. While estimations vary, Estrada states, “The principle remains consistent: the termination shock will be significantly worse than inaction.”

Estrada’s research innovatively gauges damage not only by global warming levels but also by the speed at which temperatures rise, according to Gernot Wagner from Columbia University.

Wagner warns that solar geoengineering may be riskier than it appears. “This highlights a critical concern,” he notes.

Make Sunsets, a Silicon Valley startup, has already launched over 200 sulfur dioxide-filled balloons into the stratosphere and offers emission offsets for sale. A recent launch in Mexico prompted governmental threats to ban geoengineering activities.

Israel’s Stardust Co., Ltd. has secured $75 million in funding and is lobbying the U.S. government to explore solar geoengineering options. A recent survey revealed that two-thirds of scientists anticipate large-scale SRM could occur this century, as reported by New Scientist.

According to studies, it would take at least 100 aircraft to cool the Earth by 1°C through aerosol injection, releasing millions of tons of sulfur dioxide annually, unimpeded by geopolitical conflicts or unforeseen events.

Presently, major nations like the United States are undermining global climate cooperation, but researchers highlight that such collaboration is essential to prevent termination shock and potentially realize the benefits of SRM.

Analysis of varying parameters suggests that aerosol injections could mitigate climate damage only if the annual probability of cessation is extremely low. In scenarios allowing for a gradual stop over 15 years, SRM might be viable.

If countries successfully reduce emissions, only minimal geoengineering cooling may be necessary, rendering aerosol injection beneficial with a maximum outage probability of 10%. This indicates a potential 99.9% chance of failure over a century, but manageable temperature recovery remains plausible in low emissions scenarios.

This need for international cooperation reveals what Estrada describes as the “governance paradox” of solar geoengineering: “We must ensure extremely low failure rates and possess effective governance to mitigate adverse outcomes.” However, he adds, “If we effectively reduce greenhouse gases, the need for SRM diminishes.”

These findings challenge the notion that solar geoengineering might lead to irresponsible development, as some have suggested, according to Chad Baum from Aarhus University. Funding for this new research was provided by the Degrees Initiative, aimed at supporting geoengineering studies in vulnerable low-income nations.

Baum stated, “We intend to complete all stages of this study, incorporating feedback from impacted communities.”

Despite this, Wagner emphasizes the imperative for further exploration into geoengineering’s trade-offs given the rise in emissions and their consequences: “We are approaching a critical juncture.”

Topics:

Source: www.newscientist.com

Impact of Abnormal Oral Microbiome on Obesity: Key Characteristics and Insights

Bacteria in the oral cavity

Oral Bacteria (Blue) on Human Cheek Cells (Yellow) in Scanning Electron Micrograph

Steve Gschmeisner/Science Photo Library

Recent research has revealed that individuals with obesity exhibit unique oral microbiome characteristics. This finding could pave the way for early detection and prevention strategies for obesity.

The diverse community of microorganisms in our gut significantly impacts weight gain, being commonly linked to obesity and various metabolic conditions. Notably, up to 700 species of bacteria have been implicated in obesity and overall health.

“Given that the oral microbiome is the second largest microbial ecosystem in the human body, we aimed to investigate its association with systemic diseases,” says Ashish Jha, from New York University, Abu Dhabi.

Jha and his team analyzed saliva samples from 628 adults in the United Arab Emirates, 97 of whom were classified as obese. They compared these samples with a control group of 95 individuals of healthy weight, similar in age, gender, lifestyle, oral health, and tooth brushing habits.

The analysis showed that the oral microbiome of obese individuals has a higher abundance of inflammation-causing bacteria, such as Streptococcus parasanguinis and Actinobacterium oris. Additionally, Oribacterium sinus produces lactic acid, which is linked to poor metabolic health.

Jha and his colleagues identified 94 distinct differences in metabolic pathways between the two groups. Obese participants demonstrated enhanced mechanisms for carbohydrate metabolism and the breakdown of histidine, while their capability to produce B vitamins and heme—crucial for oxygen transport—was reduced.

Metabolites notably generated in obese individuals include lactate, histidine derivatives, choline, uridine, and uracil, which are associated with metabolic dysfunction indicators such as elevated triglycerides, liver enzymes, and blood glucose levels.

“When we analyze these findings collectively, a metabolic pattern surfaces. Our data indicates that the oral environment in obesity is characterized by low pH, high carbohydrate levels, and pro-inflammatory conditions,” notes Lindsey Edwards from King’s College London. “This study offers compelling evidence that the oral microbiome may reflect and contribute to the metabolic changes associated with obesity.”

Currently, these findings suggest a correlation rather than causation. “While some associations are surprising, we cannot determine cause and effect as of now, which remains our next focus,” Jha states.

To explore whether the oral microbiome contributes to obesity or is modified by it, Jha and his team plan further experiments analyzing both saliva and gut microbiomes to investigate potential microbial and metabolic transfers.

Professor Jha believes this is plausible, as the mouth’s extensive blood vessel network facilitates nutrient absorption and taste sensing, potentially allowing metabolites direct access to the bloodstream, influencing other bodily systems.

Establishing a causal connection will also necessitate randomized controlled trials and detailed metabolic pathway analyses, according to Edwards.

As dietary patterns evolve, specific food components may become more readily metabolized by certain bacteria, leading to increased microbial activity that can influence cravings and potentially lead to obesity, Jha explains. For instance, uridine has been shown to promote higher calorie intake.

If oral bacteria are demonstrated to influence obesity, Edwards suggests it could lead to innovative interventions, such as introducing beneficial oral microbes through gels, using prebiotics to foster specific bacterial growth, or employing targeted antimicrobials. “Behavioral strategies, like reducing sugar intake, can also significantly contribute to obesity prevention,” she adds.

Even if the oral microbiome acts as a consequence rather than a cause of obesity, its assessment can still provide valuable insights. Saliva tests can easily detect distinct microbial changes, which Jha believes could be useful for early obesity detection and prevention strategies.

Topic:

Source: www.newscientist.com

Exploring ‘Dark Oxygen’: Scientists Research Its Impact in Deep Sea Mining Zones

Experiment on Oxygen Production by Deep-Sea Nodule

Experiment on Oxygen Production with Deep-Sea Nodule

Nippon Foundation

Scientists are set to deploy instruments to the ocean floor to explore the intriguing process of metal nodules producing oxygen in the Pacific Ocean. This unexpected phenomenon has ignited significant debate regarding the ethics of deep-sea mining.

In a surprising revelation from 2024, researchers identified that a potato-sized formation in the depths of the Pacific and Indian Oceans—including the distinguished Clarion-Clipperton Zone—functions as a vital oxygen source. This discovery challenges the conventional belief that large-scale oxygen production derives solely from sunlight and photosynthesis.

Dubbed “dark oxygen,” this phenomenon sustains life within the abyss, including microorganisms, sea cucumbers, and predatory sea anemones thriving thousands of meters beneath the surface. This finding casts doubt on proposals from deep-sea mining companies aiming to extract cobalt, nickel, and manganese by removing nodules from the ocean floor. A controversial deep-sea mining company was involved in this discovery, prompting a call for further scientific investigation.

Now, the team responsible for discovering dark oxygen is returning to the Clarion-Clipperton Zone, the prime location for potential deep-sea mining, to verify its existence and comprehend the mechanisms behind its production.

“Where does the oxygen come from for these diverse animal communities to thrive?” asked Andrew Sweetman from the Scottish Marine Science Society. “This could be an essential process, and we’re focused on uncovering it.”

The researchers propose that a metallic layer in the nodule generates an electrical current which splits seawater into hydrogen and oxygen. They’ve recorded up to 0.95 volts of electricity on the surface of the nodules—just below the standard 1.23 volts necessary for electrolysis. However, the team suggests that individual nodules or clusters could produce higher voltages.

Plans are underway to deploy a lander, essentially a metal frame housing various instruments, to a depth of 10,000 meters to measure oxygen flow and pH changes, as the electrolysis process releases protons, increasing water acidity.

Research Lander Deployed Into the Ocean

Scottish Marine Science Society

Given the potential role of microorganisms in this process, the lander will also collect sediment cores and nodules for laboratory analysis. Each nodule is home to approximately 100 million microorganisms, which researchers aim to identify through DNA sequencing and fluorescence microscopy.

“The immense diversity of microorganisms is constantly evolving; we are continually discovering new species,” remarked Jeff Marlow from Boston University. “Are they active? Are they influencing their environment in crucial ways?”

Furthermore, since electrolysis is generally not observed under the intense pressures found on the ocean floor, the team intends to utilize a high-pressure reactor to replicate deep-sea conditions and conduct electrolysis experiments there.

“The pressure of 400 atmospheres is comparable to that at which the Titan submarine tragically imploded,” noted Franz Geiger from Northwestern University. “We seek to understand the efficiency of water splitting under such high pressure.”

The ultimate aim is to carry out electrochemical reactions in the presence of microorganisms and bacteria under an electron microscope without harming the microorganisms.

The United Nations’ International Seabed Authority has yet to decide on the legality of deep-sea mining in international waters, with U.S. President Donald Trump advocating for its implementation. The Canadian company, The Metals Company, has applied for authorization from the U.S. government to commence deep-sea mining operations.

A recent paper authored by Metals Company scientists contends that Sweetman and his colleagues have not produced sufficient energy to facilitate seawater electrolysis in 2024, suggesting the observed oxygen was likely transported from the ocean’s surface by the deployed landers.

Sweetman countered this claim, stating that the lander would displace any air bubbles on its descent, and asserted that oxygen measurement would not have occurred if deployed in other regions, such as the Arctic ocean floor, which is 4,000 meters deep. Out of 65 experiments conducted at the Clarion-Clipperton Zone, he noted that 10% exhibited oxygen consumption while the remainder indicated oxygen production.

Sweetman and his colleagues also discovered that the oxidation phase of the electrolysis process can occur at lower voltages than those recorded on the nodule’s surface. A rebuttal presenting this data has been submitted to Natural Earth Science and is currently under review.

“From a commercial perspective, there are definitely interests attempting to suppress research in this field,” stated Sweetman in response to the Metals Company’s opposition to his findings.

“It is imperative to address all comments, regardless of their origin,” added Marlowe. “That is our current predicament in this process.”

Topics:

Source: www.newscientist.com

Love Machine Review: Exploring the Impact of Chatbots on Human Relationships

A woman with hearts in her eyes, representing the rise of AI relationships.

Imagine forming a deep bond with a chatbot that suddenly starts suggesting products.

Maria Kornieva/Getty Images

Love Machines
by James Muldoon, Faber & Faber

Artificial intelligence is becoming an inescapable reality, seamlessly integrating into our lives. Forget searching for chatbots; new icons will soon appear in your favorite applications, easily accessible with a single click, from WhatsApp to Google Drive, and even in basic programs like Microsoft Notepad.

The tech industry is making substantial investments in AI, pushing users to leverage these advancements. While many embrace AI for writing, management, and planning, some take it a step further, cultivating intimate relationships with their AI companions.

In James Muldoon’s Love Machine: How Artificial Intelligence Will Change Our Relationships, we delve into the intricate connections humans form with chatbots, whether they’re designed for romantic encounters or simply companionship. These AI systems also serve as friends or therapists, showcasing a broad range of interactions we’ve often discussed. New Scientist dedicates 38 pages to this topic.

In one interview, a 46-year-old woman in a passionless marriage shares her experience of using AI to explore her intricate sexual fantasies set in an 18th-century French villa. This opens up broader conversations about utilizing AI in more practical life scenarios, such as during a doctor’s visit.

Another participant, Madison, recounts uploading her late best friend’s text messages to a “deathbot” service, which generates a way for her to maintain communication.

Muldoon’s anecdotes often carry an element of voyeuristic intrigue. They reveal the diverse ways individuals navigate their lives, some paths being healthier than others. What works for one person might prove detrimental for another.

However, a critical question remains. Are we naïve to think that AI services won’t evolve like social media, cluttered with advertisements for profit? Envision a long-term relationship with a chatbot that frequently pushes products your way. What happens if the company collapses? Can you secure backups of your artificial companions, or migrate them elsewhere? Do you hold rights to the generated data and networks? Moreover, there are psychological risks associated with forming attachments to these indifferent “yes-men,” which may further alienate individuals lacking real social connections.

Nonetheless, there are positive applications for this technology. In Ukraine, for instance, AI is being harnessed to help individuals suffering from PTSD, far exceeding the current availability of human therapists. The potential to revolutionize customer service, basic legal operations, and administrative tasks is immense. Yet, Muldoon’s narrative suggests that AI often functions as an unhealthy emotional crutch. One man, heartbroken over his girlfriend’s betrayal, envisions creating an AI partner and starting a family with her.

This book appears less about examining the social impacts of innovative technology and more like a warning signal regarding pervasive loneliness and the critical lack of mental health resources. A flourishing economy, robust healthcare system, and more supportive society could reduce our reliance on emotional bonds with software.

Humans are naturally inclined to anthropomorphize inanimate objects, even naming cars and guitars. Our brain’s tendency to perceive faces in random patterns—pareidolia—has been a survival mechanism since prehistoric times. So, is it surprising that we could be deceived by machines that mimic conversation?

If this provokes skepticism, guilty as charged. While there’s potential for machines to gain sentience and form genuine relationships in the future, such advancements are not yet realized. Today’s AI struggles with basic arithmetic and lacks genuine concern for users, despite producing seemingly thoughtful responses.

Topics:

Source: www.newscientist.com

How El Niño Triggered Famine in Early Modern Europe: Uncovering the Climate Crisis’ Impact

Impact of El Niño on Crop Failures

El Niño’s Impact on European Agriculture: Crop Failures and Price Hikes

Public Domain

El Niño, a climate phenomenon affecting the Pacific Ocean region, significantly influenced the economy and climate of Europe, resulting in widespread famine from 1500 to 1800.

During El Niño, the warming of ocean waters in the central and eastern Pacific disrupts trade winds, which leads to altered global rainfall patterns. The cooling phase, known as La Niña, and the oscillation between these two phases is referred to as the El Niño Southern Oscillation (ENSO).

This climatic variation poses severe risks in tropical and subtropical areas, notably in Australia, where it can lead to droughts and wildfires, and in the Americas, where it causes increased rainfall.

However, until recently, the focus on El Niño’s effects on Europe was minimal. Emil Esmaili from Columbia University and his research team studied records from 160 famines in early modern Europe, correlating them with El Niño and La Niña data derived from tree rings.

The findings revealed that over 40% of famines in Central Europe during this era were directly linked to El Niño events.

El Niño typically increases rainfall in the region, which can lead to excess soil moisture, resulting in crop failures. Though it did not directly trigger famine in other European areas, it raised the likelihood of famine occurrences by 24% across all nine regions studied.

To better understand this correlation, Esmaili’s team assessed grain and fish prices, discovering that El Niño significantly drove up food prices throughout Europe for several years.

Researchers, including David Yubilaba from the University of Sydney, indicate that ENSO events can still lead to food insecurity and malnutrition in low-income households in regions such as South Asia, Southeast Asia, Oceania, and parts of Africa.

While El Niño continues to influence the climate in Europe, its impact on food security is expected to be less severe today. “Modern agricultural practices are now more resilient, weather forecasting has greatly improved, and markets have become more consolidated,” says Ubilaba.

Topic:

Source: www.newscientist.com

Enhancing Chess Fairness: The Impact of Rearranging Game Pieces

Innovative Chess Rules: Enhancing Complexity

Image Credit: Richard Levine/Alamy

Chess can be significantly enhanced by rearranging the starting pieces, creating a more challenging or equitable game, as discovered by physicists.

In traditional chess, the pieces initiate the game symmetrically, with rooks, knights, and bishops positioned on the board’s edges, while kings and queens are centrally located. This fixed setup enables elite players to memorize optimal opening moves, potentially leading to predictable and uninspiring matches.

In the 1990s, the renowned chess grandmaster Bobby Fischer proposed an innovative variation to mitigate this reliance on memory. This variation, which effectively randomizes the starting positions of the seven pieces behind the board, allows for a fair arrangement between the white and black pieces, under the rule that bishops, rooks, and kings maintain relative positions. Known as Chess960 due to its 960 possible starting positions, this format has recently gained immense popularity, drawing players like former world champion Magnus Carlsen to competitive events.

Although Chess960 appears equitable due to its randomness, Marc Barthelemy from The University of Paris-Saclay has revealed that this perceived fairness is deceptive after analyzing all possible configurations.

Typically, the white pieces, who commence the game, hold a slight edge in standard chess. Barthelemy’s analysis indicates that while certain Chess960 setups may greatly favor white, others could advantage black. “Not all positions are equal,” he explains.

To arrive at these findings, Barthelemy utilized Stockfish, an open-source chess engine, to evaluate each starting position’s complexity based on how challenging it was for both players to determine their next moves. By comparing the ease with which the best move could be identified, he assessed the complexity of each configuration. If finding the best move was straightforward, the player encountered minimal decision-making challenges. However, if both players faced comparable difficulties, the decision-making process became increasingly complex.

His research identified the starting position BNRQKBNR as the most complex, while QNBRKBNR offered a balanced challenge for both players. Such insights could assist tournament organizers in ensuring fairer matchups, Barthelemy notes.

Conversely, Vito Servedio from Austria’s Complexity Science Hub argues that randomness inherently provides fairness, and favoring specific Chess960 arrangements over others may lead players to prepare excessively. “It’s more equitable as players start on an equal footing,” Servedio asserts. “Grandmasters have deep knowledge of standard chess openings, but cannot prepare for every potential Chess960 setup.”

Barthelemy also discovered that the standard chess setup is relatively unremarkable regarding fairness and complexity in comparison to many of the other existing positions. “Surprisingly, the standard chess arrangement is not particularly striking,” Barthelemy observes. “It lacks balance and asymmetry, sitting rather centrally in the spectrum of positions. The reasoning for this historical choice remains unclear.”

“In a vast array of positions, it stands in the middle,” Servedio remarks. “Is it purely coincidental? I cannot say.”

Barthelemy notes that measuring complexity is not the sole method for evaluating chess game difficulty. Giordano De Marso from the University of Konstanz comments that the true challenge of a position often lies in having a singular move to identify, rather than choosing the best among several options.

De Marso expresses uncertainty regarding whether Barthelemy’s higher complexity scores correlate with players perceiving games as more difficult but suspects they do. “If increased positional complexity leads to longer deliberation times, it strengthens the case for this measurement,” he concludes.

Topics:

Source: www.newscientist.com

How Long He Abstains: The Impact of Male Ejaculation Timing on IVF Success

How Simple Interventions Boost IVF Success Rates

Christoph Burgstedt/Science Photo Library

Men are encouraged to ejaculate within 48 hours prior to IVF egg retrieval to enhance their chances of achieving a viable pregnancy. This recommendation comes from the first clinical trial exploring the effects of varying ejaculation intervals on fertility treatment outcomes.

During the final stages of an IVF cycle, a woman receives a “trigger” injection that matures the developing egg. This crucial injection occurs 36 hours before the eggs are harvested and fertilized.

For optimal sperm health during fertilization, men are often advised to ejaculate between two to seven days before providing their sperm sample for IVF. According to Dr. David Miller from the University of Leeds, who was not involved in the study, “There is an ideal timeframe for ejaculation when sperm quality peaks.”

This two-to-seven-day range is quite broad. Prolonged storage of sperm in the testes exposes them to various environmental toxins, particularly free oxygen radicals from metabolic processes and pollution. This exposure can lead to DNA damage and deterioration of sperm quality, warns Dr. Richard Paulson, also not involved in the trial. Conversely, too short a period between ejaculations may decrease sperm count.

Until now, solid clinical evidence supporting the idea that shorter intervals between ejaculations improve pregnancy outcomes has been lacking, though some studies hint at this. For instance, a 2024 meta-analysis observed that ejaculating less than four days apart correlated with improved semen quality in infertile men. Moreover, another study indicated that intervals under four hours resulted in lesser DNA-damaged sperm and enhanced sperm motility.

To investigate this concept further, Professor Yang Yu from the First Hospital of Jilin University in Changchun, China, conducted a study with 453 men undergoing conventional IVF. One group ejaculated roughly 36 hours prior to the final sperm sample, while another group ejaculated between 48 hours and seven days before.

Results revealed that the ongoing pregnancy rate was significantly higher in the short abstinence group: 46% versus 36% in the longer abstinence group. “While these findings are encouraging, it’s essential to note that they don’t completely represent ultimate treatment outcomes such as live birth rates,” Miller states. Nonetheless, the shorter abstinence group exhibited lower miscarriage rates, suggesting a potential for more live births.

Professor Paulson highlighted that the study provided intriguing insights but also noted its weaknesses, including the inclusion of both fresh and frozen embryos. IVF success rates can vary significantly between these two types. He also pointed out that the data showed a fertility decline in the short abstinence group while simultaneously seeing an increase in ongoing pregnancies. This suggests that fewer couples conceived, but those who did were more likely to continue past 12 weeks, warranting more detailed analysis. “Extraordinary claims necessitate extraordinary evidence that meticulously accounts for all potential variables,” he commented.

Future studies may also disclose whether more frequent ejaculation improves pregnancy outcomes for couples not undergoing IVF. “This trial offers strong evidence that shorter periods of abstinence contribute to better sperm quality,” asserts Dr. Jackson Kirkman-Brown from the University of Birmingham, UK.

Topics:

Source: www.newscientist.com

Likelihood of Catastrophic Asteroid Impact Rises Temporarily in 2025

Illustration of an asteroid passing near the moon

Mark Garlick/Science Photo Library

In 2025, the threat of a disastrous asteroid impact momentarily heightened when astronomers detected a building-sized asteroid on a collision course with Earth.

Known as 2024 YR4, this asteroid was initially identified by astronomers in late December 2024, with estimates placing its size between 40 and 90 meters. Any potential trajectory through our solar system would intersect a narrow zone that includes Earth, leading astronomers to initially assess a 1 in 83 probability of collision in 2032.

As they monitored the asteroid’s orbit more closely in early 2025, the likelihood of an impact was updated to a concerning 1 in 32 by February.

If it had impacted close to an urban area, the consequences would have been devastating, equivalent to several megatons of TNT. The asteroid was temporarily classified as a 3 on the Turin scale, where 0 means no threat and 10 signifies a global catastrophe. This raised alarms among several United Nations agencies, resulting in coordinated efforts for a global telescope campaign and discussions on the necessity of an asteroid deflection mission.

During this period, global space agencies convened regularly to share observations and enhance understanding of the asteroid. “2024 YR4 proved to be a significant learning experience for us,” stated Richard Moisle from the European Space Agency (ESA). “This served as crucial training to enhance our capabilities related to asteroid detection and understanding the overarching challenges.”

By February 20, astronomers had refined the trajectory of 2024 YR4, effectively removing Earth from the asteroid’s predicted path, and ESA subsequently reduced the collision risk to 1 in 625, or 0.16 percent. Weeks later, both NASA and ESA confirmed that there was no longer any risk of collision. “They are not considered a threat to our planet,” affirmed Moisle.

Nonetheless, astronomers still acknowledge a minor risk of a lunar impact, estimated at about 4% for 2032. “Should we hit the moon, it would provide a unique opportunity to observe the impact process from a safe distance,” commented Gareth Collins from Imperial College London.

Researchers are now assessing the potential ramifications of an asteroid impacting the moon, including the risk of debris cascading toward Earth. They are also exploring the feasibility of a deflection mission and strategizing on how to effectively dispatch a small satellite to an asteroid in an attempt to detonate it with a nuclear device. “We must tread carefully to ensure that a moon impact does not unintentionally lead to an Earth impact,” Rang Moisle.

The present 4 percent chance of a lunar collision is not sufficiently alarming to compel global space agencies to initiate a formal mission. This probability is unlikely to shift soon, as 2024 YR4 is currently obscured by the Sun and won’t be visible until 2028. However, due to its unique positioning in Earth’s orbit, there will be a rare opportunity to observe it with the James Webb Space Telescope in February 2026. Moisle indicated that since planning an asteroid mission can take years, data from these observations will represent the last realistic chance to determine whether a mission to visit or deflect the asteroid is warranted.

Total Solar Eclipse 2027 Cruise: Spain and Morocco

Join an extraordinary expedition aboard the cutting-edge exploration vessel Douglas Mawson to witness the longest total solar eclipse of the century on August 2, 2027.

Topic:

  • Asteroid/
  • 2025 News Review

Source: www.newscientist.com

Tattoos May Impact Local Immune System Function

Some researchers are concerned that tattoos might be hazardous to health

Olga Korbakova / Alamy

Research indicates that tattoo ink can accumulate in lymph nodes, potentially disrupting the immune system and leading to permanent alterations in the body’s disease defense mechanisms.

This conclusion arises from a study involving mice, which revealed chronic inflammation in the lymph nodes of tattooed animals—nodes that were stained with ink—and modified antibody responses to vaccinations. Similarly, studies have shown inflammation and discoloration in the lymph nodes of individuals with tattoos, persisting for years after the tattoo was applied.

The findings suggest that tattoos could increase disease risk and highlight the necessity for further investigation. Santiago González from the University of Lugano, Switzerland, asserts, “When you get a tattoo, you are essentially injecting ink into your body. This affects not only the skin’s appearance but also the immune system. Chronic inflammation, over time, can deplete the immune system, increasing susceptibility to infections and certain cancers—many questions remain that require additional research.”

Tattoos are becoming increasingly popular worldwide, with approximately 30 to 40 percent of individuals in Europe and the United States sporting at least one tattoo. Though Gonzalez does not have a tattoo, he admires them as an art form, stating, “I think they’re visually appealing.” Nonetheless, the long-term health implications of tattooing—particularly concerning the immune system—are still not well understood.

Gonzalez noted that he and his team were conducting an unrelated investigation into inflammation in mice when they observed a “crazy inflammatory response” after applying small identifying tattoos. Curious, they decided to delve deeper.

The team utilized standard commercial inks in black, red, and green to mark 25 square millimeter patches on the hind legs of several mice. With specialized imaging technologies, they tracked the ink traveling through lymph vessels towards nearby lymph nodes almost immediately, often within minutes.

In these nodes, the researchers found that macrophages (immune cells that eliminate debris, pathogens, and dead cells) absorbed the ink, turned the nodes discolored, and initiated acute inflammation. Within approximately 24 hours, these macrophages would perish and release the ink, which would then be taken up by other macrophages, creating a continuous cycle of chronic inflammation that outlasted the healing of the tattoo site.

After two months, during which the tattoos remained, the mice still exhibited inflammatory markers in their lymph nodes that were up to five times higher than typical, Gonzalez reported.

To comprehend how this inflammation affected immune functioning, the researchers administered a vaccine directly into the tattooed skin. Notably, the tattooed mice demonstrated a markedly weaker antibody response to the COVID-19 mRNA vaccine compared to control mice, while showing a stronger response to the influenza vaccine.

Further analysis revealed that the lymph node macrophages from tattooed mice were filled with ink and struggled to capture the COVID-19 vaccine. For mRNA vaccines to be effective, they must be processed by macrophages. Conversely, the protein-based influenza vaccines triggered an enhanced antibody response, likely due to an increase in immune cells drawn to the tattoo site. “The response may vary based on the type of vaccine,” Gonzalez explained.

Lastly, the researchers investigated a limited number of lymph node biopsies from individuals who had tattoos near their lymph nodes. Two years post-tattoo application, the lymph nodes retained visible pigment, housed within the same type of macrophages observed in the mouse research. “Their lymph nodes were entirely filled with ink,” noted Gonzalez.

Crucially, he emphasized that even if individuals undergo tattoo removal, the ink is likely to persist in the lymph nodes for a lifetime. “You can eliminate ink from your skin, but the ink in your lymph nodes remains,” he stated.

The research findings illuminate the long-suspected link between tattoos and immune response. Christel Nielsen at Lund University, Sweden, indicated that her team had recently published findings that suggest individuals with tattoos have a heightened melanoma risk. She believes that the findings from Gonzalez’s team may be explained by increased inflammation in lymph nodes. “This study provides compelling proof that this is indeed the case,” she remarked, calling it a significant advancement in our understanding of the relationship between tattoos and disease.

For Michael Jurbdazian, this study conducted at the German Federal Institute for Risk Assessment in Berlin paints a clearer picture of how tattoo pigments interact with the immune system. However, he notes that results from mouse studies might not precisely mirror human outcomes, especially considering the differences between human and mouse skin. “The correlation with human health, particularly once healing is complete, necessitates more investigation,” he stated.

topic:

  • immune system/
  • inflammation

Source: www.newscientist.com

AI’s Impact on Voter Sentiment: Implications for Democracy

AI chatbots may have the potential to sway voter opinions

Enrique Shore / Alamy

Could the persuasive abilities of AI chatbots signal the decline of democracy? A substantial study investigating the impact of these tools on voter sentiments revealed that AI chatbots surpass traditional political campaign methods, such as advertisements and pamphlets, in persuasiveness, rivaling seasoned campaigners as well. However, researchers see reasons for cautious optimism regarding how AI influences public opinion.

Evidence shows that AI chatbots, like ChatGPT, can migrate the beliefs of conspiracy theorists, winning converts to more reasonable positions and attracting support during human debates. This capability raises valid worries about AI possibly skewing the digital scales that determine election results or being misused by malicious entities to manipulate users towards certain political figures.

The concerning part is that these fears have merit. A survey involving thousands of voters who participated in recent elections in the US, Canada, and Poland found that David Rand and researchers at MIT discovered that AI chatbots effectively swayed individuals to back specific candidates or alter their stance on certain issues.

“Conversations with these models can influence attitudes towards presidential candidates—contributions often deemed deeply entrenched—more than previous studies would suggest,” Rand remarks.

In their American election analysis, Rand’s team surveyed 2,400 voters, asking them about the most significant policy issues or characteristics of a potential president. Subsequently, voters rated their preferences for the leading candidates, Donald Trump and Kamala Harris, on a 100-point scale and answered additional questions to clarify their choices.

The answers were inputted into a chatbot, such as ChatGPT, with the objective of persuading the voters to support an already favored candidate or switch their support to a less favored one. The interaction took about six minutes, consisting of three question-and-answer exchanges.

Following the AI interaction and a one-month follow-up, Rand’s team discovered that voters adjusted their candidate preferences by an average of 2.9 points.

Furthermore, the researchers examined AI’s capacity to influence views on specific policies and noticed a substantial change in opinions regarding the legalization of psychedelics, shifting voter support by approximately 10 points. In comparison, video ads impacted views by only about 4.5 points, and text ads swayed opinions by merely 2.25 points.

The magnitude of these findings is remarkable. Sasha Altai of the University of Zurich stated, “These effects are considerably larger than those typically observed with traditional political campaigning and are comparable to the influence stemming from expert discussions.”

Nevertheless, the study reveals a more hopeful insight: these persuasive interactions predominantly stemmed from fact-based arguments rather than personalized content, which tends to exploit users’ personal information available to political operatives.

Another study of approximately 77,000 individuals in the UK assessed 19 extensive language models across 707 distinct political issues, concluding that AI performed best when employing fact-based arguments, as opposed to tailoring its discussions to the individual.

“Essentially, it’s about creating a compelling argument that prompts a mindset shift,” Rand explains.

“This bodes well for democracy,” notes Altai. “It indicates that individuals are often more influenced by factual evidence than by personalized or manipulative strategies.”

There is a need for further research to confirm these findings, asserts Claes de Vries at the University of Amsterdam. He adds that if replicated, the controlled environments of these studies—where participants engaged with chatbots extensively—might differ significantly from individuals’ typical political interactions with friends or colleagues.

“The structured setting of interaction about politics with a chatbot is quite different from how people usually engage with political matters,” he mentions.

Despite this, De Vries notes growing evidence that individuals are indeed turning to AI chatbots for political advice. A recent survey of over 1,000 voters in the Netherlands ahead of the 2025 national elections found that about 10% sought AI guidance regarding candidates, political parties, and election matters. “This trend is particularly noteworthy as the elections approach,” De Vries points out.

Even if people’s engagements with chatbots are brief, De Vries asserts that the integration of AI into political processes seems unavoidable, as politicians seek tools for policy recommendations or as AI generates political advertisements. “As researchers and as a society, we must recognize that generative AI is now a vital aspect of the electoral process,” he states.

Topics:

  • artificial intelligence/
  • US election

Source: www.newscientist.com

Job Crisis: The Impact of Large Data Centers on Australia’s Freshwater Resources

a■ Australia is capitalizing on the AI boom, with numerous new investments in data centers located in Sydney and Melbourne. However, experts caution about the strain these large-scale projects may impose on already limited water resources.

The projected water demand for servicing Sydney’s data centers is anticipated to surpass the total drinking water supply in Canberra within the next decade.

In Melbourne, the Victorian government has pledged a $5.5 million investment to transform the city into Australia’s data center hub. Currently, hyperscale data center applications already exceed the collective water demands of nearly all of the top 30 business customers in the state.

Tech giants like Open AI and Atlassian are advocating for Australia to evolve into a data processing and storage hub. With 260 data centers currently operational and numerous others planned, experts express concern regarding the repercussions for drinking water resources.

Sydney Water projects that it will require as much as 250 megalitres daily to support the industry by 2035—more than the total drinking water supply in Canberra drinking water).

Cooling Requires Significant Water

Professor Priya Rajagopalan, director of RMIT’s Center for Post Carbon Research, points out that a data center’s water and energy requirements are largely dictated by the cooling technology implemented.

“Using evaporative cooling leads to significant water loss due to evaporation, while a sealed system conserves water but requires substantial amounts for cooling,” she explains.

Older data centers typically depend on air cooling. However, the increased demand for computational power means greater server rack densities, resulting in higher temperatures. Hence, these centers rely more heavily on water for cooling solutions.

Water consumption in data centers varies significantly. For instance, NextDC has transitioned to liquid-to-chip cooling, which cools processors and GPUs directly, as opposed to cooling entire rooms with air or water.

NextDC reports that while initial trials of this cooling technology have been concluded, liquid cooling is far more efficient and can scale to ultra-dense environments, improving processing power without a proportional increase in energy consumption. Their modeling suggests that the power usage efficiency (PUE) could decline to as low as 1.15.

Subscribe to Climate and Environment Editor Adam Morton’s Clear Air column for free!

The data center sector measures its sustainability using two key metrics: water usage efficiency (WUE) and power usage efficiency (PUE). These metrics gauge the levels of water or power consumed per unit of computing work.

WUE is calculated by dividing annual water usage by annual IT energy usage (kWh). For instance, a 100MW data center that uses 3ML daily would yield a WUE of 1.25. A number closer to 1 indicates greater efficiency. Certain countries enforce minimum standards; for example, Malaysia recommends a WUE of 1.8.

Even facilities that are efficient can still consume substantial amounts of water and energy at scale.

NextDC’s last fiscal year’s PUE stood at 1.44, up from 1.42 the previous year. The company indicates that this reflects the changing nature of customer activity across its facilities and the onboarding of new centers.

Calls to Ban Drinking Water Usage

Sydney Water states that estimates regarding data center water usage are continually reassessed. To prepare for future demands, the organization is investigating alternative, climate-resilient water sources like recycled water and rainwater harvesting.

“Every proposed connection for data centers will undergo case-by-case evaluations to guarantee adequate local network capacity. If additional services are necessary, operators might need to fund upgrades,” a Sydney Water representative said.

In its submission to the 2026-2031 rate review in Victoria, Melbourne Water observed that hyperscale data center operators seeking connectivity “expect instantaneous and annual demand to surpass nearly all of Melbourne’s leading 30 non-residential customers.”

Melbourne Water mentioned, “This has not been factored into our demand forecasting or expenditure plans.”

The agency is requesting upfront capital contributions from companies to mitigate the financial burden of necessary infrastructure improvements, ensuring those costs do not fall solely on the broader customer base.

Documents show that Greater Western Water in Victoria has received 19 data center applications. See more from ABC provided to the Guardian.

Skip past newsletter promotions

The Concerned Waterways Alliance, composed of various Victorian community and environmental organizations, has expressed concerns regarding the potential diversion of drinking water for cooling servers when the state’s water supplies are already under stress.

Alliance spokesperson Cameron Steele emphasized that expanding data centers would create a greater reliance on desalinated water, thereby diminishing availability for ecological streams and possibly imposing costs on local communities. The group is advocating for a ban on potable water usage for cooling and demanding that all centers transparently report their water consumption.

“We strongly promote the use of recycled water over potable water within our data centers.”

Closed Loop Cooling

In hotter regions, like much of Australia during summer, data centers require additional energy or water to remain cool.

Daniel Francis, customer and policy manager at the Australian Water Works Association, highlights that there is no universal solution for the energy and water consumption of data centers, as local factors such as land availability, noise restrictions, and water resources play significant roles.

“We constantly balance the needs of residential and non-residential customers, as well as environmental considerations,” says Francis.

“Indeed, there is a considerable number of data center applications, and it’s the cumulative effect we need to strategize for… It’s paramount to consider the implications for the community.”

“Often, they prefer to cluster together in specific locations.”

One of the data centers currently under construction in Sydney’s Marsden Park is a 504MW facility spanning 20 hectares with six four-story buildings. The company claims this CDC center will be the largest data campus in the southern hemisphere.

Last year, CDC operated its data centers with 95.8% renewable electricity, achieving a PUE of 1.38 and a WUE of 0.01. A company representative stated that this level of efficiency was made possible through a closed-loop cooling system that does not require continuous water extraction, in contrast to traditional evaporative cooling systems.

“CDC’s closed-loop system is filled only once at its inception and functions without ongoing water extraction, evaporation, or waste generation, thereby conserving water while ensuring optimal thermal performance,” the spokesperson noted.

“This model is specifically designed for Australia, a nation characterized by drought and water shortages, focusing on long-term sustainability and establishing industry benchmarks.”

Despite CDC’s initiatives, community concerns regarding the project persist.

Peter Rofile, acting chief executive of the Western NSW Health District, expressed in a letter last June that the development’s proximity to vulnerable communities and its unprecedented scale posed untested risks to residents in western Sydney.

“This proposal does not guarantee that this operation can adequately mitigate environmental exposure during extreme heat events, potentially posing an unreasonable health risk to the public,” Rofile stated.

Source: www.theguardian.com

Tesla Cautions UK: Easing EV Regulations Could Impact Sales Negatively

Tesla has notified the UK government that loosening electric vehicle regulations could negatively impact battery car sales and hinder the achievement of carbon targets, as highlighted in recently disclosed documents.

Elon Musk’s electric vehicle manufacturer also requested “support for the used car market,” as per a government consultation submission acquired earlier this year. fast charging, a newsletter focused on electric vehicles.

In April, the Labor government raised concerns among some electric car manufacturers by relaxing rules known as the zero-emission vehicle (ZEV) mandate. Previously, this mandate aimed to increase EV sales annually, but the new loophole allowed manufacturers to sell more gasoline and diesel vehicles.


Critics argue that a new tax on electric vehicles introduced in last week’s budget may further dampen demand.

Automakers such as BMW, Jaguar Land Rover, Nissan, and Toyota, all operating factories in the UK, expressed in their submissions during the spring consultation that the mandate was discouraging investment, as they were selling electric vehicles at a loss. In contrast, environmentalists and brands focusing primarily on electric vehicles assert that the rules are serving their intended purpose, with no manufacturers expected to be penalized for 2024 sales.

Tesla emphasized that avoiding new loopholes referred to as “flexibilities” was “essential” for the success of electric vehicle sales.

According to Tesla, these changes could “diminish the availability of battery electric vehicles (BEVs), significantly impact emissions, and jeopardize the UK’s carbon budget.”

Prime Minister Rachel Reeves has committed to imposing a “pay-per-mile” charge on electric vehicles from 2028, warning manufacturers of even stricter budgets to come. This could make electric vehicles less appealing compared to more polluting petrol and diesel options. Simultaneously, she announced an extension of subsidies for new electric vehicles, which was positively received by the industry.

Tom Reilly, author of Fast Charge, remarked: “Just as the shift to EVs seemed stable, the Budget has pulled it in two different directions, effectively taking from Peter to pay Paul. If car manufacturers seek mitigation obligations again, Labor will only be held accountable when climate targets are not met.”

Tesla, Mercedes-Benz, and Ford expressed concern about their responses being made public and were only permitted to reply through appeals under the Freedom of Information Act. Several documents were extensively redacted, yet the headline still indicated Tesla’s call for “support for the used car market.” Tesla opted not to comment on whether this assistance would involve subsidies.

Conversely, U.S. manufacturer Ford and Germany’s Mercedes-Benz are advocating against stricter regulations after 2030, which would require them to further lower average carbon dioxide emissions, allowing them to continue selling polluting vehicles longer.

Skip past newsletter promotions

Ford has strongly criticized European governments for retracting support for electric vehicle sales, stating, “Policymakers in various European regions are not adhering to the agreement.” Ford had previously backed stronger goals but has since changed its position.

U.S. automakers also highlighted the risk of being overshadowed by Chinese manufacturers, which “lack a foothold in the UK and benefit from lower costs.”

Mercedes-Benz contends that the UK should lower the value-added tax on public charging, which is equivalent to household electricity, from 20% to 5%, and suggests that a price cap on public charging fees should be considered.

Additionally, Tesla advocated for banning the sale of plug-in hybrid electric vehicles with a battery-only range of less than 160 miles starting in 2030, a rule that would exclude many of the best-selling models in this category.

Ford, Mercedes-Benz, and Tesla chose not to provide further comments.

Source: www.theguardian.com

Over 1,000 Amazon Employees Raise Concerns About AI’s Impact on Jobs and the Environment

An open letter signed by over 1,000 Amazon employees has raised “serious concerns” regarding AI development, criticizing the company’s “all costs justified and warp speed” approach. It warns that the implications of such powerful technologies will negatively affect “democracies, our jobs, and our planet.”

Released on Wednesday, this letter was signed anonymously by Amazon employees and comes a month after the company’s announcement about mass layoffs intended to ramp up AI integration within its operations.

The signatories represent a diverse range of roles, including engineers, product managers, and warehouse staff.

Echoing widespread concerns across the tech industry, the letter also gained support from over 2,400 employees at other companies such as Meta, Google, Apple, and Microsoft.

This letter outlines demands aimed at Amazon regarding workplace and environmental issues. Employees are urging the company to provide clean energy for all data centers, ensure that AI-driven products and services do not facilitate “violence, surveillance, and mass deportation,” and establish a working group composed of non-administrators. “They bear significant responsibility for overarching objectives within the organization, the application of AI, the implementation of AI-related layoffs, and addressing the collateral impacts of AI, such as environmental effects.”

This letter is a product of an advocacy group of Amazon employees advocating for climate justice. One worker involved in drafting the letter shared that employees felt compelled to speak out due to adverse experiences with AI tools at work and broader environmental concerns stemming from the AI boom. The employee emphasized the desire for more responsible methods in the development, deployment, and use of technology.

“I signed this letter because executives are increasingly fixated on arbitrary productivity metrics and quotas, using AI to justify pushing themselves and their colleagues to work longer hours or handle more projects with tighter deadlines,” stated a senior software engineer who preferred to remain anonymous.

Climate Change Goals

The letter claims that Amazon is “abandoning climate goals for AI development.”

Like its competitors in the generative AI space, Amazon is heavily investing in new data centers to support its AI tools, which are more resource-intensive and demand significant power. The company plans to allocate $150 billion over the next 15 years for data centers, and has recently disclosed an investment of $15 billion for a data center in northern Indiana and $3 billion for centers in Mississippi.

The letter reports that Amazon’s annual emissions have seen an “approximately 35% increase since 2019,” despite the company’s promises. The report cautions that many of Amazon’s AI infrastructure investments will be in areas where energy demands compel utilities to maintain coal plants or establish new gas facilities.

“‘AI’ is being used as a buzzword to mask a reckless investment in energy-hungry computer chips, which threaten worker power, accumulate resources, and supposedly save us from climate issues,” noted an Amazon customer researcher who requested to remain anonymous. “It would be fantastic to build AI that combats climate change! However, that’s not where Amazon’s billions are directed. They are investing in data centers that squander fossil fuel energy for AI aimed at monitoring, exploiting, and extracting profit from their customers, communities, and government entities.”

In a statement to the Guardian, Amazon spokesperson Brad Glasser refuted the employees’ claims and highlighted the company’s climate initiatives. “Alongside being a leading data center operator in efficiency, we have been the largest corporate buyer of renewable energy globally for five consecutive years, with over 600 projects globally,” Glasser stated. “We have also made substantial investments in nuclear energy through our current facilities and emerging SMR technology. These efforts are tangible actions demonstrating our commitment to achieving net-zero carbon across our global operations by 2040.”

AI for Enhanced Productivity

The letter also includes stringent demands regarding AI’s role within Amazon, arising from challenges employees are facing.

Three Amazon employees who spoke with the Guardian claimed that the company was pressuring them to leverage AI tools to boost productivity. “I received a message from my direct boss,” shared a software engineer with over two years at Amazon, who spoke on condition of anonymity for fear of retaliation, “about using AI in coding, writing, and general daily tasks to enhance efficiency, stressing that if I don’t actively use AI, I risk falling behind.”

The employee added that not long ago, their manager indicated they were “expected to double their work output due to AI tools,” expressing concern that the anticipated production levels would require fewer personnel and that “the tools simply aren’t bridging the gap.”

Customer researchers shared similar feelings. “I personally feel pressure to incorporate AI into my role, and I’ve heard from numerous colleagues who feel the same pressure…”

“Meanwhile, there is no dialogue about the direct repercussions for us as workers, from unprecedented layoffs to unrealistic output expectations.”

A senior software engineer highlighted that the introduction of AI has led to suboptimal outcomes. The most common scenario involves employees being compelled to use agent code generation tools. “Recently, I worked on a project that was merely cleaned up after an experienced engineer attempted to use AI to generate code for a complex assignment,” the employee revealed. “Unfortunately, none of it functioned as intended, and he had no idea why. In fact, we would have been better off starting from scratch.”

Amazon did not respond to questions regarding employee critiques of its AI workplace policies.

Employees stressed that they are not inherently opposed to AI but wish to see it developed sustainably and with input from those who are directly involved in its creation and application. “I believe Amazon is using AI to justify its control over local resources like water and energy, and it also legitimizes its power over its employees, who face increasing surveillance, accelerated workloads, and implicit termination threats,” a senior software engineer asserted. “There exists a workplace culture that discourages open discussions about the flaws of AI, and one of the objectives of this letter is to show colleagues that many of us share these sentiments and that an alternative route is achievable.”

Source: www.theguardian.com

Mathematicians Announce Significant Impact of Google’s AI Tools on Research Advancement

AI aids mathematicians in solving diverse problems

Andresle/Getty Images

The AI tools created by Google DeepMind are proving to be remarkably effective in aiding mathematical research, and experts believe this could initiate a wave of AI-driven mathematical breakthroughs on an unprecedented scale.

In May, Google unveiled an AI system named AlphaEvolve, which may reveal new algorithms and formulas. This system generates numerous potential solutions through Google’s AI chatbot Gemini, which then feeds them into a distinct AI evaluator. This evaluator filters out nonsensical outputs that chatbots are prone to produce. During initial tests, Google researchers pitted AlphaEvolve against over 50 unresolved mathematical problems, and discovered that it accurately rediscovered the most prominent solutions established by humans in approximately three-quarters of the cases.

Recently, Terrence Tao and his team at UCLA assessed the system using 67 more rigorous and extensive mathematical research queries. They found that AlphaEvolve did more than merely revisit old solutions; in certain instances, it could generate improved resolutions suitable for integration into other AI systems, like a more resource-intensive version of Gemini or AlphaProof, the AI that secured a gold medal in this year’s International Mathematics Olympiad, to craft new mathematical proofs.

Tao noted that it’s challenging to gauge overall effectiveness, as the problems differ in their complexities. However, the system consistently operated much faster than any individual mathematician.

“Addressing these 67 problems through traditional methods would require us to design a specific optimization algorithm for each task. That would take years and we might never have initiated this project at all. This initiative offers a chance to engage in mathematics on a previously unseen scale,” Tao states.

AlphaEvolve is particularly adept at solving what are known as optimization problems. These encompass tasks like determining the optimal figures, formulas, or objects that best resolve specific challenges. For instance, calculating the maximum number of hexagons that can occupy a defined area.

While the system is capable of addressing optimization problems across various branches of mathematics, such as number theory and geometry, these still represent “only a small fraction of all the problems that mathematicians are interested in,” according to Tao. Nonetheless, the power of AlphaEvolve is such that mathematicians might attempt to reformulate non-optimization problems into solvable forms for AI. “These tools offer a fresh perspective for tackling these issues,” he adds.

A potential drawback, however, as Tao explains, is that the system sometimes tends to “cheat” by producing answers that seem correct but utilize loopholes or methods that don’t genuinely solve the problems. “It’s akin to administering a test to a group of exceptionally bright yet morally ambiguous students who will do whatever it takes to score highly,” he remarks.

Even with its flaws, AlphaEvolve’s achievements are garnering interest from a broader segment of the mathematical community that might have previously leaned towards more general AI solutions such as ChatGPT, according to team member Javier Gomez Serrano from Brown University. Although AlphaEvolve isn’t publicly accessible yet, numerous mathematicians have expressed interest in testing it.

“There’s definitely a growing curiosity and openness to employing these tools,” asserts Gomez Serrano. “Everyone is eager to discover their potential. Interest in the mathematical community has surged compared to a year or two ago.”

Tao believes that such AI systems alleviate some of the burdens of mathematical work, allowing researchers to focus on other areas. “Mathematicians are few in number globally, making it infeasible to consider every problem. However, there exists a multitude of mid-level difficulties where tools like AlphaEvolve are particularly effective,” he notes.

Jeremy Avigado, a researcher at Carnegie Mellon University in Pennsylvania, observes that machine learning methods are increasingly beneficial to mathematicians. “The next step is enhancing collaboration between computer scientists skilled in machine learning tools and mathematicians with domain-specific knowledge,” he emphasizes.

“We aspire to witness more outcomes like this in the future and identify methods to extend this approach into more abstract mathematical fields.”

Topics:

Source: www.newscientist.com

Intact Impact Crater Unearthed in China

Scientists have identified an impact crater formed in a granite mountain, which is covered by a dense weathered crust in southern China. The Jinlin Crater, situated in Zhaoqing, Guangdong Province, is among approximately 200 craters recognized worldwide and is estimated to be less than 11,700 years old.



Panoramic aerial drone image of Jinlin Crater taken on May 12, 2025. Image credit: Chen et al., doi: 10.1063/5.0301625.

Throughout Earth’s geological history, a variety of impact craters have emerged.

Nevertheless, due to tectonic movements and significant surface weathering, many ancient craters have been eroded, distorted, or covered.

Currently, around 200 impact craters have been documented globally.

Only four of these impact craters have been reported in China, all of which are in the northeastern region.

In contrast, southern China experiences a tropical to subtropical monsoon climate, with high rainfall, humidity, and temperatures that promote substantial chemical weathering.

The newly found impact structure, referred to as Jinlin Crater, is located in the low mountains and hills of northwestern Guangdong province, adjacent to Jinlin Waterside Village in Deqing County, Zhaoqing City.

With a diameter of 900 m, it stands as the largest known impact crater of the modern Holocene, significantly surpassing the 300 m Maka crater, which was previously the largest identified Holocene impact structure.

“This discovery indicates that the scale of small extraterrestrial object impacts on Earth during the Holocene is much greater than previously known,” remarked Dr. Ming Chen, a researcher at the Hyperbaric Science and Technology Center.

In this instance, the “small” impactor is believed to be a meteorite, rather than a comet, which would have resulted in a crater no less than 10 km wide.

However, Chen and his team have not yet established if the meteorite was composed of iron or stone.

One of the most intriguing aspects of this crater is its remarkable preservation, especially given the monsoons, heavy rainfall, and high humidity conditions of the region, which are typically conducive to erosion.

Within the granite layers that shield and conserve that impact structure, researchers uncovered numerous quartz fragments that exhibit distinctive microscopic characteristics known as planar deformation features. Geologists utilize these as indicators of some form of impact.

“On Earth, quartz planar deformation features can only be formed by intense shock waves generated from celestial body collisions, with formation pressures between 10 to 35 gigapascals. This shock effect cannot be replicated by geological processes on Earth,” explained Dr. Chen.

“It is widely accepted that over Earth’s history, every point on the Earth’s surface has experienced impacts from extraterrestrial objects with roughly equal probability.”

“However, geological variations have led to different erosion rates of these historical impact markers, with some vanishing completely.”

“This underscores the significance of the Jinlin Crater discovery.”

“Impact craters serve as genuine records of Earth’s impact history.”

Uncovering Earth impact craters can furnish us with a more objective basis for comprehending the distribution, geological evolution, and impact history and regulation of small extraterrestrial objects.

For more details, refer to the team’s paper published in the Journal on October 15, 2025, titled Matter and radiation at the limit.

_____

Ming Chen et al. 2026. Jinlin Crater, Guangdong, China: Impact origin confirmed. Matarajith. extreme 11, 013001; doi: 10.1063/5.0301625

Source: www.sci.news