Urban Subsidence: A Greater Climate Crisis than Sea Level Rise

For decades, discussions surrounding coastal risk have focused primarily on climate change and sea level rise. However, a significant new global study reveals an even more urgent threat: land subsidence, affecting hundreds of millions of people living in delta regions, including urban hubs like New Orleans and Bangkok.

In various locations around the world, land is sinking at rates that often surpass the rising sea levels.

Utilizing satellite radar technology to monitor minute changes in the Earth’s surface, researchers have discovered that over half of the world’s deltas—low-lying areas where major rivers converge with the ocean—are currently sinking. This gradual subsidence, in conjunction with sea level rise, poses the most significant flood risk in many densely populated delta regions on Earth.

“This is truly a declaration of war,” stated Professor Robert Nicholls, co-author of the study and coastal scientist at the University of Southampton. The findings were reported in BBC Science Focus. “Until now, no one had taken a global perspective on delta subsidence. This study highlights the breadth of the issue and underscores the urgency of addressing it.”

The survey results can be found in the journal Nature.

Subsidence rates in river deltas, displayed as colored circles. The size of each circle reflects the area of the delta sinking faster than sea level rise, represented as a color gradient across the delta’s basin. Photo credit: Ohenhen et al. (2026)

Global Problems Hidden in Plain Sight

Delta regions comprise only 1% of the Earth’s land area but are home to approximately 350 to 500 million people, including some of the world’s most significant cities and productive agricultural zones. These areas serve as economic powerhouses, environmental hotspots, and essential food sources, yet they are inherently fragile.

Deltas are formed by loose, water-saturated sediments deposited over millennia. In their natural state, these sediments compact under their own weight and gradually sink.

Historically, natural subsidence was balanced by periodic flooding that replenished the land with fresh sediment, but modern interventions have disrupted this equilibrium.

The recent study analyzed satellite measurements across 40 major delta regions from 2014 to 2023, creating the first high-resolution global image detailing land elevation changes.

The findings were alarming: currently, at least 35% of delta regions have subsided, with over half of the land surface subsiding in most deltas.

In 18 of the 40 river deltas examined, land is sinking faster than local sea level rise, revealing hotspots where subsidence dominates over regional and global sea level increases.

A similar pattern is evident across continents—Asia, Africa, Europe, and the Americas—where relative sea levels rise due to both ocean expansion and land subsidence.

“From a risk perspective, it doesn’t matter if sea levels rise or land sinks,” Nichols explained. “The ultimate effect is the same, but the responses to those threats may differ.”

The Ciliund Delta in Indonesia is home to Jakarta, inhabited by over 40 million people, and is sinking at an average rate of 5.6 mm annually. Photo credit: Getty

What is Causing the Sinking?

The study identified three primary causes of anthropogenic land subsidence: groundwater extraction, reduced sediment supply, and urban expansion. Among these, groundwater pumping is the most significant predictor.

When groundwater is extracted, the soft surrounding sediments collapse and compact, a process that is nearly irreversible. Once the sediment is compacted, it will not return, even if water levels recover.

In 10 out of the 40 delta regions studied, groundwater depletion was the main factor driving land subsidence. Additionally, reduced river sediment caused by damming and flood defenses, combined with the weight of growing cities built on soft soils, contribute to this crisis.

As a result, what was once a slow geological phenomenon has transformed into an urgent environmental crisis.

Read More:

US Case: Mississippi Delta

The Mississippi River Delta in New Orleans and Louisiana exemplifies this issue in the United States.

The analysis confirms widespread subsidence across the delta, with over 90% of the region experiencing subsidence at an average rate of 3.3 mm per year. Some localized areas even sink much faster.

While this rate may seem minimal, it accumulates significantly over decades, especially alongside the threats posed by rising sea levels and hurricanes.

The Mississippi Delta has lost thousands of square kilometers of coastal wetlands over the last century, resulting in catastrophic damage. An area the size of a soccer field is lost to open water every 100 minutes.

The Mississippi Delta experiences an average subsidence of 3.3 mm per year, with some hotspots sinking over 10 times faster. Photo credit: NASA Earth Observatory

The lack of fresh sediment is a critical issue. Levees and dams prevent flooding and the natural deposition of new sediments that help rebuild the land. Additionally, drainage systems, oil and gas extraction, and decades of groundwater pumping exert further stress on fragile soils.

While some delta areas display resilience, one proposed solution is relocating populations away from these vulnerable regions. For instance, New Orleans has seen a steady population decline since the 1960s.

“In the United States, people tend to accept the idea of relocation,” Nichols noted, emphasizing that societal mobility and favorable land-use policies make this transition more politically feasible than in parts of Europe and Asia, where long-term protective measures are generally favored.

Warning to Major Cities

While North America grapples with these challenges, the most extreme subsidence rates can be found in parts of South and Southeast Asia, where population density is high and dependence on groundwater for agriculture, industry, and drinking water prevails.

Regions such as the Mekong River (Vietnam), Ganges and Brahmaputra rivers (Bangladesh and India), Chao Phraya River (Thailand), and Yellow River (China) are sinking faster than current global sea level rise in some areas by over a centimeter per year.

Mega-cities like Bangkok, Dhaka, Shanghai, and parts of Jakarta are built on these subsiding foundations.

The good news is that, unlike global sea level rise—which unfolds over centuries—human-induced land subsidence can respond swiftly to policy changes. A notable success story is Tokyo.

Due to strict groundwater extraction regulations, Tokyo has significantly reduced subsidence rates. Photo credit: Getty

In the mid-20th century, unchecked groundwater extraction caused parts of Tokyo to sink more than 4 meters. However, rigorous regulations on groundwater use and investments in alternative water sources resulted in a swift decrease in subsidence rates.

“Authorities have enacted legislation to ensure sufficient alternative water supplies and eliminate groundwater extraction,” Nichols remarked. “And almost overnight, this led to stabilization.”

Additional solutions include managed flooding in agricultural areas to replenish soil sediments. “Sediment is often deemed a pollutant,” Nichols points out. However, when rivers overflow, they deposit valuable materials that built the delta, a process sometimes referred to as “brown gold.”

Urban areas can be fortified with effective engineering solutions such as sea walls, levees, and storm surge barriers. “Addressing subsidence complements efforts to adapt to sea level rise and reduces vulnerabilities,” Nichols added, as reported here.

Shifting Attitudes Towards Coastal Risk

The study’s authors emphasize that land subsidence has been dangerously overlooked in global climate risk strategies, largely viewed as a local rather than a global issue.

However, local does not equate to minor. Even under severe climate scenarios, land subsidence is expected to remain the primary driver of relative sea level rise in numerous delta regions for decades to come.

Financial and institutional barriers often hinder large-scale interventions in many areas, but deferring action only exacerbates the costs and challenges of future adaptations.

Once land subsides, initiating new urban developments is not feasible, leaving communities to face tough decisions about relocation.

As Nichols succinctly states, “The first crucial step is to acknowledge that a problem exists.”

Read More:

Source: www.sciencefocus.com

Discover Our Innovative Approach to Understanding the Nature of Reality

Canal Reflection

We can Usually Agree on Objects’ Appearance, But Why?

Martin Bond / Alamy

Although our world seems inherently ambiguous at the quantum level, this is not the experience we face in daily life. Researchers have now established a methodology to measure the speed at which objective reality emerges from this quantum ambiguity, lending credibility to the notion that an evolutionary framework can elucidate this emergence.

In the quantum domain, each entity, such as a single atom, exists within a spectrum of potential states and only assumes a definitive, “classical” state upon measurement or observation. Yet, we perceive strictly classical objects devoid of existential ambiguities, and the processes enabling this have challenged physicists for years.

Prominent physicist Wojciech Zurek of Los Alamos National Laboratory in New Mexico introduced the concept of “quantum Darwinism,” suggesting that a process akin to natural selection confirms the visibility of the “fittest” state among numerous potential forms, ensuring successful replication through environmental interactions up to the observer’s perspective. When observers with access to only portions of reality converge on the same objective observation, it indicates they are witnessing one of these identical copies.

Researchers at University College Dublin, led by Steve Campbell, have shown that differing observers can still arrive at a consensus on objective reality, even if their observational methods lack sophistication or precision.

“Observers can capture a fragment and make any measurements they desire. If I capture a different fragment, I too can make arbitrary measurements. The question becomes: how does classical objectivity arise?” he explains.

The research team has redefined the emergence of objectivity as a quantum sensing issue. For instance, if the objective fact pertains to the frequency of light emitted by an object, the observer must acquire accurate data about that frequency, similar to how a computer employs a light sensor. In optimal conditions, this method achieves ultra-precise measurements, quickly leading to a definitive conclusion about the light’s frequency. This scenario is assessed using Quantum Fisher Information (QFI), a mathematical formula that benchmarks how varying, less accurate observational techniques can still attain similar precise conclusions. Gabriel Randy at the University of Rochester highlights this comparison in their recent study.

Remarkably, their calculations indicate that for significantly large fragments of reality, even observers employing imperfect measurements can ultimately gather enough data to reach the same conclusions about objectivity as those derived from the ideal QFI standard.

“Surprisingly, simplistic measurements can be just as effective as more advanced ones,” Lundy states. “This illustrates how classicality emerges: as fragments grow larger, observers tend to agree on even basic measurements.” Thus, this research contributes further to our understanding of why, when observing the macroscopic world, we concur about its physical attributes, such as the color of a coffee cup.

“This study underscores that we do not require flawless, ideal measurements,” adds Diego Wisniacki from the University of Buenos Aires, Argentina. He notes that while QFI is foundational in quantum information theory, its application to quantum Darwinism has been sparse, presenting pathways to bridge theoretical frameworks with established experimental methodologies, like quantum devices utilizing light-based or superconducting qubits.

“This research serves as a foundational ‘brick’ in our comprehension of quantum Darwinism,” states G. Massimo Palma from the University of Palermo, Italy. “It more closely aligns with the experimental descriptions of laboratory observations.”

Palma elaborates that the simplicity of the model used in this study could facilitate new experimental pursuits; however, complex system calculations will be essential to solidify quantum Darwinism’s foundation. “Advancing beyond rudimentary models would mark a significant progression,” Palma asserts.

Lundy conveyed that researchers are eager to transform theoretical findings into experimental validations. For instance, qubits formed from trapped ions could be employed to evaluate how the emergence of objectivity timescale relates to the durations during which these qubits retain their quantum characteristics.

Topics:

Source: www.newscientist.com

Unraveling Free Will: A Deep Dive into the Mystery – New Book Release


Understanding Palantir’s Impact

Palantir, a leading American data analytics firm, wields technology capable of both saving and taking lives. As its influence expands globally, concerns about this enigmatic corporation’s role in world affairs and its ultimate beneficiaries continue to rise.

The Hidden Female Psychopath

Recent studies indicate that the presence of female psychopaths may be more prevalent than once believed. If this is the case, why do they remain unnoticed? Perhaps you suspect someone around you? Here’s how to identify potential traits:

Artificial Intelligence Ethics

There is an urgent need to educate AI on moral principles. However, a paradox emerges: to elicit positive responses from AI, one must examine its behavior when exposed to malicious tasks.

Data Storage in Space

The rapid progression of AI technology is driving an unprecedented demand for electricity globally. Additionally, cooling these data centers requires significant amounts of water. Could the cosmos offer a viable solution for data storage? Many startups believe it is the ideal destination.

Plus Highlights

  • Boost Your IQ: Ditch the brain training games. Physical activity could truly unlock your brain’s full potential.
  • Impact of Social Media Bans: Experts are split on how effective Australia’s social media ban is for children.
  • Q&A Insights: Our experts tackle questions such as “Why do we kiss?” “How contagious is laughter?” “Can tigers get along with their prey?” “What are the similarities between identical twins?” “Is déjà vu unhealthy?” “Should you trim your eyelashes?” “What happens if you fall ill on the ISS?” “How do we best measure earthquakes?” “Can you maintain a happy marriage with a psychopath?” “How fast am I moving now?” and much more…

Issue No. 429 – Released on January 27, 2026

Subscribe to BBC Science Focus Magazine

Don’t forget, BBC Science Focus is also available on all major digital platforms. You can access it on Android, Kindle Fire and Kindle e-Readers, as well as on your iOS app for iPad and iPhone.

This rewrite optimizes the content for SEO while preserving HTML formatting and enhancing clarity.

Source: www.sciencefocus.com

Solution to BBC Science Focus Crossword #429: Your Complete Guide

Meet Holly, a dedicated staff writer at BBC Science Focus, where she expertly manages the engaging Q&A section. With an MSc (Special Award) in Earth Sciences (Space and Climate Physics) from UCL, Holly specializes in Astronomy and Earth Sciences. Before her journey with Our Media, she gained valuable experience as a geo-environmental consultant and engineer, passionately exploring exoplanets in her free time while advising on ground risk and remediation projects in Northern England.

With nearly a decade of experience as a regional editor for a popular theater website, Holly excels in curating and developing digital content. She is also a talented artist and illustrator, regularly contributing to the craft website Gathered. Her impressive portfolio includes collaborations with notable organizations such as RSPB, English Heritage, Disney, Pilot, and Brother, in addition to her work with BBC Good Food Magazine, Home Style Magazine, and Papercraft Inspiration Magazine.

Holly’s interests extend to photography and a fascination with antiques, showcasing her diverse artistic talents and love for culture.

Source: www.sciencefocus.com

Webb Telescope Explores a Lenticular Galaxy Cluster in the Leo Constellation

Webb astronomers have unveiled a breathtaking image captured by the NASA/ESA/CSA James Webb Space Telescope, showcasing MACS J1149.5+2223 (MACS J1149), a cosmic collection of hundreds of galaxies situated about 5 billion light-years from Earth in the constellation Leo. The latest images not only highlight the cluster’s brilliant galaxies but also illustrate how their immense gravitational forces uniquely affect the fabric of space-time.



The stunning image of the galaxy cluster MACS J1149.5+2223. Image credits: NASA / ESA / CSA / Webb / C. Willott, National Research Council Canada / R. Tripodi, INAF-Astronomical Observatory of Rome.

The latest Webb image of MACS J1149 dramatically showcases light from background galaxies, which is bent and magnified in a remarkable phenomenon known as gravitational lensing. This creates elongated arcs and distorted shapes, revealing the mass of both clusters.

“The immense gravity of this galaxy cluster does more than hold the galaxies adrift in the universe,” the Webb astronomers explained in a statement.

“As light from galaxies beyond the cluster travels toward our telescope over billions of years, its trajectory through space-time is warped by the gravitational forces of the intervening galaxies.”

This gravitational lensing effect is evident throughout the image of MACS J1149, with galaxies appearing stretched into narrow streaks and others morphing into unusual shapes. A prime example of gravitational lensing can be seen near the image’s center, just below the prominent white galaxy.

In this area, a galaxy with spiral arms has been transformed into a shape resembling a pink jellyfish. This peculiar galaxy once harbored the farthest single star ever identified and a supernova that appeared four times simultaneously.

This remarkable image of MACS J1149 is part of the Canadian NIRISS Unbiased Cluster Survey (CANUCS) program.

“This program employs Webb’s advanced instruments to explore the evolution of low-mass galaxies in the early Universe, shedding light on their star formation, dust content, and chemical makeup,” the astronomers stated.

The data collected will also assist researchers in studying the epoch of reionization, when the first stars and galaxies illuminated the universe, mapping mass distributions in galaxy clusters, and understanding how star formation diminishes within cluster environments.

Source: www.sci.news

Discover How Genes Connect Intestinal Motility to Vitamin B1: An Unexpected Nutrient Link

In a groundbreaking study analyzing data from over 268,000 individuals, researchers have identified that genes associated with thiamine (vitamin B1) metabolism significantly influence intestinal motility. This discovery paves the way for personalized treatments targeting conditions like constipation and irritable bowel syndrome (IBS).

Diaz Muñoz et al. identified key mechanisms involved in intestinal motility, including an overlooked role for vitamin B1. Image credit: Hillman et al., doi: 10.1264/jsme2.ME17017 / CC BY 4.0.

Gastrointestinal motility is crucial for food digestion, nutrient absorption, and waste elimination, all critical components of human health and well-being.

The regulation of motility depends on a multifaceted communication network, which encompasses the gut-brain axis, the immune system, gut microbiota, and is affected by external influences such as diet, physical activity, and medications.

Disruptions in motility control and peristalsis can lead to significant health issues, including IBS and chronic idiopathic intestinal pseudoobstruction, highlighting the importance of understanding these conditions.

In this recent study, Professor Mauro D’Amato from LUM University, CIC bioGUNE-BRTA, and Ikerbasque, along with his colleagues, employed a large-scale genetic approach to identify common DNA variations linked to intestinal motility.

The research utilized questionnaires and genetic data from 268,606 individuals of European and East Asian ancestry, applying computational analysis to pinpoint relevant genes and mechanisms.

The team discovered 21 genomic regions that affect defecation frequency, including 10 previously unknown regions, affirming the biologically plausible pathways involved in intestinal motility regulation.

For instance, they found significant correlations with bile acid regulation, which aids fat digestion and serves as signaling molecules in the intestines, along with neural signaling pathways crucial for intestinal muscle contractions (especially acetylcholine-related signaling).

However, the most striking outcome arose when the researchers pinpointed two high-priority genes focused on vitamin B1 biology, specifically those involved in the transport and activation of thiamine: SLC35F3 and XPR1.

To validate the relevance of the vitamin B1 signal, they further examined dietary data from the UK Biobank.

A study involving 98,449 participants revealed that increased dietary thiamine intake correlated with more frequent bowel movements.

Crucially, the relationship between thiamine consumption and bowel frequency exhibited variations based on genetic factors, specifically the combined genetic score of SLC35F3 and XPR1.

This suggests that genetic variations in thiamine metabolism may impact how vitamin B1 intake affects bowel habits in the general population.

“By utilizing genetic data, we’ve created a roadmap for the biological pathways influencing intestinal pace,” said Dr. Cristian Díaz Muñoz from CIC bioGUNE-BRTA.

“The data strongly highlights vitamin B1 metabolism alongside established mechanisms like bile acids and neural signaling.”

This research also confirms a significant biological link between bowel frequency and IBS, a widespread condition affecting millions globally.

“Issues with intestinal motility are at the core of irritable bowel syndrome, constipation, and other common motility disorders, yet the underlying biology remains challenging to decipher,” noted Professor D’Amato.

“These genetic findings point to specific pathways, particularly those involving vitamin B1, as vital areas for further research, including laboratory experiments and meticulously designed clinical trials.”

For more details, refer to the study published in the Journal on January 20, 2026.

_____

C. Diaz Muñoz et al. Genetic analysis of defecation frequency suggests a link to vitamin B1 metabolism and other pathways regulating intestinal motility. Intestine published online January 20, 2026. doi: 10.1136/gutjnl-2025-337059

Source: www.sci.news

Discovering the Shining Nebula: A Stellar Cradle of New Stars

Exploring Bella Junior’s Supernova, also referred to as RX J0852.0-4622 or G266.2-1.2, scientists have revealed the mysteries surrounding its explosive past. This ancient nebula, once a brilliant supernova, has perplexed researchers regarding its distance and the magnitude of its explosion. Recently, however, groundbreaking discoveries linked a newly formed star, Ve 7-27, with the remnants of Bella Junior. By utilizing the Multi-Unit Spectroscopic Explorer (MUSE) on the ESO’s Very Large Telescope, astronomers have captured unprecedented detailed images of Ve 7-27.



VLT/MUSE image of Ve 7-27. Image credit: ESO / Suherli et al.

“This is the first evidence ever connecting a newborn star to the remnants of a supernova,” stated Dr. Samar Safi Harb, an astrophysicist from the University of Manitoba.

“This discovery resolves a decades-long debate, enabling us to calculate the distance of Bella Junior, its size, and the true power of the explosion.”

By examining the gas emissions from Ve 7-27, Dr. Safi Harb and his team confirmed that it shares the same chemical signature as materials from the Vela Junior supernova.

This correlation established a physical connection between the two celestial bodies, allowing astronomers to accurately determine Bella Junior’s distance.

Both Ve 7-27 and Vela Junior are approximately 4,500 light-years away.

“The gas present in this young star mirrors the chemical composition of stars that exploded in the past,” remarked Dr. Safi Harb.

“Isn’t it poetic? Those same elements eventually contributed to Earth and now play a role in forming new stars.”

Recent findings indicate that Bella Junior is larger, more energetic, and expanding at a rate quicker than previously thought, marking it as one of the most potent supernova remnants in our galaxy.

“Stars are constructed in layers, much like onions,” Dr. Safi Harb explained. “When they explode, these layers are propelled into space.”

“Our research indicates that these layers are now becoming visible in the jets of nearby young stars.”

“This study not only solves an enduring astronomical enigma but also sheds light on stellar evolution, the enrichment of galaxies with elements, and how extreme cosmic events continue to shape our universe.”

This research was published today in a study featured in the Astrophysics Journal Letters.

Source: www.sci.news

New Analysis of Lunar Regolith: Challenging Meteorite and Water Formation Theories

Planetary scientists examining oxygen isotopes in lunar soil from the Apollo missions have determined that 4 billion years of meteorite impacts may have contributed only a minimal amount of Earth’s water. This insight prompts a reevaluation of established theories regarding water’s origins on our planet.



Close-up of a relatively new crater to the southeast, captured during Apollo 15’s third lunar walk. Image credit: NASA.

Previous research suggested that meteorites significantly contributed to Earth’s water supply due to their impact during the solar system’s infancy.

In a groundbreaking study, Dr. Tony Gargano from NASA’s Johnson Space Center and the Lunar and Planetary Institute, along with colleagues, employed a novel technique to analyze the lunar surface debris known as regolith.

Findings indicated that even under optimistic conditions, meteorite collisions from approximately 4 billion years ago may have delivered only a small percentage of Earth’s water.

The Moon acts as a historical archive, documenting the tumultuous events that the Earth-Moon system has endured over eons.

While Earth’s dynamic geology and atmosphere erase these records, lunar samples have retained valuable information.

However, this preservation is not without its challenges.

Traditional regolith studies have focused on metal-preferring elements, which can be obscured by continuous impacts on the Moon, complicating efforts to reconstruct original meteorite compositions.

Oxygen triple isotopes offer highly precise “fingerprints” since oxygen, being the most abundant element in rocks, remains untouched by external forces.

These isotopes facilitate a deeper understanding of the meteorite compositions that impacted the Earth-Moon system.

Oxygen isotope analyses revealed that approximately 1% of the regolith’s mass consists of carbon-rich material from meteorites that partially vaporized upon impact.

With this knowledge, researchers calculated the potential water content carried by these meteorites.

“The lunar regolith uniquely allows us to interpret a time-integrated record of impacts in Earth’s vicinity over billions of years,” explained Dr. Gargano.

“By applying oxygen isotope fingerprints, we can extract impactor signals from materials that have undergone melting, evaporation, and reprocessing.”

This significant finding alters our understanding of water sources on both Earth and the Moon.

When adjusted to account for global impacts, the cumulative water indicated in the model equates to only a minor fraction of the Earth’s oceanic water volume.

This discrepancy challenges the theory that water-rich meteorites delivered the bulk of Earth’s water.

“Our results don’t rule out meteorites as a water source,” noted Dr. Justin Simon, a planetary scientist at NASA Johnson’s Celestial Materials Research and Exploration Sciences Division.

“However, the Moon’s long-term record indicates that the slow influx of meteorites cannot significantly account for Earth’s oceans.”

While the implied water contribution from around 4 billion years ago is minimal in the context of Earth’s oceans, it remains notable for the Moon.

The Moon’s available water is concentrated in small, permanently shadowed areas at the poles.

These regions, among the coldest in the solar system, present unique opportunities for scientific research and exploration resources as NASA prepares for crewed missions to the Moon with Artemis III and subsequent missions.

The samples analyzed in this study were collected from near the lunar equator, where all six Apollo missions landed.

Rocks and dust gathered over half a century ago continue to yield valuable insights, albeit from a limited lunar area.

Future samples collected through Artemis are expected to unlock a new wave of discoveries in the years ahead.

“I consider myself part of the next generation of Apollo scientists, trained in the questions and insights enabled by the Apollo missions,” said Dr. Gargano.

“The Moon provides tangible evidence that we can examine in the lab, serving as a benchmark for what we learn from orbital data and telescopes.”

“I eagerly anticipate the information that upcoming Artemis samples will reveal about our place in the solar system.”

The findings of this study will be published in Proceedings of the National Academy of Sciences.

_____

Anthony M. Gargano et al. 2026. Constraints on impactor flux from lunar regolith oxygen isotopes to the Earth-Moon system. PNAS 123 (4): e2531796123; doi: 10.1073/pnas.2531796123

Source: www.sci.news

New Titanosaurus Species Discovered by Paleontologists in Argentina

A newly identified genus and species of titanosaurus, a colossal sauropod dinosaur from the Cretaceous period, has been uncovered from fossils in northern Patagonia, Argentina.



Reconstructing the life of Yenen Hassai. Image credit: Gabriel Rio.

Named Yenen Hassai, this new species roamed Earth approximately 83 million years ago during the Late Cretaceous period.

This ancient creature belongs to the Titanosauridae, a fascinating group of large, long-necked herbivorous dinosaurs that thrived on the Gondwana supercontinent.

“The head of Yenen Hassai was proportionately smaller compared to its massive body,” explained Dr. Leonardo Filippi, a paleontologist from CONICET and the Urquiza Municipal Museum in Argentina.

“This titanosaur measured between 10 to 12 meters (33 to 39 feet) in length and weighed approximately 8 to 10 tons.”

The fossil remains of Yenen Hassai were excavated from the Bajo de la Carpa Formation at a site known as Cerro Obero la Invernada in Neuquén, Patagonia, Argentina.

This material showcases one of the most complete titanosaur skeletons found in the region, preserving six cervical vertebrae, ten dorsal vertebrae with associated ribs, the sacrum, and the first caudal vertebra.

Alongside the holotype, researchers identified remains of at least two additional sauropods at the site, including a juvenile specimen and another adult titanosaur, which may belong to an unclassified species.

“Through phylogenetic analysis, Yenen Hassai is found to be closely related to Nalambuenatitan and Overosaurus, as a basal member of an unnamed clade of derived non-lithostrotians saltasaurids,” they noted.

“Evidence from the titanosaur fauna at Cerro Obero la Invernada indicates that species diversity was relatively high during the Santonian period, suggesting that at least two lineages, colossosaurs and saltasauroids, coexisted.”

“This discovery positions the Cerro Obero-La Invernada region as the area with the highest diversity of titanosaurs during the Santonian of the Neuquén Basin, offering crucial insights into the evolution of dinosaur fauna in this era.”

This significant finding is detailed in a recent article: research paper published in the Journal of Historical Biology on January 12, 2026.

_____

LS Filippi et al. Yenen Hassai: An Overview of Sauropod Titanosaurs Diversity from the Cerro Overo-La Invernada Region (Bajo de la Carpa Formation of the Santonian), Northern Patagonia, Argentina. Historical Biology published online January 12, 2026. doi: 10.1080/08912963.2025.2584707

Source: www.sci.news

Exploring How Gas Fuels Diverse Microbial Life in Caves – Sciworthy

Here’s an SEO-optimized version of the given content, maintaining the original HTML structure:

Caves are often dark, damp, and remote. While they lack the nutrients and energy sources that sustain life in other ecosystems, they still host a diverse array of bacteria and archaea. But how do these microorganisms acquire enough energy to thrive? A team of researchers from Australia and Europe investigated this intriguing question by examining Australian caves.

Previous studies identified that microorganisms in nutrient-poor soils can harness energy from the atmosphere through trace gases, including hydrogen, carbon monoxide, and methane. These gases are present in minute quantities, classified as trace gases. Microbes possess specific proteins that can accept electrons from these gas molecules, enabling them to utilize these gases as energy sources, such as hydrogenase, dehydrogenase, or monooxygenase, fueling their metabolic processes.

The Australian research team hypothesized that cave-dwelling microbes may be using trace gases for survival. To test this, they studied four ventilated caves in southeastern Australia. The researchers collected sediment samples at four points along a horizontal line that extended from the cave entrance to 25 meters (approximately 80 feet) deep inside the cave, resulting in a total of 94 sediment samples.

The team treated the sediment samples with specific chemicals to extract microbial DNA, using it to identify both the abundance and diversity of microorganisms present. They found multiple groups of microorganisms throughout the cave, including Actinobacteria, Proteobacteria, Acidobacteria, Chloroflexota, and Thermoproteota. Notably, the density and diversity of microbes were significantly higher near the cave entrance, with three times more microorganisms in those regions compared to further inside.

The team utilized gene sequencing to analyze the microbial DNA for genes linked to trace gas consumption. Results revealed that 54% of cave microorganisms carried genes coding for proteins involved in utilizing trace gases like hydrogenases, dehydrogenases, and monooxygenases.

To assess the generality of their findings, the researchers searched existing data on microbial populations from 12 other ventilated caves worldwide. They discovered that genes for trace gas consumption were similarly prevalent among other cave microorganisms, concluding that trace gases might significantly support microbial life and activity in caves.

Next, the researchers measured gas concentrations within the caves. They deployed static magnetic flux chambers to collect atmospheric gas samples at four points along the sampling line, capturing 25 milliliters (about 1 ounce) of gas each time. Using a gas chromatograph, they analyzed the samples and found that the concentrations of hydrogen, carbon monoxide, and methane were approximately four times higher near the cave entrance compared to deeper areas. This suggests that microorganisms might be metabolizing these trace gases for energy.

To validate their findings further, they constructed a static magnetic flux chamber in the lab, incubating cave sediment with hydrogen, carbon monoxide, and methane at natural concentration levels. They confirmed that microbes also consumed trace gases in controlled conditions.

Finally, the researchers explored how these cave microbes obtained organic carbon. They conducted carbon isotope analysis, focusing on carbon-12 and carbon-13 ratios, which can vary based on microbial metabolic processes. Using an isotope ratio mass spectrometer, they determined that cave bacteria had a lower percentage of carbon-13, indicating their reliance on trace gases to generate carbon within the cave ecosystem.

The researchers concluded that atmospheric trace gases serve as a crucial energy source for microbial communities in caves, fostering a diverse array of microorganisms. They recommended that future studies examine how climatic changes, such as fluctuations in temperature and precipitation, might influence the use of atmospheric trace gases by cave-dwelling microorganisms.

Post views: 318

This version enhances the original content with relevant keywords while retaining the structure and integrity of the HTML tags.

Source: sciworthy.com

CRISPR: Revolutionizing Genetic Code Editing – The Most Innovative Idea of the Century

New Scientist: Your source for the latest in science news and long-form articles from expert journalists covering advancements in science, technology, health, and environmental issues.

“The pain was like being struck by lightning and being hit by a freight train at the same time,” shared Victoria Gray. New Scientist reflects on her journey: “Everything has changed for me now.”

Gray once endured debilitating symptoms of sickle cell disease, but in 2019, she found hope through CRISPR gene editing, a pioneering technology enabling precise modifications of DNA. By 2023, this groundbreaking treatment was officially recognized as the first approved CRISPR therapy.

Currently, hundreds of clinical trials are exploring CRISPR-based therapies. Discover the ongoing trials that signify just the beginning of CRISPR’s potential. This revolutionary tool is poised to treat a wide range of diseases beyond just genetic disorders. For example, a single CRISPR dose may drastically lower cholesterol levels, significantly reducing heart attack and stroke risk.

While still in its infancy regarding safety, there’s optimism that CRISPR could eventually be routinely employed to modify children’s genomes, potentially reducing their risk of common diseases.

Additionally, CRISPR is set to revolutionize agriculture, facilitating the creation of crops and livestock that resist diseases, thrive in warmer climates, and are optimized for human consumption.

Given its transformative capabilities, CRISPR is arguably one of the most groundbreaking innovations of the 21st century. Its strength lies in correcting genetic “misspellings.” This involves precisely positioning the gene-editing tool within the genome, akin to placing a cursor in a lengthy document, before making modifications.

Microbes utilize this genetic editing mechanism in their defense against other microbes. Before 2012, researchers identified various natural gene-editing proteins, each limited to targeting a single location in the genome. Altering the target sequence required redesigning the protein’s DNA-binding section, a process that was time-consuming.

However, scientists discovered that bacteria have developed a diverse range of gene-editing proteins that bind to RNA—a close relative of DNA—allowing faster sequence matching. Producing RNA takes mere days instead of years.

In 2012, Jennifer Doudna and her team at the University of California, Berkeley, along with Emmanuelle Charpentier from the Max Planck Institute for Infection Biology, revealed the mechanics of one such gene-editing protein, CRISPR Cas9. By simply adding a “guide RNA” in a specific format, they could target any desired sequence.

Today, thousands of variants of CRISPR are in use for diverse applications, all relying on guide RNA targeting. This paradigm-shifting technology earned Doudna and Charpentier the Nobel Prize in 2020.

Topics:

Source: www.newscientist.com

The Vital Role of Our Microbiome: The Century’s Best Idea for Health

Explore the latest in science, technology, health, and the environment with New Scientist's expert coverage.

“The gut microbiome has transformed our understanding of human health,” says Tim Spector, PhD, co-founder of the Zoe Nutrition App from King’s College London. “We now recognize that microbes play a crucial role in metabolism, immunity, and mental health.”

Although significant advancements in microbiome research have surged in the past 25 years, humans have a long history of utilizing microorganisms to enhance health. The Romans, for instance, employed bacterial-based treatments to “guard the stomach” without comprehending their biological mechanisms.

In the 17th century, microbiologist Antony van Leeuwenhoek made the groundbreaking observation of the parasite Giardia in his own stool. It took scientists another two centuries to confirm his discoveries, until the 21st century when the profound impact of gut and skin microbes on health became evident.

By the 1970s, researchers determined that gut bacteria could influence the breakdown of medications, potentially modifying their efficacy. Fecal transplant studies hinted at how microbial communities could restore health. However, it was the rapid advancements in gene sequencing and computing in the 2000s that truly revolutionized this field. Early genome sequencing revealed every individual possesses a distinct microbial “fingerprint” of viruses, fungi, and archaea.

In the early 2000s, groundbreaking studies illustrated that the microbiome and immune system engage in direct communication. This collaboration reshapes the microbiome’s role as a dynamic participant in our health, impacting a wide range of systems, from the pancreas to the brain.

Exciting findings continue to emerge; fecal transplants are proving effective against Clostridium difficile infections, while microorganisms from obese mice can induce weight gain in lean mice. Some bacterial communities have shown potential to reverse autism-like symptoms in mice. Recently, researchers have even suggested that microbial imbalances could trigger diabetes and Parkinson’s disease. “Recent insights into the human microbiome indicate its influence extends far beyond the gut,” states Lindsay Hall from the University of Birmingham, UK.

Researchers are gaining a clearer understanding of how microbial diversity is essential for health and how fostering it may aid in treating conditions like irritable bowel syndrome, depression, and even certain cancers. Studies are also investigating strategies to cultivate a healthy microbiome from early life, which Hall believes can have “profound and lasting effects on health.”

In just a few decades, the microbiome has evolved from an obscure concept to a pivotal consideration in every medical field. We are now entering an era that demands rigorous testing to differentiate effective interventions from overhyped products, all while shaping our approach to diagnosing, preventing, and treating diseases.

Topic:

Source: www.newscientist.com

Ancient Wooden Tool: The Oldest Known Stick Shaped by Early Humans

Reconstruction of a Paleolithic woman crafting wooden tools

Credit: G. Prieto; K. Harvati

Remarkably, some of the oldest known wooden tools have been unearthed in an open-pit mine in Greece, dating back 430,000 years. These artifacts were likely crafted by an ancient human ancestor, potentially related to Neanderthals.

Archaeologists note that prehistoric wooden artefacts are “extremely rare.” According to Dirk Leder from the Lower Saxony Cultural Heritage Office in Hannover, Germany, any new findings in this area are highly valued.

Evidence suggests our extinct relatives may have utilized wooden tools for millions of years. “This could be the oldest type of tool ever used,” states Katerina Harvati from the University of Tübingen, Germany. Unfortunately, the preservation of wooden artifacts is often poor, hindering our understanding of their use.

Harvati and her team discovered the tool at a site called Marathusa 1, originally confirmed in 2013 in the Megalopolis Basin of southern Greece. The open-pit lignite mine revealed sediment layers that are nearly a million years old, offering unprecedented access to date and research, as mentioned by researcher K. Harvati.

From 2013 to 2019, excavations yielded not only tools but also the skeleton of a straight-tusked elephant (Paleoloxodon antiquus), indicating a rich archaeological context with evidence of activity, including more than 2,000 stone tools and remains of varied flora and fauna, depicting an ancient lakeshore ecosystem.


To date Marathusa 1, researchers relied on various methods, including analyzing fossil footprints and historical changes in the Earth’s magnetic field. By 2024, they confirmed that the artefacts are around 430,000 years old, a time marked by challenging climatic conditions—the gravest ice age of the Pleistocene in Europe. The Megalopolis Basin likely provided refuge due to its relatively temperate climate.

The archaeological team identified two significant wooden tools among the 144 artifacts. The first, an 81 cm long pole made from alder, exhibits marks indicative of intentional shaping. One end appears rounded, possibly serving as a handle, while the other is flattened, hinting at potential use for digging underground tubers or perhaps for butchering elephant carcasses. Harvati admits uncertainty about its exact application.

Mysterious second wooden tool from Marathusa 1

Credit: N. Thompson; K. Harvati

The second tool remains enigmatic, measuring just 5.7 cm in length and made from willow or poplar. It also shows signs of intentional shaping after the bark was removed. According to Harvati, this represents a completely new type of wooden tool. While it might have served to modify stone tools, the specific purpose remains a mystery.

Reeder points out that while the first tool is a clear example of wooden craftsmanship, questions remain about the functionality of the second. “Is this a complete item or part of something larger?” he muses.

No hominid remains have been found at Marathusa 1. Given its age, it predates our species and is likely too early even for Neanderthals. “The prevailing hypothesis suggests this site might be associated with pre-Neanderthal humans or Homo heidelbergensis. However, Harvati cautions against making definitive conclusions, noting that Greece was frequented by various hominin groups.

Other ancient wooden tools, like the Clacton spear discovered in Britain, are estimated to be about 400,000 years old, while a wooden spear from Schöningen, Germany, has been dated using multiple methods to around 300,000 years. The only tools that predate those found at Marathusa 1 are from Kalambo Falls in Zambia, which date back 476,000 years and resemble remains of larger structures or buildings.

Discover Archaeology and Paleontology

New Scientist regularly covers extraordinary archaeological sites worldwide that reshape our understanding of human evolution and early civilizations. Consider joining us on this captivating journey!

Topics:

Source: www.newscientist.com

Why Crowdsourcing Wikipedia is the Most Revolutionary Idea of the Century

New Scientist: Your Go-To Source for Science News and Insights

In today’s digital landscape, hostility often overshadows collaboration. Remarkably, Wikipedia—a publicly editable encyclopedia—has emerged as a leading knowledge resource worldwide. “While it may seem improbable in theory, it remarkably works in practice,” states Anusha Alikan from the Wikimedia Foundation, the nonprofit behind Wikipedia.

Founded by Jimmy Wales in 2001, Wikipedia continues to thrive, although co-founder Larry Sanger left the project the following year and has since expressed ongoing criticism, claiming it is “overrun by ideologues.”

Nonetheless, Sanger’s opinions are not widely echoed. Wikipedia boasts over 64 million articles in 300+ languages, generating an astonishing 15 billion hits monthly. Currently, it ranks as the 9th most visited website globally. “No one could have anticipated it would become such a trusted online resource, yet here we are,” Arikan commented.

Building trust on a massive scale is no small achievement. Although the Internet has democratized access to human knowledge, it often presents fragmented and unreliable information. Wikipedia disrupts this trend by allowing anyone to contribute, supported by approximately 260,000 volunteers worldwide, making an impressive 342 edits per minute. A sophisticated system grants broader editing rights to responsible contributors, fostering trust that encourages collaboration even among strangers.

Wikipedia also actively invites special interest groups to create and edit content. For instance, the Women in Red project tackles gender disparities, while other initiatives focus on climate change and the history of Africa. All articles uphold strict accuracy standards, despite critics like Sanger alleging bias.

As an anomaly in the technology sector, Wikipedia operates without advertising, shareholders, or profit motives. It has maintained this unique position for over two decades with great success.

However, the rise of artificial intelligence poses new challenges. AI can generate misleading content, deplete resources in training efforts, and lead to diminished website traffic and decreased donations due to AI-driven search summaries.

Topics:

  • Artificial Intelligence/
  • Internet

Source: www.newscientist.com

Unveiling the Ultimate Dark Matter Map: Discovering Unprecedented Cosmic Structures

dark matter distribution

Dark Matter Distribution: Hubble vs. James Webb

Credit: Dr. Gavin Leroy/Professor Richard Massey/COSMOS-Webb Collaboration

In a groundbreaking study, scientists leveraged subtle distortions in the shapes of over 250,000 galaxies to construct the most detailed dark matter map to date, paving the way for insights into some of the universe’s greatest enigmas.

Dark matter, elusive by nature, does not emit any detectable light. Its existence can only be inferred through its gravitational interactions with normal matter. Researchers, including Jacqueline McCreary from Northeastern University, utilized the James Webb Space Telescope (JWST) to map a region of the sky larger than the full moon.

“This high-resolution image depicts the scaffold of a small segment of the universe,” noted McCreary. The new map boasts double the resolution of previous ones created by the Hubble Space Telescope, encompassing structures much farther away.

The researchers studied approximately 250,000 galaxies, noting that their shapes, while interesting, serve primarily as a backdrop for understanding gravitational distortions. As Liliya Williams from the University of Minnesota explained, “These galaxies merely act as the universe’s wallpaper.” The critical component is the way dark matter’s gravitational pull warps the light from these distant galaxies—a phenomenon known as gravitational lensing. The more distorted the shape of these galaxies is from a perfect circle, the greater the amount of dark matter situated between us and them.

By analyzing these optical distortions, the team was able to derive a map illustrating massive galaxy clusters and the cosmic web filaments linking them. Many of these newly identified structures deviate from prior observations of luminous matter, suggesting they are predominantly composed of dark matter. “Gravitational lensing is one of the few and most effective techniques for detecting these structures across vast regions,” Williams stated.

This research is significant, considering that dark matter constitutes about 85% of the universe’s total matter, crucial for the formation and evolution of galaxies and clusters. Understanding its distribution could shed light on its behavior and composition, according to Williams.

“This achievement is not just observational but also paves the way for various analyses, including constraints on cosmological parameters, the relationship between galaxies and their dark matter halos, and their growth and evolution over time,” McCreary highlighted. These parameters include the strength of dark energy, the enigmatic force driving the universe’s accelerating expansion.

While initial findings from the JWST map align with the Lambda CDM model of the universe, McCreary emphasizes that a thorough analysis of the data is still required to unearth new insights. “At first glance, it appears consistent with Lambda CDM, but I remain cautious. A final assessment will depend on complete results.”

Topics:

Source: www.newscientist.com

How Menstrual Pads Can Provide Women with Insights into Fertility Changes

Menstrual Pads: A Revolutionary Tool for Tracking Women’s Fertility

Shutterstock/Connect World

Innovative home tests integrated into menstrual pads are empowering women to monitor their fertility through menstrual blood. This non-invasive method eliminates the need for frequent blood tests or clinic visits.

For many women, understanding their fertility journey often remains elusive until they attempt to conceive. In case of any complications, clinical tests can offer vital information.

These tests are instrumental in assessing the levels of anti-Mullerian hormone (AMH), a key indicator of “ovarian reserve,” which reflects the quantity of eggs remaining in a woman’s ovaries. In adults, AMH levels naturally decrease with age, indicating that higher levels signify a robust supply of eggs, whereas lower levels may signal reduced reserves or early onset of menopause.

Traditionally, AMH measurement has involved either clinic-based blood tests or at-home finger-prick tests, both requiring lab analysis before results are available.

Recently, Lucas Dosnon from ETH Zurich and his team in Switzerland have created a user-friendly test utilizing menstrual blood for immediate results.

The test functions as a lateral flow assay—similar to a COVID-19 test—utilizing small gold-coated particles with antibodies that selectively bind to AMH. When the test strip is exposed to menstrual blood, the hormonal interactions create a visible line, where the darkness of this line correlates with AMH levels.

While visual assessments can estimate results, researchers have developed a smartphone app that accurately analyzes test strip images. When tested against menstrual blood samples with known AMH concentrations, results aligned closely with clinical evaluations.

Moreover, the research team has seamlessly integrated this test into menstrual pads, enabling passive AMH level monitoring throughout menstruation. Over time, this approach may reveal trends in ovarian reserves that single tests could miss.

“We believe this research could be a game-changer for women’s health,” stated Dosnon, highlighting the potential for regular ovarian health screenings useful for various purposes, including during IVF and for diagnosing conditions outside of reduced ovarian reserve. Elevated AMH levels, for instance, can indicate polycystic ovarian syndrome and, in rare cases, granulosa cell tumors affecting the ovaries. “Menstrual blood is an underutilized resource with great potential in monitoring overall health,” Dosnon added.

Richard Anderson from the University of Edinburgh emphasizes the interpretation challenges all family medicine tests face, noting that understanding results can be complex, as no AMH test assesses egg quality. He questions whether women will prefer this test over traditional methods: “Is obtaining a reliable blood test that much of a burden?”

In response, Dosnon clarified that the test isn’t designed to replace clinical evaluations but rather offers an alternative that addresses the challenges in women’s health monitoring and research, praised for its non-invasive nature, user-friendliness, and affordability.

Topic:

Source: www.newscientist.com

Discover the Top 21 Innovative Ideas of the 21st Century: How We Selected Them and Why They Matter

What distinguishes a groundbreaking idea from a mediocre one? This is often a challenging distinction to make. Take the example of vaccination: collecting pus from a cowpox-infected individual and injecting it into an eight-year-old boy may seem utterly reckless. Yet, 18th-century physician Edward Jenner’s daring action ultimately led to the eradication of smallpox, a disease that plagued humanity.

With the benefit of hindsight, we recognize that Jenner’s innovation was monumental. This principle of vaccination continues to save millions of lives today. As we progress through the 21st century, we feel it’s essential to reflect on and celebrate transformative ideas from the past 25 years that are reshaping our perspectives, actions, and understanding of the world around us.

Compiling our list of the 21 most impactful ideas of the 21st century involved rigorous discussions among our editorial team. One of our initial challenges was determining if the first quarter of this century would conclude at the beginning or end of 2025. For clarity, we opted for the latter. We navigated debates on various ideas, dedicating particular attention to concepts like the microbiome—establishing it as a legitimate 21st-century notion—and scrutinizing the role of social media, which after much discussion, we deemed largely negative. Ultimately, we recognize that the quality of ideas is subjective.

We developed a robust set of criteria for our selection. To qualify for this list, a concept must already demonstrate a significant impact on our self-understanding, health, or broader universe. Additionally, it should be grounded in scientific discovery, with a strong idea underpinning it. Lastly, the development must have occurred within the last 25 years.


Rather than trying to predict the future, it’s important to take the time to reflect on the past.

While the last criterion may appear straightforward, we encountered numerous proposals that remain unrealized. The discovery of gravitational waves in the 21st century opened new cosmic vistas, but their prediction dates back a century to Albert Einstein. Similarly, ideas like weight loss medications, personalized medicine, and mRNA vaccines show promise, but their full potential has yet to be achieved—perhaps these will make the list in 2050.

During our selection process, we couldn’t disregard ideas that initially seemed appealing but faltered. Therefore, we also crafted a list of the five most disappointing ideas of the century thus far. The line between success and failure can sometimes blur, leading to controversial choices in our best ideas list. For instance, while many would advocate for the removal of smartphones, we ultimately view them as largely beneficial. Likewise, the ambitious global warming target of 1.5°C can be seen as a failure, especially as new reports indicate that average global temperatures have surpassed this benchmark for the first time. Nonetheless, we argue that striving to reduce the threshold from 2°C remains one of the century’s monumental ideas, setting a standard for global climate ambition.

Advancing away from fossil fuels is undoubtedly crucial, and prominently featured in this effort is Elon Musk. In 2016, before Musk ventured into social media and politics, his company Tesla launched its first Gigafactory in Nevada, marking a pivotal moment in the transition to renewable energy by utilizing economies of scale to transform transportation and energy systems. Conversely, other approaches to fighting climate change, such as alternative fuels and carbon offsets, appear more harmful than beneficial.

One significant takeaway from our selection process is that revolutionary ideas often arise by chance. For many, a working outlet can be the catalyst for a few minutes of smartphone scrolling during a lengthy commute. However, for two physicists in 2005, their discovery altered the global decarbonization strategy. This breakthrough also unveiled the foundations of our complex thought processes, illustrating that brain regions don’t operate in isolation but are interwoven into a robust network. This understanding has revolutionized our approach to diagnosing and treating neurological issues.

Looking back over the past quarter-century, it’s evident that the world has transformed considerably. We successfully dodged the Millennium Bug, the human genome’s first draft was completed, and the International Space Station welcomed its first crew. Concepts like “Denisovans” and “microbiomes” were unknown to us. In our pages, we celebrated innovations like wireless communication and marveled at miniaturized computer chips driving these technologies. “At its core is a device known as a Bluetooth chip,” we stated, positing it as the next big thing—a prediction that, in hindsight, was flawed, since truly transformative technologies extend beyond mere convenience.

This experience highlights the folly of predictions, as they can often be overlooked in the rush for the next trending innovation. Thus, rather than striving to foresee the future, we ought to invest time in contemplating the past. The advancements we’ve witnessed in health, technology, and environmental conservation suggest that this century has made the world a better place. Let’s hope, without necessarily predicting, that this momentum continues into the future.

Source: www.newscientist.com

How Termination Shocks Could Intensify the Economic Impact of Climate Change

Solar geoengineering: A solution to save ice sheets with potential risks

Credit: Martin Zwick/REDA/Universal Images Group (via Getty Images)

Research indicates that an abrupt halt to solar geoengineering may lead to a “termination shock,” causing a rapid temperature rise that could make the initiative more expensive than continuing without intervention.

With greenhouse gas emissions on the rise, there’s increasing attention on solar radiation management (SRM), which cools the planet by dispersing sulfur dioxide aerosols into the stratosphere to reflect sunlight.

However, sustained solar geoengineering is crucial for centuries; otherwise, the hidden warming could quickly reemerge. This rebound, referred to as termination shock, leaves little time for adaptation and could catalyze critical climate events such as ice sheet collapses.

According to Francisco Estrada, researchers from the National Autonomous University of Mexico assessed the risk of inaction on climate change compared to solar geoengineering approaches.

Projections suggest that if emissions aren’t curtailed, temperatures may soar by an average of 4.5 degrees Celsius above pre-industrial levels by 2100, leading to approximately $868 billion in economic damages. In contrast, a hypothetical stratospheric aerosol injection program initiated in 2020 could limit warming to around 2.8°C, potentially reducing these costs by half.

Nevertheless, if the aerosol program ends abruptly in 2030, resulting in a temperature rebound of 0.6 degrees Celsius over eight years, economic damages could surpass $1 trillion by century’s end. While estimations vary, Estrada states, “The principle remains consistent: the termination shock will be significantly worse than inaction.”

Estrada’s research innovatively gauges damage not only by global warming levels but also by the speed at which temperatures rise, according to Gernot Wagner from Columbia University.

Wagner warns that solar geoengineering may be riskier than it appears. “This highlights a critical concern,” he notes.

Make Sunsets, a Silicon Valley startup, has already launched over 200 sulfur dioxide-filled balloons into the stratosphere and offers emission offsets for sale. A recent launch in Mexico prompted governmental threats to ban geoengineering activities.

Israel’s Stardust Co., Ltd. has secured $75 million in funding and is lobbying the U.S. government to explore solar geoengineering options. A recent survey revealed that two-thirds of scientists anticipate large-scale SRM could occur this century, as reported by New Scientist.

According to studies, it would take at least 100 aircraft to cool the Earth by 1°C through aerosol injection, releasing millions of tons of sulfur dioxide annually, unimpeded by geopolitical conflicts or unforeseen events.

Presently, major nations like the United States are undermining global climate cooperation, but researchers highlight that such collaboration is essential to prevent termination shock and potentially realize the benefits of SRM.

Analysis of varying parameters suggests that aerosol injections could mitigate climate damage only if the annual probability of cessation is extremely low. In scenarios allowing for a gradual stop over 15 years, SRM might be viable.

If countries successfully reduce emissions, only minimal geoengineering cooling may be necessary, rendering aerosol injection beneficial with a maximum outage probability of 10%. This indicates a potential 99.9% chance of failure over a century, but manageable temperature recovery remains plausible in low emissions scenarios.

This need for international cooperation reveals what Estrada describes as the “governance paradox” of solar geoengineering: “We must ensure extremely low failure rates and possess effective governance to mitigate adverse outcomes.” However, he adds, “If we effectively reduce greenhouse gases, the need for SRM diminishes.”

These findings challenge the notion that solar geoengineering might lead to irresponsible development, as some have suggested, according to Chad Baum from Aarhus University. Funding for this new research was provided by the Degrees Initiative, aimed at supporting geoengineering studies in vulnerable low-income nations.

Baum stated, “We intend to complete all stages of this study, incorporating feedback from impacted communities.”

Despite this, Wagner emphasizes the imperative for further exploration into geoengineering’s trade-offs given the rise in emissions and their consequences: “We are approaching a critical juncture.”

Topics:

Source: www.newscientist.com

How Embracing Sauna Culture Enhances Brain Health and Reduces Dementia Risk

Sauna therapy for brain health benefits

Unlocking the Potential: Does Heat Therapy Enhance Brain Function?

gpointstudio/Getty Images

As an enthusiast of cold water swimming, I previously explored its brain benefits. However, the emerging evidence on heat therapy fascinated me—particularly regarding its neurological advantages. This prompted a deeper investigation into the subject.

During my last trip to Finland and Sweden, I immersed myself in their sauna culture, learning that ‘sauna’ is pronounced ‘sow-na’ (with ‘ow’ rhyming with ‘how’), contrasting my South East London pronunciation.

Finnish saunas, reaching temperatures of 70°C to 110°C (158°F to 230°F) with low humidity, are extensively studied. Regular sauna use correlates with numerous physical benefits, such as reduced risks of high blood pressure, muscle disorders, and respiratory diseases. Recent research also identifies significant cognitive benefits, including fewer headaches, improved mental health, better sleep quality, and a decreased risk of dementia.

A large-scale study involving nearly 14,000 participants aged 30 to 69 tracked sauna habits over 39 years. The findings revealed that those who frequented saunas nine to twelve times a month exhibited a 19 percent reduction in dementia risk compared to those who visited less than four times a month.

Moreover, sauna bathing appears linked to various cognitive enhancements. For instance, a small trial involving 37 adults with chronic headaches compared those receiving headache management advice to participants who regularly attended saunas. The sauna group reported significantly reduced headache intensity.

Regular sauna use is also associated with lower risks of psychosis and increased vitality and social functioning in elderly individuals, reinforcing its potential cognitive benefits.

However, it’s crucial to recognize that not all heat treatments yield the same results. Various forms of heat therapy exist, each offering distinct benefits. For example, a trial with 26 individuals diagnosed with major depressive disorder showed that those receiving infrared heating sessions reported significant symptom reductions over six weeks compared to a sham treatment.

How Does Heat Therapy Benefit Brain Health?

Heat therapy’s efficacy appears closely linked to its anti-inflammatory effects. In a study following 2,269 middle-aged Finnish men, researchers found that individuals engaging in frequent sauna use exhibited reduced levels of inflammation, a factor significantly associated with depression and cognitive decline.

Another mechanism involves heat shock proteins, which are produced when body temperature rises during sauna use or exercise. These proteins help prevent misfolding of other proteins—a common feature in many neurological disorders, including Alzheimer’s disease.

Enhanced blood circulation also plays a role; heat exposure dilates blood vessels, thereby improving cardiovascular health. This indirect benefit to brain health can decrease risks associated with vascular dementia and Alzheimer’s disease.

Additionally, saunas may elevate brain-derived neurotrophic factor (BDNF) levels, vital for neuron growth. In an experiment with 34 men, participants receiving 12 to 24 sessions of infrared therapy displayed significantly higher BDNF levels and improved mental well-being compared to those doing low-intensity workouts.

Can Saunas Enhance Cognitive Skills?

Beyond long-term neurological advantages, the immediate effects of sauna sessions are promising. A study involving 16 men revealed that brain activity post-sauna sessions resembled a relaxed state, indicating potential improvements in task efficiency. Researchers suggest that heat therapy may help extend mental work capacity over prolonged periods.

However, excessive heat exposure can lead to fatigue and reduced cognitive function. Studies indicate that high-temperature environments may impair memory consolidation, making saunas less suitable for study sessions.

If you’re exploring heat therapy, check guidelines from the British Sauna Association to ensure safety, including limiting duration and staying hydrated.

Do Hot Baths Offer Similar Benefits?

If you lack access to saunas, could hot baths serve as an alternative? While they may partially replicate sauna benefits, the evidence is still inconclusive. According to Ali Qadiri from West Virginia University, warm baths do elevate core body temperature and can improve mood and relaxation. Still, he cautions that robust data on saunas and dementia prevention far outweighs that for baths.

My local lake offers both cold water swimming and sauna experiences, prompting me to consider their combined effects. A Japanese study on the practice known as totonou, or alternating between hot saunas and cold baths, revealed enhancements in relaxation and reduced alertness after several rounds.

While more research is needed to determine if this combination is more effective than using heat or cold therapy alone, the overall evidence supports potential cognitive boosts from regular sauna visits, reinforcing my commitment to explore more heat and cold therapy options.

Topics:

Source: www.newscientist.com

Achieving the 1.5°C Climate Goal: The Century’s Best Vision for a Sustainable Future

New Scientist - Your source for groundbreaking science news and in-depth articles on technology, health, and the environment.

During the first decade of the 21st century, scientists and policymakers emphasized a 2°C cap as the highest “safe” limit for global warming above pre-industrial levels. Recent research suggests that this threshold might still be too high. Rising sea levels pose a significant risk to low-lying islands, prompting scientists to explore the advantages of capping temperature rise at approximately 1.5°C for safeguarding vulnerable regions.

In light of this evidence, the United Nations negotiating bloc, the Alliance of Small Island States (AOSIS), advocated for a global commitment to restrict warming to 1.5°C, emphasizing that allowing a 2°C increase would have devastating effects on many small island developing nations.

James Fletcher, the former UN negotiator for the AOSIS bloc at the 2015 UN COP climate change summit in Paris, remarked on the challenges faced in convincing other nations to adopt this stricter global objective. At one summit, he recounted a low-income country’s representative confronting him, expressing their vehement opposition to the idea of even a 1.5°C increase.

After intense discussions, bolstered by support from the European Union and the tacit backing of the United States, as well as intervention from Pope Francis, the 1.5°C target was included in the impactful 2015 Paris Agreement. However, climate scientists commenced their work without a formal evaluation of the implications of this warming level.

In 2018, the Intergovernmental Panel on Climate Change report confirmed that limiting warming to 1.5°C would provide substantial benefits. The report also advocated for achieving net-zero emissions by 2050 along a 1.5°C pathway.

These dual objectives quickly became rallying points for nations and businesses worldwide, persuading countries like the UK to enhance their national climate commitments to meet these stringently set goals.

Researchers at the University of Leeds, including Piers Foster, attribute the influence of the 1.5°C target as a catalyst driving nations to adhere to significantly tougher climate goals than previously envisioned. “It fostered a sense of urgency,” he remarks.

Despite this momentum, global temperatures continue to rise, and current efforts to curb emissions are insufficient to fulfill the 1.5°C commitment. Scientific assessments predict the world may exceed this warming threshold within a mere few years.

Nevertheless, 1.5°C remains a crucial benchmark for tracking progress in global emissions reductions. Public and policymakers are more alert than ever to the implications of rising temperatures. An overshoot beyond 1.5°C is widely regarded as a perilous scenario, rendering the prior notion of 2°C as a “safe” threshold increasingly outdated.

Topic:

Source: www.newscientist.com

Exploring the Physics Behind Stranger Things: Beyond the Ending

New Scientist - Your go-to source for insightful science news and in-depth articles covering technology, health, and environmental advancements.

Send us your feedback at New Scientist! If you enjoy following the latest advancements in science and technology, let us know your thoughts by emailing feedback@newscientist.com.

Even the Strangest Theories

This vacation, many fans spent their time reflecting on the final episode of Stranger Things. We experienced laughter, tears, and heated discussions about the storyline—especially its conclusion. Can we really say it was a fitting ending like Return of the King? (In our opinion, it was.)

In today’s online culture, vocal fan backlash is common. Some theorized that the finale was merely a ruse, leading to wild claims like “Conformity Gate” (not our term!). They argue that despite its two-hour runtime and cinematic release, the concluding episode was just a setup for a secret final episode, set to air this January. Critics point to a continuity error that suggests the entire narrative was an illusion crafted by Vecna, the mind-controlling antagonist.

Initially, we found these theories unconvincing, especially since the criticisms revolved around minor details. After all, the show itself defies physics—should we really be worried about the color of a graduation gown?

For newcomers, the storyline of Stranger Things unfolds in a small Indiana town beset by a secretive government lab conducting dangerous experiments. Spoiler alert: these experiments inadvertently open a portal to the “Upside Down,” a horrifying alternate dimension that mirrors the town, albeit in a more sinister light. Ultimately, it’s revealed that this Upside Down functions as a wormhole to yet another realm known as the Abyss.

If the Upside Down is indeed a wormhole, what then is the swirling red object levitating above? Some describe it as containing “exotic matter,” a theoretical substance crucial for stabilizing a genuine wormhole (although its existence remains unproven). This complicates matters further since the entrance to the Abyss exists in the Upside Down’s skies.

We’ve contemplated this for weeks, yet the whirling object’s purpose remains a mystery. Why does shooting it with a gun liquefy its surroundings, while an explosion obliterates the entire Upside Down? Wouldn’t such destruction release enough energy to obliterate a significant part of the East Coast?

Perhaps physicists focused on adaptive gate theory should tackle the bizarre phenomena within the Upside Down. There could be a Nobel Prize—or at least an Ig Nobel Prize—waiting for someone who can crack these mysteries.

Sparkling Sports Benefits

What could be more exhilarating than attending a live sports event? The thrill comes from being part of the crowd, cheering on your favorite players. But what if drinking soda while cheering made it even more enjoyable?

Alice Klein, a reporter, highlighted a study that demonstrated that spectators at a women’s college basketball game experienced greater enjoyment and a stronger sense of belonging when they consumed sparkling water instead of plain water. The researchers noted, “Drinking sparkling water together serves as a low-impact, non-alcoholic ritual, fostering social connection during and after live sports events.”

While Alice found this perspective amusing, editor Jacob Aaron defended the research: “They studied 40 individuals; what more could they need?” Readers may form their own opinions on the validity of this evidence. Nonetheless, we want to draw attention to the “competing interests” stated in the research paper, which we won’t comment on further. Here’s the statement:

“This study received funding from Asahi Soft Drinks Co., Ltd. WK and SM are employees of Asahi Soft Drinks Co., Ltd. The authors declare that this funding had no influence on the study design, methodology, analysis, or interpretation of the results. The sponsor has no control over the interpretation, writing, or publication of this study.”

AI Mistakes and Missteps

Reader Peter Brooker reached out to suggest a new section titled “AI Bloopers.” After using a well-known search engine, he was astounded to discover that the AI confidently asserted the first six prime numbers were 2, 3, 5, 7, 9, and 11.

We believe this section has long existed, albeit without a formal name. In fact, we often discuss how frequently to highlight these AI blunders. A weekly column could easily be filled with AI failures, but we worry it may become monotonous.

In line with Peter’s suggestion, Ghent University’s new rector, Petra de Sutter, found herself in hot water after using AI to generate her opening speech. It included fabricated quotes purportedly from Albert Einstein.

As reported by Brussels Times: “Impressively, De Sutter warned about the dangers of AI in her speech, advising that AI-generated content should not be ‘blindly trusted’ and that such text is ‘not always easy to distinguish from the original work.’”

Have a story for Feedback?

Share your insights with us at feedback@newscientist.com. Please include your home address. Previous feedback and this week’s comments can be accessed on our website.

Source: www.newscientist.com

How to Determine the X and V Coordinates of the Moon: A Comprehensive Guide

Discover the Moon’s X: Captured from Tokyo in February 2025

Credit: Yomiuri Shimbun/AP Images/Alamy

Nearly a decade ago, my excitement surged as I captured my first telescope photo of the Moon. With a makeshift setup, I clumsily held my phone camera up to the eyepiece. After a few shaky attempts, I got a clear snapshot of the lunar surface, and shared it online with pride.

Unbeknownst to me, I had clicked the picture during a brief 4-6 hour window each month when fascinating features known as Moon’s X and V could be visible.

These lunar marks are optical illusions, revealing themselves only when sunlight strikes the rims of specific craters during the Moon’s waxing phase, perfectly aligned along the terminator.

The Moon’s X forms a bright X shape, illuminated by sunlight on the edges of three craters: La Caillou, Blanquinus, and Pulbach. Similarly, the V shape comes to life as sunlight hits the Ukert crater and nearby smaller craters.

To witness the Moon’s X and V, a telescope is essential. However, timing is crucial. The visibility of these features varies globally and is influenced by your local time zone.

The next waxing moon occurs at 5 AM GMT on January 26th. However, residents in the UK may miss it as the Moon will be below the horizon then. The best viewing opportunity on the evening of January 25th will be in New York, where the first quarter appears around midnight, enabling visibility of X and V from about 10 PM to 2 AM. In places like Sydney, the daytime blocks visibility as the first quarter falls around 3 PM local time.

For the best chance to view the Moon’s captivating X’s and V’s, ensure you’re gazing at a waxing moon during optimal hours, preferably when it’s high in the night sky. Tools like Stellarium can help you track the Moon’s visibility on specific dates.

Mark your calendars for upcoming first quarter events on February 24th, March 25th, and April 24th-25th. If you’re in the UK, you might want to target March 25th as it aligns well with evening visibility around 7 PM local time.

Understanding the intricacies that must align for the Moon’s X and V to appear, I feel fortunate to have captured my first lunar photo during such a special moment.

Stay tuned for weekly articles at:
newscientist.com/maker

Abigail Beer is the feature editor of New Scientist and author of The Art of Urban Astronomy. Follow me on Twitter @abbybeall

Topics:

Source: www.newscientist.com

How Mars’ Gravity May Influence Earth’s Ice Age Cycles

Composite photo of Mars

Mars’ Significant Impact on Earth’s Climate

Credit: NASA/JPL/Malin Space Science Systems

Despite Mars being smaller than Earth, it profoundly affects Earth’s climate cycle. Understanding how smaller planets influence the climates of exoplanets is crucial for assessing their potential for habitability.

According to Stephen Cain, researchers at the University of California, Riverside, discovered this phenomenon by simulating various scenarios to analyze Mars’ effect on Earth’s orbit across different masses, from 100 times its current mass to its complete removal. “Initially, I was skeptical that Mars, only one-tenth the mass of Earth, could so significantly affect Earth’s cycles. This motivated our study to manipulate Mars’ mass and observe the effects,” says Cain.

Earth’s climate is influenced by long-term cycles tied to its orbital eccentricity and axial tilt. These cycles are dictated by the gravitational forces of the Sun and other planets, determining significant climate events such as ice ages and seasonal shifts.

One crucial cycle, referred to as the Grand Cycle, spans 2.4 million years, involving the elongation and shortening of Earth’s orbital ellipse. This directly influences the amount of sunlight reaching Earth’s surface, thus controlling long-term climate changes.

The research indicates that eliminating Mars would not only remove the Grand Cycle but also another essential eccentricity cycle lasting 100,000 years. “While removing Mars wouldn’t completely halt ice ages, it would alter the frequency and climate impacts associated with them,” Cain explains.

As Mars’ simulated mass increases, the resulting climate cycles become shorter and more intense. However, a third eccentricity cycle, enduring approximately 405,000 years, remains predominantly influenced by Venus and Jupiter’s gravitational pulls, illustrating that while Mars is notably influential, it is not the only player.

Mars also affects Earth’s axial tilt, which oscillates over about 41,000 years. Cain and colleagues observed that Mars seems to stabilize these cycles—more mass leads to less frequent cycles, while a smaller Mars results in more frequent ones.

The precise impact of Mars’ absence or increased mass on Earth remains speculative, but it would undoubtedly lead to changes. The pursuit of Earth-like exoplanets with climates suitable for life continues, underscoring the need to evaluate the influence of smaller planets more thoroughly. “A comprehensive understanding of exoplanet system architectures is essential for predicting possible climate changes on these worlds,” warns Sean Raymond from the University of Bordeaux, France.

However, deciphering these structures can be challenging. “This serves as a cautionary note: small planets like Mars may wield a greater influence than we realize, making it imperative not to overlook these difficult-to-detect celestial bodies,” concludes Cain.

Topics:

Source: www.newscientist.com

Can You Get Infected with Another Virus Alongside COVID-19? A Doctor’s Insights

As a healthcare professional, I often encounter concerns from patients about COVID-19, particularly those suffering from long-term effects. A common inquiry I receive is, “Can I get reinfected with COVID-19 while experiencing long-term symptoms from a previous infection?

Many individuals believe that enduring the virus for an extended period grants them some level of immunity against future infections. Unfortunately, this assumption is not accurate.

Long-lasting COVID-19 symptoms, including fatigue, breathing difficulties, and cognitive issues, can persist for months after initial infection. Regrettably, even prolonged exposure to COVID-19 does not shield you from reinfection.

The protective effects from previous infections and vaccinations fade over time. New variants of the virus, such as Omicron KP.3 and XEC in 2025, can evade the immune response.

This means that even if you’re grappling with persistent COVID-19 symptoms, it’s possible to contract the virus again, which may exacerbate symptoms or prolong recovery.

A positive COVID-19 test may indicate a reinfection with the same variant or a new one, but either way, it remains a manifestation of the coronavirus. Vaccines, particularly the 2025 booster shot, can significantly reduce the risk of severe illness. If you’re experiencing long-term COVID-19 and test positive, ensure you rest, stay hydrated, and consult your physician if symptoms worsen.

The coronavirus is still prevalent and continues to mutate, necessitating the practice of protective measures. It’s essential to get tested if you feel unwell, wear masks in crowded indoor settings, and keep up with vaccinations.

These proactive steps help mitigate exposure and safeguard those around you, especially as we navigate the lingering effects of this virus.


This article addresses the question from Yorkshire’s Terence Caldwell: “Can I be infected with COVID-19 along with the new variants?

If you have any questions, reach out to us at: questions@sciencefocus.com or connect with us on Facebook, Twitter, or Instagram (don’t forget to include your name and location).

Explore our ultimate fun facts and discover more amazing science resources.


Read more:


Source: www.sciencefocus.com

Pleistocene Fossils Uncover Evidence That Hopping Was Common Among Large Species, Not Just Small Kangaroos

A groundbreaking study conducted by paleontologists from the University of Bristol, the University of Manchester, and the University of Melbourne has uncovered that the giant ancestors of modern kangaroos possessed robust hindlimb bony and tendon structures, enabling them to endure the stress of jumping. This challenges the previous assumption that body size strictly limited this iconic locomotion.

Simosthenurus occidentalis. Image credit: Nellie Pease / ARC CoE CABAH / CC BY-SA 4.0 Certificate.

Currently, red kangaroos represent the largest living jumping animals, averaging a weight of approximately 90 kg.

However, during the Ice Age, some kangaroo species reached weights exceeding 250 kg—more than double the size of today’s largest kangaroos.

Historically, researchers speculated that these giant kangaroos must have ceased hopping, as early studies indicated that jumping became mechanically impractical beyond 150 kg.

“Earlier estimates relied on simplistic models of modern kangaroos, overlooking critical anatomical variations,” explained Dr. Megan Jones, a postgraduate researcher at the University of Manchester and the University of Melbourne.

“Our research indicates that these ancient animals weren’t simply larger versions of today’s kangaroos; their anatomy was specifically adapted to support their massive size.”

In this new study, Dr. Jones and her team examined the hind limbs of 94 modern and 40 fossil specimens from 63 species, including members of the extinct giant kangaroo group, Protemnodon, which thrived during the Pleistocene epoch, approximately 2.6 million to 11,700 years ago.

The researchers assessed body weight estimates and analyzed the fourth metatarsal length and diameter (a crucial elongated foot bone for jumping in modern kangaroos) to evaluate its capacity to endure jumping stresses.

Comparisons were drawn between the heel bone structures of giant kangaroos and their modern counterparts.

The team estimated the strength of tendons necessary for the jumping force of a giant kangaroo and determined whether the heel bones could accommodate such tendons.

The findings suggest that the metatarsals of all giant kangaroos were adequate to withstand jumping pressures, and the heel bones were sufficiently large to support the width of the required jump tendons.

These results imply that all giant kangaroo species had the physical capability to jump.

Nevertheless, the researchers caution that giant kangaroos likely did not rely solely on hopping for locomotion, given their large body sizes, which would hinder long-distance movement.

They highlight that sporadic hopping is observed in many smaller species today, such as hopping rodents and smaller marsupials.

Some giant kangaroo species may have used short, quick jumps to evade predators. Thylacoleo.

“Thicker tendons offer increased safety but store less elastic energy,” said Dr. Katrina Jones, a researcher at the University of Bristol.

“This trait may have rendered giant kangaroo hoppers slower and less efficient, making them more suited for short distances rather than extensive travel.”

“Even so, hopping doesn’t need to be maximally energy-efficient to be advantageous. These animals likely leveraged their hopping ability to rapidly navigate uneven terrain or evade threats.”

University of Manchester researcher Dr. Robert Nudds remarks: “Our findings enhance the understanding that prehistoric Australian kangaroos exhibited greater ecological diversity than seen today, with some large species functioning as herbivores, akin to modern kangaroos, while others filled ecological niches as browsers, a category absent among today’s large kangaroos.”

For more details, refer to the study results published in the journal Scientific Reports.

_____

M.E. Jones et al. 2026. Biomechanical Limits of Hindlimb Hopping in Extinct Giant Kangaroos. Scientific Reports 16/1309. doi: 10.1038/s41598-025-29939-7

Source: www.sci.news

Revolutionary Cosmological Simulations Illuminate Black Hole Growth in the Early Universe

Revolutionary simulations from Maynooth University astronomers reveal that, at the onset of the dense and turbulent universe, “light seed” black holes could swiftly consume matter, rivaling the supermassive black holes found at the centers of early galaxies.

Computer visualization of a baby black hole growing in an early universe galaxy. Image credit: Maynooth University.

Dr. Daksar Mehta, a candidate at Maynooth University, stated: “Our findings indicate that the chaotic environment of the early universe spawned smaller black holes that underwent a feeding frenzy, consuming surrounding matter and eventually evolving into the supermassive black holes observed today.”

“Through advanced computer simulations, we illustrate that the first-generation black holes, created mere hundreds of millions of years after the Big Bang, expanded at astonishing rates, reaching sizes up to tens of thousands of times that of the Sun.”

Dr. Louis Prowl, a postdoctoral researcher at Maynooth University, added: “This groundbreaking revelation addresses one of astronomy’s most perplexing mysteries.”

“It explains how black holes formed in the early universe could quickly attain supermassive sizes, as confirmed by observations from NASA/ESA/CSA’s James Webb Space Telescope.”

The dense, gas-rich environments of early galaxies facilitated brief episodes of “super-Eddington accretion,” a phenomenon where black holes consume matter at a rate faster than the norm.

Despite this rapid consumption, the black holes continue to devour material effectively.

The results uncover a pivotal “missing link” between the first stars and the immense black holes that emerged later on.

Mehta elaborated: “These smaller black holes were previously considered too insignificant to develop into the gigantic black holes at the centers of early galaxies.”

“What we have demonstrated is that, although these nascent black holes are small, they can grow surprisingly quickly under the right atmospheric conditions.”

There are two classifications of black holes: “heavy seed” and “light seed.”

Light seed black holes start with a mass of only a few hundred solar masses and must grow significantly to transform into supermassive entities, millions of times the mass of the Sun.

Conversely, heavy seed black holes begin life with masses reaching up to 100,000 times that of the Sun.

Previously, many astronomers believed that only heavy seed types could account for the existence of supermassive black holes seen at the hearts of large galaxies.

Dr. John Regan, an astronomer at Maynooth University, remarked: “The situation is now more uncertain.”

“Heavy seeds may be rare and depend on unique conditions for formation.”

“Our simulations indicate that ‘garden-type’ stellar-mass black holes have the potential to grow at extreme rates during the early universe.”

This research not only reshapes our understanding of black hole origins but also underscores the significance of high-resolution simulations in uncovering the universe’s fundamental secrets.

“The early universe was far more chaotic and turbulent than previously anticipated, and the population of supermassive black holes is also more extensive than we thought,” Dr. Regan commented.

The findings hold relevance for the ESA/NASA Laser Interferometer Space Antenna (LISA) mission, set to launch in 2035.

Dr. Regan added, “Future gravitational wave observations from this mission may detect mergers of these small, rapidly growing baby black holes.”

For further insights, refer to this paper, published in this week’s edition of Nature Astronomy.

_____

D.H. Meter et al. Growth of light seed black holes in the early universe. Nat Astron published online on January 21, 2026. doi: 10.1038/s41550-025-02767-5

Source: www.sci.news

Unlocking the Best Idea of the Century: Why Smartphones Are Here to Stay

Explore science news and lengthy articles by seasoned journalists on our website and in our magazine, covering breakthroughs in science, technology, health, and environmental studies.

“Every so often, a groundbreaking product emerges that reshapes our reality.” Steve Jobs during the 2007 Apple presentation. Tech executives often hype their innovations, but this proclamation was substantiated. The iPhone not only popularized apps but also introduced compact, powerful computers into our daily lives.

However, this transformation comes with drawbacks. Much like a snail retreating into its shell, we can retreat into our devices at any moment, breeding social anxiety. Coupled with safety issues, numerous countries have restricted mobile phone use in educational settings, and Australia has implemented a total ban on social media for users under 16 as of December 2025. Additionally, reliance on a constantly connected device can diminish our sense of privacy, according to data scientists like Mar Hicks of the University of Virginia. “This technology is acclimating users to significantly less privacy, not only in public spaces but also within the privacy of their own homes.”

Smartphones transcend their basic function, emphasizing their role in our lives, as anthropologist Daniel Miller from University College London notes. “They’ve expanded our personal space,” he articulates. These handheld digital environments allow for seamless access to the virtual worlds of our friends and family, resulting in a continuous navigation between our physical and digital existence.

The global influence of smartphones is undeniable. According to GSMA, the mobile operators’ industry association, over 70% of the global population now owns a smartphone. In many low-income countries, people increasingly bypass traditional desktop computers altogether. Smartphone-driven fintech platforms facilitate transactions for 70 million users across 170 countries, removing the necessity for conventional banks. Furthermore, farmers utilize smartphone applications for crop monitoring, and doctors employ them in hospitals to reduce reliance on costly machinery.

Moreover, the ramifications of smartphones extend far beyond their immediate use. The rapid miniaturization of electrical components like cameras, transistors, and motion sensors has enhanced processing power and introduced new potentials. This technological evolution has spurred numerous 21st-century innovations, including versatile drones, smart wearables, virtual reality headsets, and miniature medical implants.

Topics:

Source: www.newscientist.com

Review of ‘A Hole in the Sky’: Peter F. Hamilton’s Sci-Fi Epic with a Notable Flaw

Dark silhouette of a girl in a dress against the backdrop of mysterious deep space

A Hole in the Sky is narrated through the eyes of 16-year-old Hazel

Adam Selva/Alamy

Empty Hole
Peter F. Hamilton – Angry Robot

As an avid admirer of Peter F. Hamilton, I eagerly anticipated his latest release, Empty Hole, particularly because I’ve always been fascinated by the Ark story.

Centuries have elapsed since the ship’s voyage, and its crew has devolved into a medieval-like society, residing beneath the remnants of their ancestors’ advanced technology. We uncover the challenges they encountered, including issues with the planet they were meant to land on, and a rebellious uprising on board that stranded them in perilous circumstances. At the age of 65, inhabitants must be recycled for the ship. This unique premise captivates me completely.

All of this is framed from Hazel’s first-person viewpoint, a 16-year-old girl. A significant breach exists in the ship’s hull (hence the title), she battles intense headaches, and soon finds herself ensnared in a whirlwind of dramatic events. Yet she finds time to fret about boys and garments, which I couldn’t afford. Why would a girl focus on fashion when the survival of everyone in a spaceship is at stake, and she is constantly plagued by headaches?

As fans may know, Hamilton is a master storyteller renowned for his contributions to big science fiction. My personal favorites include Empty Space and the Dawn trilogy, as well as his intricate and thrilling Commonwealth Saga duology. His narratives are dynamic, wildly innovative, and filled with complexities that often leave me thrilled, even if I don’t fully grasp every detail.

I had reservations about Hamilton’s more recent works, like Exodus: Archimedes Engine, which ties into the upcoming video game, Exodus. I felt certain plotlines were included solely to promote the game, detracting from the reader’s enjoyment. However, I appreciate that these works may not target my demographic. It’s evident the seasoned author is seeking new challenges. (For those who enjoy video game adaptations, the second installment in the game series will be released later this year and the game is set to debut in 2027.)


If I were a movie or TV scout, I could envision Empty Hole adapting beautifully for the screen.

All this reminds me of Empty Hole. Midway through, I realized it seemed somewhat juvenile, for want of a better term. Research revealed that this novel was initially released as an audio-only book in 2021, primarily categorized as “young adult” or targeted towards teenagers.

In a 2020 interview, Hamilton expressed, “Though young adults as protagonists define a particular publishing category, I hope this work will resonate with audiences of all ages.” Personally, I don’t believe that a youthful protagonist excludes the potential for an adult-oriented book. (I mention this as a writer of novels featuring teenage lead characters.) So, can readers of all ages enjoy this book?

The plot setup and twists are stellar, as expected from Hamilton. However, I wish he had toned down the “teenage” aspects. I don’t require an interlude where she holds her boyfriend’s hand while my hero is fleeing danger. I believe that making the protagonist face the reality of being recycled at 65 would have added significant weight.

Perhaps Hamilton will capture a fresh audience with this release. For instance, as a movie or TV scout, I could envision how Empty Hole would look great on screen. This title is the first in a trilogy, with sequels slated for release in June and December. As I highlighted in my preview of new science fiction releases for 2026, this rapid schedule is unusual, and I’m excited to see how it unfolds.

I also recommend Emily…

Pandora’s Star
Peter F. Hamilton – Pan Macmillan

If you’re yet to experience Hamilton’s classic works, there are various entry points into the remarkable worlds he has created. I recommend Pandora’s Star and its sequel, Judas Unchained, as excellent beginnings. If “epic space opera” resonates with you, these novels are likely a perfect match.

Emily H. Wilson is a former editor of New Scientist and author of Sumerian, a trilogy set in ancient Mesopotamia. The final book in the series, Ninchevar, is currently available. You can find her at emilywilson.com, or follow her on X @emilyhwilson and Instagram @emilyhwilson1.

Topics:

Source: www.newscientist.com

Exploring the Uniqueness of Our Solar System: The Century’s Most Fascinating Concept

Since the early 1990s, astronomers have made groundbreaking discoveries in exoplanet research. The real surge began in the early 2000s with comprehensive surveys, revealing that our unique solar system, featuring four rocky planets and four gas giants, might be unlike most others.

For decades, the Chilean High Precision Radial Velocity Planet Probe and the California Legacy Survey have meticulously tracked the stellar wobbles caused by exoplanets. While these surveys have not as many exoplanet discoveries as pioneering telescopes like Kepler and TESS, they shed light on the distinctiveness of our solar system.

For instance, our Sun outsize over 90% of other stars and exists alone, unlike many stars with companion stars. Earth’s size is also exceptional; only 1 in 10 stars hosts a planet like Jupiter. When such planets are found, their orbits often dramatically differ from Jupiter’s stable, circular path. Notably absent from our system are super-Earths or sub-Neptunes, which are common in other star systems. Despite thousands of exoplanet discoveries, Earth-like planets orbiting sun-like stars, and potential extraterrestrial life remain elusive.

“Our solar system is strange due to what we have and what we lack,” states Sean Raymond from the University of Bordeaux, France. “It’s still uncertain whether we are simply rare at the 1% level or genuinely unique at the 1 in a million level.”

These revelations prompt intriguing inquiries about the formation of our solar system. Questions remain, such as why Jupiter is located farther from the Sun—rather than closer, as seen in many planetary systems. Unusual orbits of exoplanets have made astronomers reconsider our system’s history. The Nice model, proposed in 2001, suggests a major reconfiguration post-formation, moving Jupiter to the outskirts while redirecting asteroids and moons into new trajectories.

“The understanding that such a shift could occur stemmed directly from exoplanet research,” Raymond notes. “Approximately 90% of large exoplanetary systems exhibit instability. This insight prompts speculation about possible historical fluctuations within our solar system.”

Topic:

Source: www.newscientist.com

How Gigafactories Will Revolutionize Energy: The Century’s Best Idea

New Scientist: Your source for the latest news and long reads in science, technology, health, and the environment.

Batteries and solar energy technologies have been evolving for centuries, but they reached a pivotal moment in 2016. This year marked the launch of the first Gigafactory in Nevada, which produces cutting-edge battery technologies, electric motors, and solar cells on a large scale. The term ‘Gigafactory’ implies vast production capabilities.

The renewable energy potential—including solar, wind, and hydropower—is staggering. In merely a few days, the sun provides more energy to Earth than we can harvest from all fossil fuel reserves combined.

Efficiently harnessing this power remains a challenge. The photovoltaic effect, discovered by Edmond Becquerel in 1839, allows light to generate electric current. Although the first functional solar panels emerged in the 1950s, only in the 2010s did solar technology advance enough to rival fossil fuels. Simultaneously, lithium-ion batteries invented in the 1980s have created reliable energy storage solutions.

The Gigafactory has been instrumental in advancing these solar and battery technologies—not through new inventions but by integrating all components of electric vehicle production. This approach reflects Henry Ford’s legacy, populating the world with Teslas instead of fossil fuel-burning vehicles. “Batteries have made it possible to utilize solar power efficiently, and electric vehicles are now a reality,” says Dave Jones from Ember, a British energy think tank.

The economies of scale introduced by gigafactories have extended their impact beyond electric vehicles. “These batteries will enable a host of innovations: smartphones, laptops, and the capacity to transport energy efficiently at lower costs,” remarks Sarah Hastings-Simon from the University of Calgary, Canada.

Due to recent advancements, the costs associated with these technologies have plummeted. Many experts believe that the electrification of energy systems is now inevitable. In states like California and countries such as Australia, the abundance of solar energy has led grid operators to offer electricity at no cost. Battery technology is rapidly improving, enabling the development of solar-powered planes, ships, and long-haul trucks, effectively breaking our reliance on fossil fuels that have dominated energy systems for centuries.

Topics:

  • Electric Cars/
  • Renewable Energy

Source: www.newscientist.com

How Fear Influences Ecosystems: The Groundbreaking Insight of the Century

Explore the Science Behind Eco-Systems

After the reintroduction of wolves to Yellowstone National Park in 1995, significant ecological changes were observed, particularly a substantial decrease in moose populations. This decline was largely attributed to the impact of wolves on elk behavior; where wolves were likely present, elk dedicated more time to vigilance and less to foraging. Biologist John Laundre referred to this phenomenon as a “landscape of fear” in a pivotal 2001 study.

This concept builds on earlier research that suggested predator fear could influence prey behavior. Until then, it was widely assumed that predators primarily affected prey populations through physical predation alone. Laundre’s observations challenged this notion, indicating a potentially complex relationship between fear and wildlife dynamics.

Recent studies led by Liana Zanet at Western University in Ontario, Canada, further explore this landscape of fear. Over the past two decades, Zanet and her colleagues conducted experiments in British Columbia, playing predator calls near wild songbirds. Their findings revealed a marked reduction in egg-laying and hatching rates, with survival rates for hatchlings plummeting when predator sounds were used. Less than half of the hatchlings survived compared to when non-predator sounds were played. This indicates that fear alone can significantly outweigh the effects of direct predation on wildlife populations.

According to Zanet, prey animals often prioritize safety over foraging opportunities, avoiding prime feeding areas when they perceive threats. This fear-based behavior has profound ecological implications. On Canada’s west coast, the absence of natural predators like bears, cougars, and wolves has allowed raccoons to flourish, leading them to scavenge food resources along the coastline.

When Zanet’s team introduced dog barking recordings in coastal regions, they observed that raccoons largely avoided the beach, spending their time instead watching for potential threats. This avoidance behavior has contributed to the dramatic rebound of coastal animal populations in areas where predator fear is heightened. However, similar effects were not observed when seal sounds were played.

Understanding landscapes of fear is crucial for comprehending the profound impacts humans have on wildlife. In a specific study, Zanet’s team utilized camera traps to observe how wild animals responded to various sounds in Kruger National Park, South Africa. Surprisingly, they found that the fear generated by human presence surpassed that of lions, highlighting the extensive influence of human activity on wildlife behavior and ecosystems.

Topics:

Source: www.newscientist.com

Exploring the Epic Saga of Ancient Humanity: The Century’s Best Idea Revealed

In the last 25 years, the field of human evolution has witnessed remarkable growth, showcased by a significant increase in discoveries. Archaeologists have unearthed more fossils, species, and artifacts from diverse locations, from the diminutive “hobbits” to enigmatic creatures inhabiting Indonesian islands. Notably, Homo naledi is known solely from a single deep cave in South Africa. Simultaneously, advanced analytical techniques have enhanced our understanding of these findings, revealing a treasure trove of information about our origins and extinct relatives.

This whirlwind of discoveries has yielded two major lessons. First, since 2000, our understanding of the human fossil record has been extended further back in time. Previously, the oldest known human fossil was 4.4 million-year-old Ardipithecus, but subsequent discoveries in 2000 and 2001 unearthed even older species: Ardipithecus, Orrorin tugenensis from 6 million years ago, and Sahelanthropus tchadensis from 7 million years ago. Additionally, the Orrorin lineage was tentatively identified in 2022, suggesting it is slightly more recent than O. tugenensis.

According to Clement Zanoli from the University of Bordeaux, the discovery of these early human fossils represents “one of the great revolutions” in our understanding of evolution.

The second major lesson has enriched the narrative of how our species emerged from earlier hominins. By 2000, genetic evidence established that all non-Africans descend from ancestors who lived in Africa around 60,000 years ago. This revelation indicated that modern humans evolved in Africa and subsequently migrated, replacing other hominid species.

However, by 2010, the sequencing of the first Neanderthal genome opened a new chapter, along with the DNA analysis of several other ancient humans. These studies revealed that our species interbred with Neanderthals, Denisovans, and possibly other groups, creating a complex tapestry of human ancestry.

Skeletal research has long suggested interbreeding as many fossils exhibit traits that defy clear species categorization, as noted by Sheila Athreya at Texas A&M University. In 2003, Eric Trinkaus and colleagues described a jawbone excavated from Peștera cu Oase, Romania, as a Human-Neanderthal hybrid, based on its morphology. Later genetic testing in 2015 confirmed that individuals from Oase had Neanderthal ancestry, tracing back 4 to 6 generations ago.

This evidence highlights that our species did not merely expand from Africa; rather, our population absorbed genetic contributions from Neanderthals and Denisovans along the way. Genetically, we are a mosaic, a fusion of countless years of diverse human lineages.

Topics:

Source: www.newscientist.com

End-to-End Encryption: The Ultimate Security Solution of the Century

Everyone has secrets to protect. In today’s digital age, whether safeguarding personal messages, business communications, or confidential state information, end-to-end encryption (E2EE) offers essential security and peace of mind.

E2EE ensures that your communications remain private from internet service providers and the operators of messaging or video conferencing applications. Messages are encrypted on the sender’s device and only decrypted by the recipient, making them unreadable to unauthorized parties while in transit. This prevents access by any entity, including law enforcement or corporate insiders.

Digital encryption is rooted in robust mathematics rather than mere assurances. The RSA algorithm, introduced in 1977, pioneered modern encryption by relying on the complexity of factoring large numbers into their prime components. Since then, various algorithms have emerged, utilizing intricate mathematics to enhance cryptographic security.

The true strength of E2EE lies not just in its technical implementation, but in how it upholds democracy and human rights across the globe. As Matthew Feeney from the UK privacy group Big Brother Watch states, “There are individuals in perilous regions depending on encryption to preserve their lives.” Additionally, even in recognized democracies, freedom is vulnerable. Feeney warns that those who claim “I have nothing to hide” should take heed of history’s lessons.

Many governments view E2EE unfavorably because it blocks surveillance, similar to how postal services safeguard letters. Notably, UK governments have attempted to ban E2EE; most recently, Prime Minister Keir Starmer reversed a controversial request for a backdoor into Apple following a public outcry.

Feeney acknowledges the uncertainty surrounding the potential for E2EE to be compromised, as intelligence agencies typically do not disclose their capabilities. Concerns loom regarding the advent of quantum computing, which may soon breach current encryption algorithms. However, cryptography continues to evolve, with emerging mathematical solutions challenging outdated algorithms. “Governments may wield power, but they can’t override the laws of mathematics,” Feeney asserts.

Topics:

This rewrite optimizes the content for SEO, ensuring clarity, keyword inclusion, and readability while preserving the original structure and HTML tags.

Source: www.newscientist.com

Unlocking Epigenetics: The Century’s Most Revolutionary Concept

As we entered the new millennium, discussions surrounding the number of genes in our genome were highly debated. Initial estimates were significantly lower than anticipated, spurring a movement towards re-evaluating evolutionary processes.

The Human Genome Project revealed in 2001 that we possess fewer than 40,000 protein-coding genes — a number that has since been adjusted to around 20,000. This finding necessitated the exploration of alternative mechanisms to account for the complexity of our biology and evolution; epigenetics now stands at the forefront.

Epigenetics encompasses the various ways that molecules can interact with DNA or RNA, ultimately influencing gene activity without altering the genetic code itself. For instance, two identical cells can exhibit vastly different characteristics based purely on their epigenetic markers.

Through epigenetics, we can extract even greater complexity from our genome, factoring in influences from the environment. Some biologists are convinced that epigenetics can play a significant role in evolutionary processes.

A notable study in 2019 demonstrated how yeast exposed to toxic substances survived by silencing specific genes through epigenetic mechanisms. Over generations, certain yeast cultures developed genetic mutations that amplified gene silencing, indicating that evolutionary changes began with epigenetic modifications.

Epigenetics is crucial for expanding our understanding of evolutionary theory. Nevertheless, skepticism persists regarding its broader implications, particularly in relation to plants and other organisms.

For instance, Adrian Bird, a geneticist at the University of Edinburgh, expressed doubts, arguing in a recent paper that there is no clear evidence linking environmental factors like drought to mammalian genomes. Though epigenetic markers may be inherited, many are erased early in mammalian development.

Some researchers dispute these concerns. “Epigenetic inheritance is observed in both plants and animals,” asserts Kevin Lara, an evolutionary biologist from the University of St. Andrews. In a comprehensive study published recently, Lara and colleagues proposed a wealth of research indicating that epigenetics could play a role across the entire tree of life.

So, why is there such division in the scientific community? Timing may be a factor. “Epigenetic inheritance is an evolving area of study,” observes Lara. While epigenetics has been recognized for decades, its relevance to evolutionary research has only gained traction in the past 25 years, making it a complex field to assess.

Topic:

Source: www.newscientist.com

Transformer Architecture: The Revolutionary AI Innovation Redefining the 21st Century

Discover Today’s Most Powerful AI Tools

Explore the incredible capabilities of modern AI tools that can summarize documents, generate artwork, write poetry, and even predict protein folding. At the heart of these advancements is the groundbreaking transformer architecture, which revolutionized the field of artificial intelligence.

Unveiled in 2017 at a modest conference center in California, the transformer architecture enables machines to process information in a way that closely resembles human thinking patterns. Historically, AI models relied on recurrent neural networks, which read text sequentially from left to right while retaining only the most recent context. This method sufficed for short phrases, but when dealing with longer and more complex sentences, critical details often slipped through the cracks, leading to confusion and ambiguity.

The introduction of transformers to the AI landscape marked a significant shift, embracing the concept of self-attention. This approach mirrors the way humans naturally read and interpret text. Instead of strictly scanning word by word, we skim, revisit, and draw connections based on context. This cognitive flexibility has long been the goal in natural language processing, aiming to teach machines not just to process language, but to understand it.

Transformers emulate this mental leap effectively; their self-attention mechanism enables them to evaluate every word in a sentence in relation to every other word simultaneously, identifying patterns and constructing meaningful connections. As AI researcher Sasha Ruccioni notes, “You can take all the data you get from the Internet and Wikipedia and use it for your own tasks. And it was very powerful.”

Moreover, this transformative flexibility extends beyond text. Today’s transformers drive tools that can generate music, render images, and even model molecules. A prime example is AlphaFold, which treats proteins—long chains of amino acids—analogously to sentences. The function of a protein hinges on its folding pattern and the spatial relationships among its constituent parts. The attention mechanism allows this model to assess these distant associations with remarkable precision.

In retrospect, the insight behind transformers seems almost intuitive. Both human and artificial intelligence rely on discerning when and what to focus on. Transformers haven’t merely enhanced machines’ language comprehension; they have established a framework for navigating any structured data in the same manner that humans navigate the complexities of their environments.

Source: www.newscientist.com

Understanding Neurodiversity: Why ‘Normal’ Brains Don’t Exist – A Revolutionary Perspective for the Century

Historically, science operated under the notion of a “normal brain,” one that fits standard societal expectations. Those who diverge from this model have often been labeled with a disorder or mental health condition, treated as if they were somehow flawed. For years, researchers have refined the notion that neurodevelopmental conditions, including autism, ADHD, dyslexia, and movement disorders, should be recognized as distinctive variations representing different neurocognitive frameworks.

In the late 1990s, a paradigm shift occurred. What if these “disorders” were simply natural variations in brain wiring? What if human traits existed on a spectrum rather than a stark boundary between normal and abnormal? Those at either end of the spectrum may face challenges, yet their exceptional brains also offer valuable strengths. Viewed through this lens, diverse brains represent assets, contributing positively to society when properly supported.

The concept of neurodiversity gained momentum, sparking lively debates in online autism advocacy groups. By 2013, the Diagnostic and Statistical Manual of Mental Disorders recognized autism as a spectrum condition, abolishing the Asperger’s syndrome diagnosis and classifying it on a scale from Level 1 to Level 3 based on support needs. This shift solidified the understanding of neurodivergent states within medical literature.

Since the early 2000s, research has shown that individuals with autism often excel in mathematical reasoning and attention to detail. Those with ADHD frequently outperform others in creativity, while individuals with dyslexia are adept at pattern recognition and big-picture thinking. Even those with movement disorders have been noted to develop innovative coping strategies.

These discoveries have led many scientists to argue that neurodivergent states are not mere evolutionary happenstance. Instead, our ancestors likely thrived thanks to pioneers, creative thinkers, and detail-oriented individuals in their midst. A group possessing diverse cognitive strengths could more effectively explore, adapt, and survive. Some researchers now propose that the autism spectrum comprises distinct subtypes with varying clusters of abilities and challenges.

While many researchers advocate for framing neurodivergent characteristics as “superpowers,” some caution against overly positive portrayals. “Excessive optimism, especially without supporting evidence, can undermine the seriousness of these conditions,” says Dr. Jessica Eccles, a psychiatrist and neurodiversity researcher at Brighton and Sussex Medical School. Nevertheless, she emphasizes that “with this vocabulary, we can better understand both the strengths and challenges of neurodiversity, enabling individuals to navigate the world more effectively.”

Topics:

Source: www.newscientist.com

Unlocking Molecule Creation: Why Click Chemistry is the Century’s Most Innovative Concept

Explore the latest science news and in-depth articles by expert journalists on developments in science, technology, health, and the environment.

Chemistry can often be a complex and slow process, typically involving intricate mixtures in round-bottomed flasks that require meticulous separation afterward. However, in 2001, K. Barry Sharpless and his team introduced a transformative concept known as click chemistry. This innovative approach revolutionizes the field, with a name coined by Sharpless’s wife, Janet Dueser, perfectly encapsulating its essence: a new set of rapid, clean, and reliable reactions.

Though the idea appears straightforward, its elegance lies in its simplicity. Sharpless, along with colleagues Hartmas C. Kolb and MG Finn, described their creation as “spring-loaded.” This concept hinges on applying these reactions to various starting materials, assembling them akin to Lego blocks, thereby enabling the swift construction of a vast array of novel and beneficial molecules. Sharpless’s primary focus? Pharmaceuticals.

The overarching principle guiding these reactions was to steer clear of forming carbon-carbon bonds, which was the norm among chemists at the time, and instead to create bonds between carbon and what are known as “heteroatoms,” primarily oxygen and nitrogen. The most recognized click reaction involves the fusion of two reactants to create a triazole, a cyclic structure of carbon and nitrogen atoms. This motif proves to be highly effective at binding to large biomolecules such as proteins, making it invaluable in drug development. Sharpless independently published this specific reaction concurrently with chemist Morten Meldal, who researched it at the University of Copenhagen. This reaction has since been instrumental, notably in the production of the anticonvulsant drug Rufinamide.

Chemists like Tom Brown from the University of Oxford describe this reaction as simple, highly specific, and versatile enough to work in almost any solvent. “I would say this was just a great idea,” he asserts.

Years later, chemist Carolyn Bertozzi and her team at Stanford University developed a click-type reaction that operates without toxic catalysts, enabling its application within living cells without risking cellular damage.

For chemist Alison Hulme at the University of Edinburgh, this research was pivotal in elevating click chemistry from a promising idea to a revolutionary advancement. It granted biologists the ability to assemble proteins and other biological components while labeling them with fluorescent tags for investigation. “It’s very straightforward and user-friendly,” Hulme explains. “We bridged small molecule chemistry to biologists without necessitating a chemistry degree.”

For their groundbreaking contributions, Bertozzi, Meldal, and Sharpless were awarded the 2022 Nobel Prize in Chemistry—an outcome that surprised no one.

Topics:

Source: www.newscientist.com

Transforming Transient Astronomy: The Universe’s Biggest Drama Becomes a Cinematic Masterpiece

Here’s your content rewritten for SEO optimization, while keeping the original HTML tags:

New Scientist: Explore the latest science news, in-depth features, and expert analysis on technology, health, and environmental developments.

Imagine looking up at the night sky 1,000 years ago; you would likely see an additional point of light compared to today. Back then, Chinese astronomers referred to these phenomena as “guest stars,” believing they foretold significant changes.

Today, we understand these were likely supernovae—spectacular explosions from dying stars—one of many serendipitous discoveries made by astronomers observing at opportune moments.

In the modern era, the quest for these “transient” events has evolved into a strategic approach, revolutionizing the field of astronomy. We have since identified numerous fleeting events that span from mere nanoseconds to durations longer than a human lifetime.

“Astronomy considers both spatial and temporal scales, yet the latter remains largely unexplored,” states Jason Hessels from the University of Amsterdam.

To capture these ephemeral occurrences effectively, astronomers are innovating by synchronizing telescopes into a cohesive unit, akin to a well-oiled machine, as evidenced by the Palomar Temporary Factory project from 2009 to 2012. One significant flash observed by a telescope in San Diego prompted immediate follow-up investigations by others. “It was orchestrated like a conveyor belt,” Hessels remarked.

More specialized telescopes are emerging, focusing on time, rather than just space. Notably, the Zwicky Temporary Facility has taken over from Palomar, and the Pan-STARRS survey amassed 1.6 petabytes of astronomical data—recording the largest dataset ever captured from Hawaii.

These advanced telescopes have generated extensive data that unveil the twinkling and fluctuating events of the cosmos, including gamma-ray bursts, fast radio bursts, gravitational waves, and stars that either explode spontaneously or are ripped apart by black holes.

Transient astronomy is reshaping our perception of the universe. “We’ve progressed from painting to photography, and now to some form of stop-motion film,” Hessels describes. He continues, “We’re approaching a complete narrative. Each adjustment in my perspective of the sky feels as though the cinematic experience expands further.”

<section class="ArticleTopics" data-component-name="article-topics">
    <p class="ArticleTopics__Heading">Topic:</p>
</section>

SEO Optimization Highlights:

  1. Keywords: Integrated relevant keywords such as “transient events,” “supernovae,” “astronomy,” and “telescopes” without compromising readability.
  2. Descriptive Alt Text: Improved the alt text of the image to convey its significance clearly.
  3. Subheadings: Ensure to use the appropriate heading tags (e.g., <h1>, <h2>, etc.) where necessary in articles for better SEO structuring.
  4. Internal Links: Links to external credible sources to establish authority.

Feel free to customize specific details or add additional keywords you target for your SEO strategy!

Source: www.newscientist.com

Connecting Extreme Weather to Climate Change: The Most Important Insight of Our Time

New Scientist - Your premier source for the latest science news, technology advancements, health insights, and environmental developments.

January 2003: Physicist Miles Allen witnessed the River Thames flooding, threatening his home in Oxford, England. He asked, “Why did meteorologists refuse to link this incident to climate change?”

Later that year, climatologist Peter Stott from the British Met Office found himself in Italy during one of Europe’s most severe heatwaves. Instead of enjoying a vacation, he faced temperatures exceeding 40 degrees Celsius, a shocking experience for him.

Both Allen and Stott were intent on understanding climate change’s role in extreme weather events. Stott utilized existing climate models to simulate two scenarios of the 2003 heatwave: one reflecting the climate of that year and another devoid of human-induced warming.

They ran extensive model simulations and concluded that in their landmark 2004 paper in Nature, human activities have more than doubled the likelihood of experiencing a heatwave similar to that of 2003.

This groundbreaking work marked the inception of a new climate science field, which began to identify human influences on extreme weather events. Soon analyses emerged for diverse phenomena, from heatwaves to severe droughts and storms.

However, a significant challenge remained—post-event analyses often took months or years to determine the influence of climate change.

To address this, researchers, including Friederike Otto from Imperial College London, founded World Weather Attribution in 2014. This initiative facilitates swift analysis of extreme weather events, quantifying the probable impacts of climate change, with results frequently released within days.

This has dramatically altered reporting on such events globally, enabling news outlets to directly attribute deadly weather phenomena to climate change and emphasizing the real-world consequences of rising emissions.

As Otto stated, “When we began this work a decade ago, scientists and journalists maintained that individual weather events could not be blamed on climate change. That perspective has shifted immensely.”

This advancement also supports climate change litigation, with causal investigations providing evidence in numerous lawsuits against polluters worldwide. In 2022, the United Nations announced a new International Loss and Damage Fund, paving the way for climate change compensation.

In 2003, Allen queried: “Could litigation for climate change be feasible?” Thanks to developments in attribution science, the answer is now a definitive “yes.”

Topic:

Source: www.newscientist.com

New Bone Cancer Treatment Shows Unexpected Reduction in Tumor Pain

Nanomedicine Concept Art

Artist’s Impression of Nanomedicine in Action

Alfred Pasieka/Science Photo Library

Cancer that metastasizes to the bones can be both deadly and painful. A new innovative drug is showing promise in addressing these issues by disrupting the interaction between tumors and nerves. This groundbreaking approach may lead to a much more comfortable cancer treatment journey.

According to William Fan from Harvard University, who was not part of the study, “This highlights a new and exciting paradigm in which a single cancer treatment can simultaneously improve mortality and quality of life.”

Research indicates that 65-80% of individuals with breast or prostate cancer ultimately develop bone cancer when the disease spreads. As these tumors progress, they irritate nearby pain-sensing nerves.

Standard treatments such as radiation therapy and chemotherapy are commonly utilized to shrink bone tumors. However, pain may still persist due to residual cancer cells interacting with nerves. Furthermore, conventional methods can harm healthy tissues and often require long-term use of painkillers, like opioids, risking addiction, as noted by Xian Jia Asia at Zhejiang University in China.

In response, Xian and colleagues have introduced a revolutionary “nanotherapy” comprising tiny fat capsules loaded with DNA that encodes gasdermin B, a protein designed to kill cancer cells selectively. This therapy targets cancer cells while sparing healthy ones, utilizing the characteristic higher levels of reactive oxygen species found in tumor cells. The nanocapsules additionally contain OPSA, which enhances the body’s inherent anti-cancer immune response.

To evaluate the efficacy of this novel drug, researchers injected breast cancer cells into the legs of various mice. Once bone tumors formed, the mice received either the full nanotherapy, a simpler version containing OPSA but lacking the gasdermin B gene, or a saline control. Treatments were administered into the tail every other day over five days.

After two weeks, tumors in the full nanotherapy group were on average 94% smaller than those in the control group, while the simpler form resulted in a 50% reduction. Furthermore, all mice treated with the complete nanotherapy survived, in contrast to merely 60% of those receiving the simpler therapy and 20% in the control group. This treatment effectively killed tumor cells and induced an anti-tumor immune response, Xiang reported.

Interestingly, both forms of the nanotherapy improved mobility in the affected limbs significantly more than the control, particularly in the full nanotherapy group, indicating potential pain relief from bone tumors. Tumor samples revealed a noticeable decrease in the density of nerve cells within the cancerous growths.

The mechanism appears to involve enhancing the cancer cells’ ability to absorb calcium ions, essential for nerve growth and pain signal transmission. “The concept is that cancer cells act like sponges for local calcium, reducing the availability of calcium for sensory neurons,” explains Professor Huang. Further studies are necessary to establish how nanotherapy adjusts calcium uptake in cancer cells, which may expose new avenues for targeting this critical pathway.

In preliminary findings, it was observed that nerves surrounding tumors could facilitate their growth, suggesting that nerve-related mechanisms could not only alleviate pain but also inhibit tumor proliferation, although specific impacts remain uncertain, according to Xiang.

These findings bolster the emerging perspective that targeting the nervous system may transform cancer treatment paradigms, states Huang. However, translating these treatments from mice to humans remains challenging due to differences in immune responses. Xiang aspires to initiate human clinical trials within five to ten years.

Topics:

Source: www.newscientist.com