On January 22, 2026, the NASA/ESA Hubble Space Telescope captured stunning images of interstellar comet 3I/ATLAS, showing it in near-perfect alignment with the Sun-Earth axis. This observation unveiled an unprecedented jet structure and an elongated tail.
This image of interstellar comet 3I/ATLAS was captured by Hubble’s WFC3 instrument on January 22, 2026, at 13:10 UTC. Image credit: NASA / ESA / Hubble / Man Tu Hui, Shanghai Observatory.
According to researchers Professor Abraham Loeb from Harvard University and Dr. Mauro Barbieri from the INAF Padua Observatory, interstellar objects like 3I/ATLAS provide a rare opportunity to study materials from distant star systems. They detailed their findings in recent papers published in American Astronomical Society Research Notes.
Previous interstellar visitors such as 1I/Oumuamua lacked evidence of gas or dust, while 2I/Borisov was only observed at angles greater than 16 degrees from the Sun-Earth alignment, missing the chance for detailed study.
On January 22, 2026, comet 3I/ATLAS was positioned at an astonishingly small angle of just 0.69 degrees relative to the Earth-Sun axis, allowing for an optimal view as our planet passed nearly directly between the Sun and the comet.
This rare alignment caused a significant brightness spike, influenced by the composition and structure of the particles emitted from the 3I/ATLAS jet, as noted by Professor Loeb in the statement.
This image of interstellar comet 3I/ATLAS was taken by Hubble’s WFC3 instrument on January 22, 2026, at 13:40 UTC. Image credit: NASA / ESA / Hubble / Man Tu Hui, Shanghai Observatory.
Astrophysicist Mang To Hui from the Shanghai Observatory utilized Hubble’s capabilities to observe 3I/ATLAS during conditions that may not occur again for decades.
The interstellar object images were gathered using Hubble’s Wide Field Camera 3 (WFC3) instrument, providing valuable data for ongoing research.
Professor Loeb elaborated on the findings, stating, “When the Hubble images from the January 22, 2026, alignment were processed by my collaborator Toni Scarmato, they revealed a system of four jets, including a prominent tail directed toward the Sun and Earth, along with three smaller ‘minijets.’”
“These minijets are spaced evenly apart at 120-degree angles, with one possibly hidden from view due to its unfavorable orientation relative to Earth, rendering it dark.”
Wild blueberry (Vaccinium angustifolium) is a perennial plant native to North America. This berry is rich in polyphenols, particularly flavonoids, which offer significant health benefits. A recent study published in Critical Reviews in Food Science and Nutrition provides a comprehensive review of the evidence and insights shared at an expert symposium regarding wild blueberries and their link to cardiometabolic health.
Vaccinium angustifolium. Image credit: Σ64 / CC BY 3.0.
Known for their high nutrient content, wild blueberries, or lowbush blueberries, are celebrated for their abundance of anthocyanins and other beneficial compounds.
These polyphenols, contributing to the berries’ vibrant blue hue, have been thoroughly researched for their powerful antioxidant properties.
“Wild blueberries have been valued for centuries,” noted University of Maine professor Dorothy Krimis Zakas, co-lead author of the recent review.
“Traditional wisdom recognizes their significance, and modern research continues to investigate how the unique constituents of wild blueberries contribute to health when part of a balanced diet.”
This review analyzed 12 human clinical trials conducted across four countries and numerous additional studies on the health effects of compounds found in wild blueberries.
The most consistent result from these studies was an improvement in vascular function, indicating better blood vessel responsiveness.
Some trials noted enhanced endothelial function just hours after consuming wild blueberries, while others observed benefits from regular intake over longer periods.
Recent studies have also highlighted the impact of wild blueberries on the gut microbiome.
Thanks to their high fiber and polyphenol content, these berries resist early digestion and are processed by gut bacteria into metabolites that enter the bloodstream.
These metabolites can constitute a significant proportion of bioactive compounds in circulation post-consumption; one study demonstrated that daily intake of freeze-dried wild blueberry powder boosted levels of beneficial bioactive compounds such as Bifidobacterium.
Emerging evidence suggests that consistent blueberry consumption may enhance cognitive abilities, especially thinking speed and memory in older adults, possibly linked to improved circulation and other systemic effects.
For adults at higher cardiometabolic risk, several studies referenced in the review identified meaningful improvements in blood pressure, glycemic control, and lipid profiles, including reductions in total cholesterol, LDL (bad) cholesterol, and triglycerides, following weeks of regular blueberry intake.
“What’s remarkable about wild blueberries is their wealth of polyphenols and nutrients. Their health benefits appear to stem from multiple mechanisms,” explained Sarah A. Johnson, Ph.D., from Florida State University, co-lead author of the review.
“Evidence indicates that these berries may influence various biological pathways related to cardiometabolic health, including vascular function and inflammation, but individual responses may vary.”
“The recent focus on the gut microbiome’s role in health benefits is intriguing and might help researchers understand how to optimize gut health for enhanced wellness.”
_____
Sarah A. Johnson et al. Wild blueberries and cardiometabolic health: A current review of the evidence. Critical Reviews in Food Science and Nutrition, published online January 24, 2026. doi: 10.1080/10408398.2025.2610406
Image Credit: Christopher Michel/Contour RA by Getty Images
Civilizations often define their eras by significant materials. We speak of the Stone Age and the Bronze Age, and currently, we reside in the Silicon Age—marked by the prevalence of computers and mobile devices. What might the next defining era be? Omar Yagi from the University of California, Berkeley, posits that the innovative material he pioneered in the 1990s has promising potential: Metal-Organic Frameworks (MOFs). His groundbreaking work in this area made him a co-recipient of the 2025 Nobel Prize in Chemistry.
MOFs, along with their covalent organic frameworks (COFs) counterparts, are crystalline in structure and notable for their exceptional porosity. In 1999, Yagi and his team achieved a milestone by synthesizing a zinc-based structure known as MOF-5. This material is characterized by its numerous pores, boasting an internal surface area equivalent to that of a football field within merely a few grams (refer to the image below). Internally, the structure offers vastly more space than externally.
Over the years, Yagi has been a pioneer in the development of new MOFs and COFs, a field called reticular chemistry. Understanding how these materials can be utilized is a focal point of his research. Their porous nature allows them to absorb other molecules, making them invaluable for applications such as moisture extraction from arid desert air and atmospheric carbon dioxide capture. In an interview with New Scientist, Yagi expressed optimism about this research, discussing the past, present, and future of reticular chemistry and the impending era of these materials.
Karmela Padavic-Callaghan: What inspired your interest in reticular chemistry?
Omar Yagi: Initially, when we began our work with MOFs, we had no concept that we were addressing social issues; it was purely an intellectual pursuit. We aimed to construct materials molecule by molecule, akin to building a structure or programming using Legos. It was a formidable challenge in chemistry. Many doubted its feasibility and considered our efforts futile.
What made the design of materials seem unfeasible?
The primary hurdle in rationalizing material construction lies in the nature of component mixing, which typically results in disordered, complex arrangements. This aligns with physical laws, as nature tends to favor high entropy or disorder. Therefore, our goal was to engineer a crystal—an ordered entity with a recurring pattern.
It’s akin to instructing your children to form a perfect circle in their room—it demands significant effort. Even upon achieving that circle, if you release your hold, it may take too long to re-establish it. We were essentially attempting to crystallize materials in a day—what nature takes billions of years to accomplish. Nonetheless, I believed that with the right knowledge, anything could be crystallized.
In 1999, your intuition was validated with the publication:Synthesis of MOF-5. Did you foresee its potential utility?
We identified a valuable solvent for synthesizing stable MOFs and understanding its mechanism. This critical insight allows us to minimize disorder, effectively tuning the outcome. Subsequently, thousands of researchers have adopted this method.
Initially, I was just elated to create beautiful crystals. Observing their remarkable properties prompted thoughts of potential applications, particularly in trapping gases. Given their internal compartments, these substances can accommodate water, carbon dioxide, or other molecules.
What’s your perspective on creating these materials today?
I usually avoid elaborate cooking and prefer simple, healthy ingredients. This mindset parallels my approach to chemistry: striving for simplicity while utilizing only necessary chemicals. The first step involves selecting the backbone of material; the second, defining pore sizes; the third, administering chemistry on the backbone to incorporate trapping molecules. This process, while appearing simple, is intricately complex.
What pioneering technologies does this process enable?
By mastering molecular-level design, we foresee significant geological transformations. My vision, along with my company founded in 2020, Atco, encompasses progressing from molecules to practical societal applications—addressing material deficiencies in various tasks or enhancing poorly performed tasks with rational designs. Our advancements in material synthesis will elevate societal standards.
Recently, we unveiled COF-999, the most efficient material for capturing carbon dioxide. Undertaking extensive capture tests, we demonstrated its efficacy in collecting CO2 from the atmosphere for over 100 cycles here in Berkeley. Atoco aims to implement reticulated materials like COF-999 in carbon capture modules suitable for both industrial settings and residential buildings.
Additionally, we’ve devised a novel material capable of extracting thousands of liters of water daily from the atmosphere. This technology relies on our device which can pull moisture even in humidities below 20%, such as in desert locations like Nevada. I foresee that within the next decade, water harvesting will emerge as an everyday technology.
MOFs exhibit a crystalline structure filled with numerous small internal pores.
Image Credit: Eyes of Science/Science Photo Library
How do MOFs and COFs compare with other water and CO2 capture technologies?
We maintain a significant degree of control over the chemistry involved, allowing for sustainable device manufacturing. These devices are long-lasting, and when the MOF component eventually degrades, it can dissolve in water, thus preventing environmental contamination. Consequently, as MOFs scale to multi-ton applications, we should not anticipate a “MOF waste issue.”
For instance, we’ve developed a method to harness ambient sunlight for water release from harvesting devices, thereby enhancing energy efficiency. Similarly, carbon capture technologies can utilize waste heat from industrial processes, rendering them more economical and sustainable compared to competing systems.
However, challenges in scalability and precise molecular release control persist. While producing MOFs in large quantities is feasible, COFs production has not reached such scales yet. I am optimistic that improvements will come swiftly. Optimizing water retention is essential; we must strike the right balance between excessive and insufficient retention.
We are now leveraging artificial intelligence to streamline MOF and COF optimization, making the design process more efficient. Generally, while generating a basic MOF or COF is straightforward, achieving one with finely-tuned properties can be time-consuming, often taking a year. The integration of AI could significantly accelerate this timeline; our lab has successfully doubled the speed of MOF creation by employing large-scale language models.
What promising applications of reticular chemistry should capture public interest?
Reticular chemistry is a thriving field, with millions of new MOFs yet to be synthesized. One intriguing concept involves utilizing MOFs to replicate the catalytic functions of enzymes, enhancing the efficiency of chemical reactions important in drug development and other fields. Some MOFs have demonstrated capabilities comparable to enzymes but with improved longevity and performance, making them ripe for medical and therapeutic applications over the next decade.
An exciting future application lies in “multivariate materials.” This research, largely conducted in my lab, aspires to create MOFs with varied internal environments. By employing different modules paired with varying compounds, we can develop materials that selectively and efficiently absorb gases. This approach encourages chemists to expand their thinking beyond creating uniform structures toward designing heterogeneous frameworks that incorporate diverse elements.
What gives you confidence in the future of MOF and COF innovations?
We’ve merely scratched the surface, with no shortage of concepts for exploration. Since the 1990s, this field has flourished, and while interest in many areas declines over time, that hasn’t occurred here. An exponential rise in patents related to MOFs and COFs reflects ongoing curiosity and the pursuit of novel applications. I appreciate how this research links organic and inorganic chemistry, as well as engineering and AI, evolving beyond traditional chemistry into true scientific frontiers.
I genuinely believe we are at the cusp of a revolution. While it may not always feel that way, something extraordinary is transpiring. We can now design materials in unprecedented ways, connecting them to innovative applications that were once unimaginable.
In a shocking turnabout, a 1998 study falsely claimed a connection between the measles-mumps-rubella (MMR) vaccine and autism. I was astounded by the study’s poor quality, its acceptance by a prestigious journal, and the lack of critical reporting by journalists. At that time, I was unaware that the research was fraudulent.
Nearly three decades later, the repercussions of these misleading claims still echo globally. The World Health Organization (WHO) reports that six countries, including the UK (for the second time), Spain, and Austria, have lost their measles-free status. This decline in vaccination rates has been significantly influenced by an anti-vaccination movement propagated by that erroneous paper. Meanwhile, the United States faces its worst outbreak in decades and would have also lost its measles-free status had it not withdrawn from the WHO.
Measles is one of the most contagious viruses on the planet, causing severe complications in around 1 in 5 children. Complications may lead to lasting brain damage, respiratory issues, hearing loss, blindness, and brain swelling. The WHO estimates that approximately 95,000 people may succumb to measles in 2024.
The actual impact extends further, as measles also destroys immune cells that help protect against other infections, diminishing immunity for around five years. It is a risk not worth taking.
Fortunately, measles has specific vulnerabilities. The virus first targets immune cells, travels to lymph nodes, and then disseminates throughout the body. This complex pathway enhances the immune system’s ability to combat the virus before it fully establishes an infection, unlike respiratory viruses that primarily attack cells in the nose and throat.
This is why the measles component in the MMR vaccine is highly effective. Countless studies confirm that vaccinated children are significantly better off, with no established link to autism. One compelling observation is that when the MMR vaccine was withdrawn in Japan, autism rates remained unchanged.
To maintain herd immunity, at least 95% of children must be vaccinated to ensure that each infected individual transmits the virus to fewer than one other person. This means that a small percentage of unvaccinated children can precipitate another outbreak of measles.
Globally, vaccination rates are improving, but there is still room for growth. The percentage of children receiving the first dose of the measles vaccine increased from 71% in 2000 to 84% in 2010. Despite a slight decline during the COVID-19 pandemic, the rates have rebounded. The WHO estimates that between 2000 and 2024, measles vaccination has prevented an impressive 60 million deaths worldwide, marking a significant victory.
However, in high-income nations, progress is stalling. After the erroneous claims of 1998, MMR vaccination levels fell to only 80% in England and Wales. By 2013, intake rates exceeded 90% but have been gradually decreasing since then. A recent report indicated that this decline in the UK is partly because access to vaccinations is becoming increasingly difficult for parents, a concern that warrants urgent attention.
Additionally, the resurgence of anti-vaccine sentiments is contributing to these challenges, closely linked to right-wing extremism as propagated on specific social media platforms. A quick search for “MMR measles” on Bluesky yielded no anti-vaccine posts in the top results, while the search on X surfaced a plethora of misleading anti-vaccine rhetoric.
Combatting this misinformation is a considerable challenge, especially when high-profile individuals on social media platforms align with disinformation, such as a certain billionaire collaborating with a known liar leading the world’s wealthiest nation and appointing an anti-vaxxer as health secretary.
What’s evident is that this crisis extends beyond vaccines; it’s crucial in areas like climate science where misinformation clouds the truth. Governments throughout Europe and beyond must take decisive action to regulate the infosphere, promote scientific integrity, and silence charlatans. The future of humanity is at stake.
Following a heart attack, the brain processes signals directly from sensory neurons in the heart, indicating a crucial feedback loop that involves not only the brain but also the immune system—both vital for effective recovery.
According to Vineet Augustine from the University of California, San Diego, “The body and brain are interconnected; there is significant communication among organ systems, the nervous system, and the immune system.”
Building on previous research demonstrating that the heart and brain communicate through blood pressure and cardiac sensory neurons, Augustine and his team sought to explore the role of nerves in the heart attack response. They utilized a groundbreaking technique to make mouse hearts transparent, enabling them to observe nerve activity during induced heart attacks by cutting off blood flow.
The study revealed novel clusters of sensory neurons that extend from the vagus nerve and tightly encompass the ventricles, particularly in areas damaged by lack of blood flow. Interestingly, while few nerve fibers existed prior to the heart attack, their numbers surged significantly post-incident, suggesting that the heart stimulates the growth of these neurons during recovery.
In a key experiment, Augustine’s team selectively turned off these nerves, which halted signaling to the brain, resulting in significantly smaller damaged areas in the heart. “The recovery is truly remarkable,” Augustine noted.
Patients recovering from a heart attack often require surgical interventions to restore vital blood flow and minimize further tissue damage. However, the discovery of these new neurons could pave the way for future medications, particularly in scenarios where immediate surgery is impractical.
Furthermore, the signals from these neurons activated brain regions associated with the stress response, triggering the immune system to direct its cells to the heart. While these immune cells help form scar tissue necessary for repairing damaged muscle, excessive scarring can compromise heart function and lead to heart failure. Augustine and colleagues identified alternative methods to facilitate healing in mice post-heart attack by effectively blocking this immune response early on.
Recent decades have indicated that communication occurs between the heart, brain, and immune system during a heart attack. The difference now is that researchers possess advanced tools to analyze changes at the neuron level. Matthew Kay from George Washington University noted, “This presents an intriguing opportunity for developing new treatments for heart attack patients, potentially including gene therapy.”
Current medical practices frequently include beta-blockers to assist in the healing process following heart attack-induced tissue damage. These findings clarify the mechanism by which beta-blockers influence the feedback loops within nervous and immune systems activated during heart attacks.
As Robin Choudhury from the University of Oxford remarked, “We might have already intervened with the newly discovered routes.” Nevertheless, he cautioned that this pathway likely interacts with various other immune signals and cells that remain not fully understood.
Moreover, factors like genetics, gender differences, and conditions such as diabetes or hypertension could affect the evolution of this newly identified response. Hence, determining when and if a pathway is active in a wider population remains essential before crafting targeted drugs, Choudhury added.
Vast areas of the Amazon rainforest are cleared for cattle ranching
Michael Dantas/AFP via Getty Images
The alarming rate of deforestation is significantly diminishing rainfall patterns across the Amazon, indicating that this vital rainforest could hit a catastrophic tipping point sooner than previously anticipated.
Research from 1980 to 2019 indicates that rainfall in the southern Amazon basin has diminished by 8 to 11 percent, based on satellite data and rain gauge readings. During this same time frame, tree cover in the region has shrunk by 16 percent, primarily due to deforestation linked to beef cattle ranching.
Contrastingly, deforestation has been less pronounced in the northern Amazon Basin, where precipitation has only shown minor increases that lack statistical significance.
Recent research highlights that deforestation contributes to arid conditions within a 300-kilometer radius. This new analysis reveals that this effect spans over a basin wider than 3,000 kilometers, suggesting that deforestation harms not just forests, but also the productivity of adjacent ranches and soybean farms, according to Dominique Spracklen from the University of Leeds.
“Some in agribusiness may perceive sections of the forest as underutilized land. Yet, these forests play a crucial role in maintaining regional rainfall, which in turn benefits our agricultural practices,” Spracklen explains.
Global warming is exacerbating the drying of the Amazon, culminating in extreme droughts and unprecedented wildfires in 2024. However, atmospheric studies led by Spracklen and colleagues indicate that deforestation is responsible for 52 to 75 percent of the decline in rainfall.
Moisture from the Atlantic Ocean is transported by prevailing winds into the Amazon, where it precipitates as rain. Plants contribute to this cycle as evaporation and transpiration return about three-quarters of that water to the atmosphere. Further downwind, it falls again as rain through multiple cycles, creating “flying rivers” that distribute moisture across the rainforest.
When forested areas are destroyed, over half of the rainwater is redirected to rivers and subsequently returns to the ocean, depleting the moisture available for the flying rivers and leading to reduced rainfall. Additionally, this diminishes atmospheric instability necessary for storm cloud formation, Spracklen and his team discovered.
As fewer trees slow down the wind, it tends to pick up speed, removing more moisture from the area.
Unlike previous research, this study employs a combination of data and modeling to effectively illustrate how deforestation impacts rainfall patterns, asserts Yadvinder Malhi from Oxford University.
“The atmosphere becomes smoother and, in a sense, slipperier. There’s reduced friction with the ground, enabling moisture to travel further out of forested regions,” Malhi notes, emphasizing the significance of secondary atmospheric processes often overlooked in prior studies.
Scientists voice concerns that the cumulative impact of heightened temperatures, drought, and deforestation could push the Amazon rainforest to a tipping point where it transitions into a savannah ecosystem, although the timeline for this transition remains uncertain. Spracklen and his colleagues found that climate models may underestimate the influence of deforestation on rainfall by as much as 50 percent, implying that the rainforest could face significant threats earlier than anticipated.
According to a 2022 study, there is a 37% probability that certain regions of the Amazon could vanish by 2100 if global temperatures, currently at 1.4°C, rise to 1.5°C. However, this does not necessarily imply that rainforests will convert into savannahs; it may lead to the emergence of fewer species and scrub forests capable of storing less carbon.
“The Amazon’s sensitivity is greater than we previously imagined, which is troubling,” he states. “We may be closer to the deforestation threshold than we realize, although there remains significant uncertainty surrounding this issue.”
A recent discovery in Greece has unveiled the oldest known hand-held wooden tool, dating back approximately 430,000 years, utilized by early human ancestors.
One tool, crafted from an alder trunk, likely served a digging purpose, while the other, made from either willow or poplar, may have been employed for shaping stone, according to a study published in the Proceedings of the National Academy of Sciences.
“The rarity of preserving wood over such a long period makes this discovery particularly fascinating,” stated Annemieke Milks, the lead author of the study, in a phone interview with NBC News.
Milks, affiliated with the University of Reading in the UK and an authority on early wooden tools, emphasizes that while stone tools have been preserved for centuries, finding these rare wooden artifacts enhances our understanding of human evolution.
The evidence suggests that early human ancestors utilized wood for tool-making, marking a significant development in our knowledge of their capabilities.
These ancient tools were unearthed at the Megalopolis Basin site in Marathusa, Greece, located about 160 miles southwest of Athens.
Researchers have identified that this site—once a lakeshore—was pivotal for early human activities, including the fabrication and use of stone and bone tools, as well as hunting large animals like elephants.
Milks described one of the smaller tools as “unprecedented,” noting that its precise function remains unclear. “We were fortunate to uncover such a unique artifact,” she remarked.
Distinct markings on the wood signify that these artifacts were intentionally crafted by humans, rather than being natural sticks, according to Milks.
Innovative methods for analyzing ancient wooden tools have surged over the last decade, yielding new insights into our past, Milks added.
Since direct dating of organic materials like wood can only trace back 50,000 years, researchers relied on dating surrounding sediments and rocks to affirm the tools’ age of 430,000 years.
Milks explained that the preservation of these wooden tools was likely facilitated by their rapid burial in moist sediments, protecting them from microorganisms that would typically lead to decay.
Co-author Caterina Harbati noted that the extraordinary conditions at the excavation site facilitated the preservation of not just wood, but also delicate organic materials like seeds and leaves.
Paleoanthropologist Halvaty from the University of Tübingen in Germany emphasized the discovery’s significance, showcasing Greece’s essential role in human evolutionary studies.
“This finding expands our understanding of early human technology and highlights previously unknown types of tools, enriching our knowledge in this domain,” Halvaty stated.
Maeve McHugh, an associate professor of classical archaeology at the University of Birmingham, called the discovery an essential “snapshot” of early human activity and a glimpse into cognitive development during that era.
“The survival of this wooden artifact, particularly from such an early period in human history, is remarkable and of great significance,” McHugh concluded.
For decades, discussions surrounding coastal risk have focused primarily on climate change and sea level rise. However, a significant new global study reveals an even more urgent threat: land subsidence, affecting hundreds of millions of people living in delta regions, including urban hubs like New Orleans and Bangkok.
In various locations around the world, land is sinking at rates that often surpass the rising sea levels.
Utilizing satellite radar technology to monitor minute changes in the Earth’s surface, researchers have discovered that over half of the world’s deltas—low-lying areas where major rivers converge with the ocean—are currently sinking. This gradual subsidence, in conjunction with sea level rise, poses the most significant flood risk in many densely populated delta regions on Earth.
“This is truly a declaration of war,” stated Professor Robert Nicholls, co-author of the study and coastal scientist at the University of Southampton. The findings were reported in BBC Science Focus. “Until now, no one had taken a global perspective on delta subsidence. This study highlights the breadth of the issue and underscores the urgency of addressing it.”
The survey results can be found in the journal Nature.
Subsidence rates in river deltas, displayed as colored circles. The size of each circle reflects the area of the delta sinking faster than sea level rise, represented as a color gradient across the delta’s basin. Photo credit: Ohenhen et al. (2026)
Global Problems Hidden in Plain Sight
Delta regions comprise only 1% of the Earth’s land area but are home to approximately 350 to 500 million people, including some of the world’s most significant cities and productive agricultural zones. These areas serve as economic powerhouses, environmental hotspots, and essential food sources, yet they are inherently fragile.
Deltas are formed by loose, water-saturated sediments deposited over millennia. In their natural state, these sediments compact under their own weight and gradually sink.
Historically, natural subsidence was balanced by periodic flooding that replenished the land with fresh sediment, but modern interventions have disrupted this equilibrium.
The recent study analyzed satellite measurements across 40 major delta regions from 2014 to 2023, creating the first high-resolution global image detailing land elevation changes.
The findings were alarming: currently, at least 35% of delta regions have subsided, with over half of the land surface subsiding in most deltas.
In 18 of the 40 river deltas examined, land is sinking faster than local sea level rise, revealing hotspots where subsidence dominates over regional and global sea level increases.
A similar pattern is evident across continents—Asia, Africa, Europe, and the Americas—where relative sea levels rise due to both ocean expansion and land subsidence.
“From a risk perspective, it doesn’t matter if sea levels rise or land sinks,” Nichols explained. “The ultimate effect is the same, but the responses to those threats may differ.”
The Ciliund Delta in Indonesia is home to Jakarta, inhabited by over 40 million people, and is sinking at an average rate of 5.6 mm annually. Photo credit: Getty
What is Causing the Sinking?
The study identified three primary causes of anthropogenic land subsidence: groundwater extraction, reduced sediment supply, and urban expansion. Among these, groundwater pumping is the most significant predictor.
When groundwater is extracted, the soft surrounding sediments collapse and compact, a process that is nearly irreversible. Once the sediment is compacted, it will not return, even if water levels recover.
In 10 out of the 40 delta regions studied, groundwater depletion was the main factor driving land subsidence. Additionally, reduced river sediment caused by damming and flood defenses, combined with the weight of growing cities built on soft soils, contribute to this crisis.
As a result, what was once a slow geological phenomenon has transformed into an urgent environmental crisis.
Read More:
US Case: Mississippi Delta
The Mississippi River Delta in New Orleans and Louisiana exemplifies this issue in the United States.
The analysis confirms widespread subsidence across the delta, with over 90% of the region experiencing subsidence at an average rate of 3.3 mm per year. Some localized areas even sink much faster.
While this rate may seem minimal, it accumulates significantly over decades, especially alongside the threats posed by rising sea levels and hurricanes.
The Mississippi Delta has lost thousands of square kilometers of coastal wetlands over the last century, resulting in catastrophic damage. An area the size of a soccer field is lost to open water every 100 minutes.
The Mississippi Delta experiences an average subsidence of 3.3 mm per year, with some hotspots sinking over 10 times faster. Photo credit: NASA Earth Observatory
The lack of fresh sediment is a critical issue. Levees and dams prevent flooding and the natural deposition of new sediments that help rebuild the land. Additionally, drainage systems, oil and gas extraction, and decades of groundwater pumping exert further stress on fragile soils.
While some delta areas display resilience, one proposed solution is relocating populations away from these vulnerable regions. For instance, New Orleans has seen a steady population decline since the 1960s.
“In the United States, people tend to accept the idea of relocation,” Nichols noted, emphasizing that societal mobility and favorable land-use policies make this transition more politically feasible than in parts of Europe and Asia, where long-term protective measures are generally favored.
Warning to Major Cities
While North America grapples with these challenges, the most extreme subsidence rates can be found in parts of South and Southeast Asia, where population density is high and dependence on groundwater for agriculture, industry, and drinking water prevails.
Regions such as the Mekong River (Vietnam), Ganges and Brahmaputra rivers (Bangladesh and India), Chao Phraya River (Thailand), and Yellow River (China) are sinking faster than current global sea level rise in some areas by over a centimeter per year.
Mega-cities like Bangkok, Dhaka, Shanghai, and parts of Jakarta are built on these subsiding foundations.
The good news is that, unlike global sea level rise—which unfolds over centuries—human-induced land subsidence can respond swiftly to policy changes. A notable success story is Tokyo.
Due to strict groundwater extraction regulations, Tokyo has significantly reduced subsidence rates. Photo credit: Getty
In the mid-20th century, unchecked groundwater extraction caused parts of Tokyo to sink more than 4 meters. However, rigorous regulations on groundwater use and investments in alternative water sources resulted in a swift decrease in subsidence rates.
“Authorities have enacted legislation to ensure sufficient alternative water supplies and eliminate groundwater extraction,” Nichols remarked. “And almost overnight, this led to stabilization.”
Additional solutions include managed flooding in agricultural areas to replenish soil sediments. “Sediment is often deemed a pollutant,” Nichols points out. However, when rivers overflow, they deposit valuable materials that built the delta, a process sometimes referred to as “brown gold.”
Urban areas can be fortified with effective engineering solutions such as sea walls, levees, and storm surge barriers. “Addressing subsidence complements efforts to adapt to sea level rise and reduces vulnerabilities,” Nichols added, as reported here.
Shifting Attitudes Towards Coastal Risk
The study’s authors emphasize that land subsidence has been dangerously overlooked in global climate risk strategies, largely viewed as a local rather than a global issue.
However, local does not equate to minor. Even under severe climate scenarios, land subsidence is expected to remain the primary driver of relative sea level rise in numerous delta regions for decades to come.
Financial and institutional barriers often hinder large-scale interventions in many areas, but deferring action only exacerbates the costs and challenges of future adaptations.
Once land subsides, initiating new urban developments is not feasible, leaving communities to face tough decisions about relocation.
As Nichols succinctly states, “The first crucial step is to acknowledge that a problem exists.”
We can Usually Agree on Objects’ Appearance, But Why?
Martin Bond / Alamy
Although our world seems inherently ambiguous at the quantum level, this is not the experience we face in daily life. Researchers have now established a methodology to measure the speed at which objective reality emerges from this quantum ambiguity, lending credibility to the notion that an evolutionary framework can elucidate this emergence.
In the quantum domain, each entity, such as a single atom, exists within a spectrum of potential states and only assumes a definitive, “classical” state upon measurement or observation. Yet, we perceive strictly classical objects devoid of existential ambiguities, and the processes enabling this have challenged physicists for years.
Prominent physicist Wojciech Zurek of Los Alamos National Laboratory in New Mexico introduced the concept of “quantum Darwinism,” suggesting that a process akin to natural selection confirms the visibility of the “fittest” state among numerous potential forms, ensuring successful replication through environmental interactions up to the observer’s perspective. When observers with access to only portions of reality converge on the same objective observation, it indicates they are witnessing one of these identical copies.
Researchers at University College Dublin, led by Steve Campbell, have shown that differing observers can still arrive at a consensus on objective reality, even if their observational methods lack sophistication or precision.
“Observers can capture a fragment and make any measurements they desire. If I capture a different fragment, I too can make arbitrary measurements. The question becomes: how does classical objectivity arise?” he explains.
The research team has redefined the emergence of objectivity as a quantum sensing issue. For instance, if the objective fact pertains to the frequency of light emitted by an object, the observer must acquire accurate data about that frequency, similar to how a computer employs a light sensor. In optimal conditions, this method achieves ultra-precise measurements, quickly leading to a definitive conclusion about the light’s frequency. This scenario is assessed using Quantum Fisher Information (QFI), a mathematical formula that benchmarks how varying, less accurate observational techniques can still attain similar precise conclusions. Gabriel Randy at the University of Rochester highlights this comparison in their recent study.
Remarkably, their calculations indicate that for significantly large fragments of reality, even observers employing imperfect measurements can ultimately gather enough data to reach the same conclusions about objectivity as those derived from the ideal QFI standard.
“Surprisingly, simplistic measurements can be just as effective as more advanced ones,” Lundy states. “This illustrates how classicality emerges: as fragments grow larger, observers tend to agree on even basic measurements.” Thus, this research contributes further to our understanding of why, when observing the macroscopic world, we concur about its physical attributes, such as the color of a coffee cup.
“This study underscores that we do not require flawless, ideal measurements,” adds Diego Wisniacki from the University of Buenos Aires, Argentina. He notes that while QFI is foundational in quantum information theory, its application to quantum Darwinism has been sparse, presenting pathways to bridge theoretical frameworks with established experimental methodologies, like quantum devices utilizing light-based or superconducting qubits.
“This research serves as a foundational ‘brick’ in our comprehension of quantum Darwinism,” states G. Massimo Palma from the University of Palermo, Italy. “It more closely aligns with the experimental descriptions of laboratory observations.”
Palma elaborates that the simplicity of the model used in this study could facilitate new experimental pursuits; however, complex system calculations will be essential to solidify quantum Darwinism’s foundation. “Advancing beyond rudimentary models would mark a significant progression,” Palma asserts.
Lundy conveyed that researchers are eager to transform theoretical findings into experimental validations. For instance, qubits formed from trapped ions could be employed to evaluate how the emergence of objectivity timescale relates to the durations during which these qubits retain their quantum characteristics.
Palantir, a leading American data analytics firm, wields technology capable of both saving and taking lives. As its influence expands globally, concerns about this enigmatic corporation’s role in world affairs and its ultimate beneficiaries continue to rise.
The Hidden Female Psychopath
Recent studies indicate that the presence of female psychopaths may be more prevalent than once believed. If this is the case, why do they remain unnoticed? Perhaps you suspect someone around you? Here’s how to identify potential traits:
Artificial Intelligence Ethics
There is an urgent need to educate AI on moral principles. However, a paradox emerges: to elicit positive responses from AI, one must examine its behavior when exposed to malicious tasks.
Data Storage in Space
The rapid progression of AI technology is driving an unprecedented demand for electricity globally. Additionally, cooling these data centers requires significant amounts of water. Could the cosmos offer a viable solution for data storage? Many startups believe it is the ideal destination.
Plus Highlights
Boost Your IQ: Ditch the brain training games. Physical activity could truly unlock your brain’s full potential.
Impact of Social Media Bans: Experts are split on how effective Australia’s social media ban is for children.
Q&A Insights: Our experts tackle questions such as “Why do we kiss?” “How contagious is laughter?” “Can tigers get along with their prey?” “What are the similarities between identical twins?” “Is déjà vu unhealthy?” “Should you trim your eyelashes?” “What happens if you fall ill on the ISS?” “How do we best measure earthquakes?” “Can you maintain a happy marriage with a psychopath?” “How fast am I moving now?” and much more…
Don’t forget, BBC Science Focus is also available on all major digital platforms. You can access it on Android, Kindle Fire and Kindle e-Readers, as well as on your iOS app for iPad and iPhone.
This rewrite optimizes the content for SEO while preserving HTML formatting and enhancing clarity.
Meet Holly, a dedicated staff writer at BBC Science Focus, where she expertly manages the engaging Q&A section. With an MSc (Special Award) in Earth Sciences (Space and Climate Physics) from UCL, Holly specializes in Astronomy and Earth Sciences. Before her journey with Our Media, she gained valuable experience as a geo-environmental consultant and engineer, passionately exploring exoplanets in her free time while advising on ground risk and remediation projects in Northern England.
With nearly a decade of experience as a regional editor for a popular theater website, Holly excels in curating and developing digital content. She is also a talented artist and illustrator, regularly contributing to the craft website Gathered. Her impressive portfolio includes collaborations with notable organizations such as RSPB, English Heritage, Disney, Pilot, and Brother, in addition to her work with BBC Good Food Magazine, Home Style Magazine, and Papercraft Inspiration Magazine.
Holly’s interests extend to photography and a fascination with antiques, showcasing her diverse artistic talents and love for culture.
Webb astronomers have unveiled a breathtaking image captured by the NASA/ESA/CSA James Webb Space Telescope, showcasing MACS J1149.5+2223 (MACS J1149), a cosmic collection of hundreds of galaxies situated about 5 billion light-years from Earth in the constellation Leo. The latest images not only highlight the cluster’s brilliant galaxies but also illustrate how their immense gravitational forces uniquely affect the fabric of space-time.
The stunning image of the galaxy cluster MACS J1149.5+2223. Image credits: NASA / ESA / CSA / Webb / C. Willott, National Research Council Canada / R. Tripodi, INAF-Astronomical Observatory of Rome.
The latest Webb image of MACS J1149 dramatically showcases light from background galaxies, which is bent and magnified in a remarkable phenomenon known as gravitational lensing. This creates elongated arcs and distorted shapes, revealing the mass of both clusters.
“The immense gravity of this galaxy cluster does more than hold the galaxies adrift in the universe,” the Webb astronomers explained in a statement.
“As light from galaxies beyond the cluster travels toward our telescope over billions of years, its trajectory through space-time is warped by the gravitational forces of the intervening galaxies.”
This gravitational lensing effect is evident throughout the image of MACS J1149, with galaxies appearing stretched into narrow streaks and others morphing into unusual shapes. A prime example of gravitational lensing can be seen near the image’s center, just below the prominent white galaxy.
In this area, a galaxy with spiral arms has been transformed into a shape resembling a pink jellyfish. This peculiar galaxy once harbored the farthest single star ever identified and a supernova that appeared four times simultaneously.
“This program employs Webb’s advanced instruments to explore the evolution of low-mass galaxies in the early Universe, shedding light on their star formation, dust content, and chemical makeup,” the astronomers stated.
The data collected will also assist researchers in studying the epoch of reionization, when the first stars and galaxies illuminated the universe, mapping mass distributions in galaxy clusters, and understanding how star formation diminishes within cluster environments.
In a groundbreaking study analyzing data from over 268,000 individuals, researchers have identified that genes associated with thiamine (vitamin B1) metabolism significantly influence intestinal motility. This discovery paves the way for personalized treatments targeting conditions like constipation and irritable bowel syndrome (IBS).
Diaz Muñoz et al. identified key mechanisms involved in intestinal motility, including an overlooked role for vitamin B1. Image credit: Hillman et al., doi: 10.1264/jsme2.ME17017 / CC BY 4.0.
Gastrointestinal motility is crucial for food digestion, nutrient absorption, and waste elimination, all critical components of human health and well-being.
The regulation of motility depends on a multifaceted communication network, which encompasses the gut-brain axis, the immune system, gut microbiota, and is affected by external influences such as diet, physical activity, and medications.
Disruptions in motility control and peristalsis can lead to significant health issues, including IBS and chronic idiopathic intestinal pseudoobstruction, highlighting the importance of understanding these conditions.
In this recent study, Professor Mauro D’Amato from LUM University, CIC bioGUNE-BRTA, and Ikerbasque, along with his colleagues, employed a large-scale genetic approach to identify common DNA variations linked to intestinal motility.
The research utilized questionnaires and genetic data from 268,606 individuals of European and East Asian ancestry, applying computational analysis to pinpoint relevant genes and mechanisms.
The team discovered 21 genomic regions that affect defecation frequency, including 10 previously unknown regions, affirming the biologically plausible pathways involved in intestinal motility regulation.
For instance, they found significant correlations with bile acid regulation, which aids fat digestion and serves as signaling molecules in the intestines, along with neural signaling pathways crucial for intestinal muscle contractions (especially acetylcholine-related signaling).
However, the most striking outcome arose when the researchers pinpointed two high-priority genes focused on vitamin B1 biology, specifically those involved in the transport and activation of thiamine: SLC35F3 and XPR1.
To validate the relevance of the vitamin B1 signal, they further examined dietary data from the UK Biobank.
A study involving 98,449 participants revealed that increased dietary thiamine intake correlated with more frequent bowel movements.
Crucially, the relationship between thiamine consumption and bowel frequency exhibited variations based on genetic factors, specifically the combined genetic score of SLC35F3 and XPR1.
This suggests that genetic variations in thiamine metabolism may impact how vitamin B1 intake affects bowel habits in the general population.
“By utilizing genetic data, we’ve created a roadmap for the biological pathways influencing intestinal pace,” said Dr. Cristian Díaz Muñoz from CIC bioGUNE-BRTA.
“The data strongly highlights vitamin B1 metabolism alongside established mechanisms like bile acids and neural signaling.”
This research also confirms a significant biological link between bowel frequency and IBS, a widespread condition affecting millions globally.
“Issues with intestinal motility are at the core of irritable bowel syndrome, constipation, and other common motility disorders, yet the underlying biology remains challenging to decipher,” noted Professor D’Amato.
“These genetic findings point to specific pathways, particularly those involving vitamin B1, as vital areas for further research, including laboratory experiments and meticulously designed clinical trials.”
For more details, refer to the study published in the Journal on January 20, 2026.
_____
C. Diaz Muñoz et al. Genetic analysis of defecation frequency suggests a link to vitamin B1 metabolism and other pathways regulating intestinal motility. Intestine published online January 20, 2026. doi: 10.1136/gutjnl-2025-337059
Exploring Bella Junior’s Supernova, also referred to as RX J0852.0-4622 or G266.2-1.2, scientists have revealed the mysteries surrounding its explosive past. This ancient nebula, once a brilliant supernova, has perplexed researchers regarding its distance and the magnitude of its explosion. Recently, however, groundbreaking discoveries linked a newly formed star, Ve 7-27, with the remnants of Bella Junior. By utilizing the Multi-Unit Spectroscopic Explorer (MUSE) on the ESO’s Very Large Telescope, astronomers have captured unprecedented detailed images of Ve 7-27.
VLT/MUSE image of Ve 7-27. Image credit: ESO / Suherli et al.
“This is the first evidence ever connecting a newborn star to the remnants of a supernova,” stated Dr. Samar Safi Harb, an astrophysicist from the University of Manitoba.
“This discovery resolves a decades-long debate, enabling us to calculate the distance of Bella Junior, its size, and the true power of the explosion.”
By examining the gas emissions from Ve 7-27, Dr. Safi Harb and his team confirmed that it shares the same chemical signature as materials from the Vela Junior supernova.
This correlation established a physical connection between the two celestial bodies, allowing astronomers to accurately determine Bella Junior’s distance.
Both Ve 7-27 and Vela Junior are approximately 4,500 light-years away.
“The gas present in this young star mirrors the chemical composition of stars that exploded in the past,” remarked Dr. Safi Harb.
“Isn’t it poetic? Those same elements eventually contributed to Earth and now play a role in forming new stars.”
Recent findings indicate that Bella Junior is larger, more energetic, and expanding at a rate quicker than previously thought, marking it as one of the most potent supernova remnants in our galaxy.
“Stars are constructed in layers, much like onions,” Dr. Safi Harb explained. “When they explode, these layers are propelled into space.”
“Our research indicates that these layers are now becoming visible in the jets of nearby young stars.”
“This study not only solves an enduring astronomical enigma but also sheds light on stellar evolution, the enrichment of galaxies with elements, and how extreme cosmic events continue to shape our universe.”
This research was published today in a study featured in the Astrophysics Journal Letters.
Planetary scientists examining oxygen isotopes in lunar soil from the Apollo missions have determined that 4 billion years of meteorite impacts may have contributed only a minimal amount of Earth’s water. This insight prompts a reevaluation of established theories regarding water’s origins on our planet.
Close-up of a relatively new crater to the southeast, captured during Apollo 15’s third lunar walk. Image credit: NASA.
Previous research suggested that meteorites significantly contributed to Earth’s water supply due to their impact during the solar system’s infancy.
In a groundbreaking study, Dr. Tony Gargano from NASA’s Johnson Space Center and the Lunar and Planetary Institute, along with colleagues, employed a novel technique to analyze the lunar surface debris known as regolith.
Findings indicated that even under optimistic conditions, meteorite collisions from approximately 4 billion years ago may have delivered only a small percentage of Earth’s water.
The Moon acts as a historical archive, documenting the tumultuous events that the Earth-Moon system has endured over eons.
While Earth’s dynamic geology and atmosphere erase these records, lunar samples have retained valuable information.
However, this preservation is not without its challenges.
Traditional regolith studies have focused on metal-preferring elements, which can be obscured by continuous impacts on the Moon, complicating efforts to reconstruct original meteorite compositions.
Oxygen triple isotopes offer highly precise “fingerprints” since oxygen, being the most abundant element in rocks, remains untouched by external forces.
These isotopes facilitate a deeper understanding of the meteorite compositions that impacted the Earth-Moon system.
Oxygen isotope analyses revealed that approximately 1% of the regolith’s mass consists of carbon-rich material from meteorites that partially vaporized upon impact.
With this knowledge, researchers calculated the potential water content carried by these meteorites.
“The lunar regolith uniquely allows us to interpret a time-integrated record of impacts in Earth’s vicinity over billions of years,” explained Dr. Gargano.
“By applying oxygen isotope fingerprints, we can extract impactor signals from materials that have undergone melting, evaporation, and reprocessing.”
This significant finding alters our understanding of water sources on both Earth and the Moon.
When adjusted to account for global impacts, the cumulative water indicated in the model equates to only a minor fraction of the Earth’s oceanic water volume.
This discrepancy challenges the theory that water-rich meteorites delivered the bulk of Earth’s water.
“Our results don’t rule out meteorites as a water source,” noted Dr. Justin Simon, a planetary scientist at NASA Johnson’s Celestial Materials Research and Exploration Sciences Division.
“However, the Moon’s long-term record indicates that the slow influx of meteorites cannot significantly account for Earth’s oceans.”
While the implied water contribution from around 4 billion years ago is minimal in the context of Earth’s oceans, it remains notable for the Moon.
The Moon’s available water is concentrated in small, permanently shadowed areas at the poles.
These regions, among the coldest in the solar system, present unique opportunities for scientific research and exploration resources as NASA prepares for crewed missions to the Moon with Artemis III and subsequent missions.
The samples analyzed in this study were collected from near the lunar equator, where all six Apollo missions landed.
Rocks and dust gathered over half a century ago continue to yield valuable insights, albeit from a limited lunar area.
Future samples collected through Artemis are expected to unlock a new wave of discoveries in the years ahead.
“I consider myself part of the next generation of Apollo scientists, trained in the questions and insights enabled by the Apollo missions,” said Dr. Gargano.
“The Moon provides tangible evidence that we can examine in the lab, serving as a benchmark for what we learn from orbital data and telescopes.”
“I eagerly anticipate the information that upcoming Artemis samples will reveal about our place in the solar system.”
The findings of this study will be published in Proceedings of the National Academy of Sciences.
_____
Anthony M. Gargano et al. 2026. Constraints on impactor flux from lunar regolith oxygen isotopes to the Earth-Moon system. PNAS 123 (4): e2531796123; doi: 10.1073/pnas.2531796123
A newly identified genus and species of titanosaurus, a colossal sauropod dinosaur from the Cretaceous period, has been uncovered from fossils in northern Patagonia, Argentina.
Reconstructing the life of Yenen Hassai. Image credit: Gabriel Rio.
Named Yenen Hassai, this new species roamed Earth approximately 83 million years ago during the Late Cretaceous period.
This ancient creature belongs to the Titanosauridae, a fascinating group of large, long-necked herbivorous dinosaurs that thrived on the Gondwana supercontinent.
“The head of Yenen Hassai was proportionately smaller compared to its massive body,” explained Dr. Leonardo Filippi, a paleontologist from CONICET and the Urquiza Municipal Museum in Argentina.
“This titanosaur measured between 10 to 12 meters (33 to 39 feet) in length and weighed approximately 8 to 10 tons.”
The fossil remains of Yenen Hassai were excavated from the Bajo de la Carpa Formation at a site known as Cerro Obero la Invernada in Neuquén, Patagonia, Argentina.
This material showcases one of the most complete titanosaur skeletons found in the region, preserving six cervical vertebrae, ten dorsal vertebrae with associated ribs, the sacrum, and the first caudal vertebra.
Alongside the holotype, researchers identified remains of at least two additional sauropods at the site, including a juvenile specimen and another adult titanosaur, which may belong to an unclassified species.
“Through phylogenetic analysis, Yenen Hassai is found to be closely related to Nalambuenatitan and Overosaurus, as a basal member of an unnamed clade of derived non-lithostrotians saltasaurids,” they noted.
“Evidence from the titanosaur fauna at Cerro Obero la Invernada indicates that species diversity was relatively high during the Santonian period, suggesting that at least two lineages, colossosaurs and saltasauroids, coexisted.”
“This discovery positions the Cerro Obero-La Invernada region as the area with the highest diversity of titanosaurs during the Santonian of the Neuquén Basin, offering crucial insights into the evolution of dinosaur fauna in this era.”
This significant finding is detailed in a recent article: research paper published in the Journal of Historical Biology on January 12, 2026.
_____
LS Filippi et al.Yenen Hassai: An Overview of Sauropod Titanosaurs Diversity from the Cerro Overo-La Invernada Region (Bajo de la Carpa Formation of the Santonian), Northern Patagonia, Argentina. Historical Biology published online January 12, 2026. doi: 10.1080/08912963.2025.2584707
Here’s an SEO-optimized version of the given content, maintaining the original HTML structure:
Caves are often dark, damp, and remote. While they lack the nutrients and energy sources that sustain life in other ecosystems, they still host a diverse array of bacteria and archaea. But how do these microorganisms acquire enough energy to thrive? A team of researchers from Australia and Europe investigated this intriguing question by examining Australian caves.
Previous studies identified that microorganisms in nutrient-poor soils can harness energy from the atmosphere through trace gases, including hydrogen, carbon monoxide, and methane. These gases are present in minute quantities, classified as trace gases. Microbes possess specific proteins that can accept electrons from these gas molecules, enabling them to utilize these gases as energy sources, such as hydrogenase, dehydrogenase, or monooxygenase, fueling their metabolic processes.
The Australian research team hypothesized that cave-dwelling microbes may be using trace gases for survival. To test this, they studied four ventilated caves in southeastern Australia. The researchers collected sediment samples at four points along a horizontal line that extended from the cave entrance to 25 meters (approximately 80 feet) deep inside the cave, resulting in a total of 94 sediment samples.
The team treated the sediment samples with specific chemicals to extract microbial DNA, using it to identify both the abundance and diversity of microorganisms present. They found multiple groups of microorganisms throughout the cave, including Actinobacteria, Proteobacteria, Acidobacteria, Chloroflexota, and Thermoproteota. Notably, the density and diversity of microbes were significantly higher near the cave entrance, with three times more microorganisms in those regions compared to further inside.
The team utilized gene sequencing to analyze the microbial DNA for genes linked to trace gas consumption. Results revealed that 54% of cave microorganisms carried genes coding for proteins involved in utilizing trace gases like hydrogenases, dehydrogenases, and monooxygenases.
To assess the generality of their findings, the researchers searched existing data on microbial populations from 12 other ventilated caves worldwide. They discovered that genes for trace gas consumption were similarly prevalent among other cave microorganisms, concluding that trace gases might significantly support microbial life and activity in caves.
Next, the researchers measured gas concentrations within the caves. They deployed static magnetic flux chambers to collect atmospheric gas samples at four points along the sampling line, capturing 25 milliliters (about 1 ounce) of gas each time. Using a gas chromatograph, they analyzed the samples and found that the concentrations of hydrogen, carbon monoxide, and methane were approximately four times higher near the cave entrance compared to deeper areas. This suggests that microorganisms might be metabolizing these trace gases for energy.
To validate their findings further, they constructed a static magnetic flux chamber in the lab, incubating cave sediment with hydrogen, carbon monoxide, and methane at natural concentration levels. They confirmed that microbes also consumed trace gases in controlled conditions.
Finally, the researchers explored how these cave microbes obtained organic carbon. They conducted carbon isotope analysis, focusing on carbon-12 and carbon-13 ratios, which can vary based on microbial metabolic processes. Using an isotope ratio mass spectrometer, they determined that cave bacteria had a lower percentage of carbon-13, indicating their reliance on trace gases to generate carbon within the cave ecosystem.
The researchers concluded that atmospheric trace gases serve as a crucial energy source for microbial communities in caves, fostering a diverse array of microorganisms. They recommended that future studies examine how climatic changes, such as fluctuations in temperature and precipitation, might influence the use of atmospheric trace gases by cave-dwelling microorganisms.
Post views:318
This version enhances the original content with relevant keywords while retaining the structure and integrity of the HTML tags.
“The pain was like being struck by lightning and being hit by a freight train at the same time,” shared Victoria Gray. New Scientist reflects on her journey: “Everything has changed for me now.”
Gray once endured debilitating symptoms of sickle cell disease, but in 2019, she found hope through CRISPR gene editing, a pioneering technology enabling precise modifications of DNA. By 2023, this groundbreaking treatment was officially recognized as the first approved CRISPR therapy.
Currently, hundreds of clinical trials are exploring CRISPR-based therapies. Discover the ongoing trials that signify just the beginning of CRISPR’s potential. This revolutionary tool is poised to treat a wide range of diseases beyond just genetic disorders. For example, a single CRISPR dose may drastically lower cholesterol levels, significantly reducing heart attack and stroke risk.
While still in its infancy regarding safety, there’s optimism that CRISPR could eventually be routinely employed to modify children’s genomes, potentially reducing their risk of common diseases.
Additionally, CRISPR is set to revolutionize agriculture, facilitating the creation of crops and livestock that resist diseases, thrive in warmer climates, and are optimized for human consumption.
Given its transformative capabilities, CRISPR is arguably one of the most groundbreaking innovations of the 21st century. Its strength lies in correcting genetic “misspellings.” This involves precisely positioning the gene-editing tool within the genome, akin to placing a cursor in a lengthy document, before making modifications.
Microbes utilize this genetic editing mechanism in their defense against other microbes. Before 2012, researchers identified various natural gene-editing proteins, each limited to targeting a single location in the genome. Altering the target sequence required redesigning the protein’s DNA-binding section, a process that was time-consuming.
However, scientists discovered that bacteria have developed a diverse range of gene-editing proteins that bind to RNA—a close relative of DNA—allowing faster sequence matching. Producing RNA takes mere days instead of years.
In 2012, Jennifer Doudna and her team at the University of California, Berkeley, along with Emmanuelle Charpentier from the Max Planck Institute for Infection Biology, revealed the mechanics of one such gene-editing protein, CRISPR Cas9. By simply adding a “guide RNA” in a specific format, they could target any desired sequence.
Today, thousands of variants of CRISPR are in use for diverse applications, all relying on guide RNA targeting. This paradigm-shifting technology earned Doudna and Charpentier the Nobel Prize in 2020.
“The gut microbiome has transformed our understanding of human health,” says Tim Spector, PhD, co-founder of the Zoe Nutrition App from King’s College London. “We now recognize that microbes play a crucial role in metabolism, immunity, and mental health.”
Although significant advancements in microbiome research have surged in the past 25 years, humans have a long history of utilizing microorganisms to enhance health. The Romans, for instance, employed bacterial-based treatments to “guard the stomach” without comprehending their biological mechanisms.
In the 17th century, microbiologist Antony van Leeuwenhoek made the groundbreaking observation of the parasite Giardia in his own stool. It took scientists another two centuries to confirm his discoveries, until the 21st century when the profound impact of gut and skin microbes on health became evident.
By the 1970s, researchers determined that gut bacteria could influence the breakdown of medications, potentially modifying their efficacy. Fecal transplant studies hinted at how microbial communities could restore health. However, it was the rapid advancements in gene sequencing and computing in the 2000s that truly revolutionized this field. Early genome sequencing revealed every individual possesses a distinct microbial “fingerprint” of viruses, fungi, and archaea.
In the early 2000s, groundbreaking studies illustrated that the microbiome and immune system engage in direct communication. This collaboration reshapes the microbiome’s role as a dynamic participant in our health, impacting a wide range of systems, from the pancreas to the brain.
Exciting findings continue to emerge; fecal transplants are proving effective against Clostridium difficile infections, while microorganisms from obese mice can induce weight gain in lean mice. Some bacterial communities have shown potential to reverse autism-like symptoms in mice. Recently, researchers have even suggested that microbial imbalances could trigger diabetes and Parkinson’s disease. “Recent insights into the human microbiome indicate its influence extends far beyond the gut,” states Lindsay Hall from the University of Birmingham, UK.
Researchers are gaining a clearer understanding of how microbial diversity is essential for health and how fostering it may aid in treating conditions like irritable bowel syndrome, depression, and even certain cancers. Studies are also investigating strategies to cultivate a healthy microbiome from early life, which Hall believes can have “profound and lasting effects on health.”
In just a few decades, the microbiome has evolved from an obscure concept to a pivotal consideration in every medical field. We are now entering an era that demands rigorous testing to differentiate effective interventions from overhyped products, all while shaping our approach to diagnosing, preventing, and treating diseases.
Reconstruction of a Paleolithic woman crafting wooden tools
Credit: G. Prieto; K. Harvati
Remarkably, some of the oldest known wooden tools have been unearthed in an open-pit mine in Greece, dating back 430,000 years. These artifacts were likely crafted by an ancient human ancestor, potentially related to Neanderthals.
Archaeologists note that prehistoric wooden artefacts are “extremely rare.” According to Dirk Leder from the Lower Saxony Cultural Heritage Office in Hannover, Germany, any new findings in this area are highly valued.
Evidence suggests our extinct relatives may have utilized wooden tools for millions of years. “This could be the oldest type of tool ever used,” states Katerina Harvati from the University of Tübingen, Germany. Unfortunately, the preservation of wooden artifacts is often poor, hindering our understanding of their use.
Harvati and her team discovered the tool at a site called Marathusa 1, originally confirmed in 2013 in the Megalopolis Basin of southern Greece. The open-pit lignite mine revealed sediment layers that are nearly a million years old, offering unprecedented access to date and research, as mentioned by researcher K. Harvati.
From 2013 to 2019, excavations yielded not only tools but also the skeleton of a straight-tusked elephant (Paleoloxodon antiquus), indicating a rich archaeological context with evidence of activity, including more than 2,000 stone tools and remains of varied flora and fauna, depicting an ancient lakeshore ecosystem.
To date Marathusa 1, researchers relied on various methods, including analyzing fossil footprints and historical changes in the Earth’s magnetic field. By 2024, they confirmed that the artefacts are around 430,000 years old, a time marked by challenging climatic conditions—the gravest ice age of the Pleistocene in Europe. The Megalopolis Basin likely provided refuge due to its relatively temperate climate.
The archaeological team identified two significant wooden tools among the 144 artifacts. The first, an 81 cm long pole made from alder, exhibits marks indicative of intentional shaping. One end appears rounded, possibly serving as a handle, while the other is flattened, hinting at potential use for digging underground tubers or perhaps for butchering elephant carcasses. Harvati admits uncertainty about its exact application.
Mysterious second wooden tool from Marathusa 1
Credit: N. Thompson; K. Harvati
The second tool remains enigmatic, measuring just 5.7 cm in length and made from willow or poplar. It also shows signs of intentional shaping after the bark was removed. According to Harvati, this represents a completely new type of wooden tool. While it might have served to modify stone tools, the specific purpose remains a mystery.
Reeder points out that while the first tool is a clear example of wooden craftsmanship, questions remain about the functionality of the second. “Is this a complete item or part of something larger?” he muses.
No hominid remains have been found at Marathusa 1. Given its age, it predates our species and is likely too early even for Neanderthals. “The prevailing hypothesis suggests this site might be associated with pre-Neanderthal humans or Homo heidelbergensis. However, Harvati cautions against making definitive conclusions, noting that Greece was frequented by various hominin groups.
Other ancient wooden tools, like the Clacton spear discovered in Britain, are estimated to be about 400,000 years old, while a wooden spear from Schöningen, Germany, has been dated using multiple methods to around 300,000 years. The only tools that predate those found at Marathusa 1 are from Kalambo Falls in Zambia, which date back 476,000 years and resemble remains of larger structures or buildings.
Discover Archaeology and Paleontology
New Scientist regularly covers extraordinary archaeological sites worldwide that reshape our understanding of human evolution and early civilizations. Consider joining us on this captivating journey!
In today’s digital landscape, hostility often overshadows collaboration. Remarkably, Wikipedia—a publicly editable encyclopedia—has emerged as a leading knowledge resource worldwide. “While it may seem improbable in theory, it remarkably works in practice,” states Anusha Alikan from the Wikimedia Foundation, the nonprofit behind Wikipedia.
Founded by Jimmy Wales in 2001, Wikipedia continues to thrive, although co-founder Larry Sanger left the project the following year and has since expressed ongoing criticism, claiming it is “overrun by ideologues.”
Nonetheless, Sanger’s opinions are not widely echoed. Wikipedia boasts over 64 million articles in 300+ languages, generating an astonishing 15 billion hits monthly. Currently, it ranks as the 9th most visited website globally. “No one could have anticipated it would become such a trusted online resource, yet here we are,” Arikan commented.
Building trust on a massive scale is no small achievement. Although the Internet has democratized access to human knowledge, it often presents fragmented and unreliable information. Wikipedia disrupts this trend by allowing anyone to contribute, supported by approximately 260,000 volunteers worldwide, making an impressive 342 edits per minute. A sophisticated system grants broader editing rights to responsible contributors, fostering trust that encourages collaboration even among strangers.
Wikipedia also actively invites special interest groups to create and edit content. For instance, the Women in Red project tackles gender disparities, while other initiatives focus on climate change and the history of Africa. All articles uphold strict accuracy standards, despite critics like Sanger alleging bias.
As an anomaly in the technology sector, Wikipedia operates without advertising, shareholders, or profit motives. It has maintained this unique position for over two decades with great success.
However, the rise of artificial intelligence poses new challenges. AI can generate misleading content, deplete resources in training efforts, and lead to diminished website traffic and decreased donations due to AI-driven search summaries.
Credit: Dr. Gavin Leroy/Professor Richard Massey/COSMOS-Webb Collaboration
In a groundbreaking study, scientists leveraged subtle distortions in the shapes of over 250,000 galaxies to construct the most detailed dark matter map to date, paving the way for insights into some of the universe’s greatest enigmas.
Dark matter, elusive by nature, does not emit any detectable light. Its existence can only be inferred through its gravitational interactions with normal matter. Researchers, including Jacqueline McCreary from Northeastern University, utilized the James Webb Space Telescope (JWST) to map a region of the sky larger than the full moon.
“This high-resolution image depicts the scaffold of a small segment of the universe,” noted McCreary. The new map boasts double the resolution of previous ones created by the Hubble Space Telescope, encompassing structures much farther away.
The researchers studied approximately 250,000 galaxies, noting that their shapes, while interesting, serve primarily as a backdrop for understanding gravitational distortions. As Liliya Williams from the University of Minnesota explained, “These galaxies merely act as the universe’s wallpaper.” The critical component is the way dark matter’s gravitational pull warps the light from these distant galaxies—a phenomenon known as gravitational lensing. The more distorted the shape of these galaxies is from a perfect circle, the greater the amount of dark matter situated between us and them.
By analyzing these optical distortions, the team was able to derive a map illustrating massive galaxy clusters and the cosmic web filaments linking them. Many of these newly identified structures deviate from prior observations of luminous matter, suggesting they are predominantly composed of dark matter. “Gravitational lensing is one of the few and most effective techniques for detecting these structures across vast regions,” Williams stated.
This research is significant, considering that dark matter constitutes about 85% of the universe’s total matter, crucial for the formation and evolution of galaxies and clusters. Understanding its distribution could shed light on its behavior and composition, according to Williams.
“This achievement is not just observational but also paves the way for various analyses, including constraints on cosmological parameters, the relationship between galaxies and their dark matter halos, and their growth and evolution over time,” McCreary highlighted. These parameters include the strength of dark energy, the enigmatic force driving the universe’s accelerating expansion.
While initial findings from the JWST map align with the Lambda CDM model of the universe, McCreary emphasizes that a thorough analysis of the data is still required to unearth new insights. “At first glance, it appears consistent with Lambda CDM, but I remain cautious. A final assessment will depend on complete results.”
Menstrual Pads: A Revolutionary Tool for Tracking Women’s Fertility
Shutterstock/Connect World
Innovative home tests integrated into menstrual pads are empowering women to monitor their fertility through menstrual blood. This non-invasive method eliminates the need for frequent blood tests or clinic visits.
For many women, understanding their fertility journey often remains elusive until they attempt to conceive. In case of any complications, clinical tests can offer vital information.
These tests are instrumental in assessing the levels of anti-Mullerian hormone (AMH), a key indicator of “ovarian reserve,” which reflects the quantity of eggs remaining in a woman’s ovaries. In adults, AMH levels naturally decrease with age, indicating that higher levels signify a robust supply of eggs, whereas lower levels may signal reduced reserves or early onset of menopause.
Traditionally, AMH measurement has involved either clinic-based blood tests or at-home finger-prick tests, both requiring lab analysis before results are available.
Recently, Lucas Dosnon from ETH Zurich and his team in Switzerland have created a user-friendly test utilizing menstrual blood for immediate results.
The test functions as a lateral flow assay—similar to a COVID-19 test—utilizing small gold-coated particles with antibodies that selectively bind to AMH. When the test strip is exposed to menstrual blood, the hormonal interactions create a visible line, where the darkness of this line correlates with AMH levels.
While visual assessments can estimate results, researchers have developed a smartphone app that accurately analyzes test strip images. When tested against menstrual blood samples with known AMH concentrations, results aligned closely with clinical evaluations.
Moreover, the research team has seamlessly integrated this test into menstrual pads, enabling passive AMH level monitoring throughout menstruation. Over time, this approach may reveal trends in ovarian reserves that single tests could miss.
“We believe this research could be a game-changer for women’s health,” stated Dosnon, highlighting the potential for regular ovarian health screenings useful for various purposes, including during IVF and for diagnosing conditions outside of reduced ovarian reserve. Elevated AMH levels, for instance, can indicate polycystic ovarian syndrome and, in rare cases, granulosa cell tumors affecting the ovaries. “Menstrual blood is an underutilized resource with great potential in monitoring overall health,” Dosnon added.
Richard Anderson from the University of Edinburgh emphasizes the interpretation challenges all family medicine tests face, noting that understanding results can be complex, as no AMH test assesses egg quality. He questions whether women will prefer this test over traditional methods: “Is obtaining a reliable blood test that much of a burden?”
In response, Dosnon clarified that the test isn’t designed to replace clinical evaluations but rather offers an alternative that addresses the challenges in women’s health monitoring and research, praised for its non-invasive nature, user-friendliness, and affordability.
What distinguishes a groundbreaking idea from a mediocre one? This is often a challenging distinction to make. Take the example of vaccination: collecting pus from a cowpox-infected individual and injecting it into an eight-year-old boy may seem utterly reckless. Yet, 18th-century physician Edward Jenner’s daring action ultimately led to the eradication of smallpox, a disease that plagued humanity.
With the benefit of hindsight, we recognize that Jenner’s innovation was monumental. This principle of vaccination continues to save millions of lives today. As we progress through the 21st century, we feel it’s essential to reflect on and celebrate transformative ideas from the past 25 years that are reshaping our perspectives, actions, and understanding of the world around us.
Compiling our list of the 21 most impactful ideas of the 21st century involved rigorous discussions among our editorial team. One of our initial challenges was determining if the first quarter of this century would conclude at the beginning or end of 2025. For clarity, we opted for the latter. We navigated debates on various ideas, dedicating particular attention to concepts like the microbiome—establishing it as a legitimate 21st-century notion—and scrutinizing the role of social media, which after much discussion, we deemed largely negative. Ultimately, we recognize that the quality of ideas is subjective.
We developed a robust set of criteria for our selection. To qualify for this list, a concept must already demonstrate a significant impact on our self-understanding, health, or broader universe. Additionally, it should be grounded in scientific discovery, with a strong idea underpinning it. Lastly, the development must have occurred within the last 25 years.
“
Rather than trying to predict the future, it’s important to take the time to reflect on the past. “
While the last criterion may appear straightforward, we encountered numerous proposals that remain unrealized. The discovery of gravitational waves in the 21st century opened new cosmic vistas, but their prediction dates back a century to Albert Einstein. Similarly, ideas like weight loss medications, personalized medicine, and mRNA vaccines show promise, but their full potential has yet to be achieved—perhaps these will make the list in 2050.
During our selection process, we couldn’t disregard ideas that initially seemed appealing but faltered. Therefore, we also crafted a list of the five most disappointing ideas of the century thus far. The line between success and failure can sometimes blur, leading to controversial choices in our best ideas list. For instance, while many would advocate for the removal of smartphones, we ultimately view them as largely beneficial. Likewise, the ambitious global warming target of 1.5°C can be seen as a failure, especially as new reports indicate that average global temperatures have surpassed this benchmark for the first time. Nonetheless, we argue that striving to reduce the threshold from 2°C remains one of the century’s monumental ideas, setting a standard for global climate ambition.
Advancing away from fossil fuels is undoubtedly crucial, and prominently featured in this effort is Elon Musk. In 2016, before Musk ventured into social media and politics, his company Tesla launched its first Gigafactory in Nevada, marking a pivotal moment in the transition to renewable energy by utilizing economies of scale to transform transportation and energy systems. Conversely, other approaches to fighting climate change, such as alternative fuels and carbon offsets, appear more harmful than beneficial.
One significant takeaway from our selection process is that revolutionary ideas often arise by chance. For many, a working outlet can be the catalyst for a few minutes of smartphone scrolling during a lengthy commute. However, for two physicists in 2005, their discovery altered the global decarbonization strategy. This breakthrough also unveiled the foundations of our complex thought processes, illustrating that brain regions don’t operate in isolation but are interwoven into a robust network. This understanding has revolutionized our approach to diagnosing and treating neurological issues.
Looking back over the past quarter-century, it’s evident that the world has transformed considerably. We successfully dodged the Millennium Bug, the human genome’s first draft was completed, and the International Space Station welcomed its first crew. Concepts like “Denisovans” and “microbiomes” were unknown to us. In our pages, we celebrated innovations like wireless communication and marveled at miniaturized computer chips driving these technologies. “At its core is a device known as a Bluetooth chip,” we stated, positing it as the next big thing—a prediction that, in hindsight, was flawed, since truly transformative technologies extend beyond mere convenience.
This experience highlights the folly of predictions, as they can often be overlooked in the rush for the next trending innovation. Thus, rather than striving to foresee the future, we ought to invest time in contemplating the past. The advancements we’ve witnessed in health, technology, and environmental conservation suggest that this century has made the world a better place. Let’s hope, without necessarily predicting, that this momentum continues into the future.
Solar geoengineering: A solution to save ice sheets with potential risks
Credit: Martin Zwick/REDA/Universal Images Group (via Getty Images)
Research indicates that an abrupt halt to solar geoengineering may lead to a “termination shock,” causing a rapid temperature rise that could make the initiative more expensive than continuing without intervention.
With greenhouse gas emissions on the rise, there’s increasing attention on solar radiation management (SRM), which cools the planet by dispersing sulfur dioxide aerosols into the stratosphere to reflect sunlight.
However, sustained solar geoengineering is crucial for centuries; otherwise, the hidden warming could quickly reemerge. This rebound, referred to as termination shock, leaves little time for adaptation and could catalyze critical climate events such as ice sheet collapses.
According to Francisco Estrada, researchers from the National Autonomous University of Mexico assessed the risk of inaction on climate change compared to solar geoengineering approaches.
Projections suggest that if emissions aren’t curtailed, temperatures may soar by an average of 4.5 degrees Celsius above pre-industrial levels by 2100, leading to approximately $868 billion in economic damages. In contrast, a hypothetical stratospheric aerosol injection program initiated in 2020 could limit warming to around 2.8°C, potentially reducing these costs by half.
Nevertheless, if the aerosol program ends abruptly in 2030, resulting in a temperature rebound of 0.6 degrees Celsius over eight years, economic damages could surpass $1 trillion by century’s end. While estimations vary, Estrada states, “The principle remains consistent: the termination shock will be significantly worse than inaction.”
Estrada’s research innovatively gauges damage not only by global warming levels but also by the speed at which temperatures rise, according to Gernot Wagner from Columbia University.
Wagner warns that solar geoengineering may be riskier than it appears. “This highlights a critical concern,” he notes.
Make Sunsets, a Silicon Valley startup, has already launched over 200 sulfur dioxide-filled balloons into the stratosphere and offers emission offsets for sale. A recent launch in Mexico prompted governmental threats to ban geoengineering activities.
Israel’s Stardust Co., Ltd. has secured $75 million in funding and is lobbying the U.S. government to explore solar geoengineering options. A recent survey revealed that two-thirds of scientists anticipate large-scale SRM could occur this century, as reported by New Scientist.
According to studies, it would take at least 100 aircraft to cool the Earth by 1°C through aerosol injection, releasing millions of tons of sulfur dioxide annually, unimpeded by geopolitical conflicts or unforeseen events.
Presently, major nations like the United States are undermining global climate cooperation, but researchers highlight that such collaboration is essential to prevent termination shock and potentially realize the benefits of SRM.
Analysis of varying parameters suggests that aerosol injections could mitigate climate damage only if the annual probability of cessation is extremely low. In scenarios allowing for a gradual stop over 15 years, SRM might be viable.
If countries successfully reduce emissions, only minimal geoengineering cooling may be necessary, rendering aerosol injection beneficial with a maximum outage probability of 10%. This indicates a potential 99.9% chance of failure over a century, but manageable temperature recovery remains plausible in low emissions scenarios.
This need for international cooperation reveals what Estrada describes as the “governance paradox” of solar geoengineering: “We must ensure extremely low failure rates and possess effective governance to mitigate adverse outcomes.” However, he adds, “If we effectively reduce greenhouse gases, the need for SRM diminishes.”
These findings challenge the notion that solar geoengineering might lead to irresponsible development, as some have suggested, according to Chad Baum from Aarhus University. Funding for this new research was provided by the Degrees Initiative, aimed at supporting geoengineering studies in vulnerable low-income nations.
Baum stated, “We intend to complete all stages of this study, incorporating feedback from impacted communities.”
Despite this, Wagner emphasizes the imperative for further exploration into geoengineering’s trade-offs given the rise in emissions and their consequences: “We are approaching a critical juncture.”
Unlocking the Potential: Does Heat Therapy Enhance Brain Function?
gpointstudio/Getty Images
As an enthusiast of cold water swimming, I previously explored its brain benefits. However, the emerging evidence on heat therapy fascinated me—particularly regarding its neurological advantages. This prompted a deeper investigation into the subject.
During my last trip to Finland and Sweden, I immersed myself in their sauna culture, learning that ‘sauna’ is pronounced ‘sow-na’ (with ‘ow’ rhyming with ‘how’), contrasting my South East London pronunciation.
Finnish saunas, reaching temperatures of 70°C to 110°C (158°F to 230°F) with low humidity, are extensively studied. Regular sauna use correlates with numerous physical benefits, such as reduced risks of high blood pressure, muscle disorders, and respiratory diseases. Recent research also identifies significant cognitive benefits, including fewer headaches, improved mental health, better sleep quality, and a decreased risk of dementia.
A large-scale study involving nearly 14,000 participants aged 30 to 69 tracked sauna habits over 39 years. The findings revealed that those who frequented saunas nine to twelve times a month exhibited a 19 percent reduction in dementia risk compared to those who visited less than four times a month.
Moreover, sauna bathing appears linked to various cognitive enhancements. For instance, a small trial involving 37 adults with chronic headaches compared those receiving headache management advice to participants who regularly attended saunas. The sauna group reported significantly reduced headache intensity.
Regular sauna use is also associated with lower risks of psychosis and increased vitality and social functioning in elderly individuals, reinforcing its potential cognitive benefits.
However, it’s crucial to recognize that not all heat treatments yield the same results. Various forms of heat therapy exist, each offering distinct benefits. For example, a trial with 26 individuals diagnosed with major depressive disorder showed that those receiving infrared heating sessions reported significant symptom reductions over six weeks compared to a sham treatment.
How Does Heat Therapy Benefit Brain Health?
Heat therapy’s efficacy appears closely linked to its anti-inflammatory effects. In a study following 2,269 middle-aged Finnish men, researchers found that individuals engaging in frequent sauna use exhibited reduced levels of inflammation, a factor significantly associated with depression and cognitive decline.
Another mechanism involves heat shock proteins, which are produced when body temperature rises during sauna use or exercise. These proteins help prevent misfolding of other proteins—a common feature in many neurological disorders, including Alzheimer’s disease.
Enhanced blood circulation also plays a role; heat exposure dilates blood vessels, thereby improving cardiovascular health. This indirect benefit to brain health can decrease risks associated with vascular dementia and Alzheimer’s disease.
Additionally, saunas may elevate brain-derived neurotrophic factor (BDNF) levels, vital for neuron growth. In an experiment with 34 men, participants receiving 12 to 24 sessions of infrared therapy displayed significantly higher BDNF levels and improved mental well-being compared to those doing low-intensity workouts.
Can Saunas Enhance Cognitive Skills?
Beyond long-term neurological advantages, the immediate effects of sauna sessions are promising. A study involving 16 men revealed that brain activity post-sauna sessions resembled a relaxed state, indicating potential improvements in task efficiency. Researchers suggest that heat therapy may help extend mental work capacity over prolonged periods.
However, excessive heat exposure can lead to fatigue and reduced cognitive function. Studies indicate that high-temperature environments may impair memory consolidation, making saunas less suitable for study sessions.
If you’re exploring heat therapy, check guidelines from the British Sauna Association to ensure safety, including limiting duration and staying hydrated.
Do Hot Baths Offer Similar Benefits?
If you lack access to saunas, could hot baths serve as an alternative? While they may partially replicate sauna benefits, the evidence is still inconclusive. According to Ali Qadiri from West Virginia University, warm baths do elevate core body temperature and can improve mood and relaxation. Still, he cautions that robust data on saunas and dementia prevention far outweighs that for baths.
My local lake offers both cold water swimming and sauna experiences, prompting me to consider their combined effects. A Japanese study on the practice known as totonou, or alternating between hot saunas and cold baths, revealed enhancements in relaxation and reduced alertness after several rounds.
While more research is needed to determine if this combination is more effective than using heat or cold therapy alone, the overall evidence supports potential cognitive boosts from regular sauna visits, reinforcing my commitment to explore more heat and cold therapy options.
During the first decade of the 21st century, scientists and policymakers emphasized a 2°C cap as the highest “safe” limit for global warming above pre-industrial levels. Recent research suggests that this threshold might still be too high. Rising sea levels pose a significant risk to low-lying islands, prompting scientists to explore the advantages of capping temperature rise at approximately 1.5°C for safeguarding vulnerable regions.
In light of this evidence, the United Nations negotiating bloc, the Alliance of Small Island States (AOSIS), advocated for a global commitment to restrict warming to 1.5°C, emphasizing that allowing a 2°C increase would have devastating effects on many small island developing nations.
James Fletcher, the former UN negotiator for the AOSIS bloc at the 2015 UN COP climate change summit in Paris, remarked on the challenges faced in convincing other nations to adopt this stricter global objective. At one summit, he recounted a low-income country’s representative confronting him, expressing their vehement opposition to the idea of even a 1.5°C increase.
After intense discussions, bolstered by support from the European Union and the tacit backing of the United States, as well as intervention from Pope Francis, the 1.5°C target was included in the impactful 2015 Paris Agreement. However, climate scientists commenced their work without a formal evaluation of the implications of this warming level.
In 2018, the Intergovernmental Panel on Climate Change report confirmed that limiting warming to 1.5°C would provide substantial benefits. The report also advocated for achieving net-zero emissions by 2050 along a 1.5°C pathway.
These dual objectives quickly became rallying points for nations and businesses worldwide, persuading countries like the UK to enhance their national climate commitments to meet these stringently set goals.
Researchers at the University of Leeds, including Piers Foster, attribute the influence of the 1.5°C target as a catalyst driving nations to adhere to significantly tougher climate goals than previously envisioned. “It fostered a sense of urgency,” he remarks.
Despite this momentum, global temperatures continue to rise, and current efforts to curb emissions are insufficient to fulfill the 1.5°C commitment. Scientific assessments predict the world may exceed this warming threshold within a mere few years.
Nevertheless, 1.5°C remains a crucial benchmark for tracking progress in global emissions reductions. Public and policymakers are more alert than ever to the implications of rising temperatures. An overshoot beyond 1.5°C is widely regarded as a perilous scenario, rendering the prior notion of 2°C as a “safe” threshold increasingly outdated.
Send us your feedback at New Scientist!If you enjoy following the latest advancements in science and technology, let us know your thoughts by emailing feedback@newscientist.com.
Even the Strangest Theories
This vacation, many fans spent their time reflecting on the final episode of Stranger Things. We experienced laughter, tears, and heated discussions about the storyline—especially its conclusion. Can we really say it was a fitting ending like Return of the King? (In our opinion, it was.)
In today’s online culture, vocal fan backlash is common. Some theorized that the finale was merely a ruse, leading to wild claims like “Conformity Gate” (not our term!). They argue that despite its two-hour runtime and cinematic release, the concluding episode was just a setup for a secret final episode, set to air this January. Critics point to a continuity error that suggests the entire narrative was an illusion crafted by Vecna, the mind-controlling antagonist.
Initially, we found these theories unconvincing, especially since the criticisms revolved around minor details. After all, the show itself defies physics—should we really be worried about the color of a graduation gown?
For newcomers, the storyline of Stranger Things unfolds in a small Indiana town beset by a secretive government lab conducting dangerous experiments. Spoiler alert: these experiments inadvertently open a portal to the “Upside Down,” a horrifying alternate dimension that mirrors the town, albeit in a more sinister light. Ultimately, it’s revealed that this Upside Down functions as a wormhole to yet another realm known as the Abyss.
If the Upside Down is indeed a wormhole, what then is the swirling red object levitating above? Some describe it as containing “exotic matter,” a theoretical substance crucial for stabilizing a genuine wormhole (although its existence remains unproven). This complicates matters further since the entrance to the Abyss exists in the Upside Down’s skies.
We’ve contemplated this for weeks, yet the whirling object’s purpose remains a mystery. Why does shooting it with a gun liquefy its surroundings, while an explosion obliterates the entire Upside Down? Wouldn’t such destruction release enough energy to obliterate a significant part of the East Coast?
Perhaps physicists focused on adaptive gate theory should tackle the bizarre phenomena within the Upside Down. There could be a Nobel Prize—or at least an Ig Nobel Prize—waiting for someone who can crack these mysteries.
Sparkling Sports Benefits
What could be more exhilarating than attending a live sports event? The thrill comes from being part of the crowd, cheering on your favorite players. But what if drinking soda while cheering made it even more enjoyable?
Alice Klein, a reporter, highlighted a study that demonstrated that spectators at a women’s college basketball game experienced greater enjoyment and a stronger sense of belonging when they consumed sparkling water instead of plain water. The researchers noted, “Drinking sparkling water together serves as a low-impact, non-alcoholic ritual, fostering social connection during and after live sports events.”
While Alice found this perspective amusing, editor Jacob Aaron defended the research: “They studied 40 individuals; what more could they need?” Readers may form their own opinions on the validity of this evidence. Nonetheless, we want to draw attention to the “competing interests” stated in the research paper, which we won’t comment on further. Here’s the statement:
“This study received funding from Asahi Soft Drinks Co., Ltd. WK and SM are employees of Asahi Soft Drinks Co., Ltd. The authors declare that this funding had no influence on the study design, methodology, analysis, or interpretation of the results. The sponsor has no control over the interpretation, writing, or publication of this study.”
AI Mistakes and Missteps
Reader Peter Brooker reached out to suggest a new section titled “AI Bloopers.” After using a well-known search engine, he was astounded to discover that the AI confidently asserted the first six prime numbers were 2, 3, 5, 7, 9, and 11.
We believe this section has long existed, albeit without a formal name. In fact, we often discuss how frequently to highlight these AI blunders. A weekly column could easily be filled with AI failures, but we worry it may become monotonous.
In line with Peter’s suggestion, Ghent University’s new rector, Petra de Sutter, found herself in hot water after using AI to generate her opening speech. It included fabricated quotes purportedly from Albert Einstein.
As reported by Brussels Times: “Impressively, De Sutter warned about the dangers of AI in her speech, advising that AI-generated content should not be ‘blindly trusted’ and that such text is ‘not always easy to distinguish from the original work.’”
Have a story for Feedback?
Share your insights with us at feedback@newscientist.com. Please include your home address. Previous feedback and this week’s comments can be accessed on our website.
Discover the Moon’s X: Captured from Tokyo in February 2025
Credit: Yomiuri Shimbun/AP Images/Alamy
Nearly a decade ago, my excitement surged as I captured my first telescope photo of the Moon. With a makeshift setup, I clumsily held my phone camera up to the eyepiece. After a few shaky attempts, I got a clear snapshot of the lunar surface, and shared it online with pride.
Unbeknownst to me, I had clicked the picture during a brief 4-6 hour window each month when fascinating features known as Moon’s X and V could be visible.
These lunar marks are optical illusions, revealing themselves only when sunlight strikes the rims of specific craters during the Moon’s waxing phase, perfectly aligned along the terminator.
The Moon’s X forms a bright X shape, illuminated by sunlight on the edges of three craters: La Caillou, Blanquinus, and Pulbach. Similarly, the V shape comes to life as sunlight hits the Ukert crater and nearby smaller craters.
To witness the Moon’s X and V, a telescope is essential. However, timing is crucial. The visibility of these features varies globally and is influenced by your local time zone.
The next waxing moon occurs at 5 AM GMT on January 26th. However, residents in the UK may miss it as the Moon will be below the horizon then. The best viewing opportunity on the evening of January 25th will be in New York, where the first quarter appears around midnight, enabling visibility of X and V from about 10 PM to 2 AM. In places like Sydney, the daytime blocks visibility as the first quarter falls around 3 PM local time.
For the best chance to view the Moon’s captivating X’s and V’s, ensure you’re gazing at a waxing moon during optimal hours, preferably when it’s high in the night sky. Tools like Stellarium can help you track the Moon’s visibility on specific dates.
Mark your calendars for upcoming first quarter events on February 24th, March 25th, and April 24th-25th. If you’re in the UK, you might want to target March 25th as it aligns well with evening visibility around 7 PM local time.
Understanding the intricacies that must align for the Moon’s X and V to appear, I feel fortunate to have captured my first lunar photo during such a special moment.
Despite Mars being smaller than Earth, it profoundly affects Earth’s climate cycle. Understanding how smaller planets influence the climates of exoplanets is crucial for assessing their potential for habitability.
According to Stephen Cain, researchers at the University of California, Riverside, discovered this phenomenon by simulating various scenarios to analyze Mars’ effect on Earth’s orbit across different masses, from 100 times its current mass to its complete removal. “Initially, I was skeptical that Mars, only one-tenth the mass of Earth, could so significantly affect Earth’s cycles. This motivated our study to manipulate Mars’ mass and observe the effects,” says Cain.
Earth’s climate is influenced by long-term cycles tied to its orbital eccentricity and axial tilt. These cycles are dictated by the gravitational forces of the Sun and other planets, determining significant climate events such as ice ages and seasonal shifts.
One crucial cycle, referred to as the Grand Cycle, spans 2.4 million years, involving the elongation and shortening of Earth’s orbital ellipse. This directly influences the amount of sunlight reaching Earth’s surface, thus controlling long-term climate changes.
The research indicates that eliminating Mars would not only remove the Grand Cycle but also another essential eccentricity cycle lasting 100,000 years. “While removing Mars wouldn’t completely halt ice ages, it would alter the frequency and climate impacts associated with them,” Cain explains.
As Mars’ simulated mass increases, the resulting climate cycles become shorter and more intense. However, a third eccentricity cycle, enduring approximately 405,000 years, remains predominantly influenced by Venus and Jupiter’s gravitational pulls, illustrating that while Mars is notably influential, it is not the only player.
Mars also affects Earth’s axial tilt, which oscillates over about 41,000 years. Cain and colleagues observed that Mars seems to stabilize these cycles—more mass leads to less frequent cycles, while a smaller Mars results in more frequent ones.
The precise impact of Mars’ absence or increased mass on Earth remains speculative, but it would undoubtedly lead to changes. The pursuit of Earth-like exoplanets with climates suitable for life continues, underscoring the need to evaluate the influence of smaller planets more thoroughly. “A comprehensive understanding of exoplanet system architectures is essential for predicting possible climate changes on these worlds,” warns Sean Raymond from the University of Bordeaux, France.
However, deciphering these structures can be challenging. “This serves as a cautionary note: small planets like Mars may wield a greater influence than we realize, making it imperative not to overlook these difficult-to-detect celestial bodies,” concludes Cain.
As a healthcare professional, I often encounter concerns from patients about COVID-19, particularly those suffering from long-term effects. A common inquiry I receive is, “Can I get reinfected with COVID-19 while experiencing long-term symptoms from a previous infection?“
Many individuals believe that enduring the virus for an extended period grants them some level of immunity against future infections. Unfortunately, this assumption is not accurate.
Long-lasting COVID-19 symptoms, including fatigue, breathing difficulties, and cognitive issues, can persist for months after initial infection. Regrettably, even prolonged exposure to COVID-19 does not shield you from reinfection.
The protective effects from previous infections and vaccinations fade over time. New variants of the virus, such as Omicron KP.3 and XEC in 2025, can evade the immune response.
This means that even if you’re grappling with persistent COVID-19 symptoms, it’s possible to contract the virus again, which may exacerbate symptoms or prolong recovery.
A positive COVID-19 test may indicate a reinfection with the same variant or a new one, but either way, it remains a manifestation of the coronavirus. Vaccines, particularly the 2025 booster shot, can significantly reduce the risk of severe illness. If you’re experiencing long-term COVID-19 and test positive, ensure you rest, stay hydrated, and consult your physician if symptoms worsen.
The coronavirus is still prevalent and continues to mutate, necessitating the practice of protective measures. It’s essential to get tested if you feel unwell, wear masks in crowded indoor settings, and keep up with vaccinations.
These proactive steps help mitigate exposure and safeguard those around you, especially as we navigate the lingering effects of this virus.
This article addresses the question from Yorkshire’s Terence Caldwell: “Can I be infected with COVID-19 along with the new variants?“
If you have any questions, reach out to us at:questions@sciencefocus.com or connect with us onFacebook, Twitter, orInstagram (don’t forget to include your name and location).
Explore our ultimatefun facts and discover more amazing science resources.
A groundbreaking study conducted by paleontologists from the University of Bristol, the University of Manchester, and the University of Melbourne has uncovered that the giant ancestors of modern kangaroos possessed robust hindlimb bony and tendon structures, enabling them to endure the stress of jumping. This challenges the previous assumption that body size strictly limited this iconic locomotion.
Currently, red kangaroos represent the largest living jumping animals, averaging a weight of approximately 90 kg.
However, during the Ice Age, some kangaroo species reached weights exceeding 250 kg—more than double the size of today’s largest kangaroos.
Historically, researchers speculated that these giant kangaroos must have ceased hopping, as early studies indicated that jumping became mechanically impractical beyond 150 kg.
“Earlier estimates relied on simplistic models of modern kangaroos, overlooking critical anatomical variations,” explained Dr. Megan Jones, a postgraduate researcher at the University of Manchester and the University of Melbourne.
“Our research indicates that these ancient animals weren’t simply larger versions of today’s kangaroos; their anatomy was specifically adapted to support their massive size.”
In this new study, Dr. Jones and her team examined the hind limbs of 94 modern and 40 fossil specimens from 63 species, including members of the extinct giant kangaroo group, Protemnodon, which thrived during the Pleistocene epoch, approximately 2.6 million to 11,700 years ago.
The researchers assessed body weight estimates and analyzed the fourth metatarsal length and diameter (a crucial elongated foot bone for jumping in modern kangaroos) to evaluate its capacity to endure jumping stresses.
Comparisons were drawn between the heel bone structures of giant kangaroos and their modern counterparts.
The team estimated the strength of tendons necessary for the jumping force of a giant kangaroo and determined whether the heel bones could accommodate such tendons.
The findings suggest that the metatarsals of all giant kangaroos were adequate to withstand jumping pressures, and the heel bones were sufficiently large to support the width of the required jump tendons.
These results imply that all giant kangaroo species had the physical capability to jump.
Nevertheless, the researchers caution that giant kangaroos likely did not rely solely on hopping for locomotion, given their large body sizes, which would hinder long-distance movement.
They highlight that sporadic hopping is observed in many smaller species today, such as hopping rodents and smaller marsupials.
Some giant kangaroo species may have used short, quick jumps to evade predators. Thylacoleo.
“Thicker tendons offer increased safety but store less elastic energy,” said Dr. Katrina Jones, a researcher at the University of Bristol.
“This trait may have rendered giant kangaroo hoppers slower and less efficient, making them more suited for short distances rather than extensive travel.”
“Even so, hopping doesn’t need to be maximally energy-efficient to be advantageous. These animals likely leveraged their hopping ability to rapidly navigate uneven terrain or evade threats.”
University of Manchester researcher Dr. Robert Nudds remarks: “Our findings enhance the understanding that prehistoric Australian kangaroos exhibited greater ecological diversity than seen today, with some large species functioning as herbivores, akin to modern kangaroos, while others filled ecological niches as browsers, a category absent among today’s large kangaroos.”
For more details, refer to the study results published in the journal Scientific Reports.
_____
M.E. Jones et al. 2026. Biomechanical Limits of Hindlimb Hopping in Extinct Giant Kangaroos. Scientific Reports 16/1309. doi: 10.1038/s41598-025-29939-7
Revolutionary simulations from Maynooth University astronomers reveal that, at the onset of the dense and turbulent universe, “light seed” black holes could swiftly consume matter, rivaling the supermassive black holes found at the centers of early galaxies.
Computer visualization of a baby black hole growing in an early universe galaxy. Image credit: Maynooth University.
Dr. Daksar Mehta, a candidate at Maynooth University, stated: “Our findings indicate that the chaotic environment of the early universe spawned smaller black holes that underwent a feeding frenzy, consuming surrounding matter and eventually evolving into the supermassive black holes observed today.”
“Through advanced computer simulations, we illustrate that the first-generation black holes, created mere hundreds of millions of years after the Big Bang, expanded at astonishing rates, reaching sizes up to tens of thousands of times that of the Sun.”
Dr. Louis Prowl, a postdoctoral researcher at Maynooth University, added: “This groundbreaking revelation addresses one of astronomy’s most perplexing mysteries.”
“It explains how black holes formed in the early universe could quickly attain supermassive sizes, as confirmed by observations from NASA/ESA/CSA’s James Webb Space Telescope.”
The dense, gas-rich environments of early galaxies facilitated brief episodes of “super-Eddington accretion,” a phenomenon where black holes consume matter at a rate faster than the norm.
Despite this rapid consumption, the black holes continue to devour material effectively.
The results uncover a pivotal “missing link” between the first stars and the immense black holes that emerged later on.
Mehta elaborated: “These smaller black holes were previously considered too insignificant to develop into the gigantic black holes at the centers of early galaxies.”
“What we have demonstrated is that, although these nascent black holes are small, they can grow surprisingly quickly under the right atmospheric conditions.”
There are two classifications of black holes: “heavy seed” and “light seed.”
Light seed black holes start with a mass of only a few hundred solar masses and must grow significantly to transform into supermassive entities, millions of times the mass of the Sun.
Conversely, heavy seed black holes begin life with masses reaching up to 100,000 times that of the Sun.
Previously, many astronomers believed that only heavy seed types could account for the existence of supermassive black holes seen at the hearts of large galaxies.
Dr. John Regan, an astronomer at Maynooth University, remarked: “The situation is now more uncertain.”
“Heavy seeds may be rare and depend on unique conditions for formation.”
“Our simulations indicate that ‘garden-type’ stellar-mass black holes have the potential to grow at extreme rates during the early universe.”
This research not only reshapes our understanding of black hole origins but also underscores the significance of high-resolution simulations in uncovering the universe’s fundamental secrets.
“The early universe was far more chaotic and turbulent than previously anticipated, and the population of supermassive black holes is also more extensive than we thought,” Dr. Regan commented.
The findings hold relevance for the ESA/NASA Laser Interferometer Space Antenna (LISA) mission, set to launch in 2035.
Dr. Regan added, “Future gravitational wave observations from this mission may detect mergers of these small, rapidly growing baby black holes.”
For further insights, refer to this paper, published in this week’s edition of Nature Astronomy.
_____
D.H. Meter et al. Growth of light seed black holes in the early universe. Nat Astron published online on January 21, 2026. doi: 10.1038/s41550-025-02767-5
“Every so often, a groundbreaking product emerges that reshapes our reality.” Steve Jobs during the 2007 Apple presentation. Tech executives often hype their innovations, but this proclamation was substantiated. The iPhone not only popularized apps but also introduced compact, powerful computers into our daily lives.
However, this transformation comes with drawbacks. Much like a snail retreating into its shell, we can retreat into our devices at any moment, breeding social anxiety. Coupled with safety issues, numerous countries have restricted mobile phone use in educational settings, and Australia has implemented a total ban on social media for users under 16 as of December 2025. Additionally, reliance on a constantly connected device can diminish our sense of privacy, according to data scientists like Mar Hicks of the University of Virginia. “This technology is acclimating users to significantly less privacy, not only in public spaces but also within the privacy of their own homes.”
Smartphones transcend their basic function, emphasizing their role in our lives, as anthropologist Daniel Miller from University College London notes. “They’ve expanded our personal space,” he articulates. These handheld digital environments allow for seamless access to the virtual worlds of our friends and family, resulting in a continuous navigation between our physical and digital existence.
The global influence of smartphones is undeniable. According to GSMA, the mobile operators’ industry association, over 70% of the global population now owns a smartphone. In many low-income countries, people increasingly bypass traditional desktop computers altogether. Smartphone-driven fintech platforms facilitate transactions for 70 million users across 170 countries, removing the necessity for conventional banks. Furthermore, farmers utilize smartphone applications for crop monitoring, and doctors employ them in hospitals to reduce reliance on costly machinery.
Moreover, the ramifications of smartphones extend far beyond their immediate use. The rapid miniaturization of electrical components like cameras, transistors, and motion sensors has enhanced processing power and introduced new potentials. This technological evolution has spurred numerous 21st-century innovations, including versatile drones, smart wearables, virtual reality headsets, and miniature medical implants.
As an avid admirer of Peter F. Hamilton, I eagerly anticipated his latest release, Empty Hole, particularly because I’ve always been fascinated by the Ark story.
Centuries have elapsed since the ship’s voyage, and its crew has devolved into a medieval-like society, residing beneath the remnants of their ancestors’ advanced technology. We uncover the challenges they encountered, including issues with the planet they were meant to land on, and a rebellious uprising on board that stranded them in perilous circumstances. At the age of 65, inhabitants must be recycled for the ship. This unique premise captivates me completely.
All of this is framed from Hazel’s first-person viewpoint, a 16-year-old girl. A significant breach exists in the ship’s hull (hence the title), she battles intense headaches, and soon finds herself ensnared in a whirlwind of dramatic events. Yet she finds time to fret about boys and garments, which I couldn’t afford. Why would a girl focus on fashion when the survival of everyone in a spaceship is at stake, and she is constantly plagued by headaches?
As fans may know, Hamilton is a master storyteller renowned for his contributions to big science fiction. My personal favorites include Empty Space and the Dawn trilogy, as well as his intricate and thrilling Commonwealth Saga duology. His narratives are dynamic, wildly innovative, and filled with complexities that often leave me thrilled, even if I don’t fully grasp every detail.
I had reservations about Hamilton’s more recent works, like Exodus: Archimedes Engine, which ties into the upcoming video game, Exodus. I felt certain plotlines were included solely to promote the game, detracting from the reader’s enjoyment. However, I appreciate that these works may not target my demographic. It’s evident the seasoned author is seeking new challenges. (For those who enjoy video game adaptations, the second installment in the game series will be released later this year and the game is set to debut in 2027.)
“
If I were a movie or TV scout, I could envision Empty Hole adapting beautifully for the screen. “
All this reminds me of Empty Hole. Midway through, I realized it seemed somewhat juvenile, for want of a better term. Research revealed that this novel was initially released as an audio-only book in 2021, primarily categorized as “young adult” or targeted towards teenagers.
In a 2020 interview, Hamilton expressed, “Though young adults as protagonists define a particular publishing category, I hope this work will resonate with audiences of all ages.” Personally, I don’t believe that a youthful protagonist excludes the potential for an adult-oriented book. (I mention this as a writer of novels featuring teenage lead characters.) So, can readers of all ages enjoy this book?
The plot setup and twists are stellar, as expected from Hamilton. However, I wish he had toned down the “teenage” aspects. I don’t require an interlude where she holds her boyfriend’s hand while my hero is fleeing danger. I believe that making the protagonist face the reality of being recycled at 65 would have added significant weight.
Perhaps Hamilton will capture a fresh audience with this release. For instance, as a movie or TV scout, I could envision how Empty Hole would look great on screen. This title is the first in a trilogy, with sequels slated for release in June and December. As I highlighted in my preview of new science fiction releases for 2026, this rapid schedule is unusual, and I’m excited to see how it unfolds.
If you’re yet to experience Hamilton’s classic works, there are various entry points into the remarkable worlds he has created. I recommend Pandora’s Star and its sequel, Judas Unchained, as excellent beginnings. If “epic space opera” resonates with you, these novels are likely a perfect match.
Emily H. Wilson is a former editor of New Scientist and author of Sumerian, a trilogy set in ancient Mesopotamia. The final book in the series, Ninchevar, is currently available. You can find her at emilywilson.com, or follow her on X @emilyhwilson and Instagram @emilyhwilson1.
Since the early 1990s, astronomers have made groundbreaking discoveries in exoplanet research. The real surge began in the early 2000s with comprehensive surveys, revealing that our unique solar system, featuring four rocky planets and four gas giants, might be unlike most others.
For decades, the Chilean High Precision Radial Velocity Planet Probe and the California Legacy Survey have meticulously tracked the stellar wobbles caused by exoplanets. While these surveys have not as many exoplanet discoveries as pioneering telescopes like Kepler and TESS, they shed light on the distinctiveness of our solar system.
For instance, our Sun outsize over 90% of other stars and exists alone, unlike many stars with companion stars. Earth’s size is also exceptional; only 1 in 10 stars hosts a planet like Jupiter. When such planets are found, their orbits often dramatically differ from Jupiter’s stable, circular path. Notably absent from our system are super-Earths or sub-Neptunes, which are common in other star systems. Despite thousands of exoplanet discoveries, Earth-like planets orbiting sun-like stars, and potential extraterrestrial life remain elusive.
“Our solar system is strange due to what we have and what we lack,” states Sean Raymond from the University of Bordeaux, France. “It’s still uncertain whether we are simply rare at the 1% level or genuinely unique at the 1 in a million level.”
These revelations prompt intriguing inquiries about the formation of our solar system. Questions remain, such as why Jupiter is located farther from the Sun—rather than closer, as seen in many planetary systems. Unusual orbits of exoplanets have made astronomers reconsider our system’s history. The Nice model, proposed in 2001, suggests a major reconfiguration post-formation, moving Jupiter to the outskirts while redirecting asteroids and moons into new trajectories.
“The understanding that such a shift could occur stemmed directly from exoplanet research,” Raymond notes. “Approximately 90% of large exoplanetary systems exhibit instability. This insight prompts speculation about possible historical fluctuations within our solar system.”
Batteries and solar energy technologies have been evolving for centuries, but they reached a pivotal moment in 2016. This year marked the launch of the first Gigafactory in Nevada, which produces cutting-edge battery technologies, electric motors, and solar cells on a large scale. The term ‘Gigafactory’ implies vast production capabilities.
The renewable energy potential—including solar, wind, and hydropower—is staggering. In merely a few days, the sun provides more energy to Earth than we can harvest from all fossil fuel reserves combined.
Efficiently harnessing this power remains a challenge. The photovoltaic effect, discovered by Edmond Becquerel in 1839, allows light to generate electric current. Although the first functional solar panels emerged in the 1950s, only in the 2010s did solar technology advance enough to rival fossil fuels. Simultaneously, lithium-ion batteries invented in the 1980s have created reliable energy storage solutions.
The Gigafactory has been instrumental in advancing these solar and battery technologies—not through new inventions but by integrating all components of electric vehicle production. This approach reflects Henry Ford’s legacy, populating the world with Teslas instead of fossil fuel-burning vehicles. “Batteries have made it possible to utilize solar power efficiently, and electric vehicles are now a reality,” says Dave Jones from Ember, a British energy think tank.
The economies of scale introduced by gigafactories have extended their impact beyond electric vehicles. “These batteries will enable a host of innovations: smartphones, laptops, and the capacity to transport energy efficiently at lower costs,” remarks Sarah Hastings-Simon from the University of Calgary, Canada.
Due to recent advancements, the costs associated with these technologies have plummeted. Many experts believe that the electrification of energy systems is now inevitable. In states like California and countries such as Australia, the abundance of solar energy has led grid operators to offer electricity at no cost. Battery technology is rapidly improving, enabling the development of solar-powered planes, ships, and long-haul trucks, effectively breaking our reliance on fossil fuels that have dominated energy systems for centuries.
After the reintroduction of wolves to Yellowstone National Park in 1995, significant ecological changes were observed, particularly a substantial decrease in moose populations. This decline was largely attributed to the impact of wolves on elk behavior; where wolves were likely present, elk dedicated more time to vigilance and less to foraging. Biologist John Laundre referred to this phenomenon as a “landscape of fear” in a pivotal 2001 study.
This concept builds on earlier research that suggested predator fear could influence prey behavior. Until then, it was widely assumed that predators primarily affected prey populations through physical predation alone. Laundre’s observations challenged this notion, indicating a potentially complex relationship between fear and wildlife dynamics.
Recent studies led by Liana Zanet at Western University in Ontario, Canada, further explore this landscape of fear. Over the past two decades, Zanet and her colleagues conducted experiments in British Columbia, playing predator calls near wild songbirds. Their findings revealed a marked reduction in egg-laying and hatching rates, with survival rates for hatchlings plummeting when predator sounds were used. Less than half of the hatchlings survived compared to when non-predator sounds were played. This indicates that fear alone can significantly outweigh the effects of direct predation on wildlife populations.
According to Zanet, prey animals often prioritize safety over foraging opportunities, avoiding prime feeding areas when they perceive threats. This fear-based behavior has profound ecological implications. On Canada’s west coast, the absence of natural predators like bears, cougars, and wolves has allowed raccoons to flourish, leading them to scavenge food resources along the coastline.
When Zanet’s team introduced dog barking recordings in coastal regions, they observed that raccoons largely avoided the beach, spending their time instead watching for potential threats. This avoidance behavior has contributed to the dramatic rebound of coastal animal populations in areas where predator fear is heightened. However, similar effects were not observed when seal sounds were played.
Understanding landscapes of fear is crucial for comprehending the profound impacts humans have on wildlife. In a specific study, Zanet’s team utilized camera traps to observe how wild animals responded to various sounds in Kruger National Park, South Africa. Surprisingly, they found that the fear generated by human presence surpassed that of lions, highlighting the extensive influence of human activity on wildlife behavior and ecosystems.
In the last 25 years, the field of human evolution has witnessed remarkable growth, showcased by a significant increase in discoveries. Archaeologists have unearthed more fossils, species, and artifacts from diverse locations, from the diminutive “hobbits” to enigmatic creatures inhabiting Indonesian islands. Notably, Homo naledi is known solely from a single deep cave in South Africa. Simultaneously, advanced analytical techniques have enhanced our understanding of these findings, revealing a treasure trove of information about our origins and extinct relatives.
This whirlwind of discoveries has yielded two major lessons. First, since 2000, our understanding of the human fossil record has been extended further back in time. Previously, the oldest known human fossil was 4.4 million-year-old Ardipithecus, but subsequent discoveries in 2000 and 2001 unearthed even older species: Ardipithecus, Orrorin tugenensis from 6 million years ago, and Sahelanthropus tchadensis from 7 million years ago. Additionally, the Orrorin lineage was tentatively identified in 2022, suggesting it is slightly more recent than O. tugenensis.
According to Clement Zanoli from the University of Bordeaux, the discovery of these early human fossils represents “one of the great revolutions” in our understanding of evolution.
The second major lesson has enriched the narrative of how our species emerged from earlier hominins. By 2000, genetic evidence established that all non-Africans descend from ancestors who lived in Africa around 60,000 years ago. This revelation indicated that modern humans evolved in Africa and subsequently migrated, replacing other hominid species.
However, by 2010, the sequencing of the first Neanderthal genome opened a new chapter, along with the DNA analysis of several other ancient humans. These studies revealed that our species interbred with Neanderthals, Denisovans, and possibly other groups, creating a complex tapestry of human ancestry.
Skeletal research has long suggested interbreeding as many fossils exhibit traits that defy clear species categorization, as noted by Sheila Athreya at Texas A&M University. In 2003, Eric Trinkaus and colleagues described a jawbone excavated from Peștera cu Oase, Romania, as a Human-Neanderthal hybrid, based on its morphology. Later genetic testing in 2015 confirmed that individuals from Oase had Neanderthal ancestry, tracing back 4 to 6 generations ago.
This evidence highlights that our species did not merely expand from Africa; rather, our population absorbed genetic contributions from Neanderthals and Denisovans along the way. Genetically, we are a mosaic, a fusion of countless years of diverse human lineages.
Everyone has secrets to protect. In today’s digital age, whether safeguarding personal messages, business communications, or confidential state information, end-to-end encryption (E2EE) offers essential security and peace of mind.
E2EE ensures that your communications remain private from internet service providers and the operators of messaging or video conferencing applications. Messages are encrypted on the sender’s device and only decrypted by the recipient, making them unreadable to unauthorized parties while in transit. This prevents access by any entity, including law enforcement or corporate insiders.
Digital encryption is rooted in robust mathematics rather than mere assurances. The RSA algorithm, introduced in 1977, pioneered modern encryption by relying on the complexity of factoring large numbers into their prime components. Since then, various algorithms have emerged, utilizing intricate mathematics to enhance cryptographic security.
The true strength of E2EE lies not just in its technical implementation, but in how it upholds democracy and human rights across the globe. As Matthew Feeney from the UK privacy group Big Brother Watch states, “There are individuals in perilous regions depending on encryption to preserve their lives.” Additionally, even in recognized democracies, freedom is vulnerable. Feeney warns that those who claim “I have nothing to hide” should take heed of history’s lessons.
Many governments view E2EE unfavorably because it blocks surveillance, similar to how postal services safeguard letters. Notably, UK governments have attempted to ban E2EE; most recently, Prime Minister Keir Starmer reversed a controversial request for a backdoor into Apple following a public outcry.
Feeney acknowledges the uncertainty surrounding the potential for E2EE to be compromised, as intelligence agencies typically do not disclose their capabilities. Concerns loom regarding the advent of quantum computing, which may soon breach current encryption algorithms. However, cryptography continues to evolve, with emerging mathematical solutions challenging outdated algorithms. “Governments may wield power, but they can’t override the laws of mathematics,” Feeney asserts.
Topics:
This rewrite optimizes the content for SEO, ensuring clarity, keyword inclusion, and readability while preserving the original structure and HTML tags.
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.