A groundbreaking study examining various South American butterfly lineages and diurnal moths reveals that convergent evolution—where unrelated species develop similar traits—follows a consistent genetic pattern. This discovery has significant implications for understanding how species may adapt to climate change.
Ben Chehida and others. A flight study of Itomini, Isomini, and Heliconius butterflies, along with the Ketonga moth. Image credit: Ben Chehida et al., doi: 10.1371/journal.pbio.3003742.
“Convergent or parallel evolution serves as a natural experiment where unrelated species independently evolve similar traits in response to equivalent selective pressures,” states Kanchon Dasmahapatra, a professor at the University of York.
“This indicates how reproducible—and thus predictable—evolution can be.”
“Highly divergent lineages often display significant trait convergence, such as repeated colonization of habitats like land, water, and air, or the evolution of resistance against threats like pesticides, drought, and heat stress.”
According to the researchers, “Convergence in traits across different species can stem from genetic changes occurring in different genes or in the same gene (gene reuse).”
“Gene reuse is expected to be more prevalent among closely related lineages or when developmental pathways towards optimal fitness are limited.”
“Convergence may happen when the same allele is reused (allele sharing), either through independent mutations in one gene or through ancestral variation and introgression between species.”
In this study, the authors investigated various species of distantly related South American rainforest butterflies and moths that share similar wing color patterns for predator deterrence, a phenomenon known as mimicry.
The study aims to identify the genes responsible for these similar mimic color patterns among seven distantly related species.
Remarkably, researchers found that distinct butterfly and moth species reuse the same two genes—ivory and optics—which evolve into similar color patterns, despite being very distant relatives.
Genetic alterations in several butterfly species did not occur in the genes themselves but rather in similar “switches” that control gene expression.
Interestingly, one moth species utilizes an inversion mechanism where substantial DNA sequences flip directions, mirroring a genetic strategy used by a butterfly.
“Convergent evolution, where numerous unrelated species independently develop the same trait, is a widespread phenomenon across the tree of life,” says Professor Dasmahapatra.
“However, there is limited opportunity to explore the genetic foundation of this phenomenon.”
“By studying seven butterfly lineages along with diurnal moths, we demonstrate that evolution is surprisingly predictable and that both butterflies and moths have repeatedly employed the same genetic tricks to develop similar color patterns since the time of dinosaurs.”
The findings from this study reveal that evolution may not always be random and could be more predictable than previously believed.
Professor Joanna Meyer from the Wellcome Sanger Institute remarked: “All these distantly related butterflies and moths are toxic and unpalatable to birds that attempt to consume them.”
“Their similarities are advantageous; if birds recognize a specific color pattern as indicating ‘don’t eat us, we are poisonous’, it benefits other species to exhibit the same warning colors.”
“Our research illustrates that these warning colors are remarkably optimal. With a highly conserved genetic basis over 120 million years, evolving these similar color patterns could be quite straightforward.”
The results are published in the journal PLoS Biology.
_____
Y. Ben Chehida et al. 2026. Convergent mimic coloration in lepidopterans over 120 million years of evolution is underpinned by genetic parallelism. PLoS Biol 24 (4): e3003742; doi: 10.1371/journal.pbio.3003742
Recent research from the Max Planck Institute for Geoanthropology and the University of Cambridge reveals that malaria significantly impacted early humans, not just as a disease, but as a factor that influenced habitat selection, population fragmentation, and the genetic evolution of our species.
Colucci et al. investigated how Plasmodium falciparum-induced malaria influenced habitat selection among early human societies from 74,000 to 5,000 years ago.
“Malaria, a significant global health issue caused by the Plasmodium parasite, affects approximately 263 million people annually,” stated lead author Dr. Margherita Colucci and her team.
“Genetic evidence indicates that malaria posed a serious challenge during both recent prehistory and the Pleistocene epoch, with sickle cell anemia mutations linked to malaria emerging in Africa between 25,000 and 22,000 years ago.”
Archaeological findings also suggest that early humans developed tactics to minimize exposure to mosquitoes, such as using aromatic leaves with insecticidal properties in their surroundings.
The new study highlights how Plasmodium falciparum malaria played a crucial role in shaping human history in sub-Saharan Africa from 74,000 to 5,000 years ago.
Researchers discovered that malaria affected where early human populations settled, pushing them away from high-risk areas and leading to increased dispersal across various landscapes.
Over thousands of years, this demographic fragmentation influenced how groups intermingled and exchanged genetic material, ultimately shaping the genetic landscape of modern humans.
These findings suggest that malaria was more than just a health threat; it was a key factor in shaping human history.
“We utilized species distribution models for major mosquito groups alongside paleoclimate data,” explained Dr. Colucci.
“By integrating these findings with epidemiological insights, we estimated malaria transmission risks throughout sub-Saharan Africa.”
The researchers then compared these risk estimates with independent reconstructions of human ecological niches in the same regions during that time frame.
The results indicated that humans actively avoided high-risk malaria areas or could not survive in them.
Professor Andrea Manica remarked, “These decisions have significantly influenced human demographics over the past 74,000 years—and possibly beyond.”
“Malaria has played a pivotal role in shaping the structure of human societies.”
“Factors like climate and geographical barriers were not the only determinants of human habitation,” he added.
Professor Eleanor Seri noted, “This study opens new avenues in the exploration of human evolution, as disease has rarely been considered a driving force in our ancestry. Without ancient DNA from this period, verification would have been challenging.”
“Our research redefines this narrative and provides a new perspective on the role of disease in early human history.”
The findings are published in today’s edition of Scientific Advances.
_____
Margherita Colucci et al. 2026. Malaria’s impact on human spatial organization over 74,000 years. Scientific Advances 12(17); doi: 10.1126/sciadv.aea2316
How does Simon Singh’s classic popular science book “Fermat’s Last Theorem” resonate today?
Did you know that the number 26 is unique? It’s the sole integer nestled between the square number 25 (5) and the cube number 27 (3). This intriguing detail highlights that no other examples exist between zero and infinity.
Simon Singh’s 1997 book Fermat’s Last Theorem is an insightful exploration of mathematical proof. It delves into what proof means, how it can be achieved, and what drives mathematicians in their passionate pursuits. This book narrates a captivating quest for evidence, making it a compelling read. Given that it took 350 years for the proof to surface, it also offers an impressive historical lens on mathematics. For many, the essence of mathematics feels like abstract reasoning beyond reach. Yet, Singh’s work transports readers into this captivating realm, remaining a treasure even nearly 30 years after its publication.
Singh begins with Pythagoras, renowned for his contributions to triangle theory. Most people are familiar with the Pythagorean theorem, stating that the sum of the squares of a right triangle’s two shorter sides equals the square of the longest side (2 + y2 = z2). While others used this methodology before, Singh highlights how Pythagoras distinguished himself by proving it true for all right triangles—not through trial and error, but via inarguable logic. “The quest for mathematical proof is a pursuit for absolute knowledge,” Singh asserts.
My favorite segment involves the tale of Pythagoras, as I learned he was the founder of the Secret Brotherhood of Proofs, and was fascinated by the story of Cyclone, a man denied admission, who conspired against Pythagoras.
Next, Pierre de Fermat enters the narrative. Living in 17th-century France, this judge revealed remarkable mathematical prowess. He famously proved the uniqueness of the number 26. Fermat became renowned for his “last theorem,” an elegant extension of the Pythagorean theorem. While an infinite number of integers can satisfy the Pythagorean equation, Fermat proposed that tweaking it to n + yn = zn with any integer n results in no integer solutions. In 1637, he audaciously claimed to possess “really excellent” proof, though he never documented it.
For 350 years, mathematicians chased its secrets. Singh adeptly navigates this journey, introducing a colorful cast of characters. One standout is Sophie Germain, a pioneering French mathematician who operated under a male alias. Evariste Galois, a fervent revolutionary, made significant contributions but fell in a duel. Yutaka Taniyama, a brilliant Japanese mathematician, played a key role in the eventual proof but tragically took his life.
Yet, our narrative’s hero is mathematician Andrew Wiles, who ultimately proved Fermat’s theorem true in 1994. Singh skillfully portrays Wiles, illuminating his notable achievements, even as he shunned the limelight. Through Wiles’ work—constructing a logical bridge between elliptic curves and modular forms—readers gain insight into complex mathematical realms.
However, the journey contains a tense twist: Wiles’ original proof revealed an error—a nightmare scenario. Yet, he rose from these ashes, ultimately correcting the flaws. My only critique is that this part of the narrative could have been more concise.
Although Singh’s book dates back to the 90s, its themes remain pertinent in modern mathematics. One concept tying both the book and Wiles’ proof is the Langlands program, proposed by mathematician Robert Langlands in 1967. It suggests that various mathematical areas are interconnected, and uncovering these ties could lead to breakthroughs in previously unsolvable problems. Wiles’ research provided early confirmation of the Langlands conjecture, with recent discoveries shedding further light on this vibrant area of mathematics.
Upon finishing the book, I felt as if I was wandering through a gallery of abstract art. Mathematics proofs, like art, invite quiet observation, arousing curiosity about the minds behind them, and providing glimpses beyond everyday experience. This book deserves the highest praise for evoking such profound emotions.
Topics:
This revised content is optimized for SEO while preserving the original HTML structure.
Lyudmila Dyblenko – Chernobyl’s Guardian During the 2022 Occupation
Mykhailo Palinchak
On February 24, 2022, as Russian forces advanced into Ukraine, Lyudmila Dyblenko, head of the Chernobyl meteorological observatory, ordered her staff to evacuate. Unfortunately, she was unable to escape, as the exclusion zone around the Chernobyl nuclear plant fell under Russian occupation.
“We started gathering equipment and monitors, but it was too late,” Dyblenko recounted in the modest hut that hosts the weather station. Despite the dire circumstances, she heroically resolved to continue essential measurements—radiation, temperature, wind, and rainfall—that are crucial for scientists monitoring the situation in Chernobyl. “I chose to keep working,” she stated. “I truly love my job and my country.”
While monitoring is typically automated, power outages by March 9 left her equipment inoperable, making heating and cooking virtually impossible. The hut became the warmest refuge during her winter stay in Chernobyl, with a fire continuously lit and a comfortable desk to work at. Under occupation, conditions were increasingly challenging.
Dyblenko meticulously tracked Russian patrols, timing her exits to collect manual measurements, eventually using an older cell phone to transmit data due to its superior reception capabilities. Situated in the highlands of Chernobyl, she discovered nearby spots—a church and a truck park—where weak signals permitted data extraction.
“There is software that automatically compiles and sends data, but that was impossible during the power outage,” Dyblenko explained. “We had to do it manually.”
Unfortunately, as time passed, Russian soldiers grew bolder. At one point, someone forced their way into her house demanding cognac. She cleverly defused the situation by treating him as a mischievous child, saying, “Is this a restaurant?” Fortunately, he retreated, showing the power of her quick thinking.
Eventually, she spotted a small red light in the bushes near her scientific equipment, realizing a surveillance device had been placed there. Ignoring the threat, she persisted in her crucial work.
Thanks to her relentless efforts, there were no gaps in the data collected, allowing for uninterrupted scientific analysis of the Chernobyl Exclusion Zone during the occupation. In recognition of her bravery, Ukrainian President Volodymyr Zelenskiy awarded her one of the few medals given to a meteorologist during the ongoing conflict, a testament to her remarkable courage.
Astronauts captured stunning images of the moon’s crater-filled Antarctic region during the Artemis mission. NASA is planning future lunar landings focusing on this area.
The moon’s south pole features numerous craters believed to contain water ice, presenting unique challenges for navigation compared to the Apollo landing sites near the equator. Insights gathered during the Artemis II mission will help identify potential landing sites for upcoming exploration.
Towards the conclusion of the lunar flight, astronauts had the incredible opportunity to observe a solar eclipse from space. They recorded detailed observations for roughly an hour as the sun disappeared behind the moon and emerged from the opposite side.
During the initial phases of the eclipse, astronauts utilized specialized glasses akin to those worn on Earth to safely view the event as the moon obscured the sun’s rays.
This rewritten content emphasizes SEO by including keywords like “NASA,” “Artemis mission,” “lunar landings,” and “solar eclipse,” while maintaining the original HTML structure.
Changes in predator populations may have driven early humans to develop innovative tools
Raul Martin/MSF/Science Photo Library
Approximately 200,000 years ago, a decline in megafauna may have compelled early humans to transition from heavy stone tools to more lightweight hunting kits designed for smaller prey. A recent study supports the notion that this change in hunting strategy could have sparked a rise in cognitive capabilities among our ancestors.
For over a million years, various early human species relied on heavy stone tools such as axes, kitchen knives, scrapers, and stone balls. These robust tools were essential for hunting and butchering large herbivores, including extinct relatives of modern elephants, hippos, and rhinos.
Between 400,000 and 200,000 years ago, archaeological evidence shows a notable increase in smaller, sophisticated tools alongside the fading of traditional heavier tools. Our species, Homo sapiens, emerged during this timeframe.
Circa 200,000 years ago, heavy stone tools vanished from the archaeological record of the Levant, while the presence of diverse, lightweight masonry toolkits—like blades and precision scrapers—increased.
Research led by Vlad Litov, a professor at Tel Aviv University, revealed a correlation between these technological advancements and a significant decline in large herbivores, potentially due to overhunting.
The researchers analyzed archaeological findings from 47 sites across the Levant, spanning the Paleolithic period, which lasted from around 3.3 million years ago to 12,000 years ago. Their analysis of dated stone artifacts in relation to animal remains uncovered a compelling trend.
Findings indicate a drastic reduction in the biomass and specimen count of giant herbivores exceeding 1,000 kilograms correlating with the disappearance of heavy tools 200,000 years ago. Conversely, the availability of smaller prey increased alongside more sophisticated small tools.
Supporting the connection between tool technology and prey type, the researchers noted that sturdy stone tools were still in use in regions with abundant large game, such as southern China, until about 50,000 years ago.
Heavy-duty tools and their evolution to lightweight alternatives used by early humans
Vlad Litov et al., Institute of Archaeology, Tel Aviv University
Previous theories suggested that advancements in technology stemmed from increasing intelligence and creativity due to evolutionary pressures. However, Litov and his research team propose a different perspective: reliance on smaller prey may have catalyzed the evolutionary growth of larger brains in modern humans.
“As large herbivores dwindled, humans increasingly depended on smaller prey, necessitating varied hunting strategies, advanced planning, and the implementation of lightweight, intricate toolsets,” states Litov. “This cognitive evolution was a byproduct of adapting to new prey types, rather than the initial driver of this adaptive transformation.”
“There is more to this adaptation than merely prey size,” says Seri Shipton from University College London. He notes preliminary evidence indicating mass hunting of medium-sized ungulates like horses and bison, with signs of enhanced cognitive abilities and planning emerging during the Middle Paleolithic.
Nicolas Tessandier from the French National Center for Scientific Research also maintains some reservations. “Human adaptation to new fauna underscores adaptability rather than mere intelligence,” he posits. “Producing powerful tools for hunting large herbivores was equally astute.”
Litov recognizes that prior research has shown advanced cognitive functions present early in human evolution, notably in the development of Homo erectus around two million years ago. However, he emphasizes that switching from large to smaller prey had major consequences for human society. A single ancient elephant carcass could sustain a group of about 35 hunter-gatherers for months. As these high-calorie resources vanished, reliance on smaller prey reduced the yield per animal.
“Energetically, we had to gather numerous smaller ungulates, such as fallow deer, to replace the loss of one elephant,” explains Litov. This shift likely stimulated diverse cognitive and behavioral changes, including cooperative hunting strategies, advanced techniques, and enhanced social collaboration and organization. “Such adaptations may have contributed to the evolution of larger brains in later species, including Neanderthals and Homo sapiens,” he adds.
“In my view, the decline in large prey familiar to hominins likely intensified competition among groups,” asserts Shipton. “It was probably an iterative process where the reduction of larger prey prompted cognitive shifts that facilitated access to smaller prey.”
Discovery Tour: Archaeology, Human Origins, and Paleontology
New Scientist regularly highlights captivating sites worldwide that have transformed our understanding of species and the early days of civilization. Why not explore them yourself?
Topics:
In this revision, I’ve incorporated SEO-friendly keywords and maintained the integrity of the original content while adding clarity and enhancing readability.
2026 marks a significant milestone as humanity embarks on its bold journey to colonize Mars.
Later this year, NASA’s ESCAPADE rover is set to launch towards Mars, laying the groundwork for future manned missions. For more details, read about the rover’s objectives here.
Future settlers aim to create self-sustaining cities on Mars, transforming its harsh landscape and opening new possibilities for humanity beyond Earth. This endeavor also has the potential to extend the survival of human consciousness.
Elon Musk has expressed his ambition to land on Mars within two years, as noted in 2024 on X. He has often referenced Kim Stanley Robinson’s acclaimed novel, Red Mars, published in 1992.
Set in 2026, Robinson’s narrative doesn’t rely on extraterrestrial conflicts or futuristic technologies. Instead, it delves into the ethical dilemmas faced by humans, highlighting debates surrounding the sanctity of intelligent life versus the need for solar system exploitation.
Robinson’s prophetically accurate depiction of the future includes a world dominated by powerful multinational corporations, overshadowing the United Nations. The author suggests that the UN operates as a mere tool for these corporations, predicting a future where corporate interests dictate global affairs.
His vision resonates with early predictions by Pulitzer Prize-winning science writer David Dietz, who forecasted rampant resource overexploitation and an increase in competition, leading to rising prices and a decline in luxury goods.
Robinson’s Red Mars illustrates how future generations will navigate environmental challenges. Climate change is a key factor motivating humanity to leave Earth, and the protagonist, Anne Claiborne, views Mars as a new beginning rather than a mere resource. “You can’t simply erase the surface of a planet that’s 3 billion years old,” she notes during discussions on terraforming.
The character Frank Chalmers reflects on past ecological disasters on Earth, drawing parallels to today’s ambitious “climate megaprojects,” such as glacier stabilization and large-scale re-greening efforts.
Red Mars also continues the tradition of classic speculative fiction, focusing on human conflict and societal division as the settlers grapple with how best to cultivate their new home. This central theme is further developed in Robinson’s sequels, Green Mars and Blue Mars.
Anne’s concerns about the ethical implications of creating breathable air on Mars echo a profound respect for potential undiscovered native life. “It would be unscientific and, worse, immoral,” she asserts.
The depth of Robinson’s characters and narratives makes Red Mars a treasured work, earning both the Nebula Award and the British Science Fiction Society Award, and has been subject to numerous attempts at a screen adaptation, including interest from director James Cameron before he focused on the Avatar universe.
The prequel, Green Mars, was also included in NASA’s Mars rover Phoenix lander in 2006 as part of an interplanetary library, a nod to Robinson’s influence on the genre to this day.
Outside of his Mars Trilogy, Robinson has expressed caution regarding future technological advancements and governance in his works. His novel, 2312, published in 2012, envisions a world facing extreme heat and rising sea levels while reflecting on humanity’s slow response to climate issues.
In the same year, he addressed the future of technology and society at the Humanity+ conference, emphasizing the need for inclusivity in tech advancements, stating, “[It] has to be for All People Plus,” hinting at underlying societal tensions.
The New Scientist Book Club is currently reading Red Mars by Kim Stanley Robinson. Join us for a collective reading experience here.
Recent scientific research has unveiled two previously unknown species of marsupials within the remote rainforests of New Guinea’s Vogelkop Peninsula. The Pygmy Longfinger Possum (Dactylonax Kambuyai) and Wow Glider (Thus ayamalensis) are believed to have vanished around 6,000 years ago. These discoveries indicate that New Guinea’s rich forests may still conceal remnants of an ancient animal kingdom.
Pygmy Longfinger Possum (Dactylonax Kambuyai), a female spotted in the Kralik area of the Vogelkop Peninsula. Image credit: Carlos Bocos.
Professor Tim Flannery from the Australian Museum states: “The identification of a ‘Lazarus taxon’ is a remarkable event, especially when it was thought to be recently extinct.”
“The uncovering of two species once believed to be extinct for millennia is truly exceptional.”
“This discovery underscores the crucial need to conserve these unique biological regions and highlights the significance of collaborative research in safeguarding hidden biodiversity.”
The pygmy long-tailed possum and the ring-tailed glider, known through Pleistocene fossils found in Australia and New Guinea, inhabit secluded lowland forests of the Vogelkop Peninsula.
“Vogelkop represents an ancient section of the Australian continent, now part of New Guinea,” remarked Professor Flannery.
“Its forests may still harbor even more hidden aspects of Australia’s natural history.”
The Pygmy Longfinger Possum boasts striking stripes and remarkable adaptability, featuring one finger on each hand that is twice as long as the next longest finger.
This species is thought to have vanished from Australia during the Ice Age, a period notorious for the extinction of iconic megafauna, including the diprotodon and the marsupial lion.
Wow Glider (Thus ayamalensis), a subadult from the South Solon area of the Vogelkop Peninsula. Image credit: Arman Muharmansyah.
The ring-tailed glider is closely related to the Australian Glider (Petauroides) and marks the first new genus of marsupials identified in New Guinea since 1937.
Smaller than its relatives, this species features furless ears and a strong, prehensile tail, forming lifelong pair bonds and typically raising just one pup annually.
Similar to sugar gliders, these marsupials reside in tree hollows high within the forest canopy and face threats from logging practices.
“The glider, known locally as tous among some Tamburou and Maybrat communities, is deemed sacred,” shared Lika Koline, a Maybrat community member.
“It symbolizes the spirits of our ancestors and plays a key role in educational practices such as initiation ceremonies.”
“Our meticulous collaboration with Tamburou Elders was essential, and without the involvement of Traditional Owners, this identification would not have been feasible.”
“We are immensely proud that Papuan researchers have contributed to these groundbreaking findings. Our gratitude extends to the communities in Misool, Maybrat, and Tambulo for their continued support in this research,” stated Dr. Aksamina Yohanita from the University of Papua.
A detailed study discussing these findings was published on March 6th in the Australian Museum Records.
_____
Tim F. Flannery et al. 2026. “Reemergence after 6,000 years: A modern record of the ‘extinct’ Papuan marsupial, Dactylonax Kambuyai (Marsupial: Petauridae), revised phylogeny and zoogeography of the genus Dactylonax.” Records of the Australian Museum 78 (1): 17-34; doi: 10.3853/j.2201-4349.78.2026.3003
Scanning Electron Micrograph: Escherichia coli (Yellow) Infecting Human Bladder Cells (Blue), Resulting in Thick Mucus Secretion (Orange).
Professor PM Motta et al./Science Photo Library
New research indicates that severe infections, such as cystitis, pneumonia, and dental issues, could elevate dementia risk. A comprehensive study in Finland involving hundreds of thousands of participants revealed that hospitalizations for these infections were linked to a higher probability of developing dementia, including early-onset dementia, within six years.
Current findings suggest that dementia, particularly Alzheimer’s disease, may be preventable or delayed through brain-training activities, lifestyle changes, and even sauna use. Further evidence now supports that minimizing infections may significantly lower dementia risk. “This indicates that the risk of dementia may be partially modifiable,” says Quantin Wu from Emory University in Atlanta, Georgia, who was not part of the study.
To explore this further, researchers analyzed health records from 62,555 individuals aged 65 and older, all of whom were free from dementia in 2016 but diagnosed between 2017 and 2020. This cohort was compared with 312,772 dementia-free individuals matched by age, gender, education, and marital status, highlighting a two-decade span of diagnoses and hospitalization records.
The team identified 29 symptoms linked to an approximately 20% increased risk of developing dementia within the following five to six years. Notably, while most symptoms were non-infectious and concerned other health issues, cystitis and nonspecific bacterial infections specifically contributed to heightened dementia risk. Subsequent evaluations indicated that infections primarily drove this increase compared to 27 other health conditions.
While inflammation is a crucial immune response to infections, it also plays a role in certain dementia types, including Alzheimer’s disease. Infection-induced inflammation can damage the brain’s circulatory system, leading to microbleeds and the infiltration of toxins through the blood-brain barrier, according to Shipira. Moreover, there is increasing evidence that vaccines targeting infections like shingles and influenza may lower dementia risk.
In another segment of the research, a focus was placed on early-onset dementia, identified in individuals under 65. Although conditions such as Parkinson’s disease and head trauma pose significant risks, multiple infections—including gastroenteritis, colitis, pneumonia, and dental infections—were found to have associations with early-onset dementia.
The variation in which specific infections affect either early-onset or standard-onset dementia remains unclear. The researchers noted differing causes and genetic susceptibilities associated with these dementia types in their findings.
Despite the robust correlations observed, it’s uncertain if these infections are direct causative agents of dementia or if mere correlations arise from confounding variables. “To clarify, intervention trials are essential to assess whether improved infection prevention can effectively lower or delay dementia onset,” Cipila asserts.
“Gil Livingston,” a professor at University College London, expressed openness to the possibility that such studies may affirm causal links. “This high-quality research aligns with other evidence, and when considering the timeline and biological plausibility, it seems likely,” she states.
This insight could significantly enhance strategies for preventing, managing, and monitoring serious infectious diseases, according to Wu. For instance, preventing cystitis involves maintaining adequate hydration and administering appropriate incontinence care. “Timely treatment is vital as UTIs in older adults often manifest unusually, such as through confusion or delirium, which can lead to missed or delayed diagnoses,” she emphasizes. “This study is both concerning and motivating.”
Geoscientists have made a groundbreaking discovery by analyzing magnetic signals in 3.5 billion-year-old rocks in Western Australia. This research reveals the oldest direct evidence of global shifts in the Earth’s outer shell, pushing the origins of plate motion back into the planet’s early history.
Hadean Earth. Image credit: Alec Brenner.
“A wide range of ages has been proposed for tectonic activity,” said Dr. Alec Brenner, a researcher from Yale University.
“Our findings confirm that tectonic plates were actively moving on Earth’s surface 3.5 billion years ago.”
This significant study focused on the Pilbara Craton in Western Australia, known for its ancient and well-preserved rock formations dating back to the Archean era, a time when Earth sustained early microbial life and endured significant asteroid impacts.
The Pilbara region hosts some of the earliest signs of life, including stromatolites and microbial rocks formed by single-celled organisms like cyanobacteria.
The research team analyzed over 900 rock samples from more than 100 sites within the Arctic Dome region.
Using an electric drill with a hollow bit and diamond teeth, they extracted cylindrical core samples while cooling them with a hand-pumped horticultural sprayer.
An instrument equipped with a compass and goniometer was inserted into the drilled holes to accurately record the orientation of the samples.
The scientists then sliced the cores into thin sections and placed them into a magnetometer capable of detecting magnetic signals 100,000 times weaker than a typical compass needle.
These samples were measured multiple times while subjected to temperatures up to 590 degrees Celsius until the magnetite mineral lost its magnetization.
“We took a significant risk; demagnetizing thousands of cores took years. But it paid off—our results exceeded our expectations!” exclaimed Dr. Brenner.
In ferromagnetic minerals, the orientation of electrons acts like a compass needle pointing towards the magnetic poles, providing clues about the rock’s geographical position relative to these poles when they formed.
By analyzing a succession of rocks spanning 30 million years, the authors observed a shift of tectonic plates in the East Pilbara Formation, moving from 53 degrees to 77 degrees latitude and rotating clockwise by over 90 degrees at rates of tens of centimeters per year.
Because the magnetic poles can reverse, it remains uncertain whether this movement took place in the northern or southern hemisphere.
Movement slowed significantly within the following 10 million years, followed by a period of relative stability.
To compare these findings with Archean sites elsewhere, the researchers analyzed the Barberton Greenstone Belt in modern-day South Africa.
Previous paleomagnetic studies have indicated that the Barberton site is near the equator and remained nearly stationary during this period, suggesting differing drift patterns between these regions.
In contemporary times, the North American and Eurasian plates are moving apart at a rate of about 2.5 cm per year.
Many questions about the timing and nature of Earth’s current plate tectonics remain unanswered, with geophysicists referring to this as the “active lid,” as opposed to earlier theories of a stagnant, sluggish, or ephemeral lid.
This research dismisses the concept of a stagnant lid but doesn’t conclusively determine which model of plate movement is most probable.
“We’re examining tectonic plate movements, which require defined boundaries between plates, contrary to the notion of a continuous, crackless lithosphere,” Brenner explained.
“Instead, the lithosphere was segmented into various parts capable of moving relative to one another.”
Additionally, Brenner and his collaborators identified the oldest known geomagnetic reversals, where a planet’s magnetic field alternates its polarity. After such a reversal, a compass needle points south instead of north.
This phenomenon is associated with dynamo action in the Earth’s core, where molten iron’s convection creates electrical currents and magnetic fields. The last known reversal occurred about 780,000 years ago.
“New evidence suggests that geomagnetic reversals were less frequent 3.5 billion years ago compared to today,” noted Roger Hu, a professor at Harvard University.
“While not definitive, it implies that the mechanisms behind these reversals may have operated differently back then.”
The findings were published in the journal Science on March 19.
_____
Alec R. Brenner et al. 2026. Relative plate motion and paleomagnetic detection of a core dynamo with a rare reversal at 3.5 Ga. Science 391 (6791): 1278-1282; doi: 10.1126/science.adw9250
A fascinating study conducted by Northwestern University reveals mathematical evidence supporting the long-held belief that clothing trends cycle every 20 years. This concept resonates with many, as we’ve all noticed styles like miniskirts and bell-bottom jeans making their comeback.
Lead author Dr. Emma Zajdera, an applied mathematician at Princeton University, stated, “We’ve all experienced the idea that fashion is coming back…” in BBC Science Focus.
Dr. Zajdera elaborates, “As mathematicians, we aimed to validate or refute this theory. Thanks to recent advancements in computer tools and digitized records, we could achieve this.” This groundbreaking research involved a multidisciplinary team, including mathematicians, computer scientists, engineers, and art historians, who compiled an extensive database of approximately 37,000 images of women’s clothing spanning 150 years.
The dataset included historical sewing patterns from 1869 to 2015, alongside Vogue runway images from 1988 to 2023. These resources enabled researchers to track changes in women’s fashion characteristics over the decades.
Dr. Zajdera explained, “We utilized a unique computer tool created by our team to quantify aspects such as dress length, waistline, and neckline along the vertical body axis. This provided consistent measurements for comparison over time.”
Interestingly, the results demonstrated a cyclical change in style popularity approximately every 20 years. Fashion trends rise, fade, and eventually resurface.
“Our mathematical model is based on the psychological principle of ‘optics,’ which suggests that successful innovations should be distinctive yet familiar,” Zajdera noted.
Hemlines fluctuated from the 1920s to the 1980s, and continue to evolve – Credit: Emma Zajdela/Daniel Abrams
Take, for example, skirt lengths: they shortened from the early 1900s into the flapper styles of the 1920s, lengthened mid-century, shortened again with the 1960s miniskirts, and lengthened during the hippie era of the 1970s.
However, the researchers observed that since the 1980s, this distinct 20-year cycling of hemline lengths has blurred, with varying lengths emerging simultaneously.
“Since the mid-1980s, fashion trends have accelerated, causing the 20-year rule to become less pronounced. Nevertheless, it still exists,” Zajdera pointed out. “Today, we enjoy a greater diversity in styles.”
The researchers suggest that this trend may reflect broader societal changes, impacting not only fashion but also music, art, dog breeds, and baby names.
So, why does this happen? Instead of enriching your life, “harassers” tend to heighten your stress levels. Chronic stress significantly contributes to biological aging, leading to inflammation, a weakened immune system, and a higher likelihood of cardiovascular diseases, which can result in heart attacks.
The authors of the study note, “Negative social connections were associated not only with self-reported stress and mental health but also with molecular measures of biological aging,” according to Dr. Lee Byung-gyu from New York University, as reported by BBC Science Focus.
This comprehensive study analyzed biological age and survey data from 2,345 participants aged between 18 and 103 years.
Researchers discovered that each additional troublesome person in one’s life could negatively affect health outcomes. Specifically, the pace of aging could increase by 1.5 percent, or roughly nine months of biological age. For example, having three harassers in one’s life may equivalently make a person biologically 2.5 years older than someone of the same chronological age without such stressors.
Additionally, the toll is even greater when the difficult individual is a family member.
According to Dr. Lee, not all harassers appear the same. “A nuisance could be a parent, sibling, friend, or someone in your inner circle who regularly causes conflict and drains your time and mental energy,” he explains.
In day-to-day life, this could manifest as a family member who frequently seeks assistance or criticizes you, a friend who generates drama, or a romantic partner who instigates persistent stress in your relationship.
Being surrounded by “haters” can be mentally draining; it might even shorten your lifespan – Credit: Getty
Does this sound familiar? You’re not alone. Research indicates that nearly 30% of individuals report having at least one harasser in their close circle.
Interestingly, the study revealed that having a troublesome spouse doesn’t exert the same detrimental effects on health. The benefits of shared routines, resources, and emotional intimacy can counteract stress responses that are often present in other relationships, as explained by Lee.
However, some individuals may be more susceptible to having difficult people in their lives. The study found higher instances among women, daily smokers, those in poor health, and individuals with challenging childhoods.
Lee commented, “One possibility is that people who already face higher stress levels and have fewer resources may struggle to avoid or disengage from difficult relationships, allowing chronic tension to permeate their daily lives.”
Utilizing an extensive catalog of Sun-like stars created by ESA’s Gaia mission, astronomers have uncovered compelling evidence suggesting that our Sun migrated outward with thousands of similar stars approximately 4 to 6 billion years ago. This finding offers significant insights into the formation of the Milky Way’s central bar.
An artist’s impression illustrating the Sun’s movement and its solar twins from the center of the Milky Way galaxy, dating back 4 to 6 billion years. Image credit: National Astronomical Observatory of Japan.
“While terrestrial archaeology studies human history, galactic archaeology explores the vast journeys of stars and galaxies,” stated Daisuke Taniguchi, an astronomer at Tokyo Metropolitan University, along with his colleagues.
“It is established that our Sun formed approximately 4.6 billion years ago, originally over 10,000 light-years closer to the Milky Way’s center than its present location.”
“Research into stellar compositions supports this hypothesis, yet it has historically posed challenges for scientists.”
“Observations indicate a significant bar-like structure at the Milky Way’s center, creating a corotation barrier that restricts stars from escaping far from the center.”
The study aimed to compile a comprehensive catalog of solar twin stars with stellar parameters closely resembling those of the Sun.
“Solar twins are characterized by stellar properties such as effective temperature, surface gravity, and metallicity that closely align with those of the Sun,” the researchers explained.
“By conducting differential analysis between stellar twins—stars with similar stellar parameters—we can achieve exceptional precision in measuring both stellar parameters and chemical abundances.”
The astronomers utilized data gathered by ESA’s Gaia satellite, which contains an extraordinary array of observations from 2 billion stars and celestial objects.
They successfully cataloged 6,594 solar twins, approximately 30 times more than previously documented studies.
This extensive catalog allowed them to construct the most accurate estimates of the ages of these stars, carefully accounting for biases related to the visibility of selected stars.
Upon examining the age distribution, they identified a peak of stars ranging from 4 to 6 billion years old, including our Sun, indicating the existence of similar-age stars situated at comparable distances from the galaxy’s core.
This discovery supports the notion that the Sun’s current location is part of a broader stellar migration pattern rather than a mere coincidence.
This revelation not only enhances our understanding of the solar system but also elucidates the evolution of the Milky Way galaxy itself.
“The corotational barrier produced by the central bar structure of the galaxy would inhibit such extensive migrations,” the researchers noted. “However, if stellar formation was still occurring at that time, the scenario might differ.”
“The age of our solar twin not only indicates when the mass migration happened but also the timeframe related to the formation of the galactic boundary.”
“Regions near the center of a galaxy are generally less conducive to life than those found farther away.”
“Our findings thus unveil critical aspects regarding how our solar system, and consequently our planet, came to occupy a life-supporting region within the galaxy.”
Results were published in the Journal on March 12, 2026, in Astronomy and Astrophysics.
_____
Daisuke Taniguchi and colleagues. 2026. Gaia DR3 GSP Specification Solar Twin. I. Creation of a Comprehensive Age-Compatible Catalog of Solar Twins. A&A 707, A260; doi: 10.1051/0004-6361/202658913
Discoveries of the Fossilized Jaw of an Ancient Monkey Species Stiltonia victoriae unveil insights from Colombia’s La Victoria Formation, indicating that early primates in South America adapted to leaf consumption, which enabled them to grow larger and explore new ecological niches. This remarkable find may also provide clues about the timeline of when this lineage developed the anatomical traits responsible for the powerful howls seen in today’s howler monkeys.
Howler monkey wearing a cloak (Alouatta palliata) in Panama. Image credit: Ariel Rodriguez-Vargas / CC BY 4.0.
The ancient primate Stiltonia victoriae thrived in what is now Colombia during the Miocene epoch, approximately 13 million years ago.
Dr. Siobhan Cook, a researcher from Johns Hopkins University, stated, “Prior to this discovery, there was no evidence indicating that South American primates consumed leaves.”
This research helps address crucial questions about ecological evolution in one of the Earth’s most biodiverse regions.
“What evolutionary changes occurred in the Amazon rainforest during the existence of these monkeys?”
In their recent study, Cook and colleagues investigated two fossilized mandibles of Stiltonia victoriae from Colombia’s La Victoria Formation in the Tatacoa Desert.
The findings indicate when this ancient monkey developed the ability to eat leaves, expanding its diet beyond fruit. This adaptation enabled it to grow larger and lessen food competition among howler monkeys and other primate species in ancient ecosystems.
“Millions of years ago, ancient monkeys traversed trees in what is now the Tatacoa Desert, once inhabited by wetland grasses, forests, and riverbanks,” said Dr. Cook.
These monkeys coexisted with long-extinct fauna in the Amazon basin, including giant sloths and armored armadillos.
“Before this, fossil findings were scarce. With Stiltonia victoriae, we could only glean knowledge from a few facial and cranial bone fragments,” Cook remarked.
“The latest discoveries not only shed light on their biodiversity and dietary habits but may also provide insight into when howler monkeys developed their distinctive ‘howl’, the loudest vocalization among land mammals.”
The structure of the jaws indicated a broad and deep mandibular body, which may have allowed the hyoid bone to protrude, similar to modern howler monkeys, potentially enabling their iconic calls.
“However, we are still uncertain about their exact behavior,” Dr. Cook added.
Paleontologists employed scans of the jaw fossils to create a 3D model for detailed analysis.
From the structure of the mandibular molars, researchers determined the dietary patterns, size, and distinguishing features of Stiltonia victoriae, comparing it against 3D models of other South American primate fossils, including Stiltonia tatakoensis, a known ancestor of howler monkeys.
They also closely examined the jaws of modern howler monkey ancestors and their relatives, such as spider monkeys and woolly gibbons residing in rainforests.
“Like modern howler monkeys, Stiltonia victoriae possessed relatively large molars with protrusions to act as ‘scissors’ for efficiently grinding carbohydrates, an adaptation common in leaf-eating primates,” said Dr. Cook.
Through their research, the body weight of Stiltonia victoriae was reconstructed, revealing these monkeys weighed between 17 and 22 pounds (8 to 10 kg).
Dr. Cook highlighted, “Previous South American monkeys in the fossil records were significantly smaller. This suggests that for the first time, these monkeys had access to abundant food sources, primarily leaves, enabling them to evolve into a heavier ecological niche.”
This discovery marks the emergence of a large and diverse group of primates in South America.
“We can now accurately trace the origins of various modern lineages.”
These findings will be published in the journal Paleoanthropology.
_____
Siobhan B. Cook et al. 2026. Mandibular specimen of Stiltonia victoriae from La Victoria Formation, La Venta, Colombia. Paleoanthropology 1: 148-170; doi: 10.48738/2026.iss1.3992
Two remarkable species of marsupials, long considered extinct and previously known only from fossil records, have been rediscovered alive in New Guinea. This groundbreaking finding is the result of a collaborative effort involving scientists, indigenous communities, and citizen scientists.
The confirmation of the pygmy longfinger possum and the ring-tailed glider as living specimens marks a significant moment—it’s the first time these creatures have been seen in over 7,000 years. The announcement was made by Bishop Museum, based in Honolulu.
“As both a scientist and conservationist, it’s incredibly fulfilling to confirm their existence. This opens a new chapter in our journey to learn about and protect these fascinating animals,” stated Dr. Christopher Helgen from Bishop Museum.
For the past two years, Helgen and Dr. Tim Flannery of the Australian Museum have been dedicated to verifying the existence of these elusive mammals.
These two animals are categorized as “Lazarus species,” a term for species that re-emerge after being presumed extinct. “The discovery of two Lazarus species thought to be extinct for millennia is truly unprecedented,” Flannery noted in a press release.
Helgen believes this rediscovery underscores the idea that “extinction is avoidable.”
“This discovery offers a message of hope and a testament to second chances,” he added.
These species were initially discovered through fossils by Dr. Ken Aplin, who unearthed a critical tooth during an archaeological dig in western New Guinea in the 1990s.
Helgen’s observation of a photo featuring a gliding ring-tailed possum led to the identification of it as one of Aplin’s previously “extinct” species. Indigenous communities from West Papua’s Tambulo and Maybrat regions provided invaluable assistance by sharing their extensive knowledge about the marsupial’s unique lifestyle, according to a press release.
Recently, scientists confirmed the existence of the pygmy longfinger possum after discovering two preserved specimens at the University of Papua New Guinea.
The survival of the pygmy longfinger possum has been further validated by citizen scientists. Carlos Bokos, a citizen scientist and now co-author of the study, shared a photo of the species on iNaturalist, a global platform for documenting natural science discoveries.
This rewritten content maintains the original HTML structure while enhancing SEO through targeted keywords and phrases related to the discovery of species, collaboration, and conservation efforts.
Exciting news from New Guinea! Two marsupial species, believed extinct for over 6,000 years, have been rediscovered.
The Ring-tailed Gliders and Pygmy Longfinger Possums, previously known only from fossils in Australia, were recently observed on the Vogelkop Peninsula in Papua, Indonesia, thanks to the support of local indigenous communities.
Renowned researcher Tim Flannery and his team at the Australian Museum in Sydney undertook years of investigative work, including analyzing peculiar sightings and misidentified specimens, to confirm that these remarkable animals had returned to life.
With photographic evidence and active collaboration with local communities, researchers have verified these animals’ existence. However, their habitat is under threat from logging activities. The specific ecological requirements and range of these rediscovered species are still largely unknown, complicating conservation efforts.
Scott Hucknull, a professor at Central Queensland University, remarked that this discovery is “more significant than finding a live quoll in Tasmania.”
One notable species, the Wow Glider (Thus ayamalensis), is closely related to Australian gliders in the genus Petaurodes. However, distinct features like its prehensile tail and furless ears have warranted its classification into a separate genus.
Local indigenous communities often regard gliders as sacred and protected animals, potentially contributing to their previous obscurity in scientific literature.
“This is one of the most photogenic animals and beautiful marsupials I’ve ever encountered,” Flannery stated.
The Pygmy Longfinger Possum (Dactylonax Kambuyai) is a striking striped creature characterized by an unusually long finger on each hand, which aids its survival.
As Flannery explains, “They possess unique ear adaptations that may help them detect the low-frequency sounds of larvae within wood, allowing them to extract food from decaying trees.”
The exact location of this species remains confidential to protect it from potential wildlife traders.
Flannery cautions against capturing these animals. “They are challenging to maintain in captivity due to their specialized diet—potential pet owners should be forewarned: they don’t last long in confined environments.”
Fossils trace back to approximately 3 to 4 million years ago have been uncovered in archaeological sites in Victoria and New South Wales, Australia, but significant gaps exist in the fossil record, leaving much about the genus a mystery.
Hucknull notes, “The smallest fossil species are undifferentiated from their modern counterparts. The Dactylonax Kambuyai has now been confirmed alive in West Papua.”
“Pocket-sized, peculiar, and adorable,” says Hucknull, emphasizing the ecological significance of this unique species.
Researcher David Lindenmayer from the Australian National University in Canberra commented on the significance of these discoveries while expressing concern over deforestation and habitat destruction in New Guinea. “It provokes questions about what has been lost in Australia due to similar land clearing practices.”
Explore Fossil Hunting in the Australian Outback
Join us on an extraordinary adventure through Australia’s fossil frontier! Once a shallow inland sea millions of years ago, eastern Australia is now a fossil hotspot. Over 13 unforgettable days, travel deep into the hinterland, follow in the footsteps of prehistoric giants, and uncover the secrets of Earth’s ancient history.
The lifespan benefits derived from fasting and rapamycin usage resemble a lottery rather than a guaranteed outcome. While significant lifespan increases have been observed within a year, reanalysis indicates that results can vary significantly among individuals.
Talia Fulton, a researcher at the University of Sydney, mentions, “[They] may enhance your lifespan marginally [they] could dramatically increase it.”
The 2025 study examined 167 research papers across eight non-human species, including fish, mice, rats, and rhesus macaques. Fulton and her team discovered that when these animals were treated with rapamycin, a promising anti-aging compound, alongside calorie restriction — known for fostering longevity — they exhibited a longer lifespan on average. This suggests the same potential could extend to humans.
Current research has investigated the varied responses to longevity interventions in individual animals, revealing significant variability in benefits. Fulton notes that while taking rapamycin or implementing dietary restrictions appears “likely to be advantageous, the degree remains uncertain.”
According to her, “Some may experience considerable lifespan extension, while others may see minimal impact, or not outlive their expected lifespan.” This variability creates a somewhat unpredictable environment, meaning these treatments cannot guarantee lifespan extension for all individuals.
Fulton emphasizes that the objective of longevity interventions is to balance the population size with life expectancy through a squared curve. This implies that more individuals could lead longer lives, contrasting with the current trend of fewer individuals achieving longevity. “Squaring the survival curve means a larger number will lead extended and fulfilling lives until around 100, at which point mortality becomes almost certain,” she elaborates.
Current findings indicate that dietary restrictions and rapamycin do not effectively square this longevity curve. In this context, Fulton advises holding off on high expectations until further research clarifies who stands to benefit most from these approaches. “We aspire to decode individual genetic variables and life histories, ultimately determining ‘This is precisely what you need to achieve maximum longevity,'” she states.
Researchers like Matt Kaeberlein from the University of Washington stress that squaring the curve does not inherently mean enhanced health profiles. A more compelling consideration, he argues, is whether longevity initiatives, such as exercise, influence “healthspan inequality.”
Originally developed as an immunosuppressant for organ transplant patients, rapamycin inhibits the mTOR protein, essential for cell growth and division. At lower doses, it has demonstrated the potential to extend lifespan in species like flies and mice, potentially by safeguarding against DNA damage.
New Study Reassesses the Age of the Jordan Valley Ubaydiya Layer: Dating Back Approximately 2 Million Years, Comparable to Georgia’s Dmanisi Ruins. This Research May Mark a Critical Moment in Human Evolution, Indicating That Early Humans with Advanced Tool-Making Skills Expanded into New Environments Much Earlier Than Previously Believed.
Artist’s reconstruction of Homo erectus. Image credit: Yale University.
The Ubaydiya ruins are situated in Israel’s Jordan Valley, nestled between Menahemia village and Beit Zerah kibbutz.
Discovered in 1959, this site has yielded a distinctive Ature hatchet but only a few human remains.
“The Ubaydiya Formation has been a focus of research for years, offering early evidence of the Acheulean culture, recognized by its large, double-sided stone tools, often found alongside a diverse array of fauna, including species from Africa and Asia,” remarked Ali Matmon, a professor at the Hebrew University of Jerusalem.
“Yet, determining the precise age of this site has posed a considerable challenge over the decades.”
“Historically, researchers estimated Ubaydiya’s age to be between 1.2 and 1.6 million years, based on relative chronology.”
To ascertain the site’s true age, researchers employed three independent dating techniques: magnetic stratigraphy, uranium-lead (U-Pb) dating of mollusc shells, and cosmogenic isotope burial dating.
“Cosmogenic isotope burial dating measures rare isotopes generated when cosmic rays strike rocks on Earth’s surface,” explained the research team.
“Once buried, these isotopes decay at a known rate, effectively beginning a geological clock that indicates how long they have been underground.”
“We also analyzed remnants of Earth’s ancient magnetic field preserved in lake sediments at the site,” they added.
“As sediment settles, it locks in the orientation of the planet’s magnetic field at that time.”
“By correlating these magnetic signatures with known historical reversals in Earth’s magnetic field, we established that this formation emerged during the Matsuyama period, over 2 million years ago.”
“We also examined the fossils of melanopsis, utilizing U-Pb dating to determine the age of shells and freshwater snails within the sediment, which helped us establish the minimum age of the layer where the stone tools were discovered.”
“Overall, our findings indicate an age much earlier than previously anticipated.”
Double-sided stone tool excavated from the Ubaydiya site in Israel. Image credit: Omri Barzilai.
The team’s results indicate that the Ubaydiya site is at least 1.9 million years old, significantly older than prior estimates.
“This new chronology suggests that Ubaydiya is roughly contemporaneous with the renowned Dmanisi site in Georgia, implying that our ancestors migrated to different regions simultaneously,” the scientists noted.
“Additionally, this suggests that both simpler Oldowan and more advanced Acheulean stone tool-making techniques began their migration from Africa as various hominin groups explored new terrains.”
This groundbreaking study is published in the Quaternary Science Review.
_____
A. Matmon et al. 2026. The Complex History of Radiation Exposure Burials in the Dead Sea Rift Valley and the Recycling of Pleistocene Sediments Affecting the Age of the Acheulian Site Ubaydiya. Quaternary Science Review 378: 109871; doi: 10.1016/j.quascirev.2026.109871
A significant, long-term study indicates that engaging in brain-training video games may provide protection against dementia for decades. Experts deem this the most compelling evidence to date that cognitive training can yield enduring alterations in brain function.
“This is quite unexpected,” remarked Marilyn Albert, director of the Alzheimer’s Disease Research Center at Johns Hopkins University. “It’s not at all what I anticipated.”
This groundbreaking study, published Monday in the journal Alzheimer’s & Dementia: Translational Research & Clinical Interventions, follows the Advanced Cognitive Training for Independent and Vital Older Adults (ACTIVE) trial.
The researchers discovered that participants who engaged in up to 23 hours of a specialized cognitive training known as speed training over a three-year span exhibited a striking 25% decrease in the risk of developing Alzheimer’s disease and other forms of dementia during a follow-up period of 20 years.
The ACTIVE study was a comprehensive randomized controlled trial funded by the National Institutes of Health (NIH), involving around 3,000 participants aged 65 and older, hailing from six geographic regions and showing no prior major cognitive impairment. About 25% of participants were minorities, and the majority were women.
Women are especially vulnerable to Alzheimer’s disease, developing dementia at nearly double the rate of men.
Initially, study participants were assigned to train bi-weekly for 60 to 75 minutes per session for a maximum of 10 sessions over five weeks. Approximately half of each training group received an additional 23 hours of booster training over three years.
Researchers monitored medical records through Medicare to track dementia diagnoses in participants throughout the 20-year follow-up. Various forms of dementia, including Alzheimer’s disease, vascular dementia, and frontotemporal dementia, were aggregated into one category.
Participants who underwent speed training along with booster sessions exhibited a 25% lower risk of being diagnosed with dementia compared to the control group, while those who did not receive additional training showed no benefits.
“The findings suggest that a relatively small input of effort can yield substantial benefits over the long term,” stated Dr. Richard Isaacson, a preventive neurologist at the Neurodegenerative Disease Institute in Boca Raton, Florida, who was not involved in this study.
Dr. Thomas Wisniewski, chair of the Department of Cognitive Neurology at New York University Langone Health, praised the study results as “remarkable,” asserting this is the strongest evidence to support cognitive training’s efficacy.
“This is the first conclusive documentation in a randomized controlled trial indicating that some forms of cognitive training can diminish dementia risk,” added Wisniewski, who was also not involved in the study.
Participants were assigned to one of three cognitive training programs: speed training, memory training, and reasoning training, with a control group that received no training.
Dr. Sanjla Singh, a physician-scientist and lecturer in neurology at Harvard Medical School, explained that speed training focuses on enhancing the brain’s ability to process visual information quickly and effectively. This involves quickly identifying items on a screen and making corresponding decisions.
Albert compares this thought process to the situational awareness required when driving. “When we’re driving and must pay attention to multiple things happening around us, we need to discern what’s relevant and what’s not,” she elaborated.
In memory training, participants learned to memorize a series of words and strategies for retaining story details, such as creating mental images and associations.
Reasoning training involved exercises aimed at enhancing problem-solving skills based on identifiable patterns, such as recognizing sequences in letters or numbers.
However, no significant protective effect against dementia was observed in those who participated in memory and reasoning training alone.
Researchers remain uncertain about why speed training proved beneficial while the other forms did not; one theory relates to the distinction between implicit and explicit learning.
Implicit learning refers to acquiring unconscious habits and skills, like riding a bike. In contrast, explicit learning entails consciously memorizing facts, such as vocabulary from flashcards.
Albert noted that implicit and explicit learning processes engage different regions of the brain.
“Once the brain adapts to these skills, the changes can persist even without ongoing practice,” Singh remarked. “For example, a child can learn to ride a bike in around 10 hours, and that skill lasts a lifetime.”
Screenshot from the Double Decision game.Brain Head Office
Speed training is similarly thought to foster long-term alterations in the brain, a phenomenon defined by neuroplasticity—the brain’s capacity to adapt and reconfigure itself in response to lifelong learning.
Dr. Kellyanne Niotis, a preventive neurologist and clinical assistant professor of neurology at Weill Cornell Medical College, stated that speed training can significantly impact cognitive reserve—the brain’s ability to withstand dementia’s effects, which builds over time through various factors, including education, mentally engaging activities, and social engagement.
“We believe this visual processing speed training engages broader neural networks, thereby enhancing the brain’s resilience and cognitive reserve,” she explained.
Another hypothesis for the efficacy of speed training is its adaptive nature, meaning the difficulty escalates according to an individual’s performance. Those who initially excelled quickly progressed to more challenging tasks, a feature not seen in other forms of training.
Should I start speed training?
The speed training used in this study was devised by psychologists Carlene Ball and Daniel Loncar, with support from an NIH grant. This program has since been refined and is now available as a tool named “Double Decision” via BrainHQ, an online subscription platform.
BrainHQ’s Double Decision game (available in various difficulty levels).Brain Head Office
Based on the study results, Albert recommends this training for individuals aged 65 and older, akin to the study’s demographic.
However, early signs of Alzheimer’s disease can reportedly emerge decades before onset, indicating that those in their 40s or 50s could also experience protective benefits. She cautioned against making early conclusions regarding the advantages for younger individuals.
While these trial results are promising, experts emphasize that Alzheimer’s disease and other types of dementia are multifaceted, and no singular solution exists.
“Every individual possesses a brain that can be at risk for Alzheimer’s disease, and it’s crucial to prioritize brain health,” Isaacson urged.
Fortunately, various factors correlated with a decreased risk of developing dementia exist. In fact, one report suggests that nearly half of all dementia cases could be deferred or mitigated by addressing specific risk factors, according to the Lancet Commission Report 2024.
Niotis advises individuals to take the following steps:
Ensure regular hearing assessments.
Manage metabolic risk factors such as cholesterol, blood sugar, and blood pressure.
Correct vision issues, as vision loss is a known risk factor for dementia.
Regular exercise enhances blood circulation and nourishes the brain. Isaacson may also suggest combining cognitive-stimulating activities with exercise, such as walking during meetings or engaging in cognitive training while using a stationary bike.
Emerging research also indicates that the shingles vaccine might protect the brain against cognitive decline.
A comprehensive study from 2025 published in Nature revealed that individuals vaccinated against shingles were 20% less likely to develop dementia over a seven-year follow-up period than those who were unvaccinated.
Ancient Inuit Circular Tents Found on Isbjørne Island
Credit: Matthew Walls, Marie Christ, Pauline Knudsen
4,500 years ago, early humans embarked on a historic journey to a remote island off Greenland’s northwest coast. This daring expedition entailed crossing over 50 kilometers of open sea, marking one of the longest maritime voyages by Arctic indigenous peoples.
Archaeologists assert that these intrepid sailors were the first to reach these isolated islands. Notably, John Derwent from the University of California, Davis, contributed insights but was not involved in this study.
In 2019, Matthew Walls and a team from the University of Calgary, Canada, explored the Kittisut Islands, also known as the Carey Islands, located northwest of Greenland. These islands lie within the Pikiarasorsuaq polynya—an open ocean region surrounded by sea ice, which has been present for approximately 4,500 years.
The research focused on three main islands: Isbjörne, Mellem, and Nordvest, revealing five sites with a total of 297 archaeological features. The most significant findings were at Isbjörne beach terraces, where they uncovered the remnants of 15 circular tents, each with a central hearth and divided by stones. These distinctive “bilobed” structures are emblematic of the Paleo-Inuit—the first settlers of northern Canada and Greenland.
Radiocarbon dating of a long-billed murre’s wing bones found within one of the tent rings indicated they are between 4,400 and 3,938 years old. This confirms that humans occupied the Kittisut Islands shortly after the formation of the polynya.
“We have nesting colonies of long-billed murres,” Walls noted. The early settlers likely harvested their eggs and hunted them for food, and they likely pursued seals as well.
The Old Inuit had already reached Greenland at this time and likely journeyed to Kittisut from the west, covering a minimum distance of about 52.7 kilometers. However, due to prevailing winds and currents, they most likely set sail from a more northerly location, resulting in a longer, safer journey. To the west of Kittisut lies Ellesmere Island, which is further but presents challenging navigational conditions.
The only comparable journey known in Arctic prehistory was the 82-kilometer crossing of the Bering Strait from Siberia to Alaska, likely first accomplished over 20,000 years ago, with the Diomede Islands serving as a midway stopping point.
“Crossing that expanse required advanced watercraft,” Derwent emphasizes. The population on Kittisut likely necessitated larger vessels rather than single-person kayaks. “You can’t transport children and the elderly safely in a kayak,” he explained. The Old Inuit likely used larger boats capable of carrying nine or ten individuals.
Despite extensive studies, no boat wrecks have yet been uncovered on Kittisut Island, and few such finds exist in the Arctic region. “Their vessels would have been skin-on-frame designs similar to those utilized by later Inuit communities,” noted Walls.
The initial Paleo-Inuit settlers likely played a vital role in shaping the Kittisut ecosystem. By transporting marine nutrients onto land, they fertilized the barren soil, fostering plant growth on the islands. “There’s initially a diverse plant life there, reliant on human involvement in nutrient cycling between marine and terrestrial systems.”
Arctic Cruise with Dr. Russell Arnott: Svalbard, Norway
Join marine biologist Russell Arnott for an unforgettable ocean expedition to the North Pole.
Close-up of glass with Microsoft Flight Simulator Map Data
Microsoft Research
Innovative automated systems for storing vast amounts of data on glass could revolutionize the future of data centers.
In our data-driven world, everything relies on information—from the internet and industrial sensors to scientific data from particle colliders, all of which require secure and efficient storage solutions.
While their technique was impractical for industrial applications, Richard Black and his colleagues at Microsoft’s Project Silica have successfully demonstrated a similar glass-based technology. This innovation could pave the way for long-lasting glass data libraries in the near future.
“Glass can endure extreme temperatures, humidity, particulates, and electromagnetic fields,” explains Black. “Moreover, glass boasts a long lifespan and doesn’t need frequent replacement, making it a more sustainable medium. It requires significantly less energy to produce and is easy to recycle once it has served its purpose.”
The research team’s pioneering process starts with a femtosecond laser, which emits light pulses lasting just 100 billionths of a second. This technology etches tiny structures into a thin layer of glass to encode data. To minimize read and write errors, the researchers also incorporate additional bits into the data.
The data is read using a combination of microscope and camera systems, with images processed by a neural network algorithm that converts them back into bits. This entire process is easily reproducible and automated, making it a perfect example of a robotic data facility.
Remarkably, researchers successfully stored 4.8 terabytes of data on a square glass piece measuring 120 millimeters wide and 2 millimeters thick. This is roughly one-third the volume of an iPhone, equivalent to about 37 iPhones’ storage capacity.
Project Silica Glass Writing Instruments
Microsoft Research
Accelerated aging experiments, including heating the glass in a furnace, suggest that the data may remain stable and readable for over 10,000 years at 290°C, even longer at room temperature. Additionally, the researchers tested borosilicate glass, which, while cheaper, only effectively stored less complex data.
Kazansky highlighted Project Silica’s main breakthrough: delivering an end-to-end system scalable to data center size. Although the principles of glass-based data storage have existed for over a decade, this study confirms its feasibility as a technology.
Microsoft isn’t alone in exploring this groundbreaking technology. Kazansky also co-founded S Photonics, focused on preserving the human genome in glass. The Austrian startup Serabite proposes similar storage techniques using ultrathin layers of ceramic and glass.
Nonetheless, challenges persist, such as the cost of integrating the glass library into existing data centers and whether the Project Silica team can enhance glass capacity, potentially up to 360 terabytes as per Kazansky’s findings.
For now, Black identifies the primary potential applications for Project Silica’s technology in national libraries, scientific repositories, cultural records, and anywhere data needs to survive for centuries. Collaborations with companies like Warner Bros. and Global Music Vault are underway to safeguard data currently stored in the cloud for the long term.
Kazansky adds that this technology has even inspired cinematic portrayals. In Mission: Impossible – The Final Reckoning, a protagonist discovers the capacity and security necessary to trap an advanced artificial intelligence. “It’s a rare moment when Hollywood science fiction aligns with peer-reviewed reality,” he remarks.
Edward Jenner Administering the First Smallpox Vaccination in 1796
Ernest Board/Wellcome Collection/De Agostini via Getty Images
Recent insights into one of history’s most effective vaccination campaigns highlight critical lessons for expediting vaccine adoption today. This successful effort eradicated smallpox in Copenhagen during the early 1800s.
Smallpox, a devastating infectious disease, resulted in a mortality rate of 30% and left survivors with disfigurement and blindness, leading to an estimated 500 million deaths before its global eradication in 1980 through vaccination.
Copenhagen saw one of the earliest local triumphs over smallpox, achieving eradication in 1808 after claiming over 12,000 lives over fifty years.
The world’s first vaccine, developed by British physician Edward Jenner in 1796, quickly gained traction among Denmark’s medical and social elite, sparking “excited attention and anticipation,” as documented by leading physician Henrik Kalissen.
Doctors in Copenhagen swiftly sought smallpox vaccine supplies from Jenner in England. The inaugural recipient was a Danish judge’s child, followed by a bishop’s child. The vaccine proved remarkably effective, preventing transmission even among close contacts of infected individuals, including breastfeeding mothers, according to Calisen’s observations.
In response, the King of Denmark founded a Vaccine Commission in 1801, tasked with broadening the vaccine’s reach and meticulously tracking vaccination rates and smallpox outbreaks.
Researchers from Roskilde University analyzed these records, revealing that by 1810, 90% of Copenhagen’s children had been vaccinated, leading Denmark to rank as the highest in Europe for vaccination rates per capita.
Due to the rapid dissemination of the smallpox vaccine, the disease was eliminated from Copenhagen just seven years after the campaign’s initiation. “We will be free from one of the most destructive diseases known to us,” Calisen expressed in 1809.
Eilersen and his team identified key factors behind the high vaccination rates. Vaccines were offered free of charge to families in need, and many church leaders and school teachers actively promoted and administered the vaccines. The Vaccine Commission commended clergy who traversed the nation to disseminate knowledge about vaccinations, with one priest vaccinating nearly 2,000 children in just one year.
As smallpox cases dwindled, concerns arose about public apathy towards vaccination. To sustain high rates, the committee mandated that vaccination be a prerequisite for a child’s enrollment in church activities as of 1810.
While some resisted vaccinating their children, citing “ignorance and prejudice,” the broader public largely supported vaccination, Calisen noted. He acknowledged initial fears about vaccines but ultimately recognized their tremendous impact on public health and population growth.
Eilersen believes that the collaboration among Danish leaders fostered public trust and encouraged widespread vaccine acceptance. “Unified authorities, including government, medical institutions, and religious leaders, contributed to convincing a diverse population to embrace vaccination,” he stated.
Denmark continues to enjoy robust confidence in its governmental and health institutions, currently ranked first in public trust by Transparency International. In turn, this commitment has contributed to high childhood vaccination rates, with approximately 96% of Danish children vaccinated against diphtheria, tetanus, and pertussis, contrasting with only 80% in the United States, which ranks 28th in public trust levels.
Archaeologists have made a groundbreaking discovery, unearthing the “oldest known hand-held wooden tool” at a Middle Pleistocene site in Marathusa 1, Greece.
Impression of a Marathusa 1 female artist crafting a digging stick using small stone tools from an alder trunk. Image credit: G. Prieto / K. Harvati.
According to Professor Katerina Herberty from the University of Tübingen, “The Middle Pleistocene was crucial for human evolution, marking a period when complex behaviors emerged.”
“This era also showcases the earliest reliable evidence of the targeted use of plants for technological purposes.”
The 430,000-year-old wooden tools discovered at the Marathusa 1 site, led by Professor Harbati and his team, consist of worked alder trunks and small willow/poplar artifacts.
The primary tool is made from alder wood (Alnus sp.) and features engraving marks along with associated stop and chop marks, indicating intentional shaping.
This approximately 81 cm long artifact displays signs of usage consistent with a multifunctional rod likely employed for paleolakeshore excavation.
The second tool, a small piece of willow/poplar (Salix sp./Populus sp.), measures 5.7 cm and exhibits signs of rounding.
This object shows two signs of potential processing, suggesting that growth rings have been removed from one end.
Researchers hypothesize that this small wooden tool’s function remains uncertain but may have been utilized for modifying stone tools.
Alongside these wooden tools, scientists uncovered butchered remains of an elephant with straight tusks (Paleoloxodon Antique), as well as stone artifacts and processed bones.
Dr. Annemieke Milks, a researcher at the University of Reading, states, “Unlike stone artifacts, wooden objects need special conditions to survive over long durations.”
“We meticulously examined all tree remains, analyzing the surfaces under a microscope.”
“Our findings revealed clear evidence of cutting and carving on these two objects, strongly indicating that early humans intentionally shaped them.”
A multifunctional digging stick (top) and small wooden tools (bottom) from the Marathusa 1 site in Greece. Image credit: D. Michailidis / N. Thompson / K. Harvati.
Additionally, researchers found a large fragment of an alder trunk exhibiting deep carved stripes, interpreted as fossilized claw marks from a large carnivore. This suggests potential competition between early humans and carnivores at this site.
Evidence of cuts and damage on the elephant remains indicate that early hominins had access to the carcass, while gnawing marks reveal subsequent carnivorous activity.
Dr. Milks added, “Previous discoveries of ancient wooden tools have occurred in countries such as Britain, Zambia, Germany, and China, comprising weapons, digging sticks, and tool handles.” However, she noted that these finds date newer than the Marathusa 1 artifacts.
“The only evidence of ancient wood used by humans, dating to around 476,000 years ago, comes from the Kalambo Falls site in Zambia, where the wood served as structural material rather than tools.”
“We have now identified the oldest known wooden tools and the first of their kind from southeastern Europe,” emphasized Professor Herberty.
“This discovery highlights the exceptional conservation conditions at the Marathusa 1 site.”
“The concurrent evidence of human activity and large carnivores in the vicinity of the butchered elephant indicates a competitive dynamic between them.”
Details of these findings are published in Proceedings of the National Academy of Sciences.
_____
A. Chemilux et al. 2026. The earliest evidence of human use of wooden hand tools, discovered at Marathusa 1 (Greece). PNAS 123 (6): e2515479123; doi: 10.1073/pnas.25154791
Recent findings reveal that these stencils are over 15,000 years older than cave paintings in another Sulawesi cave, which were dated in 2024. The painting features three anthropomorphic figures interacting with pigs, believed to be approximately 51,200 years old.
“I thought my previous work was impressive, but this photo completely eclipsed it,” Blum remarked.
“This underscores the long-standing tradition of rock art creation in this region. It spans an incredible timeline,” he emphasized.
Researchers are optimistic about uncovering even older art forms, including narrative art, in Indonesia, a largely unexplored archaeological treasure trove.
Liang Methanduno, a prominent cave art location, attracts tourists. However, most artworks discovered so far, depicting domestic animals like chickens, are relatively recent, estimated to be around 4,000 years old.
In 2015, Indonesian rock art expert and lead author, Adi Octaviana, spotted a faint drawing behind a modern painting, speculating it might be an ancient hand-painted stencil.
“These had never been documented before; their existence was unknown until Addy discovered them,” Blum stated.
Previous generations of researchers exploring Ice Age cave art, dating back 30,000 to 40,000 years in regions like France and Spain, believed it marked the dawn of modern artistic culture.
However, recent discoveries in Indonesia indicate that humans outside Europe were crafting “extraordinarily sophisticated” cave art tens of thousands of years ago, even before our species arrived in that area.
Ancient cave paintings in Sulawesi. Maxime Aubert/AFP – Getty Images
Blum noted that this discovery could also shed light on the timeline of when the first humans settled in Australia.
It is widely accepted that Aboriginal populations have inhabited Australia for at least 50,000 years, though evidence suggests one of the country’s archaeological sites is around 65,000 years old.
“The finding of 67,000 to 68,000-year-old rock art on Sulawesi, nearly adjacent to Australia, supports the theory that modern humans may have arrived in Australia at least 65,000 years ago,” Blum explained.
A revealing new study challenges traditional beliefs by showing that mid-ocean ridges and continental rifts, rather than volcanic eruptions, significantly influence atmospheric carbon fluctuations and long-term climate change in Earth’s geological history.
Cryogenic Earth. Image credit: NASA.
Over the past 540 million years, Earth’s climate has gone through dramatic shifts, alternating between icy icehouse conditions and warm greenhouse phases.
Icehouse conditions prevailed during key geological periods, including the Late Ordovician, Late Paleozoic, and Cenozoic eras.
Notably, warmer periods were associated with increased atmospheric carbon dioxide, while declines in greenhouse gases led to global cooling and extensive glaciation.
Research conducted by Ben Mather and a team at the University of Melbourne reconstructed carbon movements between volcanoes, oceans, and the deep Earth over the past 540 million years.
“Our findings challenge the long-accepted view that volcanic chains formed by tectonic plate collisions are the primary natural source of Earth’s atmospheric carbon,” Dr. Mather stated.
“Instead, it appears that carbon emissions from deep-sea crevices and mid-ocean ridges, driven by tectonic movements, have been crucial in shaping the transitions between icehouse and greenhouse climates throughout most of Earth’s history.”
“For example, we discovered that carbon released from volcanoes in the Pacific Ring of Fire only emerged as a significant carbon source in the last 100 million years, prompting us to reevaluate current scientific understanding.”
This study presents the first robust long-term evidence indicating that Earth’s climate change is primarily driven by carbon released at divergent plate boundaries rather than convergent ones.
“This insight not only reshapes our understanding of past climates but will also enhance future climate models,” Dr. Mather noted.
By integrating global plate tectonics reconstructions with carbon cycle models, the research team traced the storage, release, and recycling of carbon as continents shift.
Professor Dietmar Müller from the University of Sydney remarked, “Our findings illustrate how variations in carbon release from plate spreading influenced long-term climate shifts, clarifying historical climate changes, such as the late Paleozoic ice ages, the warm Mesozoic greenhouse world, and the rise of present-day Cenozoic icehouses.”
This research holds vital implications for understanding the ongoing climate crisis.
“This study contributes to the growing body of evidence that atmospheric carbon levels are a significant factor driving major climate shifts,” Dr. Mather emphasized.
“Comprehending how Earth managed its climate historically underscores the extraordinary pace of current climate change.”
“Human activities are releasing carbon at a staggering rate, far surpassing any natural geological processes previously recorded.”
“The climate balance is tipping alarmingly fast.”
For more on this groundbreaking research, you can view the findings published in the journal Communication Earth and Environment.
_____
B.R. Mather et al. 2026. Carbon emissions along divergent plate boundaries influence climate shifts between icehouses and greenhouses. Communication Earth and Environment 7, 48; doi: 10.1038/s43247-025-03097-0
A detailed analysis of 17 fossil specimens of tyrannosaurus rex indicates that this iconic dinosaur grew much more slowly than previously believed, reaching an adult weight of approximately 8 tons by around age 40. This challenges earlier assumptions about its life history.
tyrannosaurus rex holotype specimen at the Carnegie Museum of Natural History in Pittsburgh, USA. Image credit: Scott Robert Anselmo / CC BY-SA 3.0.
tyrannosaurus rex is renowned as one of the most iconic non-avian dinosaurs, continually captivating paleontologists and the public alike.
Previous growth studies proposed that this ancient predator could exceed 8 tons within just 20 years and live for nearly 30 years.
Utilizing advanced statistical algorithms, the new research examined bone slices under specialized lighting, uncovering hidden growth rings that previous studies had overlooked.
This analysis not only extended the growing season for tyrannosaurus rex but also suggested that by age 15, some specimens might not be complete individuals of tyrannosaurus rex, but instead could belong to other species or unique variants.
“This is the largest dataset ever collected regarding tyrannosaurus rex,” stated Holly Woodward, a professor at Oklahoma State University.
“Through studying the tree rings preserved in fossilized bones, we reconstructed the growth history of these magnificent creatures year by year.”
Unlike the annual rings found in tree stumps, the cross-sections of tyrannosaurus rex bones only record the final 10 to 20 years of an individual’s life.
“Our innovative statistical approach allowed us to estimate growth trajectories by synthesizing growth records from various samples. We examined every growth stage in greater detail than any prior studies,” explained Dr. Nathan Myhrvold, a mathematician and paleontologist at Intellectual Ventures.
“The resulting compound growth curves provide a more accurate representation of how tyrannosaurus rex matured and evolved in size.”
Rather than competing for dominance into adulthood, tyrannosaurus rex demonstrated a gradual and steadier growth pattern than previously assumed.
“The prolonged growth phase over 40 years likely enabled young tyrannosaurs to occupy various ecological roles within their environment,” said Dr. Jack Horner of Chapman University.
“This may explain how they maintained their status as apex carnivores at the end of the Cretaceous period.”
The team’s findings were published in the online journal Peer J.
_____
HN Woodward et al. 2026. Long-term growth and the extension of subadult development of the tyrannosaurus rex species complex revealed through expanded histological sampling and statistical modeling. Peer J 14: e20469; doi: 10.7717/peerj.20469
Using high-resolution images, NIRCam, a near-infrared camera aboard the NASA/ESA/CSA James Webb Space Telescope, has led astronomers to discover COSMOS-74706, one of the earliest known barred spiral galaxies. This discovery is pivotal in shaping our understanding of cosmic evolution.
COSMOS-74706: Unsharp mask overlaid on F200W, F277W, and F356W filter configurations. The white lines represent logarithmic spirals along the galaxy’s arm structure while the lines indicate the north-south bar structure. Image credit: Daniel Ivanov.
The barred spiral galaxy COSMOS-74706 existed approximately 11.5 billion years ago.
“This galaxy developed its bar just two billion years after the universe’s inception,” stated Daniel Ivanov, a graduate student at the University of Pittsburgh.
“Stellar bars are linear features found at the centers of galaxies, confirming their namesakes.”
COSMOS-74706’s bar comprises a dense collection of stars and gas, appearing as a bright line bisecting the galaxy when viewed perpendicularly to its plane.
Stellar bars significantly influence a galaxy’s evolution, funneling gas from the outskirts into the center, which feeds the supermassive black hole and can inhibit star formation within the galactic disk.
While previous reports identified barred spiral galaxies, their analyses were inconclusive due to the less reliable optical redshift methods compared to the spectroscopy used for COSMOS-74706 verification.
In some instances, a galaxy’s light was distorted by a massive object, leading to a phenomenon known as gravitational lensing.
“Essentially, COSMOS-74706 is the most redshifted spectroscopically confirmed lensless barred spiral galaxy,” Ivanov noted.
“We were not surprised to find barred spiral galaxies so early in the universe’s timeline.”
“In fact, some simulations suggest the bar formed at redshift 5, or roughly 12.5 billion years ago.”
“However, I believe we shouldn’t expect to find many of these galaxies just yet.”
This discovery helps refine the timeline for bar formation, making it a significant finding.
Following a recent winter storm that pummeled California with rain and snow, the state is officially drought-free for the first time in 25 years, as reported by the US Drought Monitor.
December 2000 marked the last occasion when California had no areas classified as “abnormally dry” or experiencing drought.
While this drought-free status is encouraging news for water management, many residents are still dealing with the aftermath of severe atmospheric river storms that led to significant rainfall and widespread flooding. In contrast, high-altitude regions are grappling with heavy snowfall and increased avalanche risks.
Between December 20 and the end of the year, some parts of Northern California received nearly 7 inches of rain, while Southern California saw up to 4 inches. As New Year’s approached, California faced additional rounds of significant rain and flooding, raising the levels of the state’s 17 major reservoirs to an impressive 129% of their average capacity, according to state records.
Steve Wargoman carries Christmas presents from his granddaughter’s flooded home after heavy rain on December 22 in Redding, California. Noah Berger/Associated Press
This wet winter has boosted snowfall in California, which is crucial for the state’s water supply. However, snowfall levels remain below average. In late December, the California Department of Water Resources reported that measurements from 130 stations across the Sierra Nevada reveal a snow water equivalent of 6.5 inches, which is only 71% of the expected average for this time of year.
Nonetheless, officials are optimistic. The key months for snowfall in California—January, February, and March—are still ahead.
“It’s still early in the season, and the state’s water supply this year will ultimately depend on the frequency of storms continuing throughout the winter and early spring,” stated Angelique Fabbiani-Leon, state hydrographer with the Department of Water Resources, in a statement on December 30.
Typically, the Sierra Nevada snowpack provides about 30% of California’s annual water needs.
In contrast to California, other regions in the West, including Nevada, Utah, and Colorado, are enduring persistently dry conditions with snowfall well below normal levels.
Furthermore, in Washington, Oregon, Colorado, Arizona, and New Mexico, over 80% of monitoring stations report a “snow drought,” as defined by snow water equivalents falling below the 20th percentile, according to the National Oceanic and Atmospheric Administration.
Weight loss medications, including Munjaro (tirzepatide), are effective when taken consistently.
Alan Swart / Alamy
A recent study involving over 9,000 participants revealed that individuals who discontinue weight loss medications often regain the weight lost within two years. This finding underscores the notion that obesity should be viewed as a chronic disease necessitating ongoing treatment.
“These medications are very effective; however, obesity is a chronic, relapsing condition,” explained Susan Jebb, who addressed the press at the University of Oxford. “Similar to hypertension medications, these treatments are likely needed for life.”
It’s evident that weight loss medications can significantly aid individuals in combating obesity, particularly newer GLP-1 medications mimicking gut hormones such as glucagon-like peptide 1—examples include semaglutide (Ozempic, Wegovy) and tirzepatide (Mounjaro, Zepbound). These drugs not only facilitate weight loss but also positively impact health metrics like blood pressure and cholesterol levels.
Nevertheless, many patients have ceased using GLP-1 medications due to side effects, including nausea, or a lack of availability triggered by heightened demand. “Approximately half of users discontinue these drugs within a year,” remarks Jebb.
While nations like the United States and parts of Europe permit long-term use of GLP-1 medications for weight control, frameworks like the UK’s National Health Service are restricting semaglutide usage for weight management based on cost-effectiveness evaluations over two years.
Previous studies indicate that individuals often regain weight post-semaglutide discontinuation. Yet, it remains unclear if this pattern extends to other weight loss interventions and the swift occurrence of weight gain upon cessation.
To investigate this, Jebb and colleagues reviewed 37 trials, combining data from over 9,000 participants, all classified as overweight or obese and using some form of weight loss medication (including GLP-1) for about 10 months, followed by a monitoring period of roughly 8 months.
From their analysis, the researchers noted that participants lost an average of 8.3 kilograms and experienced improvements in metabolic parameters like blood pressure, cholesterol, and blood glucose levels.
When examining weight patterns during the follow-up phase, the model suggested participants regained the average weight lost within 1.7 years after stopping their medications.
In trials specifically addressing semaglutide and tirzepatide, participants lost an average of 14.7 kilograms, yet it was anticipated they would regain all lost weight within a year and a half. Jebb points out that further insights are required to understand the accelerated weight gain associated with these drugs compared to others.
Additionally, the team discovered that the weight regain rate after ceasing weight loss drugs was about four times steeper than that observed following the termination of a structured behavioral weight loss program, which typically emphasizes healthy eating and increased physical activity.
However, this disparity may be attributed to the greater motivation for weight loss among individuals participating in behavioral programs compared to those relying on medications.
Another factor contributing to this swift weight regain may be the appetite suppression induced by these drugs. Users often report significant increases in hunger and cravings upon discontinuation, possibly leading to rapid weight resurgence, as noted by Taraneh Soleimani from Pennsylvania State University.
Yet, a separate analysis suggested that offering behavioral support during the follow-up phase did not effectively curb weight gain. Soleimani emphasizes that more research is essential to determine optimal strategies for supporting individuals transitioning off weight loss medications.
What Jebb’s research illustrates, according to her, is the critical need to consider obesity as a long-lasting condition. “Weight loss drugs demonstrate effectiveness, and weight regain is prevalent upon cessation,” states Professor Soleimani. “These results confirm obesity as a chronic condition that requires prolonged treatment.”
The San people of southern Africa utilize poison arrows for hunting, a practice rooted in ancient traditions.
imageBROKER.com / Alamy
Discoveries of plant poisons on 60,000-year-old arrowheads in South Africa suggest that ancient hunters harnessed toxic materials far earlier than previously believed.
Prior to this discovery, evidence for poisoned arrows extended back only about 8,000 years. However, a 2020 study of arrow tips dating from 50,000 to 80,000 years ago indicated they exemplified designs similar to modern poison arrows.
Led by Professor Marlies Lombard, researchers at the University of Johannesburg uncovered that the tips of 60,000-year-old arrowheads were coated in a sticky substance, though poison’s presence couldn’t initially be confirmed.
Recently, Professor Lombard and her team confirmed the presence of toxic alkaloids, such as bupandrin and epibufanisin, in five quartzite arrowheads retrieved from Umhlatuzana rock caves in KwaZulu-Natal province.
The scientists believe these toxins likely originated from milky exudates from the roots of the plant species Buffondistica, which could be applied directly to arrow tips or processed to create a potent resin.
“If we found this in just one artifact, it could have been a mere coincidence,” Lombard noted. “However, finding it in five out of ten artifacts strongly indicates it was systematically used 60,000 years ago.”
The same toxic sap is still employed by the San people today, suggesting an unbroken tradition lasting at least 60,000 years.
Toxic plant traces discovered on arrow points from the Umhlatuzana rock shelter
Marlies Lombard
The plant’s poison is lethal to rodents within 30 minutes and can induce nausea and coma in humans. For larger prey, the toxins likely slowed them down, allowing hunters to successfully track and kill them.
Professor Lombard speculates that the poison may have first been discovered when early humans ingested toxic bulbs, which could lead to illness or death. The plant also possesses antiseptic, antibacterial, and hallucinogenic qualities and is utilized in traditional medicine, though accidental overdoses still occur.
To verify their findings, researchers tested arrows collected by Carl Peter Thunberg, a Swedish naturalist who documented the use of poisoned arrows by indigenous hunters in the 1770s. These tests also revealed the presence of toxic alkaloids from the same plant species.
Sven Isaacson, a member of the research team at Stockholm University, noted that this discovery signifies an early example of sophisticated plant utilization. “While humans have utilized plants for nourishment and tools for millennia, this represents a distinct advancement — harnessing the biochemical attributes of plants to create drugs, medicines, and poisons.”
Ancient Humans Hunting Elephants—Evidence of Slaughtering Animals 1.8 Million Years Ago
Natural History Museum/Scientific Photography Library
Hunting an elephant is a formidable challenge, necessitating advanced tools and teamwork, offering an abundant source of protein.
A research team led by Manuel Dominguez-Rodrigo from Rice University in Texas suggests that ancient humans may have accomplished this feat approximately 1.78 million years ago in Tanzania’s Olduvai Gorge.
“Around 2 million years ago, our ancestors consistently consumed smaller game like gazelles and waterbucks but did not target larger prey,” says Dominguez-Rodrigo.
Later findings from Olduvai Gorge indicate a significant shift. This valley, abundant with both animal and human fossils formed over the past 2 million to 17,000 years, shows a marked increase in elephant and hippopotamus remains around 1.8 million years ago. However, establishing conclusive evidence of human involvement in hunting remains elusive.
In June 2022, Dominguez-Rodrigo and his team discovered what may be an ancient elephant slaughterhouse at Olduvai.
The site, dubbed the EAK site, revealed partial remains of an extinct elephant species, Elephas reki, surrounded by an array of stone tools that were much larger and sturdier than those utilized by hominins 2 million years ago. Dominguez-Rodrigo posits these tools were likely crafted by the ancient hominin Homo erectus.
“These include Pleistocene knives, known for their sharpness even today,” he notes, emphasizing their potential for butchering tasks.
Dominguez-Rodrigo and his colleagues believe these stone tools facilitated elephant slaughter. Some limb bones appear to have fractured shortly after the elephant’s demise, indicating the bones were still fresh or “green.” Unlike scavengers like hyenas that can strip meat, they can’t shatter the dense bone shafts of mature elephants.
“We discovered numerous bones in the field with fresh fractures, pointing to human use of hammer stones for processing,” he states. “These ‘green’ fractured bones are widespread in the 1.7-million-year-old landscape and bear distinct impact marks.”
However, there is a scarcity of cut marks on bones, which typically indicate butchering practices to extract meat.
It remains uncertain whether humans actively hunted the elephants or merely scavenged existing carcasses.
“What we can confirm is that they disassembled the bones—or portions of them—leaving behind tools and bones as evidence,” affirms Dominguez-Rodrigo.
He adds that the transition to hunting elephants wasn’t merely due to advancements in stone tools, but also hinted at an increase in social structure and cultural development among hominin groups.
However, Michael Pante, a researcher at Colorado State University, remains skeptical of the findings.
Pante contends that the evidence for human exploitation of this individual elephant is weak. The interpretation relies heavily on the proximity of stone tools and elephant remains, as well as the inferred fractures created by human attempts to access bone marrow.
Pante asserts that the earliest definitive evidence of hippo, giraffe, and elephant hunting in Olduvai dates back to around 80,000 years ago, as shown in the research of the 1.7-million-year-old HWK EE site.
“In contrast to the EAK site, the bones at HWK EE exhibit cut marks and are associated with thousands of other bones and artifacts within an archaeological context,” he explains.
Explore the World of Archaeology and Paleontology
New Scientist consistently covers remarkable archaeological sites around the globe, redefining our understanding of species and early civilizations. Consider taking a trip to explore these fascinating locations!
This new year is filled with significant events, including the 250th anniversary of America’s founding, the world’s largest sporting event, and an ambitious mission to the moon.
Discover the groundbreaking events set to shape 2026.
Milan Cortina Games
Prepare your skis, snowboards, and skates! The Winter Olympics and Paralympics are just around the corner.
Taking place from February 6th to 22nd in Milan and Cortina d’Ampezzo, Italy, the Olympics will showcase international winter sports stars competing for prestigious gold medals.
Team USA returns with proud cross-country skiers such as Jesse Diggins, para snowboarder Noah Elliott, freestyle skier Alex Hall, and snowboarder Chloe Kim, all former gold medalists.
The closing ceremony is set for February 22nd, and both ceremonies will be broadcast live on NBC, with streaming available on Peacock.
Watch for the Paralympic Games in Milan and Cortina d’Ampezzo from March 6th to 14th, featuring six sports including para alpine skiing, para biathlon, and wheelchair curling.
Artemis II Launch
In 2026, NASA will make its grand return to the moon.
Scheduled to launch between February and April, the Artemis II mission will test NASA’s Space Launch System rocket and Orion spacecraft by sending four astronauts on a 10-day journey around the moon.
This marks the first crewed flight for the Artemis program, taking astronauts closer to the moon than ever in the past 50 years since the Apollo program concluded.
The mission is particularly critical, given discussions about the need for the U.S. to outpace China in lunar exploration.
A successful Artemis II flight could set the stage for Artemis III, which aims to land astronauts at the moon’s south pole, reinforcing America’s leadership in space exploration.
2026 FIFA World Cup
Viva el fútbol!
The highly anticipated FIFA World Cup returns this summer, marking its 23rd edition with a record 48 competitor teams.
The opening match will take place on June 11th at Estadio Azteca, Mexico City, with the final scheduled for July 19th at MetLife Stadium in East Rutherford, New Jersey.
Over the span of a month, 104 matches will unfold, showcasing the strongest teams from around the globe.
The 16 host cities include Toronto and Vancouver in Canada, Guadalajara, Mexico City, and Monterrey in Mexico, as well as major U.S. cities like Atlanta, Boston, Dallas, Houston, and Los Angeles.
This year, there will be an additional 16 teams competing compared to the 2022 World Cup in Qatar.
Returning are heavyweights such as Argentina (three-time champions), Brazil (five-time champions), England, Germany, France, Spain, Uruguay, and the U.S., who seek their first championship title.
Several countries will be making their World Cup debuts including Cape Verde, Curacao, Jordan, and Uzbekistan.
As of December, 42 teams have officially qualified, including Mexico, Canada, and several others from around the globe.
The remaining six teams will be determined by March, as they compete in playoffs.
America 250
This year marks the 250th anniversary of the United States, commemorating the adoption of the Declaration of Independence on July 4, 1776.
This day symbolizes America’s emergence as an independent nation, embodying vital values of liberty and equality.
Events and initiatives in honor of this milestone are already underway, with many more planned throughout the year.
On New Year’s Day, America 250, a bipartisan initiative created by Congress in 2016, will unveil floats in the Pasadena Rose Parade. The theme is “Moving Forward Together for 250 Years.”
In January 2025, President Trump signed a presidential order to plan events commemorating the anniversary, including a major celebration that transformed the Washington Monument into the “World’s Tallest Birthday Candle.”
Freedom 250 has announced the Great American State Fair on the National Mall from June 25th to July 10th, featuring exhibits from all 50 states.
“This will be an unprecedented event that you’ll never see again,” stated Trump in a video address on December 18th.
The grand celebrations will culminate in a Fourth of July National Unity Celebration on the National Mall, featuring a military flyover, remarks from President Trump, and a spectacular fireworks display.
Additionally, new Patriot Games—a four-day athletic event showcasing top high school athletes from each state—will be held.
Memorial Day parades and a UFC event at the White House are scheduled for Flag Day, which also coincides with Trump’s birthday.
Plans are also in the works for an “Arc de Triomphe” in Washington, D.C., similar to the one in Paris.
Midterm Elections
The 2026 political landscape will be defined by battles for congressional control and crucial gubernatorial elections.
With Republicans holding a slight edge in the House, Democrats are striving to win three additional seats to reclaim leadership amidst ongoing redistricting challenges.
Key gubernatorial races will take place in battleground states including Georgia, Nevada, Arizona, Michigan, and Wisconsin, determining the future of national legislatures alongside major mayoral elections in cities like Los Angeles and Washington, D.C.
A long-lost star, discovered by the legendary astronomer Edward Emerson Barnard in 1892, has been astonishingly rediscovered in its original location.
Barnard was not just any astronomer; he made significant contributions to the field, including the discovery of Jupiter’s fifth moon, Amalthea, in 1892—nearly three centuries after Galileo’s initial discoveries. Recently, his observations have gained renewed interest due to a puzzling article he published in 1906, titled “Unexplained Observations.”
On a particular morning, Barnard noted a star near Venus while using his telescope to search for its satellite. He estimated its brightness to be around 7th magnitude on the astronomical scale, where fainter objects bear higher numbers. Typically, under dark skies, stars of magnitude 6 are the faintest visible to the human eye.
Beneath the stars at the Bonner Cathedral, which cataloged all stars brighter than magnitude 9.5, Barnard’s 7th magnitude star was conspicuously absent. Instead, the only celestial body he found nearby was a significantly dimmer 11th magnitude star—about 100 times less bright.
Could it have been a large asteroid? “Ceres, Pallas, Juno, and Vesta were elsewhere,” he surmised. Some theorized that the 11th magnitude star he eventually observed in that region might have temporarily brightened. Other scientists speculated that Barnard could have been deceived by a “ghost” image of Venus through the telescope. The mystery lingered until late December 2024 when a dedicated group of astronomers sought to unravel it.
“In a weekly Zoom meeting dubbed ‘Asteroid Lunch,’ I brought it up,” says Tim Hunter.
Hunter, an Arizona-based amateur astronomer and co-founder of the International Dark Sky Association, along with both amateur and professional astronomers, evaluated all previous hypotheses and found flaws in them.
As doubts began to consume the group, Roger Ceragioli, an optical engineer from the University of Arizona, revisited the ghost theory by observing Venus at dawn using a vintage telescope similar to Barnard’s. Much to his surprise, although Venus was not positioned where Barnard had seen it, “the star emerged clearly in my field of view,” he noted. This led him to theorize that the star must be bright enough to be visible at dawn, even though the star map revealed it to be only 8th magnitude and therefore relatively faint.
The group’s conclusive findings suggested that Barnard’s purported 7th magnitude star was indeed the 11th magnitude star noted later—appearing brighter due to the dawn light. Using a 36-inch telescope at the Lick Observatory in California, Barnard first spotted this star alongside Venus, but no equally bright stars were visible in the area.
Understanding Star brightness measurement was a specialized skill in Barnard’s era. It had only been refined by astronomers focusing on variable stars, which Barnard had not formally studied. Thus, his mistake was rather excusable, as Ceragioli suggests.
Hunter affirms Barnard’s legacy remains intact, saying, “We’re all big fans of Barnard. It’s a minor error in an impressive career.”
Chile: The World Capital of Astronomy
Discover the astronomical wonders of Chile, home to the world’s most advanced observatory and unrivaled stargazing opportunities under some of the clearest skies on Earth.
As we approach the end of 2025, it’s a time for reflection and planning for the new year. Many individuals consider New Year’s resolutions aimed at improving health, diet, and immunity. But how can you tell if these new habits are effective?
It’s crucial to understand that “boosting” your immune system can be misleading; more robust defenses might actually be harmful. Current research indicates that your ability to combat infections can be assessed through specific immune cell measurements. Monitor your “immunity grade” to determine if your body can fend off illness effectively.
However, tests are ineffective without comprehension of what they signify. A diverse gut microbiome is increasingly recognized as essential for health, leading to various DIY fecal tests available today. Unfortunately, there is still no consensus on the beneficial microorganisms that yield high scores. With insights from the Zoe health app, scoring your microbiome health will soon be more accessible, utilizing a scale from 0 to 1000.
That said, it’s vital to approach statistics critically. For instance, body mass index (BMI) is a commonly used health metric, yet it has significant limitations. While BMI is a straightforward calculation of weight relative to height, it fails to differentiate between unhealthy fat gain and healthy increases in muscle or bone mass. Researchers recently proposed a revised definition of obesity, highlighting the need for better indicators of health.
“
Boosting your immune system is a misnomer, as excessive immune defense can be harmful. “
This emphasizes two important points: Firstly, if you aim to improve your life, ensure you have the appropriate metrics to measure your progress. There’s little value in resolving to wake up early and exercise if your success is gauged solely by your alarm time. Secondly, scientific understanding is continuously evolving, so it’s essential to stay informed with the latest credible evidence. Rest assured, if you’re reading this, you’re already taking a promising step towards better health.
Paleontologists have discovered a significant concentration of dugong fossils at Al Masjabiya, an early Miocene dam site in Qatar. These fossils indicate that the Arabian Gulf has undergone various species of sea cows over the past 20 million years. One of these species is Salvacillen catalensis.
An artistic rendering of a group of Salvacillen catalensis foraging on the ocean floor. Image credit: Alex Boersma.
It has a robust body and a downturned snout adorned with sensitive bristles. Dugongs (dugong dugong) are closely related to manatees.
A key distinction between these aquatic herbivores, often referred to as sea cows, is their tails. Manatees possess a paddle-like tail, whereas dugongs feature a fluke-like tail that resembles that of a dolphin.
Dugongs inhabit coastal waters stretching from western Africa through the Indo-Pacific to northern Australia.
The Arabian Gulf hosts the world’s largest dugong population, making sea cows critical to the ecosystem.
As they graze on seagrass, dugongs alter the ocean floor, creating feeding channels that release buried nutrients into the surrounding waters for use by other marine life.
“We uncovered a distant ancestor of the dugong in a rock formation less than 16 kilometers (10 miles) from a bay with seagrass meadows, which is currently the primary habitat for dugongs,” stated Dr. Nicholas Pienson, curator of fossil marine mammals at the National Museum of Natural History.
“This region has served as the main habitat for sea cows for the past 21 million years, with different species occupying this role over time.”
Few locations preserve as many bones as Al Masjabiya, a fossil site in southwestern Qatar.
The bone beds were initially identified in the 1970s during mining and oil exploration, when geologists found a large number of “reptilian” bones scattered across the desert.
Paleontologists revisited the area in the early 2000s and soon realized that these fossils belonged to sea cows, not ancient reptiles.
Using the surrounding rock layers as a guide, Dr. Pienson and his team dated the bone bed to the early Miocene, approximately 21 million years ago.
They found fossils indicating that this area was once a shallow marine habitat teeming with sharks, barracuda-like fish, prehistoric dolphins, and sea turtles.
Researchers identified over 170 different sites containing sea cow fossils throughout the Al Masjabiya location.
This renders the bone bed the richest trove of fossilized sea cow remains globally.
The fossilized bones from Al Masjabiya bore a resemblance to modern dugongs, although ancient sea cows still had hind limb bones, which contemporary dugongs and manatees have lost through evolution.
The prehistoric sea cows found here exhibited straighter snouts and smaller tusks compared to their living counterparts.
Researchers classified Al Masjabiya’s fossil sea cow as a new species: Salvacillen catalensis.
“Using a national name for this species seemed fitting, as it clearly indicates the location where the fossil was discovered,” said Dr. Ferhan Sakal, a researcher at Qatar Museums.
Estimated weight: 113 kg (250 lbs), Salvacillen catalensis would weigh as much as an adult panda or a heavyweight boxer.
Nonetheless, it was among the smaller sea cow species ever found, with some modern dugongs weighing nearly eight times as much as Salvacillen catalensis.
Based on the fossils, scientists theorize that the region was rich in seagrass beds more than 20 million years ago, during an era when the bay was a hotspot of biodiversity, supported by sea cows nurturing these aquatic meadows.
“The density of al-Mashabiya’s bone bed provides a significant clue. Salvacillen catalensis acted as seagrass ecosystem engineers in the early Miocene, much like dugongs do today,” Dr. Pienson added.
“Though the evolutionary agents have completely changed, the ecological roles have remained the same.”
The findings are documented in a published paper available at: Peer J.
_____
ND Pienson et al., 2025. The abundance of early Miocene sea cows from Qatar demonstrates the repeated evolution of eastern Tethyan seagrass ecosystem engineers. Peer J 13: e20030; doi: 10.7717/peerj.20030
The previous season marked the highest temperatures in the Arctic for the past 125 years. March, typically the month with the greatest sea ice extent, recorded the lowest levels in 47 years of satellite data. The North American tundra exhibited unprecedented greenness, showing more vegetation than ever before.
These findings, released on Tuesday in the National Oceanic and Atmospheric Administration’s annual Arctic Report Card, illustrate the swift and dramatic changes taking place in the region as global temperatures rise.
“The Arctic is warming at a pace that exceeds the global average, with the last decade being some of the hottest on record,” stated Steve Sarr, NOAA’s acting principal scientist and associate administrator for ocean and atmospheric research.
Due to this warming, “over 200 watersheds in the Alaskan Arctic are turning orange as permafrost thaws, ecosystems evolve, and elements like iron are released into rivers,” Thursday indicated.The research highlighted increased acidity and higher levels of toxic metals in these discolored streams.
This is just one of many consequences of climate change affecting the region detailed in the report. This marks the 20th year that NOAA has published the Arctic report card, which originally surfaced during President Donald Trump’s second term.
The Trump administration has worked to diminish or eliminate other climate change reports, including the National Climate Assessment and the extensive climate disaster database. President Trump has labeled climate change a “swindler” and is actively trying to reduce the Environmental Protection Agency’s power to regulate greenhouse gas emissions.
Matthew Druckenmiller, a writer of the report and researcher at the National Snow and Ice Data Center, affirmed during a Tuesday press conference that the team faced “no political interference concerning our findings.”
Independent scientists consulted by NBC News remarked that the report conveys a similarly urgent tone and message as in previous years, with a few minor distinctions.
“Frankly, we haven’t observed a significant shift in tone compared to prior Arctic report cards, which is encouraging,” commented Tom Di Liberto, a climate scientist and media director at Climate Central. “The implications of their conclusions remain consistent with earlier Arctic report cards. The Arctic acts as a warning sign.”
Di Liberto, who previously worked in NOAA’s communications office before his position was cut in March as part of staff reductions, noted that the previous year’s report emphasized reducing fossil fuel production, whereas this year’s report does not mention fossil fuels at all. Otherwise, he identified no major differences.
NOAA unveiled a report at the American Geophysical Union’s annual meeting in New Orleans, highlighting how climate change is disrupting ecosystems and threatening livelihoods in the Arctic. This event is one of the largest scientific gatherings of the year, attracting thousands of scientists.
Mark Alessi, a climate scientist and fellow at the Union of Concerned Scientists, remarked that the report card “effectively communicates the realities of what is occurring on the ground in the Arctic.”
“Anyone reading this will understand that we continue to raise the alarm,” he emphasized.
In strong language, the report’s authors point out that proposed budget cuts to scientific programs collecting data in the Arctic, including satellite programs monitoring sea ice, threaten to undermine the data collection essential for this report and related decision-making.
“Aging infrastructure, along with risks to funding and staffing, could further erode existing AONs.” [Arctic Observing Network] Gaps are forming that hinder long-term trend analysis and decision-making,” the report warned.
Specifically, the report highlights several satellites within the Defense Weather Satellite Program set to be decommissioned in 2026. The cessation of these satellites will restrict sea ice measurements. It also mentions that the tundra greenness dataset will remain unchanged due to NASA funding cuts, and other climate datasets may also be jeopardized by proposed federal budget cuts in fiscal year 2026.
The Arctic is warming two to four times quicker than the rest of the globe, a phenomenon known as Arctic amplification. This process alters ocean currents and the degree of sunlight absorbed by the Earth’s surface at the poles.
“This feedback loop leads to the loss of sea ice and land ice, increased absorption of sunlight, and consequently, more rapid warming,” explained Alessi.
Temperature records are categorized by the Arctic water year, with the latest data ranging from October 2024 to September 2025.
“This site, dating back 400,000 years, represents the earliest known evidence of fire not just in Britain and Europe but across the globe,” stated Nick Ashton, co-author of the study and curator at the British Museum. He noted that this discovery pushes back the timeline of when our ancestors might have first harnessed fire by approximately 350,000 years.
Researchers are uncertain about the uses of fire by these hominin ancestors. They may have roasted meat, crafted tools, or shared narratives under its glow.
Understanding when our ancestors mastered the use of fire is crucial to unraveling the complexities of human evolution and behavior.
One hypothesis suggests that the ability to start fire contributed to the increase in brain size among early humans, as cooking facilitates easier digestion and boosts caloric intake. Another theory posits that controlling fire may have fostered social gathering spots at night, boosting social behavior and cognitive evolution.
“We know brain size was increasing towards its current capacity during this period,” remarked Chris Stringer, research head in human evolution at London’s Natural History Museum and another author of the Nature study. “The brain is energetically costly, consuming about 20 percent of the body’s energy. Thus, the ability to use fire enhances nutrient absorption from food, provides energy for the brain, and allows for the evolution of larger brains.”
Stringer emphasized that this finding does not signify the beginning of fire usage among humans but is merely the earliest instance researchers can confidently point to. Other early indications of fire use have been found in regions of South Africa, Israel, and Kenya, though these are contentious and open to interpretation.
From an archaeological standpoint, it’s challenging to ascertain the cause of wildfires or whether they were initiated by humans.
“The key question is whether they collected it from a natural source, managed it, or created it themselves. On the surface, this appears to be a robust case suggesting that the group knew how to start fires,” noted Dennis Sandogyas, a senior lecturer in the archaeology department at Simon Fraser University in Canada, who was not part of the study.
In the recent Nature study, researchers highlight the presence of deposits with fire residue, fire-cracked stone tools including a flint hatchet, and two small fragments of pyrite likely brought to the site by humans for fire-making, as indicated by geological analysis.
The prehistoric hatchet stone tool was discovered near a 400,000-year-old fire site that researchers believe was frequently used by Neanderthals. Road to Ancient Britain Project
Other outside researchers expressed skepticism.
Much of the evidence presented is “circumstantial,” wrote Will Loebloeks, a professor emeritus of paleolithic archaeology at Leiden University in the Netherlands, in an email.
Lowbrokes pointed out that later Neanderthal sites, dating to around 50,000 years ago, showed flint tools with wear signs indicating they had been struck against pyrite to produce sparks, an indication of humans creating fire. This evidence isn’t present in the current study.
“While the authors conducted thorough analysis of the Burnham data, they seem to be overstating claims by suggesting this is the ‘earliest evidence of a fire outbreak,'” Lobruks noted.
For our ancestors, fire was vital for warmth, nutrition, deterring predators, and even melting resins used in adhesives.
However, Sandgate emphasized that the evolution of fire-starting is not a straightforward path; it included sporadic adaptations and innovations. Evidence exists that early groups who learned to create fire sometimes lost that ability or ceased its use for cultural reasons.
“We must be cautious not to generalize any single instance … as proof that from this moment forward everyone will know how to start a fire,” Sandogyas remarked, referencing nearly 100 modern hunter-gatherer groups that have been meticulously observed. Some lacked the ability to generate fire.
“It’s probable that the art of fire-making was discovered, lost, rediscovered, and lost again across various groups over time. Its history is undoubtedly intricate.”
The western order encompasses sharks commonly referred to as mackerel sharks. This group includes some of the most recognized shark species, such as great whites and shortfin mako sharks, along with lesser-known varieties like goblin sharks and megamouth sharks. The recent discovery of a 115-million-year-old giant shark in northern Australia indicates that oligarchs experimented with massive sizes around 15 million years earlier than previously believed, reigning at the top of the marine food chain alongside giant marine reptiles during the era of the dinosaurs.
In the ocean off the coast of Australia 115 million years ago, a gigantic 8m long predatory shark chases an unwary long-necked plesiosaur. Image credit: Polyanna von Knorring, Swedish Museum of Natural History.
Sharks are iconic predators in contemporary oceans, and their lineage dates back over 400 million years.
Nonetheless, the evolutionary journey of modern sharks initiated during the age of the dinosaurs, with the oldest known fossils appearing around 135 million years ago.
These early modern sharks, referred to as olipids, were relatively small, measuring roughly 1 meter in length, but evolved over time into colossal species like the renowned megalodon, which may have exceeded 17 meters, and the great white, known as the modern apex predator of the seas, measuring up to 6 meters.
Sharks possess cartilaginous skeletons, and their fossil record primarily consists of teeth, which are continuously shed as they eat.
This results in shark teeth being commonly found in sedimentary rocks on the ocean floor, alongside the remains of other species, such as fish and large marine reptiles, which dominated marine ecosystems during the time of the dinosaurs.
The rugged coastline around Darwin in northern Australia was once the mudbed of the ancient Tethyan sea, which extended from the southern reaches of Gondwana (now Australia) to the northern island archipelago of Laurasia (now Europe).
Fossils of sea creatures like plesiosaurs, ichthyosaurs, and large bony fish have been uncovered.
Most notably, several giant vertebrae were found, indicating the presence of an unexpected predator: the giant sheep shark.
The five recovered vertebrae were partially calcified, allowing for their preservation, and they closely resemble those of modern great white sharks.
However, while the vertebrae of an adult great white shark measure about 8 cm in diameter, the fossilized sheep shark vertebrae from Darwin exceeded 12 cm in diameter.
These vertebrae also exhibited unique morphological traits, enabling their classification within the Cardabiodontidae family, giant predatory sharks that have existed in the oceans for approximately 100 million years.
Significantly, the rhamniforms in Darwin were around 15 million years old and had already achieved the substantial body size characteristic of cardabiodontids.
“Our findings demonstrate that large body size is an ancient trait, with Australian storkids measuring between 6 to 8 meters long and weighing over 3 tonnes,” stated lead author Dr. Mohammad Bazzi from Stanford University and colleagues.
“This is comparable to some of the largest marine reptiles of their time and indicates that oligarchs entered the apex predator niche early in their adaptive evolution.”
“These sharks were substantial in size and inhabited shallow coastal waters,” added co-author Dr. Michael Syverson, a researcher at the Western Australian Museum.
“This provides significant insights into the workings of ancient food webs and underscores the value of Australia’s fossil remains in comprehending prehistoric life.”
“This discovery not only reshapes the evolutionary narrative of sharks but also enhances Australia’s global significance in paleontological studies.”
“With each fossil discovery, we refine our understanding of ancient oceans and the remarkable creatures that once ruled them.”
For more details on this discovery, refer to the new paper published in Communication Biology.
_____
M.Bazzi et al. 2025. Early giant amnioids mark the beginning of giant body sizes in the evolution of modern sharks. Commun. Biol. August 1499. doi: 10.1038/s42003-025-08930-y
In 2009, paleoanthropologists uncovered eight foot bones from ancient human ancestors in 3.4 million-year-old deposits at the Wolanso Mir site in Ethiopia’s Afar Rift Valley. A new study reveals that this fossil, known as Brutele’s foot, belongs to Australopithecus deiremeda. This finding adds to the evidence that two hominin species, Australopithecus deiremeda and Australopithecus afarensis, coexisted in the same region at the same time.
Australopithecus deiremeda and Australopithecus afarensis. Image credit: Gemini AI.
“When we found this foot in 2009 and announced it in 2012, we recognized it was distinct from Lucy’s species, although Australopithecus afarensis has received significant attention since then,” stated Professor Johannes Haile Selassie from Arizona State University.
“Typically, naming a species based on postcranial elements is uncommon in our field, so we anticipated finding something distinctly linked to the feet from the neck up.
“Traditionally, the skull, jaw, and teeth are the primary markers for species identification.”
“When Bartele’s foot was first reported, some teeth had already been found in the same area, but we weren’t certain they were from the same deposit level.”
“Then in 2015, scientists classified a new species, Australopithecus deiremeda, from the same region, but the foot was not included, despite other specimens being unearthed nearby.”
“Over the last decade, our repeated fieldwork has yielded more fossils, allowing us to confidently link Brutele’s foot to the species Australopithecus deiremeda.”
Australopithecus deiremeda exhibits more primitive foot structures compared to Lucy’s species, Australopithecus afarensis.
While retaining an opposable thumb useful for climbing, it is believed that Australopithecus deiremeda likely walked on two legs, with an emphasis on their second toes rather than their big toes, as is the case with modern humans.
“The presence of an opposable big toe in Ardipithecus ramidus was a surprising and unexpected finding, highlighting that 4.4 million years ago, early human ancestors still possessed opposable big toes,” remarked Professor Haile Selassie.
“Then, a million years later, the discovery of Brutele’s foot further amazed us.”
“Currently, we’re in an era where we can observe subsequent species. Members of Australopithecus afarensis had an adducted big toe and displayed complete bipedalism.”
“This indicates that bipedalism, or walking on two legs, manifested in diverse forms among these early human ancestors.”
“The discovery of specimens like Bartele’s foot conveys that there were multiple ways to walk bipedally. It wasn’t until later that a single method emerged.”
To gain insights into their dietary practices, researchers sampled eight of the 25 teeth found in the area related to Australopithecus deiremeda for isotope analysis.
This process involved cleaning the tooth to ensure only the enamel was analyzed.
“I extracted the tooth using a dental drill with a very small bit, similar to what dentists use,” explained Naomi Levin, a professor at the University of Michigan.
“Using this drill, we meticulously remove a small amount of powder, which we store in a vial and return to the lab for isotope analysis.”
“The results were intriguing: Lucy’s species displayed a mixed diet, consuming both C3 (from trees and shrubs) and C4 (tropical grasses and sedges) plants; while Australopithecus deiremeda primarily utilized resources from the C3 category.”
“We were taken aback by how distinctly clear the carbon isotope signal was, mirroring ancient hominin data from Australopithecus ramidus and Australopithecus anamensis.
“I considered the dietary differences between Australopithecus deiremeda and Australopithecus afarensis. Although identifying them was challenging, the isotopic data distinctly indicated that Australopithecus deiremeda was not exploiting the same range of resources as Australopithecus afarensis, known as the earliest hominin to consume C4 grass-based resources.”
Another significant analysis involved accurately dating the fossils and understanding the ancient environments inhabited by these early humans.
“We conducted extensive field research at Wolanso Mir to analyze how different fossil layers interrelate, which is essential for grasping when and in what environments different species thrived,” noted Professor Beverly Thaler from Case Western Reserve University.
In addition to the 25 teeth found at Bartele, researchers also recovered the jaw of a four-and-a-half-year-old child, displaying dental anatomy similar to that of a juvenile Australopithecus deiremeda.
Professor Gary Schwartz from Arizona State University commented: “In juvenile hominins of this age, we observed evident growth discontinuity between front teeth (incisors) and back chewing teeth (molars), akin to patterns in modern apes and early australopiths like Lucy.”
“The most surprising aspect was that, despite gaining a better understanding of the diversity within early australopith (and thus early hominid) species regarding size, diet, locomotion, and anatomy, these early forms appeared surprisingly uniform in growth patterns.”
Findings have been detailed in a paper published in this week’s edition of Nature.
_____
Y. Haile Selassie et al. New discovery illuminates the diet and lifestyle of Australopithecus deiremeda. Nature published online November 26, 2025. doi: 10.1038/s41586-025-09714-4
Domestic cats (Felis catus) and African wildcats (Felis silvestris lybica) have successfully adapted to human environments worldwide. The precise origin of the domestic cat—whether it emerged in the Levant, Egypt, or another part of the African wildcat’s range—remains uncertain. A research team from the University of Rome Tor Vergata, led by Tor Vergata, has sequenced the genomes of 87 ancient and modern cats. Their research challenges the traditional belief that domestic cats were brought to Europe during the Neolithic period, suggesting instead that their arrival occurred several thousand years later.
Ancient cat genomes from European and Anatolian sites indicate that domestic cats were introduced to Europe from North Africa around 2,000 years ago, many years after the Neolithic period began in Europe. The Sardinian African wildcat has a separate lineage originating from northwest Africa. Image credit: De Martino et al., doi: 10.1126/science.adt2642.
The history of domestic cats is extensive and complex, yet it contains many uncertainties.
Genetic analyses reveal that all modern domestic cats can trace their ancestry back to the African wildcat inhabiting North Africa and the Near East.
Yet, limited archaeological evidence and the challenges of differentiating between wild and domestic cats through skeletal remains pose significant obstacles in comprehending the origins and diffusion of early domestic cats.
“The timing and specifics surrounding cat domestication and dispersal are still unclear due to the small sample size of ancient and modern genomes studied,” stated Dr. Marco De Martino from the University of Rome Tor Vergata and fellow researchers.
“There are ongoing questions regarding the historical natural habitats of African and European wildcats and the possibility of their interbreeding.”
“Recent investigations have shown that ancient gene flow can complicate the understanding of cat dispersal, especially when relying on mtDNA data.”
“The origins of African wildcat populations on Mediterranean islands like Sardinia and Corsica are equally obscure.”
“Current research suggests these populations constitute a distinct lineage rather than stemming from domestic cats.”
To explore these issues, the team examined the genomes of 70 ancient cats retrieved from archaeological sites in Europe and Anatolia, in addition to 17 modern wildcat species from Italy (including Sardinia), Bulgaria, and North Africa (Morocco and Tunisia).
In contrast to earlier studies, they concluded that domestic cats most likely emerged from North African wildcats rather than the Levant, and that true domestic cats appeared in Europe and southwest Asia several thousand years post-Neolithic.
The early cats of Europe and Turkey predominantly consisted of European wildcats, indicating ancient interbreeding instead of early domestication.
Once introduced, North African domestic cats proliferated across Europe, following routes used by Roman military forces, and reached Britain by the first century AD.
This study also reveals that the Sardinian wildcat is more closely related to North African wildcats than to either ancient or modern domestic cats, suggesting that humans transported wildcats to islands where they do not naturally exist, and that the Sardinian wildcat did not descend from early domestic cat populations.
“By identifying at least two distinct waves of introduction to Europe, we redefine the timeline of cat dispersal,” the researchers noted.
“The first wave likely introduced wildcats from northwest Africa to Sardinia, forming the island’s current wildcat population.”
“A separate, as yet unidentified population in North Africa triggered a second dispersal no later than 2,000 years ago, establishing the modern domestic cat gene pool in Europe.”
The team’s findings are highlighted in this week’s edition of Science.
_____
M. De Martino et al. 2025. Approximately 2,000 years ago, domestic cats migrated from North Africa to Europe. Science 390 (6776); doi: 10.1126/science.adt2642
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.