New research reveals that burn injuries have significantly influenced the rapid evolution of humans.
Scientists from Imperial College London assert that our close relationship with fire has made our ancestors more resilient to burn injuries compared to other animals. This ongoing exposure to fire may have fundamentally shaped our wound healing processes and immune responses, leaving an indelible impact on our genetic makeup.
Study author Professor Armand Leroy, an evolutionary biologist at Imperial’s School of Life Sciences, states, “The concept of burn selection introduces a novel form of natural selection that is influenced by cultural factors.” He emphasizes, “This adds a new dimension to the narrative of what defines humanity, something we were previously unaware of.”
While minor burns typically heal swiftly, severe burns that take longer to mend can permit bacterial invasion, escalating the risk of infection.
Researchers hypothesize that these challenges prompted crucial genetic adaptations, leading evolution to favor traits that enhance survival after burn injuries. For instance, this includes accelerated inflammation responses and enhanced wound closure mechanisms.
Published in the journal BioEssays, the study contrasts human genomic data with that of other primates. Findings indicate that genes related to burn and wound healing exhibit accelerated evolution in humans, with increased mutations observed in these genes. These evolutionary changes are believed to have resulted in a thicker dermal layer of human skin and deeper placement of hair follicles and sweat glands.
However, the study suggests a trade-off; while amplified inflammation is beneficial for healing minor to moderate burns, it can exacerbate damage in cases of severe burns. More specifically, extreme inflammation from serious burns can lead to scarring and, in some instances, organ failure.
This research may shed light on why some individuals heal effectively while others struggle after burn-related injuries, potentially enhancing treatment methodologies for burns and scars.
According to Prince Kyei Baffour, a burn specialist and lecturer at Leeds Beckett University who was not part of the study, “This field remains underexplored and represents a burgeoning area of research regarding burn injury responses.” BBC Science Focus.
Baffour recommends further investigations into various forms of fire exposure, including smoke inhalation.
Ornithologists at the Cornell Lab of Ornithology have unveiled the most comprehensive evolutionary tree of birds to date. This groundbreaking research reveals unexpected relationships and serves as a fascinating illustration for bird enthusiasts. Explore the World Bird Lineage Explorer, where you can track lineage and evolutionary milestones.
European bee-eater (Merops apiaster). Image credit: Rashuli / CC BY 2.0.
Understanding the phylogeny of birds is crucial for advancing bird research.
With over 11,000 bird species worldwide, consolidating phylogenetic trees into a singular, updated resource has posed significant challenges for ornithologists.
The Birds of the World Phylogeny Explorer directly addresses these challenges, remaining current with the latest scientific discoveries.
“This tool combines centuries of avian research with advanced computational tools, creating a captivating interactive resource that narrates the story of bird evolution,” stated Dr. Elliott Miller, a researcher with the American Bird Conservancy.
“New evolutionary relationships are constantly being discovered. We release annual updates to our phylogenetic tools, ensuring our datasets align with the latest taxonomy,” he added.
“This tool holds immense value for the scientific community,” remarked Dr. Pam Rasmussen from the Cornell Lab of Ornithology.
“The complete tree of bird life, built on cutting-edge phylogenetic research, is now a downloadable, interactive dataset from Birds of the World, encouraging further inquiry and exploration.”
“This evolutionary tree provides crucial insights into how evolutionary history has shaped traits such as beak shape, wing length, foraging behaviors, and habitat preferences in birds.”
“Bird lovers will appreciate the personalized features of the Birds of the World Phylogeny Explorer,” Dr. Marshall Iliff noted, also from the Cornell Lab of Ornithology.
“By logging into the platform, birders can visualize the diversity of their eBird species list, diving deep into bird history across orders, families, and genera, thus revealing noteworthy evolutionary patterns.”
“For birdwatchers, their lifetime list transforms into a personal journey through evolutionary history, highlighting how each species fits into the broader narrative of avian evolution.”
“Users are sure to encounter surprising revelations. For instance, why does the North American woodpecker closely resemble other woodpeckers yet belong to a different lineage?”
“Or why are peregrine falcons fierce hunters like hawks and eagles, even though they originate from a separate branch of the family tree?”
“Solving these taxonomic enigmas can become a lifelong pursuit for anyone deeply passionate about birds.”
Octopuses in shallow waters, such as the common octopus, typically possess larger brains.
Image Credit: Shutterstock
Research suggests that the large brains of octopuses are influenced more by environmental conditions than by social interactions.
It is widely accepted that larger mammalian brains correlate with social behavior, a theory known as the social brain hypothesis. The premise is that the more social connections a species has, the larger their brains must be to handle those interactions. This trend is evident among primates, dolphins, and camelids.
In contrast, cephalopods—like octopuses, cuttlefish, and nautiluses—exhibit significant intelligence despite mostly living solitary lives, with limited parental care and minimal social learning.
To delve deeper into the reasons behind the substantial brain size of these creatures, Michael Muthukrishna and researchers from the London School of Economics analyzed data from 79 cephalopod species with available brain information. They quantified brain size based on the total volume of an animal’s central nervous system, considering that octopuses actually possess nine brains: one central brain and semi-independent brains in each of their eight arms.
“This species is a stark contrast to humans, showcasing unique appendages and behaviors,” Muthukrishna notes.
The findings revealed no direct correlation between brain size and sociability. However, they did uncover that cephalopods generally have larger brains when inhabiting shallow waters, where they encounter a wide array of objects to manipulate and use as tools, along with rich calorie availability. Conversely, species dwelling in featureless deep-sea environments tend to have smaller brains.
“The correlation is quite strong,” Muthukrishna states, “but it’s imperative to approach these findings cautiously,” as only about 10 percent of the existing 800 cephalopod species have brain data accessible.
“The absence of a social brain effect in octopuses is intriguing yet expected,” explains Robin Dunbar from Oxford University, who proposed the social brain hypothesis around three decades ago. He argues that because octopuses do not inhabit cohesive social groups, their brains lack the necessity to manage complex social dynamics.
Professor Paul Katz from the University of Massachusetts articulates the possibility that evolution may have led to smaller brain sizes each time cephalopods adapted to deep-sea environments. “It’s reminiscent of species dimensions reducing on isolated islands; the same could apply to species in the deep ocean,” he mentions.
Muthukrishna’s previous research proposed that brain size not only predicts the extent of social and cultural behaviors but also reflects ecological factors such as prey diversity. Thus, the parallel patterns between cephalopods, having diverged from vertebrates over 500 million years ago, and humans bolster the cultural brain hypothesis. According to Muthukrishna and colleagues, this hypothesis illustrates how ecological pressures and information acquisition lead to the development of larger, more complex brains.
“It’s not solely about social instincts when it comes to large brains,” Muthukrishna asserts.
“I wholeheartedly agree that exploring why humans possess large brains must be informed by our understanding of current species. However, unraveling the evolutionary history of large brains, particularly with cephalopods, is challenging, especially given the radically different predator-prey dynamics when their brains began evolving,” Katz explains.
Dunbar emphasizes that octopuses may require substantial brainpower for their independent-use of eight arms. “Understanding an octopus’s brain is complex due to its unique structure, but a significant part of its brain’s function is to manage its intricate body mechanics necessary for survival,” he states.
Furthermore, Dunbar notes that it is logical for larger brains to evolve in environments abundant in calories. “You can’t increase brain size without addressing energy consumption. Once you have a more substantial brain, its applications become vast, which is why humans can engage in writing, reading, and complex mathematics—skills not inherently present within our evolutionary contexts.”
A significant enigma in vertebrate evolution—why numerous major fish lineages appeared suddenly in the fossil record tens of millions of years post their presumed origins—has been linked to the Late Ordovician mass extinction (LOME). This insight comes from a recent analysis conducted by paleontologists at the Okinawa Institute of Science and Technology Graduate University. The study reveals that the LOME, occurring approximately 445 to 443 million years ago, instigated a parallel endemic radiation of jawed and jawless vertebrates (gnathostomes) within isolated refugia, ultimately reshaping the early narrative of fishes and their relatives.
Reconstruction of Sacabambaspis jamvieri, an armored jawless fish from the Ordovician period. Image credit: OIST Kaori Seragaki
Most vertebrate lineages initially documented in the mid-Paleozoic emerged significantly after the Cambrian origin and Ordovician invertebrate biodiversity. This temporal gap is often attributed to inadequate sampling and lengthy ghost lineages.
However, paleontologists Kazuhei Hagiwara and Lauren Saran from the Okinawa Institute of Science and Technology Graduate University propose that the LOME may have fundamentally transformed the vertebrate ecosystem.
Utilizing a newly compiled global database of Paleozoic vertebrate occurrences, biogeography, and ecosystems, they identified that this mass extinction coincided with the extinction of stylostome conodonts (extinct marine jawless vertebrates) and the decline of early gnathostomes and pelagic invertebrates.
In the aftermath, the post-extinction ecosystems witnessed the initial definitive emergence of most major vertebrate lineages characteristic of the Paleozoic ‘Age of Fish’.
“While the ultimate cause of LOME remains unclear, clear changes before and after the event are evident through the fossil record,” stated Professor Saran.
“We have assimilated 200 years of Late Ordovician and Early Silurian paleontology and created a novel database of fossil records that will assist in reconstructing the refugia ecosystem,” Dr. Hagiwara elaborated.
“This enables us to quantify genus-level diversity from this era and illustrate how LOME directly contributed to a significant increase in gnathostome biodiversity.”
LOME transpired in two pulses during a period marked by global temperature fluctuations, alterations in ocean chemistry—including essential trace elements—sudden polar glaciation, and fluctuations in sea levels.
These transformations severely impacted marine ecosystems, creating post-extinction ‘gaps’ with reduced biodiversity that extended until the early Silurian period.
The researchers confirmed a previously suggested gap in vertebrate diversity known as the Thalimar gap.
Throughout this time, terrestrial richness remained low, and the surviving fauna consisted largely of isolated microfossils.
The recovery was gradual, with the Silurian period encompassing a 23-million-year recovery phase during which vertebrate lineages diversified intermittently.
Silurian gnathostome lineages displayed gradual diversification during an early phase when global biodiversity was notably low.
Early jawed vertebrates appear to have evolved in isolation rather than rapidly dispersing into ancient oceans.
The researchers noted that gnathostomes exhibited high levels of endemism from the outset of the Silurian period, with diversification occurring primarily in certain long-term extinction reserves.
One such refuge is southern China, where the earliest conclusive evidence of jaws is present in the fossil record.
These primitive jawed vertebrates remained geographically restricted for millions of years.
Turnover and recovery following LOME paralleled climatic fluctuations similar to those at the end of the Devonian mass extinction, including prolonged epochs of low diversity and delayed dominance of jawed fishes.
“For the first time, we discovered the entire body fossil of a jawed fish directly related to modern sharks in what is now southern China,” Dr. Hagiwara noted.
“They remained concentrated in these stable refugia for millions of years until they evolved the capability to migrate across open oceans to new ecosystems.”
“By integrating location, morphology, ecology, and biodiversity, we can finally understand how early vertebrate ecosystems restructured themselves after significant environmental disruptions,” Professor Saran added.
“This study elucidates why jaws evolved, why jawed vertebrates ultimately became widespread, and how modern marine life originated from these survivors rather than earlier forms like conodonts and trilobites.”
For more information, refer to the study published on January 9th in Scientific Progress.
_____
Kazuhei Hagiwara & Lauren Saran. 2026. The mass extinction that initiated the irradiation of jawed vertebrates and their jawless relatives (gnathostomes). Scientific Progress 12(2); doi: 10.1126/sciadv.aeb2297
Humans have larger brains relative to body size compared to other primates, which leads to a higher glucose demand that may be supported by gut microbiota changes influencing host metabolism. In this study, we investigated this hypothesis by inoculating germ-free mice with gut bacteria from three primate species with varying brain sizes. Notably, the brain gene expression in mice receiving human and macaque gut microbes mirrored patterns found in the respective primate brains. Human gut microbes enhanced glucose production and utilization in the mouse brains, suggesting that differences in gut microbiota across species can impact brain metabolism, indicating that gut microbiota may help meet the energy needs of large primate brains.
Decasian et al. provided groundbreaking data showing that gut microbiome shapes brain function differences among primates. Image credit: DeCasien et al., doi: 10.1073/pnas.2426232122.
“Our research demonstrates that microbes influence traits critical for understanding evolution, especially regarding the evolution of the human brain,” stated Katie Amato, lead author and researcher at Northwestern University.
This study builds upon prior research revealing that introducing gut microbes from larger-brained primates into mice leads to enhanced metabolic energy within the host microbiome—a fundamental requirement for supporting the development and function of energetically costly large brains.
The researchers aimed to examine how gut microbes from primates of varying brain sizes affect host brain function. In a controlled laboratory setting, they transplanted gut bacteria from two large-brained primates (humans and squirrel monkeys) and a smaller-brained primate (macaque) into germ-free mice.
Within eight weeks, mice with gut microbes from smaller-brained primates exhibited distinct brain function compared to those with microbes from larger-brained primates.
Results indicated that mice hosting larger-brained microbes demonstrated increased expression of genes linked to energy production and synaptic plasticity, vital for the brain’s learning processes. Conversely, gene expression associated with these processes was diminished in mice hosting smaller-brained primate microbes.
“Interestingly, we compared our findings from mouse brains with actual macaque and human brain data, and, to our surprise, many of the gene expression patterns were remarkably similar,” Dr. Amato remarked.
“This means we could alter the mouse brain to resemble that of the primate from which the microbial sample was derived.”
Another notable discovery was the identification of gene expression patterns associated with ADHD, schizophrenia, bipolar disorder, and autism in mice with gut microbes from smaller-brained primates.
Although previous research has suggested correlations between conditions like autism and gut microbiome composition, definitive evidence linking microbiota to these conditions has been lacking.
“Our study further supports the idea that microbes may play a role in these disorders, emphasizing that the gut microbiome influences brain function during developmental stages,” Dr. Amato explained.
“We can speculate that exposure to ‘harmful’ microorganisms could alter human brain development, possibly leading to the onset of these disorders. Essentially, if critical human microorganisms are absent in early stages, functional brain changes may occur, increasing the risk of disorder manifestations.”
These groundbreaking findings will be published in today’s Proceedings of the National Academy of Sciences.
_____
Alex R. Decassian et al. 2026. Primate gut microbiota induces evolutionarily significant changes in neurodevelopment in mice. PNAS 123(2): e2426232122; doi: 10.1073/pnas.2426232122
Sahelanthropus: Fossil comparison with chimpanzees and humans
Williams et al., Sci. Adv. 12, eadv0130
The long-standing debate regarding whether our earliest ancestors walked on knuckles like chimpanzees or stood upright like modern humans may be closer to resolution, yet skepticism remains.
Scott Williams and researchers at New York University recently reanalyzed fossil remains of Sahelanthropus tchadensis, indicating that this species possessed at least three anatomical features suggesting it was our earliest known bipedal ancestor.
The journey to this conclusion has been extensive.
Fossilized remains of a skull, teeth, and jawbone from approximately 7 million years ago were first identified in 2002 in Chad, north-central Africa. The distinctive features of this ancient species, including its prominent brow ridge and smaller canine teeth, were quickly acknowledged as diverging from ape characteristics.
Analyzing the skull’s anatomy suggests it was positioned directly over the vertebrae, analogous to other upright, bipedal hominins.
In 2004, French scientists uncovered the femur and ulna associated with the Sahelanthropus skull from Chad. However, it wasn’t until 2020 that researchers claimed the femur exhibited curvature similar to that of non-bipedal great apes.
Since then, scholarly debate has fluctuated. For instance, in 2022, researchers Frank Guy and Guillaume Daver of the University of Poitiers argued for anatomical features of the femur that indicate bipedalism. In 2024, Clement Zanoli and colleagues from the University of Bordeaux countered, suggesting Guy and Daver’s assertions were flawed, as the anatomical characteristics of bipedalism may also appear in non-bipedal great apes.
Lead study author Williams started with a “fairly ambivalent” stance on Sahelanthropus.
His team investigated the femur’s attachment point for the gluteus maximus muscle, finding similarities to human femur anatomy.
They also compared the femur and ulna size and shape; while similar in size to chimpanzee bones, they aligned more closely with human proportions.
Additionally, they identified the “femoral tuberosity,” a previously overlooked feature of Sahelanthropus.
“We initially identified it by touch, later confirming it with 3D scans of the fossil,” Williams shared. “This bump, present only in species with a femoral tubercle, contrasts smooth areas found in great apes and plays a critical role in mobility.”
This area serves as an attachment point for the iliofemoral ligament, the strongest ligament in the human body. While relaxed when seated, it tightens during standing or walking, securing the femoral head in the hip joint and preventing the torso from tilting backward or sideways.
However, Williams expressed doubts about whether this study would fully end the conversation about how Sahelanthropus moved.
“We are confident Sahelanthropus was an early bipedal hominin, but we must recognize that the debate is ongoing,” Williams noted.
In response to a recent paper, Guy and Daver issued a joint statement asserting that humans likely began walking on two legs by 2022: “This reaffirms our earlier interpretations about Sahelanthropus adaptations and locomotion, suggesting habitual bipedalism despite its ape-like morphology.”
They acknowledged that only new fossil discoveries could unequivocally conclude the matter.
John Hawkes, a professor at the University of Wisconsin-Madison, also endorsed the new findings, noting their implications for understanding the complex origins of the hominin lineage.
“It may be deceptive to perceive Sahelanthropus as part of a gradual evolution towards an upright posture. It reveals crucial insights into these transformative changes,” Hawkes commented.
However, Zanoli contended, stating, “Most of the evidence aligns Sahelanthropus with traits seen in African great apes, suggesting its behavior was likely a mix between chimpanzees and gorillas, distinct from the habitual bipedalism of Australopithecus and Homo.
Explore the Origins of Humanity in South-West England
Join a gentle walking tour through the Neolithic, Bronze Age, and Iron Age, immersing yourself in early human history.
Paleontologists studied fossils that are 160 million years old. Anchiornis Huxley, a non-avian theropod dinosaur, was unearthed from the Late Jurassic Tianjishan Formation in northeastern China. The preserved feathers indicated that these dinosaurs had lost their flying capability. This rare find offers insights into the functions of organisms that existed 160 million years ago and their role in the evolution of flight among dinosaurs and birds.
This fossil of Anchiornis Huxley has nearly complete feathers and coloration preserved, allowing for detailed identification of feather morphology. Image credit: Kiat et al., doi: 10.1038/s42003-025-09019-2.
“This discovery has significant implications, suggesting that the evolution of flight in dinosaurs and birds was more intricate than previously understood,” said paleontologist Yosef Kiat from Tel Aviv University and his team.
“It is possible that some species had rudimentary flight abilities but lost them as they evolved.”
“The lineage of dinosaurs diverged from other reptiles approximately 240 million years ago.”
“Shortly after (on an evolutionary timeline), many dinosaurs began developing feathers, unique structures that are lightweight and strong, made of protein, and primarily used for flight and thermoregulation.”
About 175 million years ago, feathered dinosaurs, known as Penaraputra, emerged as distant ancestors of modern birds; they are the only dinosaur lineage that survived the mass extinction marking the end of the Mesozoic Era 66 million years ago.
As far as we know, the Pennaraputra group developed feathers for flight, but some may have lost that capability due to changing environmental conditions, similar to modern ostriches and penguins.
In this study, the researchers examined nine specimens of a feathered pennaraptorian dinosaur species called Anchiornis Huxley.
This rare paleontological find, along with hundreds of similar fossils, had its feathers remarkably preserved due to the unique conditions present during their fossilization.
Specifically, the nine fossils analyzed were selected because they retained the color of their wing feathers: white with black spots on the tips.
“Feathers take about two to three weeks to grow,” explains Dr. Kiat.
“Once they reach their final size, they detach from the blood vessels that nourished them during growth and become dead material.”
“Over time, birds shed and replace their feathers in a process known as molting, which is crucial for flight.” He notes that birds that depend on flight molt in an organized and gradual manner, maintaining symmetry and allowing them to continue flying during the process.
Conversely, the molting of flightless birds tends to be more random and irregular.
“Molting patterns can indicate whether a winged creature was capable of flight.”
By examining the color of the feathers preserved in dinosaur fossils from China, researchers could reconstruct the wing structure, which featured series of black spots along the edges.
Additionally, newly grown feathers, which had not fully matured, were identifiable by their deviation in black spot patterns.
A detailed analysis of the new feathers in nine fossils revealed an irregular molting process.
“Based on our understanding of contemporary birds, we identified a molting pattern suggesting these dinosaurs were likely flightless,” said Dr. Kiat.
“This is a rare and particularly intriguing discovery. The preservation of feather color offers a unique opportunity to explore the functional characteristics of ancient organisms alongside body structures found in fossilized skeletons and bones.”
“While feather molting might seem like a minor detail, it could significantly alter our understanding of the origins of flight when examined in fossils,” he added.
“Anchiornis Huxley‘s inclusion in the group of feathered dinosaurs that couldn’t fly underscores the complexity and diversity of wing evolution.”
The findings were published in the journal Communication Biology.
_____
Y. Kiat et al. 2025. Wing morphology of Anchiornis Huxley and the evolution of molting strategies in paraavian dinosaurs. Communication Biology August 1633. doi: 10.1038/s42003-025-09019-2
In a room adorned with gray walls in the Dutch city of Nijmegen, peculiar activities unfold beneath your feet. You find yourself seated in a chair, donning a hat covered with sensors, and your bare feet are placed in holes in the platform. Below, a robot equipped with a metal probe begins tickling the soles of your feet. Soon, the air fills with shrieks, laughter, and a certain painful mirth. Here at Radboud University’s Touch and Tickle Laboratory, volunteers are subjected to relentless tickling for the sake of science.
“We can monitor the intensity, speed, and specific areas of stimulation on the legs,” explains Constantina Kirteni, the lab’s director, regarding the robotic tickling experiment. Simultaneously, researchers document participants’ brain activity and physiological metrics such as heart rate, respiration, and sweating. Armed with these neurological and physiological insights, the researchers aim to tackle age-old questions that have intrigued philosophers from Socrates to René Descartes. Why do we experience ticklishness, what does it reveal about the boundary between pleasure and pain, and does this peculiar behavior serve any real purpose? The findings could illuminate areas such as infant brain development, clinical conditions like schizophrenia, and the structure of conscious experience in our brains.
Though the researchers have yet to publish their findings, Kirteni is willing to share some early insights. Regarding what triggers the tickling sensation, she states, “For us to recognize it as tickling, the contact must be both strong and rapid.” Preliminary analyses also indicate that electroencephalography (EEG) reveals distinct patterns of brain activity when experiencing ticklish feelings. To delve deeper into which brain regions process these sensations, the researchers intend to employ functional MRI, although the robot will require modifications to avoid interfering with the scanner. Moreover, scientists at the institute have initiated inquiries into the intriguing question of whether people actually enjoy being tickled.
“We observe a mix of responses, allowing us to see both those who find it pleasurable and those who find it distressing,” Kirteni notes. While people’s reactions may include smiles or laughter, these do not necessarily correlate with their enjoyment levels. Additionally, perceptions can shift over time. “Some individuals have reported that though it may be enjoyable initially, prolonged exposure can become uncomfortable and even painful,” she adds.
Tickling Laboratory at Radboud University, Nijmegen, Netherlands
Cohen Verheiden
One of the enduring enigmas about tickling that Kirteni is eager to unravel is why self-tickling is impossible. This peculiar fact suggests that unpredictability in stimulation is crucial, a notion supported by contemporary studies. Numerous investigations indicate that our brains predict sensations triggered by our own actions, leading us to perceive our touch as less significant than that of others. This can become particularly perplexing in certain mental health conditions. Research suggests that individuals experiencing auditory hallucinations or sensations of being controlled by external forces find their own touch more ticklish. “This indicates a possible breakdown in how our brains forecast our feelings based on our movements,” Kirteni mentions. “We are keen to explore this further in clinical populations, especially those with schizophrenia.”
What Makes Us Ticklish?
Perhaps the most significant unanswered inquiry revolves around why we are ticklish. Known primarily among humans and their close relatives, tickling may have evolved from behaviors in great ape ancestors. For instance, chimpanzees and bonobos frequently tickle each other during play. In a study published this year, Elisa Demur and colleagues from the University of Lyon in France observed a bonobo colony for three months. They discovered a notable correlation between tickling and age, with older bonobos being tickled more often, while younger ones were tickled frequently.
Demur remarked, “This is intriguing because it aligns closely with human behavior, chiefly as an interaction for young children.” The researchers observed that social bonds significantly influenced the tickling interactions; pairs that primarily engaged in tickling sessions shared strong attachments.
For Demur, this suggests that tickling evolved as a prosocial behavior enhancing connections between youngsters and their group members. This is closely related to pretend play, she adds, since acts appearing aggressive and unpleasant from strangers can be enjoyable in the presence of friends or close acquaintances. In her studies of bonobos at the Lola ya Bonobo Sanctuary in the Democratic Republic of the Congo, she observes how orphaned infants respond to tickling by their human surrogate parents, highlighting the importance of familiarity. “It’s a fascinating behavior. It’s always joyful to see them laugh; they’re incredibly adorable!” she shares.
Regardless of one’s mental state or the relationship with the person (or machine) doing the tickling, even non-consensual tickling can elicit laughter. Some researchers argue that this indicates that tickling is a physiological reflex; however, this does not preclude the idea that its evolution served a social purpose. Another hypothesis suggests that this behavior could help young individuals learn to protect vulnerable areas of their body during play or combat. “The truth remains that we don’t have definitive answers because there are valid counterarguments for all these theories,” Kirteni states.
Rats “laugh” when tickled
Shinpei Ishiyama and Michael Brecht
Nevertheless, focusing exclusively on tickling behaviors in great apes may overlook a significant aspect of this behavior. While rodents are not known to engage in tickling among themselves, they appear to enjoy human tickling. Though previously thought to be non-ticklish, mice have shown a fondness for tickling when they feel comfortable. Researcher Marlies Austrand from the University of Amsterdam found that if mice are relaxed and flipped over, they can delight in tickling, producing high-pitched sounds that resemble laughter.
Interestingly, these sounds are beyond human hearing range, and it’s uncertain whether mice can hear them as well, adding to the mystery of their laughter. While Austrand’s findings are not yet published, it’s evident that rodents respond positively to tickling. “If given the choice between a safe, scented hutch in their home cage and being tickled, mice will choose the latter,” she asserts.
Austrand speculates on why humans and animals react as they do under tickling. Our brains are constantly engaged in predicting external stimuli, evaluating potential threats and survival tactics. She proposes that tickling introduces surprises that contradict these expectations. Yet, if we feel secure, these unexpected sensations can be exhilarating. “This is more of a hypothesis; it remains unproven,” she admits. “But I believe tickling aids animals, especially young ones, in adapting to a fluid environment,” she concludes. Such peculiar behavior may well be an evolutionary quirk that we should embrace.
This year brought many revelations about our ancient human relatives
WHPics / Alamy
This is an excerpt from Our Human Story, a newsletter about the revolution in archaeology. Sign up to receive it in your inbox every month.
If we try to summarize all the new fossils, methods, and ideas emerging from the study of human evolution in 2025, we might still be here in 2027. This year has been packed with developments, and I doubt it’s feasible for one individual to digest everything without isolating themselves from other distractions. This is particularly true in human evolution, which is a decentralized field. Unlike particle physicists, who often unite in teams for large-scale experiments, paleoanthropologists scatter in diverse directions.
There are two ways this year-long endeavor can falter. One risk is getting overwhelmed by an insurmountable amount of research, rendering it indecipherable. The other is simplifying the information to the point where it becomes incorrect.
With that in mind, here are three key points I want to clarify as we head into 2025. First, there have been remarkable discoveries about the Denisovans, reshaping our understanding of this mysterious group and challenging some of our previous assumptions. Second, we’ve seen a variety of new discoveries and ideas regarding how our distant ancestors created and utilized tools. Finally, we must consider the broader picture: how and why our species diverged so significantly from other primates.
The Denisovan Flood
Hebei Geography University
This year marks 15 years since we first learned about the Denisovans, an ancient group of humans that inhabited East Asia tens of thousands of years ago. My fascination with them has persisted, and this year, I was excited to witness a surge of discoveries that broadened our knowledge of their habitats and identities.
Denisovans were initially identified primarily through molecular evidence. The first fossil discovered was a small finger bone from Denisova Cave in Siberia, which defied identification based solely on its morphology, but DNA was collected in 2010. Genetic analyses revealed that Denisovans were closely related to Neanderthals, who lived in Europe and Asia, and that they interbred with modern humans. Currently, populations in Southeast Asia, particularly Papua New Guinea and the Philippines, possess the highest concentration of Denisovan DNA.
Since then, researchers have been on the hunt for additional Denisovan remains, though this endeavor has progressed slowly. Until 2019, the second identified example was a jawbone excavated from Baisiya Karst Cave in Xianghe, located on the Tibetan Plateau. Over the next five years, several more fossils were tentatively attributed to Denisovans, notable for their large size and pronounced teeth compared to modern humans.
Then came 2025, which brought numerous exciting findings. In April, Denisovans were confirmed in Taiwan, when a jawbone dredged from the Penghu Strait in 2008 was finally identified using preserved proteins. This discovery significantly extends the known range of Denisovans to the southeast, aligning with where their genetic markers remain today.
In June, the first Denisovan facial features emerged. A skull discovered in Harbin, northern China, was described in 2021 and designated as a new species, named Homolonghi. Initially presumed to belong to Denisovans due to its large size, proteins extracted by Qiaomei Fu and her team from the bone and mitochondrial DNA from dental plaque confirmed its Denisovan origins.
So far, these findings align well with genetic evidence indicating that Denisovans roamed extensively across Asia. They also contribute to a coherent image of Denisovans as a larger species.
However, two additional discoveries in 2025 were surprising. In September, a crushed skull thought to belong to an early Denisovan was reconstructed in Unzen, China, dating back approximately 1 million years. This finding suggests that Denisovans existed as a distinct group much earlier than previously believed, indicating that their common ancestor with Neanderthals, known as Ancestor X, must have lived over a million years ago. If confirmed, it implies a longer evolutionary history for all three groups than previously thought.
Just a month ago, geneticists released a second high-quality Denisovan genome extracted from a 200,000-year-old tooth found in Denisova Cave. Notably, this genome is distinctly different from the first genome described recently, as well as from modern Denisovan DNA.
This indicates the existence of at least three groups of Denisovans: early ones, later ones, and those that hybridized with modern humans—this latter group remains a total archaeological enigma.
As our understanding of Denisovans deepens, their history appears much longer and more diverse than initially assumed. In particular, Denisovan populations that interbred with modern humans remain elusive.
For the past 15 years, Denisovans have captivated my interest. Despite their widespread presence across continents for hundreds of thousands of years, only a handful of remains have been documented.
Fortunately, I have a penchant for mysteries. Because this puzzle won’t be solved anytime soon.
Tool Manufacturing
TW Plummer, JS Oliver, EM Finestone, Houma Peninsula Paleoanthropology Project
Creating and using tools is one of humanity’s most critical functions. This ability isn’t unique to our species, as many other animals also use and even make tools. Primatologist Jane Goodall, who passed away this year, famously demonstrated that chimpanzees can manufacture tools. However, humans have significantly elevated this skill, producing a more diverse array of tools that are often more complex and essential to our survival than those of any other animal.
As we delve deeper into the fossil record, we’re discovering that the practice of tool-making dates back further than previously thought. In March, I reported on excavations in Tanzania revealing that an unidentified ancient human was consistently creating bone tools 1.5 million years ago, well over a million years before bone tools were believed to become commonplace. Similarly, while it was previously thought that humans began crafting artifacts from ivory 50,000 years ago, this year, a 400,000-year-old flake from a mammoth tusk was discovered in Ukraine.
Even older stone tools have surfaced, likely due in part to their greater preservation potential. Crude tools have been identified from 3.3 million years ago at Lomekwi, Kenya. Last month in Our Human Story, I mentioned excavations in another part of Kenya demonstrating that ancient humans consistently produced a specific type of Oldowan tools between 2.75 million and 2.44 million years ago, indicating that tool-making was already a habitual practice.
Often, tools are found without associated bones, making it challenging to determine their makers’ identities. It’s tempting to assume that most tools belong to our genus, Homo, or perhaps to Australopithecus, our more distant ancestors. However, increasing evidence suggests that Paranthropus—a hominin with a small brain and large teeth, which thrived in Africa for hundreds of thousands of years—could also have made tools, at least simple ones like the Oldowans.
Two years ago, Oldowan tools were discovered alongside Paranthropus teeth in Kenya—admittedly not definitive evidence, but strongly suggestive. This year, a fossil of Paranthropus revealed that its hand exhibited a combination of gorilla-like strength and impressive dexterity, indicating capable precision gripping essential for tool-making.
How did these ancients conceive of their tools? One possibility, suggested by Metin Eren and others this year, is that they didn’t consciously create them. Instead, tool-like stones form naturally under various conditions, such as frost cracking rocks or elephants trampling them. Early humans may have utilized these “natural stones,” knowledge of which eventually led to their replication.
As humans continued to develop increasingly complex tools, the cognitive demands of creating them likely escalated, potentially facilitating the emergence of language as we needed to communicate how to make and use these advanced tools. This year’s research explored aspects like the difficulty of learning various skills, whether close observation is necessary, or if mere exposure suffices. The findings suggest two significant changes in cultural transmission that may correlate with technological advancements.
Like most aspects of evolution, tool-making appears to have gradually evolved from our primate predecessors, reshaping our cognitive capabilities in the process.
Big Picture
Alexandra Morton Hayward
Now let’s address the age-old question of how and why humans evolved so distinctly, and which traits truly set us apart. This topic is always challenging to navigate for three main reasons.
First, human uniqueness is multifaceted and often contradictory. Social scientist Jonathan R. Goodman suggested in July that evolution has forged humans to embody both “Machiavellian” traits—planning and betraying one another—and “natural socialist” instincts driven by strong social norms against murder and theft. Claims that humans are inherently generous or instinctively cruel tend to oversimplify the matter excessively.
Second, our perceptions of what makes us unique are shaped by the societies in which we exist. For instance, many cultures remain predominantly male-focused, leading our historical narratives to center around men. While the feminist movement is working to amend this imbalance, progress remains slow. Laura Spinney’s article on prehistoric women suggested that “throughout prehistory, women were rulers, warriors, hunters, and shamans,” a viewpoint made viable only through dedicated research.
Third, reconstructing the thought processes of ancient people as they adopted certain behaviors is inherently difficult, if not impossible. Why did early humans bury their dead and enact funerary rituals? How were dogs and other animals domesticated? What choices shaped ancient humans’ paths toward change?
Still, I want to spotlight two intriguing ideas surrounding the evolution of the human brain and intelligence. One concerns the role of placental hormones that developing babies are exposed to in the womb. Preliminary evidence suggests these hormones may contribute to brain growth, equipping us with the neural capacity to navigate our unusually complex social environments.
Another compelling possibility proposes that the genetic changes associated with our increased intelligence may have also led to vulnerabilities to mental illness. In October, Christa Leste-Laser reported that genetic mutations linked to intelligence emerged in our distant ancestors, followed by mutations associated with mental disorders.
This notion has intrigued me for years, rooted in the observation that wild animals, including our close relatives like chimpanzees, do not appear to suffer from serious mental illnesses such as schizophrenia or bipolar disorder. Perhaps our brains operate at the edge of our neural capabilities. Like a finely-tuned sports car, we can excel but are also prone to breakdowns. While still a hypothesis, this concept is difficult to shake off.
Oh, one more point. Although we often shy away from discussing methodological advancements, as readers generally prefer results, we made an exception in May. Alexandra Morton Hayward and her colleagues at the University of Oxford developed a method to extract proteins from ancient brains and potentially other soft tissues. Though such tissues are rarer in the fossil record compared to bones and teeth, some remain preserved and may offer a wealth of information. The first results could be available next year.
I It’s challenging to find a more contemporary form of entertainment than open-world video games. Merging storytelling, social interaction, and the freedom to roam, these expansive technological projects offer a uniquely immersive experience with infinite possibilities. But do they truly embody novel concepts in storytelling?
This week, I had a conversation with Dan Hauser, co-founder of Rockstar and the lead writer for Grand Theft Auto and Red Dead Redemption. He was in London discussing his new venture, Absurd Ventures. He’s working on a range of exciting projects, including a novel and a podcast series titled better paradise (which delves into a vast online game that ends in tragedy), as well as a comedic quest set in an online universe known as absurd verse. He mentioned that he had an epiphany regarding the series 15 years ago while giving press interviews for the Grand Theft Auto IV expansion pack.
“I was conversing with a journalist from Paris Match, a very learned French individual, who stated, ‘The game Grand Theft Auto is akin to Dickens.’ I thought, bless you for saying that! However, in retrospect, they may not reach Dickensian heights, but they’re comparable in that they create worlds. When you examine Dickens, Zola, Tolstoy—any of those authors—you sense that the entire world they describe is magnificent. This is an open world. That’s the experience you seek from the game. It’s a bizarre prism through which to view a society that somehow becomes fascinating.”
A whole new world…an absurd world. Photo: Absurd Ventures/X
I found it incredibly engaging to discuss this concept with Hauser, as I concur that there are notable parallels between Victorian literature and contemporary narrative-driven video games. The extensive descriptive passages in these works served as a form of virtual reality, evoking vivid imagery in readers’ minds well before the advent of cinema. It’s wholly immersive. When I first read Jane Eyre a decade ago, I was struck by the richness of the inner thoughts presented, inviting readers to explore the main character’s psyche.
Hauser also noted structural resemblances to Grand Theft Auto. “There’s a sense of an expanded storytelling akin to the remarkable 19th-century novels from Thackeray onward,” he explained. “These stories can be viewed as shaggy dog tales that culminate at a single moment. They are deeply realistic; they contain a grounded progression rather than jumping around in time. The games are similarly grounded in that sense.”
For Hauser, this synthesis of Victorian literature and game design came to fruition with the creation of Red Dead Redemption 2, Rockstar’s magnum opus and a poignant narrative of vengeance set in late 19th century America. “I consumed Victorian novels,” he shared. “I listened to the Middlemarch audiobook daily during my commute, and I loved every moment.” He faced challenges in striking the right tone for the dialogue, ultimately finding inspiration in blending Middlemarch, Sherlock Holmes, and Cowboy Pulp Fiction.
“I listened to the Middlemarch audiobook every day on my way to and from the office,” Dan Hauser said. Photo: Chelsea Guglielmino/Getty Images
“From a writing perspective, I wanted it to feel more like a novel,” he remarked. “We believed this approach could yield something innovative story-wise. Given how visually stunning the game is and its strong art design, we aimed to anchor the narrative in a solid context. Our goal was to encapsulate the three-dimensionality of the characters’ lives while also portraying a sense of life and death in the 19th century, which is fundamentally different from our own experience.”
It’s fascinating to see how Victorian literature significantly influenced Rockstar’s acclaimed adventures. The gaming industry often feels inward-looking, with new titles being slightly modified iterations of successful older games, recycling the same fantasy and science fiction narratives. Drawing on Tolkien, Akira, or Blade Runner isn’t inherently problematic, but broadening one’s literary horizons is always beneficial. I eagerly anticipate how Hauser’s new endeavor will transform the notion of open-world gaming in the 21st century, yet part of me wishes he would fully embrace the adventure of a grand Victorian novel.
Forget Pride and Prejudice and Zombies; perhaps it’s time for Middlemarch and Machine Guns.
what to play
Gorgeous atmosphere… “Metroid Prime 4 Beyond”. Photo: Nintendo
Eighteen years have elapsed since the last installment of “Metroid Prime.” In that time, people have been born, attended school, completed exams, and faced their first hangovers since I last explored a mysterious planet through Samus Aran’s visor. I’ve played quite a bit of Metroid Prime 4: Beyond for fans of Nintendo’s fierce (but often overlooked) heroes. I reviewed it this week and I’m pleased to report it wasn’t a disaster. While it’s somewhat uneven and carries an old-fashioned feel, it boasts a stunning atmosphere that is visually and audibly captivating and is genuinely fun. The gameplay resonates with me because it adheres to unconventional modern game design principles. keza mcdonald
Available: Nintendo Switch/Switch 2 Estimated play time: 15-20 hours
Could Shadow be highlighted in Paramount’s upcoming Sonic the Hedgehog spin-off? Photo: Paramount Pictures and Sega of America, Inc.
Sega enthusiasts rejoice: Paramount Pictures has announced a Sonic the Hedgehog movie spin-off (or should it be a spin-off Dash?) As reported by Variety, this project, currently dubbed “Sonic Universe Event Film,” is set to release on December 22, 2028, shortly after Sonic the Hedgehog 4, slated for March 2027. Perhaps there will be a new journey for Sonic’s rival, Shadow the Hedgehog? I might be alone in this, but I’m excited about Big the Cat’s fishing adventure.
The Information Commissioner’s Office, the UK’s independent data protection and information rights regulator, is currently investigating10 Most Popular Mobile Games to focus on children’s privacy. According to the organization’s blog, “84% of parents are worried about their children being exposed to strangers and harmful content via mobile games.” This scrutiny follows recent controversies surrounding Roblox.
As someone inundated with around 200 press releases weekly about this genre, I found this piece relatable. Rock, Paper, Shotgun elaborates on the seemingly unstoppable emergence of roguelike games. Edwin Evans-Thirlwell interviews developers to uncover the reasons behind the popularity of games featuring the three Ps: procedural generation, (character) progression, and permadeath.
What to click
question block
Using power…Dishonored 2. Photo: Steam Powered
Keza answers this week’s reader inquiries from Tom:
“I was reflecting on a recent question block about non-violent games and thought, are there games that maintain violent elements but still provide alternative paths to completion? I adored Red Dead Redemption 2, yet was frustrated that firearms were often the only means to resolve conflicts. I’ve seen countless amusing videos of players attempting to finish inherently violent games without bloodshed, highlighting a desire for pacifism.”
I distinctly remember playing the original Splinter Cell on Xbox, where the protagonist opts for a non-lethal approach by incapacitating foes rather than killing them. While it took me a long time to navigate, it was indeed a viable path offered by the game. The steampunk classic Dishonored and its sequel are known for allowing players to wrap up their quests without resorting to lethal force, utilizing supernatural abilities to manipulate their surroundings. However, if memory serves, choosing the pacifist route does make the game considerably harder.
In fact, most stealth games permit a non-violent approach, though few specifically reward players for sparing lives. One notable exception is the beloved comic-inspired adventure Undertale, where players can ultimately engage monsters in dialogue instead of combat. I also believe it’s feasible to play through both original Fallout titles (possibly even Fallout: New Vegas) without killing anyone, should players possess enough charisma to navigate tough scenarios through dialogue.
We’re still accepting nominations for Game of the Year for our year-end special – let us know by. Email us at pushbuttons@theguardian.com.
The origins of the sperm swimming mechanism date back to ancient times.
Christoph Burgstedt/Alamy
The evolutionary roots of sperm can be traced to the unicellular forerunners of all existing animals.
Nearly all animals go through a unicellular phase in their life cycle, which involves two forms of sex cells, or gametes. Eggs are sizeable cells that hold genetic information and the nutrients necessary for early development, while sperm transport genetic material from one organism to another to fertilize eggs and create new life.
“Sperms play a crucial role in the process that allows life to be transmitted from generation to generation,” states Arthur Matt from Cambridge University. “It carries the legacy of over 700 million years of evolutionary history and is likely linked to the origins of animals themselves. Our aim was to explore this extensive evolutionary narrative to understand the origins of sperm.”
Matt and his team utilized an open science dataset containing information about sperm proteins from 32 animal species, including humans. They combined this data with the genomes of 62 organisms, including various related single-cell groups, to track the evolution of sperm across different animal lineages.
The research revealed a “sperm toolkit” comprising about 300 gene families that make up the last universal common sperm core genome.
“We have now identified numerous significant advancements in sperm mechanisms occurring long before multicellular animals emerged, even before the sperm themselves,” explains Matt.
This indicates that the sperm mechanics, represented by a “flagellum that propels a single cell,” were already evolving prior to the development of multicellular organisms.
Thus, our ancient progenitors were once all single-celled oceanic swimmers, and the sperm toolkit was present in our earliest swimming unicellular predecessors long before the advent of animals.
“Animals evolved multicellularity and cellular differentiation, but they did not create sperm from nothing. They repurposed the body structure of their swimming forebears as the foundation for sperm,” states Matt. “In essence, sperm are not a novel creation of multicellular organisms but are constructed upon the designs of a single-celled organism repurposed for reproduction.”
The study also indicated that the significant technological developments leading to the vast variety of current sperm primarily affected the cell heads, while the tails have remained largely constant since their common ancestor.
According to the research team members, fertilization can occur in various manners, with some sperm reaching the egg within the body, while others swim in open waters, notes Adria Leboeuf, also from the University of Cambridge. “Finding eggs in these different settings presents unique challenges and requires specialized machinery,” she explains. “However, the tail remains well-preserved since it must be capable of swimming in all environments.”
“This illustrates how evolution can modify existing structures instead of creating mechanisms from scratch,” says Jenny Graves, from La Trobe University in Melbourne, Australia.
Research led by scientist Hannah Long at the University of Edinburgh has found that specific regions of Neanderthal DNA are more effective at activating genes responsible for jaw development than those in humans, potentially explaining why Neanderthals had larger lower jaws.
Neanderthal. Image credit: Natural History Museum Trustees.
“With the Neanderthal genome being 99.7% identical to that of modern humans, the variations between species are likely to account for differences in appearance,” Dr. Hanna stated.
“Both human and Neanderthal genomes consist of roughly 3 billion characters that code for proteins and regulate gene expression in cells. Identifying the regions that influence appearance is akin to searching for a needle in a haystack.”
Dr. Long and her team had a targeted approach, focusing on a genomic area linked to the Pierre Robin sequence, a condition marked by an unusually small mandible.
“Individuals with the Pierre Robin sequence often have significant deletions or rearrangements in this portion of the genome that affect facial development and restrict jaw formation,” Dr. Hanna explained.
“We hypothesized that minor differences in DNA could produce more nuanced effects on facial structure.”
Upon comparing human and Neanderthal genomes, researchers discovered that in this segment, approximately 3,000 letters long, there are only three one-letter variations between the species.
This DNA region doesn’t code for genes but regulates when and how certain genes, particularly SOX9, which plays a crucial role in facial development, are activated.
To confirm that these Neanderthal-specific differences were significant for facial development, scientists needed to demonstrate that the Neanderthal version could activate genes in the appropriate cells at the right developmental stage.
They introduced both Neanderthal and human versions of this region into zebrafish DNA and programmed the cells to emit different colors of fluorescent protein based on the activation of either region.
By monitoring zebrafish embryo development, researchers observed that cells responsible for forming the lower jaw were active in both human and Neanderthal regions, with the Neanderthal regions showing greater activity.
“It was thrilling when we first noticed the activity of specific cell populations in the developing zebrafish face, particularly near the forming jaw, and even more exhilarating to see how Neanderthal-specific variations could influence activity during development,” said Dr. Long.
“This led us to contemplate the implications of these differences and explore them through experimental means.”
Recognizing that Neanderthal sequences were more effective at activating genes, the authors questioned whether this would lead to enhanced target activity affecting the shape and function of the adult jaw, mediated by SOX9.
To validate this idea, they augmented zebrafish embryos with additional samples of SOX9 and discovered that cells involved in jaw formation occupied a larger area.
“Our lab aims to further investigate the effects of genetic differences using methods that simulate various aspects of facial development,” Dr. Long remarked.
“We aspire to deepen our understanding of genetic variations in individuals with facial disorders and improve diagnostic processes.”
“This study demonstrates how examining extinct species can enhance our knowledge of how our own DNA contributes to facial diversity, development, and evolution.”
The findings are published in the journal Development.
_____
Kirsty Utley et al. 2025: Neanderthal-derived variants enhance SOX9 enhancer activity in craniofacial progenitor cells, influencing jaw development. Development 152 (21): dev204779; doi: 10.1242/dev.204779
Scientist Hannah Long and her team at the University of Edinburgh have discovered that specific regions of Neanderthal DNA are more effective at activating genes related to jaw formation compared to human DNA, which might explain why Neanderthals had larger lower jaws.
Neanderthal. Image credit: Natural History Museum Trustees.
“The Neanderthal genome shows a 99.7% similarity to the human genome, suggesting that the differences between the species contribute to variations in appearance,” explained Dr. Hanna.
“Both the human and Neanderthal genomes comprise around 3 billion characters that code for proteins and regulate gene usage in cells. Therefore, pinpointing regions that affect appearance is akin to finding a needle in a haystack.”
Dr. Long and her collaborators had a targeted hypothesis regarding where to initiate their search. They focused on a genomic area linked to the Pierre Robin sequence, a condition characterized by a notably small jaw.
“Some individuals with Pierre Robin sequence exhibit significant deletions or rearrangements in this genomic region that disrupt facial development and impede jaw formation,” stated Dr. Hanna.
“We speculated that minor variations in DNA could subtly influence facial shape.”
Through the comparison of human and Neanderthal genomes, researchers identified that in a segment approximately 3,000 letters long, there are just three one-letter differences between the two species.
This DNA segment lacks any specific genes but regulates the timing and manner in which genes, particularly SOX9, a crucial factor in facial development processes, are activated.
To demonstrate the significance of these Neanderthal-specific differences for facial development, researchers needed to confirm that the Neanderthal region could activate genes in the correct cells at the appropriate developmental stage.
They introduced both Neanderthal and human variants of this region into zebrafish DNA concurrently and programmed the cells to emit different colors of fluorescent protein based on whether the human or Neanderthal region was active.
By monitoring zebrafish embryo development, researchers observed that the cells crucial for lower jaw formation were active in both regions, with the Neanderthal regions showing greater activity than those of humans.
“We were thrilled when we first detected the activity in a specific group of cells within the developing zebrafish face, near the jaw, and even more so when we realized that Neanderthal-specific differences could modify this activity during development,” Dr. Long noted.
“This led us to ponder the potential implications of these differences and how we may explore them experimentally.”
Recognizing that Neanderthal sequences were more adept at activating genes, the authors inquired whether this would correlate with heightened activity in target cells, influencing the shape and function of the adult jaw as governed by SOX9.
To test this hypothesis, they administered additional samples to zebrafish embryos. They found that the cells involved in jaw formation occupied a larger area.
“In our lab, we aim to investigate the effects of additional DNA sequence differences using methods that replicate aspects of facial development,” Dr. Long said.
“We aspire to enhance our understanding of sequence alterations in individuals with facial disorders and assist with diagnostic efforts.”
“This research illustrates that by examining extinct species, we can gain insights into how our own DNA contributes to facial variation, development, and evolution.”
Findings are detailed in the journal Developmenthere.
_____
Kirsty Utley et al. 2025: Variants derived from Neanderthals enhance SOX9 enhancer activity in craniofacial progenitor cells that shape jaw development. Development 152 (21): dev204779; doi: 10.1242/dev.204779
Several hominid species — Australopithecus africanus, Paranthropus robustus, early homo varieties, Gigantopithecus brachy, Pongo, papio, homo neanderthalensis, and homo sapiens — have undergone significant lead exposure over two million years, as revealed by a new analysis of fossilized teeth collected from Africa, Asia, Oceania, and Europe. This finding challenges the notion that lead exposure is merely a contemporary issue.
Lead exposure affecting modern humans and their ancestors. Image credit: J. Gregory/Mount Sinai Health System.
Professor Renaud Joannes Boyau from Southern Cross University remarked: “Our findings indicate that lead exposure has been integral to human evolution, not just a byproduct of the industrial revolution.”
“This suggests that our ancestors’ brain development was influenced by toxic metals, potentially shaping their social dynamics and cognitive functions over millennia.”
The team analyzed 51 fossil samples globally utilizing a carefully validated laser ablation microspatial sampling technique, encompassing species like Australopithecus africanus, Paranthropus robustus, early homo variants, Gigantopithecus brachy, Pongo, papio, homo neanderthalensis, and homo sapiens.
Signs of transient lead exposure were evident in 73% of the specimens analyzed (compared to 71% in humans). This included findings on Australopithecus, Paranthropus, and homo species.
Some of the earliest geological samples from Gigantopithecus brachy, believed to be around 1.8 million years old from the early Pleistocene and 1 million years old from the mid-Pleistocene, displayed recurrent lead exposure events interspersed with periods of little to no lead uptake.
To further explore the impact of ancient lead exposure on brain development, researchers also conducted laboratory studies.
Australopithecus africanus. Image credit: JM Salas / CC BY-SA 3.0.” width=”580″ height=”627″ srcset=”https://cdn.sci.news/images/2015/01/image_2428-Australopithecus-africanus.jpg 580w, https://cdn.sci.news/images/2015/01/image_2428-Australopithecus-africanus-277×300.jpg 277w” sizes=”(max-width: 580px) 100vw, 580px”/>
Australopithecus africanus. Image credit: JM Salas / CC BY-SA 3.0.
Using human brain organoids (miniature brain models grown in the lab), researchers examined the effects of lead on a crucial developmental gene named NOVA1, recognized for modulating gene expression during neurodevelopment in response to lead exposure.
The modern iteration of NOVA1 has undergone changes distinct from those seen in Neanderthals and other extinct hominins, with the reasons for this evolution remaining unclear until now.
In organoids with ancestral versions of NOVA1, exposure to lead significantly altered neural activity in relation to Fox P2 — a gene involved in the functionality of brain regions critical for language and speech development.
This effect was less pronounced in modern organoids with NOVA1 mutations.
“These findings indicate that our variant of NOVA1 might have conferred a protective advantage against the detrimental neurological effects of lead,” stated Alison Muotri, a professor at the University of California, San Diego.
“This exemplifies how environmental pressures, such as lead toxicity, can drive genetic evolution, enhancing our capacity for survival and verbal communication while also affecting our susceptibility to contemporary lead exposure.”
An artistic rendition of a Gigantopithecus brachy herd in the forests of southern China. Image credit: Garcia / Joannes-Boyau, Southern Cross University.
Genetic and proteomic analyses in this study revealed that lead exposure in archaic variant organoids disrupts pathways vital for neurodevelopment, social behavior, and communication.
Alterations in Fox P2 activity indicate a possible correlation between ancient lead exposure and the advanced language abilities found in modern humans.
“This research highlights the role environmental exposures have played in human evolution,” stated Professor Manish Arora from the Icahn School of Medicine at Mount Sinai.
“The insight that exposure to toxic substances may conjure survival advantages in the context of interspecific competition introduces a fresh perspective in environmental medicine, prompting investigations into the evolutionary origins of disorders linked to such exposures.”
For more information, refer to the study published in the journal Science Advances.
_____
Renaud Joannes Boyau et al. 2025. Effects of intermittent lead exposure on hominid brain evolution. Science Advances 11(42); doi: 10.1126/sciadv.adr1524
Homo sapiens may have developed greater tolerance to lead exposure compared to other hominids
frantic00/Shutterstock
Research on fossilized teeth indicates that ancient humans were exposed to harmful lead for over two million years, suggesting that modern humans might have adapted to handle this toxic metal more effectively than their predecessors.
Traditionally, lead poisoning was associated with modern issues such as industrialization, poor mining techniques, and lead additives in fuels. Fortunately, efforts to phase out lead exposure have been underway since the 1980s.
This toxin is particularly harmful to children, hindering physical and cognitive growth, while adults may experience a range of serious physical and mental health issues.
Dr. Renaud Joanne Bois and colleagues from Southern Cross University in Lismore, Australia, aimed to investigate whether our ancient ancestors faced similar lead exposure.
They examined 51 fossilized hominin teeth, representing species such as Australopithecus africanus, Paranthropus robustus, Gigantopithecus black, Homo neanderthalensis, and Homo sapiens. The fossils were sourced from regions including Australia, Southeast Asia, China, South Africa, and France.
The research team utilized laser ablation techniques to identify lead concentrations in the teeth, revealing layers of lead that accumulated during the growth of these hominids. This exposure could be attributed to environmental contaminants, such as polluted water, soil, or volcanic eruptions.
Dr. Joanne Boyau noted the surprising levels of lead discovered within the teeth. For instance, Gigantopithecus, a massive ancestral relative of today’s orangutans, primarily lived in what is now China. “If current humans exhibit similar lead levels, it indicates considerable exposure from industrial activities,” she remarked.
The research then shifted focus to understanding how both modern humans and Neanderthals managed lead exposure. The team created lab-grown brain models called organoids to analyze differences in the NOVA1 gene in both species, subsequently assessing the effects of lead neurotoxicity on these organoids.
“Our findings indicate that modern NOVA1 is significantly less impacted by lead neurotoxicity,” states Joannes Boyau.
Crucially, when archaic organoids expressed NOVA1 under lead exposure, another gene, Fox P2 exhibited notable differences.
“These genes are linked to cognitive functions, language, and social bonding,” explains Joannes-Boyau. “The diminished neurotoxicity in modern humans compared to Neanderthals could provide a crucial evolutionary advantage.” This suggests that lead exposure has influenced our evolutionary history.
However, Dr. Tanya Smith from Griffith University in Brisbane, Australia, remains cautious about the conclusions drawn by the researchers regarding lead exposure levels or potential evolutionary benefits inferred from their organoid studies.
“This paper is complex and makes speculative claims,” Smith emphasizes. “While it seems logical that ancient humans and wild primates faced some level of lead exposure, the limited scope and variety of fossils studied do not necessarily demonstrate that our ancestors were consistently exposed to lead over two million years.”
Exploring Neanderthals and Ancient Humans in France
Join New Scientist’s Kate Douglas on an engaging exploration of significant Neanderthal and Upper Paleolithic sites across southern France, spanning from Bordeaux to Montpellier.
Deep in Argentina’s Andes Mountains, paleontologists have uncovered the remains of a small dinosaur, giving insight into the early adaptations that characterized sauropod dinosaurs, specifically the extended neck seen in diplodocus.
The fossil, named Huayracursor jaguensis, represents a partial skeleton of a creature that roamed the Earth during the Triassic period, roughly 230 million years ago. It is estimated to have measured around 2 meters in length and weighed about 18 kilograms.
Subsequent sauropods like brontosaurus and Patagotitan would grow to impressive sizes—over 35 meters long and weighing more than 70 tons, marking them as the largest and longest-necked animals in history.
Previously, scientists believed that the ancestors of these long-necked, herbivorous dinosaurs were small, short-necked, and possibly even omnivorous.
At the same time, other smaller sauropods, such as homo jaguensis, measured approximately 1 meter and displayed no signs of elongated neck bones, unlike the newly identified species. This led paleontologists to think that substantial growth in size and neck elongation in sauropods didn’t occur until millions of years later.
The discovery of homo jaguensis at Santo Domingo Creek in northwestern Argentina has prompted a reevaluation of how these dinosaurs developed their iconic long necks, according to Martin Hechenleitner from Argentina’s National Council for Scientific and Technical Research.
“Waila cursor presents a different narrative than the gradual transition model,” Hechenleitner points out. “This is evident since it coexisted with closely related species that were smaller and had relatively shorter necks.”
This dinosaur had a small skull, muscular hind limbs, slender hips, and notably short arms, with relatively large and robust hands compared to other dinosaurs of its era.
This suggests that the traits of increased size and neck elongation emerged early in this evolutionary line, Hechenleitner explains.
“Waila cursor allows us to trace the origins of elongated necks and larger body sizes back to the dawn of dinosaurs in the fossil record,” he says, referencing species like argentinosaurus and Patagotitan, which emerged from a lineage that originated over 100 million years ago, with early bipedal forms measuring just over a meter long and weighing between 10 and 15 kilograms.
Dinosaur hunting in Mongolia’s Gobi desert
Join an exciting expedition to unearth dinosaur remains in the expansive wilderness of the Gobi Desert, known as one of the premier paleontological sites in the world.
Did asteroid impacts shape the trajectory of human evolution?
Anna Ivanova/Alamy
This excerpt is from our “Human Stories” newsletter focusing on the archaeological revolution. Subscribe and receive it monthly in your inbox.
I remember when the concept of an asteroid impact causing the extinction of the dinosaurs was a new and thrilling idea. Luis Alvarez and his team first put forth this theory in 1980—the year before I was born. It was a bold assertion, despite the absence of concrete impact crater evidence at the time, relying instead on an unusual rock formation. It wasn’t until the 1990s, with the identification of the Chicxulub impact crater, that the theory gained substantial traction in paleontological circles. To this day, scientists debate whether the impact was the primary driver of extinction or if dinosaurs were already in decline prior to the asteroid’s strike.
Clearly, nothing comparable occurred during the period of human evolution. The Chicxulub impact was notably catastrophic.
Yet, Earth faces numerous other cosmic hazards. A theory suggests that around 42,000 years ago, anomalies in the Earth’s magnetic field may have triggered a global ecological crisis, potentially contributing to the extinction of Neanderthals. This theory was initially proposed in 2021 in Science, and my colleague Karina Shah covered it in a news article.
Moreover, various cosmic events can affect our planet. Smaller meteorite impacts can severely disrupt ecosystems in their vicinity. Additionally, radiation from exploding stars and “supernovae” subject life on Earth to ongoing existential threats, including that of humans and their extinct relatives.
So, did cosmic events play a role in shaping human evolution?
Magnetic Field Fluctuations
Earth’s magnetic field shields us from intense solar radiation and cosmic rays
Milos Kojadinovic/Alamy
Let us first examine the Earth’s magnetic field. Generated by the movement of molten metals within the Earth’s core, this magnetic field extends far into space, offering protection from harsh solar radiation and cosmic rays.
However, this magnetic field is not entirely stable. Every 100,000 years, it undergoes a flip where the north magnetic pole becomes the south pole. During these reversals, the field’s strength diminishes, allowing more radiation to penetrate the surface.
While these events aren’t catastrophic, there are also “excursions,” where the field strength wanes over extended periods, sometimes altering direction before returning to its original state without a full reversal.
The Laschamps event, occurring about 42,000 years ago, is a notable example where the magnetic field almost completely reversed. A 2021 study indicated this event lasted several hundred years, manifesting severe changes in atmospheric ozone levels. The researchers posited that these shifts likely incited “global climate change, resulting in environmental upheaval, extinction events, and alterations in archaeological records.”
Recent follow-up research has refined these ideas, suggesting that during the field’s excursion, phenomena such as auroras would have been visible farther south, affecting areas like Europe and North Africa and potentially exposing populations to harmful UV rays.
The authors further proposed that early modern humans in western Eurasia might have used a red pigment called ochre as a form of sunscreen, while also developing better clothing techniques. Such adaptations may have aided their survival against increased radiation exposure, unlike Neanderthals who lacked such adaptations.
Interestingly, the timing of the Laschamps event aligns closely with the last known presence of Neanderthals, raising questions about its possible role in their extinction.
Nevertheless, if we take a broader view of the past seven million years of human evolution, multiple magnetic field fluctuations have occurred. How did these excursions and reversals affect life during those times?
Historically, the last complete magnetic reversal occurred during the Brunhes-Matuyama transition around 795,000-773,000 years ago, long before Neanderthals but perhaps around the time of a common ancestor with us. Further explorations reveal numerous magnetic inversions throughout the past seven million years.
While smaller excursions are more frequent, securing evidence of them is challenging. A 2008 analysis identified 14 confirmed excursions over the past two million years, plus six others with weaker support.
Considering that Neanderthals experienced at least three excursions prior to the Laschamps event, why would this particular event lead to their extinction?
In fact, the Laschamps event posed significant hazards; if Neanderthals were vulnerable, it’s likely that other species suffered as well. Many megafauna species became extinct in Australia around 50,000 years ago, yet large animals in the Americas survived much longer, well into the 13,000-year mark. Notably, there was no significant spike in extinctions around 42,000 years ago.
This raises skepticism regarding the hypothesis linking the Laschamps event to Neanderthal extinction. While it may have contributed, it likely wasn’t the primary factor.
Similar issues plague claims about cosmic events impacting human evolution.
Impact Events
I’m fascinated by meteorite impacts. For an interesting afternoon rabbit hole, check out Impact Earth, an interactive map showcasing impact craters on our planet. For example, consider the Zhamanshin Hypervelocity Impact Crater in Kazakhstan, which is 13 km wide and is about 910,000 years old, or the Puntas Macrater in Nicaragua, which is 14 km wide and dates back 804,000 years. Both are notable compared to the Barringer Crater in Arizona, which measures just 1.2 km and is 61,000 years old.
Impact Earth catalogues 48 craters and sediments from the last 2.6 million years of geological history. If we expand our view back to the dawn of humanity, the number increases. Some noteworthy examples include:
Schnack in Kazakhstan, 7-17 million years ago, 2.8 km wide
Bigaha in Kazakhstan, possibly 6 million years ago, 8 km wide
Karla in Russia, 4 to 6 million years ago, 12 km wide
Aouelloul in Mauritania, 3.1 million years ago, 0.39 km wide
Keep in mind, none of these impacts come close to the scale of the Chicxulub crater. The largest craters are merely one-tenth the size. Nevertheless, such impacts can have significant localized effects.
Moreover, the timing and location of impacts matter. For instance, a significant event in Kazakhstan 6 million years ago likely did not affect humans, as they were confined to Africa at that time. However, what remained undisclosed was any research investigating the ecological repercussions of the Aouelloul and Roller Kamm impacts in Africa.
Another notable impact occurred around 790,000 years ago, resulting in unique tektites scattered across Southeast Asia and Australia. A 2019 study linked this to possibly the impact crater in Laos, measuring approximately 15 km in diameter. While it might have influenced Neanderthals, it was simply too distant and too early for it to be critical. However, it was undoubtedly significant for Homo erectus living in that region, but not impactful enough to change their survival as a species around 117,000 to 108,000 years ago.
The Exploding Stars
Supernovae emit massive pulses of matter and radiation
NASA/DOE/Fermi LAT collaboration, CXC/SAO/JPL-Caltech/Steward/O. Krause et al., NRAO/AUI
What about the more distant events, like exploding stars? When massive stars become supernovae, they release a massive outpouring of matter and radiation that traverses the galaxy. For years, we have known that nearby supernovae leave signatures in the rock record in the form of iron isotopes.
This leads to speculation about potential impacts. One proposal suggests extra cosmic rays from a supernova might increase cloud cover, thus lowering temperatures, which could have influenced australopithecines living in Africa at that time. Perhaps.
Physicist Adrian Mellott of the University of Kansas has spent two decades delving into what he terms “astrobiophysics.” He investigates how cosmic events such as supernovae might influence life on Earth. Much of this research pertains to periods before the advent of Homo, but not all.
Mellott highlights a significant moment around 2.6 million years ago when the Pliocene epoch transitioned into the Pleistocene. During this time, large marine extinctions may have coincided with supernova activity. He posits that supernovae could have bombarded Earth with cosmic particles, potentially leading to climate change characterized by more frequent wildfires and increased cancer rates. However, many paleontologists who identified the extinction instead link it to diminishing productive coastal habitats.
The universe presents an extensive array of threats. It’s vital to understand that numerous potentially perilous cosmic events have transpired during human evolution. Yet, limited evidence supports the notion that any of these incidents led to the extinction of human ancestors or any other species.
Thus, I tend to believe that asteroid impacts, supernovae, and shifts in the Earth’s magnetic field played a minimal role in the grand story of human evolution. While some cosmic events may have had localized impacts, they aren’t equivalent to eradicating human species or catalyzing new adaptations.
Keep this perspective in mind the next time you read sensational headlines claiming cosmic events led to the demise of Neanderthals or other species.
Neanderthals, Ancient Humans, Cave Art: France
Join New Scientist’s Kate Douglas on an enthralling journey through time as she delves into the significant Neanderthal and Upper Paleolithic sites across southern France, from Bordeaux to Montpellier.
Stromatolites are rock-like structures formed by bacteria in shallow water
Lkonya/Shutterstock
Microorganisms in the remote bays of Western Australia are interconnected through tiny tubes, suggesting early stages of complex life evolution.
In Shark Bay, known by the Indigenous name Gathaagudu, microbes create slimy, multi-layered assemblages called microbial mats. This challenging environment, buffeted by tidal shifts and temperature fluctuations, has fostered bacterial communities alongside another single-celled organism known as Archaea, which have thrived here for tens of thousands of years. These microorganisms often coexist symbiotically, forming layered sedimentary structures known as stromatolites.
“The mats develop under hypersaline conditions with elevated UV levels. It withstands cyclones. Despite facing numerous threats, they persist,” comments Brendan Burns from the University of New South Wales in Sydney.
He posits that these contemporary microbial communities may resemble those that existed billions of years ago when complex life first emerged. This evolution might have been driven by a mutual dependence between bacteria and Archaea, leading to the formation of more complex cells known as eukaryotes.
Burns and his team returned some of these microbial mat communities to the lab to cultivate the organisms in high-salinity, low-oxygen conditions.
They successfully cultured only one type of bacterium, stromatodesulfovibrio nilemahensis, and a newly identified archaeon named Nearachaeum marumarumayae, a member of the Asgard Archaea group. These archaeal bacteria, named after the gods’ abode in Norse mythology, are regarded as the closest relatives to the eukaryotic cells that comprise the bodies of animals, plants, and humans.
According to team members, “These organisms seem to directly interact and share nutrients,” states Iain Duggin of the Sydney Institute of Technology. Although there is no direct evidence yet, the complete genomic sequence obtained allows for speculation regarding the metabolic processes of both organisms.
The genomic analysis indicated that bacteria synthesize amino acids and vitamins, while the Archaea produce hydrogen and various compounds, such as acetic and sulfuric acids. Both sets of products are unique, indicating a dependency on each other.
The researchers also observed indications of direct interaction between the two species. “We have observed what we refer to as nanotubes,” notes Duggin. “These microscopic tubes, seemingly produced by bacteria, establish direct connections to the surface of the Asgard cells.”
3D reconstruction based on electron microscope images showing cell membranes of Archaeon (blue) and bacteria (green), with nanotubes (pink) between them
Dr. Matthew D. Johnson, Bindusmita Paul, Durin C. Shepherd et al.
In addition to their interactions, the Archaeon cells generate vesicle chains that resemble SAC-like structures utilized for transporting molecules along extracellular fibers. Duggin notes that these nano-sized vesicles appear to engage with the nanotubes formed by the bacteria.
“While nanotubes may be too slender for conduits, they facilitate a type of multicellular binding that enhances resource sharing,” asserts Duggin.
The researchers identified a protein similar to human muscle proteins, a genomic sequence coding for a previously unknown protein, and a protein consisting of about 5,500 amino acids, which is substantial for ancient species. “While I can’t claim it’s directly connected to human muscle proteins, it suggests that their evolutionary origins may trace back much further,” says team member Kate Mischey from the University of New South Wales.
“What fascinates me most are the direct connections formed by nanotubes between bacteria and archaea,” comments purilópez-garcía from Parisa Clay University, France. “Such interactions have not been documented in prior cultures.”
However, discerning the exact behaviors of bacteria and Archaea is challenging, remarks Buzz Baum from the MRC Institute of Molecular Biology, Cambridge, UK. “It’s a complex relationship of conflict and cooperation,” he notes. “They interact, share, and sometimes clash, demonstrating a nuanced understanding of each other’s presence.”
Duggin believes the prevalent dynamic is more cooperative than combative. “These organisms coexisted in our culture for over four years, suggesting a level of harmony rather than contention,” he adds.
Burns and his colleagues propose that their findings may reflect an early stage in the evolution of eukaryotic cells within microbial mats. Roland Hatzenpichler at Montana State University aligns with this perspective.
“The study’s outcomes indicate that the newly identified Asgard Archaea engage directly with sulfate-reducing bacteria,” he remarks.
However, Lopez Garcia cautions that these interactions may not date back beyond 2 billion years. “While these archaeal and bacterial forms are modern, the microbial environments they inhabit may provide insights into ancient ecosystems,” he explains.
According to Hatzenpichler, we may be on the verge of better understanding the similarities between recent microorganisms and the cells they collaborate with to form primitive nucleated cells. “We’re now in an advantageous position to uncover deeper truths,” he concludes.
Lava planets are rocky exoplanets that orbit extremely close to their host star, allowing for conditions that melt silicate rocks daily.
Boucale et al. We introduce a straightforward theoretical framework to explain the evolution of lava planets’ internal atmospheric systems. Image credit: Sci.News.
A lava planet is typically a super-Earth to Earth-sized world, orbiting its star in less than one Earth day.
Similar to the Earth’s moon, these planets are expected to be tidally locked, displaying the same hemisphere to their stars at all times.
With extreme surface temperatures, their rocks can reach melting or even evaporating points, creating a distinctive state within our solar system.
These unusual worlds are easily observable due to their pronounced orbital dynamics, offering valuable insights into the fundamental processes that drive planetary evolution.
“Due to the extreme orbital characteristics of lava planets, our understanding of rocky planets in the solar system does not apply directly, which leaves scientists uncertain about expected observations,” states Dr. Charles Eiduard Bukare from York University.
“Our simulations provide a conceptual framework for understanding their evolution and a way to investigate internal dynamics and chemical transformations over time.”
“While these processes are greatly intensified on lava planets, they fundamentally mirror those shaping rocky planets in our solar system.”
As rocks melt or evaporate, elements like magnesium, iron, silicon, oxygen, sodium, and potassium partition differently across vapor, liquid, and solid states.
The unique orbital dynamics of lava planets maintain vapor-liquid and solid-liquid equilibria for billions of years, facilitating long-term chemical evolution.
Using cutting-edge numerical simulations, the researchers predict the evolutionary status of two distinct categories.
(i) Fully melted interior (likely a younger planet): The atmosphere reflects the planet’s overall composition, with heat distribution within the melt ensuring a hot and dynamic nightside surface.
(ii) Nearly solid interior (likely an older planet): Only shallow lava oceans persist, while the atmosphere becomes depleted of elements such as sodium, potassium, and iron.
“We sincerely hope that with the NASA/ESA/CSA James Webb Space Telescope, we will be able to observe and differentiate between young and old lava planets,” Dr. Boukaré expressed.
“Demonstrating this capability would signify a significant advancement beyond conventional observational methods.”
study was published today in the journal Natural Astronomy.
____
cé. Boucale et al. The significance of internal dynamics and differentiation in the surface and atmosphere of lava planets. Nut Athlon Published online on July 29th, 2025. doi:10.1038/s41550-025-02617-4
Paleontologists have identified a novel genus and species of Triassic derepanosauromorph diapsid showcasing remarkable appendages (not feathers or skin). This discovery is based on two exceptionally well-preserved skeletal structures and related specimens. Their research reveals that wings and hair-like extensions are not exclusive to birds and mammals.
Mirasaura Grabogeli In natural forest environments, insects are hunted. Image credit Gabriel Uguet.
Feathers and hair are intricate outer body appendages of vertebrates, serving essential functions such as insulation, sensory support, display, and facilitating flight.
The development of feathers and hair traces back to the ancestral lines of birds and mammals, respectively.
However, the genetic frameworks responsible for these appendages may have origins deeper within the amniotic lineage, encompassing various animal branches, including those of birds and mammals.
The Triassic reptile species outlined by Dr. Stephan Spiekman from the Staatliches Museum für Naturkunde Stuttgart and his collaborators featured unique appendages that could reach up to 15.3 cm (6 inches) in length along their backs.
Named Mirasaura Grabogeli, this peculiar creature inhabited Europe approximately 247 million years ago.
The species exhibited a superficially bird-like skull but was classified within the Diapsid group known as Drepanosauromorpha.
Anatomy of Mirasaura Grabogeli. Image credit: Spiekman et al., doi: 10.1038/s41586-025-09167-9.
Discovered in northeastern France in the 1930s, Mirasaura Grabogeli comprises 80 specimens featuring two well-preserved skeletal structures with isolated appendages and preserved soft tissues. Recent preparations have led to its identification.
“This enabled the connection between the summit and skeleton,” the paleontologist noted.
“The tissue preserved within the appendages contains melanosomes (pigment-producing cells located in skin, fur, and feathers), resembling those found in feathers more closely than in reptilian skin or mammalian hair, yet lacking the typical branching pattern of feathers.”
“These observations suggest that such complex appendages might have evolved among reptiles prior to the emergence of birds and their closest relatives, potentially offering new insights into the development of feathers and hair.”
Given the characteristics of the appendages observed in Mirasaura Grabogeli, we dismissed their roles in flight or camouflage, proposing instead a possible role in visual communication (signaling or predator deterrence).
The team’s research paper was published today in the journal Nature.
____
SNF Spiekman et al. Triassic diapsids reveal early diversification of skin appendages in reptiles. Nature Published online on July 23, 2025. doi:10.1038/s41586-025-09167-9
Morrison, a marine creature from the Cambrian period, could represent an early arachnid
Junnn11 @ni075 CC BY-SA 4.0
Research indicates that the brains of ancient sea creatures, dating back over 500 million years, were structured similarly to those of spiders. This challenges past theories that arachnids originated on land.
Morrison reflects a time of significant biological diversity increase, known as the Cambrian Explosion, when various animal groups began appearing in fossil records. These creatures possessed chelicerae, pincer-like mouthparts likely used for tearing into small prey.
Previous beliefs suggested that modern relatives of Morrison, which include horseshoe crabs, were connected to spiders. However, Nicholas Strausfeld and his team at the University of Arizona propose otherwise.
The researchers reexamined specimens of Mollisonia symmetrica, collected in 1925 from British Columbia, Canada, and now housed at Harvard University’s Comparative Zoology Museum. Strausfeld and his colleagues identified a brain structure that had previously been overlooked.
In horseshoe crabs, the chelicerae exhibit a neural connection at the back of the brain; however, in Morrison, this structure was inverted, with chelicerae linked to two neural regions that offered a perspective on the forefront of the nervous system.
Strausfeld notes that this orientation is “characteristic of arachnid brains.” Unlike the brains of crustaceans and insects, which are folded inward, arachnids have crucial areas for planning agile movements situated at the back. This architecture likely contributes to the remarkable agility and speed seen in spiders.
While it was previously thought that arachnids evolved on land, the earliest existing land fossils of obvious arachnids will not appear for millions of years later, according to Strausfeld. “Perhaps the first arachnids inhabited tidal environments, like Morrison, in search of prey,” he mentions.
Mike Lee, a researcher at Flinders University in Adelaide, Australia, who was not involved in the study, suggests that Morrison may now be viewed as a primitive arachnid. “We now recognize it possessed a brain akin to that of a spider, indicating it was an aquatic relative of the early spiders and scorpions,” Lee states.
Nonetheless, he cautions that while researchers strive to extract as much insight as possible from a single fossil, there remains a degree of ambiguity in interpretation. “It’s akin to attempting to piece together a unique Pavlova after it has been dropped,” he explains.
Social Media and Short-Form Video Platforms Drive Language Innovation
lisa5201/getty images
Algospeak Adam Aleksic (Every (UK, July 17th) Knopf (USA, July 15th))
You won’t age, just as slang is wrapped in bamboo. In Adam Aleksic’s chapter Algospeak: How Social Media Will Change the Future of Language, this phenomenon is discussed. Phrases like “Pierce Your Gyat for Rizzler” and “WordPilled Slangmaxxing” remind me that as a millennial, I’m just as distant from boomers as today’s Alphas are.
Linguist and content creator (@etymologynerd), Aleksic has ignited a new wave of linguistic innovation fueled by social media, particularly short video platforms like TikTok. The term “Algospeak” has been traditionally linked to euphemisms used to avoid online censorship, with recent examples including “anxiety” (in reference to death) or “segg” (for sex).
However, the author insists on broadening the definition to encompass all language aspects affected by the “algorithm.” This term refers to the various, often opaque processes social media platforms use to curate content for users.
In his case, Aleksic draws on his experience of earning a living through educational videos about language. Like other creators, he is motivated to appeal to the algorithm, which requires careful word selection. A video he created dissecting the etymology of the word “pen” (tracing back to the Latin “penis”) breached sexual content rules, while a discussion on the phrase “from river to sea” remained within acceptable limits.
Meanwhile, videos that explore Gen Alpha terms like “Skibidi” (a largely nonsensical term rooted in scat singing) and “Gyat” (“Goddamn” or “Ass”) have performed particularly well. His findings illustrate how creators modify their language for algorithmic advantage, with some words transitioning online and offline to achieve notable success. When Aleksic examined educators, he found many of these terms had entered regular classroom slang, with some students learning the term “anxiety” before understanding “suicide.”
A standout aspect of his study lies in etymology, investigating how algorithms propel words from online subcultures into mainstream lexicon. He notes that the misogynistic incel community is a significant contributor to contemporary slang, evidenced by its radical nature that can outpace linguistic evolution within a group.
Aleksic approaches language trends with a non-judgmental perspective. He notes that the term “anxiety” parallels earlier euphemisms like “deceased,” while “Skibidi” is reminiscent of “Scooby-Doo.” He frequently mischaracterizes slang within arbitrarily defined generations, which claim to infuse toxic narratives into the evolution of normal languages.
The situation becomes more intricate when slang enters mainstream usage through cultural appropriation. Many contemporary slang terms, like “cool” before them, trace back to the Black community (“Thicc,” “bruh”) or originate from the LGBTQ ballroom scenes (“Slay,” “Yas,” “Queen”). Such wide-ranging adoptions can sever these terms from their historical contexts, often linked to social struggles and further entrenching negative stereotypes about the communities that birthed them.
Preventing this disruption of context is challenging. Successful slang’s fate is often to be stripped of its original nuances. Social media has drastically accelerated the timeline for language innovation. Algospeak is a necessary update, yet it can become quickly outdated. However, as long as algorithms exist, fundamental insights into how technology influences language will remain important.
Victoria Turk is a London-based author
New Scientist Book Club
Enjoy reading? Join a welcoming community of fellow book enthusiasts. Every six weeks, we explore exciting new titles, offering members exclusive access to book excerpts, author articles, and video interviews.
Ancient humans adapted to deeper forests as they journeyed from Africa, moving away from the savanna.
Lionel Bret/Eurelios/Science Photo Library
This is an excerpt from our human stories, a newsletter covering the archaeological revolution. Subscribe and receive updates in your inbox every month.
Our human origins trace back to Africa. While this has not always been clear, it is now widely accepted.
This truth can be understood in two ways. The earliest known species closely related to us emerged from Africa, dating back 7 million years. Additionally, the oldest representatives of our own species, Homo sapiens, also originated from Africa.
Here, I will focus on the narrative of modern humans originating in Africa and their subsequent migrations across the globe. The introduction of DNA sequencing technology in the latter half of the 20th century enabled comparisons between different populations. This research demonstrated that African populations exhibit the greatest genetic diversity, while non-Africans show relative genetic similarity (despite visible differences such as skin color).
This genetic distinction serves as a telling indicator. It suggests that Africa was our birthplace with a diverse population, from which all non-Africans descended from a smaller subset that left this ancestral home to settle elsewhere. Geneticists affirmed this idea as early as 1995, and further evidence has since supported this claim.
However, there is a discrepancy between archaeological evidence and genetic findings.
Genetics indicates that all living non-Africans are descendants of a small group that left Africa around 50,000 years ago. Aside from minor uncertainties about the exact timeline, this conclusion has remained consistent for two decades. Conversely, archaeologists highlight numerous instances of modern humans existing outside Africa long before this timeline.
In Greece, a modern human skull found in the Apidima Caves dates back 210,000 years. The jawbone from Misliya Cave in Israel has been dated to at least 177,000 years. Additionally, there are several debated sites in China that may contain remains of modern humans. “Moreover, there’s an ongoing discussion on the earliest inhabitants of Australia,” says Eleanor Scerri from the Max Planck Institute for Geoanthropology in Germany, with some proposing human presence as early as 65,000 years ago.
What is the explanation for this disparity? Has our extensive genetic data misled us? Or is it true that we all share a common ancestry tied to a significant migration event, while older remains represent populations that did not survive?
Scerri and her team sought to understand this conundrum.
African Environment
The researchers debated the habitats of modern humans in Africa. “Did they simply migrate across diverse African grasslands, or were they adapting to vastly different environments?” asks Scerri.
To address this question, they needed extensive data.
“We began by analyzing all archaeological sites in Africa dating back 120,000 to 14,000 years ago,” explains Emily Yuko Hallett from Loyola University in Chicago. The team constructed a database identifying the climate at various locations and times.
A significant shift was observed around 70,000 years ago. “Simply examining the data without complicated modeling shows this climatic change,” notes Andrea Manica from the University of Cambridge. The range of temperatures and rainfall suitable for human habitation had notably expanded, leading people to venture into deeper forests and arid deserts.
However, mere observation is insufficient; the archaeological record is inherently incomplete and often biased.
“In certain regions, no archaeological sites exist,” remarks Michela Leonardi from the Natural History Museum in London. This absence might not reflect a lack of human occupancy, but rather the lack of preservation. “In more recent periods, preservation is easier due to the increased data availability,” she adds.
Leonardi devised a statistical modeling technique to determine if an animal shifted its environmental range. Could humans have transitioned from grasslands to diverse habitats, such as tropical rainforests? The team initially thought this modeling would take two weeks, but it took five and a half years.
Ultimately, the statistics affirmed their initial observation: around 70,000 years ago, modern humans began occupying a broader range of environments. The findings were published on June 18th here.
Jack of All Trades
“At 70,000 years ago, our species appears to have transformed into the ultimate generalist,” states Manica. From this period onwards, modern humans adapted to a variety of complex habitats.
This could be misinterpreted. The team did not imply that prior to H. sapiens people were incapable of adaptation. In fact, studies of extinct human species highlight that adaptability has increased over time.
“Humans were inhabiting environments vastly different from the early stages,” observes Scerri. “We’ve found evidence of habitation in mangrove forests, rainforests, desert edges, and highlands like those in Ethiopia.”
It appears that this adaptability is what allowed Homo sapiens to thrive during environmental changes in Africa, while other species like Paranthropus did not; they remained too rigid in their lifestyle to adapt.
What likely transpired in our species 70,000 years ago is that existing adaptability became pronounced.
Some of this understanding only becomes clear when considering the diverse habitats humans occupied. “One might think of deserts and rainforests in rigid terms, but there are actually numerous variations,” explains Scerri. “There are lowland rainforests, montane forests, marshes, and periodically flooded woodlands.” The same diversity applies even within desert environments.
Before, H. sapiens “did not exploit the full range of potential habitats,” states Scerri. “But around 70,000 years ago, we see the beginning of this expansion into more types of forests and rainforests.”
This narrative intrigued me, as I had been contemplating an opposite idea.
Great Quarantine
Last week, I authored a piece about the extinction of local human groups: it appears that some H. sapiens populations vanished without a trace in modern genetics. After departing from Africa, they faced challenges in harsh environments, eventually succumbing during encounters with the first modern humans in Europe. These lost groups fascinated me. Why did they fail while others that entered Europe thousands of years later found much success?
The discovery that African groups expanded their environmental niches 70,000 years ago provides a partial explanation. If these later migrations involved more adaptable populations, they may have been better equipped to face the unfamiliar environments of Northern Europe—and subsequently Southeast Asia, Australia, and the Americas where their descendants would eventually journey.
A crucial point: this does not suggest that all populations 70,000 years ago thrived. “Not all humans instantly turned into successful populations,” Scerri explains. “Many of these groups disappeared, both inside and outside of Africa.”
Moreover, as with any significant discovery, this study introduces as many questions as it resolves. Specifically: what triggered modern humans to become more adaptable around 70,000 years ago?
Manica notes that skeletal morphology supports this idea. Ancient fossils classified as H. sapiens today exhibit only some of the traits we typically associate with modern humans. “Starting around 70,000 years ago, we broadly witnessed the emergence of many of these characteristics as a collective,” he asserts.
Manica posits that moving into new environments may have facilitated increased interaction between previously isolated populations. For instance, if two groups were separated by desert, they wouldn’t encounter or exchange ideas or genetic material until they learned to adapt to desert conditions.
“There may also be positive feedback,” suggests Manica. “With increased connectivity comes greater flexibility… breaking down barriers and fostering further interaction.”
To conclude, in a story about these lost populations, I mentioned that one of the greatest challenges for human groups was isolation. Without neighbors, a small group can face extinction due to minor setbacks. If Manica is correct, the opposite trend unfolded in Africa. Populations expanded and became increasingly connected, leading to a surge of creativity that allowed our species to spread across the globe.
In this light, the success of the last migration out of Africa could be attributed to the need for community. Without others, we may be vulnerable and at risk of failing. The notion of preparing for an apocalypse alone in isolation may be fundamentally flawed.
Here’s a rewritten version of the provided content while retaining the HTML tags:
Current forest die-offs due to global warming resemble those from the Permian and Triassic extinction events.
Ina Fassbender/AFP via Getty Images
Following a dramatic increase in carbon dioxide levels 252 million years ago, the death of forests resulted in enduring climate alterations, with the greenhouse effect persisting for millions of years.
Researchers striving to comprehend this phenomenon, which triggered the largest mass extinction in Earth’s history, caution that ongoing greenhouse gas emissions may lead to similar outcomes.
The extinction events of the Permian and Triassic are believed to have been triggered by extensive volcanic activity in what is now Siberia, elevating atmospheric CO2 concentrations.
The planet’s surface temperature soared by as much as 10°C, with average temperatures in the equatorial regions climbing to 34°C (93°F)—a rise of 8°C above the current average.
Although some scientists have recently posited that these mass extinction events may have limited effects on terrestrial ecosystems, Andrew Meldis from the University of Adelaide expresses confidence that life was nearly extinguished 252 million years ago.
“Small pockets of life might survive mass extinctions in isolated enclaves, but many areas within the Permian-Triassic fossil record reveal a complete ecosystem collapse,” notes Meldis.
He and his team scrutinized the fossil record to investigate why the Super Greenhouse event, which drives mass extinction, lasted five million years—far longer than the 100,000 years predicted by climate models.
The findings revealed that vast expanses of forests, originally with canopies of around 50 meters, were supplanted by resilient underground flora, typically ranging from 5 cm to 2 meters in height. Additionally, peat marshes, significant carbon storage ecosystems, vanished from tropical areas.
Employing computer models of Earth’s climatic and geochemical systems, researchers indicated that the depletion of these ecosystems contributes to elevated CO2 levels persisting for millions of years. This predominantly occurs because vegetation plays a crucial role in weathering, the mechanism that extracts carbon from the atmosphere and sequesters it in rocks and soil over extensive timescales.
With atmospheric CO2 levels rising rapidly, the parallels to the present are striking, asserts Meldis. As temperatures escalate, tropical and subtropical forests may find it increasingly challenging to adapt, potentially surpassing thresholds where vegetation ceases to maintain climate equilibrium.
Meldis explains that simply restoring former ecosystems will not lead to a “ping-pong effect.” He emphasizes that the atmosphere cannot be swiftly rejuvenated after the loss of the equatorial forest.
“You’re not transitioning from an ice house to a greenhouse and then back; the Earth will find a new equilibrium, which may differ significantly from prior states,” he elaborates.
Catlin Maisner, a researcher at the University of New South Wales—who was not involved in the study—describes reconstructing these events as analogous to “trying to assemble a jigsaw puzzle with many missing pieces,” yet acknowledges the team’s arguments as “plausible.”
However, she notes considerable uncertainty regarding oceanic processes during this period. “The ocean harbors far more carbon than land and atmosphere combined, and we still lack a comprehensive understanding of how marine biology, chemistry, and physical circulation were affected during that event,” cautions Meissner.
On June 26, the Advisory Committee on U.S. Vaccination Practices (ACIP) announced new recommendations regarding flu vaccinations that utilize controversial preservatives often misattributed to autism. While this change is unlikely to restrict access to vaccines, it reflects a broader U.S. governmental effort to uphold the integrity of the vaccination recommendation process.
What Changes Have Occurred with ACIP?
ACIP is an advisory body that provides expert recommendations to the U.S. Centers for Disease Control and Prevention (CDC) on vaccinations. Established in 1964, its members are appointed by the Secretary of Health and Human Services, currently Robert F. Kennedy Jr.
Kennedy recently dismissed all 17 members of ACIP, claiming it was necessary to eliminate conflicts of interest. “A complete overhaul is essential to restore public trust in vaccine science,” Kennedy stated in a press release.
This action faced pushback from many public health experts. “Prior to Kennedy’s actions, I had confidence in ACIP. Now, I have none,” remarked Amesh Adalja from Johns Hopkins University. “It’s reckless and dangerous to unilaterally dismantle an entire panel of experts,” added Tinatan, president of the American Infectious Diseases Association, in a statement.
Kennedy appointed eight new members, though one resigned before the inaugural meeting. Several of the new members have raised concerns regarding various vaccines. For instance, Robert Malone, a pioneer in mRNA vaccine technology, discussed on Joe Rogan’s Experience podcast in 2021 the potential links between Covid-19 vaccines and heart-related issues. A 2024 study found that while there was a lower heart attack and stroke rate post-vaccination among approximately 46 million adults, there was an increase in rare side effects such as myocarditis and pericarditis.
A CDC spokesperson stated, “Dr. Malone is a seasoned physician who advocates for rigorous, evidence-based evaluations rather than uncritical acceptance. He does not oppose vaccines based on flawed data or policies—noting the necessity for better information.”
Adalja expressed concerns that some new members lack significant expertise in vaccines, infectious diseases, and epidemiology, suggesting that trust in ACIP under Kennedy’s leadership is dwindling. “In reality, he effectively made ACIP an independent entity,” he stated.
What Changes Have Been Made to Vaccine Recommendations?
In May, Kennedy stated in a video on social media platform X that the CDC would halt Covid-19 vaccine recommendations for most children and pregnant individuals. This decision was taken without ACIP’s input, breaking longstanding precedent.
The new ACIP also voted to discontinue recommendations for flu vaccines containing Thimerosal, a preservative used in several vaccines, during its first meeting. Additionally, the panel advocated for seasonal flu vaccinations for all individuals older than six months.
What is Thimerosal? Is it Safe?
Thimerosal is a preservative used in various vaccines to prevent bacterial contamination and contains trace amounts of mercury, which the body metabolizes into a byproduct known as ethyl mercury.
Ethyl mercury is distinct from the more harmful methylmercury found in certain environmental sources, such as fish. Methylmercury is highly toxic and can accumulate in the body, while numerous studies have shown that low doses of ethyl mercury are safe. Additionally, it is typically cleared from the bloodstream within 30 days, even in infants.
Despite claims from anti-vaccine proponents, no studies have substantiated a link between thimerosal and autism. A 2014 survey of nearly 1.3 million children found no association between vaccines, including thimerosal-containing vaccines, and the development of autism.
The U.S. Food and Drug Administration conducted a thorough review of thimerosal use in pediatric vaccines in 1999, identifying no side effects aside from minor allergic reactions at the injection site.
Which Vaccines Contain Thimerosal?
The utilization of thimerosal in vaccines has decreased as formulations have evolved, with a shift toward single-dose vials minimizing the risk of bacterial contamination.
Thimerosal is not included in routine childhood vaccinations except for certain flu shots, which are used infrequently. For instance, only 3% of children over the age of 65 and about 2% of adults received the flu vaccine containing thimerosal during the 2024-2025 influenza season.
How Will New U.S. Vaccine Policies Compare to Other Countries?
Other nations have continued to recommend thimerosal-containing vaccines. For instance, the UK removed thimerosal from routine vaccinations between 2003 and 2005 to mitigate mercury exposure but later found no evidence of harm. They employed several vaccinations, including those with thimerosal, during specific flu seasons targeting H1N1.
In 1999, the European Medicines Agency endorsed the use of vaccines without thimerosal even in light of no evidence of harm. In 2004, after new data re-affirmed the safety of preservatives, the agency noted that “the benefits of vaccination significantly outweigh any exposure concerns, including thimerosal.” A further review of their guidelines in 2016 reaffirmed their relevance.
Can I Get the Vaccine for Myself or My Child?
Since most seasonal flu vaccinations do not contain thimerosal, it is unlikely that the ACIP’s recommendations will hinder vaccine access. Additionally, the CDC’s updated guidelines regarding Covid-19 vaccines for children and pregnancies should not affect the availability of vaccines.
Wimbledon is currently underway, and I’m seizing the moment to present a bold assertion: tennis holds a pivotal role as the most significant sport in the evolution of video games.
Although modern gaming giants like EA Sports FC, Madden, and NBA 2K dominate the charts, tennis lays the groundwork for the industry. Originally conceived as a straightforward bat-and-ball game by scientists in 1958, William Higginbotham introduced what is recognized by the Brookhaven National Laboratory in Upton, New York, as the first video game created solely for entertainment. This tennis game was exhibited on an oscilloscope during the lab’s annual open house, and the growing queue of players hinted at a burgeoning interest in video gaming.
Ralph Baer, the creator of the first mass-produced gaming console, Magnavox Odyssey, incorporated tennis into his innovations. While working for defense contractor Sanders Associates in the late 1960s, Baer’s prototype could only display vertical lines and square dots. Upon Magnavox’s release of the console in 1972, the standout games included table tennis and tennis, with players using a plastic overlay on the TV screen. This allowed two players to hit the ball back and forth, introducing a degree of “spin” via a dial on the controller. The simplistic controls of these tennis games limited player skill but laid the foundation for future development.
This progression inevitably led to Pong, widely regarded as the first major success in arcade gaming. Nolan Bushnell, Atari’s founder, was inspired by the tennis game on the Odyssey and sought to improve upon it. Collaborating with programmer Al Alcone, they divided the bat on-screen into eight sections, each capable of deflecting the ball at varied angles. This marked the dawn of precise player input, a critical aspect for future video games that allowed players to showcase skill and timing. The success of Pong prompted Bushnell to create a single-player variation, Breakout, wherein players aimed to hit a ball against a disappearing brick wall—effectively a one-player tennis game. Its brilliance significantly influenced the Japanese gaming landscape, leading to NAMCO’s entry into the arcade scene. Additionally, it inspired Tomohiro Nishikado in developing Space Invaders in 1978, laying the groundwork for the entire Shoot ’em up genre.
Before his passing in 2009, Ralph Baer showcased a prototype of the “brown box,” the first console. Photo: Jens Wolf/AP
Tennis simulations also played a crucial role in the rise of home computer gaming in the 1980s. Games like ZX Spectrum’s Match Points and International Tennis on the Commodore 64 delivered an engaging, easy two-player experience, contrasting with the more complex football simulations. This accessibility drew in gamers, and Nintendo capitalized on this with titles like Mario Tennis and Wii Sports, which became some of the most beloved sports games.
As consoles evolved, tennis games became staple titles across generations, often drawing in those new to gaming. While not boasting the flashy allure of soccer or basketball simulations, they maintained appeal for casual players. Titles such as Namco’s smash court, Codemasters’ Pete Sampras Tennis, 2K’s Top Spin, and Sega’s Virtua Tennis enriched the fundamental concept of rallying the ball over the net. Tennis uniquely features a confined play area that provides extensive enjoyment, intricate skill mechanics, and an easily understood ruleset within a concentrated, single-screen environment.
Have you ever found yourself waiting in line outside scientific research establishments in Upton, New York, in the fall of 1958, playing Space Blaster or Kung Fu games? I doubt it—it would have seemed uncomfortable and enigmatic to many attendees. Take a look at Computer Space, the first commercial space shooter arcade game released in 1971, designed by Nolan Bushnell and Ted Dabney. It performed modestly, but Pong’s success transformed the gaming landscape. Its controls were overly complex, and the abstract concepts were off-putting. Tennis subtly became the gateway for video games, adeptly infiltrating homes and entertainment venues, creating a new cultural phenomenon.
What to Play
Retro Treat… Armageddon of the Worm: Anniversary Edition. Photo: Team 17
I’m inclined to recommend a tennis game—classics like Virtua Tennis or Top Spin 4 come to mind—but for a twist, consider Worms Armageddon: Anniversary Edition. This modern take on the beloved 1999 title is a chaotic, multiplayer turn-based game where players eliminate opponents using an arsenal that includes sheep launchers, banana bombs, and concrete donkeys.
It’s an absurdly entertaining experience, demanding profound tactical thought and mastery over angles and trajectories. The game also unlocks access to previous titles from the Mega Drive and Game Boy series—an excellent deal.
Available on: PS5, Switch, Xbox Estimated playtime: 10 hours to 25 years
What to Read
£80 Pop…Mario Kart World. Photo: Nintendo
A pressing issue: Video Game Prices on the Rise. Continuing The BBC has reported on consumer grievances regarding video game costs, with major titles reaching up to £80. Increased production and development expenses contribute to this surge, yet attention should also be directed towards the hefty salaries of CEOs in certain industries.
Curious about how Metacritic operates? GamesIndustry.biz interviewed the founder to uncover the science behind score aggregation. Several intriguing discussions arose, including the practice of linking game publisher bonuses to the latest project metascore.
Certain sites, such as IGN, have covered recent comments from former Xbox executive Laura Fryer on the end of Xbox hardware. The announcement about Microsoft’s future strategies, including the ROG Xbox Ally X Handheld PC, sheds light on potential changes ahead.
I love your console… Sega Mega Drive. Photo: Keith Stuart/Guardian
This inquiry is from Johnny Biscuits:
“Five years ago, numerous media commentators claimed that the PS5/Xbox Series X would be the final generation of consoles. What is the current opinion?”
As mentioned, early Xbox employee Laura Fryer has suggested winding down hardware development in favor of Microsoft’s focus on Xbox applications across various platforms. This shift is becoming increasingly evident, particularly with announcements like the ROG Xbox Ally and the Meta Quest 3S Xbox Edition, as well as Samsung integrating Xbox titles into their smart TVs. However, Microsoft recently announced a multi-year partnership with AMD that includes plans for “future Xbox consoles.” Conversely, Sony, lacking the extensive ecosystem available to Microsoft through Windows, recently reiterated its commitment to dedicated consoles, especially given the PS5’s sales of around 78 million units. Additionally, rumors about a Switch 2 have emerged, with the latest model surpassing 3.5 million units sold within its first four days.
In conclusion, I don’t anticipate devoted gaming consoles disappearing anytime soon. They remain more cost-effective than assembling and maintaining a gaming PC while providing a more stable gaming experience than streaming alternatives. After a five-year stretch that prioritized digital access and streaming ownership, game consoles continue to be cherished objects, evoking nostalgia along with being functional. It might seem unreasonable to cling to a bundle of plastic and circuitry, yet when that apparatus resembles the Mega Drive, Neo Geo, or PlayStation 5, it transforms into more than a mere device—it becomes a part of our entertainment culture.
If you have inquiries, feedback regarding the newsletter, or other comments, please reach out to pushbuttons@theguardian.com.
Modern disk galaxies frequently display distinct thin and thick disks. The mechanisms driving the formation of these two discs and the timeline of their emergence are still unanswered questions. To investigate these issues, astronomers examined various epochs (statistical samples of 111 edge-on disk galaxies dating back up to 11 billion years, or approximately 2.8 billion years post-Big Bang) utilizing archived data from the NASA/ESA/CSA James Webb Space Telescope.
Webb/nircam composite images of a quarter of the team’s samples were sorted by increasing redshift. Image credit: Tsukui et al., doi: 10.1093/mnras/staf604.
Present-day disk galaxies often comprise extensive, star-rich outer disks alongside thin, star-like disks.
For instance, the thick discs of the Milky Way reach approximately 3,000 light-years in height, while the thin discs are roughly 1,000 light-years thick.
But what mechanisms lead to the formation of this dual disk structure?
“The thickness of high redshift discs, or unique measurements from the early universe, serve as benchmarks for theoretical research that can only be conducted using Webb,” states Takagi, an astronomer at the Australian National University.
“Typically, older, thicker disk stars are dim, while the younger, thinner disk stars dominate the galaxy.”
“However, Webb’s exceptional resolution allows us to observe and highlight faint older stars, enabling us to distinguish between two disk structures in a galaxy and measure their thickness separately.”
Through an analysis of 111 edge-on targets over cosmological time, astronomers studied both single-disc and double-disc galaxies.
The findings indicate that galaxies initially form a thick disk, which is followed by the formation of a thin disk.
The timing of this process is contingent on the galaxy’s mass: high-mass, single-disk galaxies transitioned to two-disk structures around 8 billion years ago.
In contrast, a thin disk emerged about 4 billion years ago within low-mass, single-disk galaxies.
“This is the first time we’ve resolved a thin star disk at such a high redshift,” remarked Dr. Emily Wysnioski from the Australian National University.
“The novelty becomes evident when observing the onset of thin star disks.”
“It was astonishing to witness a thin star disk from 8 billion years ago, and even further back.”
To elucidate the transition from a single thick disk to a dual-disk structure, as well as the timing differences between high-mass and low-mass galaxies, researchers expanded their investigation beyond the initial edge-on-galaxy samples. They examined data showing the movement of gases from large millimeter/sub-millimeter arrays (ALMAs) in Atacama and ground surveys.
By considering the movement of the galaxy’s gas disks, they found their results aligned with the “turbulent gas disk” scenario.
In this framework, the turbulent gas disks of the early universe catalyze intense star formation, leading to the creation of thick star disks.
As stars form, they stabilize the gas disks, diminishing turbulence and consequently resulting in thinner disks.
Larger galaxies can convert gas into stars more efficiently and thus calm down more quickly than their lower-mass counterparts, leading to the formation of the earlier thin disk.
“This study delineates structural differences between thin and thick discs, but we aim to explore further,” Dr. Tsukui mentioned.
“We look to incorporate the types of information typically acquired from nearby galaxies, such as stellar movement, age, and metallicity.”
“By doing so, we can bridge insights from both nearby and distant galaxies, enhancing our understanding of disk formation.”
Survey results were published in Monthly Notices of the Royal Astronomical Society.
____
Takagi Tsukui et al. 2025. The emergence of thin and thick discs of galaxies across the history of the universe. mnras 540(4): 3493-3522; doi: 10.1093/mnras/staf604
Influence of Uterine Hormones on Human Brain Development
Peter Dazeley/Getty Images
The human brain stands as one of the universe’s most intricate structures, potentially shaped by the surge of hormones released by the placenta during pregnancy.
Numerous theories have emerged regarding the evolution of the human brain, yet it remains one of science’s greatest enigmas. The social brain hypothesis posits that our expansive brains evolved to navigate complicated social interactions. This suggests that managing dynamics in larger groups necessitates enhanced cognitive abilities, and that species with strong social inclinations require increased brain development. Comparable highly social animals, like dolphins and elephants, possess significant brain sizes too; however, the biological mechanisms linking these features are still unclear.
Recently, Alex Tsompanidis from Cambridge University and his team propose that a placental sex hormone might be the key. The placenta, a temporary organ bridging the fetus and the mother, releases hormones crucial for fetal development, including sex hormones like estrogens and androgens.
“It may sound like a stretch, linking human evolution to the placenta,” notes Tsompanidis. “However, we’ve observed fluctuations in these hormone levels in utero and predicted outcomes regarding language and social development, among other areas.”
Recent studies indicate these hormones significantly impact brain development. For instance, a 2022 study revealed that administering androgens like testosterone to brain organoids—a simplified brain model derived from human stem cells—during crucial developmental stages led to an increased number of cortical cells and expansion in regions vital for memory and cognition. Other investigations involving brain organoids have highlighted the importance of estrogens in forming and solidifying neural connections.
Limited evidence suggests that humans experience greater exposure to these hormones during pregnancy compared to non-human primates. A 1983 study indicated that gorillas and chimpanzees excrete 4-5 times less estrogen than pregnant humans. Additionally, human placentas exhibit greater gene activity associated with aromatase—an enzyme converting androgens to estrogens—compared to macaques.
“These hormones appear crucial for brain development. Evidence indicates significantly elevated levels in humans, especially during pregnancy,” asserts Tsompanidis.
This influx of hormones may also clarify why humans form larger social networks. Some evolutionary biologists theorize that differences between sexes are subtler in humans than in other primates, fostering broader social connections. For instance, men and women exhibit greater size similarity in comparison to male and female Neanderthals, suggests Tsompanidis, likely a result of elevated estrogen levels in utero.
“High estrogen levels not only reduce masculinization but may also foster a more interconnected brain,” Tsompanidis explains. “Thus, the drive to elevate estrogen levels promotes social cohesion and interconnectedness, integral to human brain development.”
David Geary from the University of Missouri agrees that placental genes influence human brain development and its evolutionary path. However, he believes the significance of male-male competition in brain and cognitive evolution is often underestimated.
He notes that human males within the same groups tend to exhibit more coordination and less aggression compared to other primates—a trait that may have evolved due to intergroup conflicts. Enhanced teamwork and coordination could significantly benefit survival during life-threatening confrontations.
Our understanding of placental differences among primates remains limited. Many non-human primates, such as chimpanzees, consume their placenta post-birth, complicating research efforts, as Tsompanidis highlights.
Unraveling the factors that influenced human brain evolution is not merely an academic endeavor; it also brings insights into human nature.
“Not every human possesses extensive social or linguistic skills, and that’s perfectly acceptable—these traits don’t define humanity,” Tsompanidis remarks. Understanding the brain’s evolutionary journey can illuminate whether certain cognitive attributes come with trade-offs.
RNA is believed to have been crucial in the initiation of life
Shutterstock/nobeastsofierce
The quest to decipher how dormant molecules might have sparked life brings researchers closer to their goal. A team has developed a method using partially replicable RNA molecules, suggesting that genuine self-replication could eventually be achieved.
RNA is a pivotal molecule in the discussion of life’s origins, as it can store information like DNA and catalyze reactions akin to proteins. While neither function is perfect on its own, the dual capability has led many scientists to theorize that life originated with self-replicating RNA molecules. “This was the molecule that governed biology,” says James Atwater from University College London.
Nonetheless, engineering self-replicating RNA molecules is a challenging task. RNA can form double helices similar to DNA, which can also be copied in a similar manner. By separating the two helices and adding RNA nucleotides to each strand, one could theoretically produce two identical helices. However, the binding between RNA strands is so strong that it complicates their separation for replication.
Recently, Attwater and his team found that a trio of RNA nucleotides (triplets) can be tightly bonded, preventing the strands from re-zipping. “Three is the sweet spot,” Attwater elaborates, noting that longer combinations are prone to errors. Thus, in their methodology, the team mixed RNA enzyme double helices with the triplet sequences.
By acidifying the solution and heating it to 80°C (176°F), the helices can be separated to allow for triplet pairing. When the solution is then made alkaline and cooled to -7°C (19°F), the highly concentrated liquid remaining as water freezes activates the RNA enzymes, which then bind the triplets together to form new strands.
Currently, researchers have succeeded in replicating RNA enzymes of up to 30 nucleotides in length from an original strand of 180 nucleotides. They believe that enhancing enzyme efficiency could lead to full replication.
Attwater highlights that this “very simple molecular system” possesses intriguing characteristics. One is the potential correlation between triplet RNA sequences and the triplet code that dictates protein sequences in modern cells. “There may be a connection between the biological mechanisms employed for RNA replication and the way RNA is utilized in present-day biology,” he explains.
Additionally, the team has identified that the triplet sequences most likely to facilitate replication exhibit the strongest bonding. This suggests that the earliest genetic code may have consisted of this set of triplets, which adds another layer of interest.
Researchers contend that the conditions required to support this process might naturally occur. Given the need for freshwater, it’s likely that such processes transpired on land within geothermal systems.
“The materials we see today can be found on Earth. Icelandic hot springs display a mixed pH, similar to what we use,” Attwater notes.
“RNA nucleotide triplets convey highly specific functional information in every cell,” remarks Zachary Adam from the University of Wisconsin-Madison. “This research is captivating as it may indicate a purely chemical role (rather than informational) for RNA nucleotide triplets that could predate the emergence of living cells.”
This discovery implies that the first animals began emerging from the oceans around 400 million years ago and adapted to terrestrial life much quicker than previously thought.
Stuart Smida, a paleontologist from California State University, remarked, “I believed the transition from fins to limbs took more time.”
Before this, the oldest known reptile footprints were found in Canada and dated to 318 million years ago.
The ancient footprints were uncovered in sandstone slabs near Melbourne, revealing reptile-like feet with elongated toes and claws.
Scientists estimate that the creature was about 2.5 feet long (80 cm) and might resemble a modern monitor lizard. These findings were published on Wednesday in the journal Nature.
Co-authors and paleontologists, including Arlberg from Uppsala University in Sweden, indicated that the evidence showcases the identification of nails surrounding the footprint.
“It’s a walking animal,” he stated.
Located near Melbourne, Australia, sandstone slabs reveal fossil footprints of reptile-like creatures that roamed approximately 350 million years ago. The footprint is highlighted in yellow (front paw) and blue (back paw), indicating the movement of three similar animals, according to the researchers. Grzegorz Niedzwiedzki / Prof. Per Per Erik Ahlberg via AP
Only animals that evolved to live entirely on land developed the claws seen in these fossils. Earlier vertebrates, such as fish and amphibians, did not have hard claws and depended on aquatic environments for laying eggs.
In contrast, branches of the evolutionary tree leading to modern reptiles, birds, and mammals, known as amniotes, developed feet equipped with claws suited for traversing dry ground.
Smida commented, “This is the earliest evidence we’ve encountered of animals with claws.”
During the time these ancient reptiles existed, the environment was warm and humid, with expansive forests beginning to take shape. Australia was then part of the supercontinent Gondwana.
The fossil footprints tell a story of a day in the life, Ahlberg explained. A reptile fled across the ground before light rain; some rain droplets lightly obscured the tracks. Subsequently, two more reptiles dashed in opposing directions before the ground hardened and became covered with sediment.
Co-author John Long, a paleontologist at Flinders University in Australia, stated:
hWelcome to the depths of intricate turmoil and long-standing waves. A peace agreement is being negotiated in the Democratic Republic of the Congo this week after three months of intense conflict. I spoke with East Africa correspondent Carlos Mureicz about the situation, its rapid escalation, and the prospects for peace.
Echoes of the 90s
Flag Bearer… individuals will expand the monument during an anti-government demonstration in Bukabu in February. Photo: Luis Tato/AFP/Getty Images
In late January, in a swift and shocking turn of events, the M23 militia group captured Goma, one of the largest cities in the Democratic Republic of the Congo. Weeks later, these rebels took control of Bukabu, another strategically important city, successfully repelling attempts by Congolese troops to halt their advance. The M23’s rapid mobilization and territorial gains are rooted in decades of political and economic strife.
Carlos highlights that the conflict’s origins trace back to the 1994 genocide in Rwanda. Millions of refugees have crossed from Rwanda into the DRC, with Hutu and Tutsi factions still driven by ethnic narratives.
The M23 is primarily led by Tutsi, an ethnic group that has taken up arms over a decade ago and faced numerous skirmishes since. Their military actions are justified as necessary to protect minority communities from ongoing threats and marginalization, given that hundreds of thousands of Tutsi were slaughtered during the genocide by Hutu extremists.
Carlos notes that despite the longstanding conflict, the M23’s advances this year signify a new level of intensity, having made significant territorial gains in a remarkably short period. “This year, [the fighting is] the worst we’ve encountered.”
Conflict Minerals
Heavy Metal… the DRC is the leading producer of cobalt, accounting for over 70% of global production last year. Photo: Junior Kanna/AFP/Getty Images
The M23’s advancements represent a grave infringement on the sovereignty of the DRC, a situation exacerbated by the Rwandan government’s support for the rebel group. “Rwanda denies any involvement; however, according to the United Nations and the international community, Rwanda is financially backing the M23,” Carlos pointed out. The Rwandan government claims its assistance is limited to “protecting targeted Tutsis from genocide,” Carlos added.
However, local experts suggest that Rwanda has heavily invested in maintaining proxy control over parts of the DRC, driven not only by overlapping ethnic groups but also by the rich natural resources the DRC possesses. Often referred to as conflict minerals, these resources have fueled avarice and perpetuated military strife in Eastern and Southern DRC.
It is noteworthy that these regions are seldom discussed in light of their extraordinary beauty, showcasing unique landscapes of stunning red and orange hills, lakes, and fertile soil. Caught in the crossfire of political and economic aspirations, the area has become a battleground for ethnic and commercial conflicts. While the tensions initially stemmed from community disputes, minerals have since played a crucial role, according to Carlos.
These minerals are extremely abundant in the DRC; essential in modern technology. Cobalt, lithium, and coltan are vital components for lithium-ion batteries used in smartphones, laptops, and electric vehicles. The DRC accounts for an astonishing 60-70% of the global supply of these minerals. Carlos emphasizes the extensive resources being allocated for the capture and trade of these natural riches.
A Surge of Violence
Uprooted… individuals fleeing the conflict arrive by handmade boats near Minova, South Kivu province of DRC. Photo: Alexis Huguet/AFP/Getty Images
“Rapid and brutal” is how Carlos describes the events of recent months. He underscores that this is merely the latest chapter in the ongoing strife, which has resulted in one of the world’s largest humanitarian crises. Since 1996, the conflict has led to over 6 million fatalities and displaced a similar number of individuals both within and outside the DRC.
In March, Carlos visited Sibi Talk in Burundi, DRC’s neighbor and a primary destination for refugees fleeing the violence. Those who escaped shared “truly horrifying experiences.”
As the M23 advanced through southeastern DRC, refugees witnessed numerous Congolese soldiers deserting. Carlos remarked: “To illustrate the dire situation, these soldiers told civilians, ‘We are fleeing from the M23. We are outmatched. You should consider leaving this town if you can.’
Those who managed to escape, carrying whatever they could, reached the Burundi border, navigating a perilous river along the way. “The Congolese army appeared utterly powerless. It was a desperate situation.”
Eastern DRC – Remote Regions of a Vast Country
The sunsets of Bulambo, DRC… the shadows of warfare loom in a nation celebrated for its diverse wildlife and landscapes. Photo: Pietro Olivetta/Getty Images/500px
A unique aspect of the DRC conflict is its localization, which may partly explain the tepid response from the Army and local security forces. The capital, Kinshasa, feels worlds apart from Goma, situated a 47-hour drive and ferry journey away. Refugees shared a common sentiment: they attributed their plight to the government, feeling that Kinshasa has neglected the Eastern DRC.
The government operates in the region, comprised of numerous political factions that exploit mineral resources and allocate contracts to foreign companies. Carlos states that many believe that as long as politicians can continue profiting from the region, conflict will persist.
The Prospect of Peace
Peace Process… Qatar’s chief speaks with the Rwandan president (left) and his Congolese counterpart in Doha last month. Photo: Mofa Qatar/AFP/Getty Images
This situation may be shifting, as the M23’s advances pose a threat to the stability of Kinshasa. Carlos mentioned that just weeks ago, the warring parties were inclined to engage in dialogue. Initial discussions held in Doha yielded promises from both sides to produce a preliminary peace agreement. The Trump administration has also shown interest, expressing a strong desire to sponsor peace negotiations.
According to Carlos, these discussions are among the most hopeful in recent times. While an end to hostilities is urgently needed and welcomed, a fragile resolution is only achievable if it doesn’t revert to current chaos. The key to durable peace lies in lifting the DRC out of its historical and geographical entanglements.
Subscribe to receive the complete version of Long Wave directly to your inbox every Wednesday.
Malus is a genus comprising over 35 species that thrive in the temperate Northern Hemisphere, spanning regions from East Asia to Europe and North America. This genus includes the cultivated apple, Malus domestica, along with its wild relatives. Recent research has unveiled the evolutionary connections among Malus species and traced their genetic development over the past 60 million years.
Malus evolutionary landscape informed by phylogenetics. Image credits: Li et al., doi: 10.1038/s41588-025-02166-6.
“There are around 35 species within the Malus genus; however, despite the significance of apples as a fruit crop, comprehensive research on the evolution of this group’s genome has been lacking.”
“This study provided insights into the Malus genome, established the apple family tree, documented genomic events including whole-genome overlap and hybridization among species, and identified genomic regions linked to specific traits, such as resistance to apple scab disease.”
Professor Ma and his team compiled the genomes of 30 species by sequencing their DNA, focusing on the Malus genus known for its delicious golden apple varieties.
Among the 30 species, 20 are diploid, meaning they possess two chromosome copies per set, similar to humans, while 10 are polyploid, indicating they have three or four chromosome copies, likely resulting from recent hybridization with diploid relatives of Malus.
By scrutinizing nearly 1,000 gene sequences across these species, researchers constructed a phylogenetic tree for the genus and employed biogeographical analysis to trace its origins back to Asia approximately 56 million years ago.
“The evolutionary narrative of the genus is intricate, showcasing numerous instances of hybridization among species and shared whole-genome events that complicate comparisons,” stated Professor Ma.
“Access to high-quality genomes for a large number of species within the genus has enabled us to explore how Malus evolved and the interrelationships among these species.”
Further research into the evolutionary history of Malus genomes utilized analytical techniques called pan-genomics.
This methodology encompasses a thorough comparison of conserved genes and so-called ‘jumping genes’ that can move within the genome across the 30 species, along with genes found only in a subset of the genomes.
The analysis of pan-gene dynamics benefited greatly from the use of a pangenome graph tool, which amalgamates genomic data from closely related groups to elucidate evolutionary conservation and divergence.
“Utilizing 30 pangenomes significantly aided in identifying structural variations, gene duplications, and rearrangements among species that could have been missed with fewer genome comparisons,” remarked Professor Ma.
“Notably, one structural variant uncovered allowed us to pinpoint genomic segments related to apple scab resistance, a fungal disease impacting apples globally.”
The researchers also developed a pangenome analysis tool designed to detect evidence of selective sweeps, a process whereby advantageous traits rapidly increase in frequency within a population.
With this approach, they pinpointed genomic regions linked to cold and disease resistance in wild Malus species, which might also correlate with undesirable fruit taste.
“Attempts to cultivate the best flavor in fruit may have inadvertently diminished the hardiness of cultivated apple varieties,” noted Professor Ma.
Understanding structural variations in Malus, including hybridization histories, interspecies relationships, and pangenomic insights can inform future breeding strategies aimed at retaining both flavor and disease-resistant traits in apples.
W. Li et al. Pangenome analysis reveals evolution and diversity in Malus. Nat Genet. Published online on April 16th, 2025. doi:10.1038/s41588-025-02166-6
Four decades ago, a 4-drawer filing cabinet was necessary to house 10,000 documents. Now, it only requires 736 floppy disks to hold the same volume of files. The cloud allows for the storage of 10,000 documents without occupying physical space.
With the evolution of data storage comes a transformation in the information landscape. This evolution poses challenges related to the storage, transfer, and proper utilization of individuals’ personal data.
The Information Commissioner’s Office (ICO) organized an exhibition at the Manchester Central Library this week, showcasing 40 items that demonstrate the evolution of data privacy. Each item illustrates how access to information has changed over the past four decades and how data has become pivotal in major news events.
John Edwards, a member of the intelligence committee, expressed his appreciation for the exhibition, emphasizing the importance of human influence in data-related matters. He highlighted the significance of understanding terms like data controller, data processor, and data subject.
The exhibition features various items, including Pokemon toys, floppy disks, Tesco Club cards, modems, Millennium bug brochures, soccer shirts, and Covid vaccination cards. It also showcases how ICO interventions have brought about societal changes, such as ending the construction industry’s “employment denial list” and implementing public food hygiene assessments for restaurants.
One of Edwards’ favorite exhibition items is the spiked lawn aerator shoes, symbolizing an early enforcement action in the 1980s against a company selling customer information obtained from shoe sales.
My favourite item at the exhibition by Intelligent John Edwards is the spiked grass aerator shoes. Photo: Christopher Tormond/The Guardian
The 40th pedestal at the exhibition remains unused, inviting the public to suggest objects that have influenced the data landscape. Edwards emphasized the personal and subjective nature of privacy, stating that each individual has unique expectations and experiences.
The ICO was founded as a UK data protection regulator near Manchester 40 years ago and now oversees new data protection laws. The regulatory landscape has undergone significant transformations since its inception.
NHS Covid Vaccination Card. Photo: Andy Rain/EPA
According to Edwards, individuals now have a significantly larger amount of personal data worldwide compared to when the ICO was established. The constant flow of data worldwide illustrates the extensive data environment we now exist in.
Edwards highlighted the challenge of keeping pace with the rapid changes in technology and data usage. The ICO regulates a wide range of entities, from small schools and GP surgeries to large social media companies, requiring continuous adaptation to address privacy implications.
Reflecting on the future, Edwards acknowledged the uncertain geopolitical landscape, emphasizing the potential impact of quantum computing and advanced AI technologies on data handling and privacy in the coming years.
Ralph Hollogay, a pioneering anthropologist who emphasized the importance of changes in brain structure in human evolution, passed away on March 12th at his Manhattan home at the age of 90.
His death was announced by the School of Anthropology at Columbia University, where he had been a professor for nearly 50 years.
Holloway’s theory challenged the notion that brain size alone distinguished humans from apes and early ancestors, highlighting the significance of brain organization.
Although no brains from millions of years ago exist, Dr. Holloway focused on creating fossil skull endocasts from latex to overcome this limitation.
In a 2008 paper, he detailed how he obtained information from these casts, providing insight into brain structure by examining the outer edges of the brain.
Using endocasts, Dr. Holloway concluded that the fossil skulls from South Africa’s Town’s Children quarry belonged to early human ancestors, supporting Raymond Dart’s controversial discovery.
His meticulous research included studying natural endocasts found in the quarry to validate his conclusions, emphasizing the importance of independent investigation in scientific discovery.
Dr. Holloway’s focus on the Lunath groove behind the endocast provided evidence that aligned with human brain positioning, confirming the accuracy of Dr. Dart’s initial findings.
The contentious debate surrounding the Town’s Children’s findings has subsided, with Dr. Holloway’s and Dr. Dart’s conclusions about the Lunate Sulcus now widely accepted in the scientific community.
Dr. Holloway’s emphasis on brain structure over volume played a pivotal role in validating human ancestry, highlighting the significance of reorganization in evolutionary development.
Throughout his career, Dr. Holloway’s dedication to studying brain evolution through three-dimensional modeling remained unwavering, emphasizing the importance of understanding the human brain’s journey to its current complexity.
His contributions, such as his work on TaungChild, continue to shape our understanding of human origins and evolution.
Dr. Holloway’s legacy extends beyond his scientific achievements, as he leaves behind a lasting impact on the field of anthropology and evolutionary studies.
His commitment to rigorous research, innovative methods, and interdisciplinary collaboration sets a standard for future generations of scientists.
Dr. Hollogay’s contributions will continue to inspire and guide anthropologists, researchers, and educators in their quest to unravel the mysteries of human evolution.
His impact will be felt for generations to come, shaping the future of evolutionary studies and advancing our understanding of human origins.
Ralph Hollogay’s legacy lives on through his groundbreaking research and profound influence on the field of anthropology.
His work continues to shape our understanding of human evolution and the complexities of brain development.
This supernova event may have occurred at the Upper Centaurus Lupus Society. This is a group of giant stars about 457 light years away from Earth.
Illustration of an exoplanet like Earth after X-ray radiation exposure. Image credit: NASA/CXC/M. Weiss.
Life on Earth is constantly evolving under continuous exposure to ionizing radiation from both terrestrial and cosmic origins.
The radioactivity in the bedrock gradually decreases over timescales of billions of years, but the level of cosmic radiation fluctuates as the solar system moves through the Milky Way.
Nearby supernova activity could increase the level of radiation on the Earth’s surface by several orders of magnitude, which is expected to have a major impact on the evolution of life.
In particular, radiation levels improve as the solar system passes near a large group of stars known as the OB Association.
The winds associated with these large star factories are expected to inflate the super bubbles of high temperature plasma first. This could be the birthplace of most of the Core Collapse explosions taking place within the AB Association.
The solar system entered such a super bubble, commonly known as the local bubble, about 6 million years ago, and is now close to its centre.
“The Earth entered the local bubble and passed its stardust-rich appearance about 6.5 million years ago, sowing the planet with old iron 60, the radioactive iron of iron produced by the exploding stars. did it,” astronomer Santa Cruz, and colleagues at the University of California.
“Then, 20-3 million years ago, one of our neighboring stars exploded with incredible force, providing another cohort of radioactive iron to the planet.”
When Nojiri and her co-authors simulated what the supernova looked like, they discovered that it hammered the Earth with cosmic rays for 100,000 years of explosion.
This model perfectly described previously recorded spikes of radiation that shocked the Earth around that time.
“We’ve seen from other papers that radiation can damage DNA,” Nojiri said.
“It could be an evolutionary change in the cell or an accelerated mutation.”
Meanwhile, the author came across research into viral diversity in one of the Rift Valley Lakes in Africa.
“I can’t say they’re connected, but there are similar time frames,” Nojiri said.
“We found it interesting that the virus’s diversification is increasing.”
study It was published in Astrophysics Journal Letter.
____
Caitlyn Nojiri et al. 2025. Bubble Life: How nearby supernova left short-lived marks on the cosmic ray spectrum, leaving an indelible trace of life. apjl 979, L18; doi: 10.3847/2041-8213/ADA27A
Excavated in southeastern China, fossils of Jurassic birds are said to have a significant impact on the history of bird evolution, according to researchers.
The recently discovered Baminolis Zengensis, a bird the size of a quail, flew in the skies approximately 150 million years ago during the Jurassic period. A study about it was published on Wednesday in the Nature Journal. In essence, it is one of the oldest known birds to humanity, alongside the iconic Archeopteryx found in Germany in 1862 and of a similar age.
“For over 150 years, Archeopteryx has stood alone,” said Steve Bursatte, a paleontologist at the University of Edinburgh. Further explanation on the research accompanied this statement.
“All this time, it has remained as the sole unsuspecting bird fossil from the Jurassic era,” he mentioned to NBC News via email.
While other bird-like Jurassic fossils were found, there was a “significant mystery and frustrating gap” in the fossil record, according to Bursatte. So, where were their fossils?
The 2023 discovery of Baminornis in Zhenghe County, Fujian Province, China, filled that gap and became one of the most important discoveries since Archeopteryx, labeling it as “the second unsuspecting bird from the Jurassic era.”
Unlike the half-bird, half-reptile that had a long, thin tail resembling a velociraptor, Baminonis had a short tail where some vertebrae fused into short, sturdy nubins pushing the body center towards the wings, similar to modern birds for better flying.
Until the discovery of Baminoris, short tails were only found in birds known to have lived around 20 million years later, such as Eoconfuciusornis and Protopteryx.
Brusatte expressed excitement that Baminoris was more advanced than Archeopteryx and could fly much better.
Baminornis was more anatomically complex than Archeopteryx, being a “primitive” bird with claws and sharp teeth resembling dinosaur ancestors.
The discovery of two similarly aged birds about 5,500 miles apart led the nature research team to believe that bird evolution took place millions of years earlier than previously thought.
In addition to dozens of fossils of aquatic or semi-aquatic animals, the Zhenghe Fauna collection included at least three Aviaran fossils, enriching the understanding of early diversification and filling important gaps in the evolutionary history of terrestrial ecosystems towards the end of the Jurassic era according to researchers.
Baminornis fossils preserved most of the skeleton, but the wings were not preserved, leaving questions about their size and wing structure. The lack of a skull also limits clues regarding their diet.
Nevertheless, Baminornis suggests that various birds lived during the Jurassic period and flew in different ways,” Brusatte said.
Paleontologists have excavated fossilized remains of two Jurassic bird species in the area of Zenge County, Fujian Province, southeastern China. These 149 million-year-old fossils exhibit early appearances of highly derived bird characteristics, and together with fossils of another bird from the same region, they have the early origins of the birds and the early Jurassic. It suggests bird radiation.
“Birds are the most diverse group of terrestrial vertebrates,” says Professor Min Wang. Paleontology and Paleontology of Vertebrates The Chinese Academy of Sciences and colleagues said in a statement.
“Specific macroevolutionary studies suggest that their early diversification dates back to the Jurassic period.”
“However, the earliest evolutionary history of birds has long been obscure by highly fragmented fossil records. Archeopteryx Being the only widely accepted Jurassic bird. ”
“nevertheless Archeopteryx It was closely similar, especially due to its distinctive long reptile tail, as it had feathered wings. This is in stark contrast to the short-tailed morphology of modern and Cretaceous birds. ”
“Recent research questions about Aviaran's status. Archeopteryx classifies it as a deinonychosaurian dinosaur, a sister group of birds. ”
“This raises the question of whether there is a clear record of Jurassic birds.”
In their new study, Professor Wang and co-authors discovered and investigated two early bird fossils that were part of the so-called Zhenghe Biota.
One of these birds named Baminornis Zhenghensis the earliest known short-tailed bird.
“Baminornis Zhenghensis The end of the short tail in a complex bone called Pygostyle is a characteristic that can also be observed in living birds,” the paleontologist said.
“Previously, the oldest record of short-tailed birds was from the early Cretaceous period.”
“Baminornis Zhenghensis It is the only Jurassic and the oldest short-tailed bird ever discovered, pushing back the appearance of this derivative bird's distinctive features for nearly 20 million years. ”
According to the team, Baminornis Zhenghensis It also represents one of the oldest known birds.
“A step back and reconsidering the uncertainty of the phylogenetics Archeopteryx we don’t doubt it Baminornis Zhenghensis said Dr. Zhonghe Zhou of the Institute for Vertebrate Paleontology and Paleontology at the Chinese Academy of Sciences.
The second unnamed bird is represented by a single fossilized fullcula (wishbone).
“Our results support this introduction to Furcula ornithuromorpha a diverse group of Cretaceous birds,” the researchers said.
Team's work It was published in the journal today Nature.
____
R. Chen et al. 2025. The first short-tailed bird from the late late Jurassic period in China. Nature 638, 441-448; doi:10.1038/s41586-024-08410-z
Today, clothes are the means of self -expression and group identity, and we will not go without them.
Photo of Martin Parr/Magnum
Venus figurines are most famous for their sexual characteristics. These frequently pronounced sculptures of women’s forms made about 30,000 years to 20,000 years ago have been interpreted as a ritual rich Ility, the mother of the mother, and the self -portrait. Like a fashion plate, one of them is not generally seen. However, some of them can get a glimpse of the appetite, which is worn by a good-looking stone-oriented woman. One of Russian Kostenki is equipped with a strap-style robe. Others have stringed instruments. And the famous Venus of Willendorf is wearing a woven hat, but it is a very wonderful hat.
These statues are far from our general concepts in the past, covered with animal fur. According to archeologists, the gorgeous details with their clothes are the importance of clothing tens of thousands of years ago and hundreds of thousands of years ago. Olga sofaProfessor Emerita from the University of Illinois University of Urbanhamping. It started as needed and turned into a canvas for aesthetic expressions and meanings to keep people warm. Now, the story of how it happened was added, thanks to some new discoveries.
Clothing is easy to rot, and the oldest archeological site is only around 10,000 years old。 However, as the Venus figurine shows, we can follow the time dating in other ways. These archeological clues have revealed the origin of both simple cape and complex tailoring. largely…
If dinosaurs really did appear near the equator, life would have been particularly hot and dry.
Mark Whitton/Natural History Museum Trustees
Dinosaurs may have first evolved near the equator, rather than far south in the Southern Hemisphere as previously thought. Modeling studies suggest they originated in areas covering what is now the Amazon rainforest, the Congo Basin, and the Sahara Desert.
“Given the gaps in the fossil record and the evolutionary tree of dinosaurs, it is very likely that this is the central point of dinosaur origin,” he says. Joel Heath At University College London.
Dinosaurs evolved during the Triassic period, which lasted from 252 million to 201 million years ago, but there is “considerable” uncertainty about when and where they evolved, Heath said. The oldest known fossils of these animals are about 230 million years old, but there are enough features to suggest that dinosaurs have already been around for millions of years. “There must have been a lot going on in terms of dinosaur evolution, but we don't have the fossils,” he says.
At this time, the Earth looked very different. All the continents were combined into a single supercontinent called Pangea, shaped like a C with its center straddling the equator. South America and Africa were located in this southern hemisphere part and were fitted together like pieces of a jigsaw puzzle. The earliest known dinosaurs lived in the southern parts of these two continents, in present-day Argentina and Zimbabwe, where dinosaurs were thought to have originated.
To learn more, Heath and his colleagues built a computer model that works backwards in time from the oldest known dinosaurs to the group's origins. They considered uncertainties such as gaps in the fossil record, possible geographic barriers, and ongoing questions about how the earliest dinosaurs were related to each other to create dozens of versions. has been created.
Most of these simulations concluded that dinosaurs first appeared near the equator, with only a few supporting a southern origin.
Paleontologists have tended to believe that dinosaurs couldn't have originated near the equator, Heath said. One reason for this is that no early dinosaur fossils have been found in the area. Moreover, it was a difficult place to live. “It was very, very dry and very hot,” he says. “It is believed that dinosaurs could not have survived in such conditions.”
However, most models do not. “This suggests something that we didn't really think was possible until now,” Heath says.
In fact, there may be a more prosaic explanation for the lack of early dinosaur fossils found near the equator. Paleontologists tend to conduct excavations in North America, Europe, and more recently China. “There are many areas of the planet that are completely ignored,” says Heath. He added that geologists have not found many rocks of suitable age in the area associated with the findings that can be excavated. “It may not be exposed in a way that we can easily investigate.”
But evidence supporting Heath's idea has recently come to light. On January 8th, researchers david loveless At the University of Wisconsin-Madison, oldest known dinosaur Originally from northern Pangea. They discovered what they call a species new to science. Avaitum Banduiche, sauropodomorphs related to long-necked dinosaurs such as diplodocus That evolved later. The research team discovered the 230-million-year-old rock in Wyoming's Popo Aggie Formation.
If dinosaurs were already present north and south of Pangea that long ago, there's no way the middle of the equator would be closed off to them, Heath said. “They must have been crossing the area.”
CRisti Thomas called 911 for the second time on a warm October day, but when she couldn’t get through, she began to panic. She watched anxiously as a plume of black smoke grew over a rural community in central California.
Just then, I heard a familiar ping.
Watch Duty, an app that warns users of the risk of wildfires and provides critical information in the event of a fire, was already recording the fires. She relaxed. The cavalry was coming.
“I can’t describe the sigh of relief,” she said, recalling the moments after sirens wailed through her neighborhood and helicopters roared overhead. “We saw it happen, so we had questions, and the oversight mandate answered them all.”
Thomas is one of the millions of Watch Duty evangelists who have helped the app spread rapidly. This organization has only been in existence for three years. Currently boasts up to 7.2 million active users At peak times, it receives up to 512 million page views. For a nonprofit organization run mostly by volunteers, those numbers are impressive even by startup standards. But they are not surprising.
Lookout duties have changed the lives of people in fire-prone areas. When the skies darken and ash fills the air, users no longer have to scramble for information, they can now rely on the app to get fast, accurate information for free.
Provides access to critical information about where the danger is, including fire perimeters, evacuation areas, and evacuation location maps. Users can find wildfire camera feeds, track aircraft locations, and see wind data all in one place. The app can also help identify when there is little need for alarm, when risks have subsided, and which agencies are active on the ground.
“This app isn’t just about alerts, it’s about your state of mind,” said Watch Duty CEO John Mills. The Silicon Valley alumnus founded the organization after moving from San Francisco to a large, fire-prone ranch in Sonoma County. After starting in just four California counties, Watch Duty covered the entire state in its first year and quickly expanded from the American West to Hawaii.
As the community grows to reach people in 14 states by 2024, Mills says new features and improved accuracy have made it more popular and filled an unmet need.
It’s not just residents who have become reliant on apps in recent years. A variety of responders, from firefighters to city officials to journalists, are also logging on to ensure key stakeholders are on the same page.
“People thank me for on-duty duty, but I’m like, ‘You’re welcome, I’m sorry you need it,'” Mills said. But it’s clear that the need is real. In each new area we served, word of mouth drove adoption.
“We didn’t spend any money on marketing,” Mills said. “To let the world know that we just let the genie out of the bottle and things are not going back to the way they were.”
CalFire supervisors watch over the Rhine Fire in San Bernardino County, California, in September 2024. Photo: Jon Putman/Sopa Images/Rex/Shutterstock
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.