New research reveals that burn injuries have significantly influenced the rapid evolution of humans.
Scientists from Imperial College London assert that our close relationship with fire has made our ancestors more resilient to burn injuries compared to other animals. This ongoing exposure to fire may have fundamentally shaped our wound healing processes and immune responses, leaving an indelible impact on our genetic makeup.
Study author Professor Armand Leroy, an evolutionary biologist at Imperial’s School of Life Sciences, states, “The concept of burn selection introduces a novel form of natural selection that is influenced by cultural factors.” He emphasizes, “This adds a new dimension to the narrative of what defines humanity, something we were previously unaware of.”
While minor burns typically heal swiftly, severe burns that take longer to mend can permit bacterial invasion, escalating the risk of infection.
Researchers hypothesize that these challenges prompted crucial genetic adaptations, leading evolution to favor traits that enhance survival after burn injuries. For instance, this includes accelerated inflammation responses and enhanced wound closure mechanisms.
Published in the journal BioEssays, the study contrasts human genomic data with that of other primates. Findings indicate that genes related to burn and wound healing exhibit accelerated evolution in humans, with increased mutations observed in these genes. These evolutionary changes are believed to have resulted in a thicker dermal layer of human skin and deeper placement of hair follicles and sweat glands.
However, the study suggests a trade-off; while amplified inflammation is beneficial for healing minor to moderate burns, it can exacerbate damage in cases of severe burns. More specifically, extreme inflammation from serious burns can lead to scarring and, in some instances, organ failure.
This research may shed light on why some individuals heal effectively while others struggle after burn-related injuries, potentially enhancing treatment methodologies for burns and scars.
According to Prince Kyei Baffour, a burn specialist and lecturer at Leeds Beckett University who was not part of the study, “This field remains underexplored and represents a burgeoning area of research regarding burn injury responses.” BBC Science Focus.
Baffour recommends further investigations into various forms of fire exposure, including smoke inhalation.
Homo sapiens and Neanderthals likely interbred across a vast region, extending from Western Europe to Asia.
Modern humans (Homo sapiens) and Neanderthals (Homo neanderthalensis) exhibited mixed ancestry, with most non-Africans today possessing Neanderthal DNA, approximating 2% of their genome. Neanderthals also engaged in interbreeding, leading to a lineage shift in the Y chromosome influenced by Homo sapiens.
Despite increasing knowledge about the timing of this hybridization, the specific regions and scales of these interactions long remained a mystery. Ancestors of Neanderthals departed Africa around 600,000 years ago, migrating toward Europe and Western Asia. The first evidence of Homo sapiens moving from Africa includes skeletal remains from sites in modern-day Israel and Greece, dating to approximately 200,000 years ago.
Evidence suggests that Homo sapiens contributed genetically to the Neanderthal population in the Altai Mountains around 100,000 years ago. However, the primary wave of migration from Africa occurred over 60,000 years ago. Recent studies utilizing ancient genomic data indicate that significant gene flow between Homo sapiens and Neanderthals began around 50,000 years ago, with findings documented in studies of 4000 and 7000 gene transfers.
This interaction is thought to have primarily taken place in the eastern Mediterranean, although pinpointing the exact locations remains challenging.
To investigate, Matthias Karat and his team from the University of Geneva analyzed 4,147 ancient genetic samples from over 1,200 locations, with the oldest dating back approximately 44,000 years. They studied the frequency of genetic mutations (introgression alleles) originating from Neanderthal DNA that were passed down through hybridization.
“Our objective was to use Neanderthal DNA integration patterns in ancient human genomes to determine the sites of hybridization,” Carlat explains.
Findings revealed that the proportion of transferred DNA increased gradually as one moved away from the eastern Mediterranean region, plateauing approximately 3,900 kilometers westward into Europe and eastward into Asia.
“We were surprised to identify a distinct pattern of increasing introgression rates in the human genome, likely linked to human expansion from Africa,” Carlat notes. “This increase toward Europe and East Asia allows us to estimate the parameters of this hybrid zone.”
Computer simulations showed a hybrid zone potentially spanning much of Europe and the eastern Mediterranean, extending into western Asia.
Interbreeding Zone between Neanderthals and Homo sapiens
Lionel N. Di Santo et al. 2026
“Our findings suggest a continuous series of interbreeding events across both space and time,” notes Carlat. “However, the specifics of mating occurrences in this hybrid zone remain unknown.”
This hybrid zone encompasses nearly all known Neanderthal remains found across Western Eurasia, with the exception of the Altai region.
“The extensive geographical breadth of the putative hybrid zone suggests widespread interactions among populations,” states Leonard Yasi from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany.
Notably, the Atlantic periphery—including western France and much of the Iberian Peninsula—was not part of the hybrid zone, despite the established presence of Neanderthals in these regions. Currat suggests that interbreeding may not have occurred here or may not be reflected in the analyzed genetic samples.
“This study reveals ongoing interactions between modern humans and Neanderthals over extensive geographical areas and extended periods,” adds Yasi. The hybrid zone may extend further, though limited ancient DNA sampling in regions like the Arabian Peninsula complicates assessment of its reach.
“This pivotal research challenges the notion that interbreeding occurred only in one area of West Asia with a singular Neanderthal population (not represented in existing genetic samples). Homo sapiens appear to have dispersed from Africa in increasing numbers across expanding territories, likely outcompeting smaller Neanderthal groups they encountered throughout most of the recognized Neanderthal range,” comments Chris Stringer from the Natural History Museum in London.
Approximately 2% of the world’s fish species, or about 500 species, are known to change sex at some point during their adult life.
Some species, like the black-spotted fish (as shown above), switch from female to male periodically. Others, such as clownfish, can change from male to female, while species like coral-dwelling gobies switch genders based on environmental conditions.
This phenomenon is distinct in fish because, unlike mammals and birds, many fish species do not have their sex determined by sex chromosomes.
Environmental cues trigger changes in gene activity, influencing the production of essential hormones and enzymes. A key enzyme, aromatase, plays a critical role by converting male hormones into female ones and changing gonads into ovaries.
Social dynamics can also act as environmental signals. Clark clownfish, for instance, live among sea anemones in small groups during the breeding season. If a breeding female passes away, the largest subordinate male is known to change sex and assume her role.
Changes in water quality can signal a shift in gender as well.
Research indicates that pollutants entering rivers can induce male fish to exhibit female traits, such as spawning behaviors.
Furthermore, a 2008 study found that a mere 1 to 2 degrees Celsius increase in water temperature could skew the sex ratio of certain fish towards a higher male count.
Some sex changes are advantageous; for example, clownfish evolve to switch genders as a survival strategy to enhance reproduction. However, human activities are disrupting natural sex change processes.
Polluting rivers or warming oceans presents severe risks to future aquatic species.
This article addresses the question posed by Alex Jackson via email: “How can animals switch gender?”
For inquiries, feel free to email us at:questions@sciencefocus.com or connect with usFacebook,Twitter, orInstagramand include your name and location.
Explore our ultimatefun facts and more fascinating science content
Read more:
This revision incorporates SEO-friendly practices while maintaining the HTML structure. It emphasizes key terms and enhances readability.
Artificial intelligence is becoming an inescapable reality, seamlessly integrating into our lives. Forget searching for chatbots; new icons will soon appear in your favorite applications, easily accessible with a single click, from WhatsApp to Google Drive, and even in basic programs like Microsoft Notepad.
The tech industry is making substantial investments in AI, pushing users to leverage these advancements. While many embrace AI for writing, management, and planning, some take it a step further, cultivating intimate relationships with their AI companions.
In James Muldoon’s Love Machine: How Artificial Intelligence Will Change Our Relationships, we delve into the intricate connections humans form with chatbots, whether they’re designed for romantic encounters or simply companionship. These AI systems also serve as friends or therapists, showcasing a broad range of interactions we’ve often discussed. New Scientist dedicates 38 pages to this topic.
In one interview, a 46-year-old woman in a passionless marriage shares her experience of using AI to explore her intricate sexual fantasies set in an 18th-century French villa. This opens up broader conversations about utilizing AI in more practical life scenarios, such as during a doctor’s visit.
Another participant, Madison, recounts uploading her late best friend’s text messages to a “deathbot” service, which generates a way for her to maintain communication.
Muldoon’s anecdotes often carry an element of voyeuristic intrigue. They reveal the diverse ways individuals navigate their lives, some paths being healthier than others. What works for one person might prove detrimental for another.
However, a critical question remains. Are we naïve to think that AI services won’t evolve like social media, cluttered with advertisements for profit? Envision a long-term relationship with a chatbot that frequently pushes products your way. What happens if the company collapses? Can you secure backups of your artificial companions, or migrate them elsewhere? Do you hold rights to the generated data and networks? Moreover, there are psychological risks associated with forming attachments to these indifferent “yes-men,” which may further alienate individuals lacking real social connections.
Nonetheless, there are positive applications for this technology. In Ukraine, for instance, AI is being harnessed to help individuals suffering from PTSD, far exceeding the current availability of human therapists. The potential to revolutionize customer service, basic legal operations, and administrative tasks is immense. Yet, Muldoon’s narrative suggests that AI often functions as an unhealthy emotional crutch. One man, heartbroken over his girlfriend’s betrayal, envisions creating an AI partner and starting a family with her.
This book appears less about examining the social impacts of innovative technology and more like a warning signal regarding pervasive loneliness and the critical lack of mental health resources. A flourishing economy, robust healthcare system, and more supportive society could reduce our reliance on emotional bonds with software.
Humans are naturally inclined to anthropomorphize inanimate objects, even naming cars and guitars. Our brain’s tendency to perceive faces in random patterns—pareidolia—has been a survival mechanism since prehistoric times. So, is it surprising that we could be deceived by machines that mimic conversation?
If this provokes skepticism, guilty as charged. While there’s potential for machines to gain sentience and form genuine relationships in the future, such advancements are not yet realized. Today’s AI struggles with basic arithmetic and lacks genuine concern for users, despite producing seemingly thoughtful responses.
Simulating the human brain involves using advanced computing power to model billions of neurons, aiming to replicate the intricacies of real brain function. Researchers aspire to enhance brain simulations, uncovering secrets of cognition with enhanced understanding of neuronal wiring.
Historically, researchers have focused on isolating specific brain regions for simulations to elucidate particular functions. However, a comprehensive model encompassing the entire brain has yet to be achieved. As Markus Diesmann from the Jülich Research Center in Germany notes, “This is now changing.”
This shift is largely due to the emergence of state-of-the-art supercomputers, nearing exascale capabilities—performing billions of operations per second. Currently, only four such machines exist, according to the Top 500 list. Diesmann’s team is set to execute extensive brain simulations on one such supercomputer, named JUPITER (Joint Venture Pioneer for Innovative Exascale Research in Germany).
Recently, Diesmann and colleagues demonstrated that a simple model of brain neurons and their synapses, known as a spiking neural network, can be configured to leverage JUPITER’s thousands of GPUs. This scaling can achieve 20 billion neurons and 100 trillion connections, effectively mimicking the human cerebral cortex, the hub of higher brain functions.
These simulations promise more impactful outcomes than previous models of smaller brains such as fruit flies. Recent insights from large language models reveal that larger systems exhibit behaviors unattainable in their smaller counterparts. “We recognize that expansive networks demonstrate qualitatively different capabilities than their reduced size equivalents,” asserts Diesmann. “It’s evident that larger networks offer unique functionalities.”
Thomas Novotny from the University of Sussex emphasizes that downscaling risks omitting crucial characteristics entirely. “Conducting full-scale simulations is vital; without it, we can’t truly replicate reality,” Novotny states.
The model in development at JUPITER is founded on empirical data from limited neuron and synapse experiments in humans. As Johanna Cenk, a collaborator with Diesmann at Sussex, explains, “We have anatomical data constraints coupled with substantial computational power.”
Comprehensive brain simulations could facilitate tests of foundational theories regarding memory formation—an endeavor impractical with miniature models or actual brains. Testing such theories might involve inputting images to observe neural responses and analyze alterations in memory formation with varying brain sizes. Furthermore, this approach could aid in drug testing, such as assessing impacts on a model of epilepsy characterized by abnormal brain activity.
The enhanced computational capabilities enable rapid brain simulations, thereby assisting researchers in understanding gradual processes such as learning, as noted by Senk. Additionally, researchers can devise more intricate biological models detailing neuronal changes and firings.
Nonetheless, despite the ability to simulate vast brain networks, Novotny acknowledges considerable gaps in knowledge. Even simplified whole-brain models for organisms like fruit flies fail to replicate authentic animal behavior.
Simulations run on supercomputers are fundamentally limited, lacking essential features inherent to real brains, such as real-world environmental inputs. “While we can simulate brain size, we cannot fully replicate a functional brain,” warns Novotny.
Humans have larger brains relative to body size compared to other primates, which leads to a higher glucose demand that may be supported by gut microbiota changes influencing host metabolism. In this study, we investigated this hypothesis by inoculating germ-free mice with gut bacteria from three primate species with varying brain sizes. Notably, the brain gene expression in mice receiving human and macaque gut microbes mirrored patterns found in the respective primate brains. Human gut microbes enhanced glucose production and utilization in the mouse brains, suggesting that differences in gut microbiota across species can impact brain metabolism, indicating that gut microbiota may help meet the energy needs of large primate brains.
Decasian et al. provided groundbreaking data showing that gut microbiome shapes brain function differences among primates. Image credit: DeCasien et al., doi: 10.1073/pnas.2426232122.
“Our research demonstrates that microbes influence traits critical for understanding evolution, especially regarding the evolution of the human brain,” stated Katie Amato, lead author and researcher at Northwestern University.
This study builds upon prior research revealing that introducing gut microbes from larger-brained primates into mice leads to enhanced metabolic energy within the host microbiome—a fundamental requirement for supporting the development and function of energetically costly large brains.
The researchers aimed to examine how gut microbes from primates of varying brain sizes affect host brain function. In a controlled laboratory setting, they transplanted gut bacteria from two large-brained primates (humans and squirrel monkeys) and a smaller-brained primate (macaque) into germ-free mice.
Within eight weeks, mice with gut microbes from smaller-brained primates exhibited distinct brain function compared to those with microbes from larger-brained primates.
Results indicated that mice hosting larger-brained microbes demonstrated increased expression of genes linked to energy production and synaptic plasticity, vital for the brain’s learning processes. Conversely, gene expression associated with these processes was diminished in mice hosting smaller-brained primate microbes.
“Interestingly, we compared our findings from mouse brains with actual macaque and human brain data, and, to our surprise, many of the gene expression patterns were remarkably similar,” Dr. Amato remarked.
“This means we could alter the mouse brain to resemble that of the primate from which the microbial sample was derived.”
Another notable discovery was the identification of gene expression patterns associated with ADHD, schizophrenia, bipolar disorder, and autism in mice with gut microbes from smaller-brained primates.
Although previous research has suggested correlations between conditions like autism and gut microbiome composition, definitive evidence linking microbiota to these conditions has been lacking.
“Our study further supports the idea that microbes may play a role in these disorders, emphasizing that the gut microbiome influences brain function during developmental stages,” Dr. Amato explained.
“We can speculate that exposure to ‘harmful’ microorganisms could alter human brain development, possibly leading to the onset of these disorders. Essentially, if critical human microorganisms are absent in early stages, functional brain changes may occur, increasing the risk of disorder manifestations.”
These groundbreaking findings will be published in today’s Proceedings of the National Academy of Sciences.
_____
Alex R. Decassian et al. 2026. Primate gut microbiota induces evolutionarily significant changes in neurodevelopment in mice. PNAS 123(2): e2426232122; doi: 10.1073/pnas.2426232122
Sahelanthropus: Fossil comparison with chimpanzees and humans
Williams et al., Sci. Adv. 12, eadv0130
The long-standing debate regarding whether our earliest ancestors walked on knuckles like chimpanzees or stood upright like modern humans may be closer to resolution, yet skepticism remains.
Scott Williams and researchers at New York University recently reanalyzed fossil remains of Sahelanthropus tchadensis, indicating that this species possessed at least three anatomical features suggesting it was our earliest known bipedal ancestor.
The journey to this conclusion has been extensive.
Fossilized remains of a skull, teeth, and jawbone from approximately 7 million years ago were first identified in 2002 in Chad, north-central Africa. The distinctive features of this ancient species, including its prominent brow ridge and smaller canine teeth, were quickly acknowledged as diverging from ape characteristics.
Analyzing the skull’s anatomy suggests it was positioned directly over the vertebrae, analogous to other upright, bipedal hominins.
In 2004, French scientists uncovered the femur and ulna associated with the Sahelanthropus skull from Chad. However, it wasn’t until 2020 that researchers claimed the femur exhibited curvature similar to that of non-bipedal great apes.
Since then, scholarly debate has fluctuated. For instance, in 2022, researchers Frank Guy and Guillaume Daver of the University of Poitiers argued for anatomical features of the femur that indicate bipedalism. In 2024, Clement Zanoli and colleagues from the University of Bordeaux countered, suggesting Guy and Daver’s assertions were flawed, as the anatomical characteristics of bipedalism may also appear in non-bipedal great apes.
Lead study author Williams started with a “fairly ambivalent” stance on Sahelanthropus.
His team investigated the femur’s attachment point for the gluteus maximus muscle, finding similarities to human femur anatomy.
They also compared the femur and ulna size and shape; while similar in size to chimpanzee bones, they aligned more closely with human proportions.
Additionally, they identified the “femoral tuberosity,” a previously overlooked feature of Sahelanthropus.
“We initially identified it by touch, later confirming it with 3D scans of the fossil,” Williams shared. “This bump, present only in species with a femoral tubercle, contrasts smooth areas found in great apes and plays a critical role in mobility.”
This area serves as an attachment point for the iliofemoral ligament, the strongest ligament in the human body. While relaxed when seated, it tightens during standing or walking, securing the femoral head in the hip joint and preventing the torso from tilting backward or sideways.
However, Williams expressed doubts about whether this study would fully end the conversation about how Sahelanthropus moved.
“We are confident Sahelanthropus was an early bipedal hominin, but we must recognize that the debate is ongoing,” Williams noted.
In response to a recent paper, Guy and Daver issued a joint statement asserting that humans likely began walking on two legs by 2022: “This reaffirms our earlier interpretations about Sahelanthropus adaptations and locomotion, suggesting habitual bipedalism despite its ape-like morphology.”
They acknowledged that only new fossil discoveries could unequivocally conclude the matter.
John Hawkes, a professor at the University of Wisconsin-Madison, also endorsed the new findings, noting their implications for understanding the complex origins of the hominin lineage.
“It may be deceptive to perceive Sahelanthropus as part of a gradual evolution towards an upright posture. It reveals crucial insights into these transformative changes,” Hawkes commented.
However, Zanoli contended, stating, “Most of the evidence aligns Sahelanthropus with traits seen in African great apes, suggesting its behavior was likely a mix between chimpanzees and gorillas, distinct from the habitual bipedalism of Australopithecus and Homo.
Explore the Origins of Humanity in South-West England
Join a gentle walking tour through the Neolithic, Bronze Age, and Iron Age, immersing yourself in early human history.
A groundbreaking research team at the Max Planck Institute for Evolutionary Anthropology has successfully generated a high-quality Denisovan genome assembly using ancient DNA extracted from molar teeth found in the Denisovan Cave. This genome, dating back approximately 200,000 years, significantly predates the only previously sequenced Denisovan specimen. The findings are prompting a reevaluation of when and where early human groups interacted, mixed, and migrated throughout Asia.
Artist’s concept of Penghu Denisovans walking under the bright sun during the Pleistocene in Taiwan. Image credit: Cheng-Han Sun.
Dr. Stéphane Peregne, an evolutionary geneticist from the Max Planck Institute for Evolutionary Anthropology, along with his team, recovered this Denisovan genome from molars excavated in the Denisova Cave, located in the Altai Mountains of southern Siberia. This cave is historically significant as it was the site where Denisovans were first discovered in 2010 through DNA analysis of finger bones.
This cave continues to be pivotal in the study of human evolution, revealing repeated occupations by Denisovans, Neanderthals, and even offspring resulting from the interbreeding of these groups.
“The Denisovans were first identified in 2008 based on ancient DNA sourced from Denisova 3, a phalanx found in the Denisova Cave,” Dr. Peregne and his colleagues noted.
“This analysis confirms that Denisovans are closely related to Neanderthals, an extinct human group that thrived in Western Eurasia during the mid-to-late Pleistocene.”
Since then, twelve fragmentary remains and a single skull have been associated with Denisovans through DNA or protein analysis, with Denisova 3 being the only specimen yielding a high-quality genome.
The newly studied molars, belonging to a Denisovan male who lived approximately 200,000 years ago, are predating modern humans’ migration out of Africa.
“In 2020, a complete upper left molar was found in Layer 17, one of the oldest cultural layers within the southern chamber of the Denisova Cave, dating between 200,000 and 170,000 years old based on photostimulated luminescence,” the scientists elaborated.
“Designated as Denisova 25, this molar resembles others found at Denisova Cave, specifically Denisova 4 and Denisova 8, and exhibits larger dimensions compared to Neanderthal and most post-Middle Pleistocene hominid molars, indicating it likely belonged to a Denisovan.”
“Two samples of 2.7 mg and 8.9 mg were extracted by drilling a hole at the cement-enamel junction of the tooth, with an additional 12 subsamples varying from 4.5 to 20.2 mg collected by carefully scraping the outer root layer using a dental drill.”
Thanks to excellent DNA preservation, researchers successfully reconstructed the genome of Denisova 25 with high coverage, matching the quality of the 65,000-year-old female Denisova 3 genome.
Denisovans likely had dark skin, in contrast to the pale Neanderthals. The image depicts a Neanderthal. Image credit: Mauro Cutrona.
Comparisons between the genomes indicate that Denisovans were not a singular, homogeneous population.
Instead, at least two distinct Denisovan groups inhabited the Altai region at various intervals, with one group gradually replacing the other over millennia.
Earlier Denisovans possessed a greater amount of Neanderthal DNA than later populations, suggesting that interbreeding was a regular event rather than an isolated occurrence in the Ice Age landscape of Eurasia.
Even more intriguing, the study uncovered evidence that Denisovans engaged in interbreeding with “hyperarchaic” hominin groups that diverged from the human lineage before the ancestors of Denisovans, Neanderthals, and modern humans branched off.
“This second Denisovan genome illustrates the recurrent admixture between Neanderthals and Denisovans in the Altai region, suggesting these mixed populations were eventually supplanted by Denisovans from other regions, reinforcing the notion that Denisovans were widespread and that populations in the Altai may have existed at the periphery of their geographic range,” the researchers explained.
The Denisovan 25 genome presents valuable insights into the long-standing mysteries regarding the Denisovan ancestry in contemporary populations.
People in Oceania, parts of South Asia, and East Asia all carry Denisovan DNA, albeit from different Denisovan sources.
Through genetic comparison, scientists have identified at least three separate Denisovan origins, highlighted by their genetic segments found in thousands of modern genomes.
One lineage closely relates to the later Denisovan genome and is linked to widespread ancestry across East Asia and beyond.
A second, more distantly related Denisovan population contributed independently to Oceanian and South Asian ancestry.
Notably, East Asians do not share this highly divergent Denisovan ancestry, implying their ancestors may have taken a different route into Asia, potentially from the north, whereas Oceanian ancestors likely migrated through South Asia.
“Neanderthal-like DNA fragments appear in all populations, including Oceanians, aligning with a singular out-of-Africa migration; however, the distinct Denisovan gene flow points to multiple migrations into Asia,” the researchers stated.
Reconstruction of a young Denisovan woman based on skeletal profiles derived from ancient DNA methylation maps. Image credit: Maayan Harel.
The researchers believe certain Denisovan genetic traits offered advantages that increased their prevalence in modern human populations through the process of natural selection.
By analyzing both Denisovan genomes, the authors pinpointed numerous regions in present-day populations that appear to have originated from Denisovan introgression, particularly in Oceania and South Asia.
Genetic alterations observed in other Denisovans provide intriguing insights into their physical appearances.
Several unique mutations in Denisovans influence genes connected to cranial shape, jaw protrusion, and facial characteristics—attributes that align with the limited fossil record associated with Denisovans.
A shift in regulatory mechanisms is on the horizon. The Fox P2 gene, implicated in brain development and language in modern humans, raises important questions regarding the cognitive capabilities of Denisovans, although researchers emphasize that genetic data cannot replace direct fossil or archaeological evidence.
“The impact of Denisovan alleles on modern human phenotypes might also shed light on Denisovan biology,” the researchers pointed out.
“Examining alleles linked to contemporary human traits, we identified 16 associations with 11 Denisovan alleles, covering aspects like height, blood pressure, cholesterol levels, and C-reactive protein levels.”
“Additionally, we recognized 305 expressed quantitative trait loci (QTL) and 117 alternative splicing QTLs that affect gene expression across 19 tissues in modern humans, with the most significant effects observable in the thyroid, tibial artery, testis, and muscle tissues.”
“These molecular effects can be utilized to explore additional phenotypes that are not retained in the fossil record. This updated catalog provides a more reliable foundation for investigating Denisovan traits, adaptations, and disease susceptibilities, some of which may have influenced modern humans through admixture.”
This year brought many revelations about our ancient human relatives
WHPics / Alamy
This is an excerpt from Our Human Story, a newsletter about the revolution in archaeology. Sign up to receive it in your inbox every month.
If we try to summarize all the new fossils, methods, and ideas emerging from the study of human evolution in 2025, we might still be here in 2027. This year has been packed with developments, and I doubt it’s feasible for one individual to digest everything without isolating themselves from other distractions. This is particularly true in human evolution, which is a decentralized field. Unlike particle physicists, who often unite in teams for large-scale experiments, paleoanthropologists scatter in diverse directions.
There are two ways this year-long endeavor can falter. One risk is getting overwhelmed by an insurmountable amount of research, rendering it indecipherable. The other is simplifying the information to the point where it becomes incorrect.
With that in mind, here are three key points I want to clarify as we head into 2025. First, there have been remarkable discoveries about the Denisovans, reshaping our understanding of this mysterious group and challenging some of our previous assumptions. Second, we’ve seen a variety of new discoveries and ideas regarding how our distant ancestors created and utilized tools. Finally, we must consider the broader picture: how and why our species diverged so significantly from other primates.
The Denisovan Flood
Hebei Geography University
This year marks 15 years since we first learned about the Denisovans, an ancient group of humans that inhabited East Asia tens of thousands of years ago. My fascination with them has persisted, and this year, I was excited to witness a surge of discoveries that broadened our knowledge of their habitats and identities.
Denisovans were initially identified primarily through molecular evidence. The first fossil discovered was a small finger bone from Denisova Cave in Siberia, which defied identification based solely on its morphology, but DNA was collected in 2010. Genetic analyses revealed that Denisovans were closely related to Neanderthals, who lived in Europe and Asia, and that they interbred with modern humans. Currently, populations in Southeast Asia, particularly Papua New Guinea and the Philippines, possess the highest concentration of Denisovan DNA.
Since then, researchers have been on the hunt for additional Denisovan remains, though this endeavor has progressed slowly. Until 2019, the second identified example was a jawbone excavated from Baisiya Karst Cave in Xianghe, located on the Tibetan Plateau. Over the next five years, several more fossils were tentatively attributed to Denisovans, notable for their large size and pronounced teeth compared to modern humans.
Then came 2025, which brought numerous exciting findings. In April, Denisovans were confirmed in Taiwan, when a jawbone dredged from the Penghu Strait in 2008 was finally identified using preserved proteins. This discovery significantly extends the known range of Denisovans to the southeast, aligning with where their genetic markers remain today.
In June, the first Denisovan facial features emerged. A skull discovered in Harbin, northern China, was described in 2021 and designated as a new species, named Homolonghi. Initially presumed to belong to Denisovans due to its large size, proteins extracted by Qiaomei Fu and her team from the bone and mitochondrial DNA from dental plaque confirmed its Denisovan origins.
So far, these findings align well with genetic evidence indicating that Denisovans roamed extensively across Asia. They also contribute to a coherent image of Denisovans as a larger species.
However, two additional discoveries in 2025 were surprising. In September, a crushed skull thought to belong to an early Denisovan was reconstructed in Unzen, China, dating back approximately 1 million years. This finding suggests that Denisovans existed as a distinct group much earlier than previously believed, indicating that their common ancestor with Neanderthals, known as Ancestor X, must have lived over a million years ago. If confirmed, it implies a longer evolutionary history for all three groups than previously thought.
Just a month ago, geneticists released a second high-quality Denisovan genome extracted from a 200,000-year-old tooth found in Denisova Cave. Notably, this genome is distinctly different from the first genome described recently, as well as from modern Denisovan DNA.
This indicates the existence of at least three groups of Denisovans: early ones, later ones, and those that hybridized with modern humans—this latter group remains a total archaeological enigma.
As our understanding of Denisovans deepens, their history appears much longer and more diverse than initially assumed. In particular, Denisovan populations that interbred with modern humans remain elusive.
For the past 15 years, Denisovans have captivated my interest. Despite their widespread presence across continents for hundreds of thousands of years, only a handful of remains have been documented.
Fortunately, I have a penchant for mysteries. Because this puzzle won’t be solved anytime soon.
Tool Manufacturing
TW Plummer, JS Oliver, EM Finestone, Houma Peninsula Paleoanthropology Project
Creating and using tools is one of humanity’s most critical functions. This ability isn’t unique to our species, as many other animals also use and even make tools. Primatologist Jane Goodall, who passed away this year, famously demonstrated that chimpanzees can manufacture tools. However, humans have significantly elevated this skill, producing a more diverse array of tools that are often more complex and essential to our survival than those of any other animal.
As we delve deeper into the fossil record, we’re discovering that the practice of tool-making dates back further than previously thought. In March, I reported on excavations in Tanzania revealing that an unidentified ancient human was consistently creating bone tools 1.5 million years ago, well over a million years before bone tools were believed to become commonplace. Similarly, while it was previously thought that humans began crafting artifacts from ivory 50,000 years ago, this year, a 400,000-year-old flake from a mammoth tusk was discovered in Ukraine.
Even older stone tools have surfaced, likely due in part to their greater preservation potential. Crude tools have been identified from 3.3 million years ago at Lomekwi, Kenya. Last month in Our Human Story, I mentioned excavations in another part of Kenya demonstrating that ancient humans consistently produced a specific type of Oldowan tools between 2.75 million and 2.44 million years ago, indicating that tool-making was already a habitual practice.
Often, tools are found without associated bones, making it challenging to determine their makers’ identities. It’s tempting to assume that most tools belong to our genus, Homo, or perhaps to Australopithecus, our more distant ancestors. However, increasing evidence suggests that Paranthropus—a hominin with a small brain and large teeth, which thrived in Africa for hundreds of thousands of years—could also have made tools, at least simple ones like the Oldowans.
Two years ago, Oldowan tools were discovered alongside Paranthropus teeth in Kenya—admittedly not definitive evidence, but strongly suggestive. This year, a fossil of Paranthropus revealed that its hand exhibited a combination of gorilla-like strength and impressive dexterity, indicating capable precision gripping essential for tool-making.
How did these ancients conceive of their tools? One possibility, suggested by Metin Eren and others this year, is that they didn’t consciously create them. Instead, tool-like stones form naturally under various conditions, such as frost cracking rocks or elephants trampling them. Early humans may have utilized these “natural stones,” knowledge of which eventually led to their replication.
As humans continued to develop increasingly complex tools, the cognitive demands of creating them likely escalated, potentially facilitating the emergence of language as we needed to communicate how to make and use these advanced tools. This year’s research explored aspects like the difficulty of learning various skills, whether close observation is necessary, or if mere exposure suffices. The findings suggest two significant changes in cultural transmission that may correlate with technological advancements.
Like most aspects of evolution, tool-making appears to have gradually evolved from our primate predecessors, reshaping our cognitive capabilities in the process.
Big Picture
Alexandra Morton Hayward
Now let’s address the age-old question of how and why humans evolved so distinctly, and which traits truly set us apart. This topic is always challenging to navigate for three main reasons.
First, human uniqueness is multifaceted and often contradictory. Social scientist Jonathan R. Goodman suggested in July that evolution has forged humans to embody both “Machiavellian” traits—planning and betraying one another—and “natural socialist” instincts driven by strong social norms against murder and theft. Claims that humans are inherently generous or instinctively cruel tend to oversimplify the matter excessively.
Second, our perceptions of what makes us unique are shaped by the societies in which we exist. For instance, many cultures remain predominantly male-focused, leading our historical narratives to center around men. While the feminist movement is working to amend this imbalance, progress remains slow. Laura Spinney’s article on prehistoric women suggested that “throughout prehistory, women were rulers, warriors, hunters, and shamans,” a viewpoint made viable only through dedicated research.
Third, reconstructing the thought processes of ancient people as they adopted certain behaviors is inherently difficult, if not impossible. Why did early humans bury their dead and enact funerary rituals? How were dogs and other animals domesticated? What choices shaped ancient humans’ paths toward change?
Still, I want to spotlight two intriguing ideas surrounding the evolution of the human brain and intelligence. One concerns the role of placental hormones that developing babies are exposed to in the womb. Preliminary evidence suggests these hormones may contribute to brain growth, equipping us with the neural capacity to navigate our unusually complex social environments.
Another compelling possibility proposes that the genetic changes associated with our increased intelligence may have also led to vulnerabilities to mental illness. In October, Christa Leste-Laser reported that genetic mutations linked to intelligence emerged in our distant ancestors, followed by mutations associated with mental disorders.
This notion has intrigued me for years, rooted in the observation that wild animals, including our close relatives like chimpanzees, do not appear to suffer from serious mental illnesses such as schizophrenia or bipolar disorder. Perhaps our brains operate at the edge of our neural capabilities. Like a finely-tuned sports car, we can excel but are also prone to breakdowns. While still a hypothesis, this concept is difficult to shake off.
Oh, one more point. Although we often shy away from discussing methodological advancements, as readers generally prefer results, we made an exception in May. Alexandra Morton Hayward and her colleagues at the University of Oxford developed a method to extract proteins from ancient brains and potentially other soft tissues. Though such tissues are rarer in the fossil record compared to bones and teeth, some remain preserved and may offer a wealth of information. The first results could be available next year.
Available rooms: Minimum stay of 2 weeks, featuring a private bathroom. Enjoy a complimentary pool. Package includes meals, Wi-Fi, and infectious viruses. Call now!
Would you be inclined to respond to such advertisements? What about those that guarantee severe diarrhea? How many stars would it take to make you consider adding STDs to your stay? Perhaps a substantial cash incentive might sway your decision?
Welcome to the peculiar realm of human challenge testing – arriving soon at a biosecure isolation facility nearby.
In response to the collective trauma of the coronavirus pandemic, researchers are increasingly enlisting healthy individuals to participate in trials that intentionally expose them to illness. Volunteers are now more willing than ever to contract diseases ranging from dysentery and cholera to gonorrhea.
As detailed on page 38, clinical trials offer a rapid and relatively affordable method for assessing vaccines and treatments while monitoring infection dynamics. Contrary to popular belief, the risks may not be as high as presumed. Trials, conducted under stringent medical oversight, will only proceed if effective therapies can quickly alleviate symptoms.
“
Deliberately infecting healthy volunteers carries risks, and the ethical implications are complex. “
However, it’s not without its hazards, and the ethical landscape remains murky. Unlike patients with existing conditions who may opt for experimental therapies that could potentially cure them, challenge trials seek to induce illness with little or no immediate medical benefit, even if for a brief duration.
Moreover, we cannot always prevent potential long-term consequences. For example, some ethicists have expressed concerns regarding the manner in which British scientists conducted COVID-19 challenge trials during the pandemic, underscoring the risks of chronic symptoms associated with COVID-19.
Nonetheless, the pandemic has also underscored the significant positive impact and value of vaccines. Current data indicates that human challenge testing is safe, particularly for young, healthy individuals. These studies could hasten the development of new defenses against persistent epidemics such as malaria, Zika, and norovirus. The pressing question may be: How can we expand these efforts?
Is it conceivable that the ultra-wealthy are covertly cloning humans?
Juan Lovaro/Shutterstock
Throughout my extensive career reporting on extraordinary breakthroughs in biology, I’ve observed numerous concepts gaining massive attention, receiving thorough media scrutiny for years, and later fading from the public consciousness. Take, for instance, human cloning.
Following the landmark birth of Dolly the sheep in 1997—the first cloned mammal—speculation soared about the potential for human cloning. There were even some implausible claims about human clones existing. Yet, in recent years, such fervor has significantly diminished.
Nonetheless, reproductive technologies have evolved remarkably since the 1990s. Notably, just six years after CRISPR was unveiled, the world saw the first unlawful creation of a gene-edited child. This raises questions about what might be occurring behind closed doors. Are human clones already out there, undetected? Of course, identical twins don’t count.
What could motivate someone to engage in this? Recently, in a discussion between Vladimir Putin and Xi Jinping, the topic of extending life via organ transplants emerged. The most effective method could involve cloning individuals for organ harvesting, thereby eliminating the common issue of immune rejection often depicted in science fiction narratives. Consider Island or the book Never Let Me Go.
Moreover, cloning brings forth the notion of creating a duplicate of a person, offering a semblance of immortality, as illustrated in the television series Foundation, where the empire is governed by successive clones. However, our experiences with identical twins tell us that sharing the same genome does not equate to being the same person. As shown by Tatiana Maslany in the series Orphan Black, each clone evolves into a distinct individual. Nevertheless, wealthy individuals can hold irrational beliefs similar to others and often display a particular desire to extend their lifespans.
For scientists, there’s also the allure of being the first to achieve a groundbreaking feat. A report from a Chinese commission determined that the creators of CRISPR children “conducted research illegally in pursuit of personal fame and profit.”
Goals of Therapeutic Cloning
So, could human clones exist? For many years, the notion of cloning mammals was deemed unfeasible. Early embryo cells have the ability to differentiate into any bodily part but quickly become specialized—a process previously thought irreversible.
Dolly’s existence disproved that theory. She was produced by fusing cells from an adult ewe’s udder with a DNA-depleted egg. Her announcement in February 1997 led to a frenzy of attempts to generate cloned human embryos. The objective wasn’t to create cloned infants, but rather to harvest embryonic stem cells for novel medical therapies. As cloned cells are a perfect match for an individual, they could theoretically be employed to produce replacement tissues and organs with no risk of immune rejection.
However, extracting stem cells from cloned human embryos has proven more challenging than anticipated. It wasn’t until 2004 that Hwang Woo-seok claimed success. At that time, I found his paper impressive, as it addressed all conceivable objections effectively. Unfortunately, the study was later revealed to be fraudulent, resulting in its retraction. This experience remains ingrained in my memory. Nowadays, whenever a thesis appears too good to be true, my initial instinct is to be skeptical.
Ultimately, true embryonic stem cells from cloned human embryos weren’t obtained until 2013. By then, alternative methods for generating compatible stem cells through the activation of specific genes had emerged, leading to a decline in interest in therapeutic cloning.
Cloned Pets and Other Animals
Conversely, animal cloning has become increasingly established. Occasionally, headlines emerge when celebrities disclose that they’ve cloned their pets. Recently, former NFL player Tom Brady made news by revealing that his dog is a clone, produced by a company acquired by Colossal Biosciences.
Apart from serving as a way to “revive” cherished pets, cloning is also utilized in agriculture and horse breeding. For instance, male horses are often castrated, meaning that if they excel in show jumping, the only method to utilize their genetic material for future breeding is through cloning.
Nonetheless, animal cloning continues to pose significant challenges. A 2022 study of the first 1000 dog clones found that the cloning process is still highly inefficient, with merely 2 percent of implanted cloned embryos resulting in live births. This inefficiency contributes to the high cost of pet cloning, around $50,000.
Moreover, about 20% of cloned dogs presented noticeable physical anomalies, including enlarged tongues, unusual eye colors, cleft palates, and excessive muscle mass. Some male dog clones even exhibited female physical traits.
But what if the wealthy and powerful could clone themselves, unburdened by such concerns?
Challenges in Adult Cloning
Multiple sources have indicated several successful monkey cloning endeavors since 2017, suggesting potential applicability for humans as well. However, these sources often fail to mention that all these primate clones have been derived from fetal cells, not adult ones.
The crux of the issue lies in the fact that reprogramming adult cells to mimic a fetal state is far more complex than reprogramming fetal cells. To me, cloning signifies creating a genetically identical replica of an adult, which is what made Dolly’s achievement exceptional.
In essence, I remain convinced that cloning an adult is still unattainable. In a world filled with dictators and eccentric billionaires, this might be a fortunate circumstance.
Recent findings from neuroscientists reveal that the brain’s structure divides into five main stages throughout a typical person’s life, marked by four significant turning points from birth to death where the brain undergoes reorganization. Brain topology in children evolves from birth up to a crucial transition at age 9, then shifts into adolescence, which generally lasts until around age 32. In your early 30s, the neural wiring transitions to adult mode, marking the longest phase that extends for over 30 years. The third turning point occurs at about age 66, indicating the start of an early aging phase of brain structure, while the late brain phase begins around age 83.
Masry et al. Using a dataset of MRI diffusion scans, they compared the brains of 3,802 individuals aged 0 to 90 years. The dataset maps neural connections by tracking the movement of water molecules through brain tissue. Image credit: Mously et al., doi: 10.1038/s41467-025-65974-8.
“While we know brain wiring plays a crucial role in our development, we still lack a comprehensive understanding of how and why it fluctuates throughout life,” explained Dr. Alexa Mausley, a researcher at the University of Cambridge.
“This study is the first to pinpoint essential stages in brain wiring throughout the human lifespan.”
“These epochs offer vital insight into our brain’s strengths and vulnerabilities at different life stages.”
“Understanding these changes could shed light on why certain developmental challenges arise, such as learning difficulties in early childhood or dementia later in life.”
During the transition from infancy to childhood, strengthened neural networks emerge as the excess of synapses (the connections between neurons) in a baby’s brain diminishes, allowing only the most active synapses to thrive.
The brain rewires in a consistent pattern from birth until approximately age 9.
In this timeframe, the volumes of gray and white matter grow swiftly, resulting in maximal cortical thickness (the distance from the outer gray matter to the inner white matter), with the cortical folds stabilizing.
By the first turning point at age 9, cognitive abilities begin to evolve gradually, and the likelihood of mental health issues becomes more pronounced.
The second stage, adolescence, is characterized by an ongoing increase in white matter volume, leading to an enhancement in the sophistication of the brain’s communication networks, measurable through water diffusion scans.
This phase is marked by improved connectivity efficiency across specific regions and swift communication throughout the brain, correlating with enhanced cognitive performance.
“As expected, neural efficiency is closely linked to shorter pathways, and this efficiency increases throughout adolescence,” Mausley notes.
“These advancements peak in your early 30s, representing the most significant turning point in your lifetime.”
“Around age 32, the change in wiring direction is the most pronounced, and the overall trajectory alteration is greater than at any other turning points.”
“Although the onset of puberty is clearly defined, the conclusion is far harder to identify scientifically.”
“Based solely on neural structure, we found that puberty-related changes in brain structure conclude by the early 30s.”
Post age 32, adulthood enters its longest phase, characterized by a more stable brain structure with no significant turning points for three decades. This aligns with findings indicating an “intellectual and personality plateau.”
Additionally, the researchers observed a greater degree of “segregation” during this phase, indicating a gradual fragmentation of brain regions.
The tipping point at age 66 is more gradual, lacking dramatic structural shifts; however, notable changes in brain network patterns were found around this age on average.
“Our findings indicate a gradual reconfiguration of brain networks that peaks in the mid-60s,” stated Dr. Mausley.
“This is likely linked to aging, as white matter begins to decline, reducing connectivity further.”
“We are currently facing an era where individuals are increasingly at risk for various health conditions impacting the brain, such as high blood pressure.”
The final turning point arises around age 83, ushering in the last stage of brain structure.
Data from this stage is scarce, but a key characteristic is the shift from global to local connectivity as interactions across the brain diminish while reliance on specific regions intensifies.
Professor Duncan Astle of the University of Cambridge remarked: “In reflection, many of us recognize that our lives encompass distinct stages.”
“Interestingly, the brain also navigates through these phases.”
“Numerous neurodevelopmental, mental health, and neurological conditions are tied to the brain’s wiring.”
“In fact, variations in brain wiring can predict challenges with attention, language, memory, and a wide array of other behaviors.”
“Recognizing that structural transformations in the brain occur not in a linear fashion but through several major turning points can assist us in identifying when and how brain wiring may be vulnerable to disruptions.”
a paper detailing the study was published in the journal on November 25. Nature Communications.
_____
A. Mausley et al. 2025. Topological turning points across the human lifespan. Nat Commun 16, 10055; doi: 10.1038/s41467-025-65974-8
In a recent breakthrough regarding human evolution, researchers have unveiled that a peculiar foot unearthed in Ethiopia is from a yet-to-be-identified ancient relative.
The findings, released on Wednesday in the journal Nature, indicate the foot dates back approximately 3.4 million years and likely bears similarities to Lucy, another ancient human relative who inhabited the region around the same period.
However, scientists have revealed that Burtele’s foot, named after the site in northeastern Ethiopia where it was discovered in 2009, is distinctly different.
The fossil of Bartel’s foot has an opposable thumb akin to that of humans, suggesting its owner was a proficient climber, likely spending more time in trees compared to Lucy, according to the study.
Elements of Brutere’s foot discovered in Ethiopia in 2009. Johannes Haile Selassie/Arizona Institute of Human Origins (via AFP)
For many years, Lucy’s species was believed to be the common ancestor of all subsequent hominids, serving as a more ancient relative to humans, including Homo sapiens, in contrast to chimpanzees.
Researchers were unable to confirm that the foot belonged to a novel species until they examined additional fossils found in the same vicinity, including a jawbone with twelve teeth.
After identifying these remains as Australopithecus deiremeda, they determined that Bartele’s feet were from the same species.
John Rowan, an assistant professor of human evolution at the University of Cambridge, expressed that their conclusions were “very reasonable.”
“We now have stronger evidence that closely related, yet adaptively distinct species coexisted,” Rowan, who was not part of the study, communicated in an email to NBC News on Thursday.
The research also examined how these species interacted within the same environment. The team, led by Johannes Haile Selassie of Arizona State University, suggested that the newly identified species spent considerable time in wooded areas.
The study proposed that Lucy, or Australopithecus afarensis, was likely traversing the open land, positing that the two species probably had divergent diets and utilized their habitats in distinct ways.
Various analyses of the newly found tooth revealed that A. deiremeda was more primitive than Lucy and likely fed on leaves, fruits, and nuts, the study indicated.
“These distinctions suggest they are less likely to directly compete for identical resources,” remarked Ashley Los Angeles-Wiseman, an assistant professor at the Macdonald Institute of Archaeology at the University of Cambridge.
In an email on Thursday, Wiseman highlighted the significant implications of this discovery for our understanding of evolution, stating that it “reminds us that human evolution is not a linear progression of one species evolving into the next.”
Instead, she asserted, it should be viewed as a branching family tree with numerous so-called “cousins” existing simultaneously, each adopting various survival strategies. “Did they interact? We may never know the answer to that,” she concluded.
Rowan also noted that as the number of well-documented species related to humans increases, so do the inquiries concerning our ancestry. “Which species were our direct ancestors? Which species were our close relatives? That’s the challenge,” he remarked. “As species diversity ascends, so too do the avenues for plausible reconstructions of how human evolution unfolded.”
Wiseman cautioned that definitive species classifications should rely on well-preserved skulls and fossil fragments belonging to multiple related individuals. While the new study bolsters the case for A. deiremeda, it “does not dismiss all other alternative interpretations,” she stated.
The ancient human foot bones have puzzled scientists since their discovery in 2009.
Johannes Haile-Selassie
The origins of a 3.4-million-year-old foot bone uncovered in Ethiopia may finally be elucidated, prompting a reevaluation of how various ancient human ancestors cohabited.
In 2009, Johannes Haile-Selassie and his team at Arizona State University unearthed eight hominin bones that previously constituted a right foot at a site known as Burtele in northeastern Ethiopia’s Afar region.
This discovery, dubbed Bartele’s foot, features opposable big toes akin to those of gorillas, indicating that any species could have had arboreal capabilities.
Another ancient human species, Australopithecus afarensis, was known to inhabit the vicinity, with the well-known fossil of Lucy—also discovered in the Afar region—but Bartele’s foot appeared to belong to a different species. “From the outset, we realized it was not part of Lucy’s lineage,” Haile Selassie states.
There were two primary hypotheses that intrigued Haile Selassie: whether the foot was associated with another species within the genus Australopithecus or perhaps an older, more primitive group known as Ardipithecus, which existed in Ethiopia more than a million years ago and also possessed opposable thumbs.
Meanwhile, in 2015, scientists announced the identification of a previously unknown hominid species, named Australopithecus deiremeda, after jaw and tooth remains were found in the same region. Initially, there was uncertainty about whether the enigmatic leg bone was part of A. deiremeda, as its age differed from that of the jaw and tooth remains.
However, in the subsequent year, researchers made a crucial discovery. The lower jaw of A. deiremeda was located within 300 meters of Bartele’s foot, and both sets of remains were dated to the same geological era. This led the research team to conclude that Bartele’s foot belonged to A. deiremeda.
Bartele’s foot (left) and bones shaped like a gorilla’s foot (right), similar to Australopithecus deiremeda
Johannes Haile-Selassie
In a separate part of the study, researchers analyzed Earth’s carbon isotopes. They found that A. deiremeda primarily consumed materials from trees and shrubs, while human teeth were more adapted for a diet rich in grasses than those of afarensis.
Haile Selassie noted that this finding suggests that both hominin species occupied the same ecological niche without competing for resources. He believes these groups could have coexisted harmoniously, engaging in separate activities. “They must have crossed paths and interacted within the same habitat, each doing their own thing,” he remarked. “While members of Australopithecus deiremeda may have spent time in trees, afarensis was likely wandering the adjacent grasslands.”
This revelation enhances our understanding of human evolution. “Historically, some have argued that only a single hominid species existed at any given time, with newer forms emerging eventually,” Haile Selassie explained. “We are now realizing that our evolutionary path was not straightforward. Multiple closely related hominid species coexisted at the same time, indicating that coexistence was a fundamental aspect of our ancestors’ lives.”
Carrie Mongul, a professor at Stony Brook University in New York, expressed enthusiasm about these developments. “Understanding more about the diversity of Pliocene hominins is truly exciting,” she stated. “This period, around 3 million years ago, was rich in evolutionary significance.”
As we grow older, our brains undergo significant rewiring.
Recent studies indicate that this transformation takes place in various stages, or “epochs,” as our neural structures evolve, altering how we think and process information.
For the first time, scientists have pinpointed four key turning points in the typical aging brain: ages 9, 32, 66, and 83. During each of these phases, our brains display distinctly different structural characteristics.
The findings were Published Tuesday in Nature Communications, revealing that human cognitive ability does not merely peak and then decline with age. In reality, research suggests that the interval between 9 and 32 years old is the sole period in which our neural networks are increasingly efficient.
In adulthood, from 32 to 66 years, the structure of the average brain stabilizes without significant modifications, leading researchers to believe that intelligence and personality tend to plateau during this time.
Following another turning point, from age 83 and beyond, the brain increasingly relies on specific regions as connections between them slowly deteriorate.
“It’s not a linear progression,” comments lead author, Alexa Maudsley, a postdoctoral researcher at the University of Cambridge. “This marks an initial step in understanding how brain changes differ with age.”
These insights could shed light on why certain mental health and neurological issues emerge during specific rewiring phases.
Rick Betzel, a neuroscience professor at the University of Minnesota and not a part of the study, remarked that while the findings are intriguing, further data is necessary to substantiate the conclusions. He cautioned that the theory might face challenges over time.
“They undertook a very ambitious effort,” Betzel said about the study. “We shall see where things stand in a few years.”
For their research, Maudsley and colleagues examined MRI diffusion scans (images illustrating water molecule movement in the brain) of around 3,800 individuals, ranging from newborns to 90 years old. Their objective was to map neural connections at varying life stages.
In the brain, bundles of nerve fibers that convey signals are encased in fatty tissue called myelin—analogous to wiring or plumbing. Water molecules diffusing into the brain typically travel along these fibers, allowing researchers to identify neural pathways.”
“We can’t open up the skull…we depend on non-invasive techniques,” Betzel mentioned, discussing this form of neuroscience research. “We aim to determine the location of these fiber bundles.”
A groundbreaking study utilized MRI scans to chart the neural networks of an average individual across their lifetime, pinpointing where connections strengthen or weaken. The five “eras” discussed in the paper reflect the neural connections observed by the researchers.
They propose that the initial stage lasts until age nine, during which both gray and white matter rapidly increases. This phase involves the removal of redundant synapses and self-reconstruction.
Between ages 9 and 32, there is an extensive period of rewiring. The brain is characterized by swift communication across its regions and efficient connections.
Most mental health disorders are diagnosed during this interval, Maudsley pointed out. “Is there something about this second phase of life that might predispose individuals to mental health issues?”
From ages 32 to 66, the brain reaches a plateau. It continues to rewire, but this process occurs at a slower and less dramatic pace.
Subsequently, from ages 66 to 83, the brain undergoes “modularization,” where neural networks split into highly interconnected subnetworks with diminished central integration. By age 83, connectivity further declines.
Betzel expressed that the theory presented in this study is likely reflective of people’s experiences with aging and cognition.
“It’s something we naturally resonate with. I have two young kids, and I often think, ‘They’re transitioning out of toddlerhood,'” Betzel remarked. “Science may eventually uncover the truth. But are they precisely at the correct age? I’m not sure.”
Ideally, researchers would gather MRI diffusion data on a large cohort, scanning each individual across their lifespan, but that was unfeasible decades ago due to technological constraints.
Instead, the team amalgamated nine diverse datasets containing neuroimaging from prior studies, striving to harmonize them.
Betzel noted that these datasets vary in quality and methodology, and attempts to align them may obscure essential variations and introduce bias into the findings.
Nonetheless, he acknowledged that the paper’s authors are “thoughtful” and proficient scientists who did their utmost to mitigate that risk.
“Brain networks evolve throughout life, that’s undeniable. But are there five precise moments of transition? I hope you’ll take note of this intriguing notion.”
The notion that humans might use chemical signals known as pheromones for communication has intrigued scientists and the general public alike for many years, leading to numerous investigations aimed at discovering evidence.
Pheromones are well-documented in the animal kingdom. Ants use chemical trails for navigation and communication, dogs mark their territory with scent signals, and moths emit airborne pheromones to attract partners.
However, the question of whether humans share this capability is much more complex. Can one person elicit a physical or emotional reaction in another without their awareness? Might this influence attraction?
After over six decades of research, the answers remain uncertain, but recent findings indicate we might be getting closer to understanding this phenomenon.
First Whiff
In 1959, Adolf Butenandt and his team identified the first pheromones, specifically bombykol, a chemical released by female silk moths to attract male counterparts.
Shortly after, scientists introduced the term “pheromone” to describe chemical signals emitted by one individual that trigger distinct responses in another of the same species.
This discovery opened the door to exploring potential human equivalents.
One of the earliest notable claims regarding human pheromones was put forth by Martha McClintock in 1971. Her study involving 135 women residing in university dorms suggested their menstrual cycles seemed to synchronize throughout the year.
This phenomenon, termed the “McClintock effect,” was widely regarded as evidence supporting the existence of human pheromones. However, subsequent studies did not replicate these findings and revealed that any apparent synchronization could be attributed to chance.
For many years, researchers have concentrated on four primary chemicals believed to be human pheromones. Androstenone and androstenol are thought to influence social perception and sexual attraction.
Androstadienone has been investigated for its impact on mood and alertness in women, while estratetraenol is believed to affect men’s perceptions of women.
Nonetheless, none of these substances have been definitively established as true human pheromones.
The doses used in studies are often much higher than what the body naturally produces, leading to less reliable outcomes. Furthermore, many experiments suffer from design flaws and weak statistics, resulting in inconsistent and inconclusive findings.
Read More:
T-Shirt Test
If discussions on human pheromones arise, Professor Klaus Wedekind’s “Sweaty T-shirt research” from 1995 is likely to be mentioned.
In this experiment, women were asked to smell T-shirts worn by men and indicate their preferences.
Interestingly, women who were not on birth control were more inclined to like the scents of men whose immune system genes (MHC genes) differed most from their own.
This preference aligns with evolutionary theory, as choosing mates with varied immune genes can enhance resistance to diseases in offspring.
This study has been replicated and is frequently hailed as a compelling instance of human chemical signaling, wherein body odor conveys social or biological information.
Yet, the scents involved in this research do not adhere to the strict definition of pheromones.
Most of the odor in sweat comes from a small number of underarm bacteria on your T-shirt, not pheromones. – Photo credit: Getty
Initially, a person’s complex “smell print” consists of multiple chemicals rather than a single one. Pheromones trigger automatic and unconscious responses, such as hormonal changes and instinctive behaviors, whereas this type of scent is subjective and conscious, forming personal preferences.
Invisible Clues
Although the T-shirt study does not clarify the role of pheromones in humans, some scientists believe that research in this area is far from complete.
Among them is Dr. Tristram Wyatt, a senior research fellow at the University of Oxford’s Department of Zoology, who has dedicated his career to studying the evolution of pheromones.
“If we consider humans as just another animal, it would be surprising to think we do not communicate chemically,” he explains. “For instance, our body odor evolves during puberty and becomes even more pronounced as we reach sexual maturity.
“In other animals, such odors frequently convey critical signals, so it is highly possible that humans emit similar signals; we just haven’t established this scientifically yet.”
The queen bee releases a pheromone that inhibits the reproduction of all other females in the hive – Photo credit: Getty
Even with this potential, pinpointing human pheromones has proven extraordinarily challenging.
“Studying human pheromones is akin to searching for a needle in a haystack,” Wyatt remarks. “Humans release thousands of odor molecules, making it difficult to identify which one triggers certain effects.
“Moreover, our reactions to odors are influenced by cultural, emotional, and individual differences, rendering our responses highly variable. Without reliable bioassays that provide clear, measurable reactions to odors, it is nearly impossible to pinpoint genuine pheromones.”
Another problem is reproducibility; many pheromone studies are based on small sample sizes, which makes their results statistically unreliable and susceptible to false positives.
Early research often lacks strict controls, and the field faces publication bias, increasing the likelihood of positive results being published.
The outcome? An evidentiary basis that appears more robust than it truly is. It comprises a collection of intriguing yet unreliable findings, with only a few holding up under repeat testing.
The Scent is Hot
Despite years of challenges, Wyatt remains hopeful, particularly about recent advances in research, including a French study that may represent the closest step toward identifying a human pheromone.
This investigation centered on secretions from Montgomery’s glands (small glands around the nipples that release tiny droplets during breastfeeding) in nursing mothers.
Researchers found that when newborns were exposed to the scent of these secretions, they instinctively turned their heads, displayed suckling behavior, and began searching for the nipple.
“This is the most exciting lead we’ve encountered to date,” says Wyatt. “Babies respond to these secretions even if they come from a different mother.
“Such a universal, instinctive reaction is precisely what we expect from an authentic pheromone. If we can identify the specific compound responsible, we might finally establish the first verified human pheromone.”
A recent breakthrough in pheromone research occurred in 2023 at the Weizmann Institute of Science in Israel. Researchers studied the effects of tears from women.
Men who smelled tears shed by a woman during a sad film showed decreased testosterone levels, and brain scans indicated changes in areas linked to both aggression and olfactory processing.
The study also revealed four receptors in the nose capable of detecting chemical signals in tears, and researchers are currently working to identify the specific compounds in tear fluid that elicit this response, potentially leading to compounds that mitigate aggression.
Recent research indicates that chemicals in women’s tears significantly affect men’s testosterone levels – Image courtesy of Getty Images
Nevertheless, while there is evidence that humans utilize scent in both social and sexual contexts, it has yet to be definitively proven that pheromones play a role in human communication.
“To conclusively ascertain whether human pheromones exist, rigorous research is necessary,” Wyatt asserts.
“This entails clear testing with consistent responses, larger and better-designed studies, and moving beyond the same old unproven molecules. Only diligent, evidence-driven research will yield real answers.”
“The quest for genuine human pheromones is just at its inception,” he concludes. “With the proper guidance, we could finally be on the brink of an exciting discovery.”
In the beginning, God created man in His own image, granting him authority over all living things on Earth. While many do not turn to the Bible for insight into human existence, the belief in human superiority over nature and other beings lingers.
Characteristics often claimed to distinguish humans—such as reasoning, tool use, experiencing pain, and moral judgment—are not exclusive to us. Other species like chimpanzees and crows exhibit advanced intelligence, hold complex social structures, and utilize tools. Fish and crustaceans experience pain, while bees demonstrate cultural behaviors, and plants may possess senses akin to ours.
Primatologist Christine Webb posits that the so-called “human dominance complex” may be the root of nature’s hierarchies. In The Arrogant Monkey: And a New Look at Humanity, she seeks to dismantle this perceived superiority through a compelling and meticulously researched examination based on a course she taught at Harvard. Webb traces this notion back to religious traditions and other human constructs, revealing how it misrepresents scientific understanding and accelerates ecological decline.
The belief in human uniqueness contradicts Darwin’s vision of species continuity, and emphasizing differences among species is problematic. As Webb writes, “the degree of kindness,” reflects a hidden bias in research.
This bias is apparent in our fascination with primates and “charismatic” mammals, which we tend to view as more relatable, while disregarding plants, fish, and the vast majority of Earth’s life. It also reveals itself in our inconsistent standards for evaluating animals. For instance, comparisons between human and chimpanzee intelligence often pit captive chimps against their wild counterparts, ignoring the limitations that captivity imposes.
Concerned about ethical issues surrounding captivity and its potential to skew research findings, Webb focuses exclusively on great apes in their natural and protected habitats. These profound interactions have shaped her belief that many non-human species likely possess some form of consciousness or “conscious life.”
Webb anticipates that critics may dismiss her views as anthropomorphism, labeling it a “serious scientific error.” However, she argues that the reluctance to acknowledge similarities between humans and other species complicates scientific inquiry and undermines its conclusions. She questions the certainty with which humans claim to understand consciousness beyond their own.
Dismantling these beliefs is crucial for appreciating the wonder and diversity of life, marking the first step towards a “radically humble approach.” By recognizing ourselves as fellow animals and integral to nature, we can confront the destructive forces of capitalism that fuel zoonotic diseases, mass extinctions, climate change, and ecosystem collapse.
Webb advocates for broadening the concept of “good science” to incorporate indigenous knowledge about the uniqueness and interconnection of all life forms. She acknowledges the immense challenge this poses, declaring that human exceptionalism is “the most pervasive implicit belief of our era.” Yet, she believes that unlearning this can foster a deeper connection to nature, spark awe, and inspire advocacy for both animal welfare and environmental protection. In The Arrogant Monkey, she highlights this “stubborn ideology” and its detrimental impacts, modeling the humility, curiosity, and compassion essential for countering it.
Research led by scientist Hannah Long at the University of Edinburgh has found that specific regions of Neanderthal DNA are more effective at activating genes responsible for jaw development than those in humans, potentially explaining why Neanderthals had larger lower jaws.
Neanderthal. Image credit: Natural History Museum Trustees.
“With the Neanderthal genome being 99.7% identical to that of modern humans, the variations between species are likely to account for differences in appearance,” Dr. Hanna stated.
“Both human and Neanderthal genomes consist of roughly 3 billion characters that code for proteins and regulate gene expression in cells. Identifying the regions that influence appearance is akin to searching for a needle in a haystack.”
Dr. Long and her team had a targeted approach, focusing on a genomic area linked to the Pierre Robin sequence, a condition marked by an unusually small mandible.
“Individuals with the Pierre Robin sequence often have significant deletions or rearrangements in this portion of the genome that affect facial development and restrict jaw formation,” Dr. Hanna explained.
“We hypothesized that minor differences in DNA could produce more nuanced effects on facial structure.”
Upon comparing human and Neanderthal genomes, researchers discovered that in this segment, approximately 3,000 letters long, there are only three one-letter variations between the species.
This DNA region doesn’t code for genes but regulates when and how certain genes, particularly SOX9, which plays a crucial role in facial development, are activated.
To confirm that these Neanderthal-specific differences were significant for facial development, scientists needed to demonstrate that the Neanderthal version could activate genes in the appropriate cells at the right developmental stage.
They introduced both Neanderthal and human versions of this region into zebrafish DNA and programmed the cells to emit different colors of fluorescent protein based on the activation of either region.
By monitoring zebrafish embryo development, researchers observed that cells responsible for forming the lower jaw were active in both human and Neanderthal regions, with the Neanderthal regions showing greater activity.
“It was thrilling when we first noticed the activity of specific cell populations in the developing zebrafish face, particularly near the forming jaw, and even more exhilarating to see how Neanderthal-specific variations could influence activity during development,” said Dr. Long.
“This led us to contemplate the implications of these differences and explore them through experimental means.”
Recognizing that Neanderthal sequences were more effective at activating genes, the authors questioned whether this would lead to enhanced target activity affecting the shape and function of the adult jaw, mediated by SOX9.
To validate this idea, they augmented zebrafish embryos with additional samples of SOX9 and discovered that cells involved in jaw formation occupied a larger area.
“Our lab aims to further investigate the effects of genetic differences using methods that simulate various aspects of facial development,” Dr. Long remarked.
“We aspire to deepen our understanding of genetic variations in individuals with facial disorders and improve diagnostic processes.”
“This study demonstrates how examining extinct species can enhance our knowledge of how our own DNA contributes to facial diversity, development, and evolution.”
The findings are published in the journal Development.
_____
Kirsty Utley et al. 2025: Neanderthal-derived variants enhance SOX9 enhancer activity in craniofacial progenitor cells, influencing jaw development. Development 152 (21): dev204779; doi: 10.1242/dev.204779
Scientist Hannah Long and her team at the University of Edinburgh have discovered that specific regions of Neanderthal DNA are more effective at activating genes related to jaw formation compared to human DNA, which might explain why Neanderthals had larger lower jaws.
Neanderthal. Image credit: Natural History Museum Trustees.
“The Neanderthal genome shows a 99.7% similarity to the human genome, suggesting that the differences between the species contribute to variations in appearance,” explained Dr. Hanna.
“Both the human and Neanderthal genomes comprise around 3 billion characters that code for proteins and regulate gene usage in cells. Therefore, pinpointing regions that affect appearance is akin to finding a needle in a haystack.”
Dr. Long and her collaborators had a targeted hypothesis regarding where to initiate their search. They focused on a genomic area linked to the Pierre Robin sequence, a condition characterized by a notably small jaw.
“Some individuals with Pierre Robin sequence exhibit significant deletions or rearrangements in this genomic region that disrupt facial development and impede jaw formation,” stated Dr. Hanna.
“We speculated that minor variations in DNA could subtly influence facial shape.”
Through the comparison of human and Neanderthal genomes, researchers identified that in a segment approximately 3,000 letters long, there are just three one-letter differences between the two species.
This DNA segment lacks any specific genes but regulates the timing and manner in which genes, particularly SOX9, a crucial factor in facial development processes, are activated.
To demonstrate the significance of these Neanderthal-specific differences for facial development, researchers needed to confirm that the Neanderthal region could activate genes in the correct cells at the appropriate developmental stage.
They introduced both Neanderthal and human variants of this region into zebrafish DNA concurrently and programmed the cells to emit different colors of fluorescent protein based on whether the human or Neanderthal region was active.
By monitoring zebrafish embryo development, researchers observed that the cells crucial for lower jaw formation were active in both regions, with the Neanderthal regions showing greater activity than those of humans.
“We were thrilled when we first detected the activity in a specific group of cells within the developing zebrafish face, near the jaw, and even more so when we realized that Neanderthal-specific differences could modify this activity during development,” Dr. Long noted.
“This led us to ponder the potential implications of these differences and how we may explore them experimentally.”
Recognizing that Neanderthal sequences were more adept at activating genes, the authors inquired whether this would correlate with heightened activity in target cells, influencing the shape and function of the adult jaw as governed by SOX9.
To test this hypothesis, they administered additional samples to zebrafish embryos. They found that the cells involved in jaw formation occupied a larger area.
“In our lab, we aim to investigate the effects of additional DNA sequence differences using methods that replicate aspects of facial development,” Dr. Long said.
“We aspire to enhance our understanding of sequence alterations in individuals with facial disorders and assist with diagnostic efforts.”
“This research illustrates that by examining extinct species, we can gain insights into how our own DNA contributes to facial variation, development, and evolution.”
Findings are detailed in the journal Developmenthere.
_____
Kirsty Utley et al. 2025: Variants derived from Neanderthals enhance SOX9 enhancer activity in craniofacial progenitor cells that shape jaw development. Development 152 (21): dev204779; doi: 10.1242/dev.204779
THe faces depression: As reported by The Cut, individuals are turning to AI to crack escape room puzzles and manipulate trivia nights. Is this not the essence of spoiling one’s enjoyment? “It’s akin to entering a corn maze with the intent of taking a straight path to the exit,” remarked a TikToker featured in the article. There are conversations with passionate readers who rely on ChatGPT to substitute book clubs and source “enlightening opinions and perspectives.” Everything was pleasant until a character’s demise disrupted the fantasy saga he was savoring (though, in truth, that seems rather grim).
Conversely, Substack appears to be filled with AI-produced essays. This New Blog platform is a vibrant hub for passionate creators to showcase their writings. Handing that off to a bot feels like peak absurdity. Will Storr, who delves into storytelling, examines this unexpected trend and its implications. In his own Substack, he discusses the phenomenon of “impersonal universalism,” wherein grand statements may sound profound but fall flat. “Insight possesses a universality akin to white noise, wrapped in an unsettling vagueness that can cloud our thoughts,” he observes.
I find it puzzling how anyone can derive pleasure from using extensive language models (LLMs) to appear vaguely “intelligent” or engage in AI-altered hobbies. Yet, I believe this isn’t an existential threat posed by AI. It is crucial that we savor our experiences. Let robots take our jobs, but they shouldn’t steal our joy. I’m not here to dictate how others should find pleasure—I’m no authority on fun. If I were to teach you, it might very well come across like an AI-generated Substack (embracing nature, chatting with strangers, enjoying moments with loved ones). Yet, I often reflect on what genuinely makes me feel alive, as I seek to engage more in those activities. It becomes a personal defense against “impersonal universality.”
First up: singing. While I wish AI could concoct melodic canons and create ethereal robot madrigals, it cannot replicate the whimsical joy of my quirky choir made up of very special individuals. We may not be the most skilled vocalists, but when we harmonize, we share a deep sense of connection (research indicates that group singing fosters bonding) quick social bonding). Occasionally, everything aligns for fleeting moments of breathtaking beauty, humbly guided by our choir director, silently matching a chef’s kiss. Regardless, it remains delightful.
Next, let’s discuss not my own but someone else’s experiences. I find endless inspiration in the unique artifacts people treasure, acquire, and eventually discard. My regular visits to York’s weekly car boot sale reveal a captivating blend of stuffed badgers, Power Rangers merchandise, fishing gear, and a ceramic mouse in Victorian attire. More noble collectibles might include the textiles featured in Renaissance paintings: garments, tapestries, and drapes. Recently, I spent an exhilarating 10 minutes at The Frick Collection in New York, immersed in an astonishingly vacant room while studying Holbein’s Portrait of Thomas More, contemplating the feel of his fur collar and red velvet sleeves, pondering his choices.
A substantial portion of my joy stems from simply being present in nature. I stroll, dig in the soil, observe wildlife (yes, that includes birds), but predominantly, as a lifelong introvert, my delight comes from people. If I had to identify my most reliable source of happiness, it would be wandering through a new city, soaking in the lives of its inhabitants. What do they wear, consume, and discuss? What triggers their anger? What kind of dogs accompany them? It’s an endless buffet of human experience, from toddler tantrums to tender moments of affection to the play of queue dynamics. Recently, I watched the documentary *I Am Martin Parr*, which showcases a photographer adept at capturing the nuances of British life, likened to a magpie, and he resonates with this sentiment. Now in his seventies, Parr is still eager to explore and document the marvelous and strange nuances of society. He declares, “I’m still thrilled to venture out and observe this chaotic world we inhabit.”
That is my secret. AI can offer a rote summary of who we are, but it mixes all our hues into a muddy shade. It cannot encapsulate the joy of something utterly unique.
The exploration of the dynamics within liberal democracies has typically emphasized economic, emotional, and educational influences. However, an additional field of neurology plays a critical role.
Liberal democracies engage our cognitive processes differently than authoritarian regimes. Dictatorships provide a sense of predictability, exemplified by Adolf Hitler’s envisioned timeline, while liberal democracies leave the future open to our choices, presenting it as a canvas we shape ourselves.
This is politically significant yet cognitively daunting. Historically, the future was dictated by a select few, prioritizing preservation over progress. The inherent ambiguity and adaptability of liberal democracy can challenge individuals neurologically, as uncertainty is a state the human mind often resists. Studies indicate that uncertainty triggers more anxiety than the anticipation of an electric shock, leading to various historical attempts to diminish uncertainty through mechanisms like insurance and weather forecasting.
Your position on the spectrum of uncertainty tolerance is influenced by cultural background, age, and gender, as well as neurological factors. Research in political neuroscience reveals that conservative brains lean towards security, generally steering clear of conclusions that lack clarity. This tendency is associated with a larger amygdala, the brain region linked to threat detection, resulting in a heightened discomfort when confronted with the unfamiliar.
On the other hand, a liberal brain exhibits greater gray matter in the anterior cingulate cortex, a region involved in processing ambiguity. This anatomical difference enables liberals to tolerate uncertainty and confrontation more effectively. Liberal democracies can provide space for both perspectives under less stressful conditions. Although conservatives and liberals may have distinct neural predispositions regarding their preferences for the future, evolutionarily, all humans share the ability to envision multiple futures.
However, increased uncertainty can push some individuals beyond their comfort zones, particularly as the future of pressing issues—like environmental change, technology, and social norms—becomes less predictable. To cope with this anxiety, some individuals gravitate towards populist and authoritarian political leaders, committing to rigid decision-making and a black-and-white perspective. They often seek certainty—howbeit a mere illusion—by rejecting innovations (such as medical advancements) or dismissing foreign cultures and religions, thus limiting uncertainty and suppressing potential futures. This obsession with ambiguity and anxiety can create a more tranquil mindset for those affected.
This doesn’t imply a total surrender to an illiberal mindset. Instead, it underscores the necessity for liberal democracies to candidly inform their constituents that embracing liberalism may not come intuitively. Educational initiatives, public discourse, and civil engagement must derive insights into overcoming illiberal tendencies at a brain-based level.
We must communicate the collective benefits of cooperation in various domains, including identity. Ultimately, only through collaboratively addressing the vulnerabilities inherent in our brains can we tackle the significant global challenges we face today.
Florence Gaub is the author of Future: Manual (Hurst, 2026). Riya Yu has authored Fragile Minds: The Neuropolitics of Divided Societies (Columbia UP).
It’s fair to state that the ancient human family tree has always been subject to revision. Take the Denisovans, for instance. These enigmatic ancient hominins were once primarily identified through mere bone fragments. However, in June, molecular analysis revealed that a peculiar skull from China belonged to the Denisovans, thus giving them a more defined identity.
Yet, not everyone is convinced. Anthropologist Christopher Bay, a professor at the University of Hawaii at Manoa, contests this finding, asserting that the skull is more likely associated with a species named Homolonghi. Bay has been foundational in ongoing discussions regarding our ancestral lineage. For over five years, he, alongside colleagues, has advocated for the recognition of two ancient human species: Homo bodoensis and Homo juruensis.
These proposals have stirred debate, especially since Bay and his team have intentionally disregarded traditional naming conventions. He argues that such rules have become outdated, failing to accommodate the removal of names that are now considered offensive or unpronounceable. In a conversation with New Scientist, he elaborated on how his personal quest for identity fueled his passion for human evolution.
Michael Marshall: What initially encouraged you to explore the study of ancient humans?
Christopher Bay: The ultimate aim of paleoanthropology is to piece together historical narratives, even when all elements are not available. This field resonates with me personally as I was adopted and spent my first year without any concrete memory. I was born in South Korea, abandoned around one year of age, spent six months in an orphanage, and was later taken in by an American family.
During my undergraduate studies, I had the opportunity to visit Korea for the first time as an exchange student. On this trip, I visited an adoption center in my hometown, inquiring if there was any possibility of locating my birth parents. Unfortunately, I was informed that my Korean name and birth date were not legitimate, and there was virtually no chance of finding them. That was a moment of resignation for me.
Although I was intrigued by my origins, I didn’t know how to pursue them. Then, I enrolled in an introductory biological anthropology course, which allowed me to navigate my curiosity about origins—almost like constructing my own foundation.
Two species frequently debated regarding our direct ancestry are Homo heidelbergensis and Homo rhodesiensis. In 2021, you joined a team proposing the substitution of these names with a new species, H. bodoensis. Could you elaborate on this?
My colleague, Mirjana Roksandic from the University of Winnipeg and I discussed H. heidelbergensis at the 2019 Anthropology Conference. It became apparent that this species had been labelled a “trash can taxon,” becoming an easy classification for fossils that didn’t belong to Homo erectus, Homo neanderthalensis, or Homo sapiens.
What are the implications?
If we aim to discard H. heidelbergensis, the next valid name based on priority is H. rhodesiensis. However, this name honors Northern Rhodesia—renamed Zambia—an area linked to the controversial Cecil Rhodes. Are we comfortable naming a potential ancestor of modern humans after a historical figure associated with colonialism? So, in compiling that paper, we decided to introduce a fresh name paying tribute to Bodo, a 600,000-year-old skull discovered in Ethiopia.
What response did your paper receive?
Upon peer review submission, half of the reviewers contended the argument had to be published for its significance, while the other half deemed it entirely flawed. Unsurprisingly, the paper was met with polarized reactions once it was released.
Have any workable agreements emerged from this debate?
We held a workshop in Novi Sad, Serbia in 2023, with approximately 16 to 17 paleoanthropologists participating. A consensus emerged around the notion that H. heidelbergensis is indeed an inappropriate taxonomic category. Another significant finding was that H. rhodesiensis should be excluded due to its colonial implications—remarkably, only one paleoanthropologist present believed otherwise.
The ICZN published a statement in 2023, indicating it “does not involve itself in removing names that may be ethically problematic.” This direction prompted us to challenge the ICZN. Editor’s note: The ICZN’s 2023 announcement recognized that some scientific names might be offensive but asserted it’s beyond their remit to weigh the morality of individuals honored in eponyms. Moreover, it stressed the necessity for zoologists to adhere to its ethical code while naming new species.
Are the names of species significant enough to merit conflict?
Yes and no. For instance, several beetles from Slovenian caves were named after Adolf Hitler in the 1930s by an Austrian entomologist, Oskar Scheibel. One species, Anophthalmus hitleri, has gained popularity as a collector’s item among neo-Nazis, which could lead to this innocent beetle’s extinction.
What alternatives do you propose?
I advocate working with local collaborators to choose species names that they find acceptable since they live with and experience these species regularly. Ideally, I believe we should refrain from using personal names for species, as this could lead to ongoing issues. Change is on the horizon; the ICZN is re-evaluating the inclusion of members from the Global South to provide them a stronger say. Recently, the American Ornithological Society voted to remove names with negative connotations associated with historical figures from their species designations.
Last year, you again disputed ICZN regulations concerning ancient human fossils excavated at a site, Xujiayao, in northern China. What occurred there?
In the 1970s, researchers uncovered multiple hominin fossils representing over ten individuals at the site, though they were found as separate pieces. Together with my colleagues, Wu Xiujie from the Chinese Academy of Sciences worked extensively on these fossils. Wu has virtually reconstructed part of one skull, and upon seeing this, we noted it appeared distinctly different from other hominins of a comparable age.
What differentiates these specimens?
The variations lie in size and shape; our average cranial capacity is around 1300 to 1500 cubic centimeters, whereas these fossils have cranial volumes between 1700 cm³ and 1800 cm³, significantly larger than typical humans. Shape analysis similarly indicated that the Xujiayao fossils correlated differently compared to those from a nearby site called Xuchang, leading us to propose a new species name.
Mr. Bae studies human fossils discovered in Serbia, potentially linked to Homo bodoensis
Christopher J. Bay
The name you ultimately selected has been met with criticism. Can you clarify the rationale behind it?
The origin of the species name is intriguing; in this case, we could have opted for Homo suziayaoensis, named after Xujiayao, aligning with ICZN guidelines.
In Latin, it translates to “homo“, but you found that option unsatisfactory?
The challenge lies in the fact that only fluent Chinese speakers can pronounce it, and even spelling it correctly can be an issue. Names must be both pronounceable and memorable. Thus, we came up with “julu,” which translates directly to “big head.” However, adhering to ICZN guidelines, we would need to modify the name to “Homo juui”. In our view, since non-Chinese speakers struggle to pronounce it correctly, we ultimately decided upon Homo juruensis.
How does your new species intersect with the enigmatic Denisovans, who inhabited what is now East Asia during the Stone Age?
If you compare the second molars from Denisova Cave with those from Xujiabao, they appear strikingly similar. It’s even plausible to interlink Xujiayao’s and Denisova’s molars, as the distinction is often so subtle.
This year, another research team suggested a link between the same Denisovan fossils and another ancient species, Homolonghi, which has garnered a positive reception among numerous researchers.
Most ancient hominin experts in China tend to side with our argument for H. juruensis, while many Western scholars familiar with China’s historical records also find it agreeable.
Concerning the June-discovered skull, researchers managed to extract ancient proteins associated with H. longhi that corresponded with known Denisovan fossils. What are your thoughts?
Most geneticists argue that protein analysis isn’t robust enough for accurate species identification. While it can differentiate between broader categories—like cats and dogs—its utility in distinguishing more nuanced levels is quite limited.
Replica of Denisova molars discovered in Denisova Cave in 2000
Tilo Parg CC BY-SA 3.0
Do you still consider H. longhi a legitimate species?
I personally appreciate H. longhi and the fossils associated with it. The debate revolves around which other fossils should be allocated to longhi or juruensis. It’s interesting to note that advocates for longhi are attempting to consolidate all fossils under that designation, despite the evident morphological diversity present in Chinese fossils.
Many paleoanthropologists have expressed strong criticism of your research. How do you and your colleagues respond to this?
Over time, we’ve developed resilient skin regarding our work.
Depiction of a teenage girl with a Neanderthal mother and a Denisovan father
John Bavaro Fine Art/Science Photo Library
This marks the second occasion researchers have successfully retrieved the complete genome of Denisovans, an ancient human lineage that inhabited Asia. The DNA was sourced from a tooth estimated to be 200,000 years old, discovered in a Siberian cave.
The genome indicates that there were at least three distinct groups of Denisovans, each with unique histories. It also suggests that early Denisovans intermixed with an unidentified ancient human group as well as a previously unknown Neanderthal population.
“This research is groundbreaking,” asserts David Reich from Harvard University.
“This study significantly broadened my perspective on the Denisovan ecosystem,” states Samantha Brown from the National Center for Human Evolution Research in Spain.
Denisovans were first described solely via their DNA. Finger bones retrieved from Denisova Cave in Siberia exhibited DNA distinct from both modern humans and Neanderthals found in western Eurasia. Genomic analysis indicates Denisovans mated with modern humans, with populations in Southeast Asia, including the Philippines and Papua New Guinea, carrying Denisovan DNA.
Since their initial discovery in 2010, researchers have found that: a small number of Denisovans also originated from East Asia. In June, a skull unearthed in Harbin, China, was confirmed as Denisovan through molecular evidence, providing the first insight into their physical appearance. However, despite DNA fragments being recovered from various specimens, only the original specimen yielded a high-quality genome.
Researchers led by Stéphane Pèregne from Germany’s Max Planck Institute for Evolutionary Anthropology has introduced an additional researcher. (Pèregne declined to comment as the study is pending peer review.)
In 2020, a team of researchers discovered a male Denisovan molar tooth and sequenced its entire genome from the preserved DNA.
The researchers estimated this individual lived around 205,000 years ago, judging by the number of genetic mutations and comparing them with other ancient human genomes. This timeframe aligns with findings that the deposits containing the teeth are dated between 170,000 to 200,000 years old. In contrast, the other high-quality genome belongs to Denisovans who lived between 55,000 and 75,000 years ago, revealing an earlier chapter in Denisovan history.
The researchers suggest that at least three distinct Denisovan populations likely existed, based on comparisons from various Denisovan cave sites. The oldest group comprised the individuals whose teeth were analyzed. Many millennia later, a second group supplanted this earlier population in Denisova Cave.
“Comprehending how early Denisovans were supplanted by subsequent groups underscores pivotal events in human history,” says Qiao Meifu from the Institute of Vertebrate Paleontology and Paleoanthropology in China.
A third group, absent from the cave, still interbred with modern humans as suggested by genetic testing. Thus, all Denisovan DNA present in modern humans derives from a Denisovan group about which little is known.
The new genome illuminates the fact that Denisovans mated repeatedly with Neanderthals, who resided in and around the Denisovan Cave. Notably, this genome also contained traces of Neanderthals who lived between 7,000 and 13,000 years prior to Denisovan individuals. These traces do not align with any known Neanderthal genomes, indicating that the Denisovans interbred with a Neanderthal group yet to be sequenced.
Moreover, it’s probable that Denisovans also mated with an as-yet unidentified ancient human group that evolved independently of both Denisovans and modern humans for hundreds of thousands of years. One possibility is Homo erectus, the earliest known human species to migrate out of Africa and inhabit regions as far as Java, Indonesia. However, no DNA has been retrieved to confirm this.”H. erectus, so certainty remains elusive.
“It’s endlessly fascinating to uncover these new populations,” Brown remarked.
Several hominid species — Australopithecus africanus, Paranthropus robustus, early homo varieties, Gigantopithecus brachy, Pongo, papio, homo neanderthalensis, and homo sapiens — have undergone significant lead exposure over two million years, as revealed by a new analysis of fossilized teeth collected from Africa, Asia, Oceania, and Europe. This finding challenges the notion that lead exposure is merely a contemporary issue.
Lead exposure affecting modern humans and their ancestors. Image credit: J. Gregory/Mount Sinai Health System.
Professor Renaud Joannes Boyau from Southern Cross University remarked: “Our findings indicate that lead exposure has been integral to human evolution, not just a byproduct of the industrial revolution.”
“This suggests that our ancestors’ brain development was influenced by toxic metals, potentially shaping their social dynamics and cognitive functions over millennia.”
The team analyzed 51 fossil samples globally utilizing a carefully validated laser ablation microspatial sampling technique, encompassing species like Australopithecus africanus, Paranthropus robustus, early homo variants, Gigantopithecus brachy, Pongo, papio, homo neanderthalensis, and homo sapiens.
Signs of transient lead exposure were evident in 73% of the specimens analyzed (compared to 71% in humans). This included findings on Australopithecus, Paranthropus, and homo species.
Some of the earliest geological samples from Gigantopithecus brachy, believed to be around 1.8 million years old from the early Pleistocene and 1 million years old from the mid-Pleistocene, displayed recurrent lead exposure events interspersed with periods of little to no lead uptake.
To further explore the impact of ancient lead exposure on brain development, researchers also conducted laboratory studies.
Australopithecus africanus. Image credit: JM Salas / CC BY-SA 3.0.” width=”580″ height=”627″ srcset=”https://cdn.sci.news/images/2015/01/image_2428-Australopithecus-africanus.jpg 580w, https://cdn.sci.news/images/2015/01/image_2428-Australopithecus-africanus-277×300.jpg 277w” sizes=”(max-width: 580px) 100vw, 580px”/>
Australopithecus africanus. Image credit: JM Salas / CC BY-SA 3.0.
Using human brain organoids (miniature brain models grown in the lab), researchers examined the effects of lead on a crucial developmental gene named NOVA1, recognized for modulating gene expression during neurodevelopment in response to lead exposure.
The modern iteration of NOVA1 has undergone changes distinct from those seen in Neanderthals and other extinct hominins, with the reasons for this evolution remaining unclear until now.
In organoids with ancestral versions of NOVA1, exposure to lead significantly altered neural activity in relation to Fox P2 — a gene involved in the functionality of brain regions critical for language and speech development.
This effect was less pronounced in modern organoids with NOVA1 mutations.
“These findings indicate that our variant of NOVA1 might have conferred a protective advantage against the detrimental neurological effects of lead,” stated Alison Muotri, a professor at the University of California, San Diego.
“This exemplifies how environmental pressures, such as lead toxicity, can drive genetic evolution, enhancing our capacity for survival and verbal communication while also affecting our susceptibility to contemporary lead exposure.”
An artistic rendition of a Gigantopithecus brachy herd in the forests of southern China. Image credit: Garcia / Joannes-Boyau, Southern Cross University.
Genetic and proteomic analyses in this study revealed that lead exposure in archaic variant organoids disrupts pathways vital for neurodevelopment, social behavior, and communication.
Alterations in Fox P2 activity indicate a possible correlation between ancient lead exposure and the advanced language abilities found in modern humans.
“This research highlights the role environmental exposures have played in human evolution,” stated Professor Manish Arora from the Icahn School of Medicine at Mount Sinai.
“The insight that exposure to toxic substances may conjure survival advantages in the context of interspecific competition introduces a fresh perspective in environmental medicine, prompting investigations into the evolutionary origins of disorders linked to such exposures.”
For more information, refer to the study published in the journal Science Advances.
_____
Renaud Joannes Boyau et al. 2025. Effects of intermittent lead exposure on hominid brain evolution. Science Advances 11(42); doi: 10.1126/sciadv.adr1524
Three distinguished scientists (two from the U.S. and one from Japan) have been awarded the Nobel Prize in Medicine for their pivotal discovery related to peripheral immune resistance.
Mary E. Blankku, Fred Ramsdell, and Sakaguchi Shiko were jointly recognized for their breakthrough that “has invigorated the field of peripheral tolerance and contributed to the advancement of medical treatments for cancer and autoimmune disorders,” as stated in a news release by the Nobel Committee. The three recipients will share a prize of 11 million Swedish Kronor (approximately $1.2 million).
“This could also enhance the success rates of organ transplants. Several of these therapies are currently in clinical trials,” he noted.
Autoimmune diseases may arise when T cells, which serve as the body’s main defense against harmful pathogens, malfunction.
Their collective discovery establishes an essential foundation for understanding alternative methods by which the immune system, known as peripheral resistance, functions.
To mitigate damage, our bodies attempt to eliminate malfunctioning T cells within the thymus, a lymphoid organ, through a mechanism termed central resistance. Associated Press.
The groundbreaking research began in 1995 when Sakaguchi, a prominent professor at the Center for Immunology Frontier Research at Osaka University in Japan, uncovered a previously unknown class of immune cells that defend against autoimmune diseases.
Six years later, in 2001, Mary Blankku, who now serves as a senior program manager at the Institute of Systems Biology in Seattle, along with Ramsdell, a scientific advisor to Sonoma Biotherapeutics in San Francisco, identified a specific genetic mutation responsible for a severe autoimmune disease known as IPEX.
They designated this gene as foxp3.
By 2003, Sakaguchi confirmed that the FOXP3 gene he had identified nearly a decade prior was crucial for cell development. These cells are now referred to as regulatory T cells, which are essential in monitoring other T cells to prevent their malfunction.
“Their discoveries were vital for understanding the immune system’s functioning and why serious autoimmune diseases don’t affect everyone,” remarked All Kampe, Chairman of the Nobel Committee.
Nobel Committee Executive Director Thomas Perman announced the award on Monday morning, stating that he was only able to reach Sakaguchi.
“I hugged him in his lab, and he expressed immense gratitude, stating it was a tremendous honor. He was quite moved by the news,” Perman mentioned.
The awards ceremony is scheduled for December 10th, coinciding with the anniversary of Alfred Nobel’s death, a Swedish industrialist who founded the award to honor individuals who have significantly contributed to humanity. The inaugural award was revealed in 1901, marking the fifth anniversary of his passing.
The Nobel Prize in Physiology or Medicine will be announced in Stockholm at the Karolinska Institute on Monday, followed by the prizes for Physics, Chemistry, and Literature on the ensuing days.
Approximately 12,000 years ago, during the Pleistocene-Chlorocene transition, humans navigated a network of seasonal waters in Northern Arabia, marking significant locations with camels, ibex, wild equids, gazelles, and monumental rock carvings of Auloc, as well as establishing access routes.
Jebel Arnaan rock art panel. Image credit: Mariaguanine.
As part of the Green Arabia Project, archaeologist Michael Petraglia from Griffith University and his team have uncovered over 60 rock art panels featuring 176 sculptures in three previously unexplored locations.
The sculptures predominantly illustrate camels, ibex, equids, gazelles, and aurochs, comprising 130 life-size and naturalistic figures, with heights exceeding 3 meters and 2 meters.
This sculptural activity occurred between 12,800 and 11,400 years ago, a time when seasonal water bodies re-emerged following a period of severe aridity.
These water sources, identified through sediment analysis, facilitated early human migration into the interior desert and offered rare survival opportunities.
“These large-scale sculptures are not just rock art; they likely represent assertions of existence, access, and cultural identity,” noted Dr. Maria Guanine, an archaeologist at the Max Planck Institute.
“Rock art signifies water sources and movement routes, potentially indicating territorial rights and intergenerational memory,” added Dr. Seri Shipton, an archaeologist at the University of London.
In contrast to previously known sites where sculptures were hidden in crevices, the Jebel Mleiha and Jebel Arnaan panels were carved on the face of a towering 39-meter cliff, making them visually dominant.
One panel required ancient artists to ascend narrow ledges to create their work, emphasizing the effort and significance attributed to the imagery.
Various artifacts, including Levantine-style Erkiam, Hellwan stone points, green pigments, and dental beads, indicate extensive connections to pre-Pottery Neolithic (PPN) populations in the Levant.
Nevertheless, the size, content, and arrangement of these Arabian sculptures distinguish them from others.
“This unique form of symbolic representation reflects a distinct cultural identity evolved to thrive in harsh, arid environments,” stated Dr. Faisal Al Ghibrien, a heritage researcher at the Saudi Ministry of Culture.
“The project’s interdisciplinary approach aims to bridge significant gaps in the Northern Arabian archaeological record between the last Glacial Maximum and the Holocene, shedding light on the resilience and innovation of early desert communities,” remarked Dr. Petraglia.
The team’s paper has been published in the journal Nature Communications.
____
M. Guanine et al. 2025. Monumental rock art indicates that humans thrived in the Arabian desert during the Pleistocene and Holocene transitions. Nature Communications 16, 8249; doi:10.1038/s41467-025-63417-y
Since the inception of brain organoids by Madeline Lancaster in 2013, these structures have become invaluable in global brain research. But what are they really? Are they simply miniaturized brains? Could implanting them into animals yield a super-intelligent mouse? Where do we draw the ethical line? Michael Le Page explored these questions at Lancaster’s lab at the MRC Institute of Molecular Biology in Cambridge, UK.
Michael Le Page: Can you clarify what a brain organoid is? Is it akin to a mini brain?
Madeline Lancaster: Not at all. There are various types of organoids, and they are not miniature brains. We focus on specific parts of the human brain, and our organoids are small and immature. They don’t function like developed human brains with memories. In scale, they’re comparable to insect brains, lacking the necessary tissue present in those brains. I would categorize them closer to insect neural structures.
What motivated you to create your first brain organoid?
I initiated the process using mouse embryonic brain cells, cultivating them in Petri dishes. Some cells didn’t adhere as expected, leading to a fascinating outcome where they interconnected and formed self-organizing cell clusters indicative of early brain tissue development. The same technique was then applied to human embryonic stem cells.
Why is the development of brain organoids considered a significant breakthrough?
The human brain is vital to our identity and remained enigmatic for a long time. Observing a mouse brain doesn’t capture the intricacies of the human brain. Brain organoids have opened a new perspective into this complex system.
Can you provide an example of this research?
One of our initial ventures involved modeling a condition called micropathy, where the brain is undersized. In mice, similar mutations don’t alter brain size. We tested whether we could replicate size reduction in human brain organoids, and we succeeded, enabling further insights into the disease.
Madeline Lancaster in her lab in Cambridge, UK
New Scientist
What has been your most significant takeaway from studying brain organoids?
We are gaining a better understanding of what distinguishes the human brain. I’m fascinated by the finding that human stem cells which generate neurons behave differently from those in mice and chimpanzees. One key difference is that human development is notably slower, allowing for more neurons to be produced as our stem cells proliferate.
Are there practical outcomes from this research?
Much of our foundational biology research has crucial implications for disease treatment. My lab primarily addresses evolutionary questions, particularly genetic variances between humans and chimpanzees. Specific genes that arise are often linked to human disorders, implying that mutations essential for brain development could lead to significant damage.
What types of treatments might emerge from this work in the future?
We’re already utilizing brain organoids for drug screening. I’m especially optimistic about their potential in treating mental health conditions and neurodegenerative diseases, where novel therapies are lacking. Currently, treatments for schizophrenia utilize medications that are five decades old. Brain organoid models could unveil new approaches. In the longer term, organoids might even provide therapeutic options themselves. While not for all brain areas, techniques have already been developed to create organoids of dopaminergic neurons from the substantia nigra, which are lost in Parkinson’s, for potential implantation.
Are human brain organoids already being implanted in animal brains?
Yes, but not for treatment purposes; rather, these practices enhance human organoid research. Organoids usually lack vascularity and other cell types from outside the brain, especially microglia, which serve as the brain’s immune cells. Thus, to examine how these other cells interact with human brain matter, various studies have implanted organoids into mice.
Should we have concerns regarding the implantation of human organoids in animals?
Neurons are designed to connect with one another. So, when a human brain organoid is inserted into a mouse brain, the human cells will bond with mouse neurons. However, they aren’t structured coherently. These mice exhibit diminished cognitive performance after implantation, akin to a brain malfunction; hence, they won’t become super-intelligent.
Images of the color of brain organoids, showing their neural connections
MRC Institute of Molecular Biology
Is cognitive enhancement a possibility?
We’re quite a distance from that. Higher-level concepts relate to how different brain regions interlink, how individual neurons connect, and how collections of neurons communicate. Achieving an organized structure like this could be possible, but challenges like timing persist. While mice have a short lifespan of about two years, human development toward advanced intelligence takes significantly longer. Furthermore, the sheer size of human brains presents challenges; a human-sized brain cannot fit within a mouse. Because of these factors, I don’t foresee such concerns emerging in the near future.
Regarding size, the main limitation is the absence of blood vessels. Organoids start to die off when they exceed a few millimeters. How much headway has been made in addressing this issue?
While we’ve made strides and should acknowledge our accomplishments, generating brain tissue is relatively straightforward as it tends to develop autonomously. Vascularization, however, is complex. Progress is being made with the introduction of vascular cells, but achieving fully functional blood perfusion remains a significant hurdle.
When you reference ‘far away’…
I estimate it could take decades. It may seem simple, given that the body accomplishes this naturally. However, the challenges arise from the body’s integrated functioning. Successfully vascularizing organoids requires interaction with a whole organism; we can’t replicate this on a plate.
If we achieve that, could we potentially create a full-sized brain?
Even if we manage to develop a large, vascularized human brain in a lab, without communication or sensory input, it would lack meaningful function. For instance, if an animal’s eyes are shut during development and opened later, they may appear functional, but the brain can’t interpret visual input, rendering it effectively blind. This principle applies to all senses and interactions with the world. I believe that an organism’s body must have sensory experiences to develop awareness. Certain patients who lose sensory input can end up experiencing lock-in syndrome, an alarming condition. But these are individuals who have previously engaged with the world. A brain that has never engaged lacks context.
As brain organoid technology progresses, how should we define the boundaries of ethical research?
The field closely intersects with our understanding of consciousness, which is complex and difficult to measure. I’m not even certain I have the definitive answer about consciousness for myself. However, we can undoubtedly assess factors relevant to consciousness, like organization, sensory inputs and outputs, maturity, and size. Mice might meet several of these criteria but are generally not recognized to possess human-like consciousness, largely due to their size. Even fully interconnected human organoids won’t achieve human-level consciousness if they remain small. Establishing these kinds of standards offers more practical methods than attempting to directly measure consciousness.
Laboratories enable modification of human egg cell genetic identity
Science Photo Library / Aramie
Human embryos arise from eggs that utilize the DNA from adult skin cells. This was accomplished with mice. This advancement may offer a pathway for same-sex couples or women facing fertility challenges to have biologically related children.
Researchers have successfully replicated animals through cloning techniques. This involves substituting the nucleus of an egg cell with the nuclei from somatic cells such as skin cells. However, in addition to the legal hurdles surrounding human cloning, many couples desire children that carry genes from both partners, necessitating both sperm and eggs. Shoukhrat Mitalipov of Oregon Health and Science University.
This scenario is complicated by the nature of eggs and sperm being haploid, meaning they contain only one set of chromosomes. The challenge lies in halving the complete set of chromosomes found within cells such as skin cells after selecting an optimal combination of the original genes.
Females develop all of their eggs while still in the womb, where the progenitor cells initially containing 46 chromosomes undergo a complicated process of replication, mixing, and division to reduce to 23 chromosomes.
Mitalipov was intrigued by the possibility of employing natural chemical processes that facilitate chromosomal division in mature human eggs both before and after fertilization to replicate this process in his laboratory.
Having achieved this with mice, Mitalipov and his team are now trialing the method with human subjects. They started by extracting the nuclei from hundreds of eggs donated by healthy women, which were left at a specific development stage linked to chromosomal division. Next, the nuclei of skin cells, known as fibroblasts, from healthy female volunteers were inserted into these eggs. Microscopic images displayed the chromosomes aligned on the spindle and the internal structures necessary for chromosomal separation.
The team then injected sperm from a healthy donor to fertilize some of the eggs, utilizing a method akin to that employed in creating babies using third-party mitochondrial DNA, which can also minimize the risk of specific genetic disorders.
This injection typically causes the eggs to undergo chromosome selection and eliminate duplicate DNA, preparing them for additional reception from the sperm. Nonetheless, in the case of the skin-derived eggs, this process was interrupted, with chromosomes aligning but not separating. Consequently, the researchers attempted again with a new batch of fertilized eggs, applying an electrical pulse that allowed calcium to surge into the egg, emulating natural signals triggered when sperm contact the egg’s outer layer, alongside an incubation period with a drug to activate them from their dormant state pre-fertilization.
Through a series of trials, the researchers successfully halved the chromosome counts in the eggs, discarding any excess. By the conclusion of the experiment, 9% of the fertilized eggs had developed into blastocysts — a dense cluster of cells at about 5-6 days post-fertilization, typically moving into the uterus during IVF treatments. However, the team did not pursue the transfer or sustain the blastocyst beyond six days.
Despite the progress made, the mixtures of genes forming the remaining chromosomes appeared particularly susceptible to defects. “I believe this method is still in its early stages and is not presently suitable for clinical applications,” stated MITINORI SAITOU from Kyoto University in Japan.
Lin from Osaka University noted that while the techniques are “very sophisticated and organized,” they remain “inefficient and potentially hazardous for immediate clinical use.” Nevertheless, Hayashi remarked that the team has achieved a “substantial breakthrough in reducing the human genome.” “This advancement will herald new technologies,” he stated.
Mitalipov acknowledged the validity of the criticisms, emphasizing that his team is actively working to address the existing flaws. “At the end of the day, we’re making progress, but we aren’t there yet,” he remarked.
Is artificial intelligence poised to dismantle the SDH [subtitles for the deaf and hard of hearing] industry? While SDH remains the standard subtitle format across most platforms, the individuals behind it raise a valid concern as the sector, like many creative fields, faces increasing devaluation in the AI era. “SDH is an art; the industry often overlooks this. Many see it merely as transcription,” remarked Max Deryagin, chairman of Interface Activities, a nonprofit for freelance subtitlers and translators.
<p class="dcr-130mj7b">While AI promises to streamline subtitle creation, it misses the mark, according to Meredith Canela, a committee member. "There's a notion that AI tools mean we should work less. Yet, having spent 14-15 years in this field, I can attest that the time taken to complete projects has not changed significantly over the past five to six years."</p>
<p class="dcr-130mj7b">"Automatic transcription shows some positive advancements," Cannela adds. However, the overall efficiency does not represent a net gain compared to previous software, as extensive corrections are necessary.</p>
<figure id="8a1af689-3e30-498c-9401-81b7bc4d8a2d" data-spacefinder-role="inline" data-spacefinder-type="model.dotcomrendering.pageElements.ImageBlockElement" class="dcr-173mewl">
<figcaption data-spacefinder-role="inline" class="dcr-fd61eq">
<span class="dcr-1inf02i">
<svg width="18" height="13" viewbox="0 0 18 13">
<path d="M18 3.5v8l-1.5 1.5h-15l-1.5-1.5v-8l1.5-1.5h3.5l2-2h4l2 2h3.5l1.5 1.5zm-9 7.5c1.9 0 3.5-1.6 3.5-3.5s-1.6-3.5-3.5-3.5-3.5 1.6-3.5 3.5 1.6 3.5 3.5 3.5z"/>
</svg>
</span>
<span class="dcr-1qvd3m6">"You can't overwhelm your audience"... Barbie's open caption screening for deaf and hard of hearing audiences in Westwood, California in 2023.</span> Photo: Allen J. Shaven/Los Angeles Times/Getty Images
</figcaption>
</figure>
<p class="dcr-130mj7b">Moreover, the quality of AI-generated SDHs is often subpar, requiring significant effort to meet standards. Unfortunately, human subtitlers frequently find themselves taking on "quality control" roles with minimal compensation. Many in the field state that earning a sustainable income is currently a challenge.</p>
<p class="dcr-130mj7b">"The fees for SDH work were never great, but they've dropped to a point where it's hardly worth the effort," says Rachel Jones, an audiovisual translator and committee member. "This seriously undermines the value we provide."</p>
<p class="dcr-130mj7b">This value is crucial. "We're thrilled to welcome Teri Devine, associate director of inclusion at the Royal National Institute for Deaf and Deaf," a representative stated. “For those who are deaf or hard of hearing, subtitles are an essential service."</p>
<aside data-spacefinder-role="supporting" data-gu-name="pullquote" class="dcr-19m4xhf">
<svg viewbox="0 0 22 14" style="fill:var(--pullquote-icon)" class="dcr-scql1j">
<path d="M5.255 0h4.75c-.572 4.53-1.077 8.972-1.297 13.941H0C.792 9.104 2.44 4.53 5.255 0Zm11.061 0H21c-.506 4.53-1.077 8.972-1.297 13.941h-8.686c.902-4.837 2.485-9.411 5.3-13.941Z"/>
</svg>
<blockquote class="dcr-zzndwp">The same sound can mean a million different things. As humans, we interpret how it should feel.</blockquote>
</aside>
<p class="dcr-130mj7b">Deaf and hard of hearing communities are diverse, meaning subtitles must accommodate various needs in crafting SDH. Jones explains, "While some believe that naming songs in subtitles is pointless, others might resonate with it because of the song's title."</p>
<p class="dcr-130mj7b">Subtitles involve numerous creative and emotion-driven choices—qualities AI currently lacks. When Jones first watches a show, she notes her emotional reactions to sounds and determines how best to express those in words. She then decides which sounds to subtitle and which may be excessive: "You can't overwhelm the audience," she points out. It's a delicate balancing act. "I want to avoid over-explaining everything to the viewers," Cannela adds.</p>
<figure id="880e1917-c3ac-492b-829a-737f8a57f715" data-spacefinder-role="supporting" data-spacefinder-type="model.dotcomrendering.pageElements.ImageBlockElement" class="dcr-a2pvoh">
<figcaption data-spacefinder-role="inline" class="dcr-9ktzqp">
<span class="dcr-1inf02i">
<svg width="18" height="13" viewbox="0 0 18 13">
<path d="M18 3.5v8l-1.5 1.5h-15l-1.5-1.5v-8l1.5-1.5h3.5l2-2h4l2 2h3.5l1.5 1.5zm-9 7.5c1.9 0 3.5-1.6 3.5-3.5s-1.6-3.5-3.5-3.5-3.5 1.6-3.5 3.5 1.6 3.5 3.5 3.5z"/>
</svg>
</span>
<span class="dcr-1qvd3m6">"Algorithms cannot replicate the level of professional work."</span> Photo: Milan Sulkara/Arami
</figcaption>
</figure>
<p class="dcr-130mj7b">AI struggles to discern which sounds are crucial. "It’s far from achieving that now," Deryagin notes, emphasizing the importance of understanding the broader context of a film rather than just individual images or scenes. For instance, in *Blow Out* (1981), a mysterious sound recurs, enhancing viewers' understanding of the main plot points. "SDH must create these connections rapidly without over-informing the audience initially," he explains. "The same sound can have countless meanings, and as a human, it’s my job to interpret those nuances."</p>
<p class="dcr-130mj7b">"You can't simply feed an algorithm a soundtrack and expect it to get it right. Providing metadata will not bridge the gap to professional quality."</p>
<p class="dcr-130mj7b">Netflix provided a glimpse into its "SDH process" following the subtitles for *Stranger Things*—for example, "[Eleven pants]" or "[Tentacles squelching wetly]"—in an <a href="https://www.netflix.com/tudum/articles/stranger-things-season-4-captions" data-link-name="in body link">interview with the subtitler</a>. The company chose not to comment further on AI in subtitle production. The BBC informed the *Guardian* that "we do not use AI for TV subtitles," though much of that work was outsourced to Redbee Media last year. <a href="https://www.redbeemedia.com/news/red-bee-medias-artificial-intelligence-captioning-workflows-bring-costs-down-for-network-10/" data-link-name="in body link">A statement was issued</a> regarding the use of AI for creating SDHs for the Australian Broadcasting Network 10.</p>
<p class="dcr-130mj7b">Jones notes that linguists and subtitlers aren't inherently opposed to AI, but at this juncture, it complicates rather than simplifies their work. "In every industry, AI tends to replace the creative aspects that bring us joy, rather than alleviating the tedious tasks that we’d rather avoid," she concludes.</p>
For the first time, real-time footage of human embryos being implanted into an artificial uterus has been recorded.
This remarkable achievement, published in the journal Advances in Science, offers an unparalleled glimpse into one of the crucial stages of human development.
Implantation failure is a leading cause of infertility, responsible for 60% of miscarriages. Researchers aim to enhance understanding of the implantation process to improve fertility results in both natural conception and in vitro fertilization (IVF).
“We can’t observe this, due to the transplantation in the mother,” stated Dr. Samuel Ojosnegros, head of bioengineering at the Institute of Bioengineering (IBEC) and the lead author of the study, as reported by BBC Science Focus.
“Thus, we required a system to observe how it functions and to address the primary challenges to human fertility.”
Implantation marks the initial phase of pregnancy, where the fertilized egg (developing embryo) attaches to the uterine lining, allowing it to absorb nutrients and oxygen from the mother—vital for a successful pregnancy.
To investigate this process, the research team developed a platform that simulates the natural uterine lining, utilizing a collagen scaffold combined with proteins essential for development.
The study then examined how human and mouse embryos implant onto this platform, uncovering significant differences. Unlike mouse embryos that adhere to the uterine surface, human embryos penetrate fully into the tissue before growing from within.
Video showing the implantation process of mouse embryos (left) and human embryos (right).
“Human embryos are highly invasive,” said Ojosnegros. “They dig a hole in the matrix, embed themselves, and then grow internally.”
The footage indicated that the embryo exerts considerable force on the uterus during this process.
“We observed that the embryo pulls, moves, and rearranges the uterine matrix,” stated Dr. Amélie Godeau, co-first author of the research. “It also responds to external force cues. We hypothesize that contractions in vivo may influence embryo transfer.”
According to Ojosnegros, the force applied during this stage could explain the pain and bleeding many women experience during implantation.
Researchers are currently focused on enhancing the realism of implantation platforms, including the integration of living cells. The goal is to establish a more authentic view of the implantation process, which could boost the likelihood of success in IVF, such as by selecting embryos with better implantation potential.
“We understand more about the development of flies and worms than our own species,” remarked Ojosnegros. “So enjoy watching the film.”
Left: The remains of a middle-aged woman at the Liu Po site in southern China, where smoke was used before burial approximately 8,000 years ago. Right: Contemporary smoke-dried mummies of Dani individuals in West Papua, Indonesia.
Zhen Li, Hirofumi Matsumura, Hsiao-Chun Hung
Carefully preserved through smoking practices up to 14,000 years ago, a human body has been found at archaeological sites in Southeast Asia, making it the world’s oldest known mummy.
This custom continues today among the Dani people in West Papua, Indonesia, who mummify their deceased relatives by exposing them to smoke and treat them with care and respect as part of the household. Many of these mummies are found in a tightly bound squatting position.
Similar “highly flexed” ancient remains have also been discovered in Australia, China, the Philippines, Laos, Thailand, Malaysia, South Korea, and Japan.
Hsiao-Chun Hung from the Australian National University in Canberra noted the striking similarities between burial remains excavated in relation to Dani traditions while working on ancient skeletons in Vietnam in 2017.
Hung and her team analyzed the burial practices of 54 hunter-gatherers from 11 archaeological sites across Southeast Asia dated between 12,000 and 4,000 years ago to uncover evidence of smoking. Most sites were based in northern Vietnam and southern China.
Numerous remains displayed clear signs of partial burning, though not enough to indicate cremation. The researchers utilized two analytical methods, X-ray diffraction and infrared spectroscopy, on several bone samples to assess thermal exposure.
Over 90% of the 69 skeletal samples displayed indications of heat exposure. The findings suggest that while human remains were not subjected to extreme temperatures, they likely endured lower temperatures, potentially from smoking for extended periods.
The oldest mummy examined by a Vietnamese team from Hang Cho dates back over 11,000 years. However, a tightly bound skeleton from another site, Hang Mooy, indicates practices recorded over 14,000 years ago. “We didn’t need X-rays or infrared to analyze this one because it’s evidently partially burned and visible to the naked eye,” explains Hung.
Previously, the oldest known mummy was believed to come from northern Chile, approximately 7,000 years ago, and ancient Egypt around 4,500 years ago.
Hung suggests that the evidence indicates this burial tradition likely spread across southern China and Southeast Asia at least 14,000 years ago, as agricultural societies became prevalent in the region around 4,000 to 3,500 years ago. The constricting bindings of mummified bodies may have facilitated their transport, she notes.
Ethnographic studies indicate that these traditions persisted in southern Australia until the late 19th and early 20th centuries, according to Hung. “Additionally, our literature review in the New Guinea highlands reveals that these practices continue among some communities today.”
“Our results signify a unique blend of techniques, traditions, culture, and a profound connection to ancestry that spans an extraordinary timeframe, covering vast regions from the Paleolithic era to the present,” she states.
Vito Hernandez from Flinders University in Adelaide suggests that this study challenges long-standing beliefs that such practices were exclusive to arid regions like Chile’s Atacama and Egypt’s Nile Valley. “It highlights the role of tropical environments in nurturing distinct mortuary traditions among early modern humans across the Far East and potentially into the Pacific,” he remarks.
“By extending the timeline of mummification by at least 5,000 years, the Chinchalo culture [of South America] emphasizes Southeast Asia’s role as a center for cultural innovation, demonstrating a deep continuity that connects early Holocene hunter-gatherers with present-day indigenous groups in New Guinea and Australia,” Hernandez adds.
Cairo and Alexandria, Scientific Pioneers of the Ancient World: Egypt
Embark on an unforgettable journey in Cairo and Alexandria, Egypt’s two most iconic cities, where ancient history meets modern charm.
A dramatic reconstruction of early modern Homo sapiens in Africa
BBC/BBC Studios
human Available on BBC iPlayer (UK); US PBS (September 17)
Based on my observations, science documentaries often fall into two categories, akin to French and Italian cuisines. (Hear me out before you judge that comparison.) The first category employs intricate techniques for a deep experience. The second is more straightforward, allowing the content to shine naturally.
Both documentary styles can yield impressive results in their own ways. human, a five-part BBC series exploring the roots of our genus, Homo, undoubtedly fits into the latter category. It weaves together compelling stories, stunning visuals, and the charismatic presence of paleontologist Ella Al Shamahi, inviting viewers to embark on a heartfelt journey through six million years of our human history. No flashy add-ons are necessary.
The first episode delves into complex inquiries. When exactly did our species emerge? Multiple perspectives yield varying answers. Was it 300,000 years ago when humans began to exhibit features resembling ours? Was it when our skulls, according to Al Shamahi, transformed to become softer and more spherical? Or, more poetically, when we developed remarkable traits like intricate language, abstract thought, and cooperative behavior?
“
The series intertwines fascinating narratives, stunning visuals, and the captivating presence of Ella Al Shamahi. “
It’s an engaging episode, particularly when the narrative shifts to other extinct human species. For instance, Al Shamahi’s exploration of Indonesia introduces us to Homo floresiensis, a meter-tall human uniquely adapted to life on Flores. The discovery of these “hobbits” in the Liang Bua Caves two decades ago reshaped our understanding of ancient human biology. Their small brains provide insights into tool use, with their long arms and short stature diverging from other human species.
Episode three highlights the fate of our most famous relative, the Neanderthals. As we spread into Europe and Asia, they adapted to colder climates but ultimately faced extinction.
Throughout the series, Al Shamahi showcases amazing paleontological discoveries made over recent decades (many of which you may have read about in New Scientist). For instance, rainbow feathers from birds like the red kite have garnered interest regarding their significance to Neanderthals. Meanwhile, the perikymata—a growth line in tooth enamel—affirms that H. sapiens experienced extended childhoods, leveraging our cognitive capacity.
Over just five episodes, human cannot cover every aspect of our evolutionary story. Yet, it illuminates how H. sapiens has been shaped by climate influences, the flora and fauna that provide for us, other human species, and collaborative nomadic groups that shared skills, knowledge, and DNA, allowing us to thrive and eventually build cities.
This dimension of H. sapiens portrays humanity as the ultimate survivor, capable of progression and dominance over the Earth. In contrast, human offers a more humble narrative, emphasizing our species alongside our ancient relatives.
Tracking Human Evolution Gain insights behind the scenes of the new BBC series human with Ella Al Shamahi on NewsCientist.com/Video
In a captivating and poignant narrative, Ella Al Shamahi addresses the inadequate frontline science conducted in regions perceived as inhospitable to Western researchers. Discover Neanderthal skeletons exhibiting severe disabilities unearthed in present-day Iraq—a striking reminder of the discoveries we’ve overlooked.
Bethan Ackerley is a sub-editor at New Scientist. She has a passion for science fiction, sitcoms, and all things eerie. Follow her on Twitter @inkerley
New Scientist Book Club
Are you a book lover? Join a welcoming community of fellow readers. Every six weeks, we dive into exciting new titles, and members enjoy exclusive access to excerpts, author articles, and video interviews.
In 2024, 2.6 billion people (nearly a third of the global population) were still offline, as reported by The International Telecommunication Union (ITU). That same year, Freedom House estimated that over three-quarters of those with internet access live in countries where individuals have been arrested for sharing political, social, or religious content online, with nearly two-thirds of global internet users experiencing some form of online censorship.
The accessibility and quality of internet connections significantly impact how individuals lead their lives, a fact that deserves serious consideration. Having free and unobstructed internet access is no longer merely a luxury.
Human rights ensure a baseline of decent living conditions, as established by the UN General Assembly in the 1948 Declaration. In today’s digital landscape, the exercise of these rights—ranging from free speech to access to primary education—depends heavily on internet connectivity. For instance, many essential public services are transitioning online, and in several areas, digital services are the most viable alternatives to the absence of physical banks, educational institutions, and healthcare facilities.
Given the critical significance of internet access today, it must be officially recognized as a standalone human right by the United Nations and national governments. Such recognition would provide legal backing and obligations for international support that are often missing at the state level.
The ITU projects that achieving universal broadband coverage by 2030 will require an investment of nearly $428 billion. While this is a substantial sum, the benefits of connecting the remaining portion of humanity—enhanced education, economic activity, and health outcomes—far outweigh the costs.
Ensuring a minimum standard of connectivity is already an attainable goal. This includes providing 4G mobile broadband coverage, consistent access to smartphones, and affordable data plans for individuals that cost less than 2% of the average national income for 2GB per person, along with opportunities to develop essential digital skills.
However, having internet access alone is not sufficient for upholding human rights. As highlighted by the United Nations, misuse of technology for monitoring populations, gathering personal data for profit maximization, or spreading misinformation constitutes oppression rather than empowerment.
This right entails that states should respect users’ privacy, opposing censorship and the manipulation of information online. Businesses should prioritize human rights, especially users’ privacy, and actively combat misinformation and abuse on their platforms in line with regulations governing social media.
In 2016, the United Nations affirmed that people must be protected online just as they are offline. This concept was first suggested in 2003.
The time to act is now. Advocating for universal internet access as a human right calls for political action. We cannot afford to see the internet degrade from a tool for human advancement to one of division. Establishing this right will be a powerful measure to ensure that the internet serves the interests of all, not just a select few.
The recent findings in Sulawesi, Indonesia, have revised the timeline for early human sea crossings, adding complexity to the puzzle of their creators.
Archaeologists have unearthed stone tools at a location in South Sulawesi, called Cario, dating back at least 104 million years. Given that Sulawesi is encircled by swift and deep waters, anyone who created these tools would have had to navigate the open ocean.
“This represents the earliest known evidence of early human presence in Sulawesi,” says Professor Adam Brumm from the Australian Center for Human Evolution Research at Griffith University, which co-directed the research. BBC Science Focus.
“It now seems evident that early hominins managed to cross the Wallace Line, leading to isolated populations on distant islands.”
The Wallace Line serves as a critical biogeographical boundary between mainland Asia and Wallacea Island. “For land mammals that don’t fly, such as those in Sulawesi, crossing from the edge of mainland Asia to the nearest Wallacea island would have been nearly impossible due to the vast distances and swift currents,” Brumm explained.
Earlier discoveries indicated that hominins arrived at nearby Flores Island approximately 102 million years ago, evolving into species like Homo floresiensis (nicknamed “The Hobbit” due to its stature) and Homo luzonensis.
However, as of now, no fossils have been discovered in Sulawesi, leaving the identity of the tool’s maker an enigma.
“We suspect it was an early Asian human species, possibly Homo erectus,” Brumm remarked. “I doubt they used boats for this journey. The colonization of the island likely occurred accidentally as they might have clung to logs or natural vegetation ‘rafts’ that were formed during tsunamis.”
These stone tools, excavated from Cario in Sulawesi, have been dated to over 104 million years ago. The scale bar is 10 mm. – Credit: MW Moore/University of New England
If Homo erectus made it to Sulawesi more than a million years ago, they may have been carving out their own evolutionary niche.
“In Flores and Luzon, fossil discoveries indicate that hominins on these islands underwent evolutionary changes, leading to unique new species that are small and distinct,” noted Brumm. “Though we have yet to find human fossils in Sulawesi, the possibility of similar events occurring on the island cannot be ruled out.”
What’s next for Brumm and the team? “We’re continuing our excavations,” he stated. “Human fossils are incredibly rare, but millions of hominins have existed and perished over the last million years, so there might be preserved remains of these toolmakers out there.”
“We hope to discover a fossil—or two—with persistence (and a bit of luck), as finding one would be an extraordinary breakthrough, perhaps even a game changer.”
Adam Brumm is a professor of archaeology at Griffith University. His work has accumulated over 21 years of funding for research in Indonesia. His published studies include many in Nature, spanning topics from the discovery of new human fossils in Wallacea (the island region between Asia and Australia) to recent insights into human evolution.
Concerns have been raised that AI could exacerbate racism and sexism in Australia, as human rights commissioners expressed during internal discussions within the Labor party regarding new technologies.
Lorraine Finlay cautioned that while seeking productivity gains from AI is important, it should not come at the cost of discrimination if the technology remains unregulated.
Finlay’s remarks came after worker Sen. Michel Ananda Raja advocated for the “liberation” of Australian data to tech companies, noting that AI often reflects and perpetuates biases from abroad while shaping local culture.
Ananda Raja opposes a dedicated AI law but emphasizes that content creators ought to be compensated for their contributions.
Sign up: AU Breaking NewsEmail
Discussions about enhancing productivity through AI are scheduled for the upcoming federal economic summit, as unions and industry groups voice concerns over copyright and privacy issues.
Media and Arts organizations have raised alarms about the “ramping theft” of intellectual property if large tech corporations gain access to content for training AI systems.
Finlay noted the challenges of identifying embedded biases due to a lack of clarity regarding the datasets used by AI tools.
“Algorithmic bias means that discrimination and inequality are inherent in the tools we utilize, leading to outcomes that reflect these biases,” she stated.
Lorraine Finlay, Human Rights Commissioner. Photo: Mick Tsikas/AAP
“The combination of algorithmic and automation biases leads individuals to rely more on machine decisions and potentially disregard their own judgment,” Finlay remarked.
The Human Rights Commission has consistently supported an AI Act that would enhance existing legislation, including privacy laws, and ensure comprehensive testing for bias in AI tools. Finlay urged the government to quickly establish new regulations.
“Bias tests and audits, along with careful human oversight, are essential,” she added.
Evidence of bias in AI technologies is increasingly reported in fields like healthcare and workforce recruitment in Australia and worldwide.
A recent survey in Australia revealed that job applicants interviewed by AI recruiters faced potential discrimination if they had accents or disabilities.
Ananda Raja, a vocal proponent for AI development, noted the risks of training AI systems using exclusively Australian data, as well as the concerns of amplifying foreign biases.
While the government prioritizes intellectual property protection, she cautioned against limiting domestic data access, warning that Australia would be reliant on overseas AI models without adequate oversight.
“AI requires a vast array of data from diverse populations to avoid reinforcing biases and harming those it aims to assist,” Ananda Raja emphasized.
“We must liberate our data to better train our models, ensuring they authentically represent us.”
“I am eager to support content creators while freeing up data, aiming for an alternative to foreign exploitation of resources,” Ananda Raja stated.
She cited AI screening tools for skin cancer as examples where algorithmic bias has been documented. To combat bias and discrimination affecting specific patients, it is essential to train these models on diverse datasets to protect sensitive information.
Finlay emphasized that any release of Australian data needs to be handled fairly, but she feels the emphasis should be on establishing appropriate regulations.
“It’s certainly beneficial to have diverse and representative data… but that is merely part of the solution,” she clarified.
“We must ensure that this technology is equitable and is implemented in a manner that recognizes and values human contributions.”
Judith Bishop, an AI expert at La Trobe University and former data researcher at an AI firm, asserted that increasing the availability of local data will enhance the effectiveness of AI tools.
“It is crucial to recognize that systems developed in different contexts can be relevant, as the [Australian] population should not exclusively depend on US data models,” Bishop stated.
eSafety Commissioner Julie Inman Grant has also voiced concerns regarding the lack of transparency related to the data applied by AI technologies.
In her statement, she urged tech companies to be transparent about their training datasets, develop robust reporting mechanisms, and utilize diverse, accurate, and representative data for their products.
“The opacity surrounding generative AI’s development and deployment poses significant issues,” Inman Grant remarked. “This raises critical concerns about the potential for large language models (LLMs) to amplify harmful biases, including restrictive or detrimental gender norms and racial prejudices.”
“Given that a handful of companies dominate the development of these systems, there is a significant risk that certain perspectives, voices, and evidence could become suppressed or overlooked in the generated outputs.”
Our brains are glowing. While this phenomenon isn’t visible to the naked eye, scientists have the ability to detect faint light that permeates the skull. Recent studies indicate that this light varies based on our activities.
All living tissues generate a subtle light known as Ultraweak Photon Emissions (UPE). This emission ceases once the organism dies. The human brain, however, emits a considerable amount of this light due to its high energy consumption, accounting for around 20% of the body’s total energy.
“Ultraweak photon emissions, or UPE, are extremely faint light signals produced by all types of cells throughout the body—trillions of times weaker than the light from bulbs,” stated Dr. Nirosha Murugan, an Assistant Professor of Health Sciences at Wilfrid Laurier University in Ontario, Canada. BBC Science Focus.
“Although UPE is a weak signal, the energy expenditure of the brain generates more light than other organs,” she explained. “Consider the hundreds of billions of brain cells; each one emits a weak light signal, but together they create a measurable collective glow outside the head.”
Murugan’s research team aimed to explore whether this glow fluctuated with brain activity and if it could be utilized to assess brain functions.
To investigate, scientists equipped participants with caps containing electrical sensors to track both electrical impulses and light emitted from the brain. Twenty adults were invited to sit in a darkened room.
Participants were directed to open and close their eyes and follow simple audio instructions.
Comparisons were made between the captured electrical signals and UPEs, revealing notable correlations.
“We discovered that the optical signals detected around the head correlate with electrical activity in the brain during cognitive tasks,” Murugan noted. “These patterns of light emission from the brain are dynamic, intricate, and informative.”
The brain emitted this light in a slow, rhythmic pattern, occurring less than once per second, creating the illusion of stability throughout the two-minute tasks.
All living cells emit ultrawave light as a byproduct of chemical reactions such as energy metabolism – Credit: Sean Gladwell via Getty
Murugan indicated that measuring this brain light could offer scientists and medical professionals a novel method for brain imaging, potentially identifying conditions like epilepsy, dementia, and depression.
This light is not merely a by-product; it might also play a functional role in the brain. Murugan emphasized that examining it could “uncover hidden dimensions” of our cognitive processes.
“I hope that the possibility of detecting and interpreting light signals from the brain will inspire new questions previously deemed unfathomable,” she stated. “For instance, can UPEs permeate the skull and influence other brains within the vicinity?”
This study serves as a preliminary exploration, suggesting that plenty remains to be uncovered about our illuminating brains.
Nonetheless, Murugan expressed hope that the team’s discoveries will “ignite a new discussion regarding the significance of light in brain functionality.”
read more:
About our experts
Dr. Nirosha Murugan is an assistant professor in the Department of Health Sciences at Wilfrid Laurier University, Ontario, Canada. She was recently appointed as Tier 2 Canada Research Chair of Biophysics at the University of Algoma in Ontario.
OpenAI asserts that the recent upgrade to ChatGPT marks a “significant step” towards achieving artificial general intelligence (AGI), yet recognizes that there is still no “many” in the endeavor to create a system capable of performing human tasks.
The company claims that the GPT-5 model, which serves as the foundation of its innovative AI chatbot, represents a substantial improvement over previous iterations in areas like coding and creative writing, with significantly fewer sycophants.
The enhancements in ChatGPT are now availableto over 1 million weekly users.
OpenAI CEO Sam Altman referred to the model as a “significant step forward” in reaching the theoretical state of AGI, which is characterized as a highly autonomous system that can outperform humans in economically significant roles.
However, Altman conceded that GPT-5 has not yet attained that objective. “[It is] missing something very crucial, something very important,” he noted, emphasizing that the model cannot “learn on a continuous basis.”
Altman explained that while GPT-5 is “generally intelligent” and represents an “important step towards AGI,” most definitions indicate it has not reached that level yet.
“I believe the way we define AGI is significantly lacking, which is quite crucial. One major aspect… is that this model doesn’t adapt continuously based on new experiences.”
During the GPT-5 launch event on Thursday, Altman described the new version of ChatGPT as akin to having “doctoral experts in your pocket.” He compared the previous version to a college student and the one before that to a high school student.
The theoretical capabilities of AGI, along with high-tech companies’ drive to realize it, have led AI executives to predict that numerous white-collar jobs—ranging from lawyers to accountants—could be eliminated due to these technological advances. Dario Amodei, CEO of AI firm Anthropic, cautioned that technology might replace half of entry-level office roles in the coming five years.
According to OpenAI, the key enhancements to GPT-5 include reduced factual inaccuracies and hallucinations, improved coding capabilities for creating functional websites and apps, and a boost in creative writing abilities. Instead of outright “rejecting” prompts that violate guidelines, the model now aims to provide the most constructive response possible within safety parameters, or at least clarify why it cannot assist.
ChatGPT retains its agent functionalities (like checking restaurant availability and online shopping) but can also access users’ Gmail, Google Calendar, and contacts—provided permission is granted.
Similar to its predecessor, GPT-5 can generate audio, images, and text, and is capable of processing inquiries in these formats.
On Thursday, the company showcased how GPT-5 could swiftly write hundreds of lines of code to create applications, such as language learning tools. Staff noted that the model’s writing isn’t robotic; it produced a “more nuanced” compliment. Altman mentioned that ChatGPT could also be valuable for healthcare advice, discussing ways to support women diagnosed with cancer last year and assisting chatbots in deciding on radiation therapy options.
The company stated that the upgraded ChatGPT excels at addressing health-related inquiries and will become more proactive in “flagging potential concerns,” including serious physical and mental health issues.
The startup emphasized that chatbots should not replace professional assistance, amidst worries that AI tools could worsen the plight of individuals susceptible to mental health challenges.
Nick Turley, director of OpenAI’s ChatGPT, claimed that the model shows “significant improvement” in sycophancy. It’s becoming too familiar, which could lead to negative experiences for users.
The release of the latest model is expected to funnel billions into tech companies’ efforts to attain AGI. On Tuesday, Google’s AI division outlined its latest progress towards AGI by unveiling an unreleased “world model,” while last week, Mark Zuckerberg, CEO of parent company Meta, suggested that a future state of AI, even more advanced than AGI, is “on the horizon.”
Investor confidence in the likelihood of further breakthroughs and AI’s ability to reshape the modern economy has sparked a surge in valuations for companies like OpenAI. Reports on Wednesday indicated that OpenAI was in preliminary talks to sell shares held by current and former employees, potentially valuing the company at $500 million, surpassing Elon Musk’s SpaceX.
OpenAI also launched two open models this week and continues to offer a free version of ChatGPT, while generating revenue through subscription fees for its advanced chatbot version, which can be integrated into business IT systems. Access to the free version of ChatGPT on GPT-5 will be limited, whereas users of the $200 Pro package will enjoy unlimited use.
Cut marks on the foot bone from El Mirador cave, Spain
iphes-cerca
The discovery of human remains in caves in northern Spain indicates that Neolithic people may have resorted to cannibalism after battles.
Francesc Marginedas from the Catalan Institute of Human Paleoecology and Social Evolution (IPHES) in Tarragona, along with his team, examined fragments from 650 human remains found in El Mirador cave on Mount Atapuerca. These remains date back approximately 5,700 years and belong to 11 individuals.
All examined bones displayed evidence that these individuals had been consumed by other humans. Some exhibited chop markings made by a stone tool, while others showed translucent portions with gently rounded edges. Some of the long bones were fractured open with stones to access the bone marrow, and smaller bones like metatarsals and ribs had clear human bite marks.
This research supports the notion that cannibalistic practices were more prevalent in human history than previously believed.
El Mirador marks at least the fifth significant site in Spain with notable evidence of cannibalism during the Neolithic era, a shift period from foraging to agriculture, according to Margida. “There’s a growing understanding that such behavior was more frequent than we anticipated.”
The motives behind these cannibalistic acts remain unclear. Some archaeological sites show skull cups indicating a ritualistic aspect to cannibalism, while others hint at survival strategies during dire circumstances.
However, Marsidas and his team propose that the findings at El Mirador suggest these acts were linked to warfare. There was a significant amount of animal remains, and no signs of nutritional stress among the humans involved, indicating this early agricultural community was not struggling with food scarcity. Their findings offer no indication of ritualistic behavior, as human bones were found alongside animal remains.
The ages of the individuals ranged from under seven to over fifty, implying that an entire family unit may have been lost to conflict. Radiocarbon dating indicated that all 11 individuals were killed and consumed within a few days.
This evidence reflects patterns of conflict and cannibalism, which have also been noted at two other Neolithic sites: the Von Bregore Caves in France and Helxheim in Germany. This period appears marked by instability and violence due to community clashes with neighboring groups and newcomers.
While Margida and his colleagues are uncertain about the reasons behind these cannibalistic practices, historical ethnographic studies suggest that such acts during warfare can serve as a method of “ultimate exclusion.” “We believe that one group attacking and consuming another serves as a humiliating statement,” states Merseydus.
“The thoroughness of the body’s treatment and consumption is remarkable.” Paul Pettitt from Durham University, UK, comments, “The aggressive nature shown in these artifacts, regardless of whether the consumed were relatives or adversaries, mirrors a dehumanization process during consumption.”
Sylvia Bello from the Museum of Natural History in London concurs that this evidence of death likely ties back to conflicts but remains skeptical about the notion of consumption as humiliation. She suggests that cannibalism may stem from aggression and animosity rather than ritualized farewell practices, implying a more complex interpretation. “It could carry ritual significance, even amid warfare,” she asserts.
Neanderthals, Ancient Humans, and Cave Art in France
Join new scientist Kate Douglas on an enthralling exploration of the key Neanderthal and Upper Paleolithic sites in southern France, spanning from Bordeaux to Montpellier.
Like all cells, human eggs are subject to mutations
CC Studio/Science Photo Library
Research indicates that human eggs may be shielded from certain types of mutations associated with aging. In a recent study, scientists discovered that as women age, there are no signs of accumulating mutations in the mitochondrial DNA of their egg cells.
“When we consider age-related mutations, we typically think about older individuals having more mutations compared to younger ones,” notes Kateryna Makova from Pennsylvania State University. “However, this assumption doesn’t always hold true.”
Mitochondria, which provide the primary energy source for the body’s cells, are inherited solely from the mother. While mitochondrial DNA mutations are generally benign, they can sometimes result in complications that impact muscles and neurons, particularly due to their high energy demands. “Oocytes” [egg cells] serve as this biological reservoir,” explains Ruth Lehmann from Massachusetts Institute of Technology, who was not part of this study.
Prior research has shown that older mothers tend to pass down more chromosomal mutations, leading to the general assumption that a similar pattern exists with mitochondrial DNA mutations. To investigate this, Makova and her team utilized DNA sequencing to identify new mutations across 80 eggs sourced from 22 women aged 20 to 42 years.
The findings revealed that mitochondrial mutations in female eggs do not actually escalate with advancing age, unlike those found in salivary and blood cells. “It seems we have evolved a mechanism that mitigates the accumulation of mutations, allowing for their replication later in life,” remarks Makova.
Previous research has indicated that mitochondrial DNA mutations in macaque eggs showed an increase while their reproductive capacity remained stable until the animal reached about nine years of age. “It would be worthwhile to also study younger women. This could apply to humans as well,” comments team member Barbara Arbetuber from Penn State University.
Recent research conducted by scientists at the University of Utah sheds light on unlocking hibernation abilities, potentially paving the way for treatments that could reverse neurodegeneration and diabetes.
Investigating the evolution of hibernation in certain species like helinates, bats, ground squirrels, and lemurs can unveil the mysteries of their extraordinary resilience. Image credit: Chrissy Richards.
Gene clusters known as fat mass and obesity (FTO) loci are crucial to understanding hibernation capabilities. Interestingly, these genes are also present in humans.
“What stands out in this region is that it represents the most significant genetic risk factor for obesity in humans,” states Professor Chris Greg, the lead author of both studies from the University of Utah.
“Hibernators seem to leverage genes in the FTO locus uniquely.”
Professor Greg and his team discovered DNA regions specific to hibernation factors near the FTO locus that regulate the expression of nearby genes, modulating their activity.
They hypothesize that hibernators can accumulate weight prior to entering winter by adjusting the expression of adjacent genes, particularly those at or near the FTO locus, utilizing fat reserves gradually for winter energy needs.
Moreover, regulatory regions linked to hibernation outside the FTO locus appear to play a significant role in fine-tuning metabolism.
When the research team mutated these hibernation factor-specific regions in mice, they observed variations in body weight and metabolism.
Some mutations accelerated or inhibited weight gain under specific dietary conditions, while others affected the mice’s ability to restore body temperature post-hibernation or regulate their overall metabolic rate.
Interestingly, the hibernator-specific DNA regions identified by researchers are not genes themselves.
Instead, this region comprises a DNA sequence that interacts with nearby genes, modulating their expression like conductors guiding an orchestra to adjust volume levels.
“This indicates that mutating a single hibernator-specific region can influence a broad array of effects well beyond the FTO locus,” notes Dr. Susan Steinwand from the University of Utah. First study.
“Targeting a small, inconspicuous DNA region can alter the activity of hundreds of genes, which is quite unexpected.”
Gaining insight into the metabolic flexibility of hibernators may enhance the treatment of human metabolic disorders like type 2 diabetes.
“If we can manipulate more genes related to hibernation, we may find a way to overcome type 2 diabetes similar to how hibernators transition back to normal metabolic states,” says Dr. Elliot Ferris, Ph.D., of the University of Utah. Second survey.
Locating genetic regions associated with hibernation poses a challenge akin to extracting needles from a vast haystack of DNA.
To pinpoint relevant areas, scientists employed various whole-genome technologies to investigate which regions correlate with hibernation.
They then sought overlaps among the outcomes of each method.
Firstly, they searched for DNA sequences common to most mammals that have recently evolved in hibernators.
“This region has remained relatively unchanged among species for over 100 million years; however, if significant alterations occur in two hibernating mammals, it signals critical features for hibernation,” remarked Dr. Ferris.
To comprehend the biological mechanisms of hibernation, researchers tested and identified genes that exhibited fluctuations during fasting in mice, producing metabolic alterations similar to those seen in hibernation.
Subsequently, they identified genes that serve as central regulators or hubs for these fasting-induced gene expressions.
Numerous recently altered DNA regions in hibernators appear to interact with these central hub genes.
Consequently, the researchers predict that the evolution of hibernation necessitates specific modulations in hub gene regulation.
These regulatory mechanisms constitute a potential candidate list of DNA elements for future investigation.
Most alterations related to hibernation factors in the genome seem to disrupt the function of specific DNA rather than impart new capabilities.
This implies that hibernation may have shed constraints, allowing for great flexibility in metabolic control.
In essence, the human metabolic regulator is constrained to a narrow energy expenditure range, whereas, for hibernators, this restriction may not exist.
Hibernation not only reverses neurodegeneration but also prevents muscle atrophy, maintains health amidst significant weight fluctuations, and suggests enhanced aging and longevity.
Researchers surmise that their findings imply if humans can bypass certain metabolic switches, they may already possess a genetic blueprint akin to a hibernation factor superpower.
“Many individuals may already have the genetic structure in place,” stated Dr. Steinwand.
“We must identify the control switches for these hibernation traits.”
“Mastering this process could enable researchers to bestow similar resilience upon humans.”
“Understanding these hibernation-associated genomic mechanisms provides an opportunity to potentially intervene and devise strategies for tackling age-related diseases,” remarks Professor Greg.
“If such mechanisms are embedded within our existing genome, we could learn from hibernation to enhance our health.”
The findings are published in two papers in the journal Science.
____
Susan Steinwand et al. 2025. Conserved non-coding CIS elements associated with hibernation regulate metabolism and behavioral adaptation in mice. Science 389 (6759): 501-507; doi: 10.1126/science.adp4701
Elliot Ferris et al. 2025. Genome convergence in hibernating mammals reveals the genetics of metabolic regulation of the hypothalamus. Science 389 (6759): 494-500; doi: 10.1126/science.adp4025
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.