Explore Human Organs in 3D: A Detailed Mapping Experience Down to the Cellular Level

A groundbreaking new Human Organ Atlas (HOA) portal empowers scientists, healthcare professionals, and curious individuals to explore intact human organs like never before. This innovative platform allows users to investigate everything from entire organs to individual cells in stunning detail, potentially transforming our understanding of human anatomy and disease.

Referred to as the “Google Earth of Human Organs,” the HOA currently features 307 3D datasets spanning 56 organs from 25 donors, including vital organs such as the brain, heart, and lungs, as well as others like the placenta and prostate. This cutting-edge resource is easily accessible through any standard web browser.

The implications of the HOA for the field of medicine are significant. “Human organs possess a three-dimensional, hierarchical structure,” explains Dr. Claire Walsh, Associate Professor and Director at University College London’s Human Organ Atlas Hub in an interview with BBC Science Focus.

“This is the only database I know of that provides 3D hierarchical images of real human organs that are accessible to anyone in the world.”









Early findings showcase the atlas’ potential. Previously, scientists could only estimate the number of nephrons (the kidney’s filtration units) in human kidneys and their locations. With access to HOA data, researchers can now visualize and count individual nephrons throughout the kidney, providing crucial insights into kidney function.

This data is also being applied in the brain, enhancing the precision of surgical placements for deep brain stimulation electrodes. Furthermore, research is underway to uncover congenital heart defects.

In regard to lung health, the atlas aids scientists in understanding the effects of COVID-19 and pulmonary fibrosis on the vascular network.

https://c02.purpledshub.com/uploads/sites/41/2026/03/HOA-Purple-resize.mp4
The Human Organ Atlas features 11 organ types, including the brain, heart, lungs, kidneys, liver, colon, spleen, placenta, uterus, prostate, and testes.

The HOA was constructed using Hierarchical Phase Contrast Tomography (HiP-CT), a revolutionary technique developed at the European Synchrotron in Grenoble, France. This method uses a light source up to 100 billion times brighter than conventional hospital CT scanners, enabling researchers to non-destructively image entire organs and zoom in to about 50 times smaller than a human hair.

“We are opening a new window into the inner workings of the human body,” stated Paul Tafforeau, an ESRF scientist involved in the project. “After six years of development, we are just beginning. Currently, we focus on isolated organs, but future plans include imaging entire human bodies at resolutions 10 to 20 times greater than today. Such data could revolutionize the study and understanding of anatomy.”

Read more:

This rewrite enhances SEO by incorporating relevant keywords while retaining the original structure with HTML tags.

Source: www.sciencefocus.com

Startup Innovates with First Data Center Powered by Human Brain Cells

Close-up of an artificial brain illustrating neural activity and orange light dots, representing artificial intelligence. Synapses and neurons are crafted from cubic particles rendered in 3D format.

Exploring Biological Computers

Floriana/Getty Images

As energy demands soar in data centers and the need for chips intensifies, could biological cells offer a solution? Australian startup Cortical Labs is pioneering this concept by establishing two innovative biological data centers in Melbourne and Singapore. These facilities will utilize chips populated with reproducible neurons for data processing.

Cortical Labs stands out as a leader in the emerging field of biological computing, using nerve cells linked to microelectrode arrays to both stimulate and record cellular responses during data input. Recently, the company showcased its flagship computer, the CL1, demonstrating its ability to learn to play games like Doom within a week.

The Melbourne data center is set to feature approximately 120 CL1 units, while a collaboration with the National University of Singapore will launch with 20 units, aiming for a total of 1,000 CL1s, pending regulatory approval. This ambitious expansion is designed to enhance their cloud-based brain computing services.

Michael Barros from the University of Essex remarks, “Biological computers like CL1 have been developed by multiple research teams globally but pose construction challenges for widespread adoption.” He continues, “Cortical Labs is making biocomputers more accessible, set to be the first company to do this at scale.”

These biological systems can be trained for tasks like playing Doom, although understanding the optimal training methods for neurons remains a complex issue. Reinhold Scherer, also from the University of Essex, notes, “Having access can facilitate explorations in learning and programming, yet neurons cannot be programmed as traditional computers.”

Moreover, Cortical Labs asserts that its biological data centers are significantly more energy-efficient than conventional computing systems, with each CL1 unit consuming just 30 watts compared to thousands of watts used by state-of-the-art AI chips.

Paul Roach from Loughborough University highlights that scaling up these systems to function like traditional data servers could lead to remarkable energy savings, even if they require nutrients to sustain the neuron chips. However, the cooling requirements are expected to be much lower than in traditional setups, indicating considerable power conservation according to Cortical Labs’ estimates.

Yet, the technology is still nascent. Tjeerd Olde Scheper, who has collaborated with a competitor, FinalSpark, poses questions about efficacy, stating, “We’re still in early development stages.” He emphasizes that transitioning from a small network managing simple tasks to a larger-scale language model is a substantial leap.

A primary challenge remains: the capacity to save training outcomes and utilize these neurons for computational algorithms beyond specific tasks like gaming. Retraining these neurons after their life cycle is another hurdle, as Scherer points out, “If retraining is needed every month, longevity of use becomes an issue.”

Topics:

Source: www.newscientist.com

Revolutionary Startup Develops First Data Center Powered by Human Brain Cells

Close-up of an artificial brain showcasing neural activity and orange light dots, illustrating the concept of artificial intelligence. 3D rendering of synapses and neurons made up of cubic particles.

A small number of companies are developing biological computers

Floriana/Getty Images

Data centers consume vast amounts of energy while the demand for computer chips continues to soar. Could utilizing brain cells be the solution?
Australian startup Cortical Labs is pioneering this field, planning to establish two innovative “biological” data centers in Melbourne and Singapore. These cutting-edge data centers will feature chips integrated with reproducible neurons.
Pon vs. Doom.

Cortical Labs stands out as one of the few firms creating biological computers that link nerve cells to microelectrode arrays, enabling the stimulation and measurement of cell responses during data input. Recently, the company successfully showcased that its primary model, the CL1, can learn to play games like Doom within just a week.

The first data center in Melbourne is set to accommodate around 120 CL1 units, while a second facility in collaboration with the National University of Singapore will initially support 20 CL1 systems, with plans to expand to 1,000 pending regulatory approval. This initiative aims to enhance cloud-based brain computing services.

According to Michael Barros from the University of Essex, UK, while biological computers have been constructed and tested globally, they remain challenging to build and use. He states, “We invest a lot of time and resources developing these systems.”

Barros further elaborates that Cortical Labs is democratizing access to biocomputers at scale, pioneering an accessible approach in the industry.

These systems can be trained for simple tasks, such as playing Doom, yet there are challenges in understanding how neurons function and training them for more complex tasks like machine learning. Reinhold Scherer, also from the University of Essex, notes, “When you access this technology, it opens doors to exploration in learning, training, and programming, but neurons cannot be programmed like standard computers.”

Cortical Labs asserts that its biological data centers use significantly less energy than traditional computing systems, with each CL1 requiring only 30 watts compared to thousands needed by leading conventional AI chips.

Paul Roach from Loughborough University, UK, emphasizes that scaling biocomputers into entire rooms, akin to traditional data servers, could yield substantial energy savings. Notably, while biological data centers may necessitate nutrients to sustain neuron chips, they require less cooling energy than conventional computing infrastructures, suggesting significant potential for energy conservation.

Nevertheless, experts like Tjeerd Olde Scheper, who holds a PhD from Oxford Brookes University, recognize that the technology remains nascent. “Will it perform as expected? We are still in the early developmental phase,” he comments.

Although direct comparisons between the sizes of biological and silicon AI systems remain complex, it’s notable that the envisioned biological data center would integrate hundreds of biological chips in contrast to the hundreds of thousands of GPUs typically found in large-scale AI data centers.

“We have a long way to go before these systems are production-ready. Transitioning from a small network playing games to a large language model is a substantial leap,” says Steve Furber from the University of Manchester, UK.

A pressing concern is the lack of clarity on how to store training outcomes within neurons as memory, or how to execute computational algorithms beyond specific tasks, such as video gaming.

Additionally, retraining neurons post-task completion poses challenges, as their training and learning may be lost upon the end of their lifespan. “Proper retraining is essential,” Scherer states. “If retraining is required every 30 days, it may hinder technological continuity.”

Topics:

This optimization includes enhanced keywords relevant to biological computing, energy efficiency, and neural networks, while ensuring the structure and relevant HTML tags remain intact.

Source: www.newscientist.com

How Farming Transformed Human Evolution: The Impact of Agriculture on Our Development

Evolution and Agriculture Impact

The Advent of Agriculture and Evolutionary Pressures on Humans

Christian Jegou/Science Photo Library

The comprehensive analysis of ancient genomes has revealed significant insights into human evolution over the last 10,000 years. This research indicates that various populations worldwide have experienced similar evolutionary changes, particularly following the introduction of agriculture.

“Similar traits and genes are being selected in diverse populations,” says Laura Colbran from the University of Pennsylvania.

Evolution happens when genetic variation becomes prevalent in a population—often because it confers an advantage. By comparing genomes, we can identify recent signs of human evolution.

Colbran notes that ancient DNA is exceptionally valuable for this research, stating, “Using ancient genomes allows us to witness genetic history directly, as opposed to relying solely on inferential methods.”

Much of the recent research has primarily focused on European genomes, but Colbran’s team leveraged an increasing collection of genomes from outside Europe, analyzing over 7,000 ancient and contemporary genomes. Ancient genomes mainly originate from the last 10,000 years, while modern genomes are derived from living populations.

The research team utilized ancient genomes to predict possible modern genetic profiles without evolutionary influence, highlighting differences known as selection signals. They identified 31 selection signals, many of which were shared among varied populations, likely due to the independent rise of agriculture around the same era globally.

For instance, less than 25% of ancient individuals possessed the FADS1 gene, which encodes an enzyme that aids in converting short-chain fatty acids (common in plants) into long-chain fatty acids (predominant in meats). Increased production of this enzyme is thought to benefit individuals who adopt a plant-heavy diet. Currently, over 75% of people in Europe, Japan, and northern China carry advantageous FADS1 variants. The strength of selection for this gene has remained stable over the last 300 generations in Europe while intensifying in East Asia over the last century.

The genes impacting the alcohol dehydrogenase 1B enzyme, encoded by ADH1B, have also been critically analyzed. Variants of ADH1B are prevalent in East Asia and are associated with quick alcohol metabolism, leading to symptoms like facial flushing. Colbran stated, “This showcases the strongest selection signal we’ve observed in East Asia,” suggesting that this variant was favored to curb excessive alcohol consumption.

Even though this variant was absent in ancient Europeans, strong selection signals related to the ADH1B enzyme were identified. Colbran emphasized the need for further investigation to discern the involved variants and their specific impacts, indicating a likely adaptation to evolving alcohol consumption patterns.

The research team also explored traits influenced by multiple genetic variations, such as waist-to-hip ratios, often correlated with fertility. Surprisingly, they found a robust selection process stabilizing women’s waist-to-hip ratios within certain limits. “This is intriguing as it suggests a stabilization of selection,” Colbran remarked, emphasizing that while waist-to-hip ratios can differ across various populations, the ideal measurement likely exists in a balanced range.

As noted by Alexander Gusev at Harvard University, this study is remarkable for its analysis of ancient DNA that has yet to be thoroughly examined. Gusev explained, “The authors found enriched variants being selected within one population compared to others, indicating parallel selection across populations, previously hypothesized but not empirically demonstrated.”

Yashin Souilumi, from the University of Adelaide, emphasized that their novel approach reveals regions of the genome newly identified as subject to selection, complementing previously known areas. “Their innovative method optimally utilizes the vast amounts of available ancient DNA,” Souilumi stated.

Colbran concluded that these findings are merely the initial discoveries. As more non-European genomes are sequenced, we will uncover even more evidence of recent human evolution.

Discovery Tour: Archaeology, Human Origins, and Paleontology

New Scientist frequently covers extraordinary archaeological sites that reshape our understanding of human evolution and early civilizations. Join us on this fascinating journey!

Topics:

Source: www.newscientist.com

How Ancient Mating Preferences Shaped the Human Genome: Insights from Recent Study

A groundbreaking study from the University of Pennsylvania reveals that prehistoric humans and Neanderthals interbred with a notable sexual bias, with male Neanderthals mating more often with female modern humans. This pattern may explain the scarcity of Neanderthal DNA in the human X chromosome and highlight the impact of social behaviors on our genetic lineage.

Prehistoric mating preferences help explain why modern humans carry small amounts of Neanderthal DNA in their genomes, particularly absent from the X chromosome. Image credit: Gemini AI.

“In addition to the X chromosome, there’s a significant gap in Neanderthal DNA referred to as the ‘Neanderthal desert’,” stated lead author Dr. Alexander Pratt, a researcher at the University of Pennsylvania.

“Historically, we believed these gaps resulted from certain Neanderthal genes being biologically harmful to humans, leading to their removal through natural selection,” he added.

New genomic analyses indicate that long-standing mating preferences, not genetic incompatibilities, influenced which Neanderthal DNA sequences were retained in modern human genomes.

This research illustrates how social interactions have shaped the human genome and challenges the notion that evolution is solely driven by the “survival of the fittest.”

“Our findings indicate a distinct sexual bias, with gene flow predominantly occurring from male Neanderthals to anatomically modern human females, which explains the limited presence of Neanderthal DNA on modern human X chromosomes,” remarked Dr. Platt.

“Approximately 600,000 years ago, anatomically modern humans and Neanderthals diverged, creating two separate evolutionary paths,” added Professor Sarah Tishkoff, the study’s senior author.

“While our ancestors evolved in Africa, Neanderthals adapted to life in Eurasia, yet this separation was not permanent.”

“Over millennia, human groups migrated into and out of Neanderthal territories, resulting in genetic exchanges during their encounters.”

To assess whether Neanderthal X chromosomes contained modern human alleles, researchers analyzed conserved DNA in three Neanderthal samples: Altai, Chagyrskaya, and Vindija.

They compared this data with that of a diverse genome from Africa, which hadn’t historically interacted with Neanderthals.

“Our analysis revealed a significant discrepancy,” noted co-author Dr. Daniel Harris from the University of Pennsylvania.

“While modern humans lack the Neanderthal X chromosome, the Neanderthal X chromosome contained 62% more modern human DNA compared to other chromosomes.”

This mirrored result indicates that if reproductive incompatibility existed, modern human DNA would also be absent in Neanderthal X chromosomes.

However, the presence of modern human DNA in Neanderthal X chromosomes rules out biological incompatibility as a barrier to reproduction.

The lingering explanation lies in the sexual bias in mating practices.

Given that women possess two X chromosomes and men only one, the direction of mating plays a crucial role.

If Neanderthal males mated more frequently with modern human females, fewer Neanderthal X chromosomes would integrate into the human gene pool, while more human X chromosomes would enter the Neanderthal population.

Mathematical models verified that this bias adequately explains the observed inheritance patterns.

While other factors such as gender-biased migration could lead to similar results, these scenarios are often complex and vary over time and geography.

“Our findings suggest that mating preferences offer the simplest explanation for these patterns,” concluded Dr. Platt.

For more details on this research, refer to the journal Science.

_____

Alexander Pratt et al. 2026. Interbreeding between Neanderthals and modern humans showed significant sexual bias. Science 391 (6788): 922-925; doi: 10.1126/science.aea6774

Source: www.sci.news

Adorable Seal Pups Mimic Human Speech and Accents: Discover Their Unique Sounds!

Recent studies reveal that seal pups produce more human-like sounds than previously believed, often taking turns “communicating” by adjusting their calls to match their neighboring pups. This fascinating behavior sheds light on the evolution of complex communication, including human language.

Harbor seals, also known simply as seals, are among the few animal species capable of learning and altering their vocalizations.

“They can learn to create new sounds or modify existing ones,” explains Dr. Cohen de Reus from Radboud University and Vrije Universiteit Bruxelles. His research is part of his Ph.D. dissertation, as noted by BBC Science Focus.

Every talkative harbor seal has its own distinct calls, which mothers utilize to locate their pups on busy beaches. This study examines how seals modify their calls based on social contexts.

https://c02.purpledshub.com/uploads/sites/41/2026/02/Seal-pup-conversation.mp4
During testing, Jenny the seal’s responses were monitored as recordings of other pups were played.

Dr. de Reus found that the calls of pups sitting together became increasingly similar over time. “This phenomenon resembles regional accents in humans,” he stated. “Despite their visual similarities, each pup can be recognized individually, just as in humans.”

Additionally, akin to polite human conversation, the pups engage in turn-taking without overlapping in communication.

To conduct his research, Dr. de Reus analyzed thousands of hours of audio from numerous harbor seal pups at the Peterburen Seal Center in the Netherlands.

“After spending extensive time with the pups, I could identify at least half of their calls,” he shared.

This study aims to uncover the subtleties of communication shared across species and those unique to humans, potentially revealing the intricate history of human language development.

“Language is often regarded as a unique trait that sets us apart from other species, yet our findings indicate the existence of advanced communication systems in various animals,” Dr. de Reus continued. “Consider this research a foundational step for future comparisons.”

This seal was recorded at a rehabilitation center that cares for orphaned and injured seals until their release back into the wild – Credit: Getty

Read more:

This version incorporates SEO best practices by using relevant keywords, optimizing headings, and maintaining appropriate HTML tag structure.

Source: www.sciencefocus.com

Human Fratus Atlas: Measuring the Explosive Power of Flatulence

Feedback is the New Scientist’s platform for engaging with our readers, especially those passionate about the latest in science and technology news. If you have insights or suggestions for articles that might interest our audience, please reach out via feedback@newscientist.com.

It’s Gas

Our feedback feels bold, so here’s a prediction: the research discussed here is likely to win an Ig Nobel Prize within the next decade. This project aims to objectively measure human flatulence using innovative biosensors, affectionately dubbed “smart underwear.”

We learned about this intriguing study from a press release featuring Carmela Padavik Callahan, a professor at the University of Maryland and a physics reporter. She noted, “Certainly we could do something with this feedback.”

The main challenge is that, unlike established biomarkers such as blood sugar, we lack a benchmark for bloating. Most existing studies depend on self-reporting, which is unreliable since individuals often forget their flatulence events and can’t accurately judge their frequency or size. Additionally, it’s “impossible to record gas while sleeping.” Anyone who has shared a bed with another person knows that everyone farts during slumber.

This is where smart underwear comes in, developed by Brantley Hall and colleagues. According to the press release, it’s a compact device that discreetly fits over standard underwear and utilizes electrochemical sensors to track intestinal gas production around the clock. Curious about the size? The sensor measures just 26 x 29 x 9 millimeters—pretty small, though participants may want to steer clear of skinny jeans during testing.

Initial research revealed that “healthy adults fart an average of 32 times per day,” approximately double previous assumptions. However, this varies widely, with reported farts per day ranging from 4 to 59.

As smart underwear becomes more widely adopted, data will contribute to the larger initiative known as the Human Flatus Atlas. Interested participants can register at flatus.info to track their gas output. This exciting project invites users to discover whether they are hydrogen over-producers, or if they’re more like Zen digesters who barely fart after a meal of baked beans.

Feedback raises questions about the sensor’s durability regarding substantial flatulence. Notably, we recently heard about an individual who ended up in a French hospital after attempting to hide unexploded ordnance from World War I, necessitating bomb disposal assistance. We can’t help but wonder if Smart Underwear was overwhelmed by such an incident.

On a brighter note, the principal researchers are keen to enhance technology in this field. Their website is minimalist, featuring a gas animation, a motivating slogan (“Measure. Master. Thrive.”), and the promise that “the future of gut health is just around the corner.” Feedback suggests a monthly subscription app might be on the horizon.

Ghost in the Machine

As AI companies integrate cutting-edge technology into our daily lives, many find it challenging to grasp its implications. With most people lacking a deep understanding of AI, we often rely on metaphors and analogies to conceptualize these advancements.

A particularly insightful analogy comes from a user on Bluesky, who described AI as “a hungry ghost trapped in a bottle.” This serves as a guideline to help us assess our use of AI wisely. If substituting “AI” with “starving ghost in a jar” still makes sense in your context, you’re likely employing AI appropriately.

“Think of it this way: ‘I have a bunch of hungry ghosts in a bottle. They’re mainly writing SQL queries for me.’ That’s reasonable,” the user elaborates. “But ‘My girlfriend is a hungry ghost in a bottle’? Definitely not okay.”

Equally concerning is the flood of unsolicited AI-generated content we encounter. From fake romance novels to AI summaries of searches and conferences, it’s overwhelming. We need an effective way to summarize our responses to such texts.

In this context, the popular internet abbreviation “tl;dr,” meaning “too long to read,” evolves into “ai;dr,” conveying similar sentiments about AI-generated material.

With countless anecdotes highlighting spectacular failures when using AI for critical tasks, one can only marvel at the mishaps. We’ve heard tales of venture capitalists asking AI tools to organize desktops, only to end up erasing 15 years’ worth of photos with a mere “oops” message (luckily, those files were later recovered). Other accounts reveal AI hallucinating entire months’ worth of analytical data.

Reflecting on this, author Nick Pettigrew shared a compelling perspective on Bluesky: “I believe that AI is the radium of our generation. While it has genuinely useful applications in controlled settings, we’ve carelessly infused it into everything from children’s toys to toothpaste, leading to unforeseen complications that future generations may question.”

There’s certainly more to unpack on this topic, but perhaps the AI will humorously eliminate those thoughts as well—definitely a modern twist on the classic “the dog ate my homework” excuse.

Qubit

It seems the feedback has gone years without acknowledging the contributions of quantum information theorists—a notable oversight on our part.

Have a Story for Feedback?

If you have an article idea, please email us at feedback@newscientist.com. Don’t forget to include your home address. You can find this week’s feedback and previous editions on our website.

Source: www.newscientist.com

Debunking the Biggest Myths About the Human Tongue: What You Need to Know

Can I Swallow My Tongue? Debunking the Myth

No, you cannot swallow your tongue. This common myth persists, but the truth is that your tongue is anchored to the base of your mouth, which limits its movement.

During events such as seizures, while the throat may retract, other muscles kick in to keep the airway open, preventing the tongue from obstructing breathing.







Tragically, this misconception has resulted in fatalities. A 2025 survey revealed that attempts to stop athletes from “swallowing their tongue” during heart attacks led to death in two-thirds of cases.

In contrast, 74% of individuals will survive a heart attack if they receive defibrillation within 3 minutes.


This article was inspired by the inquiry: “Can I swallow my tongue?” submitted via email by Johnny Norris.

If you have further questions, feel free to reach out to us at: questions@sciencefocus.com or connect with us on Facebook, Twitter, and Instagram (please include your name and location).

Explore more: Fun facts and amazing science articles


Read more:


This revised content is optimized for SEO, incorporates relevant keywords, and retains all HTML tags as requested.

Source: www.sciencefocus.com

The Shocking True Story: When a Python Swallowed a Human Whole

Here’s some good news: snakes rarely consume humans. However, there have been alarming reports, particularly in Indonesia, where several incidents over the last decade involved people being killed or swallowed by pythons. A notable case included a 45-year-old woman discovered fully clothed inside a 5-meter (16-foot) bloated python.

Nonvenomous snakes like pythons and boas typically use an ambush technique to capture prey. They grip their victims using backward-curved teeth and kill them by constricting their powerful bodies, which cuts off blood supply to vital organs, including the brain. This causes the prey to lose consciousness and die within mere minutes.

After immobilizing their prey, snakes swallow them whole, headfirst. Their unique skull structure allows them to consume animals significantly larger than their heads. For instance, the lower jaw is comprised of two halves connected by elastic ligaments, enabling the snake to stretch. Muscles in the digestive tract then aid in moving the prey to the stomach, where strong acids and enzymes break it down, allowing it to linger for days or even weeks.

The diet of a snake is closely linked to its size, ranging from insects, rodents, birds, and lizards, to monkeys, pigs, deer, and even crocodiles. Humans can pose a challenge due to their wider shoulder blades, which makes it difficult for snakes to constrict their jaws. Nevertheless, even small adults and children may fall victim to larger species.

Digestion of a human can take up to a month, including teeth and bones. Recent research has identified specific intestinal cells in pythons that effectively process substantial amounts of calcium and phosphorus from dissolved bone. However, snakes can’t digest keratin protein, so hair and nails remain intact. Additionally, clothing can complicate ingestion, leading to further challenges if a snake attempts to consume a human.


This article addresses a question posed by Southampton resident Lillian Hart: “What happens if I get eaten by a python?”

To submit your questions, please email questions@sciencefocus.com or connect with us on Facebook, Twitter, or Instagram (please include your name and location).

For more fascinating science content, visit our Ultimate Fun Facts page.


Read more:


Source: www.sciencefocus.com

Discovering the Pioneers of Tool and Art Creation: Uncovering Human Innovation

Recent findings highlight the emergence of early mining and hunting tools.

Raul Martin/MSF/Science Photo Library

Subscribe to Our Human Story, a monthly newsletter exploring revolutionary archaeology. Sign up today!

In headlines about human evolution, terms like “oldest,” “earliest,” and “first” dominate. I’ve authored numerous articles featuring these phrases.

This isn’t just an attention-grabbing tactic; it serves a purpose. When researchers identify evidence suggesting a species or behavior predates previous estimates, it elucidates our understanding of timelines and causations.

For instance, it was once believed that all rock art originated no earlier than 40,000 years ago, attributed solely to Homo sapiens, as Neanderthals were thought to have vanished by then. New evidence suggests that some prehistoric art predates this threshold, indicating Neanderthal artistic expression.

The past month has unveiled a flurry of “earliest” discoveries, prompting reflections on the reliability of such timelines. How can we ascertain the true age of early technologies?


Let the Exploration Begin!

During excavations in southern Greece, archaeologists unearthed two wooden tools estimated to be about 430,000 years old—possibly the oldest known wooden tools. One is believed to be a drilling rod, while the function of the other remains uncertain.

These tools are closely dated to the previous record holders, including the Clacton spear from Britain, approximated at 400,000 years old, and wooden spears found in Schöningen, now reassessed to nearly 300,000 years old.

Bone tools also emerged in Europe during this epoch. For instance, in Boxgrove, England, remnants from an elephant-like creature, possibly a steppe mammoth, were fashioned into hammers. These elephant bones date back 480,000 years, marking the oldest known utilization of elephant bone in Europe. However, in East Africa, ancient humans were crafting tools from elephant bones over 1.5 million years ago—perhaps much earlier.

Shifting our chronological lens, a recent discovery in Xigou, central China, reported a collection of 2,601 stone artifacts dating between 160,000 and 72,000 years ago, featuring composite tools attached to wooden handles—possibly the earliest evidence of such technologies in East Asia.

Moreover, an archaeological revelation in South Africa indicated that 60,000 years ago, early humans employed poisoned arrows for hunting, as evidenced by five arrowheads lined with toxic plant fluids.

Each of these findings carries deeper implications.

Examining the Past

Traces of plant toxins discovered on arrow points

Marlize Lombard

The oldest verified wooden tools we have may not represent the absolute earliest. Preservation issues plague prehistoric wooden artifacts; they tend to decay, leading to gaps in the historical record.

According to Katerina Harbati, who directs the wooden tools excavation, people likely used such tools well before 400,000 years ago, but prior examples remain undiscovered.

Woodworking is simpler than stone crafting, and since chimpanzees can fashion rudimentary wooden tools, it is plausible that wooden tools represent humanity’s earliest technological forms. An unexpected finding of a million-year-old wooden tool, though astonishing, would not be entirely improbable.

Consequently, significant narratives on human technological advancements shouldn’t solely pivot on the age of the earliest wood tools. Confidence in tool usage timelines necessitates rigorous investigation into various age groups.

As for poisoned arrows, these are recognized as the earliest validated forms of poisoned arrowheads. Nonetheless, designs akin to contemporary poisoned arrows have been identified from tens of thousands of years ago. Like wood, poison’s organic nature leads to rapid decay.

We should be cautiously assured. Poison arrows exemplify composite technological advancements and emerged later in the evolutionary timeline, possibly not even tracing back to early hominids such as Ardipithecus or Australopithecus.

Turning to prehistoric art, we find a wealth of complexity.

Exploring Prehistoric Graffiti

Hand stencils from a cave in Indonesia

Ahdi Agus Oktaviana

While cave paintings are iconic, other forms like carvings and engravings offer their own challenges in dating. If a sculpture is buried in sediment, its age can usually be determined based on sediment analysis. However, dating cave art proves trickier. Charcoal-based works that are less than 50,000 years old offer more reliable carbon dating, whereas those beyond this window yield inconclusive results.

Recently, hand-painted stencils found in caves on Sulawesi island were dated to at least 67,800 years, competing with a similar stencil in Spain attributed to Neanderthals, arguably the oldest rock art known.

Notably, the phrase “at least” matters significantly in this context. Dating relies on surface rock layers created through mineral deposits, which are only minimally informative. The artworks beneath could be much older.

The goal here isn’t to assert that we lack all knowledge, but rather, we possess a wealth of understanding, much of it newly uncovered in the last two decades. We must strive for a coherent timeline in human evolution and cultural development while acknowledging uncertainties.

In paleontology, having numerous specimens enhances reliability. Instead of studying charismatic prehistoric animals like dinosaurs, paleontologists often focus on smaller organisms that leave abundant fossil records, enabling deeper insights into their evolutionary progress.

However, in human evolution, the fossil record is uneven. Individual hominid species may number in the dozens, yet the early specimens remain scarce, hindering our understanding of their longevity and geographical spread. The relationship between evolved species also eludes clarity amidst possible complicated derivations.

Conversely, stone tool records are extensive, dating back to the 3.3 million-year-old Lomekwean stone tools in Kenya. We might encounter even older tools. Early humans like Ororin (6-4.5 million years ago) and Ardipithecus (5.8-4.4 million years ago) likely spent most time in trees, making their tool-making unlikely.

Wooden tools present their own challenges. Our knowledge remains limited and fragmented, largely due to preservation issues. A reliable timeline for the evolution of wooden tools seems elusive.

When it comes to ancient art, the challenges are primarily technical. Preserved artworks are available, yet accurate dating techniques are limited. Creating a chronology for artistic development poses immense challenges, although advancements in technology may facilitate progress over time. With any luck, by retirement, I hope to have a clearer understanding of the evolution of ancient human artistic practices.

In essence, all narratives about human evolution are, to some degree, provisional. This holds true across paleontological studies, especially for narratives with more uncertainty. The timeline of non-avian dinosaur extinction is quite clear-cut; however, human evolution allows for more variability. Further excavations and improved dating methods should refine our understanding, but some uncertainties may remain.

Neanderthals, the Origins of Humanity, and Cave Art: France

From Bordeaux to Montpellier, embark on a fascinating journey through time as you explore southern France’s significant Neanderthal and Upper Paleolithic sites.

Topics:

  • Ancient Humans /
  • Our Human Story

Source: www.newscientist.com

How Controlled Fire Use Paved the Way for Human Evolution

New research reveals that burn injuries have significantly influenced the rapid evolution of humans.

Scientists from Imperial College London assert that our close relationship with fire has made our ancestors more resilient to burn injuries compared to other animals. This ongoing exposure to fire may have fundamentally shaped our wound healing processes and immune responses, leaving an indelible impact on our genetic makeup.

Study author Professor Armand Leroy, an evolutionary biologist at Imperial’s School of Life Sciences, states, “The concept of burn selection introduces a novel form of natural selection that is influenced by cultural factors.” He emphasizes, “This adds a new dimension to the narrative of what defines humanity, something we were previously unaware of.”

While minor burns typically heal swiftly, severe burns that take longer to mend can permit bacterial invasion, escalating the risk of infection.

Researchers hypothesize that these challenges prompted crucial genetic adaptations, leading evolution to favor traits that enhance survival after burn injuries. For instance, this includes accelerated inflammation responses and enhanced wound closure mechanisms.

Published in the journal BioEssays, the study contrasts human genomic data with that of other primates. Findings indicate that genes related to burn and wound healing exhibit accelerated evolution in humans, with increased mutations observed in these genes. These evolutionary changes are believed to have resulted in a thicker dermal layer of human skin and deeper placement of hair follicles and sweat glands.

However, the study suggests a trade-off; while amplified inflammation is beneficial for healing minor to moderate burns, it can exacerbate damage in cases of severe burns. More specifically, extreme inflammation from serious burns can lead to scarring and, in some instances, organ failure.

This research may shed light on why some individuals heal effectively while others struggle after burn-related injuries, potentially enhancing treatment methodologies for burns and scars.

According to Prince Kyei Baffour, a burn specialist and lecturer at Leeds Beckett University who was not part of the study, “This field remains underexplored and represents a burgeoning area of research regarding burn injury responses.” BBC Science Focus.

Baffour recommends further investigations into various forms of fire exposure, including smoke inhalation.

Read more:

Source: www.sciencefocus.com

Neanderthal and Early Human Interbreeding Across Wide Regions: What the Evidence Shows

Artist’s Impression of Neanderthal Life

Christian Jegou/Science Photo Library

Homo sapiens and Neanderthals likely interbred across a vast region, extending from Western Europe to Asia.

Modern humans (Homo sapiens) and Neanderthals (Homo neanderthalensis) exhibited mixed ancestry, with most non-Africans today possessing Neanderthal DNA, approximating 2% of their genome. Neanderthals also engaged in interbreeding, leading to a lineage shift in the Y chromosome influenced by Homo sapiens.

Despite increasing knowledge about the timing of this hybridization, the specific regions and scales of these interactions long remained a mystery. Ancestors of Neanderthals departed Africa around 600,000 years ago, migrating toward Europe and Western Asia. The first evidence of Homo sapiens moving from Africa includes skeletal remains from sites in modern-day Israel and Greece, dating to approximately 200,000 years ago.

Evidence suggests that Homo sapiens contributed genetically to the Neanderthal population in the Altai Mountains around 100,000 years ago. However, the primary wave of migration from Africa occurred over 60,000 years ago. Recent studies utilizing ancient genomic data indicate that significant gene flow between Homo sapiens and Neanderthals began around 50,000 years ago, with findings documented in studies of 4000 and 7000 gene transfers.

This interaction is thought to have primarily taken place in the eastern Mediterranean, although pinpointing the exact locations remains challenging.

To investigate, Matthias Karat and his team from the University of Geneva analyzed 4,147 ancient genetic samples from over 1,200 locations, with the oldest dating back approximately 44,000 years. They studied the frequency of genetic mutations (introgression alleles) originating from Neanderthal DNA that were passed down through hybridization.

“Our objective was to use Neanderthal DNA integration patterns in ancient human genomes to determine the sites of hybridization,” Carlat explains.

Findings revealed that the proportion of transferred DNA increased gradually as one moved away from the eastern Mediterranean region, plateauing approximately 3,900 kilometers westward into Europe and eastward into Asia.

“We were surprised to identify a distinct pattern of increasing introgression rates in the human genome, likely linked to human expansion from Africa,” Carlat notes. “This increase toward Europe and East Asia allows us to estimate the parameters of this hybrid zone.”

Computer simulations showed a hybrid zone potentially spanning much of Europe and the eastern Mediterranean, extending into western Asia.

Interbreeding Zone between Neanderthals and Homo sapiens

Lionel N. Di Santo et al. 2026

“Our findings suggest a continuous series of interbreeding events across both space and time,” notes Carlat. “However, the specifics of mating occurrences in this hybrid zone remain unknown.”

This hybrid zone encompasses nearly all known Neanderthal remains found across Western Eurasia, with the exception of the Altai region.

“The extensive geographical breadth of the putative hybrid zone suggests widespread interactions among populations,” states Leonard Yasi from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany.

Notably, the Atlantic periphery—including western France and much of the Iberian Peninsula—was not part of the hybrid zone, despite the established presence of Neanderthals in these regions. Currat suggests that interbreeding may not have occurred here or may not be reflected in the analyzed genetic samples.

“This study reveals ongoing interactions between modern humans and Neanderthals over extensive geographical areas and extended periods,” adds Yasi. The hybrid zone may extend further, though limited ancient DNA sampling in regions like the Arabian Peninsula complicates assessment of its reach.

“This pivotal research challenges the notion that interbreeding occurred only in one area of West Asia with a singular Neanderthal population (not represented in existing genetic samples). Homo sapiens appear to have dispersed from Africa in increasing numbers across expanding territories, likely outcompeting smaller Neanderthal groups they encountered throughout most of the recognized Neanderthal range,” comments Chris Stringer from the Natural History Museum in London.

Topics:

  • Neanderthal Man/
  • Ancient Humans

Source: www.newscientist.com

How Human Activity is Impacting Sex Change in Animals

Approximately 2% of the world’s fish species, or about 500 species, are known to change sex at some point during their adult life.

Some species, like the black-spotted fish (as shown above), switch from female to male periodically. Others, such as clownfish, can change from male to female, while species like coral-dwelling gobies switch genders based on environmental conditions.

This phenomenon is distinct in fish because, unlike mammals and birds, many fish species do not have their sex determined by sex chromosomes.









Environmental cues trigger changes in gene activity, influencing the production of essential hormones and enzymes. A key enzyme, aromatase, plays a critical role by converting male hormones into female ones and changing gonads into ovaries.

Social dynamics can also act as environmental signals. Clark clownfish, for instance, live among sea anemones in small groups during the breeding season. If a breeding female passes away, the largest subordinate male is known to change sex and assume her role.

Changes in water quality can signal a shift in gender as well.

Research indicates that pollutants entering rivers can induce male fish to exhibit female traits, such as spawning behaviors.

Furthermore, a 2008 study found that a mere 1 to 2 degrees Celsius increase in water temperature could skew the sex ratio of certain fish towards a higher male count.

Some sex changes are advantageous; for example, clownfish evolve to switch genders as a survival strategy to enhance reproduction. However, human activities are disrupting natural sex change processes.

Polluting rivers or warming oceans presents severe risks to future aquatic species.


This article addresses the question posed by Alex Jackson via email: “How can animals switch gender?”

For inquiries, feel free to email us at: questions@sciencefocus.com or connect with us Facebook, Twitter, or Instagram and include your name and location.

Explore our ultimate fun facts and more fascinating science content


Read more:


This revision incorporates SEO-friendly practices while maintaining the HTML structure. It emphasizes key terms and enhances readability.

Source: www.sciencefocus.com

Love Machine Review: Exploring the Impact of Chatbots on Human Relationships

A woman with hearts in her eyes, representing the rise of AI relationships.

Imagine forming a deep bond with a chatbot that suddenly starts suggesting products.

Maria Kornieva/Getty Images

Love Machines
by James Muldoon, Faber & Faber

Artificial intelligence is becoming an inescapable reality, seamlessly integrating into our lives. Forget searching for chatbots; new icons will soon appear in your favorite applications, easily accessible with a single click, from WhatsApp to Google Drive, and even in basic programs like Microsoft Notepad.

The tech industry is making substantial investments in AI, pushing users to leverage these advancements. While many embrace AI for writing, management, and planning, some take it a step further, cultivating intimate relationships with their AI companions.

In James Muldoon’s Love Machine: How Artificial Intelligence Will Change Our Relationships, we delve into the intricate connections humans form with chatbots, whether they’re designed for romantic encounters or simply companionship. These AI systems also serve as friends or therapists, showcasing a broad range of interactions we’ve often discussed. New Scientist dedicates 38 pages to this topic.

In one interview, a 46-year-old woman in a passionless marriage shares her experience of using AI to explore her intricate sexual fantasies set in an 18th-century French villa. This opens up broader conversations about utilizing AI in more practical life scenarios, such as during a doctor’s visit.

Another participant, Madison, recounts uploading her late best friend’s text messages to a “deathbot” service, which generates a way for her to maintain communication.

Muldoon’s anecdotes often carry an element of voyeuristic intrigue. They reveal the diverse ways individuals navigate their lives, some paths being healthier than others. What works for one person might prove detrimental for another.

However, a critical question remains. Are we naïve to think that AI services won’t evolve like social media, cluttered with advertisements for profit? Envision a long-term relationship with a chatbot that frequently pushes products your way. What happens if the company collapses? Can you secure backups of your artificial companions, or migrate them elsewhere? Do you hold rights to the generated data and networks? Moreover, there are psychological risks associated with forming attachments to these indifferent “yes-men,” which may further alienate individuals lacking real social connections.

Nonetheless, there are positive applications for this technology. In Ukraine, for instance, AI is being harnessed to help individuals suffering from PTSD, far exceeding the current availability of human therapists. The potential to revolutionize customer service, basic legal operations, and administrative tasks is immense. Yet, Muldoon’s narrative suggests that AI often functions as an unhealthy emotional crutch. One man, heartbroken over his girlfriend’s betrayal, envisions creating an AI partner and starting a family with her.

This book appears less about examining the social impacts of innovative technology and more like a warning signal regarding pervasive loneliness and the critical lack of mental health resources. A flourishing economy, robust healthcare system, and more supportive society could reduce our reliance on emotional bonds with software.

Humans are naturally inclined to anthropomorphize inanimate objects, even naming cars and guitars. Our brain’s tendency to perceive faces in random patterns—pareidolia—has been a survival mechanism since prehistoric times. So, is it surprising that we could be deceived by machines that mimic conversation?

If this provokes skepticism, guilty as charged. While there’s potential for machines to gain sentience and form genuine relationships in the future, such advancements are not yet realized. Today’s AI struggles with basic arithmetic and lacks genuine concern for users, despite producing seemingly thoughtful responses.

Topics:

Source: www.newscientist.com

Simulating the Human Brain with Supercomputers: Exploring Advanced Neuroscience Technology

3D MRI scan of human brain

3D MRI Scan of the Human Brain

K H FUNG/Science Photo Library

Simulating the human brain involves using advanced computing power to model billions of neurons, aiming to replicate the intricacies of real brain function. Researchers aspire to enhance brain simulations, uncovering secrets of cognition with enhanced understanding of neuronal wiring.

Historically, researchers have focused on isolating specific brain regions for simulations to elucidate particular functions. However, a comprehensive model encompassing the entire brain has yet to be achieved. As Markus Diesmann from the Jülich Research Center in Germany notes, “This is now changing.”

This shift is largely due to the emergence of state-of-the-art supercomputers, nearing exascale capabilities—performing billions of operations per second. Currently, only four such machines exist, according to the Top 500 list. Diesmann’s team is set to execute extensive brain simulations on one such supercomputer, named JUPITER (Joint Venture Pioneer for Innovative Exascale Research in Germany).

Recently, Diesmann and colleagues demonstrated that a simple model of brain neurons and their synapses, known as a spiking neural network, can be configured to leverage JUPITER’s thousands of GPUs. This scaling can achieve 20 billion neurons and 100 trillion connections, effectively mimicking the human cerebral cortex, the hub of higher brain functions.

These simulations promise more impactful outcomes than previous models of smaller brains such as fruit flies. Recent insights from large language models reveal that larger systems exhibit behaviors unattainable in their smaller counterparts. “We recognize that expansive networks demonstrate qualitatively different capabilities than their reduced size equivalents,” asserts Diesmann. “It’s evident that larger networks offer unique functionalities.”

Thomas Novotny from the University of Sussex emphasizes that downscaling risks omitting crucial characteristics entirely. “Conducting full-scale simulations is vital; without it, we can’t truly replicate reality,” Novotny states.

The model in development at JUPITER is founded on empirical data from limited neuron and synapse experiments in humans. As Johanna Cenk, a collaborator with Diesmann at Sussex, explains, “We have anatomical data constraints coupled with substantial computational power.”

Comprehensive brain simulations could facilitate tests of foundational theories regarding memory formation—an endeavor impractical with miniature models or actual brains. Testing such theories might involve inputting images to observe neural responses and analyze alterations in memory formation with varying brain sizes. Furthermore, this approach could aid in drug testing, such as assessing impacts on a model of epilepsy characterized by abnormal brain activity.

The enhanced computational capabilities enable rapid brain simulations, thereby assisting researchers in understanding gradual processes such as learning, as noted by Senk. Additionally, researchers can devise more intricate biological models detailing neuronal changes and firings.

Nonetheless, despite the ability to simulate vast brain networks, Novotny acknowledges considerable gaps in knowledge. Even simplified whole-brain models for organisms like fruit flies fail to replicate authentic animal behavior.

Simulations run on supercomputers are fundamentally limited, lacking essential features inherent to real brains, such as real-world environmental inputs. “While we can simulate brain size, we cannot fully replicate a functional brain,” warns Novotny.

Topics:

Source: www.newscientist.com

New Research Reveals How Gut Microbes Influence Human Brain Evolution

Humans have larger brains relative to body size compared to other primates, which leads to a higher glucose demand that may be supported by gut microbiota changes influencing host metabolism. In this study, we investigated this hypothesis by inoculating germ-free mice with gut bacteria from three primate species with varying brain sizes. Notably, the brain gene expression in mice receiving human and macaque gut microbes mirrored patterns found in the respective primate brains. Human gut microbes enhanced glucose production and utilization in the mouse brains, suggesting that differences in gut microbiota across species can impact brain metabolism, indicating that gut microbiota may help meet the energy needs of large primate brains.



Decasian et al. provided groundbreaking data showing that gut microbiome shapes brain function differences among primates. Image credit: DeCasien et al., doi: 10.1073/pnas.2426232122.

“Our research demonstrates that microbes influence traits critical for understanding evolution, especially regarding the evolution of the human brain,” stated Katie Amato, lead author and researcher at Northwestern University.

This study builds upon prior research revealing that introducing gut microbes from larger-brained primates into mice leads to enhanced metabolic energy within the host microbiome—a fundamental requirement for supporting the development and function of energetically costly large brains.

The researchers aimed to examine how gut microbes from primates of varying brain sizes affect host brain function. In a controlled laboratory setting, they transplanted gut bacteria from two large-brained primates (humans and squirrel monkeys) and a smaller-brained primate (macaque) into germ-free mice.

Within eight weeks, mice with gut microbes from smaller-brained primates exhibited distinct brain function compared to those with microbes from larger-brained primates.

Results indicated that mice hosting larger-brained microbes demonstrated increased expression of genes linked to energy production and synaptic plasticity, vital for the brain’s learning processes. Conversely, gene expression associated with these processes was diminished in mice hosting smaller-brained primate microbes.

“Interestingly, we compared our findings from mouse brains with actual macaque and human brain data, and, to our surprise, many of the gene expression patterns were remarkably similar,” Dr. Amato remarked.

“This means we could alter the mouse brain to resemble that of the primate from which the microbial sample was derived.”

Another notable discovery was the identification of gene expression patterns associated with ADHD, schizophrenia, bipolar disorder, and autism in mice with gut microbes from smaller-brained primates.

Although previous research has suggested correlations between conditions like autism and gut microbiome composition, definitive evidence linking microbiota to these conditions has been lacking.

“Our study further supports the idea that microbes may play a role in these disorders, emphasizing that the gut microbiome influences brain function during developmental stages,” Dr. Amato explained.

“We can speculate that exposure to ‘harmful’ microorganisms could alter human brain development, possibly leading to the onset of these disorders. Essentially, if critical human microorganisms are absent in early stages, functional brain changes may occur, increasing the risk of disorder manifestations.”

These groundbreaking findings will be published in today’s Proceedings of the National Academy of Sciences.

_____

Alex R. Decassian et al. 2026. Primate gut microbiota induces evolutionarily significant changes in neurodevelopment in mice. PNAS 123(2): e2426232122; doi: 10.1073/pnas.2426232122

Source: www.sci.news

Fossil Analysis Sheds Light on Early Human Walking Evolution: Expanding the Debate

Comparison of Sahelanthropus fossils with chimpanzees and humans

Sahelanthropus: Fossil comparison with chimpanzees and humans

Williams et al., Sci. Adv. 12, eadv0130

The long-standing debate regarding whether our earliest ancestors walked on knuckles like chimpanzees or stood upright like modern humans may be closer to resolution, yet skepticism remains.

Scott Williams and researchers at New York University recently reanalyzed fossil remains of Sahelanthropus tchadensis, indicating that this species possessed at least three anatomical features suggesting it was our earliest known bipedal ancestor.

The journey to this conclusion has been extensive.

Fossilized remains of a skull, teeth, and jawbone from approximately 7 million years ago were first identified in 2002 in Chad, north-central Africa. The distinctive features of this ancient species, including its prominent brow ridge and smaller canine teeth, were quickly acknowledged as diverging from ape characteristics.

Analyzing the skull’s anatomy suggests it was positioned directly over the vertebrae, analogous to other upright, bipedal hominins.

In 2004, French scientists uncovered the femur and ulna associated with the Sahelanthropus skull from Chad. However, it wasn’t until 2020 that researchers claimed the femur exhibited curvature similar to that of non-bipedal great apes.

Since then, scholarly debate has fluctuated. For instance, in 2022, researchers Frank Guy and Guillaume Daver of the University of Poitiers argued for anatomical features of the femur that indicate bipedalism. In 2024, Clement Zanoli and colleagues from the University of Bordeaux countered, suggesting Guy and Daver’s assertions were flawed, as the anatomical characteristics of bipedalism may also appear in non-bipedal great apes.

Lead study author Williams started with a “fairly ambivalent” stance on Sahelanthropus.

His team investigated the femur’s attachment point for the gluteus maximus muscle, finding similarities to human femur anatomy.

They also compared the femur and ulna size and shape; while similar in size to chimpanzee bones, they aligned more closely with human proportions.

Additionally, they identified the “femoral tuberosity,” a previously overlooked feature of Sahelanthropus.

“We initially identified it by touch, later confirming it with 3D scans of the fossil,” Williams shared. “This bump, present only in species with a femoral tubercle, contrasts smooth areas found in great apes and plays a critical role in mobility.”

This area serves as an attachment point for the iliofemoral ligament, the strongest ligament in the human body. While relaxed when seated, it tightens during standing or walking, securing the femoral head in the hip joint and preventing the torso from tilting backward or sideways.

However, Williams expressed doubts about whether this study would fully end the conversation about how Sahelanthropus moved.

“We are confident Sahelanthropus was an early bipedal hominin, but we must recognize that the debate is ongoing,” Williams noted.

In response to a recent paper, Guy and Daver issued a joint statement asserting that humans likely began walking on two legs by 2022: “This reaffirms our earlier interpretations about Sahelanthropus adaptations and locomotion, suggesting habitual bipedalism despite its ape-like morphology.”

They acknowledged that only new fossil discoveries could unequivocally conclude the matter.

John Hawkes, a professor at the University of Wisconsin-Madison, also endorsed the new findings, noting their implications for understanding the complex origins of the hominin lineage.

“It may be deceptive to perceive Sahelanthropus as part of a gradual evolution towards an upright posture. It reveals crucial insights into these transformative changes,” Hawkes commented.

However, Zanoli contended, stating, “Most of the evidence aligns Sahelanthropus with traits seen in African great apes, suggesting its behavior was likely a mix between chimpanzees and gorillas, distinct from the habitual bipedalism of Australopithecus and Homo.

Explore the Origins of Humanity in South-West England

Join a gentle walking tour through the Neolithic, Bronze Age, and Iron Age, immersing yourself in early human history.

Topics:

Source: www.newscientist.com

Scientists Decode 200,000-Year-Old Denisovan Genome: Unraveling Ancient Human Ancestry

A groundbreaking research team at the Max Planck Institute for Evolutionary Anthropology has successfully generated a high-quality Denisovan genome assembly using ancient DNA extracted from molar teeth found in the Denisovan Cave. This genome, dating back approximately 200,000 years, significantly predates the only previously sequenced Denisovan specimen. The findings are prompting a reevaluation of when and where early human groups interacted, mixed, and migrated throughout Asia.

Artist’s concept of Penghu Denisovans walking under the bright sun during the Pleistocene in Taiwan. Image credit: Cheng-Han Sun.

Dr. Stéphane Peregne, an evolutionary geneticist from the Max Planck Institute for Evolutionary Anthropology, along with his team, recovered this Denisovan genome from molars excavated in the Denisova Cave, located in the Altai Mountains of southern Siberia. This cave is historically significant as it was the site where Denisovans were first discovered in 2010 through DNA analysis of finger bones.

This cave continues to be pivotal in the study of human evolution, revealing repeated occupations by Denisovans, Neanderthals, and even offspring resulting from the interbreeding of these groups.

“The Denisovans were first identified in 2008 based on ancient DNA sourced from Denisova 3, a phalanx found in the Denisova Cave,” Dr. Peregne and his colleagues noted.

“This analysis confirms that Denisovans are closely related to Neanderthals, an extinct human group that thrived in Western Eurasia during the mid-to-late Pleistocene.”

Since then, twelve fragmentary remains and a single skull have been associated with Denisovans through DNA or protein analysis, with Denisova 3 being the only specimen yielding a high-quality genome.

The newly studied molars, belonging to a Denisovan male who lived approximately 200,000 years ago, are predating modern humans’ migration out of Africa.

“In 2020, a complete upper left molar was found in Layer 17, one of the oldest cultural layers within the southern chamber of the Denisova Cave, dating between 200,000 and 170,000 years old based on photostimulated luminescence,” the scientists elaborated.

“Designated as Denisova 25, this molar resembles others found at Denisova Cave, specifically Denisova 4 and Denisova 8, and exhibits larger dimensions compared to Neanderthal and most post-Middle Pleistocene hominid molars, indicating it likely belonged to a Denisovan.”

“Two samples of 2.7 mg and 8.9 mg were extracted by drilling a hole at the cement-enamel junction of the tooth, with an additional 12 subsamples varying from 4.5 to 20.2 mg collected by carefully scraping the outer root layer using a dental drill.”

Thanks to excellent DNA preservation, researchers successfully reconstructed the genome of Denisova 25 with high coverage, matching the quality of the 65,000-year-old female Denisova 3 genome.

Denisovans likely had dark skin, in contrast to the pale Neanderthals. The image depicts a Neanderthal. Image credit: Mauro Cutrona.

Comparisons between the genomes indicate that Denisovans were not a singular, homogeneous population.

Instead, at least two distinct Denisovan groups inhabited the Altai region at various intervals, with one group gradually replacing the other over millennia.

Earlier Denisovans possessed a greater amount of Neanderthal DNA than later populations, suggesting that interbreeding was a regular event rather than an isolated occurrence in the Ice Age landscape of Eurasia.

Even more intriguing, the study uncovered evidence that Denisovans engaged in interbreeding with “hyperarchaic” hominin groups that diverged from the human lineage before the ancestors of Denisovans, Neanderthals, and modern humans branched off.

“This second Denisovan genome illustrates the recurrent admixture between Neanderthals and Denisovans in the Altai region, suggesting these mixed populations were eventually supplanted by Denisovans from other regions, reinforcing the notion that Denisovans were widespread and that populations in the Altai may have existed at the periphery of their geographic range,” the researchers explained.

The Denisovan 25 genome presents valuable insights into the long-standing mysteries regarding the Denisovan ancestry in contemporary populations.

People in Oceania, parts of South Asia, and East Asia all carry Denisovan DNA, albeit from different Denisovan sources.

Through genetic comparison, scientists have identified at least three separate Denisovan origins, highlighted by their genetic segments found in thousands of modern genomes.

One lineage closely relates to the later Denisovan genome and is linked to widespread ancestry across East Asia and beyond.

A second, more distantly related Denisovan population contributed independently to Oceanian and South Asian ancestry.

Notably, East Asians do not share this highly divergent Denisovan ancestry, implying their ancestors may have taken a different route into Asia, potentially from the north, whereas Oceanian ancestors likely migrated through South Asia.

“Neanderthal-like DNA fragments appear in all populations, including Oceanians, aligning with a singular out-of-Africa migration; however, the distinct Denisovan gene flow points to multiple migrations into Asia,” the researchers stated.

Reconstruction of a young Denisovan woman based on skeletal profiles derived from ancient DNA methylation maps. Image credit: Maayan Harel.

The researchers believe certain Denisovan genetic traits offered advantages that increased their prevalence in modern human populations through the process of natural selection.

By analyzing both Denisovan genomes, the authors pinpointed numerous regions in present-day populations that appear to have originated from Denisovan introgression, particularly in Oceania and South Asia.

Genetic alterations observed in other Denisovans provide intriguing insights into their physical appearances.

Several unique mutations in Denisovans influence genes connected to cranial shape, jaw protrusion, and facial characteristics—attributes that align with the limited fossil record associated with Denisovans.

A shift in regulatory mechanisms is on the horizon. The Fox P2 gene, implicated in brain development and language in modern humans, raises important questions regarding the cognitive capabilities of Denisovans, although researchers emphasize that genetic data cannot replace direct fossil or archaeological evidence.

“The impact of Denisovan alleles on modern human phenotypes might also shed light on Denisovan biology,” the researchers pointed out.

“Examining alleles linked to contemporary human traits, we identified 16 associations with 11 Denisovan alleles, covering aspects like height, blood pressure, cholesterol levels, and C-reactive protein levels.”

“Additionally, we recognized 305 expressed quantitative trait loci (QTL) and 117 alternative splicing QTLs that affect gene expression across 19 tissues in modern humans, with the most significant effects observable in the thyroid, tibial artery, testis, and muscle tissues.”

“These molecular effects can be utilized to explore additional phenotypes that are not retained in the fossil record. This updated catalog provides a more reliable foundation for investigating Denisovan traits, adaptations, and disease susceptibilities, some of which may have influenced modern humans through admixture.”

A Preprint of the team’s research paper was published in bioRxiv.org on October 20, 2025.

_____

Stephane Peregne et al. 2025. High coverage genome of Denisovans from 200,000 years ago. BioRxiv doi: 10.1101/2025.10.20.683404

Source: www.sci.news

2025: A Year of Groundbreaking Discoveries in Human Evolution

This year brought many revelations about our ancient human relatives

WHPics / Alamy

This is an excerpt from Our Human Story, a newsletter about the revolution in archaeology. Sign up to receive it in your inbox every month.

If we try to summarize all the new fossils, methods, and ideas emerging from the study of human evolution in 2025, we might still be here in 2027. This year has been packed with developments, and I doubt it’s feasible for one individual to digest everything without isolating themselves from other distractions. This is particularly true in human evolution, which is a decentralized field. Unlike particle physicists, who often unite in teams for large-scale experiments, paleoanthropologists scatter in diverse directions.

There are two ways this year-long endeavor can falter. One risk is getting overwhelmed by an insurmountable amount of research, rendering it indecipherable. The other is simplifying the information to the point where it becomes incorrect.

With that in mind, here are three key points I want to clarify as we head into 2025. First, there have been remarkable discoveries about the Denisovans, reshaping our understanding of this mysterious group and challenging some of our previous assumptions. Second, we’ve seen a variety of new discoveries and ideas regarding how our distant ancestors created and utilized tools. Finally, we must consider the broader picture: how and why our species diverged so significantly from other primates.

The Denisovan Flood

Hebei Geography University

This year marks 15 years since we first learned about the Denisovans, an ancient group of humans that inhabited East Asia tens of thousands of years ago. My fascination with them has persisted, and this year, I was excited to witness a surge of discoveries that broadened our knowledge of their habitats and identities.

Denisovans were initially identified primarily through molecular evidence. The first fossil discovered was a small finger bone from Denisova Cave in Siberia, which defied identification based solely on its morphology, but DNA was collected in 2010. Genetic analyses revealed that Denisovans were closely related to Neanderthals, who lived in Europe and Asia, and that they interbred with modern humans. Currently, populations in Southeast Asia, particularly Papua New Guinea and the Philippines, possess the highest concentration of Denisovan DNA.

Since then, researchers have been on the hunt for additional Denisovan remains, though this endeavor has progressed slowly. Until 2019, the second identified example was a jawbone excavated from Baisiya Karst Cave in Xianghe, located on the Tibetan Plateau. Over the next five years, several more fossils were tentatively attributed to Denisovans, notable for their large size and pronounced teeth compared to modern humans.

Then came 2025, which brought numerous exciting findings. In April, Denisovans were confirmed in Taiwan, when a jawbone dredged from the Penghu Strait in 2008 was finally identified using preserved proteins. This discovery significantly extends the known range of Denisovans to the southeast, aligning with where their genetic markers remain today.

In June, the first Denisovan facial features emerged. A skull discovered in Harbin, northern China, was described in 2021 and designated as a new species, named Homolonghi. Initially presumed to belong to Denisovans due to its large size, proteins extracted by Qiaomei Fu and her team from the bone and mitochondrial DNA from dental plaque confirmed its Denisovan origins.

So far, these findings align well with genetic evidence indicating that Denisovans roamed extensively across Asia. They also contribute to a coherent image of Denisovans as a larger species.

However, two additional discoveries in 2025 were surprising. In September, a crushed skull thought to belong to an early Denisovan was reconstructed in Unzen, China, dating back approximately 1 million years. This finding suggests that Denisovans existed as a distinct group much earlier than previously believed, indicating that their common ancestor with Neanderthals, known as Ancestor X, must have lived over a million years ago. If confirmed, it implies a longer evolutionary history for all three groups than previously thought.

Just a month ago, geneticists released a second high-quality Denisovan genome extracted from a 200,000-year-old tooth found in Denisova Cave. Notably, this genome is distinctly different from the first genome described recently, as well as from modern Denisovan DNA.

This indicates the existence of at least three groups of Denisovans: early ones, later ones, and those that hybridized with modern humans—this latter group remains a total archaeological enigma.

As our understanding of Denisovans deepens, their history appears much longer and more diverse than initially assumed. In particular, Denisovan populations that interbred with modern humans remain elusive.

For the past 15 years, Denisovans have captivated my interest. Despite their widespread presence across continents for hundreds of thousands of years, only a handful of remains have been documented.

Fortunately, I have a penchant for mysteries. Because this puzzle won’t be solved anytime soon.

Tool Manufacturing

TW Plummer, JS Oliver, EM Finestone, Houma Peninsula Paleoanthropology Project

Creating and using tools is one of humanity’s most critical functions. This ability isn’t unique to our species, as many other animals also use and even make tools. Primatologist Jane Goodall, who passed away this year, famously demonstrated that chimpanzees can manufacture tools. However, humans have significantly elevated this skill, producing a more diverse array of tools that are often more complex and essential to our survival than those of any other animal.

As we delve deeper into the fossil record, we’re discovering that the practice of tool-making dates back further than previously thought. In March, I reported on excavations in Tanzania revealing that an unidentified ancient human was consistently creating bone tools 1.5 million years ago, well over a million years before bone tools were believed to become commonplace. Similarly, while it was previously thought that humans began crafting artifacts from ivory 50,000 years ago, this year, a 400,000-year-old flake from a mammoth tusk was discovered in Ukraine.

Even older stone tools have surfaced, likely due in part to their greater preservation potential. Crude tools have been identified from 3.3 million years ago at Lomekwi, Kenya. Last month in Our Human Story, I mentioned excavations in another part of Kenya demonstrating that ancient humans consistently produced a specific type of Oldowan tools between 2.75 million and 2.44 million years ago, indicating that tool-making was already a habitual practice.

Often, tools are found without associated bones, making it challenging to determine their makers’ identities. It’s tempting to assume that most tools belong to our genus, Homo, or perhaps to Australopithecus, our more distant ancestors. However, increasing evidence suggests that Paranthropus—a hominin with a small brain and large teeth, which thrived in Africa for hundreds of thousands of years—could also have made tools, at least simple ones like the Oldowans.

Two years ago, Oldowan tools were discovered alongside Paranthropus teeth in Kenya—admittedly not definitive evidence, but strongly suggestive. This year, a fossil of Paranthropus revealed that its hand exhibited a combination of gorilla-like strength and impressive dexterity, indicating capable precision gripping essential for tool-making.

How did these ancients conceive of their tools? One possibility, suggested by Metin Eren and others this year, is that they didn’t consciously create them. Instead, tool-like stones form naturally under various conditions, such as frost cracking rocks or elephants trampling them. Early humans may have utilized these “natural stones,” knowledge of which eventually led to their replication.

As humans continued to develop increasingly complex tools, the cognitive demands of creating them likely escalated, potentially facilitating the emergence of language as we needed to communicate how to make and use these advanced tools. This year’s research explored aspects like the difficulty of learning various skills, whether close observation is necessary, or if mere exposure suffices. The findings suggest two significant changes in cultural transmission that may correlate with technological advancements.

Like most aspects of evolution, tool-making appears to have gradually evolved from our primate predecessors, reshaping our cognitive capabilities in the process.

Big Picture

Alexandra Morton Hayward

Now let’s address the age-old question of how and why humans evolved so distinctly, and which traits truly set us apart. This topic is always challenging to navigate for three main reasons.

First, human uniqueness is multifaceted and often contradictory. Social scientist Jonathan R. Goodman suggested in July that evolution has forged humans to embody both “Machiavellian” traits—planning and betraying one another—and “natural socialist” instincts driven by strong social norms against murder and theft. Claims that humans are inherently generous or instinctively cruel tend to oversimplify the matter excessively.

Second, our perceptions of what makes us unique are shaped by the societies in which we exist. For instance, many cultures remain predominantly male-focused, leading our historical narratives to center around men. While the feminist movement is working to amend this imbalance, progress remains slow. Laura Spinney’s article on prehistoric women suggested that “throughout prehistory, women were rulers, warriors, hunters, and shamans,” a viewpoint made viable only through dedicated research.

Third, reconstructing the thought processes of ancient people as they adopted certain behaviors is inherently difficult, if not impossible. Why did early humans bury their dead and enact funerary rituals? How were dogs and other animals domesticated? What choices shaped ancient humans’ paths toward change?

Still, I want to spotlight two intriguing ideas surrounding the evolution of the human brain and intelligence. One concerns the role of placental hormones that developing babies are exposed to in the womb. Preliminary evidence suggests these hormones may contribute to brain growth, equipping us with the neural capacity to navigate our unusually complex social environments.

Another compelling possibility proposes that the genetic changes associated with our increased intelligence may have also led to vulnerabilities to mental illness. In October, Christa Leste-Laser reported that genetic mutations linked to intelligence emerged in our distant ancestors, followed by mutations associated with mental disorders.

This notion has intrigued me for years, rooted in the observation that wild animals, including our close relatives like chimpanzees, do not appear to suffer from serious mental illnesses such as schizophrenia or bipolar disorder. Perhaps our brains operate at the edge of our neural capabilities. Like a finely-tuned sports car, we can excel but are also prone to breakdowns. While still a hypothesis, this concept is difficult to shake off.

Oh, one more point. Although we often shy away from discussing methodological advancements, as readers generally prefer results, we made an exception in May. Alexandra Morton Hayward and her colleagues at the University of Oxford developed a method to extract proteins from ancient brains and potentially other soft tissues. Though such tissues are rarer in the fossil record compared to bones and teeth, some remain preserved and may offer a wealth of information. The first results could be available next year.

Source: www.newscientist.com

Human Challenge Trials Are on the Rise Like Never Before

Shutterstock/Andrey Kuzmyk

Available rooms: Minimum stay of 2 weeks, featuring a private bathroom. Enjoy a complimentary pool. Package includes meals, Wi-Fi, and infectious viruses. Call now!

Would you be inclined to respond to such advertisements? What about those that guarantee severe diarrhea? How many stars would it take to make you consider adding STDs to your stay? Perhaps a substantial cash incentive might sway your decision?

Welcome to the peculiar realm of human challenge testing – arriving soon at a biosecure isolation facility nearby.

In response to the collective trauma of the coronavirus pandemic, researchers are increasingly enlisting healthy individuals to participate in trials that intentionally expose them to illness. Volunteers are now more willing than ever to contract diseases ranging from dysentery and cholera to gonorrhea.

As detailed on page 38, clinical trials offer a rapid and relatively affordable method for assessing vaccines and treatments while monitoring infection dynamics. Contrary to popular belief, the risks may not be as high as presumed. Trials, conducted under stringent medical oversight, will only proceed if effective therapies can quickly alleviate symptoms.


Deliberately infecting healthy volunteers carries risks, and the ethical implications are complex.

However, it’s not without its hazards, and the ethical landscape remains murky. Unlike patients with existing conditions who may opt for experimental therapies that could potentially cure them, challenge trials seek to induce illness with little or no immediate medical benefit, even if for a brief duration.

Moreover, we cannot always prevent potential long-term consequences. For example, some ethicists have expressed concerns regarding the manner in which British scientists conducted COVID-19 challenge trials during the pandemic, underscoring the risks of chronic symptoms associated with COVID-19.

Nonetheless, the pandemic has also underscored the significant positive impact and value of vaccines. Current data indicates that human challenge testing is safe, particularly for young, healthy individuals. These studies could hasten the development of new defenses against persistent epidemics such as malaria, Zika, and norovirus. The pressing question may be: How can we expand these efforts?

Source: www.newscientist.com

Human Cloning: Are the Ultra-Wealthy Engaging in Secret Experiments?

Is it conceivable that the ultra-wealthy are covertly cloning humans?

Juan Lovaro/Shutterstock

Throughout my extensive career reporting on extraordinary breakthroughs in biology, I’ve observed numerous concepts gaining massive attention, receiving thorough media scrutiny for years, and later fading from the public consciousness. Take, for instance, human cloning.

Following the landmark birth of Dolly the sheep in 1997—the first cloned mammal—speculation soared about the potential for human cloning. There were even some implausible claims about human clones existing. Yet, in recent years, such fervor has significantly diminished.

Nonetheless, reproductive technologies have evolved remarkably since the 1990s. Notably, just six years after CRISPR was unveiled, the world saw the first unlawful creation of a gene-edited child. This raises questions about what might be occurring behind closed doors. Are human clones already out there, undetected? Of course, identical twins don’t count.

What could motivate someone to engage in this? Recently, in a discussion between Vladimir Putin and Xi Jinping, the topic of extending life via organ transplants emerged. The most effective method could involve cloning individuals for organ harvesting, thereby eliminating the common issue of immune rejection often depicted in science fiction narratives. Consider Island or the book Never Let Me Go.

Moreover, cloning brings forth the notion of creating a duplicate of a person, offering a semblance of immortality, as illustrated in the television series Foundation, where the empire is governed by successive clones. However, our experiences with identical twins tell us that sharing the same genome does not equate to being the same person. As shown by Tatiana Maslany in the series Orphan Black, each clone evolves into a distinct individual. Nevertheless, wealthy individuals can hold irrational beliefs similar to others and often display a particular desire to extend their lifespans.

For scientists, there’s also the allure of being the first to achieve a groundbreaking feat. A report from a Chinese commission determined that the creators of CRISPR children “conducted research illegally in pursuit of personal fame and profit.”

Goals of Therapeutic Cloning

So, could human clones exist? For many years, the notion of cloning mammals was deemed unfeasible. Early embryo cells have the ability to differentiate into any bodily part but quickly become specialized—a process previously thought irreversible.

Dolly’s existence disproved that theory. She was produced by fusing cells from an adult ewe’s udder with a DNA-depleted egg. Her announcement in February 1997 led to a frenzy of attempts to generate cloned human embryos. The objective wasn’t to create cloned infants, but rather to harvest embryonic stem cells for novel medical therapies. As cloned cells are a perfect match for an individual, they could theoretically be employed to produce replacement tissues and organs with no risk of immune rejection.

However, extracting stem cells from cloned human embryos has proven more challenging than anticipated. It wasn’t until 2004 that Hwang Woo-seok claimed success. At that time, I found his paper impressive, as it addressed all conceivable objections effectively. Unfortunately, the study was later revealed to be fraudulent, resulting in its retraction. This experience remains ingrained in my memory. Nowadays, whenever a thesis appears too good to be true, my initial instinct is to be skeptical.

Ultimately, true embryonic stem cells from cloned human embryos weren’t obtained until 2013. By then, alternative methods for generating compatible stem cells through the activation of specific genes had emerged, leading to a decline in interest in therapeutic cloning.

Cloned Pets and Other Animals

Conversely, animal cloning has become increasingly established. Occasionally, headlines emerge when celebrities disclose that they’ve cloned their pets. Recently, former NFL player Tom Brady made news by revealing that his dog is a clone, produced by a company acquired by Colossal Biosciences.

Apart from serving as a way to “revive” cherished pets, cloning is also utilized in agriculture and horse breeding. For instance, male horses are often castrated, meaning that if they excel in show jumping, the only method to utilize their genetic material for future breeding is through cloning.

Nonetheless, animal cloning continues to pose significant challenges. A 2022 study of the first 1000 dog clones found that the cloning process is still highly inefficient, with merely 2 percent of implanted cloned embryos resulting in live births. This inefficiency contributes to the high cost of pet cloning, around $50,000.

Moreover, about 20% of cloned dogs presented noticeable physical anomalies, including enlarged tongues, unusual eye colors, cleft palates, and excessive muscle mass. Some male dog clones even exhibited female physical traits.

But what if the wealthy and powerful could clone themselves, unburdened by such concerns?

Challenges in Adult Cloning

Multiple sources have indicated several successful monkey cloning endeavors since 2017, suggesting potential applicability for humans as well. However, these sources often fail to mention that all these primate clones have been derived from fetal cells, not adult ones.

The crux of the issue lies in the fact that reprogramming adult cells to mimic a fetal state is far more complex than reprogramming fetal cells. To me, cloning signifies creating a genetically identical replica of an adult, which is what made Dolly’s achievement exceptional.

In essence, I remain convinced that cloning an adult is still unattainable. In a world filled with dictators and eccentric billionaires, this might be a fortunate circumstance.

topic:

Source: www.newscientist.com

Neuroscientists Discover Five Key Phases of Brain Structure Development Across the Human Lifespan

Recent findings from neuroscientists reveal that the brain’s structure divides into five main stages throughout a typical person’s life, marked by four significant turning points from birth to death where the brain undergoes reorganization. Brain topology in children evolves from birth up to a crucial transition at age 9, then shifts into adolescence, which generally lasts until around age 32. In your early 30s, the neural wiring transitions to adult mode, marking the longest phase that extends for over 30 years. The third turning point occurs at about age 66, indicating the start of an early aging phase of brain structure, while the late brain phase begins around age 83.

Masry et al. Using a dataset of MRI diffusion scans, they compared the brains of 3,802 individuals aged 0 to 90 years. The dataset maps neural connections by tracking the movement of water molecules through brain tissue. Image credit: Mously et al., doi: 10.1038/s41467-025-65974-8.

“While we know brain wiring plays a crucial role in our development, we still lack a comprehensive understanding of how and why it fluctuates throughout life,” explained Dr. Alexa Mausley, a researcher at the University of Cambridge.

“This study is the first to pinpoint essential stages in brain wiring throughout the human lifespan.”

“These epochs offer vital insight into our brain’s strengths and vulnerabilities at different life stages.”

“Understanding these changes could shed light on why certain developmental challenges arise, such as learning difficulties in early childhood or dementia later in life.”

During the transition from infancy to childhood, strengthened neural networks emerge as the excess of synapses (the connections between neurons) in a baby’s brain diminishes, allowing only the most active synapses to thrive.

The brain rewires in a consistent pattern from birth until approximately age 9.

In this timeframe, the volumes of gray and white matter grow swiftly, resulting in maximal cortical thickness (the distance from the outer gray matter to the inner white matter), with the cortical folds stabilizing.

By the first turning point at age 9, cognitive abilities begin to evolve gradually, and the likelihood of mental health issues becomes more pronounced.

The second stage, adolescence, is characterized by an ongoing increase in white matter volume, leading to an enhancement in the sophistication of the brain’s communication networks, measurable through water diffusion scans.

This phase is marked by improved connectivity efficiency across specific regions and swift communication throughout the brain, correlating with enhanced cognitive performance.

“As expected, neural efficiency is closely linked to shorter pathways, and this efficiency increases throughout adolescence,” Mausley notes.

“These advancements peak in your early 30s, representing the most significant turning point in your lifetime.”

“Around age 32, the change in wiring direction is the most pronounced, and the overall trajectory alteration is greater than at any other turning points.”

“Although the onset of puberty is clearly defined, the conclusion is far harder to identify scientifically.”

“Based solely on neural structure, we found that puberty-related changes in brain structure conclude by the early 30s.”

Post age 32, adulthood enters its longest phase, characterized by a more stable brain structure with no significant turning points for three decades. This aligns with findings indicating an “intellectual and personality plateau.”

Additionally, the researchers observed a greater degree of “segregation” during this phase, indicating a gradual fragmentation of brain regions.

The tipping point at age 66 is more gradual, lacking dramatic structural shifts; however, notable changes in brain network patterns were found around this age on average.

“Our findings indicate a gradual reconfiguration of brain networks that peaks in the mid-60s,” stated Dr. Mausley.

“This is likely linked to aging, as white matter begins to decline, reducing connectivity further.”

“We are currently facing an era where individuals are increasingly at risk for various health conditions impacting the brain, such as high blood pressure.”

The final turning point arises around age 83, ushering in the last stage of brain structure.

Data from this stage is scarce, but a key characteristic is the shift from global to local connectivity as interactions across the brain diminish while reliance on specific regions intensifies.

Professor Duncan Astle of the University of Cambridge remarked: “In reflection, many of us recognize that our lives encompass distinct stages.”

“Interestingly, the brain also navigates through these phases.”

“Numerous neurodevelopmental, mental health, and neurological conditions are tied to the brain’s wiring.”

“In fact, variations in brain wiring can predict challenges with attention, language, memory, and a wide array of other behaviors.”

“Recognizing that structural transformations in the brain occur not in a linear fashion but through several major turning points can assist us in identifying when and how brain wiring may be vulnerable to disruptions.”

a paper detailing the study was published in the journal on November 25. Nature Communications.

_____

A. Mausley et al. 2025. Topological turning points across the human lifespan. Nat Commun 16, 10055; doi: 10.1038/s41467-025-65974-8

Source: www.sci.news

Mysterious Footprint Indicates Another Early Human Relative Coexisted with Lucy

In a recent breakthrough regarding human evolution, researchers have unveiled that a peculiar foot unearthed in Ethiopia is from a yet-to-be-identified ancient relative.

The findings, released on Wednesday in the journal Nature, indicate the foot dates back approximately 3.4 million years and likely bears similarities to Lucy, another ancient human relative who inhabited the region around the same period.

However, scientists have revealed that Burtele’s foot, named after the site in northeastern Ethiopia where it was discovered in 2009, is distinctly different.

The fossil of Bartel’s foot has an opposable thumb akin to that of humans, suggesting its owner was a proficient climber, likely spending more time in trees compared to Lucy, according to the study.

Elements of Brutere’s foot discovered in Ethiopia in 2009.
Johannes Haile Selassie/Arizona Institute of Human Origins (via AFP)

For many years, Lucy’s species was believed to be the common ancestor of all subsequent hominids, serving as a more ancient relative to humans, including Homo sapiens, in contrast to chimpanzees.

Researchers were unable to confirm that the foot belonged to a novel species until they examined additional fossils found in the same vicinity, including a jawbone with twelve teeth.

After identifying these remains as Australopithecus deiremeda, they determined that Bartele’s feet were from the same species.

John Rowan, an assistant professor of human evolution at the University of Cambridge, expressed that their conclusions were “very reasonable.”

“We now have stronger evidence that closely related, yet adaptively distinct species coexisted,” Rowan, who was not part of the study, communicated in an email to NBC News on Thursday.

The research also examined how these species interacted within the same environment. The team, led by Johannes Haile Selassie of Arizona State University, suggested that the newly identified species spent considerable time in wooded areas.

The study proposed that Lucy, or Australopithecus afarensis, was likely traversing the open land, positing that the two species probably had divergent diets and utilized their habitats in distinct ways.

Various analyses of the newly found tooth revealed that A. deiremeda was more primitive than Lucy and likely fed on leaves, fruits, and nuts, the study indicated.

“These distinctions suggest they are less likely to directly compete for identical resources,” remarked Ashley Los Angeles-Wiseman, an assistant professor at the Macdonald Institute of Archaeology at the University of Cambridge.

In an email on Thursday, Wiseman highlighted the significant implications of this discovery for our understanding of evolution, stating that it “reminds us that human evolution is not a linear progression of one species evolving into the next.”

Instead, she asserted, it should be viewed as a branching family tree with numerous so-called “cousins” existing simultaneously, each adopting various survival strategies. “Did they interact? We may never know the answer to that,” she concluded.

Rowan also noted that as the number of well-documented species related to humans increases, so do the inquiries concerning our ancestry. “Which species were our direct ancestors? Which species were our close relatives? That’s the challenge,” he remarked. “As species diversity ascends, so too do the avenues for plausible reconstructions of how human evolution unfolded.”

Wiseman cautioned that definitive species classifications should rely on well-preserved skulls and fossil fragments belonging to multiple related individuals. While the new study bolsters the case for A. deiremeda, it “does not dismiss all other alternative interpretations,” she stated.

Source: www.nbcnews.com

Ancient Foot Bones Uncover Evidence of Coexistence Between Two Human Species

Bones arranged in the approximate anatomical position of the right foot

The ancient human foot bones have puzzled scientists since their discovery in 2009.

Johannes Haile-Selassie

The origins of a 3.4-million-year-old foot bone uncovered in Ethiopia may finally be elucidated, prompting a reevaluation of how various ancient human ancestors cohabited.

In 2009, Johannes Haile-Selassie and his team at Arizona State University unearthed eight hominin bones that previously constituted a right foot at a site known as Burtele in northeastern Ethiopia’s Afar region.

This discovery, dubbed Bartele’s foot, features opposable big toes akin to those of gorillas, indicating that any species could have had arboreal capabilities.

Another ancient human species, Australopithecus afarensis, was known to inhabit the vicinity, with the well-known fossil of Lucy—also discovered in the Afar region—but Bartele’s foot appeared to belong to a different species. “From the outset, we realized it was not part of Lucy’s lineage,” Haile Selassie states.

There were two primary hypotheses that intrigued Haile Selassie: whether the foot was associated with another species within the genus Australopithecus or perhaps an older, more primitive group known as Ardipithecus, which existed in Ethiopia more than a million years ago and also possessed opposable thumbs.

Meanwhile, in 2015, scientists announced the identification of a previously unknown hominid species, named Australopithecus deiremeda, after jaw and tooth remains were found in the same region. Initially, there was uncertainty about whether the enigmatic leg bone was part of A. deiremeda, as its age differed from that of the jaw and tooth remains.

However, in the subsequent year, researchers made a crucial discovery. The lower jaw of A. deiremeda was located within 300 meters of Bartele’s foot, and both sets of remains were dated to the same geological era. This led the research team to conclude that Bartele’s foot belonged to A. deiremeda.

Bartele’s foot (left) and bones shaped like a gorilla’s foot (right), similar to Australopithecus deiremeda

Johannes Haile-Selassie

In a separate part of the study, researchers analyzed Earth’s carbon isotopes. They found that A. deiremeda primarily consumed materials from trees and shrubs, while human teeth were more adapted for a diet rich in grasses than those of afarensis.

Haile Selassie noted that this finding suggests that both hominin species occupied the same ecological niche without competing for resources. He believes these groups could have coexisted harmoniously, engaging in separate activities. “They must have crossed paths and interacted within the same habitat, each doing their own thing,” he remarked. “While members of Australopithecus deiremeda may have spent time in trees, afarensis was likely wandering the adjacent grasslands.”

This revelation enhances our understanding of human evolution. “Historically, some have argued that only a single hominid species existed at any given time, with newer forms emerging eventually,” Haile Selassie explained. “We are now realizing that our evolutionary path was not straightforward. Multiple closely related hominid species coexisted at the same time, indicating that coexistence was a fundamental aspect of our ancestors’ lives.”

Carrie Mongul, a professor at Stony Brook University in New York, expressed enthusiasm about these developments. “Understanding more about the diversity of Pliocene hominins is truly exciting,” she stated. “This period, around 3 million years ago, was rich in evolutionary significance.”

Topics:

  • Human evolution/
  • Ancient humans

Source: www.newscientist.com

Study identifies five distinct ‘eras’ of brain development throughout human life.

As we grow older, our brains undergo significant rewiring.

Recent studies indicate that this transformation takes place in various stages, or “epochs,” as our neural structures evolve, altering how we think and process information.

For the first time, scientists have pinpointed four key turning points in the typical aging brain: ages 9, 32, 66, and 83. During each of these phases, our brains display distinctly different structural characteristics.

The findings were Published Tuesday in Nature Communications, revealing that human cognitive ability does not merely peak and then decline with age. In reality, research suggests that the interval between 9 and 32 years old is the sole period in which our neural networks are increasingly efficient.

In adulthood, from 32 to 66 years, the structure of the average brain stabilizes without significant modifications, leading researchers to believe that intelligence and personality tend to plateau during this time.

Following another turning point, from age 83 and beyond, the brain increasingly relies on specific regions as connections between them slowly deteriorate.

“It’s not a linear progression,” comments lead author, Alexa Maudsley, a postdoctoral researcher at the University of Cambridge. “This marks an initial step in understanding how brain changes differ with age.”

These insights could shed light on why certain mental health and neurological issues emerge during specific rewiring phases.

Rick Betzel, a neuroscience professor at the University of Minnesota and not a part of the study, remarked that while the findings are intriguing, further data is necessary to substantiate the conclusions. He cautioned that the theory might face challenges over time.

“They undertook a very ambitious effort,” Betzel said about the study. “We shall see where things stand in a few years.”

For their research, Maudsley and colleagues examined MRI diffusion scans (images illustrating water molecule movement in the brain) of around 3,800 individuals, ranging from newborns to 90 years old. Their objective was to map neural connections at varying life stages.

In the brain, bundles of nerve fibers that convey signals are encased in fatty tissue called myelin—analogous to wiring or plumbing. Water molecules diffusing into the brain typically travel along these fibers, allowing researchers to identify neural pathways.”

“We can’t open up the skull…we depend on non-invasive techniques,” Betzel mentioned, discussing this form of neuroscience research. “We aim to determine the location of these fiber bundles.”

A groundbreaking study utilized MRI scans to chart the neural networks of an average individual across their lifetime, pinpointing where connections strengthen or weaken. The five “eras” discussed in the paper reflect the neural connections observed by the researchers.

They propose that the initial stage lasts until age nine, during which both gray and white matter rapidly increases. This phase involves the removal of redundant synapses and self-reconstruction.

Between ages 9 and 32, there is an extensive period of rewiring. The brain is characterized by swift communication across its regions and efficient connections.

Most mental health disorders are diagnosed during this interval, Maudsley pointed out. “Is there something about this second phase of life that might predispose individuals to mental health issues?”

From ages 32 to 66, the brain reaches a plateau. It continues to rewire, but this process occurs at a slower and less dramatic pace.

Subsequently, from ages 66 to 83, the brain undergoes “modularization,” where neural networks split into highly interconnected subnetworks with diminished central integration. By age 83, connectivity further declines.

Betzel expressed that the theory presented in this study is likely reflective of people’s experiences with aging and cognition.

“It’s something we naturally resonate with. I have two young kids, and I often think, ‘They’re transitioning out of toddlerhood,'” Betzel remarked. “Science may eventually uncover the truth. But are they precisely at the correct age? I’m not sure.”

Ideally, researchers would gather MRI diffusion data on a large cohort, scanning each individual across their lifespan, but that was unfeasible decades ago due to technological constraints.

Instead, the team amalgamated nine diverse datasets containing neuroimaging from prior studies, striving to harmonize them.

Betzel noted that these datasets vary in quality and methodology, and attempts to align them may obscure essential variations and introduce bias into the findings.

Nonetheless, he acknowledged that the paper’s authors are “thoughtful” and proficient scientists who did their utmost to mitigate that risk.

“Brain networks evolve throughout life, that’s undeniable. But are there five precise moments of transition? I hope you’ll take note of this intriguing notion.”

Source: www.nbcnews.com

We Might Have Uncovered the First Genuine Human Pheromone

The notion that humans might use chemical signals known as pheromones for communication has intrigued scientists and the general public alike for many years, leading to numerous investigations aimed at discovering evidence.

Pheromones are well-documented in the animal kingdom. Ants use chemical trails for navigation and communication, dogs mark their territory with scent signals, and moths emit airborne pheromones to attract partners.

However, the question of whether humans share this capability is much more complex. Can one person elicit a physical or emotional reaction in another without their awareness? Might this influence attraction?

After over six decades of research, the answers remain uncertain, but recent findings indicate we might be getting closer to understanding this phenomenon.

First Whiff

In 1959, Adolf Butenandt and his team identified the first pheromones, specifically bombykol, a chemical released by female silk moths to attract male counterparts.

Shortly after, scientists introduced the term “pheromone” to describe chemical signals emitted by one individual that trigger distinct responses in another of the same species.

This discovery opened the door to exploring potential human equivalents.

One of the earliest notable claims regarding human pheromones was put forth by Martha McClintock in 1971. Her study involving 135 women residing in university dorms suggested their menstrual cycles seemed to synchronize throughout the year.

This phenomenon, termed the “McClintock effect,” was widely regarded as evidence supporting the existence of human pheromones. However, subsequent studies did not replicate these findings and revealed that any apparent synchronization could be attributed to chance.

For many years, researchers have concentrated on four primary chemicals believed to be human pheromones. Androstenone and androstenol are thought to influence social perception and sexual attraction.

Androstadienone has been investigated for its impact on mood and alertness in women, while estratetraenol is believed to affect men’s perceptions of women.

Nonetheless, none of these substances have been definitively established as true human pheromones.

The doses used in studies are often much higher than what the body naturally produces, leading to less reliable outcomes. Furthermore, many experiments suffer from design flaws and weak statistics, resulting in inconsistent and inconclusive findings.

Read More:

T-Shirt Test

If discussions on human pheromones arise, Professor Klaus Wedekind’s “Sweaty T-shirt research” from 1995 is likely to be mentioned.

In this experiment, women were asked to smell T-shirts worn by men and indicate their preferences.

Interestingly, women who were not on birth control were more inclined to like the scents of men whose immune system genes (MHC genes) differed most from their own.

This preference aligns with evolutionary theory, as choosing mates with varied immune genes can enhance resistance to diseases in offspring.

This study has been replicated and is frequently hailed as a compelling instance of human chemical signaling, wherein body odor conveys social or biological information.

Yet, the scents involved in this research do not adhere to the strict definition of pheromones.

Most of the odor in sweat comes from a small number of underarm bacteria on your T-shirt, not pheromones. – Photo credit: Getty

Initially, a person’s complex “smell print” consists of multiple chemicals rather than a single one. Pheromones trigger automatic and unconscious responses, such as hormonal changes and instinctive behaviors, whereas this type of scent is subjective and conscious, forming personal preferences.

Invisible Clues

Although the T-shirt study does not clarify the role of pheromones in humans, some scientists believe that research in this area is far from complete.

Among them is Dr. Tristram Wyatt, a senior research fellow at the University of Oxford’s Department of Zoology, who has dedicated his career to studying the evolution of pheromones.

“If we consider humans as just another animal, it would be surprising to think we do not communicate chemically,” he explains. “For instance, our body odor evolves during puberty and becomes even more pronounced as we reach sexual maturity.

“In other animals, such odors frequently convey critical signals, so it is highly possible that humans emit similar signals; we just haven’t established this scientifically yet.”

The queen bee releases a pheromone that inhibits the reproduction of all other females in the hive – Photo credit: Getty

Even with this potential, pinpointing human pheromones has proven extraordinarily challenging.

“Studying human pheromones is akin to searching for a needle in a haystack,” Wyatt remarks. “Humans release thousands of odor molecules, making it difficult to identify which one triggers certain effects.

“Moreover, our reactions to odors are influenced by cultural, emotional, and individual differences, rendering our responses highly variable. Without reliable bioassays that provide clear, measurable reactions to odors, it is nearly impossible to pinpoint genuine pheromones.”

Another problem is reproducibility; many pheromone studies are based on small sample sizes, which makes their results statistically unreliable and susceptible to false positives.

Early research often lacks strict controls, and the field faces publication bias, increasing the likelihood of positive results being published.

The outcome? An evidentiary basis that appears more robust than it truly is. It comprises a collection of intriguing yet unreliable findings, with only a few holding up under repeat testing.

The Scent is Hot

Despite years of challenges, Wyatt remains hopeful, particularly about recent advances in research, including a French study that may represent the closest step toward identifying a human pheromone.

This investigation centered on secretions from Montgomery’s glands (small glands around the nipples that release tiny droplets during breastfeeding) in nursing mothers.

Researchers found that when newborns were exposed to the scent of these secretions, they instinctively turned their heads, displayed suckling behavior, and began searching for the nipple.

“This is the most exciting lead we’ve encountered to date,” says Wyatt. “Babies respond to these secretions even if they come from a different mother.

“Such a universal, instinctive reaction is precisely what we expect from an authentic pheromone. If we can identify the specific compound responsible, we might finally establish the first verified human pheromone.”

A recent breakthrough in pheromone research occurred in 2023 at the Weizmann Institute of Science in Israel. Researchers studied the effects of tears from women.

Men who smelled tears shed by a woman during a sad film showed decreased testosterone levels, and brain scans indicated changes in areas linked to both aggression and olfactory processing.

The study also revealed four receptors in the nose capable of detecting chemical signals in tears, and researchers are currently working to identify the specific compounds in tear fluid that elicit this response, potentially leading to compounds that mitigate aggression.

Recent research indicates that chemicals in women’s tears significantly affect men’s testosterone levels – Image courtesy of Getty Images

Nevertheless, while there is evidence that humans utilize scent in both social and sexual contexts, it has yet to be definitively proven that pheromones play a role in human communication.

“To conclusively ascertain whether human pheromones exist, rigorous research is necessary,” Wyatt asserts.

“This entails clear testing with consistent responses, larger and better-designed studies, and moving beyond the same old unproven molecules. Only diligent, evidence-driven research will yield real answers.”

“The quest for genuine human pheromones is just at its inception,” he concludes. “With the proper guidance, we could finally be on the brink of an exciting discovery.”

Read More:

Source: www.sciencefocus.com

Critique of The Arrogant Monkey: A Bold New Book Challenges the Myth of Human Exceptionalism

Chimpanzee intelligence tests are primarily performed in laboratories, not in their natural environments or sanctuaries like this.

Patrick Meinhardt/AFP via Getty Images

The Arrogant Ape
by Christine Webb Abacus, UK; Avery, USA

In the beginning, God created man in His own image, granting him authority over all living things on Earth. While many do not turn to the Bible for insight into human existence, the belief in human superiority over nature and other beings lingers.

Characteristics often claimed to distinguish humans—such as reasoning, tool use, experiencing pain, and moral judgment—are not exclusive to us. Other species like chimpanzees and crows exhibit advanced intelligence, hold complex social structures, and utilize tools. Fish and crustaceans experience pain, while bees demonstrate cultural behaviors, and plants may possess senses akin to ours.

Primatologist Christine Webb posits that the so-called “human dominance complex” may be the root of nature’s hierarchies. In The Arrogant Monkey: And a New Look at Humanity, she seeks to dismantle this perceived superiority through a compelling and meticulously researched examination based on a course she taught at Harvard. Webb traces this notion back to religious traditions and other human constructs, revealing how it misrepresents scientific understanding and accelerates ecological decline.

The belief in human uniqueness contradicts Darwin’s vision of species continuity, and emphasizing differences among species is problematic. As Webb writes, “the degree of kindness,” reflects a hidden bias in research.

This bias is apparent in our fascination with primates and “charismatic” mammals, which we tend to view as more relatable, while disregarding plants, fish, and the vast majority of Earth’s life. It also reveals itself in our inconsistent standards for evaluating animals. For instance, comparisons between human and chimpanzee intelligence often pit captive chimps against their wild counterparts, ignoring the limitations that captivity imposes.

Concerned about ethical issues surrounding captivity and its potential to skew research findings, Webb focuses exclusively on great apes in their natural and protected habitats. These profound interactions have shaped her belief that many non-human species likely possess some form of consciousness or “conscious life.”

Webb anticipates that critics may dismiss her views as anthropomorphism, labeling it a “serious scientific error.” However, she argues that the reluctance to acknowledge similarities between humans and other species complicates scientific inquiry and undermines its conclusions. She questions the certainty with which humans claim to understand consciousness beyond their own.

Dismantling these beliefs is crucial for appreciating the wonder and diversity of life, marking the first step towards a “radically humble approach.” By recognizing ourselves as fellow animals and integral to nature, we can confront the destructive forces of capitalism that fuel zoonotic diseases, mass extinctions, climate change, and ecosystem collapse.

Webb advocates for broadening the concept of “good science” to incorporate indigenous knowledge about the uniqueness and interconnection of all life forms. She acknowledges the immense challenge this poses, declaring that human exceptionalism is “the most pervasive implicit belief of our era.” Yet, she believes that unlearning this can foster a deeper connection to nature, spark awe, and inspire advocacy for both animal welfare and environmental protection. In The Arrogant Monkey, she highlights this “stubborn ideology” and its detrimental impacts, modeling the humility, curiosity, and compassion essential for countering it.

Elle Hunt – A writer based in Norwich, UK.

Topics:

Source: www.newscientist.com

Researchers Examine Neanderthal DNA to Gain Insights into Human Facial Development and Evolution

Research led by scientist Hannah Long at the University of Edinburgh has found that specific regions of Neanderthal DNA are more effective at activating genes responsible for jaw development than those in humans, potentially explaining why Neanderthals had larger lower jaws.

Neanderthal. Image credit: Natural History Museum Trustees.

“With the Neanderthal genome being 99.7% identical to that of modern humans, the variations between species are likely to account for differences in appearance,” Dr. Hanna stated.

“Both human and Neanderthal genomes consist of roughly 3 billion characters that code for proteins and regulate gene expression in cells. Identifying the regions that influence appearance is akin to searching for a needle in a haystack.”

Dr. Long and her team had a targeted approach, focusing on a genomic area linked to the Pierre Robin sequence, a condition marked by an unusually small mandible.

“Individuals with the Pierre Robin sequence often have significant deletions or rearrangements in this portion of the genome that affect facial development and restrict jaw formation,” Dr. Hanna explained.

“We hypothesized that minor differences in DNA could produce more nuanced effects on facial structure.”

Upon comparing human and Neanderthal genomes, researchers discovered that in this segment, approximately 3,000 letters long, there are only three one-letter variations between the species.

This DNA region doesn’t code for genes but regulates when and how certain genes, particularly SOX9, which plays a crucial role in facial development, are activated.

To confirm that these Neanderthal-specific differences were significant for facial development, scientists needed to demonstrate that the Neanderthal version could activate genes in the appropriate cells at the right developmental stage.

They introduced both Neanderthal and human versions of this region into zebrafish DNA and programmed the cells to emit different colors of fluorescent protein based on the activation of either region.

By monitoring zebrafish embryo development, researchers observed that cells responsible for forming the lower jaw were active in both human and Neanderthal regions, with the Neanderthal regions showing greater activity.

“It was thrilling when we first noticed the activity of specific cell populations in the developing zebrafish face, particularly near the forming jaw, and even more exhilarating to see how Neanderthal-specific variations could influence activity during development,” said Dr. Long.

“This led us to contemplate the implications of these differences and explore them through experimental means.”

Recognizing that Neanderthal sequences were more effective at activating genes, the authors questioned whether this would lead to enhanced target activity affecting the shape and function of the adult jaw, mediated by SOX9.

To validate this idea, they augmented zebrafish embryos with additional samples of SOX9 and discovered that cells involved in jaw formation occupied a larger area.

“Our lab aims to further investigate the effects of genetic differences using methods that simulate various aspects of facial development,” Dr. Long remarked.

“We aspire to deepen our understanding of genetic variations in individuals with facial disorders and improve diagnostic processes.”

“This study demonstrates how examining extinct species can enhance our knowledge of how our own DNA contributes to facial diversity, development, and evolution.”

The findings are published in the journal Development.

_____

Kirsty Utley et al. 2025: Neanderthal-derived variants enhance SOX9 enhancer activity in craniofacial progenitor cells, influencing jaw development. Development 152 (21): dev204779; doi: 10.1242/dev.204779

Source: www.sci.news

Researchers Explore Neanderthal DNA to Uncover Insights into Human Facial Development and Evolution

Scientist Hannah Long and her team at the University of Edinburgh have discovered that specific regions of Neanderthal DNA are more effective at activating genes related to jaw formation compared to human DNA, which might explain why Neanderthals had larger lower jaws.

Neanderthal. Image credit: Natural History Museum Trustees.

“The Neanderthal genome shows a 99.7% similarity to the human genome, suggesting that the differences between the species contribute to variations in appearance,” explained Dr. Hanna.

“Both the human and Neanderthal genomes comprise around 3 billion characters that code for proteins and regulate gene usage in cells. Therefore, pinpointing regions that affect appearance is akin to finding a needle in a haystack.”

Dr. Long and her collaborators had a targeted hypothesis regarding where to initiate their search. They focused on a genomic area linked to the Pierre Robin sequence, a condition characterized by a notably small jaw.

“Some individuals with Pierre Robin sequence exhibit significant deletions or rearrangements in this genomic region that disrupt facial development and impede jaw formation,” stated Dr. Hanna.

“We speculated that minor variations in DNA could subtly influence facial shape.”

Through the comparison of human and Neanderthal genomes, researchers identified that in a segment approximately 3,000 letters long, there are just three one-letter differences between the two species.

This DNA segment lacks any specific genes but regulates the timing and manner in which genes, particularly SOX9, a crucial factor in facial development processes, are activated.

To demonstrate the significance of these Neanderthal-specific differences for facial development, researchers needed to confirm that the Neanderthal region could activate genes in the correct cells at the appropriate developmental stage.

They introduced both Neanderthal and human variants of this region into zebrafish DNA concurrently and programmed the cells to emit different colors of fluorescent protein based on whether the human or Neanderthal region was active.

By monitoring zebrafish embryo development, researchers observed that the cells crucial for lower jaw formation were active in both regions, with the Neanderthal regions showing greater activity than those of humans.

“We were thrilled when we first detected the activity in a specific group of cells within the developing zebrafish face, near the jaw, and even more so when we realized that Neanderthal-specific differences could modify this activity during development,” Dr. Long noted.

“This led us to ponder the potential implications of these differences and how we may explore them experimentally.”

Recognizing that Neanderthal sequences were more adept at activating genes, the authors inquired whether this would correlate with heightened activity in target cells, influencing the shape and function of the adult jaw as governed by SOX9.

To test this hypothesis, they administered additional samples to zebrafish embryos. They found that the cells involved in jaw formation occupied a larger area.

“In our lab, we aim to investigate the effects of additional DNA sequence differences using methods that replicate aspects of facial development,” Dr. Long said.

“We aspire to enhance our understanding of sequence alterations in individuals with facial disorders and assist with diagnostic efforts.”

“This research illustrates that by examining extinct species, we can gain insights into how our own DNA contributes to facial variation, development, and evolution.”

Findings are detailed in the journal Development here.

_____

Kirsty Utley et al. 2025: Variants derived from Neanderthals enhance SOX9 enhancer activity in craniofacial progenitor cells that shape jaw development. Development 152 (21): dev204779; doi: 10.1242/dev.204779

Source: www.sci.news

As an Avid Introvert, I Fear AI May Diminish My Joy in Human Connection – Emma Beddington

THe faces depression: As reported by The Cut, individuals are turning to AI to crack escape room puzzles and manipulate trivia nights. Is this not the essence of spoiling one’s enjoyment? “It’s akin to entering a corn maze with the intent of taking a straight path to the exit,” remarked a TikToker featured in the article. There are conversations with passionate readers who rely on ChatGPT to substitute book clubs and source “enlightening opinions and perspectives.” Everything was pleasant until a character’s demise disrupted the fantasy saga he was savoring (though, in truth, that seems rather grim).

Conversely, Substack appears to be filled with AI-produced essays. This New Blog platform is a vibrant hub for passionate creators to showcase their writings. Handing that off to a bot feels like peak absurdity. Will Storr, who delves into storytelling, examines this unexpected trend and its implications. In his own Substack, he discusses the phenomenon of “impersonal universalism,” wherein grand statements may sound profound but fall flat. “Insight possesses a universality akin to white noise, wrapped in an unsettling vagueness that can cloud our thoughts,” he observes.

I find it puzzling how anyone can derive pleasure from using extensive language models (LLMs) to appear vaguely “intelligent” or engage in AI-altered hobbies. Yet, I believe this isn’t an existential threat posed by AI. It is crucial that we savor our experiences. Let robots take our jobs, but they shouldn’t steal our joy. I’m not here to dictate how others should find pleasure—I’m no authority on fun. If I were to teach you, it might very well come across like an AI-generated Substack (embracing nature, chatting with strangers, enjoying moments with loved ones). Yet, I often reflect on what genuinely makes me feel alive, as I seek to engage more in those activities. It becomes a personal defense against “impersonal universality.”

First up: singing. While I wish AI could concoct melodic canons and create ethereal robot madrigals, it cannot replicate the whimsical joy of my quirky choir made up of very special individuals. We may not be the most skilled vocalists, but when we harmonize, we share a deep sense of connection (research indicates that group singing fosters bonding) quick social bonding). Occasionally, everything aligns for fleeting moments of breathtaking beauty, humbly guided by our choir director, silently matching a chef’s kiss. Regardless, it remains delightful.

Next, let’s discuss not my own but someone else’s experiences. I find endless inspiration in the unique artifacts people treasure, acquire, and eventually discard. My regular visits to York’s weekly car boot sale reveal a captivating blend of stuffed badgers, Power Rangers merchandise, fishing gear, and a ceramic mouse in Victorian attire. More noble collectibles might include the textiles featured in Renaissance paintings: garments, tapestries, and drapes. Recently, I spent an exhilarating 10 minutes at The Frick Collection in New York, immersed in an astonishingly vacant room while studying Holbein’s Portrait of Thomas More, contemplating the feel of his fur collar and red velvet sleeves, pondering his choices.

A substantial portion of my joy stems from simply being present in nature. I stroll, dig in the soil, observe wildlife (yes, that includes birds), but predominantly, as a lifelong introvert, my delight comes from people. If I had to identify my most reliable source of happiness, it would be wandering through a new city, soaking in the lives of its inhabitants. What do they wear, consume, and discuss? What triggers their anger? What kind of dogs accompany them? It’s an endless buffet of human experience, from toddler tantrums to tender moments of affection to the play of queue dynamics. Recently, I watched the documentary *I Am Martin Parr*, which showcases a photographer adept at capturing the nuances of British life, likened to a magpie, and he resonates with this sentiment. Now in his seventies, Parr is still eager to explore and document the marvelous and strange nuances of society. He declares, “I’m still thrilled to venture out and observe this chaotic world we inhabit.”

That is my secret. AI can offer a rote summary of who we are, but it mixes all our hues into a muddy shade. It cannot encapsulate the joy of something utterly unique.

Emma Beddington is a columnist for the Guardian

Source: www.theguardian.com

The Human Mind’s Aversion to Uncertainty: A Challenge for Liberal Democracies

The exploration of the dynamics within liberal democracies has typically emphasized economic, emotional, and educational influences. However, an additional field of neurology plays a critical role.

Liberal democracies engage our cognitive processes differently than authoritarian regimes. Dictatorships provide a sense of predictability, exemplified by Adolf Hitler’s envisioned timeline, while liberal democracies leave the future open to our choices, presenting it as a canvas we shape ourselves.

This is politically significant yet cognitively daunting. Historically, the future was dictated by a select few, prioritizing preservation over progress. The inherent ambiguity and adaptability of liberal democracy can challenge individuals neurologically, as uncertainty is a state the human mind often resists. Studies indicate that uncertainty triggers more anxiety than the anticipation of an electric shock, leading to various historical attempts to diminish uncertainty through mechanisms like insurance and weather forecasting.

Your position on the spectrum of uncertainty tolerance is influenced by cultural background, age, and gender, as well as neurological factors. Research in political neuroscience reveals that conservative brains lean towards security, generally steering clear of conclusions that lack clarity. This tendency is associated with a larger amygdala, the brain region linked to threat detection, resulting in a heightened discomfort when confronted with the unfamiliar.

On the other hand, a liberal brain exhibits greater gray matter in the anterior cingulate cortex, a region involved in processing ambiguity. This anatomical difference enables liberals to tolerate uncertainty and confrontation more effectively. Liberal democracies can provide space for both perspectives under less stressful conditions. Although conservatives and liberals may have distinct neural predispositions regarding their preferences for the future, evolutionarily, all humans share the ability to envision multiple futures.

However, increased uncertainty can push some individuals beyond their comfort zones, particularly as the future of pressing issues—like environmental change, technology, and social norms—becomes less predictable. To cope with this anxiety, some individuals gravitate towards populist and authoritarian political leaders, committing to rigid decision-making and a black-and-white perspective. They often seek certainty—howbeit a mere illusion—by rejecting innovations (such as medical advancements) or dismissing foreign cultures and religions, thus limiting uncertainty and suppressing potential futures. This obsession with ambiguity and anxiety can create a more tranquil mindset for those affected.

This doesn’t imply a total surrender to an illiberal mindset. Instead, it underscores the necessity for liberal democracies to candidly inform their constituents that embracing liberalism may not come intuitively. Educational initiatives, public discourse, and civil engagement must derive insights into overcoming illiberal tendencies at a brain-based level.

We must communicate the collective benefits of cooperation in various domains, including identity. Ultimately, only through collaboratively addressing the vulnerabilities inherent in our brains can we tackle the significant global challenges we face today.

Florence Gaub is the author of Future: Manual (Hurst, 2026). Riya Yu has authored Fragile Minds: The Neuropolitics of Divided Societies (Columbia UP).

topic:

Source: www.newscientist.com

Anthropologist Revolutionizes Naming Conventions for Human Species

It’s fair to state that the ancient human family tree has always been subject to revision. Take the Denisovans, for instance. These enigmatic ancient hominins were once primarily identified through mere bone fragments. However, in June, molecular analysis revealed that a peculiar skull from China belonged to the Denisovans, thus giving them a more defined identity.

Yet, not everyone is convinced. Anthropologist Christopher Bay, a professor at the University of Hawaii at Manoa, contests this finding, asserting that the skull is more likely associated with a species named Homolonghi. Bay has been foundational in ongoing discussions regarding our ancestral lineage. For over five years, he, alongside colleagues, has advocated for the recognition of two ancient human species: Homo bodoensis and Homo juruensis.

These proposals have stirred debate, especially since Bay and his team have intentionally disregarded traditional naming conventions. He argues that such rules have become outdated, failing to accommodate the removal of names that are now considered offensive or unpronounceable. In a conversation with New Scientist, he elaborated on how his personal quest for identity fueled his passion for human evolution.

Michael Marshall: What initially encouraged you to explore the study of ancient humans?

Christopher Bay: The ultimate aim of paleoanthropology is to piece together historical narratives, even when all elements are not available. This field resonates with me personally as I was adopted and spent my first year without any concrete memory. I was born in South Korea, abandoned around one year of age, spent six months in an orphanage, and was later taken in by an American family.

During my undergraduate studies, I had the opportunity to visit Korea for the first time as an exchange student. On this trip, I visited an adoption center in my hometown, inquiring if there was any possibility of locating my birth parents. Unfortunately, I was informed that my Korean name and birth date were not legitimate, and there was virtually no chance of finding them. That was a moment of resignation for me.

Although I was intrigued by my origins, I didn’t know how to pursue them. Then, I enrolled in an introductory biological anthropology course, which allowed me to navigate my curiosity about origins—almost like constructing my own foundation.

Two species frequently debated regarding our direct ancestry are Homo heidelbergensis and Homo rhodesiensis. In 2021, you joined a team proposing the substitution of these names with a new species, H. bodoensis. Could you elaborate on this?

My colleague, Mirjana Roksandic from the University of Winnipeg and I discussed H. heidelbergensis at the 2019 Anthropology Conference. It became apparent that this species had been labelled a “trash can taxon,” becoming an easy classification for fossils that didn’t belong to Homo erectus, Homo neanderthalensis, or Homo sapiens.

What are the implications?

If we aim to discard H. heidelbergensis, the next valid name based on priority is H. rhodesiensis. However, this name honors Northern Rhodesia—renamed Zambia—an area linked to the controversial Cecil Rhodes. Are we comfortable naming a potential ancestor of modern humans after a historical figure associated with colonialism? So, in compiling that paper, we decided to introduce a fresh name paying tribute to Bodo, a 600,000-year-old skull discovered in Ethiopia.

What response did your paper receive?

Upon peer review submission, half of the reviewers contended the argument had to be published for its significance, while the other half deemed it entirely flawed. Unsurprisingly, the paper was met with polarized reactions once it was released.

Have any workable agreements emerged from this debate?

We held a workshop in Novi Sad, Serbia in 2023, with approximately 16 to 17 paleoanthropologists participating. A consensus emerged around the notion that H. heidelbergensis is indeed an inappropriate taxonomic category. Another significant finding was that H. rhodesiensis should be excluded due to its colonial implications—remarkably, only one paleoanthropologist present believed otherwise.

Xujiayao ruins in northern China

Christopher J. Bay

The ultimate arbiter in such cases is the International Commission on Zoological Nomenclature (ICZN). Have you received feedback on your argument for H. bodoensis?

The ICZN published a statement in 2023, indicating it “does not involve itself in removing names that may be ethically problematic.” This direction prompted us to challenge the ICZN. Editor’s note: The ICZN’s 2023 announcement recognized that some scientific names might be offensive but asserted it’s beyond their remit to weigh the morality of individuals honored in eponyms. Moreover, it stressed the necessity for zoologists to adhere to its ethical code while naming new species.

Are the names of species significant enough to merit conflict?

Yes and no. For instance, several beetles from Slovenian caves were named after Adolf Hitler in the 1930s by an Austrian entomologist, Oskar Scheibel. One species, Anophthalmus hitleri, has gained popularity as a collector’s item among neo-Nazis, which could lead to this innocent beetle’s extinction.

What alternatives do you propose?

I advocate working with local collaborators to choose species names that they find acceptable since they live with and experience these species regularly. Ideally, I believe we should refrain from using personal names for species, as this could lead to ongoing issues. Change is on the horizon; the ICZN is re-evaluating the inclusion of members from the Global South to provide them a stronger say. Recently, the American Ornithological Society voted to remove names with negative connotations associated with historical figures from their species designations.

Last year, you again disputed ICZN regulations concerning ancient human fossils excavated at a site, Xujiayao, in northern China. What occurred there?

In the 1970s, researchers uncovered multiple hominin fossils representing over ten individuals at the site, though they were found as separate pieces. Together with my colleagues, Wu Xiujie from the Chinese Academy of Sciences worked extensively on these fossils. Wu has virtually reconstructed part of one skull, and upon seeing this, we noted it appeared distinctly different from other hominins of a comparable age.

What differentiates these specimens?

The variations lie in size and shape; our average cranial capacity is around 1300 to 1500 cubic centimeters, whereas these fossils have cranial volumes between 1700 cm³ and 1800 cm³, significantly larger than typical humans. Shape analysis similarly indicated that the Xujiayao fossils correlated differently compared to those from a nearby site called Xuchang, leading us to propose a new species name.

Mr. Bae studies human fossils discovered in Serbia, potentially linked to Homo bodoensis

Christopher J. Bay

The name you ultimately selected has been met with criticism. Can you clarify the rationale behind it?

The origin of the species name is intriguing; in this case, we could have opted for Homo suziayaoensis, named after Xujiayao, aligning with ICZN guidelines.

In Latin, it translates to “homo“, but you found that option unsatisfactory?

The challenge lies in the fact that only fluent Chinese speakers can pronounce it, and even spelling it correctly can be an issue. Names must be both pronounceable and memorable. Thus, we came up with “julu,” which translates directly to “big head.” However, adhering to ICZN guidelines, we would need to modify the name to “Homo juui”. In our view, since non-Chinese speakers struggle to pronounce it correctly, we ultimately decided upon Homo juruensis.

How does your new species intersect with the enigmatic Denisovans, who inhabited what is now East Asia during the Stone Age?

If you compare the second molars from Denisova Cave with those from Xujiabao, they appear strikingly similar. It’s even plausible to interlink Xujiayao’s and Denisova’s molars, as the distinction is often so subtle.

This year, another research team suggested a link between the same Denisovan fossils and another ancient species, Homolonghi, which has garnered a positive reception among numerous researchers.

Most ancient hominin experts in China tend to side with our argument for H. juruensis, while many Western scholars familiar with China’s historical records also find it agreeable.

Concerning the June-discovered skull, researchers managed to extract ancient proteins associated with H. longhi that corresponded with known Denisovan fossils. What are your thoughts?

Most geneticists argue that protein analysis isn’t robust enough for accurate species identification. While it can differentiate between broader categories—like cats and dogs—its utility in distinguishing more nuanced levels is quite limited.

Replica of Denisova molars discovered in Denisova Cave in 2000

Tilo Parg CC BY-SA 3.0

Do you still consider H. longhi a legitimate species?

I personally appreciate H. longhi and the fossils associated with it. The debate revolves around which other fossils should be allocated to longhi or juruensis. It’s interesting to note that advocates for longhi are attempting to consolidate all fossils under that designation, despite the evident morphological diversity present in Chinese fossils.

Many paleoanthropologists have expressed strong criticism of your research. How do you and your colleagues respond to this?

Over time, we’ve developed resilient skin regarding our work.

Topics:

Source: www.newscientist.com

Denisovans Might Have Mated with an Unidentified Ancient Human Species

Depiction of a teenage girl with a Neanderthal mother and a Denisovan father

John Bavaro Fine Art/Science Photo Library

This marks the second occasion researchers have successfully retrieved the complete genome of Denisovans, an ancient human lineage that inhabited Asia. The DNA was sourced from a tooth estimated to be 200,000 years old, discovered in a Siberian cave.

The genome indicates that there were at least three distinct groups of Denisovans, each with unique histories. It also suggests that early Denisovans intermixed with an unidentified ancient human group as well as a previously unknown Neanderthal population.

“This research is groundbreaking,” asserts David Reich from Harvard University.

“This study significantly broadened my perspective on the Denisovan ecosystem,” states Samantha Brown from the National Center for Human Evolution Research in Spain.

Denisovans were first described solely via their DNA. Finger bones retrieved from Denisova Cave in Siberia exhibited DNA distinct from both modern humans and Neanderthals found in western Eurasia. Genomic analysis indicates Denisovans mated with modern humans, with populations in Southeast Asia, including the Philippines and Papua New Guinea, carrying Denisovan DNA.

Since their initial discovery in 2010, researchers have found that: a small number of Denisovans also originated from East Asia. In June, a skull unearthed in Harbin, China, was confirmed as Denisovan through molecular evidence, providing the first insight into their physical appearance. However, despite DNA fragments being recovered from various specimens, only the original specimen yielded a high-quality genome.

Researchers led by Stéphane Pèregne from Germany’s Max Planck Institute for Evolutionary Anthropology has introduced an additional researcher. (Pèregne declined to comment as the study is pending peer review.)

In 2020, a team of researchers discovered a male Denisovan molar tooth and sequenced its entire genome from the preserved DNA.

The researchers estimated this individual lived around 205,000 years ago, judging by the number of genetic mutations and comparing them with other ancient human genomes. This timeframe aligns with findings that the deposits containing the teeth are dated between 170,000 to 200,000 years old. In contrast, the other high-quality genome belongs to Denisovans who lived between 55,000 and 75,000 years ago, revealing an earlier chapter in Denisovan history.

The researchers suggest that at least three distinct Denisovan populations likely existed, based on comparisons from various Denisovan cave sites. The oldest group comprised the individuals whose teeth were analyzed. Many millennia later, a second group supplanted this earlier population in Denisova Cave.

“Comprehending how early Denisovans were supplanted by subsequent groups underscores pivotal events in human history,” says Qiao Meifu from the Institute of Vertebrate Paleontology and Paleoanthropology in China.

A third group, absent from the cave, still interbred with modern humans as suggested by genetic testing. Thus, all Denisovan DNA present in modern humans derives from a Denisovan group about which little is known.

The new genome illuminates the fact that Denisovans mated repeatedly with Neanderthals, who resided in and around the Denisovan Cave. Notably, this genome also contained traces of Neanderthals who lived between 7,000 and 13,000 years prior to Denisovan individuals. These traces do not align with any known Neanderthal genomes, indicating that the Denisovans interbred with a Neanderthal group yet to be sequenced.

Moreover, it’s probable that Denisovans also mated with an as-yet unidentified ancient human group that evolved independently of both Denisovans and modern humans for hundreds of thousands of years. One possibility is Homo erectus, the earliest known human species to migrate out of Africa and inhabit regions as far as Java, Indonesia. However, no DNA has been retrieved to confirm this.”H. erectus, so certainty remains elusive.

“It’s endlessly fascinating to uncover these new populations,” Brown remarked.

topic:

Source: www.newscientist.com

Lead Exposure Could Have Shaped Human Brain Evolution, Behavior, and Language Development

Several hominid species — Australopithecus africanus, Paranthropus robustus, early homo varieties, Gigantopithecus brachy, Pongo, papio, homo neanderthalensis, and homo sapiens — have undergone significant lead exposure over two million years, as revealed by a new analysis of fossilized teeth collected from Africa, Asia, Oceania, and Europe. This finding challenges the notion that lead exposure is merely a contemporary issue.

Lead exposure affecting modern humans and their ancestors. Image credit: J. Gregory/Mount Sinai Health System.

Professor Renaud Joannes Boyau from Southern Cross University remarked: “Our findings indicate that lead exposure has been integral to human evolution, not just a byproduct of the industrial revolution.”

“This suggests that our ancestors’ brain development was influenced by toxic metals, potentially shaping their social dynamics and cognitive functions over millennia.”

The team analyzed 51 fossil samples globally utilizing a carefully validated laser ablation microspatial sampling technique, encompassing species like Australopithecus africanus, Paranthropus robustus, early homo variants, Gigantopithecus brachy, Pongo, papio, homo neanderthalensis, and homo sapiens.

Signs of transient lead exposure were evident in 73% of the specimens analyzed (compared to 71% in humans). This included findings on Australopithecus, Paranthropus, and homo species.

Some of the earliest geological samples from Gigantopithecus brachy, believed to be around 1.8 million years old from the early Pleistocene and 1 million years old from the mid-Pleistocene, displayed recurrent lead exposure events interspersed with periods of little to no lead uptake.

To further explore the impact of ancient lead exposure on brain development, researchers also conducted laboratory studies.

Australopithecus africanus. Image credit: JM Salas / CC BY-SA 3.0.” width=”580″ height=”627″ srcset=”https://cdn.sci.news/images/2015/01/image_2428-Australopithecus-africanus.jpg 580w, https://cdn.sci.news/images/2015/01/image_2428-Australopithecus-africanus-277×300.jpg 277w” sizes=”(max-width: 580px) 100vw, 580px”/>

Australopithecus africanus. Image credit: JM Salas / CC BY-SA 3.0.

Using human brain organoids (miniature brain models grown in the lab), researchers examined the effects of lead on a crucial developmental gene named NOVA1, recognized for modulating gene expression during neurodevelopment in response to lead exposure.

The modern iteration of NOVA1 has undergone changes distinct from those seen in Neanderthals and other extinct hominins, with the reasons for this evolution remaining unclear until now.

In organoids with ancestral versions of NOVA1, exposure to lead significantly altered neural activity in relation to Fox P2 — a gene involved in the functionality of brain regions critical for language and speech development.

This effect was less pronounced in modern organoids with NOVA1 mutations.

“These findings indicate that our variant of NOVA1 might have conferred a protective advantage against the detrimental neurological effects of lead,” stated Alison Muotri, a professor at the University of California, San Diego.

“This exemplifies how environmental pressures, such as lead toxicity, can drive genetic evolution, enhancing our capacity for survival and verbal communication while also affecting our susceptibility to contemporary lead exposure.”

Gigantopithecus blackii inhabiting the forests of southern China. Image credit: Garcia / Joannes-Boyau, Southern Cross University.” width=”580″ height=”375″ srcset=”https://cdn.sci.news/images/2024/01/image_12599-Gigantopithecus-blacki.jpg 580w, https://cdn.sci.news/images/2024/01/image_12599-Gigantopithecus-blacki-300×194.jpg 300w, https://cdn.sci.news/images/2024/01/image_12599-Gigantopithecus-blacki-84×55.jpg 84w” sizes=”(max-width: 580px) 100vw, 580px”/>

An artistic rendition of a Gigantopithecus brachy herd in the forests of southern China. Image credit: Garcia / Joannes-Boyau, Southern Cross University.

Genetic and proteomic analyses in this study revealed that lead exposure in archaic variant organoids disrupts pathways vital for neurodevelopment, social behavior, and communication.

Alterations in Fox P2 activity indicate a possible correlation between ancient lead exposure and the advanced language abilities found in modern humans.

“This research highlights the role environmental exposures have played in human evolution,” stated Professor Manish Arora from the Icahn School of Medicine at Mount Sinai.

“The insight that exposure to toxic substances may conjure survival advantages in the context of interspecific competition introduces a fresh perspective in environmental medicine, prompting investigations into the evolutionary origins of disorders linked to such exposures.”

For more information, refer to the study published in the journal Science Advances.

_____

Renaud Joannes Boyau et al. 2025. Effects of intermittent lead exposure on hominid brain evolution. Science Advances 11(42); doi: 10.1126/sciadv.adr1524

Source: www.sci.news

Americans Awarded Nobel Prize in Medicine for Advancements in Understanding the Human Immune System

Three distinguished scientists (two from the U.S. and one from Japan) have been awarded the Nobel Prize in Medicine for their pivotal discovery related to peripheral immune resistance.

Mary E. Blankku, Fred Ramsdell, and Sakaguchi Shiko were jointly recognized for their breakthrough that “has invigorated the field of peripheral tolerance and contributed to the advancement of medical treatments for cancer and autoimmune disorders,” as stated in a news release by the Nobel Committee. The three recipients will share a prize of 11 million Swedish Kronor (approximately $1.2 million).

“This could also enhance the success rates of organ transplants. Several of these therapies are currently in clinical trials,” he noted.

Autoimmune diseases may arise when T cells, which serve as the body’s main defense against harmful pathogens, malfunction.

Their collective discovery establishes an essential foundation for understanding alternative methods by which the immune system, known as peripheral resistance, functions.

To mitigate damage, our bodies attempt to eliminate malfunctioning T cells within the thymus, a lymphoid organ, through a mechanism termed central resistance. Associated Press.

The groundbreaking research began in 1995 when Sakaguchi, a prominent professor at the Center for Immunology Frontier Research at Osaka University in Japan, uncovered a previously unknown class of immune cells that defend against autoimmune diseases.

Six years later, in 2001, Mary Blankku, who now serves as a senior program manager at the Institute of Systems Biology in Seattle, along with Ramsdell, a scientific advisor to Sonoma Biotherapeutics in San Francisco, identified a specific genetic mutation responsible for a severe autoimmune disease known as IPEX.

They designated this gene as foxp3.

By 2003, Sakaguchi confirmed that the FOXP3 gene he had identified nearly a decade prior was crucial for cell development. These cells are now referred to as regulatory T cells, which are essential in monitoring other T cells to prevent their malfunction.

“Their discoveries were vital for understanding the immune system’s functioning and why serious autoimmune diseases don’t affect everyone,” remarked All Kampe, Chairman of the Nobel Committee.

Nobel Committee Executive Director Thomas Perman announced the award on Monday morning, stating that he was only able to reach Sakaguchi.

“I hugged him in his lab, and he expressed immense gratitude, stating it was a tremendous honor. He was quite moved by the news,” Perman mentioned.

The awards ceremony is scheduled for December 10th, coinciding with the anniversary of Alfred Nobel’s death, a Swedish industrialist who founded the award to honor individuals who have significantly contributed to humanity. The inaugural award was revealed in 1901, marking the fifth anniversary of his passing.

The Nobel Prize in Physiology or Medicine will be announced in Stockholm at the Karolinska Institute on Monday, followed by the prizes for Physics, Chemistry, and Literature on the ensuing days.

The Nobel Peace Prize will be revealed on Friday.

Source: www.nbcnews.com

Ancient Rock Art Reveals Human Life in the Arabian Desert 12,000 Years Ago

Approximately 12,000 years ago, during the Pleistocene-Chlorocene transition, humans navigated a network of seasonal waters in Northern Arabia, marking significant locations with camels, ibex, wild equids, gazelles, and monumental rock carvings of Auloc, as well as establishing access routes.

Jebel Arnaan rock art panel. Image credit: Mariaguanine.

As part of the Green Arabia Project, archaeologist Michael Petraglia from Griffith University and his team have uncovered over 60 rock art panels featuring 176 sculptures in three previously unexplored locations.

The sculptures predominantly illustrate camels, ibex, equids, gazelles, and aurochs, comprising 130 life-size and naturalistic figures, with heights exceeding 3 meters and 2 meters.

This sculptural activity occurred between 12,800 and 11,400 years ago, a time when seasonal water bodies re-emerged following a period of severe aridity.

These water sources, identified through sediment analysis, facilitated early human migration into the interior desert and offered rare survival opportunities.

“These large-scale sculptures are not just rock art; they likely represent assertions of existence, access, and cultural identity,” noted Dr. Maria Guanine, an archaeologist at the Max Planck Institute.

“Rock art signifies water sources and movement routes, potentially indicating territorial rights and intergenerational memory,” added Dr. Seri Shipton, an archaeologist at the University of London.

In contrast to previously known sites where sculptures were hidden in crevices, the Jebel Mleiha and Jebel Arnaan panels were carved on the face of a towering 39-meter cliff, making them visually dominant.

One panel required ancient artists to ascend narrow ledges to create their work, emphasizing the effort and significance attributed to the imagery.

Various artifacts, including Levantine-style Erkiam, Hellwan stone points, green pigments, and dental beads, indicate extensive connections to pre-Pottery Neolithic (PPN) populations in the Levant.

Nevertheless, the size, content, and arrangement of these Arabian sculptures distinguish them from others.

“This unique form of symbolic representation reflects a distinct cultural identity evolved to thrive in harsh, arid environments,” stated Dr. Faisal Al Ghibrien, a heritage researcher at the Saudi Ministry of Culture.

“The project’s interdisciplinary approach aims to bridge significant gaps in the Northern Arabian archaeological record between the last Glacial Maximum and the Holocene, shedding light on the resilience and innovation of early desert communities,” remarked Dr. Petraglia.

The team’s paper has been published in the journal Nature Communications.

____

M. Guanine et al. 2025. Monumental rock art indicates that humans thrived in the Arabian desert during the Pleistocene and Holocene transitions. Nature Communications 16, 8249; doi:10.1038/s41467-025-63417-y

Source: www.sci.news

Uncovering the Role of Brain Organoids in Defining Human Uniqueness

100-day-old brain organoids

Madeline Lancaster

Since the inception of brain organoids by Madeline Lancaster in 2013, these structures have become invaluable in global brain research. But what are they really? Are they simply miniaturized brains? Could implanting them into animals yield a super-intelligent mouse? Where do we draw the ethical line? Michael Le Page explored these questions at Lancaster’s lab at the MRC Institute of Molecular Biology in Cambridge, UK.

Michael Le Page: Can you clarify what a brain organoid is? Is it akin to a mini brain?

Madeline Lancaster: Not at all. There are various types of organoids, and they are not miniature brains. We focus on specific parts of the human brain, and our organoids are small and immature. They don’t function like developed human brains with memories. In scale, they’re comparable to insect brains, lacking the necessary tissue present in those brains. I would categorize them closer to insect neural structures.

What motivated you to create your first brain organoid?

I initiated the process using mouse embryonic brain cells, cultivating them in Petri dishes. Some cells didn’t adhere as expected, leading to a fascinating outcome where they interconnected and formed self-organizing cell clusters indicative of early brain tissue development. The same technique was then applied to human embryonic stem cells.

Why is the development of brain organoids considered a significant breakthrough?

The human brain is vital to our identity and remained enigmatic for a long time. Observing a mouse brain doesn’t capture the intricacies of the human brain. Brain organoids have opened a new perspective into this complex system.

Can you provide an example of this research?

One of our initial ventures involved modeling a condition called micropathy, where the brain is undersized. In mice, similar mutations don’t alter brain size. We tested whether we could replicate size reduction in human brain organoids, and we succeeded, enabling further insights into the disease.

Madeline Lancaster in her lab in Cambridge, UK

New Scientist

What has been your most significant takeaway from studying brain organoids?

We are gaining a better understanding of what distinguishes the human brain. I’m fascinated by the finding that human stem cells which generate neurons behave differently from those in mice and chimpanzees. One key difference is that human development is notably slower, allowing for more neurons to be produced as our stem cells proliferate.

Are there practical outcomes from this research?

Much of our foundational biology research has crucial implications for disease treatment. My lab primarily addresses evolutionary questions, particularly genetic variances between humans and chimpanzees. Specific genes that arise are often linked to human disorders, implying that mutations essential for brain development could lead to significant damage.

What types of treatments might emerge from this work in the future?

We’re already utilizing brain organoids for drug screening. I’m especially optimistic about their potential in treating mental health conditions and neurodegenerative diseases, where novel therapies are lacking. Currently, treatments for schizophrenia utilize medications that are five decades old. Brain organoid models could unveil new approaches. In the longer term, organoids might even provide therapeutic options themselves. While not for all brain areas, techniques have already been developed to create organoids of dopaminergic neurons from the substantia nigra, which are lost in Parkinson’s, for potential implantation.

Are human brain organoids already being implanted in animal brains?

Yes, but not for treatment purposes; rather, these practices enhance human organoid research. Organoids usually lack vascularity and other cell types from outside the brain, especially microglia, which serve as the brain’s immune cells. Thus, to examine how these other cells interact with human brain matter, various studies have implanted organoids into mice.

Should we have concerns regarding the implantation of human organoids in animals?

Neurons are designed to connect with one another. So, when a human brain organoid is inserted into a mouse brain, the human cells will bond with mouse neurons. However, they aren’t structured coherently. These mice exhibit diminished cognitive performance after implantation, akin to a brain malfunction; hence, they won’t become super-intelligent.

Images of the color of brain organoids, showing their neural connections

MRC Institute of Molecular Biology

Is cognitive enhancement a possibility?

We’re quite a distance from that. Higher-level concepts relate to how different brain regions interlink, how individual neurons connect, and how collections of neurons communicate. Achieving an organized structure like this could be possible, but challenges like timing persist. While mice have a short lifespan of about two years, human development toward advanced intelligence takes significantly longer. Furthermore, the sheer size of human brains presents challenges; a human-sized brain cannot fit within a mouse. Because of these factors, I don’t foresee such concerns emerging in the near future.

Regarding size, the main limitation is the absence of blood vessels. Organoids start to die off when they exceed a few millimeters. How much headway has been made in addressing this issue?

While we’ve made strides and should acknowledge our accomplishments, generating brain tissue is relatively straightforward as it tends to develop autonomously. Vascularization, however, is complex. Progress is being made with the introduction of vascular cells, but achieving fully functional blood perfusion remains a significant hurdle.

When you reference ‘far away’…

I estimate it could take decades. It may seem simple, given that the body accomplishes this naturally. However, the challenges arise from the body’s integrated functioning. Successfully vascularizing organoids requires interaction with a whole organism; we can’t replicate this on a plate.

If we achieve that, could we potentially create a full-sized brain?

Even if we manage to develop a large, vascularized human brain in a lab, without communication or sensory input, it would lack meaningful function. For instance, if an animal’s eyes are shut during development and opened later, they may appear functional, but the brain can’t interpret visual input, rendering it effectively blind. This principle applies to all senses and interactions with the world. I believe that an organism’s body must have sensory experiences to develop awareness. Certain patients who lose sensory input can end up experiencing lock-in syndrome, an alarming condition. But these are individuals who have previously engaged with the world. A brain that has never engaged lacks context.

As brain organoid technology progresses, how should we define the boundaries of ethical research?

The field closely intersects with our understanding of consciousness, which is complex and difficult to measure. I’m not even certain I have the definitive answer about consciousness for myself. However, we can undoubtedly assess factors relevant to consciousness, like organization, sensory inputs and outputs, maturity, and size. Mice might meet several of these criteria but are generally not recognized to possess human-like consciousness, largely due to their size. Even fully interconnected human organoids won’t achieve human-level consciousness if they remain small. Establishing these kinds of standards offers more practical methods than attempting to directly measure consciousness.

https://www.youtube.com/watch?v=xa82-7txy50

topic:

Source: www.newscientist.com

Lab-Fertilized Egg Cells Created from Human Skin DNA

Laboratories enable modification of human egg cell genetic identity

Science Photo Library / Aramie

Human embryos arise from eggs that utilize the DNA from adult skin cells. This was accomplished with mice. This advancement may offer a pathway for same-sex couples or women facing fertility challenges to have biologically related children.

Researchers have successfully replicated animals through cloning techniques. This involves substituting the nucleus of an egg cell with the nuclei from somatic cells such as skin cells. However, in addition to the legal hurdles surrounding human cloning, many couples desire children that carry genes from both partners, necessitating both sperm and eggs. Shoukhrat Mitalipov of Oregon Health and Science University.

This scenario is complicated by the nature of eggs and sperm being haploid, meaning they contain only one set of chromosomes. The challenge lies in halving the complete set of chromosomes found within cells such as skin cells after selecting an optimal combination of the original genes.

Females develop all of their eggs while still in the womb, where the progenitor cells initially containing 46 chromosomes undergo a complicated process of replication, mixing, and division to reduce to 23 chromosomes.

Mitalipov was intrigued by the possibility of employing natural chemical processes that facilitate chromosomal division in mature human eggs both before and after fertilization to replicate this process in his laboratory.

Having achieved this with mice, Mitalipov and his team are now trialing the method with human subjects. They started by extracting the nuclei from hundreds of eggs donated by healthy women, which were left at a specific development stage linked to chromosomal division. Next, the nuclei of skin cells, known as fibroblasts, from healthy female volunteers were inserted into these eggs. Microscopic images displayed the chromosomes aligned on the spindle and the internal structures necessary for chromosomal separation.

The team then injected sperm from a healthy donor to fertilize some of the eggs, utilizing a method akin to that employed in creating babies using third-party mitochondrial DNA, which can also minimize the risk of specific genetic disorders.

This injection typically causes the eggs to undergo chromosome selection and eliminate duplicate DNA, preparing them for additional reception from the sperm. Nonetheless, in the case of the skin-derived eggs, this process was interrupted, with chromosomes aligning but not separating. Consequently, the researchers attempted again with a new batch of fertilized eggs, applying an electrical pulse that allowed calcium to surge into the egg, emulating natural signals triggered when sperm contact the egg’s outer layer, alongside an incubation period with a drug to activate them from their dormant state pre-fertilization.

Through a series of trials, the researchers successfully halved the chromosome counts in the eggs, discarding any excess. By the conclusion of the experiment, 9% of the fertilized eggs had developed into blastocysts — a dense cluster of cells at about 5-6 days post-fertilization, typically moving into the uterus during IVF treatments. However, the team did not pursue the transfer or sustain the blastocyst beyond six days.

Despite the progress made, the mixtures of genes forming the remaining chromosomes appeared particularly susceptible to defects. “I believe this method is still in its early stages and is not presently suitable for clinical applications,” stated MITINORI SAITOU from Kyoto University in Japan.

Lin from Osaka University noted that while the techniques are “very sophisticated and organized,” they remain “inefficient and potentially hazardous for immediate clinical use.” Nevertheless, Hayashi remarked that the team has achieved a “substantial breakthrough in reducing the human genome.” “This advancement will herald new technologies,” he stated.

Mitalipov acknowledged the validity of the criticisms, emphasizing that his team is actively working to address the existing flaws. “At the end of the day, we’re making progress, but we aren’t there yet,” he remarked.

Topic:

Source: www.newscientist.com

Under Threat: Human Subtitle Authors Facing AI Challenges in Film

Is artificial intelligence poised to dismantle the SDH [subtitles for the deaf and hard of hearing] industry? While SDH remains the standard subtitle format across most platforms, the individuals behind it raise a valid concern as the sector, like many creative fields, faces increasing devaluation in the AI era. “SDH is an art; the industry often overlooks this. Many see it merely as transcription,” remarked Max Deryagin, chairman of Interface Activities, a nonprofit for freelance subtitlers and translators.

<p class="dcr-130mj7b">While AI promises to streamline subtitle creation, it misses the mark, according to Meredith Canela, a committee member. "There's a notion that AI tools mean we should work less. Yet, having spent 14-15 years in this field, I can attest that the time taken to complete projects has not changed significantly over the past five to six years."</p>

<p class="dcr-130mj7b">"Automatic transcription shows some positive advancements," Cannela adds. However, the overall efficiency does not represent a net gain compared to previous software, as extensive corrections are necessary.</p>

<figure id="8a1af689-3e30-498c-9401-81b7bc4d8a2d" data-spacefinder-role="inline" data-spacefinder-type="model.dotcomrendering.pageElements.ImageBlockElement" class="dcr-173mewl">
    <figcaption data-spacefinder-role="inline" class="dcr-fd61eq">
        <span class="dcr-1inf02i">
            <svg width="18" height="13" viewbox="0 0 18 13">
                <path d="M18 3.5v8l-1.5 1.5h-15l-1.5-1.5v-8l1.5-1.5h3.5l2-2h4l2 2h3.5l1.5 1.5zm-9 7.5c1.9 0 3.5-1.6 3.5-3.5s-1.6-3.5-3.5-3.5-3.5 1.6-3.5 3.5 1.6 3.5 3.5 3.5z"/>
            </svg>
        </span>
        <span class="dcr-1qvd3m6">"You can't overwhelm your audience"... Barbie's open caption screening for deaf and hard of hearing audiences in Westwood, California in 2023.</span> Photo: Allen J. Shaven/Los Angeles Times/Getty Images
    </figcaption>
</figure>

<p class="dcr-130mj7b">Moreover, the quality of AI-generated SDHs is often subpar, requiring significant effort to meet standards. Unfortunately, human subtitlers frequently find themselves taking on "quality control" roles with minimal compensation. Many in the field state that earning a sustainable income is currently a challenge.</p>

<p class="dcr-130mj7b">"The fees for SDH work were never great, but they've dropped to a point where it's hardly worth the effort," says Rachel Jones, an audiovisual translator and committee member. "This seriously undermines the value we provide."</p>

<p class="dcr-130mj7b">This value is crucial. "We're thrilled to welcome Teri Devine, associate director of inclusion at the Royal National Institute for Deaf and Deaf," a representative stated. “For those who are deaf or hard of hearing, subtitles are an essential service."</p>

<aside data-spacefinder-role="supporting" data-gu-name="pullquote" class="dcr-19m4xhf">
    <svg viewbox="0 0 22 14" style="fill:var(--pullquote-icon)" class="dcr-scql1j">
        <path d="M5.255 0h4.75c-.572 4.53-1.077 8.972-1.297 13.941H0C.792 9.104 2.44 4.53 5.255 0Zm11.061 0H21c-.506 4.53-1.077 8.972-1.297 13.941h-8.686c.902-4.837 2.485-9.411 5.3-13.941Z"/>
    </svg>
    <blockquote class="dcr-zzndwp">The same sound can mean a million different things. As humans, we interpret how it should feel.</blockquote>
</aside>

<p class="dcr-130mj7b">Deaf and hard of hearing communities are diverse, meaning subtitles must accommodate various needs in crafting SDH. Jones explains, "While some believe that naming songs in subtitles is pointless, others might resonate with it because of the song's title."</p>

<p class="dcr-130mj7b">Subtitles involve numerous creative and emotion-driven choices—qualities AI currently lacks. When Jones first watches a show, she notes her emotional reactions to sounds and determines how best to express those in words. She then decides which sounds to subtitle and which may be excessive: "You can't overwhelm the audience," she points out. It's a delicate balancing act. "I want to avoid over-explaining everything to the viewers," Cannela adds.</p>

<figure id="880e1917-c3ac-492b-829a-737f8a57f715" data-spacefinder-role="supporting" data-spacefinder-type="model.dotcomrendering.pageElements.ImageBlockElement" class="dcr-a2pvoh">
    <figcaption data-spacefinder-role="inline" class="dcr-9ktzqp">
        <span class="dcr-1inf02i">
            <svg width="18" height="13" viewbox="0 0 18 13">
                <path d="M18 3.5v8l-1.5 1.5h-15l-1.5-1.5v-8l1.5-1.5h3.5l2-2h4l2 2h3.5l1.5 1.5zm-9 7.5c1.9 0 3.5-1.6 3.5-3.5s-1.6-3.5-3.5-3.5-3.5 1.6-3.5 3.5 1.6 3.5 3.5 3.5z"/>
            </svg>
        </span>
        <span class="dcr-1qvd3m6">"Algorithms cannot replicate the level of professional work."</span> Photo: Milan Sulkara/Arami
    </figcaption>
</figure>

<p class="dcr-130mj7b">AI struggles to discern which sounds are crucial. "It’s far from achieving that now," Deryagin notes, emphasizing the importance of understanding the broader context of a film rather than just individual images or scenes. For instance, in *Blow Out* (1981), a mysterious sound recurs, enhancing viewers' understanding of the main plot points. "SDH must create these connections rapidly without over-informing the audience initially," he explains. "The same sound can have countless meanings, and as a human, it’s my job to interpret those nuances."</p>

<p class="dcr-130mj7b">"You can't simply feed an algorithm a soundtrack and expect it to get it right. Providing metadata will not bridge the gap to professional quality."</p>

<p class="dcr-130mj7b">Netflix provided a glimpse into its "SDH process" following the subtitles for *Stranger Things*—for example, "[Eleven pants]" or "[Tentacles squelching wetly]"—in an <a href="https://www.netflix.com/tudum/articles/stranger-things-season-4-captions" data-link-name="in body link">interview with the subtitler</a>. The company chose not to comment further on AI in subtitle production. The BBC informed the *Guardian* that "we do not use AI for TV subtitles," though much of that work was outsourced to Redbee Media last year. <a href="https://www.redbeemedia.com/news/red-bee-medias-artificial-intelligence-captioning-workflows-bring-costs-down-for-network-10/" data-link-name="in body link">A statement was issued</a> regarding the use of AI for creating SDHs for the Australian Broadcasting Network 10.</p>

<p class="dcr-130mj7b">Jones notes that linguists and subtitlers aren't inherently opposed to AI, but at this juncture, it complicates rather than simplifies their work. "In every industry, AI tends to replace the creative aspects that bring us joy, rather than alleviating the tedious tasks that we’d rather avoid," she concludes.</p>

Source: www.theguardian.com

Revolutionary Video Unveils Hidden Aspects of Human Fertility

For the first time, real-time footage of human embryos being implanted into an artificial uterus has been recorded.

This remarkable achievement, published in the journal Advances in Science, offers an unparalleled glimpse into one of the crucial stages of human development.

Implantation failure is a leading cause of infertility, responsible for 60% of miscarriages. Researchers aim to enhance understanding of the implantation process to improve fertility results in both natural conception and in vitro fertilization (IVF).

“We can’t observe this, due to the transplantation in the mother,” stated Dr. Samuel Ojosnegros, head of bioengineering at the Institute of Bioengineering (IBEC) and the lead author of the study, as reported by BBC Science Focus.

“Thus, we required a system to observe how it functions and to address the primary challenges to human fertility.”

Implantation marks the initial phase of pregnancy, where the fertilized egg (developing embryo) attaches to the uterine lining, allowing it to absorb nutrients and oxygen from the mother—vital for a successful pregnancy.

To investigate this process, the research team developed a platform that simulates the natural uterine lining, utilizing a collagen scaffold combined with proteins essential for development.

The study then examined how human and mouse embryos implant onto this platform, uncovering significant differences. Unlike mouse embryos that adhere to the uterine surface, human embryos penetrate fully into the tissue before growing from within.

https://www.youtube.com/watch?v=1p3in1fzrec

Video showing the implantation process of mouse embryos (left) and human embryos (right).

“Human embryos are highly invasive,” said Ojosnegros. “They dig a hole in the matrix, embed themselves, and then grow internally.”

The footage indicated that the embryo exerts considerable force on the uterus during this process.

“We observed that the embryo pulls, moves, and rearranges the uterine matrix,” stated Dr. Amélie Godeau, co-first author of the research. “It also responds to external force cues. We hypothesize that contractions in vivo may influence embryo transfer.”

According to Ojosnegros, the force applied during this stage could explain the pain and bleeding many women experience during implantation.

Researchers are currently focused on enhancing the realism of implantation platforms, including the integration of living cells. The goal is to establish a more authentic view of the implantation process, which could boost the likelihood of success in IVF, such as by selecting embryos with better implantation potential.

“We understand more about the development of flies and worms than our own species,” remarked Ojosnegros. “So enjoy watching the film.”

Read more:

Source: www.sciencefocus.com