Why Humans Are Not Naturally Built for Strict Monogamy: Unveiling the Truth

The Fragile Nature of Monogamy: A Deep Dive into Human Relationships

Monogamy is often regarded as a sacred institution, the default setting for romantic relationships. Yet, behind the facade, monogamy can feel fleeting, negotiable, and at times, nearly impossible to maintain. If sexual exclusivity is a natural trait for humans, why does it often require significant effort and societal reinforcement to uphold?

A Cultural Perspective on Non-Monogamy

Human culture is rich in examples of non-monogamous relationships, including polyamory, infidelity, and casual dating. Research conducted since the 1960s indicates that 87% of pre-industrial societies embraced some form of polygamy, suggesting that today’s fascination with monogamy might stem from rapid cultural shifts.

Interestingly, while polygamous relationships were widely accepted, only a fraction of individuals engaged in such partnerships, keeping monogamy as the prevalent norm. Moreover, fewer than 10% of mammal species form long-term pair bonds, although this rate is about 25% among primates. Notably, our closest relatives, chimpanzees and bonobos, are often promiscuous, engaging with multiple partners to strengthen social ties.

Why Monogamy?

A compelling theory proposes that monogamy evolved as a defense mechanism against male infanticide. A landmark study published in 2013 examined 230 primate species and revealed that males of promiscuous species often killed the offspring of rivals, prompting a shift toward monogamous relationships for protection.

In contrast, female chimpanzees often mate with various males to obfuscate paternity and mitigate the risk of infanticide. Bonobos, on the other hand, utilize casual sex as a way to nurture social bonds and ease conflicts.

Additionally, male provision of resources to females is believed to increase their attractiveness as mates, further supporting the evolutionary narrative favoring monogamous pairings.

Understanding Human Pair Bonding

Despite their complex origins, it’s evident that humans have a biological inclination to form tight-knit partnerships. However, it’s essential to distinguish between pair bonding and the modern interpretation of monogamy, which usually implies emotional and sexual exclusivity.

Biologists categorize monogamy into three types: social, sexual, and reproductive. Socially monogamous pairs share resources, raise offspring together, and may also engage in sexual exclusivity. In the animal kingdom, reproduction and sexual monogamy are often intertwined, especially since contraception does not exist.

In humans, strong romantic bonds are a hallmark of our species. According to Professor Agustín Fuentes of Princeton University, these connections foster extensive networks of care, such as compassion and cooperation. Research on prairie voles highlights the powerful neurochemistry involved in forming these bonds, showcasing how brain chemicals like oxytocin foster strong pair bonds.

The Statistics Behind Monogamy

Dr. Mark Dyble from the University of Cambridge emphasizes the challenges in quantifying human mating systems, particularly due to discrepancies in self-reported sexual behavior. Recent studies utilizing genetic data reveal that approximately 66% of human siblings are full siblings, indicating a faculty toward monogamy, albeit with considerable variability.

Despite approximately 25% of marriages in the UK ending in divorce within ten years and significant rates of infidelity reported among men and women in the U.S., these findings align humans more closely with socially monogamous mammals, ranking seventh behind species like the Eurasian beaver.

Cultural Constructs and Changing Norms

Human sexual and marital practices vary widely across cultures, influenced by factors like resource allocation and socio-economic conditions. As Fuentes indicates, contemporary definitions of monogamy can shift from “one person for life” to “one person at a time,” a phenomenon known as serial monogamy.

As individuals age, many gravitate toward forms of semi-monogamy, often signaling a return to monogamous norms. For some, monogamy feels natural; for others, it can be stifling. Yet for many, it remains a conscious choice with continually evolving terms influenced by societal norms.

Conclusion: The Dual Nature of Human Relationships

So, how monogamous are humans? The answer is nuanced: it varies. We possess an innate capacity for deep attachment, bolstered by ancient brain chemistry that facilitates pair bonding. Additionally, intricate cultural frameworks shape our relationships, oscillating between enforcing and relaxing monogamous norms. Ultimately, while humans may strive to practice monogamy, our success in doing so remains complex and multifaceted.

For more insights on human relationships and the evolution of monogamy, explore our additional resources here.

This revised content is optimized for SEO, incorporating relevant keywords while retaining the original HTML structure.

Source: www.sciencefocus.com

New Study Reveals Dragonflies and Humans Have Identical Red Vision Mechanisms

Recent research from Osaka Metropolitan University has unveiled a groundbreaking visual protein, enabling dragonflies to perceive deep red and near-infrared light. This discovery showcases an evolutionary parallel to human vision, hinting at exciting medical applications.



Asiagomphus melaenopus Female from Miroku Forest, Kasugai City, Aichi Prefecture. Image credit: Alpsdake / CC BY-SA 4.0.

Humans perceive colors through a specific protein called opsin found in our eyes.

In humans, there are three distinct opsins responsible for color perception: blue, green, and red light.

Dragonflies possess notably enhanced red vision compared to most insects.

A recent study led by Professor Mitsumasa Koyanagi at Osaka Metropolitan University identified a unique dragonfly opsin that detects light wavelengths around 720 nm, extending beyond the visible spectrum’s deep red range.

“This is one of the most red-sensitive visual pigments ever found,” stated Professor Akihisa Terakita from Osaka Metropolitan University.

“Dragonflies likely see red light more profoundly than many other insects.”

The researchers posited that this heightened sensitivity assists dragonflies in identifying ideal mates.

To support this hypothesis, they measured the reflectance properties of surfaces, indicating how dragonflies visually perceive one another.

Findings reveal significant differences between male and female Asiatic gomphus melaenopus dragonflies, displaying reflectance from red to near-infrared light. This ability may promote quick differentiation between sexes during flight.

“Interestingly, the mechanism by which dragonfly red opsin detects red light mirrors that of mammals, including humans,” explained Ryu Sato, a graduate student at Osaka Metropolitan University.

“This surprises us and indicates an independent evolutionary development in vastly different species.”

The research team also identified a critical position within the protein that regulates light sensitivity.

By altering this position, they were able to enhance the sensitivity further, enabling the opsin to respond to light approaching the infrared spectrum.

They engineered a protein variant that reacts to even longer wavelengths, demonstrating activation of cells by near-infrared light.

These discoveries hold promise for the field of optogenetics, leveraging light-sensitive proteins to investigate various disease states.

Given that dragonfly opsins are responsive to longer light wavelengths, they could operate effectively in deeper tissue applications.

“In this research, we’ve successfully shifted the sensitivity of the modified near-infrared opsin found in the Odonata family to longer wavelengths, confirming that this opsin triggers cellular responses via near-infrared light,” noted Professor Koyanagi.

“This illustrates the potential of this opsin as an innovative optogenetic tool for deep tissue light detection.”

For further detailed insights, refer to the study published in January 2026 in the journal Cell and Molecular Life Sciences.

_____

Takashi Sato et al. 2026. Dragonfly red opsin shares a common regulatory mechanism with mammalian red opsin, further enhancing near-infrared sensitivity. Cell. Mol. Life Sci. 83, 66; doi: 10.1007/s00018-025-06017-9

Source: www.sci.news

How Early Humans Revolutionized Their Toolkits 200,000 Years Ago: Key Changes and Innovations

Changes in predator populations may have driven early humans to develop innovative tools

Raul Martin/MSF/Science Photo Library

Approximately 200,000 years ago, a decline in megafauna may have compelled early humans to transition from heavy stone tools to more lightweight hunting kits designed for smaller prey. A recent study supports the notion that this change in hunting strategy could have sparked a rise in cognitive capabilities among our ancestors.

For over a million years, various early human species relied on heavy stone tools such as axes, kitchen knives, scrapers, and stone balls. These robust tools were essential for hunting and butchering large herbivores, including extinct relatives of modern elephants, hippos, and rhinos.

Between 400,000 and 200,000 years ago, archaeological evidence shows a notable increase in smaller, sophisticated tools alongside the fading of traditional heavier tools. Our species, Homo sapiens, emerged during this timeframe.

Circa 200,000 years ago, heavy stone tools vanished from the archaeological record of the Levant, while the presence of diverse, lightweight masonry toolkits—like blades and precision scrapers—increased.

Research led by Vlad Litov, a professor at Tel Aviv University, revealed a correlation between these technological advancements and a significant decline in large herbivores, potentially due to overhunting.

The researchers analyzed archaeological findings from 47 sites across the Levant, spanning the Paleolithic period, which lasted from around 3.3 million years ago to 12,000 years ago. Their analysis of dated stone artifacts in relation to animal remains uncovered a compelling trend.

Findings indicate a drastic reduction in the biomass and specimen count of giant herbivores exceeding 1,000 kilograms correlating with the disappearance of heavy tools 200,000 years ago. Conversely, the availability of smaller prey increased alongside more sophisticated small tools.

Supporting the connection between tool technology and prey type, the researchers noted that sturdy stone tools were still in use in regions with abundant large game, such as southern China, until about 50,000 years ago.

Heavy-duty tools and their evolution to lightweight alternatives used by early humans

Vlad Litov et al., Institute of Archaeology, Tel Aviv University

Previous theories suggested that advancements in technology stemmed from increasing intelligence and creativity due to evolutionary pressures. However, Litov and his research team propose a different perspective: reliance on smaller prey may have catalyzed the evolutionary growth of larger brains in modern humans.

“As large herbivores dwindled, humans increasingly depended on smaller prey, necessitating varied hunting strategies, advanced planning, and the implementation of lightweight, intricate toolsets,” states Litov. “This cognitive evolution was a byproduct of adapting to new prey types, rather than the initial driver of this adaptive transformation.”

“There is more to this adaptation than merely prey size,” says Seri Shipton from University College London. He notes preliminary evidence indicating mass hunting of medium-sized ungulates like horses and bison, with signs of enhanced cognitive abilities and planning emerging during the Middle Paleolithic.

Nicolas Tessandier from the French National Center for Scientific Research also maintains some reservations. “Human adaptation to new fauna underscores adaptability rather than mere intelligence,” he posits. “Producing powerful tools for hunting large herbivores was equally astute.”

Litov recognizes that prior research has shown advanced cognitive functions present early in human evolution, notably in the development of Homo erectus around two million years ago. However, he emphasizes that switching from large to smaller prey had major consequences for human society. A single ancient elephant carcass could sustain a group of about 35 hunter-gatherers for months. As these high-calorie resources vanished, reliance on smaller prey reduced the yield per animal.

“Energetically, we had to gather numerous smaller ungulates, such as fallow deer, to replace the loss of one elephant,” explains Litov. This shift likely stimulated diverse cognitive and behavioral changes, including cooperative hunting strategies, advanced techniques, and enhanced social collaboration and organization. “Such adaptations may have contributed to the evolution of larger brains in later species, including Neanderthals and Homo sapiens,” he adds.

“In my view, the decline in large prey familiar to hominins likely intensified competition among groups,” asserts Shipton. “It was probably an iterative process where the reduction of larger prey prompted cognitive shifts that facilitated access to smaller prey.”

Discovery Tour: Archaeology, Human Origins, and Paleontology

New Scientist regularly highlights captivating sites worldwide that have transformed our understanding of species and the early days of civilization. Why not explore them yourself?

Topics:

In this revision, I’ve incorporated SEO-friendly keywords and maintained the integrity of the original content while adding clarity and enhancing readability.

Source: www.newscientist.com

The Shroud of Turin: Secrets of DNA from Humans, Plants, and Animals Uncovered

The Shroud of Turin is engraved with an image resembling Jesus Christ.

Public Domain/Art Collection 2/Alamy

Recent DNA analysis has unveiled a significant number of contaminants—animal, plant, and human—on the Shroud of Turin, which complicates the narrative surrounding this enigmatic relic, believed to be the cloth in which Jesus Christ was wrapped following his crucifixion over 2,000 years ago.

Stretching 4.4 meters long and 1.1 meters wide, the Shroud of Turin is considered one of the most renowned and debated Christian artifacts globally. Its first documented appearance was in France during 1354, after which it resided for nearly 5,000 years in the Basilica of St. John the Baptist in Turin, Italy.

In 1988, researchers conducted radiocarbon dating along with accelerator mass spectrometry techniques, concluding that the Shroud was created between 1260 and 1390. This finding brought into question the identity of the figure depicted on the cloth as Jesus, although many Christian scholars continue to dispute this late medieval dating.

In a 2015 study by Gianni Barcaccia and colleagues from the University of Padova in Italy, material from the artifacts sampled in 1978 was reexamined. The researchers first proposed the possibility that the cloth may have origins in India.

Currently, Mr. Barcaccia—who opted not to be interviewed for this publication—has spearheaded a new analysis of the material from 1978, revealing that the Shroud contains a remarkable spectrum of medieval and modern DNA.

The genetic materials identified include DNA from domestic animals like cats, dogs, chickens, cows, goats, sheep, pigs, and horses, alongside wild species such as deer and rabbits.

Additionally, traces of various fish species such as mullet, Atlantic cod, and stingrays were discovered, as well as marine crustaceans, flies, aphids, and arachnids like dust and skin mites.

Common plant DNA located within the Shroud includes species like carrots, wheat, peppers, tomatoes, and potatoes, suggesting these were possibly introduced to Europe following exploratory voyages to Asia and the Americas.

However, pinpointing the timeline of these contaminating events regarding animals and plants remains elusive.

The research team also isolated human DNA from various individuals who came in contact with the Shroud, notably including those from the 1978 sampling. “The presence of multiple individuals’ DNA complicates the task of identifying the Shroud’s original DNA,” the team noted.

Nearly 40% of the human DNA identified on the Shroud appears to be of Indian origin, possibly resulting from historic interactions or from Romans importing linen from regions near the Indus Valley, report Barcaccia and colleagues.

“The DNA findings on the Shroud of Turin indicate extensive exposure in the Mediterranean area, potentially suggesting that the fabric may have been produced in India,” the researchers articulated.

Anders Goeterström from Stockholm University stated that preliminary studies place the Shroud’s date in the 13th century, a timeframe that is widely accepted in the scientific community. “Despite discussions surrounding the 1988 radiocarbon dating, most researchers find it sufficiently credible,” he explained.

Goeterström remains skeptical about the cloth’s potential Indian origins. He asserts, “There is still no compelling evidence to dismiss that the Shroud is French and dates from the 13th or 14th century,” he concluded.

“This significant relic has its own unique history, which might prove to be more intriguing than its legendary roots lacking scientific backing.”

Topics:

Source: www.newscientist.com

Uncovering the Link: Marine Animal Virus Linked to Unusual Eye Issues in Humans

New Scientist: Featuring science news and expert coverage of developments in science, technology, health, and the environment.

A virus traditionally affecting marine life is causing glaucoma-like symptoms in humans.

Virginie Vaes/Getty Images

A newly identified virus typically affecting marine animals has led to alarming glaucoma-like symptoms, including irreversible vision loss, in a handful of individuals in China. This marks the first documented instance of an aquatic virus infecting humans and resulting in serious health complications. The infections are believed to have occurred through consumption of raw seafood or handling aquatic creatures, with some evidence suggesting possible human-to-human transmission.

“It’s astonishing that this virus is capable of infecting invertebrates, fish, and mammals,” says Edward Holmes, a researcher at the University of Sydney. “I cannot recall any virus exhibiting such a broad host range.”

Cases of persistent ocular hypertension viral anterior uveitis (POH-VAU), characterized by inflammation and increased eye pressure leading to optic nerve damage, are escalating in China. To delve into the triggers behind this spike, researchers from the Chinese Academy of Fisheries Sciences in Qingdao evaluated 70 patients diagnosed with the condition between January 2022 and April 2025.

The research team tested these individuals for the latent and lethal nodavirus, which commonly infects various marine species; remarkably, all 70 tests returned positive. “Up to this point, viruses from aquatic animals have not been shown to cause illness in humans directly,” the researchers stated, though they declined to provide further commentary. Despite treatment aimed at reducing inflammation, a significant portion of subjects required surgical intervention, with one case resulting in irreversible vision loss.


In an effort to better understand the virus, the research team infected mice, which exhibited marked pathological changes in the cornea, iris, and retina within a month. They also observed that the virus could spread among mice sharing water.

Of those studied, more than half owned aquatic animals, highlighting a potential source for the infection. However, approximately 16% reported consuming raw seafood or had close ties with high-risk groups.

While there is no definitive proof of human-to-human transmission, an epidemiological study revealed a distinct subgroup of urban POH-VAU patients with no direct contact with aquatic animals or other risk factors, apart from close interactions with family members. These relatives, who are at a heightened risk for the latent nodavirus, have experienced hand injuries during handling of aquatic animals, implying that the virus may transmit within households, potentially via shared utensils.

To gauge the virus’s prevalence, researchers evaluated 523 captive and wild aquatic animals across Asia, the Americas, Europe, Antarctica, and Africa, concluding the global presence of this hidden and deadly nodavirus. They identified it in 49 species, including shrimp, crabs, fish, and barnacles, resulting in symptoms like lethargy and color loss; yet the reasons behind its ocular effects in humans remain unclear.

Holmes emphasized the potential ubiquity of this pathogen: “I suspect it is much more prevalent than currently recognized. I wouldn’t discount the possibility it first passed through another species, possibly a mammal.”

The virus may also propagate among marine organisms consuming infected animals. For instance, researchers found that farmed shrimp often consume frozen brine shrimp or Antarctic krill, potentially resulting in infections. Moreover, the introduction of this virus into warmer waters has led to increased infections in marine life, suggesting that Antarctic species may serve as reservoirs for pathogens without displaying illness themselves.

Researchers warn that the transmission of this elusive and deadly nodavirus from marine life to humans presents new biosecurity threats. Nonetheless, Holmes reiterates the absence of clear evidence for human-to-human transmission, stating, “This is not regarded as a contagious disease.”

Topics:

This rewritten content integrates relevant keywords while retaining essential information and HTML tags, optimizing it for better search engine visibility.

Source: www.newscientist.com

Revealing the Shroud of Turin: Discovering DNA from Humans, Plants, and Animals

Shroud of Turin

The Shroud of Turin is inscribed with an image of a man believed to resemble Jesus Christ.

Public Domain/Art Collection 2/Alamy

Recent DNA analysis has revealed a wide array of animal, plant, and human contaminants on the Shroud of Turin, complicating the narrative surrounding this enigmatic relic that is claimed to be the burial cloth of Jesus Christ from over 2,000 years ago.

Spanning 4.4 meters in length and 1.1 meters in width, the Shroud stands as one of the most infamous and controversial Christian artifacts globally. It was first documented in France in 1354, and has since resided at the Basilica of St. John the Baptist in Turin, Italy, for nearly 5 centuries.

In 1988, scientists utilized radiocarbon dating and accelerator mass spectrometry to conclude that the Shroud was created between 1260 and 1390, thereby raising questions about its association with Jesus. Nonetheless, this late medieval dating remains a point of contention among some Christian scholars.

In 2015, Gianni Barcaccia and a team at the University of Padova in Italy analyzed samples taken from the Shroud in 1978 and proposed that the cloth might have originated from India.

Currently, Mr. Barcaccia, who opted not to be interviewed, leads a renewed study re-examining the 1978 samples. His team has uncovered diverse DNA from both medieval and modern sources preserved within the Shroud.

The genetic material includes DNA from domesticated animals like cats, dogs, chickens, cows, goats, sheep, pigs, and horses, as well as wild species such as deer and rabbits.

The researchers also identified several fish species, including mullet and Atlantic cod, along with marine crustaceans and insects like flies and skin mites.

Common plant DNA found in the Shroud consists of carrots, various wheat types, peppers, tomatoes, and potatoes—likely introduced to Europe post-exploration of Asia and the Americas.

However, pinpointing the timeline of these contamination events remains elusive.

Human DNA samples were traced back to many individuals who handled the Shroud, including the 1978 sampling team. The researchers noted, “The Shroud’s contact with multiple individuals complicates the possibility of identifying its original DNA.”

Almost 40% of the human DNA is of Indian origin, which may stem from historical trade routes or Romans importing linen from areas near the Indus Valley, Barcaccia and his team noted.

“The DNA evidence on the Shroud of Turin indicates that it may have been significantly exposed in the Mediterranean region, and the fabric may indeed have been produced in India,” they concluded.

Anders Goeterström from Stockholm University noted that while early studies date the Shroud to the 13th century, this timeframe is widely accepted in the scientific community. “Although there’s ongoing debate regarding the 1988 radiocarbon date, most researchers consider it robust,” he stated.

Goeterström remains skeptical about the cloth’s Indian origins, asserting, “There’s currently no evidence to suggest that the Shroud is anything other than a French artifact from the 13th or 14th century.”

“As a significant relic, the Shroud has its own history, which may be more fascinating than the unsupported legendary narratives,” he concluded.

Topic:

Source: www.newscientist.com

Are Humans Genetically Degenerating and Becoming Less Intelligent?

Are Harmful Genetic Mutations Accumulating and Impacting Our Intelligence?

H. Armstrong Roberts/Classic Stock/Getty Images

Every human is born with approximately 100 genetic mutations, unique from their parents. As you have children, half of these mutations are passed down, coupled with new mutations from the next generation. This raises an important question: Are harmful mutations accumulating in humans, leading to a decline in both physical and mental fitness?

Some experts, like geneticist Michael Lynch, suggest that we could see a significant decline in human physical fitness over the next few centuries in industrialized societies. In a 2010 study, various countries, including the UK and Australia, reported declines in IQ, suggesting we might be witnessing a direct consequence of these accumulating mutations.

Historically, the concept of human degeneration spurred highly unethical eugenics policies in the 20th century. While early proponents fabricated stories to justify their views, modern genomic sequencing allows us to directly analyze mutations and understand their implications.

Research indicates that humans possess a relatively high mutation rate compared to many other species. The male reproductive system, responsible for producing sperm continuously from stem cells, plays a vital role in this process. As men can father children for extended periods, mutations may accumulate over generations more than in short-lived species.

While most of our 100 additional mutations have little impact due to the prevalence of ‘junk’ DNA, some can lead to harmful effects. These mutations can occur within protein-coding genes or regulatory sequences, potentially altering gene function.

While severe mutations can be life-threatening, others with minor negative effects can persist through generations. So, what prevents a continuous buildup of harmful mutations in populations?

Traditional genetic theories posit that offspring with significantly damaging mutations are less likely to survive and reproduce, stabilizing the ‘genetic load’ of harmful mutations within populations. However, with evolving health care and conditions in high-income countries, natural selection may be weakening.

Lynch proposes that relaxed natural selection is contributing to the accumulation of harmful mutations, predicting a reduction in human fitness by at least 1% per generation, and perhaps even up to 5%.

Nevertheless, some studies upon which Lynch’s predictions are based involved non-mammalian species. Peter Keatley and his team at the University of Edinburgh explored mutation accumulation in mammals, breeding 55 strains of mice over 21 generations under relaxed selection conditions. Their findings, published in 2024, suggest that the fitness loss in humans per generation may equate to less than 0.4%.

It’s worth noting that natural selection remains effective, as a considerable percentage of pregnancies end in miscarriage. According to Joanna Maskell from the University of Arizona, “There’s always a choice.”

Is Losing Physical Strength Necessarily a Negative Thing?

Moreover, fitness, in an evolutionary context, isn’t always advantageous. Genetic mutations providing resistance to infectious diseases or malnutrition may have adverse effects when those threats are minimal or negligible. For instance, a mutation providing malaria resistance can manifest in sickle cell disease when malaria is absent.

In the larger scheme of evolution, organisms like bacteria can quickly eliminate harmful mutations due to their smaller genomes and large population sizes. However, Maskell notes that this rapid elimination isn’t feasible for humans.

“Our genomes are cluttered with various parasitic elements,” she states. “The influx of harmful mutations surpasses our capacity for removal, yet we possess mechanisms to compensate for them.”

Instead of individually cleansing genetic disadvantages, organisms have evolved a ‘sewage system’ to manage multiple issues simultaneously. This evolutionary process suggests that even rare beneficial mutations with substantial effects can counterbalance numerous slightly detrimental mutations.

A Sewage Treatment System for Clearing Dangerous Mutations

pxl.store/Alamy

This perspective is profound; harmful mutations can paradoxically drive complexity by creating issues that require the evolution of advanced solutions. For example, when a mutation introduces junk DNA into a gene, cellular systems have evolved to excise this extraneous material from the RNA copy.

Interestingly, simulations conducted by her team indicate that as mutation rates rise, beneficial mutations accumulate more rapidly than harmful ones.

“We’re effectively enhancing our waste management system at a faster rate than we create disruptions,” Maskell comments. “Surprisingly, the mathematical outcomes were counterintuitive.”

If these findings hold true, then the high mutation rates in humans may not present the alarming issue many biologists fear. The correlation between declining IQ and mutation may be coincidental. The scientific inquiry continues, yet there’s little cause for alarm regarding human degeneration.

Meanwhile, there are pressing global issues that warrant our attention, such as climate change, which Maseru suggests should be our primary concern instead. I wholeheartedly concur.

Topic:

Source: www.newscientist.com

How Early Humans Created Symbol Systems Before Writing: Uncovering Prehistoric Communication

Approximately 40,000 years ago, early humans in Europe created a sophisticated system of geometric symbols. These symbols are believed to represent an intentional, repeatable form of communication that transcends mere decoration. Discover more in a recent study published in Proceedings of the National Academy of Sciences.



Movable artefact featuring geometric symbols from the Swabian Aurignacian culture. Image credit: Christian Bentz & Ewa Dutkiewicz, doi: 10.1073/pnas.2520385123.

According to researchers Christian Benz from the Universities of Saarland and Passau, and Eva Dutkiewicz from the National Museum in Berlin, “Around 45,000 years ago, modern humans migrated into eastern and central Europe.”

During this migration, they encountered Neanderthals, their distant relatives.

In a period of rapid population turnover, modern humans produced a variety of movable artifacts, including tools and figurines crafted from materials such as ivory, bone, and antler.

These artifacts date back to the early Upper Paleolithic and are part of the Aurignacian technocomplex.

Numerous objects adorned with geometric symbols have been discovered, particularly in France’s Dordogne region, Germany’s Swabian Jura, and Belgian archaeological sites.

The researchers examined a collection of 260 mobile Aurignacian artifacts found in caves in the Swabian Jura.

These remarkable items, made from mammoth ivory, bone, and horn, date between 43,000 and 34,000 years ago.

Artifacts include tools, beads, musical instruments, and figurines representing both animals and humans, many etched with sequences of geometric signs—dots, lines, crosses, and more.

The scientists emphasized, “The inhabitants of these caves produced specialized tools for cutting meat, processing animal hides, and crafting clothing and ropes during this period.”

They also pioneered the flute, the first musical instrument made from bone and ivory.

Utilizing information theory and quantitative linguistics, the authors analyzed over 3,000 geometric symbols from the artifacts.

They assessed characteristics like repetition, diversity, and overall information density within the engraved symbols.

Dr. Benz noted, “While many theories exist, there has been minimal empirical research on the measurable properties of these symbols.”

The results revealed intriguing findings. Statistically, these Paleolithic symbols differ significantly from modern writing, which usually favors less repetition and denser information.

However, they bear a resemblance to Protocuneiform, the earliest known accounting symbols from Mesopotamia, used about 5,500 years ago.

This similarity doesn’t indicate that Ice Age Europeans had a writing system, as true writing encodes spoken language, while the Aurignacian symbols do not.

Instead, these artifacts illustrate a stable, traditional system for visually storing and conveying information without language.

The placement of symbols matters; figurines, particularly ivory ones, display a greater complexity and denser arrangement than everyday tools.

Specific symbols were exclusive to certain subjects, with dots frequently appearing on human and feline figures, while crosses were found on mammoths and horses, but never on human forms.

This pattern indicates a shared set of rules passed down through generations.

Researchers noted that unlike precuneiform, which evolved into a comprehensive script as ancient societies grew more complex, the structure of the Aurignacian symbol system remained remarkably consistent over roughly 10,000 years.

Dr. Benz stated, “Our analysis reveals that these symbol sequences have no correlation to contemporary writing systems, which represent spoken language and feature high information density.”

In contrast, the symbols found in archaeological artifacts often showcase repetitive patterns: cross, cross, cross, line, line, line, a hallmark absent in spoken language.

“Our findings also indicate that Paleolithic hunter-gatherers developed symbols with an information density statistically akin to the earliest proto-cuneiform tablets from ancient Mesopotamia, which emerged 40,000 years later.”

Proto-cuneiform symbols exhibit a similar repetitive quality, with individual symbols appearing at consistent rates, showcasing comparable complexity.

This discovery supports the growing consensus among archaeologists that symbolic communication likely evolved gradually through systems aimed at recording numbers, events, or social knowledge, rather than emerging suddenly as writing.

Some symbols may have tracked seasonal patterns, hunting data, or ritual concepts, though their precise meanings remain elusive.

Dr. Dutkiewicz added, “Modern humans have the benefit of thousands of years of accumulated knowledge that was unavailable to our ancestors. However, anatomically, Stone Age humans may have possessed cognitive abilities akin to ours.”

“The capacity to record and share information was crucial for Paleolithic humans, possibly enhancing their ability to coordinate groups and improve survival strategies.”

“They were adept craftsmen, evident in the portability of many of these artifacts, which often fit seamlessly in the palm of the hand, reminiscent of proto-cuneiform tablets.”

_____

Christian Benz and Eva Dutkiewicz. 2026. Early humans developed a traditional symbol system 40,000 years ago. PNAS 123 (9): e2520385123; doi: 10.1073/pnas.2520385123

Source: www.sci.news

Why Humans Are the Only Primates with Jaws: New Insights Revealed

The Human Jaw: An Evolutionary Enigma

Westend61/Getty Images

Humans possess a distinctive jaw structure, setting them apart from other primates. Recent analyses reveal that this anatomical feature likely emerged not for a specific purpose but as an incidental outcome of various evolutionary adaptations driven by natural selection.

According to Noreen von Cramon-Taubadel from the University at Buffalo, New York, “It’s a misconception that every significant trait between species has been shaped by natural selection with a specific intent. Evolution is frequently more complex and directionless than anticipated.”

The chin, a prominent bony projection of the lower jaw, significantly differentiates humans from other species. Among primates, particularly Homo sapiens, its evolutionary purpose remains a subject of intrigue.

Some researchers posit that the chin might alleviate stress during chewing or play a role in speech formation, while others suggest it may have evolved through sexual selection, with individuals preferring partners showcasing this unique facial attribute.

Conversely, some scientists challenge the idea of any practical function for the chin, contemplating whether its emergence was simply a byproduct of cranial and jaw evolution.

Von Cramon-Taubadel and her team hypothesize that the development of the human chin might actually be attributable to genetic drift, a random evolutionary process.

In their investigation, they studied 532 museum skulls belonging to humans and 14 other modern ape species, including chimpanzees, bonobos, gorillas, orangutans, and gibbons.

Measurements were taken at 46 anatomical landmarks on the skull and jaw, including nine points defining the human jaw, forming a comprehensive evolutionary map.

Utilizing these data, they estimated the head and jaw characteristics of the last common ancestor of all great apes, and applied a standard quantitative genetic model to evaluate genetic drift across family branches.

The findings indicated that three traits associated with the human jaw likely underwent direct selection, while six others appeared to be either neutral or byproducts of other evolutionary changes unrelated to jaw development.

As early human ancestors became more bipedal, the base of their skulls shifted, allowing for a more supportive facial structure. This transformation led to an evolution from pronounced front teeth and strong jaw muscles to diminished traits, ultimately producing a pronounced lower jaw that extends beyond the teeth, marking the emergence of the jaw as we know it.

This unique jaw structure is likely a byproduct of adapting to upright walking, having larger brains, and smaller teeth. According to von Cramon-Taubadel, this illustrates how changes in one area can inadvertently impact others in the evolutionary process.

As noted by Alessio Veneziano from the French National Museum of Natural History in Paris, this jaw structure is a “textbook example” of maladaptation—a characteristic that arises without the direct influence of natural selection. “It’s intriguing to confirm significant evolutionary trends that occur without adaptability,” he remarks.

This evolutionary byproduct is often termed a spandrel, a concept derived from architecture describing a space created by the shape of another structure. Other examples include the human navel or features of the small tyrannosaurus rex.

The study reveals the intricate connections between skull and jaw as a cohesive unit. As highlighted by James DiFrisco at the Francis Crick Institute in London, “Observable features like the jaw may appear as separate entities, but that doesn’t imply they evolved independently.”

Explore Archaeology and Paleontology

New Scientist frequently features remarkable archaeological sites that transform our understanding of species and the dawn of civilization. Consider exploring these fascinating locations as well!

Topics:

Source: www.newscientist.com

Can Humans Be Genetically Enhanced Using George Church’s Renowned Genetic Improvement List?

Biologist George Church Curates Beneficial Genetic Variants

Don Emmert/AFP via Getty Images

“Why should only tall people have access to tall genes? And why should only intelligent people have access to smart genes? Instead of accepting genetic inequality, we aim to provide everyone the opportunity to select beneficial genes for themselves and their future offspring. Genetics should not be a game of chance.”

This is the vision of Bootstrap Bio, a startup striving to empower future parents by enhancing genetic qualities for their children. While it seems that affluent families might already have genetic advantages, the pressing question remains: Can we genuinely enhance our children’s genetics if we choose to?

To understand the possibilities, I began with the List of Protective and Enhanced Gene Variants, curated by Harvard biologist George Church. When I inquired about the list’s purpose, Church explained that it addresses common questions from his lectures—such as whether all rare genetic variants are detrimental and what types of enhancements might be feasible. This list is particularly popular among transhumanists interested in genetic engineering for superhuman traits.

Let’s delve into its details.

Are You Sure You Want Extra Fingers?

The list is intricate, containing over 100 items, yet only about half represent specific genetic mutations linked to concrete effects, with the rest stemming from animal research or medical trials. Church identified mutations that may yield significant “positive effects,” from disease resistance to lower aggression levels in men.

Some traits on this list, however, may not be universally desirable. For instance, a mutation could theoretically lead to six fingers on each hand, enhancing “manipulative capabilities.” But is that really an improvement? Imagine trying to find gloves that fit!

Additionally, two genetic deletions that cause pain insensitivity are also featured, yet lacking the ability to feel pain is not an enhancement—children who are pain-insensitive can suffer severe injuries.

Many remaining traits appear to fall into the “nice to have” category but may not warrant genetic modification. For instance, “low odor production” seems unnecessary in an era of deodorants. While I would appreciate being able to hold my breath longer or endure high altitudes, I doubt my descendants will value these traits as much.

Only a limited number of mutations confer highly desirable characteristics, like extended lifespans or enhanced intelligence—traits for which wealthier prospective parents might be willing to pay. Still, we lack sufficient confidence that incorporating these mutations into children will actually lead to increased intelligence or longevity.

Less Sleep, But at What Consequence?

It is crucial to note that some associations may be misleading, and certain genetic variations might not produce the anticipated effects. Moreover, achieving the desired outcome may depend on combinations of other specific mutations.

Trade-offs are often present too. For example, high-intelligence mutations may increase the risk of future blindness, and resistance to norovirus might predispose individuals to Crohn’s disease, as noted in Church’s list. Personally, I would prefer to be a bit less intelligent and tolerate occasional bouts of norovirus rather than risk potential consequences for my children.

Most variants do not explicitly list drawbacks, but that does not imply they are without consequences. Consider mutations associated with sleep deprivation; the essential role of sleep in maintaining brain health suggests that trade-offs likely exist.

Moreover, many people fail to realize that our understanding of these genetic variations is still developing. In many instances, it is uncertain whether a specific change is genuinely beneficial. This is because biologists must study vast populations—tens of thousands or more—carrying a particular genetic mutation to ascertain both its positive and negative effects.

Creating a Fair Genetic Lottery

To maximize the likelihood that an individual will benefit from genetic engineering, multiple genetic modifications may be necessary simultaneously. This is especially true concerning traits promoted by Bootstrap Bio, as height and intelligence rely on hundreds of mutations, each contributing marginally. The challenge is that we currently lack the technology to safely implement multiple changes in human embryos, much less hundreds at once, as discussed in my previous article on preventing genetic illnesses.

I support the idea of genetic enhancement for children—it’s preferable to leaving a child’s destiny to a random genetic lottery. However, I remain skeptical about the immediate feasibility of heritable genome editing. Expanding studies like the UK Biobank, which tracks large populations over the years to clarify genetic variant effects, is essential.

Finally, the notion that companies offering genetic enhancements can create a fairer world deserves scrutiny. Currently, a fifth of all children worldwide are born shorter than their potential due to inadequate nutrition, and many lack access to quality education. Those genuinely interested in enhancing children’s life chances should prioritize ensuring that all children meet their existing genetic potential rather than focusing narrowly on selective gene enhancements.

Topics:

Source: www.newscientist.com

Ancient Wooden Tool: The Oldest Known Stick Shaped by Early Humans

Reconstruction of a Paleolithic woman crafting wooden tools

Credit: G. Prieto; K. Harvati

Remarkably, some of the oldest known wooden tools have been unearthed in an open-pit mine in Greece, dating back 430,000 years. These artifacts were likely crafted by an ancient human ancestor, potentially related to Neanderthals.

Archaeologists note that prehistoric wooden artefacts are “extremely rare.” According to Dirk Leder from the Lower Saxony Cultural Heritage Office in Hannover, Germany, any new findings in this area are highly valued.

Evidence suggests our extinct relatives may have utilized wooden tools for millions of years. “This could be the oldest type of tool ever used,” states Katerina Harvati from the University of Tübingen, Germany. Unfortunately, the preservation of wooden artifacts is often poor, hindering our understanding of their use.

Harvati and her team discovered the tool at a site called Marathusa 1, originally confirmed in 2013 in the Megalopolis Basin of southern Greece. The open-pit lignite mine revealed sediment layers that are nearly a million years old, offering unprecedented access to date and research, as mentioned by researcher K. Harvati.

From 2013 to 2019, excavations yielded not only tools but also the skeleton of a straight-tusked elephant (Paleoloxodon antiquus), indicating a rich archaeological context with evidence of activity, including more than 2,000 stone tools and remains of varied flora and fauna, depicting an ancient lakeshore ecosystem.


To date Marathusa 1, researchers relied on various methods, including analyzing fossil footprints and historical changes in the Earth’s magnetic field. By 2024, they confirmed that the artefacts are around 430,000 years old, a time marked by challenging climatic conditions—the gravest ice age of the Pleistocene in Europe. The Megalopolis Basin likely provided refuge due to its relatively temperate climate.

The archaeological team identified two significant wooden tools among the 144 artifacts. The first, an 81 cm long pole made from alder, exhibits marks indicative of intentional shaping. One end appears rounded, possibly serving as a handle, while the other is flattened, hinting at potential use for digging underground tubers or perhaps for butchering elephant carcasses. Harvati admits uncertainty about its exact application.

Mysterious second wooden tool from Marathusa 1

Credit: N. Thompson; K. Harvati

The second tool remains enigmatic, measuring just 5.7 cm in length and made from willow or poplar. It also shows signs of intentional shaping after the bark was removed. According to Harvati, this represents a completely new type of wooden tool. While it might have served to modify stone tools, the specific purpose remains a mystery.

Reeder points out that while the first tool is a clear example of wooden craftsmanship, questions remain about the functionality of the second. “Is this a complete item or part of something larger?” he muses.

No hominid remains have been found at Marathusa 1. Given its age, it predates our species and is likely too early even for Neanderthals. “The prevailing hypothesis suggests this site might be associated with pre-Neanderthal humans or Homo heidelbergensis. However, Harvati cautions against making definitive conclusions, noting that Greece was frequented by various hominin groups.

Other ancient wooden tools, like the Clacton spear discovered in Britain, are estimated to be about 400,000 years old, while a wooden spear from Schöningen, Germany, has been dated using multiple methods to around 300,000 years. The only tools that predate those found at Marathusa 1 are from Kalambo Falls in Zambia, which date back 476,000 years and resemble remains of larger structures or buildings.

Discover Archaeology and Paleontology

New Scientist regularly covers extraordinary archaeological sites worldwide that reshape our understanding of human evolution and early civilizations. Consider joining us on this captivating journey!

Topics:

Source: www.newscientist.com

Moroccan Hominin Fossils: Potential Close Ancestors of Modern Humans

Ancient Human Jawbone Discovered in Morocco’s Man Cave

Hamza Mehimdate, Casablanca Pre-History Program

Approximately 550,000-year-old fossils discovered in North Africa potentially belong to a shared ancestor of Neanderthals, Denisovans, and modern humans, existing right before these three significant hominin lineages diverged.

Neanderthals and Denisovans, the final common ancestors of modern humans, are believed to have thrived between 765,000 and 550,000 years ago. However, key questions about their existence and habitats still challenge our understanding of human evolution.

Recent fossil discoveries suggest that researchers, including Jean-Jacques Hublin from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, are nearing the pivotal moment of divergence in ancient human lineages.

Hublin and his team examined various fossils, including two adult jawbones, one juvenile jawbone, and several vertebrae unearthed from a cave referred to as the Cave of Mankind near Casablanca, Morocco. One of the adult jawbones had been detailed in a 1969 study, while the remaining specimens are presented for the first time.

The fossilized molars share similarities with early Homo sapiens and Neanderthals, yet their jaw structures resemble ancient African Homo erectus.

Fortunately, these Moroccan hominids existed around the same period as changes in Earth’s magnetic field, recorded within the geological formations containing the fossils, allowing for precise dating to approximately 773,000 years ago.

Hublin described the find as filling a “significant gap” in the African human fossil record dating back to between one million and 600,000 years ago. Paleogenetic studies reveal that the ancestors of Neanderthals and Denisovans diverged around this time, while H. sapiens evolved independently in Africa.

The newly identified fossils are contemporaneous with a hominid population in Spain, previously hypothesized to serve as a common ancestor between Homo sapiens and Neanderthals.

Excavation Team at Moroccan Fossil Site

R. Gallotti, Casablanca Pre-History Program

Both Homo ancestors and Moroccan hominins exhibit “a mosaic of primitive and derived features,” Hublin notes, suggesting possible genetic exchanges across the Strait of Gibraltar. However, notable distinctions exist between the fossils from both areas, with Spanish specimens appearing more Neanderthal-like.

“The last common ancestor likely inhabited both sides of the Mediterranean during that era, indicating a deep African lineage for Homo sapiens opposed to the Eurasian origin theories proposed by some,” Hublin states.

Julian Lewis, a professor at Griffith University in Brisbane, Australia, expresses intrigue over the physical differences in early Pleistocene hominids closely related to our species.

“The key takeaway is that these differences have been developing for a substantial period,” Lewis concluded, alluding to the arrival of the Homo ancestor in Spain, indicating it may represent one of several species across North Africa that eventually crossed over to Europe.

Chris Stringer from the Natural History Museum in London has also contributed to this discussion. His research, including findings from human fossils in China published last year, suggests that the last common ancestor of Homo sapiens, Neanderthals, and Denisovans could date back over a million years.

“The specific continent for that common ancestor’s existence remains unknown,” Stringer remarked. “Yet even if it lived outside Africa, our analysis indicates that the evolution of Homo sapiens predominantly took place in Africa, suggesting a potential early migration back into Africa for continued evolution.”

These newly identified Moroccan fossils may even represent early specimens of Homo sapiens, though sufficient skeletal fragments are lacking for definitive classification.

Ongoing comparisons with previously studied fossils will help ascertain their evolutionary positioning.

Topics:

  • Human Evolution/
  • Ancient Hominins

Source: www.newscientist.com

Jellyfish Sleep Patterns: Similar to Humans and Napping Habits Explained

Upside-down jellyfish on the ocean floor

Upside-Down Jellyfish Thrives on the Ocean Floor

Photo by Eilat. Gil Koplowicz

Recent research shows that jellyfish share surprising similarities with humans, including a sleep pattern of approximately eight hours a day, complemented by short naps. Understanding the sleep behaviors of these marine creatures can shed light on the evolutionary significance of sleep.

“Interestingly, like humans, jellyfish spend about a third of their time sleeping,” states Lior Appelbaum from Bar-Ilan University in Israel.

In animals with brains, such as mammals, sleep is crucial for memory consolidation and the elimination of metabolic waste. However, it remains unclear why sleep evolved in jellyfish, which belong to the brainless cnidarian group and possess neurons arranged in simple networks.

Appelbaum and his team utilized high-resolution cameras to observe Cassiopeia Andromeda, an upside-down jellyfish, in a controlled aquarium environment. The jellyfish were subjected to cycles of light and darkness to replicate natural conditions.

During the simulated daytime, the jellyfish exhibited an average pulse rate of over 37 times per minute, demonstrating responsiveness to sudden stimuli. In contrast, their pulse rate decreased at night, and they became less reactive, indicating a sleep state. These pulsations are vital for nutrient distribution and oxygen supply within the jellyfish’s body, as explained by Appelbaum.

Overall, jellyfish typically sleep for about eight hours each night, interspersed with brief naps lasting one to two hours. Prior studies had confirmed nocturnal sleep in C. Andromeda, but the intricacies of their sleep cycles were previously unknown.

In another experiment, researchers simulated sleep disruption by pulsating water against the jellyfish, which led to even better sleep the following day. “It mirrors human behavior: when sleep-deprived at night, we tend to feel more fatigued during the day,” notes Appelbaum.

Crucially, further examination indicated that sleep in C. Andromeda is associated with reduced DNA damage. Sleep likely protects neurons from deterioration that might occur during wakefulness, as corroborated by the observation that exposing jellyfish to ultraviolet light—thereby increasing DNA damage—resulted in improved sleep patterns.

Future studies are required to determine whether similar sleep benefits apply to other jellyfish species or even mammals. The researchers also found comparable results with starlet sea anemones (Nematostella vectensis), marking a significant step in confirming sleep in these organisms, according to Appelbaum.

Topic:

Source: www.newscientist.com

Evidence Suggests Early Humans Began Hunting Elephants 1.8 Million Years Ago

Homo heidelbergensis hunting elephant

Ancient Humans Hunting Elephants—Evidence of Slaughtering Animals 1.8 Million Years Ago

Natural History Museum/Scientific Photography Library

Hunting an elephant is a formidable challenge, necessitating advanced tools and teamwork, offering an abundant source of protein.

A research team led by Manuel Dominguez-Rodrigo from Rice University in Texas suggests that ancient humans may have accomplished this feat approximately 1.78 million years ago in Tanzania’s Olduvai Gorge.

“Around 2 million years ago, our ancestors consistently consumed smaller game like gazelles and waterbucks but did not target larger prey,” says Dominguez-Rodrigo.

Later findings from Olduvai Gorge indicate a significant shift. This valley, abundant with both animal and human fossils formed over the past 2 million to 17,000 years, shows a marked increase in elephant and hippopotamus remains around 1.8 million years ago. However, establishing conclusive evidence of human involvement in hunting remains elusive.

In June 2022, Dominguez-Rodrigo and his team discovered what may be an ancient elephant slaughterhouse at Olduvai.

The site, dubbed the EAK site, revealed partial remains of an extinct elephant species, Elephas reki, surrounded by an array of stone tools that were much larger and sturdier than those utilized by hominins 2 million years ago. Dominguez-Rodrigo posits these tools were likely crafted by the ancient hominin Homo erectus.

“These include Pleistocene knives, known for their sharpness even today,” he notes, emphasizing their potential for butchering tasks.

Dominguez-Rodrigo and his colleagues believe these stone tools facilitated elephant slaughter. Some limb bones appear to have fractured shortly after the elephant’s demise, indicating the bones were still fresh or “green.” Unlike scavengers like hyenas that can strip meat, they can’t shatter the dense bone shafts of mature elephants.

“We discovered numerous bones in the field with fresh fractures, pointing to human use of hammer stones for processing,” he states. “These ‘green’ fractured bones are widespread in the 1.7-million-year-old landscape and bear distinct impact marks.”

However, there is a scarcity of cut marks on bones, which typically indicate butchering practices to extract meat.

It remains uncertain whether humans actively hunted the elephants or merely scavenged existing carcasses.

“What we can confirm is that they disassembled the bones—or portions of them—leaving behind tools and bones as evidence,” affirms Dominguez-Rodrigo.

He adds that the transition to hunting elephants wasn’t merely due to advancements in stone tools, but also hinted at an increase in social structure and cultural development among hominin groups.

However, Michael Pante, a researcher at Colorado State University, remains skeptical of the findings.

Pante contends that the evidence for human exploitation of this individual elephant is weak. The interpretation relies heavily on the proximity of stone tools and elephant remains, as well as the inferred fractures created by human attempts to access bone marrow.

Pante asserts that the earliest definitive evidence of hippo, giraffe, and elephant hunting in Olduvai dates back to around 80,000 years ago, as shown in the research of the 1.7-million-year-old HWK EE site.

“In contrast to the EAK site, the bones at HWK EE exhibit cut marks and are associated with thousands of other bones and artifacts within an archaeological context,” he explains.

Explore the World of Archaeology and Paleontology

New Scientist consistently covers remarkable archaeological sites around the globe, redefining our understanding of species and early civilizations. Consider taking a trip to explore these fascinating locations!

Topics:

Source: www.newscientist.com

New Study Examines Paleolithic Shift: Transition from Neanderthals to Anatomically Modern Humans

The transition from the Middle to Upper Paleolithic, occurring approximately 50,000 to 38,000 years ago, was a pivotal period characterized by the decline and extinction of Neanderthals alongside the emergence and expansion of anatomically modern humans, known as Homo sapiens. Paleoanthropologists at the University of Cologne have created a high-resolution model of population dynamics to reconstruct this significant transition on the Iberian Peninsula. Their ensemble simulations investigated Neanderthal survival, the arrival of modern humans, and the potential for interbreeding.

This image shows a Neanderthal and a human child. Image credit: Neanderthal Museum.

During this critical transition from the Middle Paleolithic to the Upper Paleolithic, Neanderthal populations experienced a steady decline across Europe, particularly in the Iberian Peninsula, leading to their eventual extinction.

Simultaneously, anatomically modern humans spread throughout Europe, marking a significant shift in human history.

This era was further defined by dramatic climate fluctuations, featuring alternating cold and warm periods. Rapid warming events occurred within centuries, contrasting with gradual cooling phases, known as the Dansgaard-Eschger phenomenon, which were punctuated by severe cooling caused by iceberg releases into the North Atlantic (Heinrich phenomenon).

The precise timing of Neanderthal extinction and the arrival of modern humans remains uncertain, leaving open the possibility of interactions between the two species.

Genetic analyses of archaeological bones compared to modern populations indicate admixture events in eastern Europe during the early phases of modern human migration.

Given the uncertainty of these dates, it remains possible that the two populations on the Iberian Peninsula may have intermixed at a later time, though this has yet to be substantiated.

In this innovative study, Professor Yaping Hsiao and his colleagues from the University of Cologne utilized numerical models to exploratory simulate potential encounters between Neanderthals and modern humans on the Iberian Peninsula.

These models considered ongoing climate changes while simulating the populations of both groups, along with their interactions and connectivity.

“By running the model multiple times with varying parameters, we can assess the plausibility of different scenarios, such as the early extinction of Neanderthals, small at-risk populations, or prolonged survival leading to admixture,” explained Professor Hsiao.

“However, for the majority of the simulations, the two groups did not converge.”

Across all scenarios, the populations exhibited high sensitivity to climate change.

Mixing between the two species was plausible if both populations could maintain stability over an extended period.

At a low probability (1%), a small fraction of the total population—approximately 2-6%—could possess genes from both groups by the end of the simulation.

This admixture likely occurred in the northwestern region of the Iberian Peninsula, where modern humans may have arrived early enough to interact with still-surviving Neanderthal populations.

“By integrating climate, demography, and cultural factors, our dynamic model provides a comprehensive framework that enhances our interpretation of the archaeological and genomic records,” stated Professor Gerd Christian Weniger from the University of Cologne.

For further details, refer to a paper that will be published in the online journal PLoS ONE.

_____

Y. Xiao et al. 2025. Pathways at the Crossroads of Iberia: Dynamic Modeling of the Middle to Upper Paleolithic Transition. PLoS ONE 20 (12): e0339184; doi: 10.1371/journal.pone.0339184

Source: www.sci.news

The First Evidence of Fire-Making by Early Humans in Britain

Artistic representation of sparks from flint and pyrite

Craig Williams, Trustee of the British Museum

Approximately 400,000 years ago, Neanderthals or their ancestors in Britain struck flint with pyrite, repeatedly igniting fires in the same location. Archaeologists believe this is the earliest evidence of human fire-making discovered to date.

Early humans may have opportunistically utilized fire for around 1.5 million years, likely benefiting from naturally occurring fires caused by events like lightning strikes.

Starting around 400,000 years ago, signs of more extensive fire usage grew in Europe, yet direct evidence indicates that humans gained the ability to control fire only around 50,000 years ago.

Nick Ashton from The British Museum reported three crucial findings from the Burnham Quarry site in Suffolk: pyrite, charred deposits, and a heat-damaged hatchet.

Pyrite holds significant importance in humanity’s fire history, as striking it with flint can produce sparks capable of igniting dry materials. However, pyrite does not naturally occur near quarries, implying that early humans must have transported it. “Pyrite is crucial,” Ashton states.

Additionally, the reddish sediment left by these fires is vital, according to him. Combustion alters the iron minerals present in the deposits, consequently changing their magnetic characteristics. Laboratory tests indicate that the reddish clay layer may have experienced firing events more than a dozen times, suggesting that humans revisited and ignited fires in this area repeatedly.

Heating flint facilitates its shaping into sharp tools, but excessive heat can lead to its shattering, as evidenced by the hatchet discovered at Burnham. Tests indicated it exceeded 700 degrees Celsius, prompting Mr. Ashton to believe it had inadvertently been heated.

Excavations at a disused quarry in Burnham, UK

Jordan Mansfield, Road to Ancient Britain Project.

Ashton notes that there’s accumulating evidence suggesting humans half a million years ago possessed a range of cultural and technological abilities, including making and using fire.

“Early Neanderthals, along with other modern human species, were far more capable than we often acknowledge,” Ashton says. “Starting a fire is no simple task; it necessitates understanding pyrite’s origin, its properties when used with flint, and selecting the right tinder to create a flame.”

John Gowlett at The University of Liverpool has stated that recent findings make it “highly plausible” that individuals 400,000 years ago were not just aware of fire but likely utilized it daily.

“Early humans were indeed familiar with fire, but the mere discovery of a burnt object associated with a tool doesn’t automatically indicate human control over it,” he explains. “If a location shows signs of repeated human occupation accompanied by multiple indicators of fire, that presents compelling evidence of human control, as natural fires seldom occur in the same place repeatedly.”

A gentle walk through the origins of humanity and prehistoric times in south-west England

Join a gentle walking tour immersing yourself in early human eras, including the Neolithic, Bronze Age, and Iron Age.

Topics:

  • Neanderthal Man/
  • Ancient Humans

Source: www.newscientist.com

New Evidence Shows Humans Mastered Fire 400,000 Years Ago, Earlier Than Previously Believed

“This site, dating back 400,000 years, represents the earliest known evidence of fire not just in Britain and Europe but across the globe,” stated Nick Ashton, co-author of the study and curator at the British Museum. He noted that this discovery pushes back the timeline of when our ancestors might have first harnessed fire by approximately 350,000 years.

Researchers are uncertain about the uses of fire by these hominin ancestors. They may have roasted meat, crafted tools, or shared narratives under its glow.

Understanding when our ancestors mastered the use of fire is crucial to unraveling the complexities of human evolution and behavior.

One hypothesis suggests that the ability to start fire contributed to the increase in brain size among early humans, as cooking facilitates easier digestion and boosts caloric intake. Another theory posits that controlling fire may have fostered social gathering spots at night, boosting social behavior and cognitive evolution.

“We know brain size was increasing towards its current capacity during this period,” remarked Chris Stringer, research head in human evolution at London’s Natural History Museum and another author of the Nature study. “The brain is energetically costly, consuming about 20 percent of the body’s energy. Thus, the ability to use fire enhances nutrient absorption from food, provides energy for the brain, and allows for the evolution of larger brains.”

Stringer emphasized that this finding does not signify the beginning of fire usage among humans but is merely the earliest instance researchers can confidently point to. Other early indications of fire use have been found in regions of South Africa, Israel, and Kenya, though these are contentious and open to interpretation.

From an archaeological standpoint, it’s challenging to ascertain the cause of wildfires or whether they were initiated by humans.

“The key question is whether they collected it from a natural source, managed it, or created it themselves. On the surface, this appears to be a robust case suggesting that the group knew how to start fires,” noted Dennis Sandogyas, a senior lecturer in the archaeology department at Simon Fraser University in Canada, who was not part of the study.

In the recent Nature study, researchers highlight the presence of deposits with fire residue, fire-cracked stone tools including a flint hatchet, and two small fragments of pyrite likely brought to the site by humans for fire-making, as indicated by geological analysis.

The prehistoric hatchet stone tool was discovered near a 400,000-year-old fire site that researchers believe was frequently used by Neanderthals.
Road to Ancient Britain Project

Other outside researchers expressed skepticism.

Much of the evidence presented is “circumstantial,” wrote Will Loebloeks, a professor emeritus of paleolithic archaeology at Leiden University in the Netherlands, in an email.

Lowbrokes pointed out that later Neanderthal sites, dating to around 50,000 years ago, showed flint tools with wear signs indicating they had been struck against pyrite to produce sparks, an indication of humans creating fire. This evidence isn’t present in the current study.

“While the authors conducted thorough analysis of the Burnham data, they seem to be overstating claims by suggesting this is the ‘earliest evidence of a fire outbreak,'” Lobruks noted.

For our ancestors, fire was vital for warmth, nutrition, deterring predators, and even melting resins used in adhesives.

However, Sandgate emphasized that the evolution of fire-starting is not a straightforward path; it included sporadic adaptations and innovations. Evidence exists that early groups who learned to create fire sometimes lost that ability or ceased its use for cultural reasons.

“We must be cautious not to generalize any single instance … as proof that from this moment forward everyone will know how to start a fire,” Sandogyas remarked, referencing nearly 100 modern hunter-gatherer groups that have been meticulously observed. Some lacked the ability to generate fire.

“It’s probable that the art of fire-making was discovered, lost, rediscovered, and lost again across various groups over time. Its history is undoubtedly intricate.”

Source: www.nbcnews.com

Study Shows Humans Struggle to Accurately Interpret Dog Emotions

We often believe we can accurately gauge our dogs’ emotions, yet recent studies indicate that many of us may be misunderstanding their feelings.

Researchers at Arizona State University (ASU) discovered that when individuals are in a good mood, they are more prone to perceive their dog as looking sad. Conversely, when experiencing mild depression, they are likely to view the same dog as happy.

This contrasts with how we interpret human emotions. In social interactions, we generally perceive others’ feelings as mirroring our own.

“I am continually fascinated by how people interpret emotions in dogs,” stated the study’s co-author, Clive Wynn. “We have only begun to uncover what is shaping up to be a significant mystery.”

The researchers believe these findings could greatly influence how we care for our pets.

“By enhancing our understanding of how we recognize emotions in animals, we can improve their care,” explained the first author, Dr. Holly Molinaro, who was a doctoral student at ASU focused on animal behavior at the time.

Dogs involved in the study, from left to right: Canyon, a 1-year-old Catahoula; Henry, a 3-year-old French Bulldog; and Oliver, a 14-year-old mongrel. The video background was black, ensuring only the dogs were visible. – Credit: Arizona State University

The research stemmed from two experiments with about 300 undergraduate students.

Participants first viewed images designed to evoke positive, negative, or neutral moods. They then watched a brief video featuring an adorable dog to assess its emotional state.

Those who saw uplifting images rated the dog in the video as sadder, while participants who viewed more somber images rated it as happier.

The video included three dogs—Oliver, Canyon, and Henry—depicted in scenarios reflecting cheerful, anxious, or neutral moods. Factors like snacks, toys, and the promise of visiting “Grandma” elevated their spirits, while a vacuum cleaner and a photo of a cat were used to bring them down.

Scientists are still puzzled about why humans misinterpret dogs’ emotions. “Humans and dogs have coexisted closely for at least 14,000 years,” Wynn noted.

“Over this time, dogs have learned much about cohabitation with humans. However, our research indicates significant gaps in our understanding of how dogs truly feel.”

read more:

Source: www.sciencefocus.com

60,000 Years Ago: Ancient Humans Arrived in Australia via Two Distinct Routes

Ancient humans took two distinct pathways to reach modern Australia.

Helen Farr and Eric Fisher

The timeline and means by which ancient humans made their way to what is now Australia and New Guinea have sparked much debate over the years. Recent genetic studies indicate this event likely occurred at least 60,000 years ago and involved two separate routes.

The regions of modern-day Australia, Tasmania, and New Guinea were once part of Sahul, an ancient continent that emerged during the peak of the ice age when sea levels were significantly lower. Researchers have been keen to understand human migration into these regions as it necessitated navigating dangerous ocean stretches of over 100 kilometers, even during low sea levels.

There are two primary theories regarding the arrival of humans in Sahul: one suggests it took place at least 60,000 years ago, while the other posits a timeline of around 45,000 years ago.

Regarding the approach taken, scientists have put forth two main routes. The southern route is believed to have led to Australia by sea from present-day mainland Southeast Asia through the Sunda region that comprises Malaysia, Indonesia, and Timor. The northern route, however, has more compelling supporting evidence, indicating that humans migrated through the Philippines and Sulawesi to reach modern-day New Guinea, where ancient hominin stone tools dating back millions of years were recently found.

To unravel these migrations, Martin Richards and his colleagues from the University of Huddersfield in the UK examined approximately 2,500 genome sequences from Indigenous Australians, Papua New Guineans, and various populations across the Western Pacific and Southeast Asia.

By analyzing DNA mutation rates and the genetic ties between these populations, the researchers determined that the initial human settlement of Sahul occurred via both routes, but predominantly through the northern pathway.

The question of timing has also been addressed by the researchers. “We traced both dispersals to around the same period, approximately 60,000 years ago,” Richards noted. “This lends support to the ‘long chronology’ of settlement as opposed to the ‘short chronology’ suggesting arrival around 45,000 to 50,000 years ago.”

The findings further illustrate that migration wasn’t a straightforward process, partially based on the discovery of ancient genetic lineages in a 1,700-year-old burial site in Sulawesi. The team also detected evidence indicating that shortly after their arrival on Sahul, coastal and marine communities began migrating towards what we now refer to as the Solomon Islands.

Adam Blum, a professor at Griffith University in Brisbane, asserted that the field of paleogenetics, which investigates history through preserved genetic materials, “seems to adjust the narrative with each new study.”

“We believe this research bolsters the idea that the northern route played a crucial role in the early populating of Australia,” Blum remarked. “Considering the ancient cave art found on Sulawesi, the possibility is rapidly becoming more plausible.”

This remarkable rock artwork has been dated to at least 51,200 years ago, Blum explained. “I have a strong suspicion that individuals were crafting art in Sulawesi’s caves and shelters over 65,000 years ago.”

Peter Veth and his team at the University of Western Australia in Perth assert that even the most conservative estimates from the Majedbebe site in Australia’s Northern Territory suggest human activity traces exceeding 60,000 years. New research further underscores the significance of early human arrival in Sahul.

Discovery Tour: Archaeology and Paleontology

New Scientist frequently features incredible archaeological sites that have transformed our understanding of human history and the dawn of civilization. Why not explore them yourself?

topic:

Source: www.newscientist.com

5,000 Years Ago: Ancient Humans Introduced Wolves to Isolated Baltic Sea Islands.

The wolf, the wild ancestor of dogs, stands as the sole large carnivore domesticated by humans. Nonetheless, the exact nature of this domestication remains a topic of debate—whether it was a result of direct human control over wild wolves or a gradual adaptation of wolf populations to human environments. Recent archaeological findings in the Stra Fjärväl cave on the Swedish island of Stra Karsø, located in the Baltic Sea, have revealed the remains of two canids with genetic ties to gray wolves. This island, measuring just 2.5 km2, possesses no native land mammals, similar to its neighboring Gotland, and thus any mammalian presence must have been human-introduced.

Canadian Eskimo Dog by John James Audubon and John Bachman.

“The discovery of wolves on such a remote island was entirely unexpected,” remarked Dr. Linus Gardland Frink, a researcher from the University of Aberdeen.

“They not only had genetic links indistinguishable from other Eurasian wolves but also seemed to coexist and feed alongside humans in areas that were only reachable by boat.”

“This paints a complex picture of the historical dynamics between humans and wolves.”

Genomic analysis of the canid remains indicates they are wolves, not dogs.

However, their traits suggest a level of coexistence with humans.

Isotope analysis of their bones indicates a diet high in marine proteins, such as seals and fish, mirroring the diet of the humans on the island, suggesting they were likely fed.

Furthermore, these wolves were smaller than typical mainland counterparts, and one individual demonstrated signs of low genetic diversity—a common outcome due to isolation or controlled breeding.

This findings challenge long-standing notions regarding the power dynamics between wolves and humans and the domestication of dogs.

While it is unclear if these wolves were domesticated, confined, or managed, their presence in human-occupied areas suggests deliberate and ongoing interactions.

“The fact that it was a wolf and not a dog was a complete surprise,” stated Dr. Pontus Skoglund from the Francis Crick Institute.

“This provocative case suggests that under certain conditions, humans may have kept wolves in their habitats and found them valuable.”

“The genetic findings are intriguing,” noted Dr. Anders Bergström from the University of East Anglia.

“We discovered that the wolf with the most complete genome showed less genetic diversity than any ancient wolf previously analyzed.”

“This resembles what is observed in isolated or bottlenecked populations, or in domesticated species.”

“Although we cannot completely dismiss the idea that low genetic diversity may occur naturally, it implies humans were likely interacting with and managing wolves in ways not previously considered.”

One Bronze Age wolf specimen also presented advanced pathology in its limb bones, which would have restricted its mobility.

This suggests care or adaptation to an environment where large prey hunting was unnecessary for survival.

Professor Jan Stroh of Stockholm University stated: “The combined data offers new and unexpected perspectives on human-animal interactions during the Stone and Bronze Ages, especially regarding wolves and dogs.”

“These findings imply that prehistoric interactions between humans and wolves were more intricate than previously understood, involving complex relationships that extend beyond simple hunting or avoidance, hinting at new aspects of domestication unrelated to modern dogs.”

A study detailing this research was published on November 24th in the Proceedings of the National Academy of Sciences.

_____

Linus Gardland-Frink et al. 2025. A gray wolf in the anthropogenic setting of a small prehistoric Scandinavian island. PNAS 122 (48): e2421759122; doi: 10.1073/pnas.2421759122

Source: www.sci.news

Scientists Discover Humans Possess a Type of ‘Remote Touch’

Recent studies indicate that humans possess the capability to detect objects without physical contact, a skill seen in certain animals.



Chen and colleagues. The first study examined human fingertip sensitivity to tactile signals from buried objects, while the second utilized a robotic arm with a long short-term memory model to detect objects. Image credit: Gemini AI.

Typically, human touch is viewed as a sense limited to direct physical interaction with objects.

However, recent insights into animal sensory mechanisms challenge this perception.

Some species of sandpipers and plovers, for instance, utilize a form of remote touch to locate prey concealed beneath the sand.

Remote touch allows for the detection of objects hidden beneath particles by subtle mechanical signals transmitted through the medium when nearby pressure is applied.

In a groundbreaking study, Dr. Elisabetta Versace from Queen Mary University of London and her team explored whether humans share similar capabilities.

Participants delicately glided their fingers over the sand to locate a hidden cube before making physical contact.

Remarkably, the study outcomes revealed a sensitivity analogous to that found in shorebirds, despite humans lacking the specialized beak structure that facilitates this ability in avians.

Modeling the physical attributes of this phenomenon, researchers concluded that human hands are so sensitive they can perceive buried objects through minute sand displacements.

This sensitivity approaches the theoretical threshold for detecting mechanical “reflections” of granules when the movement of sand is reflected by a stable surface (the concealed object).

When evaluating the performance of humans against robotic tactile sensors trained using long short-term memory (LSTM) algorithms, humans achieved a remarkable accuracy of 70.7% within the anticipated detection range.

Interestingly, the robot could sense objects from slightly greater distances on average but encountered frequent false positives, resulting in an overall accuracy of only 40%.

These findings affirm that humans can genuinely detect objects prior to physical contact, showcasing an extraordinary aspect of our senses typically linked to direct interactions.

Both humans and robots demonstrated performance nearing the maximum sensitivity predicted by physical models of displacement.

This research uncovers that humans can identify objects buried in sand without direct contact, broadening our understanding of the extent of tactile perception.

Additionally, it provides quantitative evidence of tactile abilities previously undocumented in humans.

The study also presents a valuable benchmark for enhancing tactile sensing in assistive technologies and robotic systems.

Emulating human sensory perception, engineers can design robots that incorporate near-human touch sensitivity for practical uses in tasks such as surveying, excavation, and exploration where visual cues are limited.

“This is the first instance of remote contact being examined in humans, reshaping our concept of the perceptual fields of living beings, including humans,” stated Dr. Versace.

“This discovery opens avenues for creating tools and assistive technologies that amplify the human sense of touch,” remarked Dr. Student Chen Zhenchi.

“These insights could lead to the development of advanced robots capable of performing delicate tasks, such as locating untouched archaeological artifacts or navigating sandy or granular terrains like Martian soil or ocean floors.”

“More generally, this research facilitates the development of touch-based systems that enhance safety and effectiveness in exploring hidden and hazardous locations.”

“What makes this study particularly intriguing is the mutual influence between human research and robotic research,” noted Dr. Lorenzo Hamone, a researcher at University College London.

“Human experiments informed the robot’s learning strategy, while the robot’s efficacy offered new interpretations of human data.”

“This serves as a prime example of how psychology, robotics, and artificial intelligence can collaborate, illustrating how interdisciplinary teamwork can ignite both fundamental discoveries and technological advancements.”

Details of the findings were presented in September at the 2025 IEEE International Conference on Development and Learning (ICDL) in Prague, Czech Republic.

_____

Z. Chen and colleagues. Exploring haptics for object localization in granular media: A human-robot study. 2025 IEEE International Conference on Development and Learning; doi: 10.1109/ICDL63968.2025.11204359

Source: www.sci.news

Research Indicates Humans Evolved from Ape-Like Ancestors in Africa

A recent investigation conducted by paleoanthropologists from the United States and Canada has focused on the morphology of the hominid talus, a significant bone in the ankle that connects to the tibia and calcaneus of the foot. Ardipithecus ramidus, a hominid species that existed in eastern Africa approximately 4.4 million years ago, was at the center of this study. The researchers discovered that the fossil exhibits similarities to the talus of chimpanzees and gorillas, which are adapted for vertical climbing and terrestrial quadrupedal locomotion—a form of movement where animals traverse on all fours with the entire sole of the foot touching the ground, including the heel. Additionally, the authors confirmed the presence of derived features in the specimen that align with earlier suggestions for improved extrusion mechanisms in the legs of Ardipithecus ramidus.

Ardipithecus ramidus, a hominid that existed in Africa over 4 million years ago. Illustration by Arturo Asensio, from Quo.es.

Partial skeleton from 4.4 million years ago, Ardipithecus ramidus, affectionately dubbed “Aldi,” was uncovered in 1994.

This species featured an ape-sized brain and had grasping big toes adapted for climbing trees.

It walked on two legs, and its upper canine teeth were diamond-shaped as opposed to the V-shape commonly found in chimpanzees.

“Aldi represents one of the oldest and most complete skeletons discovered,” remarked Dr. Thomas (Cody) Plan, a researcher at Washington University in St. Louis.

“Aldi is roughly a million years older than ‘Lucy’, another renowned early human ancestor, and signifies an early phase in human evolution.”

“Oneof the surprising aspects of this find was that, despite walking upright, Aldi retained many monkey-like characteristics, such as its grasping feet.”

“Great apes, including chimpanzees and gorillas, possess forked big toes that facilitate gripping tree branches while climbing.”

“However, it also exhibited traits consistent with our lineage. Ardipithecus truly represents a transitional species.”

Initially, scientists speculated that Ardi’s locomotion resembled a common form rather than being typical of African apes, leading them to conclude that this early human ancestor was not particularly ape-like, which startled the paleoanthropology community.

“From their analysis, they inferred that contemporary African apes, like chimpanzees and gorillas, represent a dead end, or a kind of evolutionary cul-de-sac. Dead end underscores the evolutionary process rather than the point at which humans emerged,” stated Dr. Puran.

“Instead, they posited that Ardi offered evidence of a more generalized ancestry that was less akin to chimpanzees and gorillas.”

By examining the ankles of chimpanzees and gorillas, researchers can gain insights into their movement, especially regarding their vertical tree climbing techniques.

This crucial bone also sheds light on how early species transitioned to bipedalism.

For the recent study, Dr. Plan and his team compared Ardi’s ankles to those of great apes, monkeys, and early humans.

Their findings indicated that Ardi’s ankle is the only one within the primate fossil record that shares similarities with African apes.

These apes are recognized for their adaptations to vertical climbing and terrestrial quadrupedal locomotion, suggesting that Ardi might have utilized their feet similarly.

Alongside these primitive traits, Ardi’s talus exhibited signs of an enhanced foot extrusion mechanism.

This complexity points to a blend of climbing and locomotor behaviors in this early human species, which is crucial in understanding the evolution of bipedalism.

“This discovery is both controversial and aligns with earlier theories,” Mr. Pran noted.

“While there is no disagreement regarding the significance of Aldi’s find, many in the field would argue that the initial interpretation was likely flawed.”

“Thus, this paper represents a reevaluation of the original views that distanced Aldi from chimpanzees and gorillas.”

“It’s vital to understand that our paper does not claim that humans evolved from chimpanzees.”

“However, this study further supports the hypothesis that the common ancestor of humans and chimpanzees was likely very similar to today’s chimpanzees.”

For more details, refer to the paper published in the journal Communication Biology.

_____

TC Plan et al. 2025. Ardipithecus ramidus Ankle provides evidence of African ape-like vertical climbing in early humans. Commun. Biol. August 1454. doi: 10.1038/s42003-025-08711-7

Source: www.sci.news

Could Humans Face Extinction in Precisely 314 Years?

Feedback is New Scientist A well-known figure who observes the latest news in science and technology with a critical eye. To share feedback about topics you believe may interest our readers, please contact us at feedback@newscientist.com.

Our Expiry Date

Unfortunately, we have some bad news. Humanity’s time is marked; experts predict our extinction by 2339, leaving us only a few centuries (as of now).

News Editor Jacob Aaron presented this startling information. A paper not yet peer-reviewed was shared on the social science preprint server SocArXiv. In their work, demographers David Swanson and Jeff Tayman discuss how the human population could decline from the current 8.1 billion to zero.

Their reasoning is straightforward: “Considering the decrease in birth rates from 2019 to 2024 and applying probabilistic forecasting methods, by 2139, the world’s population will fall between 1.55 billion and 1.81 billion… By 2339, humanity will be extinct,” they assert.

Swanson and Tayman highlight that this extinction timeline is “only 314 years away.” One might think the estimate could have been rounded to 300 to incorporate some necessary uncertainty in the predictions, but the confidence displayed is noteworthy.

This may seem evident, but we cannot base projections for the next three centuries on just five years of data — especially from 2019 to 2024, a period marked by significant global events that likely impacted birth rates.

They employed three different methodologies: the Cohort Component Method, the Hamilton-Perry Method, and even the notable Espenshade-Tiemann Method. Despite this, the prediction remains flawed. However, it’s likely our audience has already deduced this.

For a moment, we questioned if the paper was intended as satire, aiming to mislead unsuspecting science journalists into reckless reporting. However, this seems unlikely as Mr. Swanson shared it at a conference in September. Following his presentation, “a robust discussion unfolded.“Oh, I can’t believe I was heading straight for that wall.

This might hint at a precursor to a new belief system, positioning the apocalypse conveniently three centuries away to avoid embarrassment if it doesn’t come to pass.

Oh, No More

The feedback reveals that US President Donald Trump referred to climate change as “a scam, deeming renewable energy sources like wind power as “pathetic.”

This came in the wake of a government report published in July, generated by “independent researchers,” attempting to justify ceasing climate change mitigation efforts. Carbon Brief reviewed the report and identified over 100 misleading statements. Across the pond, the British Conservative Party has pledged to repeal climate change legislation upon regaining power.

The feedback notes that renewable energy has surpassed coal to become the leading source of electricity by mid-2025, which doesn’t seem particularly pathetic. Meanwhile, we’re reminded of that memorable scene from Monty Python and the Holy Grail, where monks beat their heads in a rhythmic fashion. We can only assume that these individuals read Swanson and Tayman’s paper and concluded that 2339 was too far off.

A Simple Thank You

One of the hallmarks of being an excellent researcher is to explore questions that others haven’t considered. Consequently, a study was published in the journal Socius in September: “‘This Task Would Have Been Impossible‘… A study examining the length of acknowledgments in sociology books.” Yes, that’s correct. This is an entire sociology paper dedicated to the acknowledgments section of sociology literature.

The first takeaway, as noted by the authors, is that they are not the first to pose this question. Back in 1972, Kenneth Henry Mackintosh published a study titled Approval Patterns in Sociology. When I searched for feedback online, I was disappointed to find that it was over 300 pages long and, even if the table of contents was accurate, it lacked an acknowledgments section.

What of the new research? The researchers evaluated 411 books written by 317 sociologists and examined the acknowledgments (excluding 7 percent for rudeness). A significant statistical trend revealed that female authors wrote longer acknowledgments than their male counterparts.

Similarly, books released by university presses contained longer acknowledgments compared to those from other publishers. It remains unclear whether this means they were thanking more individuals or simply elaborating more extensively.

Naturally, I was curious about the acknowledgments section of this very paper, so I scrolled down. We were pleased to see it consisted of 218 words and included a heartfelt mention of “steadfast love and support.”

Then, we discovered it wasn’t entirely original. Co-author Jeff Lockhart listed the paper on Bluesky, and another researcher quipped:I love that the paper itself has a lengthy acknowledgments section. In response, Lockhart remarked, “we felt it was necessary.”

I would like to acknowledge the cats who prevented me from stepping on my laptop keyboard while writing this article.

Have a story for feedback?

You can send your article to Feedback at feedback@newscientist.com. Please include your home address. You can find this week’s and past feedback on our website.

Source: www.newscientist.com

Transplanting Pig Livers into Living Humans Achieves Near-Normal Functionality

Surgeons carry out a pig liver transplant at the First Affiliated Hospital of Anhui Medical University in China in May 2024.

Lu Xianfu

Transplants of organs from non-human animals to human recipients could transform medicine and potentially save countless lives each year as many die awaiting transplants. Past experiments have seen pig hearts and kidneys transplanted into humans, but this marks the first instance of an animal liver being transplanted into a living person.

“This is truly groundbreaking,” remarks Heiner Wedemeyer from Hannover Medical School in Germany, who was not involved in the procedure. “The patient was critically ill, but thanks to the transplant, he survived for six months.”

The complexities of the liver have prevented previous surgeries of this kind. Earlier studies were conducted on brain-dead individuals, but indications of success were observed. “The heart acts merely as a muscle for pumping blood,” Wedemeyer explains. “Kidneys are simpler as they filter waste. The liver, however, is unique as it synthesizes a variety of proteins essential for numerous metabolic functions.”

Similar early successes were noted in heart and kidney transplants, although subsequent complications arose. In the realm of heart transplantation, risks potentially include the spread of swine viruses.

Recently, Hokujo Taiyo and colleagues at Anhui Medical University reported a pig liver transplant performed on a 71-year-old man. His liver was deemed too damaged for a traditional transplant due to severe tumor growth and significant scarring from hepatitis B. Thousands perish annually awaiting liver transplants, so each surgical case must be meticulously justified, according to Sun.

However, Sun indicated that the man required some form of transplant as there was a risk of the tumor rupturing, which could be life-threatening. With the patient’s consent, Sun and his team replaced the affected portion of the liver with one harvested from an 11-month-old minipig in May 2024. During a five-hour procedure, they connected the blood vessels of the pig liver to those of the left side of the recipient’s own liver.

To mitigate the risk of rejection by the immune system, three pig genes were disabled while seven human genes were introduced, enhancing compatibility. The patient was also administered immunosuppressants while the team diligently examined his liver to ensure it was free from swine viruses.

Almost immediately post-surgery, the new liver began to produce bile. Bile is crucial for the digestion of fats. Within weeks, levels of bile and albumin (a protein that retains fluid within blood vessels) in the patient rose to healthy ranges, as reported by Sun.

Nevertheless, about a month post-transplant, a life-threatening blood clot formed in a blood vessel, necessitating the removal of the graft. This complication likely stemmed from an overactive immune response, leading to abnormal blood-clotting protein levels—a challenge that may be common in pig transplants given the biological differences between species.

The patient lived for roughly five additional months with only the left side of his liver remaining before succumbing to gastrointestinal bleeding, a frequent issue associated with liver scarring, according to Sun. Both Sun and Wedemeyer believe this bleeding was probably not related to the transplant.

Despite the outcome, the operation is seen as a partial success because the patient would likely have died very soon after the tumor’s removal, noted Wedemeyer. Furthermore, he added that the patient’s liver may have partially regenerated during the successful functioning of the transplant, enabling survival for several months after the graft removal.

Wedemeyer emphasized that this procedure enhanced the understanding of xenotransplantation and opened up the possibility of pig livers providing temporary solutions for patients awaiting human transplants. There may even be a chance that the remaining liver tissue could grow sufficiently to negate the need for further treatment, indicated Sun.

However, Sun cautioned that it may take at least ten years before pig livers can replace human livers permanently. He stressed the need to minimize potential complications through further genetic advancements.

topic:

Source: www.newscientist.com

Five Incredible Inventions that Turn Humans into True Cyborgs

We already understand how to artificially bring our bodies back to their natural optimal state. Prescription glasses can help correct vision, while hearing aids aid those with hearing loss.

Today, emerging technologies are advancing to enhance the human body more than ever before.

Those who utilize these innovations embody a blend of human and machine, elevating their capabilities to reach what once seemed unattainable.

These are not merely futuristic concepts; they are new technologies available on the market or soon to be released.

I’ll take you to the sky

Flight has been a long-held human aspiration, from Icarus in Greek mythology to flying cars in sci-fi films like Blade Runner. Though personal flight technology may seem fanciful, it exists today in the form of a gravity jet suit.

We once envisioned rocket boots that could lift us off the ground, but such designs would be inherently unstable due to thrust being directed away from the wearer’s center of gravity.

Instead, the Gravity Jetsuit employs five engines that gently assist the wearer in navigating the air. Pilots can hover, spin, and even glide short distances above land or water.

The 1,050 horsepower gas turbine on the back provides essential lift, while two small jet turbines on each arm ensure stability and control.

In 2021, the Royal Navy tested the suit in a training exercise to explore its onboarding potential.

Please close your eyes and look

In the future, this technology could assist individuals with color blindness in perceiving a broader spectrum of colors – Photo Credit: Getty Images

Human vision is relatively limited; without light, we are nearly blind. Night vision goggles have allowed us to see red light wavelengths for some time, but this technology has now been miniaturized.

Chinese scientist developed contact lenses coated with nanoparticles that absorb infrared rays and re-emitt them as visible red, green, or blue light.

In trials, these lenses enabled users to see flashes from an infrared LED.

Interestingly, participants found it easier to see these flashes with their eyes closed, as the visible interference from light was blocked by their eyelids.

In the future, this technology may be adapted to assist those with color blindness in experiencing a wider range of colors.

read more:

I’m walking towards the future

Exoskeleton suits are wearable machines designed to enhance natural strength. Think of them as a type of powered suit or wearable forklift.

Many associate them with sci-fi movies like Alien, but they are already in use today.

Exoskeleton sensors detect the wearer’s movement and activate joint motors to provide additional strength.

Exoskeleton sensors detect your movement and activate the joint motors to provide added strength – Image Credit: DNSYS

Exoskeleton technology currently assists some individuals with disabilities, enabling them to function without assistance. It is also being developed to help warehouse workers lift and move heavier objects safely.

Moreover, this technology has everyday applications, such as aiding individuals in hiking longer distances or navigating stairs that might otherwise be challenging.

Get a better grip

Scientists at University College London recently created a glove-like device that provides the wearer with an additional thumb.

Researchers from University College London developed a glove-like device that provides the wearer with an extra thumb – Image credit: Dani Close/UCL

While this may appear redundant (as most of us already possess two), tests show that the additional thumb enhances dexterity, allowing tasks to be completed more easily with two hands.

Sensors located beneath the wearer’s toe control the movement of the third thumb. Currently, there is no tactile feedback, meaning the wearer cannot yet use it for delicate tasks, like carrying eggs.

Though the third thumb is not part of our natural anatomy, participants have quickly adapted to using it; only four out of nearly 600 individuals could not operate it successfully.

Control things with your thoughts

Brain Control Interface Technology allows individuals with neurodegenerative disorders reliable control over their surroundings – Photo Credit: Case Western Reserve University

Brain Control Interface (BCI) is an innovative technology that enables individuals to control computers using their thoughts.

By embedding microelectrodes into the brain, nerve signals associated with limb movement can be decoded by computers and used to manipulate external devices.

This technology enhances far more than just computer usability; it offers those with neurodegenerative conditions reliable control over their environment.

So far, this technology has allowed volunteers with spinal cord injuries to control computer cursors merely by thinking of movements associated with their paralyzed hands or arms.

The next evolution for these devices is the precise control of robotic limbs. With BCI technology, scientists envision a future where patients with muscle atrophy or other degenerative conditions fully regain their motor capabilities.

read more:

Source: www.sciencefocus.com

British AI Startup Outperforms Humans in Global Forecasting Competition

The artificial intelligence system has outperformed numerous prediction enthusiasts, including a number of experts. A competition focused on event predictions spanned events from the fallout between Donald Trump and Elon Musk to Kemi Badenok being dismissed as a potential Conservative leader.

The UK-based AI startup, established by former Google DeepMind researchers, ranks among the top 10 in international forecasting competitions, with participants tasked with predicting the probabilities of 60 events occurring over the summer.

Manticai secured 8th place in the Metaculus Cup, operated by a forecasting firm based in San Francisco aiming to predict the futures of investment funds and corporations.

While AI performance still lags behind the top human predictors, some contend that it could surpass human capabilities sooner than anticipated.

“It feels odd to be outperformed by a few bots at this stage,” remarked Ben Sindel, one of the professional predictors who ended up behind the AI during the competition, eventually finishing on Mantic’s team. “We’ve made significant progress compared to a year ago when the best bots were ranked around 300.”

The Metaculus Cup included questions like which party would win the most seats in the Samoan general election, and how many acres of the US would be affected by fires from January to August. Contestants were graded based on their predictions as of September 1st.

“What Munch achieved is remarkable,” stated Degar Turan, CEO of Metaculus.

Turan estimated that AI would perform at par or even surpass top human predictors by 2029, but also acknowledged that “human predictors currently outshine AI predictors.”

In complex predictions reliant on interrelated events, AI systems tend to struggle with logical validation checks when interpreting knowledge into final forecasts.

Mantic effectively dissects prediction challenges into distinct tasks and assigns them to various machine learning models such as OpenAI, Google, and DeepSeek based on their capabilities.

Co-founder Toby Shevlane indicated that their achievements mark a significant milestone for the AI community, utilizing large language models for predictive analytics.

“Some argue that LLMs merely replicate training data, but we can’t predict such futures,” he noted. “We require genuine inference. We can assert that our system’s forecasts are more original than those of most human contenders, as individuals often compile average community predictions. AI systems frequently differ from these averages.”

Mantic’s systems deploy a range of AI agents to evaluate current events, conduct historical analyses, simulate scenarios, and make future predictions. The strength of AI prediction lies in its capacity for hard work and endurance, vital for effective forecasting.

AI can simultaneously tackle numerous complex challenges, revisiting each daily to adapt based on evolving information. Human predictors also leverage intuition, but Sindel suggests this may emerge in AI as well.

“Intuition is crucial, but I don’t think it’s inherently human,” he commented.

Top-tier human super forecasters assert their superiority. Philip Tetlock, co-author of the bestseller SuperForecasting, recently published research indicating that, on average, experts continue to outperform the best bots.

Turan reiterated that AI systems face challenges in complex predictions involving interdependent events, struggling to identify logical inconsistencies in output during validation checks.

“We’ve witnessed substantial effort and investment,” remarked Warren Hatch, CEO of Good Judgement, a forecasting firm co-founded by Tetlock. “We anticipate AI excelling in specific question categories, such as monthly inflation.

Or, as Lubos Saloky, the human forecaster who placed third in the Metaculus Cup, expressed, “I’m not retiring. If you can’t beat them, I’ll collaborate with them.”

Source: www.theguardian.com

Cats Can Experience Dementia: A Potential Key to New Treatments for Humans

Cats that exhibit dementia-like symptoms in their senior years undergo changes analogous to those seen in humans with Alzheimer’s disease, as highlighted in a study I found. This finding may open pathways for new research and help in discovering treatments for these challenging and notorious diseases.

“Our advancements in treating Alzheimer’s disease have been relatively limited compared to other illnesses,” stated Dr. Robert McGeechan, the study’s lead author, in an interview with BBC Science Focus.

“Cats are experiencing similar neurological changes, making them potentially more relevant models for understanding the disease. By investigating Alzheimer’s in cats, we can develop treatments that might be more effective for humans.”

Alzheimer’s disease is the most prevalent form of dementia, encompassing a range of neurodegenerative conditions that impair memory, problem-solving, language, and behavior. Approximately one in nine individuals over 65 are affected by Alzheimer’s, and with an aging global population, over 150 million people could be diagnosed by 2050.

Yet, despite decades of investigation and billions spent, only a handful of effective treatments exist today.

How Cats Develop Dementia

The understanding that cats can show dementia-like symptoms with age is not new. According to some research, nearly one-third of cats aged 11 to 14 exhibit at least one sign of cognitive dysfunction syndrome (CDS), the veterinary term for dementia in felines. For cats older than 15, this figure increases to over half.

CD symptoms in cats, which resemble those in humans with Alzheimer’s, include changes in sleep patterns and disorientation. Many cats also become more vocal and often seek additional comfort and attention from their owners.

It is also known that, similar to humans, older cats typically develop an accumulation of amyloid beta plaques in their brains, which are suspected to play a role in the onset of Alzheimer’s.

“As we age, humans develop these protein plaques in our brains. However, not everyone with these plaques develops Alzheimer’s, and the reasons for this remain unclear,” McGeechan explained.

“We were similarly situated with cats, knowing they could develop dementia and that some produce these proteins as they grow older, but we lacked clarity on whether this was solely age-related or if it contributed to dementia.”

To delve deeper, McGeechan’s team examined the brains of 25 cats of varying ages post-mortem, including those with CDS symptoms.

They discovered that amyloid beta plaques were not just passively situated in the brain but were also linked to detrimental changes. Notably, they observed increased inflammation and signs of glial cells, the immune cells of the brain, “enveloping” the synapses surrounding these protein plaques.

read more:

Synapses are tiny junctions enabling brain cells to communicate, and their progressive loss is believed to underlie many memory and behavioral symptoms associated with dementia.

The findings imply that a similar toxic chain reaction may also occur in feline brains. As amyloid beta accumulates, it activates glial cells, leading to the degradation of healthy synapses. While this broader pattern was evident, the finer details proved to be more intricate.

Upon comparing the brains across different age groups, subtle differences emerged. Cats with dementia appeared significantly different from younger cats, exhibiting greater amyloid plaque accumulation, inflammation, and synaptic loss. However, they did not starkly differ from older, yet otherwise healthy cats.

This observation indicates that researchers might struggle to draw a clear line between aging and dementia.

Yet, the team noted an important distinction concerning the relationship between amyloid plaques and synaptic damage. In older, healthy cats, increased amyloid did not equate to more harm. However, in cats with dementia, higher plaque levels correlated with increased inflammation and greater brain cell loss.

McGeechan posits that this mirrors human scenarios. Numerous older adults accumulate amyloid plaques in their brains without developing Alzheimer’s, while others experience significant cognitive decline.

“Amyloid may have a more toxic impact on cats experiencing CDS,” he noted. “This correlation suggests amyloid plays a role in inflammation and synaptic loss in the dementia group, unlike in the aging group.”

Thus, while amyloid accumulation may contribute to feline dementia, it likely does not tell the full story. Much like Alzheimer’s in humans, a complex interplay of various factors may also be at play.

MRI image of a cat’s brain exhibiting signs of cognitive dysfunction. The lighter areas on the edges illustrate regions of tissue loss.

The Significance of Cats in Research

Alzheimer’s disease research has historically depended on rodents, where diseases are artificially induced by genetic manipulation.

While these models aid in exploring molecular mechanisms, they often fall short of encapsulating the intricacies of naturally occurring diseases that unfold over time. Consequently, numerous promising drugs that succeed in mice fail when tested on humans.

In contrast, cats naturally develop dementia as they age, mirroring the human experience. They also share the same living environments and risk factors, including diet and air quality.

This similarity renders them a more realistic model for understanding disease biology and identifying environmental triggers that might push certain individuals towards dementia.

“Cats could serve as a bridge in our pursuit of effective treatments,” McGeechan expressed.

Future Directions

At this point, the findings raise just as many inquiries as they resolve. Given that the study involved only 25 cats, a larger sample size may be necessary to clarify the precise mechanisms underlying the observed clinical outcomes, according to McGeechan.

Another area of focus is tau. Besides amyloid beta, tau is another key protein associated with Alzheimer’s disease. Unlike amyloid plaques, tau forms tangles within brain cells. Many researchers believe tau drives the most severe stages of the disease in humans, but this investigation did not address tau in cats.

Dogs may also present a valuable avenue for exploration. Like cats, they can age into a dementia-like syndrome, displaying symptoms recognizable to many owners, such as sleep disturbances, anxiety, and forgetfulness. Comparing the brains of dogs and cats might reveal shared biological processes across species.

Ultimately, this body of research holds promise not just for human health.

“Dementia in cats is a distressing condition for both the animals and their owners,” remarked Professor Danièlle Gunn-Moore, a co-author of the study and a chair in feline medicine at the Royal (Dick) School of Veterinary Medicine.

“Conducting such research aims to enhance our understanding of how best to treat these conditions. This work benefits not just cats and their owners but also individuals with Alzheimer’s disease and their loved ones. Dementia in cats serves as an ideal natural model to study Alzheimer’s disease.”

read more:

About Our Experts

Robert McGeechan is a resident in Veterinary Neurology and Neurosurgery and serves as an ECAT Veterinary Clinical Lecturer at the University of Edinburgh, UK. His research has been published in European Journal of Neuroscience, Scientific Reports, and Nature Neuroscience.

Source: www.sciencefocus.com

Brain Implants Restore Decades-Long Forgotten Joy to Humans

A man who underwent brain stimulation had previously tried 20 treatments for his depression

Damien Fair et al./cc-by 4.0

Men suffering from severe depression for over 30 years have seemingly found relief through a personalized brain “pacemaker” designed to selectively stimulate various brain regions.

“He’s felt joy for the first time in years,” states Damien Fair from the University of Minnesota.

Treatment-resistant depression is often characterized by minimal improvement after trying at least two antidepressants. While procedures like electroconvulsive therapy (ECT) may provide some benefits, they don’t always yield relief. “They’re effective for all sizes. You’ll target the same brain area,” Fair explains. Yet, as every brain is unique, he often doesn’t hit the exact target needed for individual relief.

Fair and his team have now created a tailored method for a 44-year-old man, who was first hospitalized for depression at 1 PM. He had attempted 20 different treatments, including antidepressants, therapy, ECT, and more, all without lasting success. “It’s one of the most severe depression cases I’ve seen; he has attempted suicide three times,” Fair notes.

Initially, the researchers conducted a 40-minute MRI scan to delineate the boundaries of four brain activity networks linked to depression. This particular network in the man was found to be four times more active than that of individuals without depression, potentially exacerbating his symptoms, according to Fair.

The team then surgically implanted clusters of four electrodes at these defined boundaries, entering through two small openings in the skull. Just three days later, they began sending weak electrical pulses through wires attached to the electrodes, stimulating each brain network separately.

Upon stimulating the first network—default mode, related to introspection and memory—the man cried tears of joy. “I felt so much better,” Fair recalls.

Stimulation of the Action Mode and Salience Networks also led to reduced feelings of anxiety, while the team noticed enhanced focus when targeting the parietal networks involved in decision-making.

Using the man’s feedback, the team connected the electrode wires to tiny batteries placed just beneath the skin near the collarbone, allowing him to maintain these benefits outside the hospital. This setup acts like a “brain pacemaker,” as Fair describes it, stimulating various networks for a minute each day.

For six months, the man utilized an app linked to the pacemaker to alternate between different stimulation patterns crafted by the team every few days. He also documented his depression symptoms daily. The team optimized the stimulation based on this data during the first six months post-surgery.

Even seven weeks post-surgery, the man reported no suicidal thoughts. By the nine-month mark, he was in remission as per the Hamilton Depression Rating Scale. This improvement persisted for over two and a half years, apart from a brief period when his symptoms slightly recurred after contracting Covid-19.

“This is an incredible outcome,” states Mario Juruna from King’s College London. “It serves as a crucial proof of concept for patients unable to tolerate traditional depression treatments.”

Researchers have noted that compared to previous attempts at personalized brain stimulation, their method required fewer computational resources and led to shorter hospital stays.

It’s plausible that the expanded salience network of the man played a role in the treatment’s success. This is often present in depression; however, it’s premature to conclude if individuals with a lower level of salience network expansion would respond similarly, Juruena states.

To confirm the safety and effectiveness of this approach, randomized controlled trials assigning various individuals with depression to either stimulation or placebo will be necessary, according to Juruena. The team aims to conduct these trials within two years after testing the method on additional individuals, according to Fair.

If you need someone to listen, reach out: Samaritans in the UK at 116123 (Samaritans.org); US 988 Suicide & Crisis Lifeline: 988 (988lifeline.org). Visit bit.ly/suicidehelplines for resources in other countries

Topics:

Source: www.newscientist.com

Humans Experience Rare Conditions After Querying CHATGPT and Eliminating Salt

American medical journals are cautioning against the use of ChatGPT for health-related information after a case involving men who developed a rare condition following their discussions with chatbots about eliminating table salt from their diets.

A chronicled case in internal medicine highlights that a 60-year-old man experienced bromism, also referred to as bromide toxicity, after consulting ChatGPT.

This case study mentioned that bromism was a “well-recognized” syndrome in the early 20th century, contributing to psychiatric hospitalizations for about one in ten individuals during that period.

After learning about the negative effects of sodium chloride (table salt), the patient sought guidance from ChatGPT on eliminating chloride from his diet and disclosed that he had been consuming sodium bromide for three months. This action occurred despite previous reading that “chloride can be exchanged for bromide, but is likely for other purposes such as cleaning.” Sodium bromide was historically used as a sedative in the early 20th century.


The article’s author, an alumnus of Washington University in Seattle, emphasized that this incident underscores “how the use of artificial intelligence contributes to preventable health outcomes.”

They noted that the lack of access to the patient’s ChatGPT conversation logs hindered their ability to ascertain the specific advice the man received.

Regardless, the author found that when querying ChatGPT for alternatives to chloride, the responses also included bromide, lacking specific health warnings, and did not inquire about the author’s reasons for seeking such information; “I think healthcare professionals typically would do that,” they remarked.

The author cautioned that ChatGPT and other AI applications can “generate scientific inaccuracies and critically debate results, ultimately spreading misinformation.”

OpenAI, the creator of ChatGPT, was approached for a statement.

The company recently announced an upgrade for its chatbot, asserting that one of its notable strengths lies in health-related queries. Powered by the GPT-5 model, ChatGPT excels in answering health questions and aims to be more proactive in “flagging potential concerns” like serious physical and mental illnesses. However, it stressed that chatbots cannot replace expert advice.

An article published last week before the release of GPT-5 indicated that the patient had likely interacted with an earlier version of ChatGPT.

While recognizing that AI could serve as a conduit between scientists and the public, the article warned that the technology also risks disseminating “decontextualized information,” emphasizing that medical professionals would rarely suggest sodium bromide in response to inquiries about replacing table salt.

The authors encouraged physicians to consider using AI in understanding where patients derived their information.

The author narrated that a patient suffering from bromism introduced himself at a hospital and expressed concern about a neighbor possibly being addicted to him. He also mentioned having several dietary restrictions and was noted to have paranoia regarding the water provided to him despite intense thirst.

The patient attempted to leave the hospital within 24 hours of admission and was subsequently sectioned before receiving treatment for mental health issues. Once stabilized, he reported various other bromism symptoms, including facial acne, relentless thirst, and insomnia.

Source: www.theguardian.com

Scientists Uncover the (Surprising) Creepy Reason Humans First Domesticated Cats

Recent studies have uncovered new insights regarding the timing and locations of cat domestication. Contrary to the belief that these early felines were simply pampered companions or helpful pest eliminators, it appears they may have primarily been bred for mass sacrificial purposes.

Historically, it was thought that the domestication of cats began over 9,000 years ago as Wildcats started to adapt to the first agricultural settlements.

As grain storage attracted rodents, North African Wildcats (Ferris Livica) began hunting these pests, fostering mutually beneficial relationships that ultimately led to domestication.

However, this model is now being rigorously examined. “North African wildcats, the wild ancestors of domestic cats, were believed to have been tamed during the Neolithic era,” states Dr. Shawn Doherty, an archaeological scientist at the University of Exeter and lead author of a study featured in BBC Science Focus.

“Our research challenges this narrative by reviewing existing osteological, genetic, and iconographic evidence. We propose that cat domestication actually began in Egypt around the first millennium BC.”

Dr. Doherty’s team reassessed ancient cat artifacts from archaeological sites across Europe and North Africa, from antiquity to the present, utilizing zooarchaeological analysis, genetics, and radiocarbon dating. They found that the bones from agricultural villages in Cyprus dating back 900 to 500 years ago closely resembled those of Wildcats, undermining prior assumptions of early domestication.

Some misconceptions stem from the small size of cat bones, which can migrate between soil layers over time. “We employed radiocarbon dating to verify the ages, revealing that many cat remains are significantly more recent than previously believed.”

This data implies that the domestication of cats actually occurred much later than previously thought.

Millions of cats were sacrificed and mummified in ancient Egypt, dating from the late period to the Ptolemaic period (715-30 BC). – Getty

Researchers suggest that while rodent control may have played a role in domestication, religion could have been even more significant. In ancient Egypt, cats were revered as sacred to the goddess Bastet, and millions were kept for sacrificial purposes.

“The bond between domestic cats and the Egyptian goddess Bastet peaked in the first millennium BC,” Dr. Doherty noted. “Millions of mummified cats have been discovered in temples dedicated to her. During the Victorian era, these remains were often exhumed and transported to England for use as fertilizer.”

Through the breeding of vast numbers of kittens for ritualistic sacrifice, traits that made them more manageable may have gradually been selected, leading to the emergence of the domestic cat.

A second genome-related study, co-authored by Dr. Doherty, analyzed 87 ancient and modern cat genomes, finding no evidence that domestic cats migrated to Europe with Neolithic farmers. Instead, they likely arrived within the last 2,000 years from North Africa.

“I think this illustrates that the bond between humans and cats is not necessarily a result of the length of time they have been together, unlike with dogs,” Dr. Doherty stated.

Considering the motivations behind their early domestication, it is no surprise that cats exhibit such ethereal behavior.

Both studies are preliminary and are currently awaiting formal peer review.

Read more:

About our experts

Shawn Doherty is a senior researcher at the University of Exeter. His expertise lies in exploring deep time and animal-environment interactions through the integration of fauna, biomolecules (isotope analysis, proteomics, and genetics), alongside historical and anthropological studies.

Source: www.sciencefocus.com

Scientists Discover That Wild Killer Whales Occasionally Share Food with Humans

In the paper published online in the Journal of Comparative Psychology, marine biologists detail 34 interactions where killer whales (Orcinus orca) offered food to humans over the last 20 years. These incidents were reported in oceans worldwide, including locations from California to New Zealand, Norway, and Patagonia. The killer whales provided a diverse array of 18 species: six fish, five mammals, three invertebrates, two birds, one reptile, and one type of seaweed. The authors propose that these occurrences may illustrate a form of general altruism.

Case presented by Towers et al. Wild killer whale (Orcinus orca) utilized prey and other items to incite interactions with humans. Image credits: Towers et al., doi: 10.1037/com0000422.

Altruistic behaviors, such as sharing prey, are prosocial acts that can foster various forms of reciprocity.

Such relationship dynamics lay the groundwork for cerebral development related to the evolution of social norms in species like primates and dolphins.

Despite some cultures among these species benefiting from interactions with other mammals, documented cases of wild animals actively providing for humans are exceedingly rare.

“Orcas frequently share food with one another. It’s a prosocial behavior that helps them build relationships,” remarked Dr. Jared Towers, a researcher at Bay Setrogy.

“We are also intrigued by what they choose to share with people and how that relates to us.”

Dr. Towers and his collaborators, Dr. Ingrid Visser from the Orca Research Trust and Dr. Vanessa Prigollini from the Marine Education Association, collected and analyzed data on the 34 food distribution events they and others encountered.

In 11 cases, people were in the water when killer whales approached them. In 21 instances, they were on boats, and in two cases, they were on the shore.

Some instances were recorded through videos and photos, while others were documented via interviews with researchers.

All incidents had to meet strict criteria for inclusion in the analysis; each case required the whale to approach humans and present items voluntarily.

In every instance except one, killer whales observed the reactions after they offered food, and in seven instances, they initially refused to present it before trying multiple times.

“While domesticated animals like dogs and cats occasionally share food with humans, our study provides one of the first comprehensive accounts of such behaviors in non-domesticated species,” the scientists stated.

“This behavior makes sense, as killer whales are intelligent, social creatures that use food sharing to form relationships with both kin and non-kin.”

“They often hunt prey much larger than themselves.”

“By offering items to humans, they are presented with chances to practice, explore, and engage with the cultural behaviors typical of killer whales, while also learning to manipulate and enhance their interactions with us.”

“We believe that the cognitive abilities and social nature of these animals can help explain these behaviors.”

____

JR Towers et al. 2025. Water Test: An Attempt by a Wild Killer Whale (Orcinus orca) to Provide for People (Homo sapiens). Journal of Comparative Psychology in press; doi: 10.1037/com0000422

Source: www.sci.news

Qantas Incident Highlights That One Phone Can Exploit the Weakest Link in Cybersecurity: Humans

a
A phone call may be coming your way. This week’s revelations show that cybercriminals stole personal data from as many as 6 million customers after breaching offshore IT call centers and accessing third-party systems.

This incident adds to a troubling trend of cyberattacks affecting major Australian corporations, including the personal information of millions linked to the recent breaches involving Optus, Medibank, and more recently, the $4 trillion superannuation sector in Australia.

The attack on Qantas follows recent targeting by a group known as “spiders,” focusing on various airline sectors. They employ social engineering techniques to manipulate employees and contractors into granting access, often bypassing multifactor authentication.



New technology brings old methods

Although companies can implement the latest software updates and safeguard their systems, hackers continue to exploit social engineering tactics, often targeting the weakest link: human behavior.

Social engineering is not a new concept; it revolves around tricking individuals into revealing sensitive information, predating the internet.

Phishing is the most common manifestation of social engineering, crafted to appear legitimate to lure unsuspecting users into divulging credentials.

The telephone variation, known as vishing, presents a greater challenge for attackers as they must employ persuasive tactics over the phone to manipulate employees into providing sensitive information.

The emergence of user-friendly artificial intelligence tools, including voice cloning, has made such attacks even simpler for cybercriminals.

The latest report from Australia’s intelligence commissioner covering the latter half of 2024 indicated a significant rise in complaints about social engineering attacks, particularly within government agencies, finance, and health sectors.

Qantas’ breach involved compromised details such as names, email addresses, phone numbers, birth dates, frequent flyer numbers, etc. While these breaches might not directly lead to financial theft, the growing number of incidents in Australia enables hackers to aggregate stolen data to target new vulnerable entities.

Data breaches lead to more data breaches

In April, the national pension fund acknowledged the risks associated with hackers collecting credentials from previous breaches to gain access to superannuation accounts, a tactic termed “eligibility smashing.”

Fortunately, only a small number of customers incurred losses totaling around $500,000. However, this could represent a significant number of fund holders who are yet to reach retirement age.



The Albanese government has been cautioned that this attack signals potential risks within the financial sector. In a recent advisory provided to the incoming government, released under the Freedom of Information Act, Australia’s Prudential Regulation Authority (APRA) warned that superannuation assets are susceptible to cyber threats.

“The prevalence and frequency of cyberattacks on large pension funds reinforce the necessity for enhancing our capabilities in managing both cyber and operational risks,” stated APRA.

“Despite only a small number of accounts reporting fraudulent withdrawals, it highlights the need for the sector to mature its cybersecurity and operational resilience.”

“As the sector expands and more members retire, continuity and increasing interconnectedness with the banking sector are crucial.”


Skip past newsletter promotions

APRA cautioned the industry in 2023 about the critical nature of multifactor authentication, yet some funds were unable to implement it before the April breach.

Regulators noted that there is an ongoing wave of cyberattacks targeting banking and insurance sectors, necessitating continuous testing of their defenses against emerging threats.

Who is at the most risk?

According to Craig Searle, global leader in cyber advisory at Trustwave, healthcare, finance, technology, and critical infrastructure sectors such as telecommunications are particularly vulnerable to cyber threats.

“The technology sector is especially at risk due to its pivotal role in digital infrastructure and interconnected supply chains,” he explained. “Recent high-profile supply chain attacks demonstrate how breaches of a single tech provider can ripple through to hundreds or thousands of downstream clients.”

“Overall, the sectors facing the highest risks are those that manage valuable data, maintain complex supply chains, and deliver critical services.”

Searle noted that attackers intentionally target third-party systems and outsourced IT support, which presents significant risks for large corporations, as exemplified by the breaches at Qantas.

“The interconnected dynamics of the digital supply chain can lead to vulnerabilities among partners or contractors, creating a ripple effect that compromises sensitive data far beyond the initial breach,” he remarked.

Christian Beek, senior director of threat analysis at Rapid7, highlighted that third-party systems are now integral to the operations of many organizations and thus become prime targets for cybercriminals.

“Organizations must apply adequate levels of due diligence when evaluating the security protocols of these third-party systems to mitigate the risk of data being compromised.”

Searle emphasized the necessity for organizations to adopt a proactive cybersecurity posture, swiftly applying software patches and establishing robust access controls, such as multifactor authentication.

Beek echoed that organizations need to be proactive, insisting that executive leadership must take responsibility for cybersecurity and board oversight.

“The new tactics utilized by modern cybercrime groups extend beyond standard security management protocols,” he warned. “These unconventional approaches compel us to rethink the typical defensive strategies, especially regarding social engineering tactics and how we counter them.”

Source: www.theguardian.com

Why Did Ancient Humans Evolve Language Just Once?

My child is extraordinary. He enters the kitchen, glances at me, and articulates enchanting words: “Could I please have a cheese and tomato sandwich?” Moments later, that very snack materializes in front of him.

Other young animals express their hunger through sounds and murmurs, but only humans possess advanced grammar and vocabulary systems that enable precise communication.

This narrative is part of our themed special, showcasing expert perspectives on some of science’s most astonishing concepts. Click here for additional insights.

Research into animal behavior reveals that these creatures exhibit many traits previously thought to be exclusive to humans—from culture to emotional depth, and even aspects of morality. While language may seem to set us apart, “I believe language gives us a unique status as a species,” says Brian Relch from the University of North Carolina, Chapel Hill.

Given this context, one critical area of research focuses on how language originated and why it evolved solely within our human lineage.

Psychologist Simon Edelman from Cornell University proposes in The Magical Power of Language that there is a straightforward evolutionary rationale. Alongside his colleague Oren Korodny, now at Hebrew University in Jerusalem, he theorizes that the origins of language may date back approximately 1.7 million years, coinciding with early humans developing the ability to create stone tools—a skill beyond the capabilities of non-human animals.

The notion is that tool-making locations functioned as learning environments, where novice tool creators required guidance from experienced individuals. Proto-language may have developed as a way for mentors to instruct their students, possibly explaining why both language and tool-making appear to necessitate cognitive structures that organize thoughts in a coherent sequence.

However, around a decade ago, a pivotal experiment questioned this narrative. In 2014, Shelby Putt from Illinois State University and her team investigated how individuals learn to create tools, exposing 24 volunteers either to expert instructions or to direct demonstrations while occasionally engaging their attention. Surprisingly, both approaches proved effective, indicating that intricate tool-making may not rely on verbal language.

This does not imply that Putt views language and tool-making as entirely disconnected. She posits that creating complex tools required individuals to structure their thoughts and organize them to achieve their task. She asserts that this ability led to an expansion of brain regions associated with working memory, enabling easier mental manipulation of concepts.

Nonetheless, Putt suggests that humans utilized these cognitive frameworks to devise language, enhancing communication and potentially increasing survival odds.

All these scenarios presume that language functions fundamentally as a communication tool among individuals. However, an alternative perspective on the evolution of language emphasizes the ways it aids individuals in organizing their thoughts when confronted with complex tasks.

Some, including prominent linguist Noam Chomsky, argue that this may have driven language evolution, suggesting it had no relation to tool-making. These researchers propose that language emerged approximately 70,000 years ago, possibly due to random genetic mutations that reconfigured brain circuitry.

Ultimately, the origins of language remain a subject of debate. If Chomsky and his associates are correct, the development of language was less about magic and more about fortunate circumstances.

Explore other pieces in this series via the links below:

topic:

Source: www.newscientist.com

Can Humans Thrive Beneath the Waves? Exploring a Live Underwater Experiment

In recent years, the desire to establish human colonies beyond Earth, whether to escape environmental issues or explore uncharted territories, has gained significant traction.

While much attention is given to proposed bases on the Moon and Mars, there’s a more challenging and lesser-known frontier much closer to home: the ocean’s depths.

This concept isn’t new. Since the 1960s, with pioneers like French oceanographer Jacques Cousteau, individuals have created and spent extended periods in aquatic habitats.

NASA has been sending teams to the Aquarius Reef Base since 2001. This research facility, located 20 meters (around 65 feet) underwater off the Florida coast, has allowed scientists, engineers, and future astronauts to live in the module for 7 to 14 days.

With advancements in technology, prolonged underwater stays may become feasible. The UK company, Deep, is leveraging this technology to design habitats for extended underwater living. But, is the technology the only challenge we face?

Above the Atmosphere, Under the Sea

Humans are quite vulnerable. We struggle without oxygen or sunlight and are not fond of extreme pressure changes. Thus, we might not be the best candidates for life at the ocean floor.

This doesn’t imply that we can’t thrive in inhospitable environments.

Since 2000, astronauts have spent significant periods aboard the International Space Station (ISS).

Several astronauts have been documented living in the ISS for over 300 consecutive days, but Valeri Polyakov holds the record, having spent 437 days aboard the Mir Space Station in Russia between 1994 and 1995.

Moreover, astronauts returning from lengthy missions often face health issues, such as reduced bone density and muscle atrophy. What does this mean for those who aim to live underwater?

The most extensive study is that of Rudiger Koch, a German aerospace engineer who lived in a capsule submerged 11 meters (36 feet) under the Caribbean Sea for 120 days between 2024 and 2025.

Rudiger Koch on the balcony of the capsule where he lived between 2024 and 2025.

Koch reported no health issues upon celebrating with champagne and cigars.

In second place is Professor Joseph Dituri, who spent 100 days studying the physical and psychological effects of living underwater in a lodge situated at the bottom of a 9-meter deep (30-foot) lagoon in Florida.

Dituri conducted daily tests during his time submerged and following his return to the surface. Notably, aside from minor setbacks, he felt quite well.

He noted improvements in sleep quality, cholesterol levels, and inflammation. His stem cell count, testosterone levels, and cognitive performance also improved.

Interestingly, Dituri appeared to have lowered his biological age (an indicator of the aging process of the body), although he was recorded as having shrunk by over 1 cm (approximately 0.5 inches) due to the pressurized environment inside the lodge.

read more:

A Step Towards Living Underwater

With limited data, we still have a tenuous understanding of life in aquatic environments. This is where Deep comes in.

The ocean technology and exploration company aims to develop two habitats by 2027, with the goal of establishing a permanent underwater presence. They are using a submerged quarry in Gloucestershire as a testing ground for their underwater habitats.

Deep is developing two habitat models: Vanguard, designed for three-person short stays, and Sentinel, a 16-meter (52-foot) capsule intended as a long-term habitat complete with living quarters, bedrooms, and research facilities, capable of accommodating researchers at depths of up to 200 meters (656 feet) for 28 days.

The aim is to enable researchers to remain submerged for extended periods, allowing for comprehensive studies of underwater living impacts and marine life. However, achieving these depths poses significant challenges.

“The most hazardous aspects of diving occur during descent,” explains Dr. Dawn Kernagis, Deep’s scientific research director. “Divers breathe compressed gas, with fluctuating pressure increasing the risk of decompression sickness (DCS), where gas bubbles form in the bloodstream.”

While most DCS cases are mild, severe instances can impact the brain, spinal cord, respiratory system, and circulatory systems.

To mitigate these risks, Deep aims to keep researchers “saturated” in the Sentinel habitats. This means achieving a new equilibrium with the underwater environment.

“Saturated tanks, like ours, facilitate diving into greater depths and adjusting to the pressure, enabling much longer stays, ranging from hours to about a month,” states Kernagis.

Deep plans for close monitoring of researchers during their stays to better understand the long-term physical and psychological effects of deep-sea living.

The foundation laid now may support future inhabitants underwater for weeks, months, or even years. In the not-so-distant future, some of us may find ourselves living in a modern-day Atlantis.

About Our Experts

Dr. Dawn Kernagis is the director of scientific research at Deep, a UK-based ocean technology and exploration firm. She has published in numerous scientific journals, including Journal of Clinical Oncology, Proceedings of the National Academy of Sciences, and Circulation.

read more:

Source: www.sciencefocus.com

As Technology Advances, Early Humans Developed Enhanced Teaching Skills.

As technology evolves, humans enhance their ability to teach skills to others

English Heritage/Heritage Images/Getty Images

Research into human evolution spanning 3 million years illustrates that advancements in communication and technology have occurred simultaneously. As early humans developed more sophisticated stone tools and various techniques, they also refined their abilities to communicate and educate the next generation on these new skills.

“There exists a scenario for the evolution of modes of cultural transmission throughout human history,” states Francesco Dalico, from the University of Bordeaux, France. “It seems there’s a co-evolution between the complexity of cultural traits and the complexity of their transmission methods.”

A defining characteristic of humanity is the progression toward more complex tools and behaviors. For instance, ancient humans crafted sharp stones for cutting or stabbing and affixed them to wooden shafts to create spears.

Crucially, the ability to instruct others in these skills is vital. For more intricate tasks like playing the violin or coding, extensive education and practice are typically necessary. However, in prehistoric times, the capacity for effective communication was limited, particularly before intricate languages emerged.

Furthermore, Ivan Colagè from the Pope University of the Holy Cross in Rome, along with D’Errico, investigated how the transmission of cultural information has evolved over the last 3.3 million years, aligning with changes in behavior and technology. They examined 103 cultural traits, such as specific types of stone tools, decorative items like beads, and burial customs. They documented the initial appearances of each trait in the archaeological record, indicating common practices.

The researchers assessed the complexity involved in learning each trait. Some simple tools, like stone hammers, require minimal instruction. “They don’t need much explanation,” D’Errico notes. In contrast, demonstrating the creation of more advanced tools is necessary, and the most intricate behaviors, such as deeply symbolic burials, demand explicit verbal explanations.

To analyze this, D’Errico and Colagè outlined three dimensions of learning: First, spatial proximity—can tasks be learned from a distance, or does one need to be physically present? Second, temporality—does one brief lesson suffice, or are multiple sessions necessary, perhaps emphasizing various steps? Third, the social aspect—who learns from whom?

They evaluated these traits and consulted a panel of 24 experts for assessment, whose consensus reinforced their findings. “I believe the conclusion is quite robust,” says D’Errico.

Recent studies indicate two significant shifts in cultural communication. The first occurred around 600,000 years ago when early humans began teaching one another, likely without relying on spoken language; gestures may have sufficed. This predates the emergence of our species, Homo sapiens, and aligns with the onset of hafting.

The second shift happened between 200,000 and 100,000 years ago, coinciding with the development of modern languages, which became essential for performing complex tasks like burials. “These actions involve many detailed steps, requiring explanation,” D’Errico explains.

“The relationship between cultural communication and cultural complexity is strong,” asserts Ceri Shipton from University College London. He emphasizes that while the timeline for language development remains uncertain, this new estimate provides a “reasonable timeframe.”

topics:

  • Human evolution/
  • Ancient humans

Source: www.newscientist.com

Research Uncovers That Humans Are Seasonal Beings

Biological processes such as sleep, heart rate, and metabolism are regulated by the circadian clock found in nearly every cell in the human body. However, modern lifestyles challenge this natural timing mechanism in ways for which we are not well-suited. Factors like industrialization, shift work, artificial lighting, and smartphone usage significantly impact our sleep and circadian rhythms. A recent study from the University of Michigan reveals that our circadian rhythms continue to align with seasonal changes in sunlight. This result was published in the journal NPJ Digital Medicine.

Kim et al. We believe that substantial individual differences in shift work adaptation, which are vital for shift workers’ health, can be explained in part by the biological mechanisms of seasonal timing. Image credit: Sasin Tipchai.

“We may not want to admit it in today’s world, but humans are inherently seasonal,” stated Dr. Ruby Kim, the study’s lead author.

“The duration of daylight and the sunlight we receive significantly influence our physiology.”

“Our research demonstrates that the timing of biologically significant seasons plays a role in how individuals adapt to changes in their daily routines.”

“These findings could lead to new avenues for investigating and understanding seasonal affective disorders, a form of depression linked to seasonal variations.”

“It could also point to new areas of exploration regarding a range of health issues related to sleep schedules and alignment with circadian rhythms.”

“This work holds great promise for future discoveries, potentially impacting metabolic and cardiovascular health as well as mental health conditions such as mood disorders and anxiety.”

The study also indicated that humans possess a seasonal genetic component, which might explain the significant differences in how individuals are impacted by variations in daylight.

“Some individuals may adapt better, while others might fare much worse,” remarked Professor Daniel Foger, a senior author of the study.

Investigating this genetic component could help researchers and healthcare providers identify where an individual falls on the adaptation spectrum, although achieving this will require more time and effort.

For now, this study serves as an important first step in reshaping our understanding of human circadian rhythms.

“Many people tend to perceive their circadian rhythm as a singular entity,” explained Professor Foger.

“Our findings indicate that it’s not one clock, but rather two.”

“One clock tracks dawn, and the other tracks dusk. They communicate with each other.”

Researchers adjusted their studies of circadian rhythms according to seasonal sunlight by analyzing sleep data collected from thousands of participants using wearable health technology like Fitbits.

All participants were medical interns involved in a one-year internship as part of a healthcare study funded by the National Institutes of Health.

Interns are shift workers whose schedules frequently change, which also changes their sleep patterns.

Moreover, these schedules often run counter to the natural day-night cycle.

The observation that the circadian rhythm of this group demonstrated seasonal dependence is a strong indicator of how deeply ingrained this feature is in humans, which is unsurprising.

“It makes a lot of practical sense. Our brain physiology has been attuned to track dusk and dawn for millions of years,” stated Professor Foger.

“Then industrialization came along in an evolutionary blink, and we’re still trying to catch up.”

Participants in the healthcare study also provided saliva samples for DNA analysis, enabling researchers to include genetic factors in their evaluations.

Previous genetic studies have identified specific genes involved in how circadian clocks in various animals respond to seasonal changes.

Since humans share this gene, the authors could pinpoint a smaller group of interns with slight variations in their genetic makeup.

For this group, shift work was more disruptive due to the misalignment between seasonal circadian rhythms and their sleep schedules.

This leads to many questions, particularly regarding the health implications and how shift work affects different individuals.

However, these are questions researchers will seek to investigate further in the future.

____

R. Kim et al. 2025. Seasonal timing and individual differences in shift work adaptation. npj digits. Pharmaceuticals 8, 300; doi:10.1038/s41746-025-01678-z

Source: www.sci.news

Study Suggests All Humans Emit Subtle Light Until Death

All living beings, including you, emit subtle, etheric, semi-visible light that continues until death. Recent research supports this idea.

This mysterious luminescence might lead one to believe it is an indication of an aura or something similar.

However, Dr. Daniel Oblak, a physicist from the University of Calgary and the study’s lead author, explained to BBC Science Focus that while the concept of an aura is metaphysical and unscientific, the emitted light is not. Known as Ultraweak Photon Emission (UPE), it is a natural byproduct of metabolism.

“I would like to emphasize that UPE usually results from biochemical processes and is thus akin to what occurs with glow sticks.”

“UPE is so faint that it is imperceptible to the human eye and can be completely obscured by other light sources unless in total darkness.”

Don’t think that you can observe your own sparkle simply by closing the curtains and turning off the lights; this light is 1,000 to 1,000,000 times dimmer than what the human eye can detect.

These four mice emitted significantly more ultrweak photon emissions (UPEs) while alive (top) compared to after death (bottom). – Credits: Salari et al, The Journal of Physical Chemistry Letters, 2025

UPE arises when a chemical within a cell creates an unstable molecule, known as a reactive oxygen species (ROS), which is essentially a byproduct of metabolic activity.

As ROS levels increase, other molecules become “excited,” meaning they carry excess energy, and it is this energy that emits light.

The primary factor driving this phenomenon is oxidative stress, a kind of cellular damage caused by aging and disease. The greater the oxidative stress experienced by the body, the more ROS—and consequently, more light—is produced.

“When an organism ceases to live, it halts metabolism, thereby stopping the emission of ultrawave photons,” he remarked.

To investigate UPE, scientists in Calgary measured the UPE generated by immobilized and deceased mice and damaged leaves.

Using specialized cameras, they found that living mice emitted significantly more light than their deceased counterparts. Conversely, the leaves released more light in areas that were damaged compared to intact regions.

This is due to increased oxidative stress in the scratched areas. However, the dead mice did not emit light as their bodies no longer underwent metabolic processes.

The leaves of St. Stwhere were illuminated by scratches and chemical damage. – Credits: Salari et al, The Journal of Physical Chemistry Letters, 2025

Dr. Oblak highlighted that the significance of UPE lies in its ability to provide a non-invasive method to assess the health of living organisms.

“This technology could be utilized to monitor tissue status, such as in transplants, or to gauge crop and forest health, especially regarding the stress levels in organisms,” he explained.

Nonetheless, this field remains rife with uncertainties. For instance, Oblak pondered: “Perhaps UPE is not merely a byproduct of metabolic processes; it may also serve a purpose,” although scientists have yet to reach a consensus.

Read more:

About our experts

Dr. Daniel Oblak is an associate professor in the Faculty of Physics and Astronomy at the University of Calgary. He earned his PhD in Quantum Optics from the University of Copenhagen in 2010, having previously completed his Bachelor of Science and Master’s degrees at Aarhus University. Currently, his research interests encompass quantum information science, long-range encryption, quantum networks, and quantum light interfaces.

Source: www.sciencefocus.com

AI May Be More Persuasive Than Humans in Debates, Scientists Find

Artificial intelligence can perform tasks similarly to humans, if not more effectively, particularly in crafting persuasive arguments rather than just being unable to shout.

Experts have raised concerns about the implications this has, especially regarding the integrity of elections.

“If we can deploy persuasive AI on a large scale, we could envision an army of bots micro-targeting undecided voters with tailored political narratives that seem authentic,” he mentioned. Additionally, he pointed out that such influences are challenging to monitor, harder to regulate, and nearly impossible to detect in real time.

“It would be surprising if malicious entities hadn’t begun using these tools to disseminate misinformation and biased propaganda,” Salvi stated.

Nonetheless, he also noted that persuasive AI could have positive effects, such as reducing conspiracy beliefs and political polarization, as well as encouraging healthier lifestyle choices.

In a report featured in the Nature Human Behavior Journal, Salvi and his colleagues described an online experiment with 300 participants faced with 300 human opponents, and another 300 paired with Chat GPT-4.

Each participant was assigned a topic for discussion. Did these topics lead to debates such as “Should students wear school uniforms?” or “Should abortion be legalized?” Each participant was randomly given a stance to argue.

Before and after the discussions, participants rated their level of agreement with the proposals.

In half of the pairs, whether human or AI, additional information about the opponents, like age, gender, ethnicity, and political affiliation, was provided.

The results from the 600 discussions showed that Chat GPT-4 was as effective as its human counterparts in persuading others, provided no personal information was shared.

However, access to such personal details made the AI more convincing than humans. When both types of opponents were similarly unpersuasive, the AI was able to shift participants’ opinions 64% of the time compared to human opponents.

Upon further investigation, the team found that AI’s persuasiveness was particularly evident on topics that elicited less strongly held views.

The researchers noted that when human participants were paired with AI, they correctly guessed their partner’s identity about 75% of the time. The team also observed that AI utilized a more analytical and structured approach compared to human participants, but not all engaged in arguments they personally supported. However, the team cautioned that these aspects did not fully account for AI’s persuasiveness.

Rather, its effectiveness seemed to stem from its ability to tailor arguments to individual preferences.

“It’s akin to debating with someone who makes a compelling case,” Salvi remarked, noting that the impact could be even greater with more detailed personal information, such as insights derived from someone’s social media activities.

Professor Sander van der Linden, a social psychologist at Cambridge University who did not participate in the study, remarked that it reopened discussions about the potential for large-scale manipulation of public opinion through personalized conversations with language models.

While he indicated that various studies, including his own, have shown that the persuasiveness of language models relies on analytical reasoning and evidence use; one study revealed that personal information did not enhance Chat GPT’s persuasiveness.

Professor Michael Wooldridge, an AI researcher at Oxford University, acknowledged that while there are beneficial applications of such systems, like health-related chatbots, there are many concerning aspects as well, including the potential exploitation of these applications by harmful groups targeting youths.

“As AI continues to evolve, we will witness an increasingly broad range of potential technological abuses,” he asserted. “Policymakers and regulators must act decisively to stay ahead of these threats rather than constantly playing catch-up.”

Source: www.theguardian.com

This Machine Solves the Rubik’s Cube Faster Than Most Humans!

Blink and you’ll miss it: Purdue University’s engineering students have developed a robot capable of solving a Rubik’s Cube in just 0.1 seconds.

This robot, dubbed “Purdubik’s Cube,” set a Guinness World Record last month. The record for the fastest robot to solve puzzle cubes was achieved with a time of 0.103 seconds, surpassing the prior record of 0.305 seconds set by Mitsubishi Electric Engineers in May 2024.

Located on Purdue’s campus in West Lafayette, Indiana, these robots utilize custom algorithms optimized for machine vision, speed, and industrial-grade motion control hardware for color recognition, as stated in a press release from Purdue University.

The Purdubik’s Cube team features a rapid robotic system that can solve scrambled Rubik’s Cubes in 0.103 seconds, including members Junpei Ota, Aiden Hurd, Matthew Patrohay, and Brock Berta.
Purdue

Formed by engineering students Junpei Ota, Aiden Hurd, Matthew Patrohay, and Brock Berta, the robots were initially created for the December 2024 Spark Challenge organized by Purdue’s Elmore Family School of Electrical and Computer Engineering. After clinching first place, the team aimed to enhance their robots with support from Purdue’s Laboratory for Control, Optimization and Networking.

The innovative Purdubik’s Cubes aren’t just a novelty; high-speed robotic systems like these are already being utilized across various industries, including manufacturing and packaging.

The Rubik’s Cube first emerged as a cultural sensation in the 1980s, only to wane in popularity during the 1990s. However, a surprising revival occurred thanks to the internet, spurring the development of SpeedCubing, where participants race to solve a 3 x 3 puzzle as quickly as possible.

Today, enthusiasts frequently attend events dedicated to solving Rubik’s Cubes in numerous styles. Nonetheless, no human can match the speed of Purdue’s robot. The current world record for human solvers is held by Max Park, who completed a cube in 3.13 seconds in 2023.

Source: www.nbcnews.com

Chimpanzees Share “Fundamental Elements of Musical Rhythms” with Humans

Young Chimpanzee Drumming in Guinea

Cyril Ruoso/Naturepl.com

Musicality may have originated from a shared ancestor of chimpanzees and humans, given the similarities in their drumming techniques.

Katherine Hoheiter at The University of St Andrews and her research team analyzed 371 instances of drumming from two of the four subspecies of chimpanzees in Africa: Western chimpanzees (Pantroglogistics Vers) and Eastern chimpanzees (Pan troglodytes schweinfurthi).

They utilize their hands and feet, often on buttress roots, creating rapid rhythms mainly during rest, travel, or during displays of threat.

Hoheiter mentions that while capturing chimpanzee drumming is common, the rainforest poses significant research challenges, and gathering data for some populations took decades.

Ultimately, researchers found that chimpanzees drum significantly faster than most humans. “The longest drumming event we recorded exceeded five seconds, while the shortest was less than 0.1 seconds,” notes Hoheiter. “Chimpanzees also tend to repeat these beats, especially while traveling.”

Despite the contrasts between chimpanzee and human drumming, chimpanzees exhibit some “core components of human musical rhythms,” according to team member Vesta Eleuteri from the University of Vienna.

“They employ recognizable rhythms present in various musical cultures, which contrasts with randomly played beats. These consist of hits that are evenly spaced, akin to clock ticks,” she elaborates. “Moreover, we discovered that the Eastern and Western chimpanzee subspecies, residing on different sides of Africa, exhibit distinct rhythmic patterns.”

Eleuteri explains that Eastern chimpanzees alternate between short and long intervals between beats, while Western chimpanzees maintain equally spaced hits. Additionally, these chimpanzees initiate drumming more quickly and use more hits to commence drumming early during a unique pant-hoot call.

Miguel Rulente from the University of Girona finds the notion that different subspecies display unique drumming styles compelling. “These patterns suggest the potential for not just individual idiosyncrasies but also cultural distinctions in how groups utilize drumming as communication tools.”

It is well understood that rhythm plays a crucial role in human social interaction, whether through music, dance, or even conversational rhythms, explains Hoheiter. “I’m not implying that chimpanzee drumming reflects the sophistication of modern human rhythms. However, this research is the first to show that we share fundamental rhythmic elements, suggesting that rhythms are intrinsic to our social environment even before we evolved into humans.”

“Previously, it was claimed that rhythmicity was exclusive to humans,” states Gisela Kaplan from the University of New England. “However, a growing body of evidence suggests this is not the case.”

Topics:

Source: www.newscientist.com