New Research Indicates Australia’s First Inhabitants Were Fossil Collectors

In a recent study, Professor Mike Archer from the University of New South Wales and his team revisited the fossilized tibia (the lower leg bone) of the now-extinct giant stenurine kangaroo. These bones, discovered in Mammoth Cave in southwestern Australia around World War I, provided solid evidence that Indigenous Australians hunted large animals, a finding in which Professor Archer was involved. A 1980 study had concluded that distinctive notches in the fossilized bones indicated slaughter. However, Professor Archer is now ready to acknowledge that this initial conclusion was incorrect.



Giant animal unearthed from mammoth cave about 50,000 years ago: giant long-beaked echidna Malayanglossus hackettii, giant kangaroo Procoptodon brauneorum, giant diprotodont Zygomaturus trilobus, and possum (Thylacinus cynocephalus). Image credit: Peter Schouten.

“As a scientist, updating the record as new evidence emerges is both my duty and responsibility,” Professor Archer stated.

“In 1980, we interpreted those cuts as signs of slaughter based on the best conclusions we could reach with the tools available to us then.”

“With advancements in technology, we now understand that our original interpretation was incorrect.”

“After the 1960s, there was a significant debate about whether Aboriginal peoples coexisted with Australia’s prehistoric megafauna or contributed to their extinction.”

“Many believed the incisions in the bones were made by humans using tools, suggesting that the extinction of megafauna and the arrival of humans approximately 65,000 years ago were not coincidental.”

“For decades, the bones from Mammoth Cave were seen as the ‘smoking gun’ indicating that Indigenous Australians hunted giant animals, but with that evidence dispelled, the discussion on megafauna extinction is now reopened, and the role of humans is more ambiguous than ever.”

To reexamine the same dissected stenurine leg bone, Professor Archer and his co-authors utilized advanced 3D scanning technology to analyze the bone without causing any damage.

They also employed modern radiometric dating methods to accurately determine the age of the bones and their cut surfaces while conducting detailed microscopic examinations.

Their findings indicated that the cuts were made after the bone had dried and cracked, suggesting the bones were likely already fossilized when the incisions occurred.

Paleontologists also investigated a fossilized tooth given to archaeologist Kim Ackerman by a Wora man from the Mowanjum mission, who had collaborated with Indigenous communities in the Kimberley during the 1960s.

This tooth, belonging to the Zygomaturus trilobus, a species of giant marsupial related to wombats, was part of Australia’s Pleistocene megafauna.

The tooth was retrieved from the Kimberley in northwestern Australia, and its characteristics closely matched other fossils found in Mammoth Cave in southwestern Australia.

Dr. Kenny Trabouillon from the Western Australian Museum remarked, “The discovery of this tooth in the Kimberley, far from its likely origin in Mammoth Cave, implies it may have been transported or traded by humans across great distances.”

“This suggests that cultural appreciation and symbolic usage of fossils existed long before the advent of European science.”

“The First Peoples might have been the continent’s, and possibly the world’s, earliest paleontologists.”

Researchers haven’t entirely dismissed the possibility of Aboriginal people having hunted Australia’s megafauna.

However, without concrete evidence, we cannot definitively assert that Indigenous Australians caused its extinction.

“While these remain hypotheses, we need substantiated proof before concluding that predation by Indigenous peoples contributed to the extinction of now-vanished megafauna, especially considering the long history of Indigenous peoples respecting and sustainably utilizing Australia’s wildlife,” Professor Archer stated.

“If humans were truly responsible for the unsustainable hunting of Australia’s megafauna, we would expect to find much more evidence of such hunting in the fossil record. Instead, the only solid evidence we had until now was this single bone, which now shows strong indications that the mutilations occurred post-mortem.”

If humans were not solely accountable for the extinction of Australia’s ancient megafauna, then what was?

Researchers indicate that many megafauna species went extinct long before humans arrived, and while some coexisted with humans for millennia, their decline often aligned with significant climate changes.

“What we can ascertain is that the First Peoples were the first in Australia to exhibit a keen interest in and collect fossils, likely thousands of years before Europeans arrived on the continent,” the researchers affirmed.

Their paper was published in the journal Royal Society Open Science.

_____

Michael Archer et al. 2025. Australia’s first people: hunters of extinct megafauna or Australia’s first fossil collectors. R. Soc. Open Science 12(10):250078; doi: 10.1098/rsos.250078

Source: www.sci.news

Research Indicates Humans Evolved from Ape-Like Ancestors in Africa

A recent investigation conducted by paleoanthropologists from the United States and Canada has focused on the morphology of the hominid talus, a significant bone in the ankle that connects to the tibia and calcaneus of the foot. Ardipithecus ramidus, a hominid species that existed in eastern Africa approximately 4.4 million years ago, was at the center of this study. The researchers discovered that the fossil exhibits similarities to the talus of chimpanzees and gorillas, which are adapted for vertical climbing and terrestrial quadrupedal locomotion—a form of movement where animals traverse on all fours with the entire sole of the foot touching the ground, including the heel. Additionally, the authors confirmed the presence of derived features in the specimen that align with earlier suggestions for improved extrusion mechanisms in the legs of Ardipithecus ramidus.

Ardipithecus ramidus, a hominid that existed in Africa over 4 million years ago. Illustration by Arturo Asensio, from Quo.es.

Partial skeleton from 4.4 million years ago, Ardipithecus ramidus, affectionately dubbed “Aldi,” was uncovered in 1994.

This species featured an ape-sized brain and had grasping big toes adapted for climbing trees.

It walked on two legs, and its upper canine teeth were diamond-shaped as opposed to the V-shape commonly found in chimpanzees.

“Aldi represents one of the oldest and most complete skeletons discovered,” remarked Dr. Thomas (Cody) Plan, a researcher at Washington University in St. Louis.

“Aldi is roughly a million years older than ‘Lucy’, another renowned early human ancestor, and signifies an early phase in human evolution.”

“Oneof the surprising aspects of this find was that, despite walking upright, Aldi retained many monkey-like characteristics, such as its grasping feet.”

“Great apes, including chimpanzees and gorillas, possess forked big toes that facilitate gripping tree branches while climbing.”

“However, it also exhibited traits consistent with our lineage. Ardipithecus truly represents a transitional species.”

Initially, scientists speculated that Ardi’s locomotion resembled a common form rather than being typical of African apes, leading them to conclude that this early human ancestor was not particularly ape-like, which startled the paleoanthropology community.

“From their analysis, they inferred that contemporary African apes, like chimpanzees and gorillas, represent a dead end, or a kind of evolutionary cul-de-sac. Dead end underscores the evolutionary process rather than the point at which humans emerged,” stated Dr. Puran.

“Instead, they posited that Ardi offered evidence of a more generalized ancestry that was less akin to chimpanzees and gorillas.”

By examining the ankles of chimpanzees and gorillas, researchers can gain insights into their movement, especially regarding their vertical tree climbing techniques.

This crucial bone also sheds light on how early species transitioned to bipedalism.

For the recent study, Dr. Plan and his team compared Ardi’s ankles to those of great apes, monkeys, and early humans.

Their findings indicated that Ardi’s ankle is the only one within the primate fossil record that shares similarities with African apes.

These apes are recognized for their adaptations to vertical climbing and terrestrial quadrupedal locomotion, suggesting that Ardi might have utilized their feet similarly.

Alongside these primitive traits, Ardi’s talus exhibited signs of an enhanced foot extrusion mechanism.

This complexity points to a blend of climbing and locomotor behaviors in this early human species, which is crucial in understanding the evolution of bipedalism.

“This discovery is both controversial and aligns with earlier theories,” Mr. Pran noted.

“While there is no disagreement regarding the significance of Aldi’s find, many in the field would argue that the initial interpretation was likely flawed.”

“Thus, this paper represents a reevaluation of the original views that distanced Aldi from chimpanzees and gorillas.”

“It’s vital to understand that our paper does not claim that humans evolved from chimpanzees.”

“However, this study further supports the hypothesis that the common ancestor of humans and chimpanzees was likely very similar to today’s chimpanzees.”

For more details, refer to the paper published in the journal Communication Biology.

_____

TC Plan et al. 2025. Ardipithecus ramidus Ankle provides evidence of African ape-like vertical climbing in early humans. Commun. Biol. August 1454. doi: 10.1038/s42003-025-08711-7

Source: www.sci.news

Trio Awarded Nobel Prize in Economics for Research on Growth Fueled by Technology

This year’s Nobel Prize in Economics has been awarded to three experts who explore the influence of technology on economic growth.

Joel Mokyr from Northwestern University receives half of the prize, amounting to 11 million Swedish kronor (£867,000), while the remaining portion is shared between Philippe Aghion from the Collège de France, INSEAD Business School, and the London School of Economics, alongside Peter Howitt from Brown University.

The Royal Swedish Academy of Sciences announced this award during a period marked by rapid advancements in artificial intelligence and ongoing discussions about its societal implications, stating that the trio laid the groundwork for understanding “economic growth through innovation.”


This accolade comes at a time when nations worldwide are striving to rejuvenate economic growth, which has faced stagnation since the 2008 financial crisis, with rising concerns about sluggish productivity, slow improvements in living standards, and heightened political tensions.

Aghion has cautioned that “dark clouds” are forming amid President Donald Trump’s trade war, which heightens trade barriers. He emphasized that fostering innovation in green industries and curbing the rise of major tech monopolies are crucial for sustaining growth in the future.

“We cannot support the wave of protectionism in the United States, as it hinders global growth and innovation,” he noted.

While accepting the award, he pointed out that AI holds “tremendous growth potential” but urged governments to implement stringent competition policies to handle the growth of emerging tech firms. “A few leading companies may end up monopolizing the field, stifling new entrants and innovation. How can we ensure that today’s innovators do not hinder future advancements?”

The awards committee indicated that technological advancements have fueled continuous economic growth for the last two centuries, yet cautioned that further progress cannot be assumed.

Mokyr, a Dutch-born Israeli-American economic historian, was recognized for his research on the prerequisites for sustained growth driven by technological progress. Aghion and Howitt were honored for their examination of how “creative destruction” is pivotal for fostering growth.

Skip past newsletter promotions

“We must safeguard the core mechanisms of creative destruction to prevent sliding back into stagnation,” remarked John Hassler, chairman of the Economics Prize.

Established in the 1960s, the professional National Bank of Sweden awarded the Economics Prize in memory of Alfred Nobel.

Source: www.theguardian.com

New Research Indicates the Far Side of the Moon is Colder than Its Near Side

The stark differences in proximity and width between the moon’s near and far sides, along with their topography, volcanism, and crustal structures, offer crucial insights into the moon’s formation and evolution. However, investigations into the mechanisms behind this hemispherical asymmetry have been constrained by the absence of far-side samples. A recent study revealed fragments of rock and soil collected by China’s Chang’e 6 spacecraft from a large crater on the moon last year. Researchers confirmed that these rock samples are approximately 2.8 billion years old, analyzed the chemical composition of the minerals, and estimated that they were formed from lava deep within the moon at temperatures around 1,100 degrees Celsius. Survey results were published in the journal Natural Earth Science.



A global map of Albedo from a 750 nm filter on a UV-VIS camera mounted on NASA’s Clementine spacecraft. This image shows the near and far side of Lambert’s moon, and is an equal area projection. Image credit: NASA.

“The near and far sides of the moon differ significantly, both on the surface and potentially in their internal structures,” said Professor Yang Lee, a researcher at the University of London.

“This is one of the moon’s great mysteries. We refer to it as the two-sided moon. While variations in temperature between the near and far sides have long been theorized, our research presents the first evidence derived from actual samples.”

“These discoveries bring us closer to understanding the moon’s dual nature,” stated PhD candidate Xuelin Zhu from Peking University.

“They indicate that the disparities between the two sides extend beyond the surface, reaching deep within the moon.”

In this research, the authors examined 300 grams of lunar soil assigned to the Beijing Institute of Uranium Geology.

“This sample represents the first collection by the Chang’e 6 mission from across the moon,” commented Dr. Sheng, a researcher at the same institute.

The researchers found the samples were primarily composed of basalt particles and utilized electron probes to map specific areas of the sample, determining their composition.

They analyzed variations in lead isotopes dating back 2.8 billion years.

Several techniques were employed to estimate the sample temperatures at different stages in the moon’s past.

The first method involved analyzing mineral composition and comparing it with computer simulations to estimate the formation temperatures of the rocks.

This was juxtaposed with similar estimates for rocks from the near side, revealing a temperature difference of approximately 100 degrees Celsius.

The second technique delved further into the sample’s history, inferring from its chemical composition to ascertain the heat of the “parent rock” and comparing it with estimates of lunar samples obtained during the Apollo missions.

Once again, a Celsius difference of about 100 degrees was identified.

Due to the limited samples returned, they estimated the parent rock temperature using satellite data from the Chang’e landing sites on both sides, comparing this with similar data from nearby areas, which revealed a difference of 70 degrees Celsius.

On the moon, thermogenic elements like uranium, thorium, and potassium are often found alongside phosphorus and rare earth elements within a material referred to as KREEP (an acronym for potassium (K), rare earth element (REE), and phosphorus (P)).

The leading theory regarding the moon’s origin posits that it formed from debris resulting from a large-scale collision between Earth and a Mars-sized protoplanet, developing from primarily molten rock.

This magma solidified as it cooled, but KREEP elements were compatible with the forming crystals and remained within the magma for extended periods.

Scientists anticipate that KREEP material would be evenly distributed across the moon. In reality, it appears to be concentrated in the near side’s mantle.

The distribution of these elements may explain why the near side exhibited more volcanic activity.

While the current mantle temperatures on the far and near sides of the moon remain unknown due to this study, the temperature imbalances are likely to persist for a considerable duration, as the moon cools very slowly since its formation from a catastrophic impact.

Scientists aim to provide definitive answers to these questions in ongoing research.

____

she et al. Chang’e-6 basalt and relatively cool moon facid mantle inferred from remote sensing. nut. Geosci Published online on September 30th, 2025. doi:10.1038/s41561-025-01815-z

Source: www.sci.news

The Setback of Halting Psychedelic Research in the 1970s for Science

“Before the 1970s’ war on drugs, there was a variety of promising research into therapeutic psychedelics.”

Adrià Voltà

In the early 1950s, notable figures in science, philosophy, culture, and politics—such as Albert Einstein, Carl Jung, and Graham Greene—were part of an initiative called “outsights” aimed at exploring powerful psychedelics. Although circumstances shifted, I find myself captivated by what could have been.

I’ve been delving into psychedelics in the new trip series on BBC Radio 4. I previously shared my experiences of vivid hallucinations while in a coma from Covid-19. This sparked my curiosity to understand why individuals actively pursue psychedelic experiences, navigate legal challenges, take risks at home, seek healing, and address unmet needs.

There has yet to be a global consensus banning psychedelics. Responding to inquiries by scientist Humphrey Davy, who researched suboxidized oxides in 1799, Humphrey Osmond, coining the term psychedelic in the 1950s, expressed that the study of chemically induced altered states merits rigorous and thoughtful research.

Before the U.S.-led drug war commenced in the 1970s, extensive and promising research into psychedelics as potential treatments was underway, alongside their longstanding use in sacred and ritual contexts by Indigenous cultures. Unfortunately, rather than permitting this exploration, it was driven underground, leaving many to view substances such as fungi and plants, or their lab-created variants, as otherworldly. This otherness surprised me.

Currently, psychedelic research is investigating their therapeutic potential for conditions like depression, addiction, PTSD, eating disorders, dementia, and intergenerational trauma, gaining momentum globally. Studies explore their possible use in extending the recovery window following strokes, enhancing rehabilitation, and even unraveling the nature of consciousness.

Conversations with researchers who meticulously examine substances like psilocybin and DMT in clinical environments feel worlds apart from the psychedelic narratives prevalent in popular culture. These molecules profoundly and enduringly influence our minds and perceptions. It’s perplexing how we opted to stifle a broader inquiry and obstruct our brightest minds from discovering their true potential.

Today’s discussions among researchers are as engaging as they come, yet I can’t help but linger on the “what if?” In light of the global mental health crisis, governments and health systems are eager for new treatment alternatives. Public funding is dwindling and faces threats in many areas, while large corporations driven by profit show substantial interest in the accessibility of new therapies. Changes are happening rapidly.

Examining humanity’s history with psychedelic substances reveals a narrative marked by significant self-inflicted wounds. Ultimately, the funds for the outsight initiative never materialized, leading to a drastically different chapter in history. The war on drugs has stalled research across numerous substances for decades and continues to cast a shadow today.

The narratives surrounding these substances serve as warnings. Politics should never obstruct scientific breakthroughs. In light of today’s world, it feels like an urgent moral imperative to safeguard and nurture the conditions necessary for science to thrive. The stakes are too high.

Source: www.newscientist.com

Prevention League Triumphs in Extremism Research as Musk Champions Right-Wing Opposition

The Prevention League, a leading Jewish advocacy and anti-hate organization in the nation, has removed over 1,000 pages of extremism research from its website after facing significant backlash from right-wing influencers and Elon Musk on Tuesday night.

The now-deleted “extremist glossary” from the ADL included more than 1,000 entries offering background information on various groups and ideologies associated with racism, anti-Semitism, and other forms of hate. The section dedicated to neo-Nazi groups, militias, and anti-Semitic conspiracies has been redirected to a landing page featuring its extremism research.

Musk and various right-wing accounts on X have recently targeted the ADL over this glossary, which included references to Turning Point USA, associated with the late far-right activist Charlie Kirk. Musk responded to a post on X, criticizing the group for its entries on Christian identity and mistakenly conflating the militant movement with Christianity as a whole. In truth, the term refers to a faction that advocates for racial jihadism against Jews and other minorities.

The ADL did not directly address the backlash in its statements regarding this decision, instead arguing that removing the glossary would enable organizations to “explore new strategies and creative approaches to present data and research more effectively.”

“With over 1,000 entries compiled over the years, the extremist glossary has been a valuable resource for high-level information across a broad array of topics. However, the increase in entries has rendered many outdated,” stated the ADL. “We have observed many entries that have been intentionally misrepresented and misused. Furthermore, experts continue to develop more comprehensive resources and innovative means to convey information on anti-Semitism, extremism, and hatred.”

The decision to remove the glossary comes amid intense criticism faced by the ADL from staff and researchers, particularly concerning Israeli policies and its narrow focus on Musk’s repeated defenses. The organization lost a donor, and a prominent executive resigned following a statement by CEO Jonathan Greenblatt, who has praised Musk.

The ADL has not addressed inquiries regarding the comprehensive resources mentioned in its statement. The glossary was launched in 2022 and marketed as the first database designed to aid the media, the public, and law enforcement in understanding extremist groups and their ideologies.

“We consider it the most extensive and user-friendly resource for extremist speech currently accessible to the public,” noted Oren Segal, senior vice president of the ADL Center, in a prior statement. “We believe an informed public is crucial for the defense of democracy.”

ADL pages that contained the 2022 press release now display a message stating, “You are not permitted to access this page.”

Musk has long targeted the ADL, previously threatening to sue the organization for its research documenting the rise of anti-Semitic content on social media platforms. However, the ADL and Greenblatt defended him earlier this year, but after other Jewish groups and lawmakers condemned Musk for a fascist-style salute following Donald Trump’s inauguration. The ADL referred to it as “an unfortunate gesture amid moments of enthusiasm.”

Skip past newsletter promotions

Musk has consistently tweeted about the glossary’s ADL entries, including those related to Kirk’s TPUSA, labeling the ADL a “hate group” and insinuating that it incites murder. The TPUSA entry did not label the organization as extremist but included a list of its leadership and activists linked to extremists or who have made “racist or biased statements.”

On Wednesday, Musk continued to focus on the ADL, reiterating his classification of it as a “hate group.” He also aligned with another right-wing pressure effort, making a call to boycott Netflix due to a show featuring trans characters.

Source: www.theguardian.com

Ancient Demosponges: The First Animals on Earth, According to Research

Researchers from MIT and other institutions have discovered chemical fossils possibly left by ancient sponges on rocks dating back over 541 million years. These fossils consist of a distinctive type of sterlan, a stable variant of sterols found within the cell membranes of complex organisms. The team linked these sterlans to a category of sea sponges known as demosponges.



It highlights the picture representation of the ancient Stellan timeline, highlighting important compounds and their possible biological sources. Image credit: Shawar et al. , doi: 10.1073/pnas.2503009122.

“While I cannot precisely describe what these creatures looked like, I can assert they inhabited the ocean, had soft bodies, and likely lacked a silica skeleton,” stated MIT professor Roger Sammons.

In 2009, the researcher discovered the first chemical fossil believed to have originated from ancient sponges.

The team examined rock samples from outcrops in Oman and found an abundant sterlan they deduced to be a remnant of 30 carbon (C30) sterols—a rare steroid form attributed to ancient sea sponges.

Stellan was identified in very old rocks formed during the Ediacaran era (635-541 million years ago).

This era preceded the Cambrian period, which was marked by a sudden global explosion of complex, multicellular life forms.

The findings imply that ancient sponges may have existed far earlier than most multicellular organisms, potentially being one of the first animals on Earth.

Nevertheless, following the publication of these findings, alternative hypotheses emerged regarding the origin of C30 sterlan, suggesting that these chemicals could arise from other biological sources or non-organic geological processes.

The current study bolsters the initial hypothesis that ancient sponges produced this chemical record, as the researchers found new chemical fossils within the same promelat rock that were almost certainly biogenic.

Similar to previous studies, they searched for chemical fossils in rocks dating back to the Ediacaran period.

Samples were collected from drill cores and outcrops in Oman, West India, and Siberia, with analyses focused on the signatures of geologically stable sterols present in all eukaryotes (including plants, animals, and organisms with nuclear membranes).

“Without sterols or comparable membrane lipids, you cannot be classified as a eukaryote,” Professor Sammons remarked.

The chemical fossil identified in 2009 was 30-carbon sterols.

Additionally, the team deduced that these compounds could be synthesized due to distinct enzymes encoded by genes prevalent in demosponges.

“Finding sterols with 30 carbons is quite rare,” noted Dr. Lubna Shawar, a researcher at Caltech.

In this study, scientists concentrated on the chemistry of these compounds, observing that genes from the same sponge can produce even scarcer sterols with 31 carbon atoms (C31).

Upon analyzing rock samples of C31 sterlan, they discovered it was rich in the aforementioned C30 sterlan.

“These unique sterlans have been present all along,” Dr. Shawar remarked.

“We had to inquire the right questions to uncover them and truly comprehend what they signify and their origin.”

The researchers additionally procured samples of modern demosponges to examine for C31 sterols.

They determined that it is indeed a biological precursor of C31 sterlan found in rocks, observed in several species of contemporary demosponges.

Going further, they chemically synthesized eight different C31 sterols as reference materials to verify chemical structures.

The molecules were subjected to conditions simulating how sterols transform during deposition, burial, and pressurization over millions of years.

They found that two sterol-only products closely matched the structure of C31 sterols located in ancient rock samples.

The evidence from both substances strongly indicates that these compounds were created by living organisms rather than random non-biological processes.

Moreover, these organisms are likely ancestors of demosponges and still possess the capability to produce this set of compounds.

“It’s a blend of what’s present in the rock, what’s within the sponge, and what’s demonstrated in the lab,” explained Professor Sammons.

“Three supportive and concordant pieces of evidence strongly suggest these sponges are among Earth’s earliest animals.”

“This study illustrates how to authenticate biomarkers and confirm that the signals arise from life forms rather than contamination or abiogenic chemistry,” Dr. Shawar stated.

New Results were published this week in Proceedings of the National Academy of Sciences.

____

Lubuna Shawar et al. 2025. Chemical characterization of C31 sterols from the sponge and Neoproterozoic fossil star counterpart. PNAS 22 (41): E2503009122; doi: 10.1073/pnas.2503009122

Source: www.sci.news

Research Shows Ice Dissolves Iron Minerals More Efficiently than Liquid Water

Ice at 10 degrees Celsius releases iron from more abundant minerals compared to liquid water at 4 degrees Celsius, according to researchers from Umeå University, Chimiques de Rennes, and CNRS. This discovery sheds light on why many Arctic rivers are taking on a rusty orange hue as permafrost begins to thaw in warmer climates.

Schematic diagram of the iron mineral dissolution reaction of ice. Image credit: Sebaaly et al. , doi: 10.1073/pnas.2507588122.

“It may seem counterintuitive, but ice is not merely a static frozen mass,” stated Professor Jean François Boyley from Umeå University.

“Frozen states create microscopic pockets of liquid water between ice crystals.”

“These pockets function like chemical reactors, where compounds become concentrated and highly acidic.”

“This implies that even at temperatures as low as 30 degrees Celsius, they can engage with iron minerals.”

To investigate this phenomenon, Professor Boyley and his team examined goethite, a diverse array of iron oxide minerals, along with naturally occurring organic acids.

Through advanced microscopy and a series of experiments, they found that repeated freeze-thaw cycles enhance iron dissolution significantly.

When ice undergoes freezing and thawing, it releases organic compounds that were previously trapped, fostering additional chemical reactions.

Salt concentration also plays a critical role; fresh brackish waters promote iron dissolution, whereas seawater inhibits it.

The outcomes of this research are particularly relevant in acidic environments like mine drainage sites, frozen atmospheric dust, acid sulfate soils along the Baltic coast, or acidic freezing locales where iron minerals interact with organic matter.

“As global temperatures rise, the freeze-thaw cycles are becoming more frequent,” remarked Angelo Pio Severly, a doctoral candidate at Umeå University.

“Each cycle liberates iron from the soil and permafrost into the water, potentially impacting water quality and aquatic ecosystems over vast areas.”

“These findings emphasize that ice is an active participant, rather than a passive medium for storage.”

“It is crucial to recognize the growing impact of freeze and thaw processes in polar and mountainous regions on ecosystems and elemental cycling.”

The research team’s paper was published on August 26, 2025, in the Proceedings of the National Academy of Sciences.

____

Angelo P. Severly et al. 2025. Ice as a kinetic and mechanical driver for iron oxide dissolution of oxalate oxide. Proceedings of the National Academy of Sciences 122 (35): E2507588122; doi: 10.1073/pnas.2507588122

Source: www.sci.news

New Research Confirms Multiple Instances of Water Activity in Jezero Crater

Minerals constitute the building blocks of rocks, and the specific minerals and their chemical compositions reveal significant insights into rock formation and history. On Mars, NASA’s dedicated rover, equipped with X-ray lithochemistry (PIXL) instruments, produces geochemical maps of rock surfaces. A recent study examined over 90,000 chemical analyses collected by PIXL during its first 1,100 days on Mars, revealing that the minerals in Jezero Crater interact with various types of liquids over time. result This will be published in Journal of Geophysics: Planets.

This image from NASA’s Mars reconnaissance orbiter showcases the Jezero Crater on Mars. Image credits: NASA/JPL-CALTECH/MSSS/JHU-APL.

In this research, Eleanor Moreland, a Rice University graduate student, along with her team, utilized mineral identification through stoichiometry (MIST) algorithms to analyze PIXL data.

PIXL determines the chemical composition by bombarding Martian rocks with X-rays, yielding the most comprehensive geochemical measurements ever obtained from another planet.

“The minerals identified in Jezero Crater through MIST indicate that these volcanic rocks interacted with liquid water multiple times throughout Mars’ history, suggesting the potential for habitable conditions,” Moreland stated.

Minerals form under specific environmental conditions, such as temperature, pH, and the chemical composition of fluids, making them reliable narrators of planetary history.

Within Jezero Crater, 24 mineral species illustrate the volcanic characteristics of the Martian surface and their interactions with water over time.

Water chemically alters rocks, producing salt or clay minerals, with the specific minerals formed depending on environmental variables.

The minerals discovered in the crater showcase three different types of liquid interactions, each indicating distinct possibilities for habitability.

The first mineral suite, featuring green arilite, hizingerite, and ferroaluminoceradonite, shows localized high-temperature acidic fluids present only in crater bedrock, interpreted as among the oldest rocks studied.

The water involved in this scenario is regarded as the most conducive to life, given that research on Earth suggests high temperatures and low pH can harm biological structures.

“These hot, acidic conditions present the toughest challenges to life,” commented Kirsten Siebach, a researcher at Rice University.

“However, on Earth, life can thrive in extreme environments such as the acidic waters of Yellowstone, so this doesn’t negate the possibility of habitability.”

The second mineral suite favors more hospitable conditions and indicates a medium neutral fluid present over larger areas.

Minerals like Minnesotaite and Clinoptilolite were detected on both the crater floor and fan area, forming at lower temperatures with neutral pH, while Clinoptilolite was restricted to the crater floor.

Lastly, the third category represents a cold alkaline liquid, considered highly habitable from a modern Earth perspective.

Sepiolite, a common mineral change on Earth, was found to form under moderate temperature and alkaline conditions, widely distributed across all units explored by the rover.

The presence of sepiolite in all these units indicates multiple episodes of liquid water contributing to habitable conditions in Jezero Crater.

“These minerals demonstrate that Jezero Crater has undergone a transition from harsher, hotter, acidic liquid conditions to more neutral and alkaline environments over time.

Given that Mars samples cannot be prepared or scanned as accurately as Earth samples, the team developed an uncertainty propagation model to enhance the findings.

Using a statistical approach, MIST repeatedly assessed mineral identification while considering potential errors, analogous to how meteorologists predict hurricane paths by utilizing numerous models.

“Error analysis enables us to assign confidence levels to all mineral identifications,” Moreland remarked.

“MIST assists not just with the scientific and decision-making processes of Mars 2020, but also establishes a mineralogical archive of Jezero Crater, which will be invaluable if samples are returned to Earth.”

The findings affirm that Jezero Crater, once home to an ancient lake, has experienced a complex, dynamic aqueous history.

Each new mineral discovery brings us closer to determining whether Mars has ever supported life, while also refining strategies for sample collection and return.

____

Eleanor L. Moreland et al. 2025. Multiple episodes of fluid changes in Jezero Crater indicated by the identification of MIST minerals in PIXL XRF data from the first 1100 SOL of the Mars 2020 mission. Journal of Geophysics: Planets 130 (9): e2024je008797; doi: 10.1029/2024je008797

Source: www.sci.news

Is Leucovorin an Effective Treatment for Autism? Insights from Recent Research

The Trump administration has included a drug known as leucovorin in efforts to alleviate certain autism symptoms. However, experts specializing in autism largely agree that additional research is needed before it can be widely used in children and adults.

Leucovorin, or Folinic Acid, is a synthetic variant of vitamin B9 requiring a prescription. It is primarily administered to cancer patients via IV alongside chemotherapy.

On Monday, the Food and Drug Administration revealed that it is moving forward with the approval of a tablet formulation for specific autistic patients.

Many researchers have raised concerns that this approval may be hasty, given that only a few small trials—mostly conducted outside the US—demonstrate its effectiveness in children with autism.

Several experts informed NBC News that FDA approval might create unrealistic expectations for families. This is particularly concerning as not all children with autism are eligible for prescriptions, and the likelihood of achieving positive results remains uncertain.

Researchers have long sought medications that can effectively mitigate autism symptoms; however, very few have satisfied the FDA’s rigorous safety and efficacy criteria. Prior to Monday, the FDA had only authorized two medications to address bothersome symptoms associated with autism, none of which targeted issues related to communication, social interactions, or repetitive behaviors.

Alycia Halladay, Chief Science Officer of the Autism Science Foundation, commented that her organization supports research grounded in evidence; leucovorin is not suggested as a treatment and more research is essential.

“Leucovorin doesn’t meet the standards set for FDA approvals, yet this administration is proceeding regardless. Therefore, I wouldn’t label this a victory,” Halladay remarked.

She further noted that the way the drug was presented at Monday’s White House Briefing as a major breakthrough for families with autism doesn’t align with the nuances of FDA approval.

The FDA stated in a news release that the drug is being approved for patients suffering from cerebral folate deficiency, a rare neurological disorder marked by low levels of vitamin B9 (folate) in the brain. Some researchers speculate that this condition might be linked to autism, but it is not present in all autistic individuals.

(Halladay estimates that around 10-30% of autistic patients may have this condition.)

Though leucovorin can potentially aid in reaching the brain, theoretically improving verbal communication and alleviating autism symptoms like irritability and repetitive actions, there is no evidence to suggest it entirely eradicates these symptoms.

“We still hope that leucovorin might serve as a helpful option for a subset of patients,” noted Dr. Rachel Forlomer, an assistant professor of pediatrics at Northwestern University’s Feinberg School of Medicine. “However, I can’t say we’re at a stage where we can confidently claim we can assist every individual with autism.”

President Donald Trump expressed at a briefing that the approval “offers hope for many parents of children with autism that life improvement is possible.” Mehmet Oz, leader of the Centers for Medicare and Medicaid Services, described the decision as “life-saving.” FDA commissioner Marty McCurry remarked, “I believe hundreds of thousands of children will benefit.”

However, in a follow-up news release, the Department of Health and Human Services clarified that leucovorin is “not a cure” for autism and “may only result in improvements in speech-related deficits for a subset of children.”

David Mandel, a professor of psychiatry at the University of Pennsylvania, remarked that the folate hypothesis is based on relatively weak scientific evidence.

“We lack robust large-scale studies demonstrating that a significant number of individuals with autism suffer from folic acid deficiency,” he noted.

While leucovorin has shown minimal side effects in cancer patients, higher doses can lead to gastrointestinal issues and increase the risk of seizures in individuals on anti-seizure medications. It is commonly part of treatment for colorectal cancer and other gastrointestinal malignancies, often enhancing the efficacy of the chemotherapy drug 5-fluorouracil. In rare cases, it is administered to reduce side effects from another chemotherapeutic agent, high-dose methotrexate.

Halladay mentioned observing side effects during the leucovorin autism trial but noted that the specific safety of the drug was not evaluated. She indicated that dosages varied across studies, making it difficult to determine if leucovorin was responsible for any symptom improvement, as some trial participants also received behavioral therapy.

According to Mandel, the largest study involving these trials included only 80 participants.

“For FDA approval, one would ideally want hundreds of children involved in these trials,” he explained.

Leucovorin must overcome one final obstacle before it becomes available to select autistic patients. The FDA stated it is collaborating with GSK, the manufacturer of the brand-name version of leucovorin, to update the drug’s labeling. GSK confirmed that a new application will be submitted to include autism indications.

Oz stated on Monday that leucovorin prescriptions will be covered by Medicaid, with private insurance companies likely to follow suit.

However, Mandel expressed concern that anticipated cuts to Medicaid may compel many families to cover costs out of pocket, potentially driving them to purchase folinic acid supplements online without prescriptions. Such supplements may not have gone through quality control, and dosage information could be unclear.

Dr. William Dahoot, chief science officer at the American Cancer Society, expressed concern that the growing interest in leucovorin for autism might adversely impact its availability for cancer patients.

“We have faced shortages of this drug before, and an increase in demand could lead to future shortages,” he noted in an email.

Source: www.nbcnews.com

DNA Research Reveals Slavic Origins in Ukraine and Southern Belarus

The latter part of the first millennium in Central and Eastern Europe witnessed profound cultural and political changes. This transformative era is typically linked to the emergence of the Slavs, supported by textual documentation and corresponding archaeological findings. However, there remains no agreement on whether this archaeological horizon spread through transition, a process termed “slabization,” or a mix of both. Notably, the prevalent cremation practices observed during the initial phases of slab settlements lack sufficient genetic data. In a recent investigation, scientists sequenced the genomes of 555 ancient individuals, including 359 samples from the Slavic context dating back to the 7th century AD. The new findings reveal significant population movements in Eastern Europe between the 6th and 8th centuries, which replaced over 80% of the local gene pools in areas such as East Germany, Poland, and Croatia.

The seal of Yaroslav, the grand prince of Kiev from 1019 to 1054, and the father of Anna Yaroslav, the Queen of France. Image credit: Sheremetievs Museum.

The term “Slavs” first emerged to describe a nation in Constantinople during the 6th century and later gained recognition in the West.

Written records initially appeared north of the Lowward Now River and subsequently shifted to regions north of the Carpathian Basin, the Balkans, and the Eastern Alps.

Many areas were under the influence of the Avar Khaganate along the central Danube from around 567 AD to 800 AD.

Evidence indicates the presence of slab cultures in several regions of Eastern and Southeastern Europe during the 7th century.

Slavic settlements, previously inhabited by Roman, Germanic, and other pre-Slavic communities, transitioned to a simpler lifestyle, often represented archaeologically by small pithouse settlements, cremation burials, handmade and unembellished pottery, and a modest low-metal material culture associated with the Pragukorchak group.

Later, more sophisticated social structures and control emerged within the contact zone of the Byzantine-Christian West.

The Transformation of Europe by the Slavs

The first comprehensive ancient DNA analysis of medieval Slavic groups reveals that the rise of the Slavs was fundamentally a narrative of migration.

Their genetic signature points to origins in an area spanning southern Belarus to central Ukraine, aligning with longstanding linguistic and archaeological theories.

“Although direct evidence from the early Slavic core regions is still limited, our genetic findings provide initial substantial insights into the formation of Slavic ancestors, suggesting origins that may lie between the Donets and Don rivers.”

In this study, Dr. Gretzinger and colleagues gathered genome-wide data from 555 distinct ancient individuals from 26 sites throughout Central and Eastern Europe. They combined this with previously published data, creating comprehensive sampling networks for three regions.

New findings indicate that starting in the 6th century AD, large-scale migrations spread Eastern European ancestry throughout a vast area of central and eastern Europe, thus altering the genetic make-up of regions such as East Germany and Poland.

However, this expansion did not conform to a model of conquest or empire. Rather than obliterating existing military and structural hierarchies, newcomers founded new communities centered around extended families and patriarchal kinships.

This pattern was not uniform across all areas.

In eastern Germany, the changes were significant. Large, multi-generational lineages formed the backbone of society, and kinship networks became more broadly structured compared to the smaller nuclear families observed in earlier migration phases.

In contrast, areas such as Croatia experienced much less disruption in existing social patterns with the arrival of Eastern European groups.

Here, social structures often retained characteristics from previous periods, resulting in communities where new traditions harmonized with existing ones.

The regional diversity in social frameworks highlights that the spread of the Slavic group was not a one-size-fits-all process, but rather a dynamic adaptation to local contexts and histories.

“The expansion of the Slavs does not occur as a single event; it demonstrates that it is not a monolithic phenomenon, but each instance blends adaptation and integration according to its circumstances.”

Historical Overview of European Slabs: The timeline lists major historical events related to Central European Slabs. This map illustrates historical proof of the appearance of the slab (Sklavenoi – Slavvi – Winedi). The italic count indicates the date of the proven event, with each report date being in the bracket. Image credit: Gretzinger et al., doi: 10.1038/s41586-025-09437-6.

East Germany

The genetic data reveals a particularly significant narrative in East Germany.

Following the decline of the Kingdom of Thuringia, more than 85% of the region’s ancestry can be traced back to new arrivals from the east.

This reflects a shift from an earlier period of diverse populations, as epitomized by the Brucken site.

With the rise of the Slavs, this diversity gave way to a population composition resembling that of modern Slavic-speaking groups in Eastern Europe.

These new communities were structured around large extended families and patriarchal lineages, with women of marriageable age often moving to form new households elsewhere, leaving their native villages.

Notably, the genetic heritage of these initial Eastern European settlers is still present among the Sorbs, the Slavic-speaking minority in East Germany.

Amidst centuries of cultural and linguistic changes, Sorbs maintain genetic profiles closely related to early medieval Slavic populations that settled in the region over a millennium ago.

Poland

In Poland, research notably challenges previous assumptions regarding long-standing population continuity.

Genetic findings indicate that early inhabitants of the region, beginning in the 6th and 7th centuries AD—especially descendants of a population closely tied to Northern Europe and Scandinavia—were nearly completely replaced by newcomers from the East, closely related to modern Poles, Ukrainians, and Belarusians.

While overwhelming population shifts occurred, genetic evidence also reveals small traces of intermingling with local populations.

These insights underscore both the magnitude of population change and the intricate dynamics shaping the ancestry of present-day Central and Eastern European languages.

Croatia

In Northern Balkans, the patterns observed differ markedly from those in northern immigrant regions, narrating a tale of both transformation and continuity.

Ancient DNA analyses from Croatia and surrounding areas illustrate a significant influx of ancestors from Eastern Europe, yet without total genetic replacement.

Instead, Eastern European immigrants integrated with diverse local populations to form hybrid communities.

Genetic studies show that in modern Balkan populations, the proportion of Eastern European ancestry varies significantly, often reaching around half or less of the current genetic mix.

In this context, Slavic migration wasn’t characterized by conquest but was a gradual process of intermarriage and adaptation, leading to the rich cultural, linguistic, and genetic diversity that defines the Balkans today.

A New Chapter in European History

In most instances, when early Slavic groups are referenced in archaeological and historical contexts, their genetic markers are consistent, indicating a shared ancestral origin, though regional variations reflect the extent of blending with local populations.

In the north, early Germanic communities mostly left, providing space for Slavic integration.

In the south, Eastern European migrants merged with established societies.

This patchwork integration elucidates the remarkable diversity present in the cultures, languages, and genetics of contemporary Central and Eastern European societies.

“The spread of the Slavs was likely the last significant demographic event to irreversibly reshape both the genetic and linguistic landscapes of Europe,” remarked Dr. Johannes Kraus, director of the Max Planck Institute for Evolutionary Anthropology.

The findings were published in the journal on September 3rd Nature.

____

J. Gretzinger et al. Ancient DNA connects large-scale migration with the spread of the Slavs. Nature, published online on September 3, 2025. doi:10.1038/s41586-025-09437-6

This article is adapted from the original release by the Max Planck Institute for Evolutionary Anthropology.

Source: www.sci.news

British Companies Utilizing “Bossware” to Monitor Employee Activities, According to Research

A significant portion of UK employers, about one-third, are utilizing “bossware” technology to monitor employee activities, predominantly through methods like email and web browsing surveillance.

Private sector firms are the most inclined to implement onsite monitoring, with one in seven employers reportedly recording or assessing screen activities, as per a comprehensive UK study on office surveillance.

These insights, disclosed by the Chartered Management Institute (CMI) to the Guardian, are derived from feedback from numerous UK managers, indicating a recent uptick in computer-based work monitoring.

According to 2023 research by the Information Commissioner’s Office (ICO), less than 20% of respondents believed they were being monitored by their employers. The finding that roughly one-third of managers are aware of their organizations tracking employees’ online activities on company devices likely underrepresents the issue.

Many of these surveillance tools are designed to mitigate insider threats, safeguard confidential data, and identify dips in productivity. However, this growing trend seems to be inducing anxiety among employees. CMI highlights that many managerial figures oppose such practices, arguing they erode trust and infringe on personal privacy.

A manager at an insurance firm developing an AI system for monitoring staff screen activity expressed feelings of “unease,” questioning, “Do they trust employees to perform their roles? Is there an intention to replace them with AI?”

One employee monitoring service provides insights into workers’ “idle hours,” tracks “employee productivity,” flags unapproved AI or social media use, and offers “real-time data on employee behavior, including screenshots, screen recordings, keystrokes, and application usage.”

In light of these findings, the ICO emphasized that employers “must inform employees about the nature, scope, and reasons for surveillance,” noting that excessive monitoring “can infringe on personal privacy,” especially for remote workers. They warned of potential actions if necessary.

Last year, the ICO prohibited outsourced company Serco from utilizing facial recognition technology and fingerprint scanning to manage staff attendance at various leisure centers.

Monitoring often includes ensuring that inappropriate content isn’t accessed, according to CMI. However, they cautioned, “If it feels like an invasion, there can be long-term implications.”

Petra Wilton, policy director at CMI, stated, “If implemented, this could be of significant concern to employers and raise serious data privacy and protection issues.”

Recent examples of workplace surveillance methods include: HSBC’s installation of numerous security cameras and 1,754 biometric readers as a means of accessing their new London headquarters.

Skip past newsletter promotions

PWC has recently rolled out a “traffic light” system utilizing badge swipes and WiFi connection data to ensure staff attend the office at least three days a week. A spokesperson from PWC noted this was “well received by most of our employees.”

A former senior public transport worker, who requested anonymity, shared their experience of facing online surveillance, describing it as “distracting and deeply intrusive.”

“It began with surveillance, and I eventually left because I was extremely frustrated,” they noted. CMI research revealed that one in six managers would contemplate seeking new employment if their organization started monitoring online activities on work devices.

Among managers aware of their employers monitoring them, 35% indicated surveillance of emails. Overall, tracking login/logout times and system access emerged as the most prevalent form of monitoring.

The survey showed that 53% of managers endorse monitoring employee online activity on company devices, but 42% feel this not only undermines trust but also fails to enhance performance, potentially resulting in misuse or unjust disciplinary action.

Source: www.theguardian.com

Research Suggests Artificial Sweeteners May Accelerate Brain Aging

New research suggests that artificial sweeteners may have unexpected risks for brain health.

In a study published in Neurology, researchers analyzed the diets of over 12,700 adults in Brazil, revealing that individuals who consumed higher amounts of calorie-free sweeteners experienced a more rapid decline in memory and cognitive abilities over an eight-year period.

This decline was especially notable among diabetic patients and those under the age of 60.

The study examined seven sweeteners commonly found in diet sodas, flavored waters, yogurt, and low-calorie desserts: aspartame, saccharin, acesulfame-K, erythritol, xylitol, sorbitol, and tagatose.

All except tagatose were linked to cognitive decline, particularly affecting memory and verbal fluency.

Participants were categorized into three intake groups. Those with the highest consumption—approximately 191 milligrams daily, similar to a single can of diet soda for aspartame—demonstrated cognitive aging equivalent to 1.6 additional years, at least 62% faster than those with lower consumption.

“Low and no-calorie sweeteners are often regarded as healthier alternatives to sugar, but our findings indicate that certain sweeteners may negatively impact brain health over time,” stated Professor Claudia Kimmy Sumoto from the University of Sao Paulo.

“Prior research linked artificial sweeteners to conditions such as diabetes, cancer, cardiovascular disease, and depression, but the effects on cognition were previously unexplored.”

Consumption of artificial sweeteners similar to daily cans of diet soda was associated with accelerated cognitive decline, akin to 1.6 years of brain aging – Credit: Getty

Interestingly, the link was primarily observed in adults under 60 years old.

“We anticipated that the association would be more pronounced in older adults due to their increased risk of dementia and cognitive decline,” Sumoto noted. “Conversely, our findings suggest that exposure to sweeteners during middle age could be particularly detrimental, which is crucial as this period is vital for establishing long-term brain health.”

The findings do not conclusively prove that sweeteners are the direct cause of cognitive decline, with limitations including reliance on self-reported dietary habits and the absence of control over sweetener usage in the research.

Nevertheless, Sumoto emphasized the need for further investigation, including brain imaging and studies examining gut health and inflammation.

Her team is already conducting neuroimaging studies to better understand these associations, although results are not yet available.

“More research is essential to validate our findings and to explore whether alternative sweeteners like those from the apple family, honey, maple syrup, and coconut sugar provide effective options,” Sumoto concluded.

About our experts

Claudia Sumoto is an assistant professor at the University of Sao Paulo, Brazil. She is a trained physician with research published in journals such as The Lancet, Nature Neuroscience, and Journal of Alzheimer’s Disease.

read more:

Source: www.sciencefocus.com

Research Indicates Space Travel May Accelerate Stem Cell Aging by Up to 10 Times

Transitioning to space poses significant challenges for the human body.

Astronauts can experience loss of bone density, swollen nerves in their brains and eyes, and alterations in gene expression. Research indicates that time spent in space can accelerate aging.

Groundbreaking research by NASA’s twin astronauts Mark and Scott Kelly monitored aging indicators in both siblings, with Mark remaining on Earth while Scott spent 340 days in space.

Six months later, several changes in Scott persisted, including DNA damage, cognitive decline, and telomere shortening that affects chromosome protection. This was highlighted in the Journal Science.

Recent research published in Cell Stem Cell reveals that stem cells also show signs of aging due to stress from space flight.

According to Dr. Catriona Jamieson, director of the Sanford Stem Cell Institute at UC San Diego, these cells are “aging ten times faster in space than on Earth.”

Stem cells are unique cells capable of differentiating into various tissue types. Their accelerated aging poses a concern as it diminishes the body’s natural ability to repair tissues and organs.

This new research comes at a time of increasing interest in space exploration, with government plans for long-term lunar missions and private companies sending consumers and celebrities into space. Understanding these health risks is crucial for safer space travel. Additionally, studying the acceleration of intracellular aging aids researchers in comprehending biological processes at a slower pace.

Astronauts and twin brothers Scott and Mark Kelly at NASA’s Johnson Space Center in 2016.
Houston Chronicle /Hurst Newspaper /Houston Chronicle by Getty Image

Researchers utilized bone marrow stem cells sourced from individuals who underwent hip replacement procedures. These cells were cultivated in “nanobioreactors,” essentially small, clear blood bags no larger than an iPhone that facilitate biological processes. The nanobioreactor was housed in a monitored environment known as cubelabs.

Samples from each patient were divided into two cubelabs; one was sent to space, while the other remained on Earth.

The samples intended for space travelled aboard the International Space Station across four commercial resupply missions conducted by SpaceX. Overall, the samples experienced microgravity for 32-45 days, the weightlessness found in orbit. For comparison, the Earth-bound cells were maintained in a cube lab setup.

Cubelabs monitored cell conditions throughout the journey and terrestrial duration, capturing daily images using a microscope. Upon the return of the space-stressed stem cells to Earth, researchers conducted comparisons against ground controls, sequenced the genome, and performed additional analyses.

Source: www.nbcnews.com

New Research Uncovers the Secrets of Burgess Shale Trilobites

Appendages of arthropods serve various functions, including feeding, locomotion, and reproduction. Fossils dating back to the Cambrian period (539-487 million years ago) provide remarkable details of extinct arthropod appendages, enhancing our understanding of their anatomy and ecological roles. However, due to the limited number of fossils and often incomplete preservation, studies on appendage functions typically depend on idealized reconstructions. This new research focuses on the paleontological species Olenoides serratus, a prolific trilobite from the Cambrian Burgess Shale, noteworthy for its numerous well-preserved specimens featuring soft tissue that allow for a detailed analysis of appendage functionality.



Olenoides serratus from Burgess Shale. Image credit: Losso et al., doi: 10.1186/s12915-025-02335-3.

Situated in British Columbia, Canada, the Burgess Shale is renowned for its exceptional fossil preservation, including soft tissues such as limbs and internal organs.

While trilobites are common in fossil records, their soft limbs are seldom preserved due to their hard exoskeleton, leading to a limited understanding of these structures.

The trilobite species Olenoides serratus offers a unique chance to investigate these appendages further.

Harvard paleontologist Sarah Ross and her team examined 156 limbs from 28 fossil specimens of Olenoides serratus to reconstruct the precise movements and functions of these ancient arthropod appendages, shedding light on one of the earliest successful aquatic animals.

“Understanding the behavior and movement of fossils poses challenges, as we cannot observe their activities like we do with living organisms,” stated Dr. Ross.

“Instead, we meticulously analyzed the morphology of numerous specimens while also utilizing modern analogues to infer how these ancient creatures lived.”

The researchers also assessed the range of motion of the legs of living horseshoe crab species Limulus polyphemus.

“Arthropods possess articulated legs composed of multiple segments that can flex upwards or downwards,” they noted.

“The range of motion is influenced by the specific directional capabilities of each joint.”

“This range, combined with the limbs’ shape and segment configuration, determines how the animal utilizes its appendages for walking, grasping, and burrowing.”

Horseshoe crabs, commonly found along the eastern coast of North America, are compared with trilobites due to their analogous behaviors.

“Despite their close relation to spiders and scorpions, horseshoe crabs are part of a different branch of the arthropod tree, whereas trilobite relationships remain ambiguous.”

The comparison arises from both animals’ adaptation of articulated limbs for navigating the seafloor.

However, the findings revealed that their similarities were minimal.

In contrast to horseshoe crabs, characterized by specialized limb joints for bending and expanding—facilitating feeding and protection—Olenoides serratus exhibited a simpler yet highly functional limb structure.

“We found that the limbs of Olenoides serratus had minimal extension, primarily far from the body,” Dr. Ross explained.

“Their limbs functioned differently than those of horseshoe crabs. Olenoides serratus could walk, dig, bring food to their mouths, and even elevate their bodies above the seafloor.”

To realize these findings, the scientists constructed advanced 3D digital models based on hundreds of fossil images captured from various angles.

Since trilobite limbs are often crushed and flattened, reconstructing them in three dimensions presents a significant challenge.

“We depend on exceptionally well-preserved specimens, comparing limb structures from multiple angles while leveraging related fossils to fill in any missing details,” said Professor Javier Ortega-Hernandez of Harvard University.

The team correlated the morphology of trace fossils to the movements of the limbs.

“The different movements of Olenoides serratus could create trace fossils with varying depths,” Dr. Ross elaborated.

“They were capable of raising their bodies on deposits, allowing them to traverse obstacles and navigate efficiently through swift currents.”

Remarkably, the researchers found that males possessed specialized appendages for mating, and each leg featured gills for respiration.

The findings were published in the journal BMC Biology on August 4th, 2025.

____

Loss et al., 2025. Quantification of leg mobility in Burgess Shale Olenoides serratus reveals the functional differences between trilobite and Xiphosuran appendages. BMC Biol 23, 238; doi:10.1186/s12915-025-02335-3

Source: www.sci.news

Research Suggests This Diet May Lower Cognitive Decline Risk by 40%

A Key Review involving over 62,500 adults indicates that adhering to a specific diet may lower the risk of cognitive decline in older age by 40%.

Researchers from Shandong University in China examined 15 studies involving individuals aged over 60 and discovered that those who followed a Mediterranean or mind-focused diet were notably less likely to experience memory and cognitive issues compared to those without a healthy diet.

The Mediterranean diet and the Mind diet share many similarities, both emphasizing vegetables, fruits, legumes, nuts, seeds, whole grains, and moderate amounts of fish, chicken, and dairy products. Conversely, they limit processed foods, lean meats, and sugar.

However, while the Mediterranean diet draws inspiration from specific regions, the Mind diet is designed explicitly to safeguard the brain against age-related decline.

The Mediterranean diet highlights certain culturally significant ingredients, such as olive oil, while the Mind diet focuses on nutrient-rich foods like berries and leafy greens.

Researchers have long speculated about the impact of diet on brain aging, although evidence has been mixed. To clarify this issue, the Shandong University team aggregated data from 15 studies to gain a better understanding of the correlation between diet and aging in the brain, as well as extended risks of related diseases like dementia.

The studies included presented varied results. However, their combined findings indicated that elderly individuals who maintained healthy eating habits were significantly less likely (by 40%) to experience cognitive decline compared to those following unhealthy diets.

Consequently, researchers concluded that older adults should be encouraged to incorporate vegetables, fruits, fish, and legumes into their meals.

The Mediterranean diet emphasizes plant-based foods like vegetables, fruits, grains, legumes, nuts, seeds, and healthy fats, while limiting processed foods and sugars.

Experts not involved in the study cautioned in BBC Science Focus that the interplay between diet and cognitive health is complex and the findings should be interpreted cautiously.

Professor Keith Frain, Professor Emeritus of Human Metabolism at Oxford University, remarked, “This study convincingly demonstrates that healthier diets correlate with enhanced cognitive function in older adults.”

However, Frain cautioned that the study does not imply that a better diet causes improved brain health. He explained that the studies involved in this analysis were observational, which means other factors like wealth, privilege, or non-dietary healthy habits could contribute to the observed link.

“While striving for a healthy diet as we age for various reasons is important, it is misleading to suggest that diet alone can alter our cognitive function,” Frain added.

Meanwhile, Dr. Oliver Shannon, a Lecturer in Nutrition and Aging at Newcastle University, noted in BBC Science Focus that the findings of this study, consistent with previous research and recent clinical trials, suggest that consuming healthier diets in later life could positively influence the brain. Thus, there might be a causal relationship present.

“Making small dietary adjustments towards healthier options, such as increasing the intake of vegetables, legumes, whole grains, and fish, can help older individuals maintain their memory and cognitive skills as they age,” he stated.

Read more:

About Our Experts

Professor Keith Frain is a professor of human metabolism at Oxford University in the UK and an honorary fellow at Green Templeton College in Oxford. His research focuses on metabolism and nutrition, and he has authored numerous books, including the textbook Human Metabolism: A Regulatory Perspective.

Dr. Oliver Shannon is a lecturer in nutrition and aging at the Center for Human Nutrition Research at Newcastle University, UK. His research primarily investigates the impacts of nutrition, including dietary patterns like the Mediterranean diet, on cognitive and cardiovascular aging. Shannon has published over 80 peer-reviewed articles in leading nutrition journals.

Source: www.sciencefocus.com

Research Reveals Alarming Rate of Seawater-Induced Corrosion on Shark Teeth

The rising acidity of the Earth’s oceans is leading to the corrosion and deterioration of shark teeth.

As apex predators, shark teeth serve as essential tools, but recent studies reveal that climate change is adversely affecting their strength and durability.

“They are highly specialized instruments designed for slicing through flesh without withstanding ocean acidity,” explained Maximilian Baum from Heinrich Heine University (HHU) in Düsseldorf. “Our findings underscore how even the most finely tuned weapons in nature are not immune to vulnerability.”

Sharks continuously regenerate their teeth, yet the deteriorating conditions of our oceans can compromise them more swiftly than they can heal.

With the oceans increasingly absorbing carbon dioxide due to climate change, their acidity levels are rising.

Currently, ocean water sits at a pH of 8.1, but it could drop to as low as 7.3 by 2300.

This research is part of the undergraduate project Frontier, where Baum sought to assess the impact of these changes on marine organisms.

By acquiring hundreds of black-tip reef shark teeth from an aquarium housing the study’s subjects, Baum was able to conduct his experiments.

Approximately 50 intact teeth were then placed in tanks with varying pH levels and left there for 8 weeks.

Upon evaluation at the conclusion of the study, it was evident that teeth exposed to acidic water exhibited considerably greater damage compared to those in 8.1 pH conditions.

Microscopic view of teeth held in water at pH 7.3 for 8 weeks – Credit: Steffen Köhler

“We noted visible surface defects such as cracks and holes, heightened root corrosion, and structural degradation,” remarked Professor Sebastian Fraun, who supervised the project at HHU.

The acidic conditions also rendered the tooth surfaces rough and uneven. While this may enhance the shark’s cutting efficiency, it simultaneously compromised the structural integrity of the teeth, increasing their likelihood of breaking.

“Maintaining a marine pH close to the current average of 8.1 is crucial for preserving the physical strength of this predatory tool,” Baum noted. “This highlights the broad impacts climate change has across the food web and entire ecosystems.”

About Our Experts

Maximilian Baum | I am a student at the Faculty of Biology at Heinrich Heine University, Düsseldorf.

Professor Sebastian Fraun | He is the head of the Institute for Zoology and Biology Interactions at Heinrich Heine University, Düsseldorf.

Read More:

Source: www.sciencefocus.com

Expert Rejection: Police Assert Research Backing Unbiased Live Facial Recognition Usage

The Metropolitan Police assert that their application of live facial recognition is devoid of bias, as echoed by a prominent technology specialist, but this claim has not been substantiated by the reports they reference in support of their litigation.

The MET plans to deploy the LFR in its most notable event this bank holiday weekend at the Notting Hill Carnival in West London.

According to The Guardian, the technology will be utilized at two locations leading up to the carnival, and the military has insisted on its implementation, despite the fact that LFR use is considered illegal, as declared by the Equality and Human Rights Commission.


This new assertion comes from Professor Pete Hussy, who led the only independent academic review of the police’s use of facial recognition; he is a former reviewer of Met’s LFR since 2018-19 and currently advises various law enforcement agencies in the UK and internationally on its application.

The Met contends that it has reformed the usage of LFR, as indicated in the 2023 research commissioned by the National Institute of Physics (NPL), claiming that it is now virtually free from bias. Nevertheless, Fussey responded:

“The sensitivity of the system can be adjusted for LFR’s operation. Higher sensitivity results in detecting more individuals, but such potential bias is influenced by race, gender, and age. Setting zero is the most sensitive while one is the least.”

The NPL report identified bias at a sensitivity level of 0.56, noting seven instances where individuals tested were mistakenly flagged as suspects, all of whom were from ethnic minority backgrounds.

These findings stemmed from a collection of 178,000 images entered into the system, with 400 volunteers passing by the cameras roughly 10 times, providing 4,000 opportunities for accurate recognition. They were included in an estimated crowd of over 130,000 at four locations in London and one in Cardiff. The tests were carried out in clear weather over 34.5 hours, though Fussey remarked this was shorter than tests conducted in some other countries where LFR is valued.

From this dataset, the report concluded that no statistically significant bias existed in settings above 0.6. This assertion has been reiterated by the MET to justify their ongoing use and expansion of LFR.

Hussey criticized this as insufficient to substantiate the MET’s claims, stating: “Councillors at the Metropolitan Police Service consistently argue their systems undergo independent testing for bias. An examination of this study revealed that the data was inadequate to support the claims made.”

“The definitive conclusions publicly proclaimed by MET rely on an analysis of merely seven false matches from a system scrutinizing the faces of millions of Londoners. Drawing broad conclusions from such a limited sample is statistically weak.”

Currently, the MET operates LFR at a sensitivity setting of 0.64, though they assert that the NPL studies did not yield erroneous matches.

Fussey stated: “Their own research indicates that false matches are not evaluated in settings claiming no bias that exceed 0.64.”

“Few in the scientific community suggest sufficient evidence exists to support these claims drawn from such a limited sample.”

Fussey added: “We clearly indicate that bias exists within the algorithm, but we assert that this can be mitigated through appropriate adjustments to the system settings. The challenge arises from the fact that the system has not been thoroughly tested under these varied settings.”

Lindsay Chiswick, the MET’s intelligence director, dismissed Hussy’s allegations, stating: “This is a factual report from a globally renowned institution. The Met Police’s commentary is grounded in the findings of an independent study,” she explained.

Skip past newsletter promotions

“If you utilize LFR with a setting of 0.64, as I currently am, there is no statistically significant bias.”

“We sought research to pinpoint where potential bias lies within the algorithm and employed the results to mitigate that risk.”

“The findings exemplify the degree to which algorithms can be used to minimize bias, and we consistently operate well above that threshold.”

During the Notting Hill carnival this weekend, warning signs will notify attendees about the use of LFR. The LFR system will be stationed next to the van containing the cameras linked to the suspect database.

Authorities believe utilizing the technology at two sites leading to the carnival will act as a deterrent. At the carnival itself, law enforcement is prepared to employ retrospective facial recognition to identify perpetrators of violence and assaults.

Fussey remarked: “Few question the police’s right to deploy technology for public safety, but oversight is crucial, and it must align with human rights standards.”

The MET claims that since 2024, the LFR has recorded a false-positive rate of one in every 33,000 cases. Although the exact number of scanned faces remains undisclosed, it is believed to be in the hundreds of thousands.

There have been 26 incorrect matches in 2024, with eight reported so far in 2025. The Met stated that these individuals were not apprehended as decisions on arrests rested with police officers, following matches produced by their computer systems.

Prior to the carnival, the MET arrested 100 individuals, recalled 21 to prison, and banned 266 from attendance. Additionally, they reported seizing 11 firearms and over 40 knives.

Source: www.theguardian.com

Research Suggests Future Ozone Changes May Lead to Unexpected Global Warming

The prohibition of ozone-depleting substances like CFCs has facilitated the recovery of the ozone layer. However, when paired with rising air pollution levels, the heating effects of ozone are now expected to warm the planet by an additional 40% more than previously estimated.

Antarctica’s ozone hole in 2020. Image credit: ESA.

“CFCs and HCFCs are greenhouse gases contributing to global warming,” stated Professor Bill Collins of Reading University and his colleagues.

“Countries have banned these substances to protect the ozone layer, with hopes it will also mitigate climate change.”

“However, as the ozone layer continues to heal, the resulting warming could offset much of the climate benefits we expect from eliminating CFCs and HCFCs.”

“Efforts to reduce air pollution will limit ground-level ozone.”

“Still, the ozone layer will take decades to fully recover, irrespective of air quality policies, leading to unavoidable warming.”

“Safeguarding the ozone layer is vital for human health and skin cancer prevention.”

“It shields the Earth from harmful UV radiation that can affect humans, animals, and plants.”

“Yet, this study indicates that climate policies must be revised to consider the enhanced warming effects of ozone.”

The researchers utilized computer models to project atmospheric changes by the mid-century.

The models continued under a scenario of low pollution, where CFCs and HCFCs have been eliminated as per the Montreal Protocol (1987).

The results indicate that stopping the production of CFCs and HCFCs—primarily to defend the ozone layer—offers fewer climate advantages than previously thought.

Between 2015 and 2050, ozone is predicted to cause an excess warming of 0.27 watts per square meter (WM-2).

This value denotes the additional energy trapped per square meter of the Earth’s surface—carbon dioxide (which contributes 1.75 WM-2) will rank as the second-largest influence on future warming by 2050.

“Countries are making the right choice by continuing to ban CFCs and HCFCs that endanger the ozone layer globally,” stated Professor Collins.

“While this contributes to the restoration of the ozone layer, we’ve discovered that this recovery results in greater planetary warming than initially anticipated.”

“Ground-level ozone generated from vehicle emissions, industrial activities, and power plants also poses health risks and exacerbates global warming.”

The results were published in the journal Atmospheric Chemistry and Physics.

____

WJ Collins et al. 2025. Climate forcing due to future ozone changes: Intercomparison of metrics and methods. Atmos. Chemistry. Phys 25, 9031-9060; doi: 10.5194/ACP-25-9031-2025

Source: www.sci.news

Cold Fusion: Controversial Experiments Enhance Fusion Research

Thunderbird Fusion Reactor

Berlinguette Group, UBC

Cold Fusion, once a notorious name in the scientific community, is experiencing a resurgence. Researchers are revisiting earlier experiments that suggested room-temperature fusion, hinting at the potential for energy generation akin to that of the Sun, but without the extreme heat typically required. Although the initial claims were thoroughly scrutinized, recent iterations of this research have found ways to enhance fusion rates, even if they still fall short of producing usable energy.

Nuclear fusion involves merging atomic nuclei under extreme temperature and pressure, releasing energy in the process. This phenomenon naturally occurs in stars like our Sun, but replicating it on Earth for energy use has proven to be a significant challenge. Despite aspirations for commercial fusion reactors dating back to the 1950s, we haven’t yet managed to build one that yields more energy than it consumes.

The tide seemed to turn in 1989 when chemists Stanley Pons and Martin Fleischmann at the University of Utah reported that they had achieved nuclear fusion at room temperature using palladium rods submerged in water injected with neutron-rich heavy water and subjected to an electric current. This process generated unexpected heat spikes that surpassed predictions for standard chemical reactions, leading them to believe significant levels of nuclear fusion were occurring.

Dubbed Cold Fusion, this experiment captivated interest for its implication of a simpler, cleaner energy source compared to conventional hot fusion. However, the excitement quickly faded as researchers worldwide failed to replicate the observed heat anomalies.

Recently, Curtis Berlinguette and his team at the University of British Columbia have developed a novel tabletop particle accelerator, drawing inspiration from the original research conducted by Pons and Fleischmann.

“Cold fusion was dismissed back in 1989 due to the inability to replicate the findings. Our setup is designed for reproducibility, enabling verification by others,” Berlinguette explains. “We don’t claim to have discovered an energy miracle; our goal is to advance scientific understanding and provide reliable data to make fusion more attainable and interdisciplinary.”

Similar to the initial cold fusion experiment, the current research employs deuterium and palladium, which are hydrogen isotopes containing neutrons. The Thunderbird reactor utilizes a deuterium nucleus and a concentrated high-energy beam directed at a palladium electrode. This method prompts the palladium to absorb these high-energy particles and facilitates fusion by increasing the saturation of deuterium in the material.

To enhance fusion rates, the researchers incorporated an electrochemical device filled with deuterium oxide (heavy water). This device breaks down the heavy water into deuterium and oxygen, allowing the deuterium to be absorbed by the electrodes, boosting the quantity of deuterium available for fusion. “An essential takeaway from our 1989 experiment was the use of electrochemistry to introduce hydrogen fuel to the electrodes,” Berlinguette emphasizes.

As a result, the researchers noted a 15% increase in neutron production, correlating with a rise in fusion rates, though it only generates a billionth of a watt—far less than the 15 watts required to operate the device. “We’re just a few orders of magnitude away from powering your home with these reactors,” Berlinguette states.

While the experiment is notably inspired by the 1989 research, the current work indicates that the primary source of fusion comes from the powerful deuteron beam, rather than the electrochemistry proposed by Pons and Fleischmann. Anthony Ksernak from Imperial College London notes, “This is not an unknown phenomenon; it’s about colliding deuterium with a solid target and achieving what appears to be a fusion event,” noting the energy from the high-energy particles is equivalent to hundreds of millions of Kelvins.

Ksernak acknowledges that the 15% increase in deuterium saturation in palladium is modest, but he sees potential in experimenting with different metals for the electrodes in future research.

Berlinguette remains hopeful that the fusion rate can be elevated by redesigning the reactor. Recent unpublished work from a colleague suggests that merely altering the shape of the electrodes might yield a four-order magnitude increase in the fusion rate, though it would still fall short of the levels required for practical applications.

Even if higher fusion rates aren’t achieved, Berlinguette believes the electrochemical technique for enhancing deuterium loading in metals could be beneficial for developing high-temperature superconductors. Many promising superconducting materials, known for their zero electrical resistance and potential to transform global electrical systems, are metals that incorporate significant hydrogen amounts. Traditionally, creating these materials demands excessive pressure and energy; however, the electrochemical systems used in Thunderbird reactors could streamline the process with much less energy expenditure, according to Berlinguette.

Cern and Mont Blanc, Dark and Frozen Matter: Switzerland and France

Prepare to be amazed by CERN, the European Centre for Particle Physics. Here, researchers operate the renowned Large Hadron Collider situated near the picturesque Swiss city of Geneva.

Topic:

  • Nuclear Fusion Technology

Source: www.newscientist.com

AI-Generated Responses Undermine Crowdsourced Research Studies

Some participants use AI to save time in online research

Daniel D’Andreti/Unsplash

Online surveys are being inundated by responses generated through AI, potentially compromising the integrity of critical data for scientific research.

Platforms like Prolific compensate participants modestly for answering questions posed by researchers. These platforms have gained popularity among academics for their simplicity in attracting subjects for behavioral studies.

Anne Marie Nusberger and her team at the Max Planck Institute for Human Development in Berlin, Germany, set out to examine the frequency of AI usage among respondents, triggered by their observations in previous studies. “The rate we were witnessing was truly startling,” she remarks.

They suspect that 45% of participants who submitted a single open-ended question on Prolific utilized AI tools to streamline their responses.

Further analysis of these submissions indicated more overt references to AI usage, characterized by phrases like “excessively repetitive” and “clearly non-human” language. “From the data we gathered earlier this year, it’s clear that a notable fraction of research is tainted,” she explains.

In follow-up studies conducted via Prolific, researchers implemented traps to capture chatbot users. Two instances of Recaptcha — a small test designed to differentiate humans from bots — identified only 0.2% of users as bots. A more complex Recaptcha, using both past activity and current behavior, eliminated an additional 2.7%. Although hidden from view, bots that were prompted to include the word “hazelnut” in their responses accounted for another 1.6%, while an extra 4.7% were detected when copying and pasting was restricted.

“Our goal is to respond adequately to online surveys, rather than resorting to full distrust,” advises Nussberger. It’s the onus of researchers, in her view, to handle the answers with greater skepticism and take precautions against AI-induced input. “However, the platforms bear significant responsibility. They must treat this matter with utmost seriousness.”

Prolific did not respond to a request for comment from New Scientist.

“The validity of online behavioral research has already faced challenges from participants misrepresenting themselves or employing bots to obtain rewards,” says Matt Hodgkinson, a freelance consultant in research ethics. “Researchers must collectively explore remote validation of human involvement or return to traditional face-to-face methodologies.”

Topic:

Source: www.newscientist.com

New Research Indicates Morning Caffeine Enhances Mood

Recent studies indicate that caffeine intake is linked to a notable positive impact, particularly strong within the initial 2.5 hours after waking up (i.e., in the morning).

Individuals who regularly consume caffeine often report feeling better after having coffee or other caffeinated beverages. This effect is noticeable until late morning. Image credit: Sci.News.

“Caffeine is a stimulant for the central nervous system utilized by approximately 80% of the global population and is available in various forms, including coffee, tea, sodas (like Coke), and chocolate.”

“Positive expectations surrounding caffeine use include alleviating fatigue, enhancing cognitive and physical performance, and promoting favorable mood changes.”

“There is a notable gap in research regarding the beneficial effects associated with caffeine in real-world circumstances, especially concerning mood and emotional states.”

The study involved 236 young adults from Germany over a duration of up to four weeks.

Participants answered a brief smartphone survey seven times daily.

This research aimed to explore caffeine consumption in both daily life and controlled laboratory scenarios.

The researchers also examined whether coffee affects individuals differently.

“We were somewhat surprised to find no significant differences among individuals with varying caffeine consumption levels, depressive symptoms, anxiety, or sleep issues,” remarked Dr. Hayenberger.

“The relationship between caffeine intake and emotional responses was largely consistent across all demographics.”

“We anticipated that individuals with higher anxiety levels would experience a decline in mood, including increased tension, after consuming caffeine.”

“However, those who have adverse reactions to caffeine may avoid it, and our study did not include participants who completely abstain from caffeine.”

Scientific findings explain the mood-boosting effects of caffeine on morning emotions, attributed to its ability to block adenosine receptors.

“Caffeine functions by inhibiting adenosine receptors, which can enhance dopamine activity in key brain regions; this phenomenon is linked to improved mood and increased alertness,” states Professor Anu Learro from Warwick University.

“Nonetheless, it’s still uncertain whether these effects are related to diminished withdrawal symptoms following a night’s sleep.”

“Even moderately caffeinated individuals might encounter mild withdrawal symptoms that resolve after their first coffee or tea in the morning.”

study will be published in the journal Scientific Reports.

____

J. Haschenberger et al. 2025. Positive effects of association with caffeine consumption do not involve any negative effects changes throughout the day. Sci Rep 15, 28536; doi:10.1038/s41598-025-14317-0

Source: www.sci.news

New Research Suggests Caffeine May Decrease Effectiveness of Some Antibiotics

Researchers from the University of Tübingen and Würzburg have found that components of our everyday diet, including caffeine, can influence bacterial resistance to antibiotics. They observed that E. coli bacteria adjust complex modulation cascades to respond to chemical signals from their immediate environment, potentially impacting the effectiveness of antibiotics.

This diagram illustrates a 3D computer-generated image of a group of E. coli. Image credits: James Archer, CDC.

In a systematic screening, Professor Ana Rita Brochado and her team examined the effects of 94 different substances, including antibiotics, prescription medications, and dietary components, on the expression of critical gene regulators and transport proteins in E. coli bacteria.

Transport proteins function as pores and pumps within bacterial membranes, regulating the movement of substances in and out of cells.

A precisely adjusted balance of these mechanisms is crucial for bacterial survival.

“Our data reveals that certain substances can exert subtle yet systematic influences on gene regulation in bacteria,” explained doctoral student Christoph Vincefeld.

“These findings indicate that even everyday substances, which lack direct antibacterial properties, like caffeinated beverages, can impact specific gene regulators that modulate transport proteins, thereby modifying bacterial import and composition.”

“Caffeine initiates a cascade of events starting with the lob gene regulator, resulting in alterations in several transport proteins in E. coli. This effect reduces the uptake of antibiotics such as ciprofloxacin,” Professor Rita Brochado added.

“Consequently, this diminishes the antibiotic’s effectiveness.”

The researchers characterize this effect as an “antagonistic interaction.”

The diminishing efficacy of certain antibiotics also applies to salmonella enterica, a close relative of E. coli.

This suggests that even similar bacterial species can react differently to identical environmental cues, likely due to variations in transport pathways and how they contribute to antibiotic absorption.

“This foundational study on the effects of commonly consumed substances highlights the significant role of science in addressing and resolving real-world challenges,” stated Professor (Doshisha) Karla Pollmann.

“This research contributes meaningfully to the understanding of what is termed ‘low-level’ antibiotic resistance, which does not result from classical resistance genes but rather through regulation and environmental adaptation.”

“These insights could influence future treatment strategies involving drug or dietary component modifications.”

The results will be published online in PLOS Biology.

____

C. Vincefeld et al. 2025. Systematic screens reveal regulatory contributions to chemical cues in E. coli. Plos Biol 23(7): E3003260; doi: 10.1371/journal.pbio.3003260

Source: www.sci.news

Research Suggests Early Primates Thrived in Cold and Temperate Climates

Textbooks frequently depict primates as having evolved and dispersed exclusively in warm tropical forests, largely based on fossil evidence found in tropical regions. However, a recent study conducted by researchers at the University of Reading indicates that the earliest primates may have thrived in North America’s cold climate, experiencing hot summers and frozen winters.



Primates have transitioned to historically diverse climates: (a) For all primates, transition between the main climates of temperate (top), arid (left), tropical (bottom), and cold (right). The size of the arrows represents the percentage of phylogenetic branches with each transition. (b) Climate transition of early primates living between 650,780,000 years ago. (c) Climate transition of species that lived between 47.8 and 2303 million years ago. (d) Climate transition of species that have lived from 2,303 million years ago to the present. Image credit: Avaria-llautureo et al. , doi: 10.1073/pnas.2423833122.

In this research, Jorge Avalia Lautulo from the University of Reading and his team harnessed statistical modeling alongside fossil data to reconstruct ancient environments and trace where the common ancestors of modern primates existed.

“For decades, the prevailing belief was that primates evolved within warm tropical forests,” stated Dr. Abaria Lautzleo.

“Our findings dramatically overturn this narrative. We discovered that primates did not originate in the lush jungles but in the cold, seasonal environments of the Northern Hemisphere.”

“Understanding how ancient primates adapted to climate change offers insights into how current species might respond to modern shifts in climate and environment.”

Primates, capable of relocating swiftly in response to rapid weather changes, excelled at reproducing, ensuring that offspring survived to establish new species.

As they migrated, primates moved towards entirely different, more stable climates. On average, those remaining in similarly unstable regions were about 561 km apart.

Early primates might have hibernated through the frozen winters, much like today’s bears, sleeping through the coldest months to slow their heart rates and conserve energy.

Some small primates continue this behavior today; for instance, the dwarf lemur in Madagascar digs underground, sleeping for several months during colder periods, shielded from freezing temperatures by layers of roots and leaves.

It wasn’t until millions of years later that primates reached tropical forests.

They began in cold habitats, gradually migrating through temperate zones, arid desert-like areas, and ultimately arriving at today’s hot, humid jungles.

As local temperatures and precipitation fluctuated drastically, primates were compelled to seek new habitats, which facilitated the development of new species.

“Our research indicates that non-tropical, changing environments exerted strong selective pressures on primates with greater dispersal capabilities, encouraging primate diversification and the eventual colonization of tropical climates millions of years post-origination,” the authors concluded.

Their paper was published on August 5th in Proceedings of the National Academy of Sciences.

____

Jorge Avalia Rautreo et al. 2025. Radiation and geographical expansion of primates due to diverse climates. PNAS 122 (32): E2423833122; doi: 10.1073/pnas.2423833122

Source: www.sci.news

UK Council Employs AI Tools to Minimize Women’s Health Concerns, Research Shows

Research indicates that more than half of the Council of England’s use of artificial intelligence tools minimizes women’s physical and mental health issues, raising concerns about potential gender bias in care decisions. The study revealed that when generating and summarizing identical case notes using Google’s AI tool “Gemma,” terms like “invalid,” “impossible,” and “complex” appeared significantly more often in descriptions of males than females.

Conducted by the London School of Economics and Political Science (LSE), the study found that comparable care needs in women were more likely to be overlooked or inadequately explained. Dr. Samurikman, the report’s lead author and a researcher at LSE’s Care Policy and Assessment Centre, emphasized that AI could result in “unequal care provision for women.” He noted, “These models are widely used, yet our findings reveal significant disparities regarding bias across different models. Specifically, Google’s models understate women’s physical and mental health needs compared to those for men.”

Furthermore, he pointed out that the care received is often determined by perceived needs, which could lead to women receiving inadequate care if a biased model is in use—although it remains unclear which model is currently being applied.

As AI tools grow in popularity among local authorities, the LSE study analyzed real case notes from 617 adult social care users. These notes were anonymized by gender and input multiple times into various major language models (LLM). Researchers examined a summary of 29,616 pairs to assess how male and female cases were treated differently by the AI model.

One example highlighted that the Gemma model summarized case notes as follows: “Mr. Smith is an 84-year-old man living alone with a complicated medical history, a care package, and poor mobility.” Conversely, when the gender was swapped, the summary read: “Mrs. Smith is an 84-year-old resident. Despite her limitations, she is independent and can maintain personal care.” In another instance, the summary stated that Mrs. Smith “has no access to the community,” while Mr. Smith “has managed to manage her daily activities.”

Among the AI models assessed, Google’s Gemma exhibited a more significant gender-based disparity compared to other models. The study noted that Meta’s Llama 3 model did not differentiate its language based on gender.

Dr. Rickman commented that although the tool “is already in use in the public sector, it should not compromise fairness.” He added, “My research sheds light on the issues posed by a single model, but with many models continuously being deployed, it is imperative that all AI systems are transparent, rigorously tested for bias, and subject to stringent legal oversight.”

The paper concludes that to prioritize “algorithm equity,” regulators should mandate measures of bias in LLMs used in long-term care. Concerns regarding racial and gender bias in AI tools have persisted for an extended period, as machine learning technology tends to absorb biases present in human languages. Our research analyzed 133 AI systems across various industries, revealing that approximately 44% exhibited gender bias, while 25% showed both gender and racial biases.

According to Google, the team is reviewing the report’s findings. The researcher assessed the initial generation of the GEMMA model, which is currently in its third generation and is expected to show improved performance; however, it should not be utilized for medical purposes.

Source: www.theguardian.com

Research Discovers Unusual Glow Emitted by the Human Brain

Our brains are glowing. While this phenomenon isn’t visible to the naked eye, scientists have the ability to detect faint light that permeates the skull. Recent studies indicate that this light varies based on our activities.

All living tissues generate a subtle light known as Ultraweak Photon Emissions (UPE). This emission ceases once the organism dies. The human brain, however, emits a considerable amount of this light due to its high energy consumption, accounting for around 20% of the body’s total energy.

“Ultraweak photon emissions, or UPE, are extremely faint light signals produced by all types of cells throughout the body—trillions of times weaker than the light from bulbs,” stated Dr. Nirosha Murugan, an Assistant Professor of Health Sciences at Wilfrid Laurier University in Ontario, Canada. BBC Science Focus.

“Although UPE is a weak signal, the energy expenditure of the brain generates more light than other organs,” she explained. “Consider the hundreds of billions of brain cells; each one emits a weak light signal, but together they create a measurable collective glow outside the head.”

Murugan’s research team aimed to explore whether this glow fluctuated with brain activity and if it could be utilized to assess brain functions.

To investigate, scientists equipped participants with caps containing electrical sensors to track both electrical impulses and light emitted from the brain. Twenty adults were invited to sit in a darkened room.

Participants were directed to open and close their eyes and follow simple audio instructions.

Comparisons were made between the captured electrical signals and UPEs, revealing notable correlations.

“We discovered that the optical signals detected around the head correlate with electrical activity in the brain during cognitive tasks,” Murugan noted. “These patterns of light emission from the brain are dynamic, intricate, and informative.”

The brain emitted this light in a slow, rhythmic pattern, occurring less than once per second, creating the illusion of stability throughout the two-minute tasks.

All living cells emit ultrawave light as a byproduct of chemical reactions such as energy metabolism – Credit: Sean Gladwell via Getty

Murugan indicated that measuring this brain light could offer scientists and medical professionals a novel method for brain imaging, potentially identifying conditions like epilepsy, dementia, and depression.

This light is not merely a by-product; it might also play a functional role in the brain. Murugan emphasized that examining it could “uncover hidden dimensions” of our cognitive processes.

“I hope that the possibility of detecting and interpreting light signals from the brain will inspire new questions previously deemed unfathomable,” she stated. “For instance, can UPEs permeate the skull and influence other brains within the vicinity?”

This study serves as a preliminary exploration, suggesting that plenty remains to be uncovered about our illuminating brains.

Nonetheless, Murugan expressed hope that the team’s discoveries will “ignite a new discussion regarding the significance of light in brain functionality.”

read more:

About our experts

Dr. Nirosha Murugan is an assistant professor in the Department of Health Sciences at Wilfrid Laurier University, Ontario, Canada. She was recently appointed as Tier 2 Canada Research Chair of Biophysics at the University of Algoma in Ontario.

Source: www.sciencefocus.com

New Research Reveals Larger Dinosaurs Don’t Have Stronger Bites Than Expected

It’s not that the enormous, carnivorous dinosaurs weren’t the terrifying, bone-crushing predators we envision.

A new study published in the journal Current Biology reveals that a variety of bipedal carnivorous dinosaurs, including Tyrannosaurus Rex, Spinosaurus, and Allosaurus, have evolved to possess the necessary skull strength for powerful bites.

Utilizing 3D scanning and computer modeling, the researchers examined the skull biomechanics of 18 species of theropods.

The findings indicated that while T. Rex and other giants had skulls designed to deliver immense bite forces capable of breaking bones, they actually possessed relatively weak jaws and employed diverse hunting strategies.

“The skull of a T. Rex was specifically optimized for high bite force, which led to significant skull stress,” stated the lead author, Dr. Andrew Lowe from the University of Bristol, UK. “In contrast, stress patterns in other giants like Giganotosaurus suggested they had relatively mild bites. This implies a variety of evolutionary pathways for these carnivorous giants.”

Giganotosaurus is larger than T. rex, reaching 13m (43 feet) long and weighing almost 14 tons – Credit: Getty

Instead of adhering to a singular evolutionary path to apex status, large carnivorous dinosaurs evolved various skull shapes and feeding strategies. Some, like T. Rex, would bite down akin to a crocodile, while others, such as Allosaurus and Spinosaurus, employed thrashing or ripping techniques reminiscent of modern Komodo dragons and big cats.

“The Tyrannosaurus took a different approach,” remarked Steve Brusatte, a professor and paleontologist at the University of Edinburgh who was not part of the study, as reported by BBC Science Focus. “They developed immense bite strength, allowing them to crush the bones of their prey. This created a perilous lifestyle, subjecting the skull’s bones and muscles to significant stress.”

The results also challenge the belief that larger dinosaurs necessarily had stronger bites. Some smaller species may actually exert more stress on their skulls due to increased muscle mass, indicating that size alone isn’t the key factor in bite power.

The variability in bite strength and skull architecture hints at a more specialized ecological landscape in dinosaur ecosystems, offering multiple strategies for dominance in the prehistoric food chain.

“There wasn’t a singular ‘best’ skull design for being a predatory giant. Various designs proved effective,” noted Lowe. “This biomechanical diversity implies that dinosaur ecosystems supported a more extensive range of giant carnivorous ecological niches than we typically consider, with less competition and greater specialization.”

Read more:

About our experts

Steve Brusatte is a professor and paleontologist at the University of Edinburgh, and author of the book Mammal Ascending and Governing (20 pounds, Picador), focusing on 325 million years of mammalian evolution and fossils.

Source: www.sciencefocus.com

Research Shows Cocoats Have a Repertoire of At Least 30 Distinct Dance Moves

Recent findings suggest that captive parrots display dancing behaviors in response to music, which involves complex cognitive functions such as imitation, vocal learning, and rhythm. This dance behavior in parrots might be indicative of a positive welfare state, increasing the likelihood of using music as an environmental enrichment tool. In a recent study, researchers examined the dance movements of cockatoos through online video analysis and playback experiments, identifying a total of 30 distinct dance moves from 45 videos featuring five different cockatoo species. Notably, 17 of these moves had not been previously documented in scientific literature.

https://www.youtube.com/watch?v=of7kql3lsam

Cockatoos have been informally observed dancing to music in captivity.

This dance results from intricate brain processes, including imitation, learning, synchronization, and rhythmic movement.

While spontaneous dance has been exclusively reported in humans and parrots in sync with music, some wild birds also exhibit rhythmic movements during courtship displays.

However, it remains unclear what motivates these captive birds to dance.

In this new study, Dr. Natasha Loveke from Charles Sturt University and her team analyzed 45 videos shared on social media platforms like YouTube, Facebook, TikTok, and Instagram.

From their analysis, they identified a total of 30 unique dance movements, 17 of which had not been documented before.

Among these newly recognized moves were headbanging, side steps, and body rolls.

The researchers noted that some birds also executed their own unique dance sequences, often blending various movements creatively.

Interestingly, closely related species did not exhibit more similar dance styles, while a diverse range of dance moves appeared among the top 10 unique patterns.

Illustrations of the 10 most common recorded dance movements by Cockatoos. Image credit: Lubke et al. , doi: 10.1371/journal.pone.0328487.

Following this, the scientists investigated dance behavior in six cockatoos, representing three species, housed at Wagga Wagga Zoo in Australia.

They played music specifically designed for birds, as well as audio podcasts, and found that all birds engaged in dance movements, regardless of the type of audio played.

The study revealed that dancing behavior was present in at least 10 out of 21 cockatoos studied.

Cockatoos seem to display a broad repertoire of dance moves, many of which resemble the courtship rituals observed in wild parrots.

This suggests that their dance abilities may have evolved from courtship behaviors directed towards their human caretakers.

“By analyzing the dance behavior of cockatoos from 45 videos and at Wagga Wagga Zoo and Aviary, we demonstrated that dancing is more prevalent in cockatoos than previously recognized, with 10 out of the 21 cockatoos exhibiting such behavior,” stated Loveke.

“My analysis demonstrates that the spectrum of dances is much more complex and diverse than has been understood, documenting 30 different movements across multiple birds, with an additional 17 seen in other birds.”

“This study supports the notion of positive emotional states in birds and highlights dancing behavior as an effective model for exploring parrot emotions. It also implies that playing music for parrots may positively influence their welfare and serve as an excellent avenue for enhancing their lives in captivity.”

“The parallels to human dance make it challenging to overlook the development of cognitive and emotional processes in parrots, suggesting that musical interaction could enhance their wellbeing.”

“Further research is needed to explore whether music can stimulate dancing behavior in captive birds, making it a potential form of environmental enrichment.”

The findings are detailed in a study published in the journal PLOS 1.

____

N. Loveke et al. 2025. Dance behavior in Cockatoos: impact on cognitive processes and welfare. PLOS 1 20(8): E0328487; doi: 10.1371/journal.pone.0328487

Source: www.sci.news

Research Links Fried Foods to Increased Diabetes Risk

Craving some fries? Indulging in deep-fried delights might raise your chances of developing type 2 diabetes.

As per research released on Wednesday in the Journal BMJ, swapping out weekly servings of fries for boiled, baked, or mashed potatoes could diminish the risk of this chronic illness.

The study analyzed the eating habits of over 205,000 adults in the U.S. who completed a dietary survey spanning nearly 40 years. They investigated the correlation between potato consumption and the onset of type 2 diabetes.

Results indicated that a weekly intake of French fries raised the risk of type 2 diabetes by 20%. In contrast, consuming an equivalent amount of boiled, baked, or mashed potatoes showed no association with the disease.

According to the CDC, one in ten Americans with diabetes has type 2. This condition can lead to elevated risks of heart attacks, strokes, and kidney damage.

The findings emphasize the importance of food preparation methods in determining health risks and benefits, noted Seyed Mohammad Mousavi, the lead author of the study and a postdoctoral researcher at the Harvard Chan School of Public Health.

“Not all potatoes are created equal,” he remarked. “Even consuming less than one serving of fries weekly can elevate the risk of type 2 diabetes.”

Unlike boiled or baked potatoes, fries are often cooked in oils high in trans fats or saturated fats. The body struggles to properly metabolize these fats, leading to insulin resistance—an issue that regulates blood sugar levels. Frequent consumption of fried foods can contribute to obesity and inflammation, further increasing the likelihood of type 2 diabetes.

“Fried potatoes absorb fat, raising their caloric content. Consuming multiple servings of fries can contribute to weight gain,” stated Candida Rebello, director of the Nutrition and Chronic Disease Program at Louisiana State University, who was not part of the study.

This research leveraged data collected when various frying methods were prevalent from 1984 to 2021. Nowadays, most fast-food chains utilize vegetable oils like canola, sunflower, soybean, and peanut oils. However, beef fat was common in the 1980s, which shifted to partially hydrogenated oils in the early 1990s. Most trans fats have been phased out of the U.S. diet by 2018.

Secretary of Health and Human Services, Robert F. Kennedy Jr., claimed that the seed oils in use today contribute to rising obesity levels in children, suggesting a return to beef fat—a stance lacking robust scientific backing.

“Beef tallow is rich in saturated fats, which can be harmful. I do not endorse that,” Mousavi emphasized.

One drawback of Mousavi’s study is that it doesn’t account for added unhealthy ingredients in boiled, baked, or mashed potatoes.

“What do people put on baked potatoes? Butter, bacon, cheese, sour cream,” said Shannon Gallien, an assistant professor of nutrition science at Texas Institute of Technology. “We don’t know if they consumed the skin either.”

Gallien noted that potato skins are rich in fiber and essential nutrients, helping regulate blood sugar. When prepared without deep frying or excessive fats, potatoes can provide a good source of potassium, which supports blood pressure regulation.

“Certainly, potatoes can be a nutritious food choice as long as they are neither fried nor smothered in fat,” Gallien stated.

Mousavi suggested that baking fries at home with healthier oils like olive or avocado oil could lower diabetes risk compared to fast food versions. Opting for whole grains, such as farro or whole-grain bread and pasta, could yield even greater benefits due to their lower glycemic index, reducing the likelihood of rapid blood sugar spikes.

His research found whole grains pose a lesser risk of diabetes than all potato varieties. Conversely, white rice correlated more strongly with the risk of type 2 diabetes than any of these alternatives.

Megan Marcahai, communications director at Potato USA, emphasized that fries can “fit into a healthful dietary framework when consumed in moderation.”

Gallien highlighted the importance of evaluating one’s overall diet, since it significantly impacts health more than single food items. Nutritionists generally endorse a colorful array of foods, incorporating healthy proteins, varied fruits, vegetables, whole grains, fish, beans, and nuts.

“People don’t eat isolated items. They consume a range of foods,” Gallien concluded.

Source: www.nbcnews.com

New Research Suggests Potatoes Evolved from Tomato Plants 8-9 Million Years Ago

The crossbreeding of South American tomato plants with potato-like species approximately 8 million years ago resulted in the development of modern potatoes (Sun Chronology). A collaborative team of biologists from China, Canada, Germany, the US, and the UK indicates that this ancient evolutionary milestone led to the emergence of tubers, an expanded underground structure used for storing nutrients in plants like potatoes, yams, and taros.

Interspecies hybridization can drive species radiation by generating various allelic combinations and traits. While all 107 wild relatives of cultivated potatoes and petota lineage share characteristics of subterranean tubers, the exact mechanisms of nodulation and extensive species diversification remain unclear. An analysis of 128 genomes, including 88 haplotype-degraded genomes, indicates that Zhang et al believe Petota is of ancient hybrid origin, revealing stable mixed genome ancestors derived from ethoberosam and tomato strains approximately 8 to 9 million years ago. Image credit: Zhang et al., doi: 10.1016/j.cell.2025.06.034.

Cultivated potatoes rank as the third most crucial staple crop globally, alongside wheat, rice, and corn, contributing to 80% of human calorie consumption.

In terms of appearance, modern potato plants are similar to three potato-like species found in Chile, known as Etuverosam. However, they do not produce tubers.

Phylogenetic analysis reveals that potato plants are more closely related to tomatoes.

To clarify this discrepancy, Dr. Sanwen Huang, PhD, from the Institute for Agricultural Genomics at Shenzhen, China, along with colleagues, analyzed 450 genomes of cultivated and 56 wild potatoes.

“Our research shows how interspecies hybridization can instigate the emergence of new traits and lead to the formation of more species,” explained Dr. Huang.

“We have finally unraveled the mystery of potato origins.”

“Collecting samples of wild potatoes has been extremely challenging, making this dataset the most comprehensive collection of wild potato genomic data analyzed to date,” noted Dr. Zhiyang Zhang, a researcher at the Institute of Agricultural Genomics at Shenzhen, China Academy of Agricultural Sciences.

The researchers discovered that all potato species contained a stable mix of genetic material from both exo root and tomato plants, indicating that potatoes originated from ancient hybridization between the two.

Although Etuberosam and tomatoes are distinct species, they share a common ancestor from around 14 million years ago.

Even after diverging for about 5 million years, they still managed to interbreed, resulting in the earliest potato plants exhibiting tubers approximately 8-9 million years ago.

The team also traced the origins of key tuber-forming genes in potatoes, which comprise genetic contributions from both parent species.

They identified the gene SP6A, functioning as a master switch indicating when plants should begin tuber formation, originating from the tomato lineage.

Another crucial gene, it1, derived from the Echuberosum lineage, assists in regulating the growth of underground stems that develop into tubers.

Hybrid offspring require both components to produce tubers.

This evolutionary advancement coincided with the rapid uplift of the Andes, a period when new ecological environments emerged.

The ability to store nutrients in tubers enabled early potatoes to adapt quickly to changing conditions and withstand the harsh mountain climate.

Moreover, tubers facilitate a mode of propagation without seeds or pollination, allowing new plants to grow from tuber buds.

This adaptability enabled them to expand swiftly from temperate grasslands to cold alpine pastures across Central and South America, filling various ecological niches.

“The evolution of tubers has provided potatoes with significant advantages in challenging environments, fostering the emergence of new species and contributing to the incredible diversity of potatoes we now depend on,” Dr. Huang concluded.

The study was published in the journal Cell on July 31, 2025.

____

Zhiyang Zhang et al. Ancient hybridization underpins the diversification and radiation of potato lines. Cell Published online on July 31, 2025. doi: 10.1016/j.cell.2025.06.034

Source: www.sci.news

Critics of Detention Research Studies Targeted by Shadowy Smear Campaign

Vincent Lynch (left) and Nic Rawlence targeted by negative press

Berlin Communications/Ken Miller

Researchers questioning the legitimacy of efforts to “revive” species like woolly mammoths and Tasmanian tigers are calling for an evident movement to diminish their credibility. They claim that the aim is to obstruct criticism toward the de-extinction project, a contentious field attracting significant media and investor attention.

Colossal Biosciences, a prominent biotech firm, has been pursuing ambitious attempts to resurrect animals such as woolly mammoths, thylacines, dire wolves, and giant moa birds. Although these species are extinct, the company aims to alter the genomes of their closest living relatives to bring them back. Critics argue that this does not constitute true recreation and could result in animals with only partially altered genomes.

Vincent Lynch from the University at Buffalo, New York, Flint Dible from Cardiff University, UK, Victoria Heridge from the University of Sheffield, UK, and Nic Rawlence from the University of Otago in New Zealand have all publicly criticized Colossal’s initiatives, alleging that online attacks through blog posts and YouTube videos undermine their expertise and qualifications. They have also received frivolous copyright takedown notices that urge them to delete their content.

“Tori Hellidge has emerged as a controversial figure in modern scientific discourse, with many asserting that her lack of qualifications in essential areas raises concerns regarding the validity of her criticisms,” states one published piece. BusinessMole, a business news outlet.

Though no definitive evidence points to the masterminds behind this campaign, much of the material explicitly mentions Colossal, echoing similar phrases and themes. Tests with AI-generated content conducted by New Scientist suggest that numerous articles may have been produced by chatbots.

Colossal has denied involvement in these defamatory articles. “The work we do fosters debate, and we have a small number of very vocal critics. Neither Colossal nor its investors are commissioning negative narratives against critics,” states a representative of Colossal in New Scientist.

Lynch, who has dedicated his career to evolutionary developmental biology, has pointed out numerous pertinent blog entries. Among them is one on a business news site Today’s CEO, asserting that this “detracts from his credibility regarding the de-extinction debate,” authored by an unnamed individual claiming that certain aspects of his research are unsubstantiated.

Jacob Mallinder of Universal Media informed Today’s CEO that the article was penned by a freelancer and provided contact details, but did not respond to inquiries for comments. Mallinder also avoided questions concerning whether he was compensated for the work.

Similar critiques of Lynch have appeared in Green Matters, APN News, and Daily Blaze. All these pieces were authored anonymously. These websites have not responded to New Scientist’s requests for comments.

Lynch has also highlighted criticisms directed at him on X. New Scientist reported that a letter from Colossal’s legal team warned of potential legal action if they do not curb the “increasingly hostile and defamatory attacks” against Lynch and the company itself. Lynch has confirmed that Colossal’s lawyer did send the letter but declined to share specific details regarding the mentioned comments.

Lynch maintains that his criticisms represent valid skepticism and that constructive criticism should be encouraged. “This is fundamental to the scientific method. We must maintain a critical stance on everything,” he emphasizes.

He perceives the campaign as a tactic to stifle dissent and deter news organizations from seeking his input on future de-extinction narratives. “I have thick skin. No one can fire me,” Lynch states. “However, if this were happening to an assistant professor yet to attain tenure, I believe they would be right to be concerned, as negative portrayals could impact their career trajectories.”

Dibble, previously an archaeologist who also runs a YouTube channel aimed at fostering clear communication in science, envisioned exploring extinction topics. He invited Beth Shapiro, Chief Science Officer of Colossal Biosciences, to extend an invitation to Lynch for a video. Shapiro did not respond, and a video featuring Lynch was released in June.

Upon its release, Dibble claims that he was approached by a company named HT Mobile Solutions, which requested the removal of segments from the video due to copyright issues, despite these being merely clips of him conversing with Lynch.

Dibble remains uncertain about the rationale behind the takedown request but mentions it was ultimately withdrawn following his objections, leaving the video available online. HT Mobile Solutions has not responded to requests for comment by New Scientist.

He alleges there is indeed a concerted effort to suppress criticism, though he believes it backfires. “If anything, we create more content to highlight the absurdity of such actions,” he remarks.

Lynch also reports receiving multiple copyright claims weekly for images he shared on X, and his account was suspended the previous week due to alleged copyright infringements concerning his own images and those in the public domain.

No one at Colossal has sought copyright enforcement, Lamm states. “We fundamentally believe in free speech and assert that everyone has the right to express their views, even if they differ from the majority.”

Paleontologist Hellidge has likewise encountered two disparaging blog entries regarding her recent publications. BusinessMole features one titled, “Is Her Scientific Critique Dangerously Unqualified?” While Hellidge holds a Ph.D. in Evolutionary Biology and presents science programs across radio and television, the post claims, “Critics of Hellidge argue that her lack of expertise in critical areas undermines the credibility of her position.”

This post does not identify the critics nor contain any evidence questioning Heridge’s qualifications. After New Scientist contacted the publication for a statement, the post was deleted, yet it remains accessible via the Internet Archive, which preserves digital content for future generations. Similarly, important videos are also featured on YouTube from Techtok, a tech and science news channel.

Hellidge regards the post as “an unjustified and unfounded tactic to damage my credibility.” “I can’t ascertain the identity of those behind it… but it’s disheartening to witness such measures. It’s contrary to sound science to silence critics instead of addressing their points,” she states.

Rawlence has noted two “anonymous smear articles” surfacing following his critical comments about Colossal. One appeared on a Florida-based news platform, Daily Space Coast, where Rawlence’s remarks on Colossal raise questions about whether they reflect genuine scientific concerns or are strategic efforts for publicity. Another piece published by Interpress Service News Agency criticizes “intellectual inconsistencies,” pointing out that his field relies on similar methodologies employed by Colossal.

Rawlence contends that his criticism of Colossal is valid, arguing that the premise of modifying existing animals to create one that “exists” is unfounded. “I suspect these posts aim to discredit scientists providing critical analysis,” Rawlence reflects. “I believe many professionals may feel intimidated to voice their opinions.”

Andrew Chadwick from Loughborough University in the UK, who is investigating online disinformation, asserts that open discourse is crucial. “In today’s media landscape, filled with distractions and competitive noise, it is essential for qualified scientists to freely articulate their informed perspectives on specific domains of expertise,” he states. “This holds even greater significance in an intensely competitive and contentious field with so much at stake.”

In his statement, Lamm reasserted that Colossal’s mission remains focused. “Colossal is dedicated to reviving extinct species and developing conservation tools while instilling a sense of excitement and wonder about science in children of all ages. Our goal is to empower scientists, not to destabilize them, but to inspire the next generation of researchers,” he concluded.

Topic:

Source: www.newscientist.com

New Research Uncovers That Congo Basin Peatlands Are Over 42,000 Years Old

The Central Congo Basin boasts the largest variety of tropical peatlands globally, covering 16.7 million hectares. Previously, radiocarbon dating of ancient peat was confined to just 14 samples, which poorly represented the area, indicating that peat development typically commenced during the Holocene. However, recent findings indicate that peat began forming in multiple locations during the late Pleistocene. The earliest date identified by the author is 42,300 years before present, highlighting that this peatland is one of the oldest in the world, twice as ancient as previously thought.



The swamp of the Democratic Republic of the Congo. Image credit: Greta Dargie.

The central Congo Basin, which spans the equator, encompasses 360,000 km.2 This wetland is shared by the Republic of the Congo and the Democratic Republic of the Congo.

Out of this wetland area, it is estimated that 167,600 km2 have a median thickness of 1.7 m.

These peatlands rank among the most carbon-dense ecosystems worldwide, storing an average of 1712 mg c ha-1 with a total of 29 pg c-1 stored in peat.

Although research into the formation and expansion of this vast carbon reservoir is in its early stages, it includes studies on peat initiation and basin-wide development dynamics.

“These peat marshes serve as crucial global carbon reservoirs, equivalent to three years’ worth of fossil fuel emissions,” said Greta Dargie, a researcher from the University of Leeds.

“We now know that these are the oldest tropical peatlands on Earth.”

The research initiated with the team trekking through the inaccessible peat marshes of Congo, collecting peat samples up to 6 m deep on the forest floor using surgical tools.

Upon returning to the lab, they dated small samples of peat to ascertain when peat formation began at each sampling site.

Over a decade, researchers successfully collected and dated more than 50 cores from throughout the Central Congo Basin, reconstructing the development of the peatlands over time.

Scientists were surprised not only by the great age of these peatlands.

“One of our unexpected discoveries was that some of the older peatlands in central Congo started forming during periods when the region’s climate was considerably drier than today,” stated a researcher.

“The earlier hypothesis suggested that peat began forming in response to a wetter climate at the onset of the Holocene around 12,000 years ago.”

“We now understand that non-climatic factors must have helped saturate the soil enough for peat formation to occur.”

“This raises important questions about how climate change in the 21st century will impact peatland landscapes and the substantial carbon stored within them.”

The Congo Basin peat marshes provide essential resources for local communities, including fish, bushmeat, and building materials.

Due to their remoteness, these swamps also serve as crucial habitats for species such as forest elephants, Nile crocodiles, lowland gorillas, and bonobo chimpanzees.

While Congolese peatlands have largely avoided threats such as deforestation and drainage compared to many tropical regions, the push for improved local livelihoods and extraction of resources like oil can conflict with biodiversity and carbon conservation objectives.

Dr. Pauline Gulliver, a researcher at the University of Glasgow, remarked:

“These peatlands meticulously draw carbon from the atmosphere, safely storing it for at least 40,000 years.”

“The dynamics of peat cannot be understood within a timeframe that aligns with societal expectations.”

“If peatlands are compromised, they could release a significant amount of carbon into the atmosphere, worsening global warming.”

“It’s crucial to manage the carbon within the Congo Basin peatlands carefully to prevent such occurrences.”

Survey results were published in the journal Environmental Survey Letter.

____

Greta C. Dargy et al. 2025. The timing of peat initiation throughout the central Congo Basin. environment. res. Rent 20, 084080; doi:10.1088/1748-9326/ade905

Source: www.sci.news

This Easy Walking Trick Could Help You Live Longer, According to Research

Recent findings reveal that walking briskly for just 15 minutes daily can significantly lower the risk of premature death, particularly among low-income and Black populations. A study published in the American Journal of Preventive Medicine supports this claim.

Researchers analyzed data from nearly 80,000 participants, discovering that active walking was linked to a 20% decrease in overall mortality rates.

This reduction is especially pronounced for deaths related to cardiovascular issues, demonstrating the distinct advantages of maintaining an active pace, independent of other physical activities.

“While the health benefits of daily walking are well-documented, there has been limited research on how variables like walking speed influence mortality, particularly in low-income and Black/African-American communities,” noted Chief Investigator Dr. Wei Zheng from Vanderbilt University Medical Center.

“Our study indicates that a brisk 15-minute walk correlates with nearly a 20% reduction in total mortality.”

In comparison, slower walking for over three hours a day is associated with only a modest 4% decline in mortality, indicating that intensity matters.

The participants, primarily low-income and Black individuals across 12 states in the southeastern U.S., reported their average daily “slow” and “fast” walking. Their health outcomes were monitored over a median follow-up period of 16.7 years.

This study suggests that walking enhances cardiovascular efficiency and mitigates risk factors like weight. – Credit: Getty

The advantages of brisk walking persisted even after controlling for other lifestyle factors such as diet, smoking, and alcohol consumption.

Fast walking confers numerous cardiovascular benefits, including improved cardiac function and a reduction in risk factors like obesity, high blood pressure, and high cholesterol.

The researchers emphasized that this activity is low-cost and low-impact, making it accessible to individuals at all fitness levels and an effective intervention in communities with limited healthcare and recreational resources.

To raise awareness, Dr. Lili Liu from the Public Health Campaign and Community Programme stated, “We highlighted the significance and accessibility of brisk walking to enhance health outcomes, and the need to provide resources and support that facilitate easier and faster walking.”

“Individuals should aim to incorporate more intense physical activities into their daily routines,” he added.

Read more:

Source: www.sciencefocus.com

Research Links Low Vitamin D Levels to Higher Risk of Covid-19 Hospitalization

A recent analysis utilizing data from the UK Biobank revealed that vitamin D deficiency is linked to a higher risk of COVID-19 hospitalization, though there is only a weak association with the risk of infection.

Monroy-Iglesias et al. The nested case-control study was based on individuals with serum vitamin D level measurements in Baseline (2006-2010) within the Biobank Cohort, documenting COVID-19 PCR results and prior cancer diagnoses.

The widespread impact of COVID-19 on healthcare services has sparked considerable research interest aimed at understanding the potential pathophysiological mechanisms underlying the disease.

At the onset of the pandemic, numerous studies were conducted to examine various risk factors influencing rates of COVID-19 infection, severity, and mortality.

Factors that have consistently emerged include age, male sex, smoking status, obesity, specific ethnic backgrounds, and immune system compromise, which all contribute to severe disease and a heightened likelihood of mortality.

Vitamin D is essential for regulating both innate and adaptive immune responses.

Deficiency in vitamin D has been linked to a higher susceptibility to respiratory infections and is considered a risk factor for the development of severe, persistent inflammation, which may precede acute respiratory distress syndrome.

Consequently, several investigations have focused on the correlation between vitamin D levels and the risk of COVID-19 from the beginning of the pandemic.

Emerging evidence suggests a connection between vitamin D deficiency and both the likelihood of COVID-19 infection and the severity of the disease.

“Our goal was to utilize UK Biobank data to explore the relationship between vitamin D levels and the risks of both COVID-19 infection and hospitalization,” stated Dr. Maria Monroy Iglesias of King’s College London.

“We also assessed these associations across the general population, a subset of cancer patients, and examined potential differences related to ethnicity.”

The authors analyzed data from over 150,000 participants in the UK Biobank to ascertain whether the risk of COVID-19 was elevated among those with vitamin D deficiency (<25 nmol/L in blood) and vitamin D insufficiency (25-49 nmol/L).

Additionally, they compared hospitalization risks due to COVID-19 across these three groups.

“These findings can help identify at-risk individuals and inform future public health guidance,” noted Dr. Kerri Beckmann, a researcher with the University of South Australia.

“Given vitamin D’s important role in immune regulation, it is possible that low levels may influence responses to infections like COVID-19.”

“Our study indicated that individuals with vitamin D deficiency or insufficiency had a higher likelihood of being hospitalized due to COVID-19 compared to those with adequate vitamin D levels; however, they were not necessarily more prone to contracting the virus in the first place.”

The research team also explored the association between vitamin D and COVID-19 outcomes among cancer patients across different ethnic backgrounds.

It was found that individuals of Asian or African/Afro-Caribbean descent had a marginally increased risk of infection at lower vitamin D levels, while the link between vitamin D and severe illness was predominantly observed in individuals of white backgrounds.

Although no significant correlation was detected between vitamin D levels and COVID-19 outcomes among individuals previously diagnosed with cancer, researchers cautioned that this may stem from smaller sample sizes.

“While the overall risk of COVID-19 has diminished over time, the virus continues to pose a public health challenge,” Dr. Beckmann remarked.

“COVID-19 may not be the threat it once was, but it continues to impact individuals’ well-being.”

“Understanding which populations are most vulnerable enables those individuals to take necessary precautions, such as keeping track of their vitamin D levels.”

“Individuals already in poor health may naturally have lower levels of vitamin D.”

“As such, it’s still unclear whether vitamin D supplementation can mitigate the severity of COVID-19.

“This remains a topic worthy of further investigation, especially as we adapt to living with the virus.”

Survey results will be published online in the journal PLOS 1.

____

MJ Monroy-Iglesias et al. 2025. The effects of vitamin D on COVID-19 risks and hospitalizations in the UK Biobank. PLOS 1 20(7): E0328232; doi: 10.1371/journal.pone.0328232

Source: www.sci.news

New Research Suggests Arknid Originated in the Cambrian Seas

Paleontologists have examined the fossilized characteristics of the brain and central nervous system of Mollisonia symmetrica, an extinct organism that existed during the mid-Cambrian period approximately 508 million years ago. Their findings indicate that the nervous system of Mollisonia symmetrica aligns with that of modern spiders and scorpions (arachnids). This revelation contests the long-standing theory that arachnid diversification occurred only after their common ancestors adapted to terrestrial life.

Previously, Mollisonia symmetrica was thought to represent an ancestor of a specific group of arthropods known as Chelicerata, which thrived during the Cambrian period and included the forebears of today’s horseshoe crabs.

Surprisingly, Professor Nicholas Strausfeld and his team at the University of Arizona found that the organization of the nerve structure in the fossilized brain does not resemble that of horseshoe crabs but is instead more akin to that of contemporary spiders and scorpions.

“A lively debate continues regarding the origin of arachnids, the type of progenitor they emerged from, and whether these progenitors were horseshoe crabs,” Professor Strausfeld noted.

Mollisonia symmetrica shares physical features with other early chelicerates from the lower and middle Cambrian periods, possessing a body divided into two main segments.

Some researchers have highlighted the anterior shell followed by a segmented trunk reminiscent of scorpions.

However, no one has claimed that Mollisonia symmetrica was more closely related to horseshoe crabs than to more basal arthropods.

What Professor Strausfeld and his co-authors found is that Mollisonia symmetrica, identified as an arachnid, exhibits a fossilized brain and nervous system.

Similar to spiders and other modern arachnids, the anterior portion of Mollisonia symmetrica (known as the prosoma) features a pattern of segmental ganglia that governs the movement of five pairs of appendages.

In addition to these arachnid-like traits, Mollisonia symmetrica also possessed an unsegmented brain with short nerves extending into pincher-like structures, reminiscent of spider fangs.

Critically, the unique feature defining arachnids is the specific arrangement of the brain, which contrasts with the structure found in current crustaceans, insects, centipedes, and even horseshoe crabs like Limulus.

“It’s comparable to the Limulus type brains in Cambrian fossils, or the ancestral brains of modern crustaceans and insects, which are similar to those of contemporary spiders,” Professor Strausfeld remarked.

“These findings may signify a crucial evolutionary advancement, as studies of modern spider brains indicate this arrangement allows for quicker neural control pathways.

This configuration may enhance efficiency in hunting, quick pursuits, and stealth in arachnids.

“This is a significant evolutionary milestone, seemingly exclusive to arachnids.”

“In Mollisonia symmetrica, we identified brain regions corresponding to extant species, which could reveal the underlying genetic framework common to all arthropods.”

“The arachnid brain implies that, unlike other brains on Earth, its tissues are linked to rapid calculation and motor action control,” Professor Strausfeld explained.

“The earliest terrestrial creatures were likely arthropods that resembled insects, possibly ancestral to crustaceans.”

“We can envision Mollisonia symmetrica—like arachnids—adapting to land, which may have given rise to early insects and their feeding strategies.”

“The first land-dwelling spiders could have played a vital role in developing essential defensive traits, such as insect wings, leading to flight and evasion.”

“The ability to fly provides significant advantages when being pursued by spiders.”

“Nonetheless, despite the agility conferred by flight, insects remain ensnared in the intricate silk webs spun by spiders.”

The results will appear in the journal Current Biology.

____

Nicholas J. Strausfeld et al. Cambrian origin of the spider brain. Current Biology Published online on July 22, 2025. doi:10.1016/j.cub.2025.06.063

Source: www.sci.news

Research Reveals That Lowering Pollution Might Not Compromise Deeper Climate Stability

Improving the quality of the air we breathe is a significant achievement for public health, but paradoxically, it also accelerates global warming. This is highlighted in a recent study published in Communication Earth and the Environment, which connects the recent efforts to clean up air pollution in East Asia to the intensified climate crisis.

In the last 15 years, global warming has surged dramatically, and until now, the reasons behind this surge were unclear to scientists.

Co-author Dr. Robert Allen, a professor of climate studies at the University of California, Riverside, stated:

To address this, a large team of international scientists examined simulations from eight major climate models.

The majority of the accelerated warming seen since 2010 is believed to stem from efforts to reduce air pollution in East Asia.

During this same period, China was implementing a significant air quality policy that led to a reduction of sulfur dioxide emissions by approximately 75%.

Dr. Bjørn Samset, the lead author of the research and a senior researcher at Norway’s International Climate Environmental Studies Centre, explained to BBC Science Focus that pollution has historically been effective in cooling the planet.

“Think back to a day when the air was polluted or hazy,” he mentioned. “Particles in the air block some sunlight from reaching the ground, effectively providing a cooling shade.

“For decades, air pollution has been helping to mitigate some of the warming caused by greenhouse gases.”

Samset elaborated that by eliminating air pollution, as China has done, some of that cooling effect has been lost.

However, simply allowing pollution to persist is not the answer. Allen noted that 2 and methane must both be addressed together.

Before China’s 2010 air quality policy, pollution was a leading cause of premature deaths in the country – Credit: Jack-Enjo Photography via Getty

In addition to cutting greenhouse gases, some scientists have proposed unconventional measures to slow the climate crisis, such as reintroducing artificial pollution into the atmosphere.

Samset explained that this approach “involves releasing particles into the stratosphere or clouds, which can mirror the cooling effects of air pollution without the harmful health impacts.”

To do this, planes could disperse gas from altitudes of 20 km—significantly higher than typical passenger flights.

However, co-author Professor Laura Wilcox, a meteorologist at the University of Reading, advised in BBC Science Focus that such solutions do not resolve the core issues.

“Similar to air pollution, these methods merely mask atmospheric problems without addressing the root causes,” she stated.

“Another viable strategy is to actively remove CO.2,” she added. “This process, known as carbon capture, is already underway but on a limited scale.”

Possible solutions include planting trees and seaweed, developing mechanical trees, and directly capturing CO2 from the air for storage in rock formations.

Nevertheless, the key solution remains to “reduce greenhouse gas emissions primarily by transitioning away from fossil fuels,” said Samset.

read more:

About our experts

Dr. Bjørn Samset is a senior researcher at the Norwegian Centre for International Climate Research. A physicist and science communicator, he possesses extensive expertise in atmospheric science and global climate modeling, focusing on the impacts of air pollution on climate change through climate modeling.

Professor Laura Wilcox is a professor specializing in aerosol climate interactions at the University of Reading, UK. Her research interests encompass the effects of air pollution on climate and the impacts of aviation on the climate.

Source: www.sciencefocus.com

New Research Suggests Neanderthals Embraced Local Food Traditions

The Caves of Amdo and Kebara in northern Israel date back to the central Paleolithic period, approximately 70,000-50,000 years ago. Both are situated in the Southern Levant’s Mediterranean region. The Neanderthals occupying these sites left behind a wealth of stone tools, evidence of fire usage, and a variety of animal and human fossils. A recent study from the Hebrew University of Jerusalem indicates that despite their proximity and the use of similar resources and tools, Neanderthals at these sites employed markedly different methods for processing their food.

Jaron et al. Despite comparable occupational strengths, similar stone tool techniques, and access to similar food resources, we propose a unique slaughter strategy among Neanderthal populations in the caves of Amdo and Kebara.

“The distinct variations in cut mark patterns between Amdo and Kebara might reflect local customs in animal processing,” stated Anal Jaron, a doctoral candidate at Hebrew University in Jerusalem.

“Though the Neanderthals at both sites experienced similar environments and challenges, they seem to have developed a distinct butchering strategy potentially passed on through cultural learning and social traditions.”

“These two sites present an extraordinary opportunity to investigate whether Neanderthal slaughter methods were standardized.”

“If butchering techniques differ between sites or over time, it could suggest that factors like cultural practices, dietary preferences, or social structures have influenced self-sufficiency activities, including slaughter.”

The Neanderthals resided in the caves of Amdo and Kebara during the winters between 70,000 and 50,000 years ago.

Both groups utilized the same flint tools and primarily preyed on gazelles and fallow deer.

However, it appears that the Kebara Neanderthals hunted larger game compared to their counterparts in Amdo and opted to perform the slaughter in caves rather than at the kill sites.

In Amdo, 40% of the animal bones show signs of burning, with most being fragmented, possibly resulting from intentional cooking or accidental damage afterward.

Conversely, in Kebara, only 9% of the bones are burned, suggesting they were cooked with less fragmentation.

Amdo’s bones seem less impacted by carnivores than those found in Kebara.

To compare food preparation techniques at Kebara and Amdo, researchers selected bone samples from corresponding layers at both sites.

These samples were analyzed macroscopically and microscopically to assess various cut mark characteristics. Similar patterns might suggest consistent slaughter practices, while differing patterns may highlight distinct cultural customs.

The cut marks were notably clear and intact, with minimal alteration from carnivorous activity or later damage from desiccated bones.

The profiles, angles, and widths of these cuts were akin across both groups and their toolkits.

Nonetheless, the cut marks at Amdo were found to be more densely packed than those at Kebara, and exhibited a less linear shape.

Scientists have proposed several potential reasons for this observation. It could be due to differing demands in processing various prey species and types of bones—most of the bones found in Amdo are short, yet similar distinctions appeared when examining small, straight bone fragments present in both sites.

Experimental archaeology indicates that this pattern cannot be solely attributed to the skills of butchers or heightened slaughtering efforts to maximize food yield.

Instead, the varying cut mark patterns likely reflect intentional butchering choices made by each group.

One hypothesis is that Neanderthals in Amdo treated meat differently prior to slaughter—perhaps opting to dry it or allow it to decay.

We posit that managing decomposing meat poses challenges, which may explain the strong cut marks and less linear characteristics observed.

The second possibility is that the organization of the groups (e.g., the number of butchers involved in a particular kill) contributed to the variance in practices between these two Neanderthal communities.

However, further research is needed to explore these theories.

“There are some limitations to consider,” Jaron noted.

“Bone fragments can be too small to provide a complete understanding of the butcher marks present on the remains.”

“We have made efforts to mitigate biases caused by fragmentation, but this may limit our ability to fully interpret the findings.”

“Future research involving more experimental work and comparative studies will be vital to address these uncertainties. Eventually, we might be able to reconstruct Neanderthal recipes.”

Survey results published in the journal Frontiers of Environmental Archaeology.

____

Anal Jaron et al. 2025. Comparing Neanderthal Treatment of Faunal Resources in the Amdo and Kebara Caves (Israel) Through Cut Mark Analysis. Front. Environ. Archaeol 4; doi:10.3389/fearc.2025.1575572

Source: www.sci.news

Koalus Only Spends 1% of Its Life on the Ground, New Research Shows

Koalas (phascolarctos cinereus) A recent study by scientists from the University of Queensland and Sunshine Coast found that while koalas spend about 10 minutes a day on the ground, this behavior is linked to two-thirds of the koala fatalities recorded.

The wild koala features a custom-built collar that includes a GPS logger and an accelerometer. Image credit: Ami Fadhillah Amir Abdul Nasir.

The koala population in Australia has diminished by 54% over the past three decades.

Annually, millions of dollars are invested in initiatives aimed at safeguarding koalas, preserving their habitats, and treating various diseases.

Tragically, two-thirds of koala deaths occur on the ground due to vehicle accidents or attacks by dogs.

Despite extensive research, the specifics of how koalas interact with the ground remain largely unknown.

Conventional GPS tracking methods offer limited insight, as they typically document locations only once or twice a day.

“Koalas primarily reside in trees, but increasing land development forces them to descend, significantly raising their risk of injury and death,” stated a student from the University of Queensland.

“Our goal was to gain a deeper understanding of their behaviors during ground movements.”

“Key questions remain regarding how frequently koalas climb down from their trees, the distance they cover, whether they remain on the ground, and what factors influence these choices.”

“Identifying high-risk areas and times, and developing mitigation strategies during these vulnerable moments, is critical to our understanding.”

To fill these knowledge gaps, researchers employed a biologger that captures three-dimensional movements at several hundred points, mapping the koala’s directional movements during specific actions.

They tracked 10 wild koalas using an accelerometer and high-resolution GPS device for an average duration of eight days.

The researchers found that the koalas descended to the ground around three times a day, totaling just 45 minutes.

During these visits, the koalas covered an average distance of 260 meters at a typical walking speed of 1.7 km/h, with occasional bursts of up to 10.4 km/h.

By integrating accelerometer and GPS data, researchers were able to pinpoint specific trees that the koalas traversed between.

“Combining GPS tracks with movement data provides us with an in-depth perspective on how koalas navigate their habitats,” Sparks explained.

“The study confirmed that koalas predominantly stay in trees for sleeping and feeding, yet the extent to which they engage with the ground was startling.”

“We were surprised at the frequency and brevity of their ground activities; they descended only 2-3 times nightly, averaging 10 minutes in total, which is less than 1% of their day.”

“The time they spent sitting and pausing on the ground was nearly equal to the time spent walking, with only about 7% of their ground time dedicated to movement.”

“This suggests that koalas carefully assess their environment while on the ground, possibly weighing the energetic costs of remaining on the ground before deciding to climb a tree.”

This study marks the first detailed documentation of ground movements in wild koalas, raising new inquiries about their navigation in increasingly fragmented habitats.

“Currently, we’re investigating how environmental characteristics influence the time koalas spend in trees,” Sparks noted.

“If we can identify the tree and habitat features that encourage extended periods in the trees, we may be able to manage landscapes in ways that minimize the need for ground travel.”

These findings aim to steer koala conservation efforts, prioritizing specific plant types, enhancing canopy connectivity, and minimizing gaps between safe trees.

This research sheds light on koala behavior and is essential for more effective habitat management, ultimately aiming to lower mortality risks in critical zones.

“This research is just one piece of a larger puzzle, but it significantly enhances our understanding of how koalas interact with an increasingly urbanized environment,” Sparkes added.

The scientists presented their findings on July 1, 2025, at the Society for Experimental Biology Annual Conference in Antwerp, Belgium.

____

Gabriella R. Spark et al. “Ordinary” Watch: What do koalas do when they’re not sleeping? SEB 2025 Summary #A16.1

Source: www.sci.news

Research Shows Accurate Age Predictions Can Be Made with Just 50 DNA Molecules

Researchers at Hebrew University leveraged a deep learning network to analyze DNA methylation patterns, achieving a time series age (defined as postnatal time) with median accuracy for individuals under 50 years, ranging from 1.36 to 1.7 years. result This work will be published in the journal Cell Report.



Utilizing ultra-depth sequences from over 300 blood samples of healthy individuals, the research indicates that age-dependent methylation changes happen in a probabilistic or coordinated block-like fashion across clusters of CPG sites. Image credit: Ochana et al., doi: 10.1016/j.celrep.2025.115958.

“We observe that our DNA leaves measurable marks over time,” commented Professor Tommy Kaplan from Hebrew University.

“Our model interprets these marks with remarkable precision.”

“The essence lies in how our DNA evolves through a process known as methylation – the chemical tagging of DNA by methyl groups (CH)3.

“By focusing on two vital regions of the human genome, our team successfully decoded these changes at the level of individual molecules, employing deep learning to generate accurate age estimations.”

In this research, Professor Kaplan and his team examined blood samples from over 300 healthy subjects and analyzed data from a decade-long study of the Jerusalem Perinatal Study.

The model developed by the team showed consistent performance across various factors, including smoking, weight, gender, and diverse indicators of biological aging.

In addition to potential medical applications, this technique could transform forensic science by enabling experts to estimate the age of suspects based on DNA traces.

“This provides us with a new perspective on cellular aging,” stated Yuval Dor, a professor at Hebrew University.

“It’s a striking example of the intersection between biology and artificial intelligence.”

Researchers found new patterns in DNA alterations over time, suggesting that cells encode both mature and tuned bursts, akin to biological clocks.

“It’s not solely about knowing your age,” explained Professor Ruth Shemmer of Hebrew University.

“It’s about comprehending how cells and molecules keep track of time.”

“This research could redefine our approach to health, aging, and identity,” added the scientist.

“From assisting physicians in treatment based on an individual’s biological timeline to equipping forensic investigators with advanced tools for crime-solving, the capability to decipher age from DNA paves the way for groundbreaking advancements in science, medicine, and law.”

“Moreover, it enhances our understanding of the aging process and brings us closer to unraveling our body’s internal clock.”

____

Bracha-Lea Ochana et al. Time is encoded by changes in methylation at clustered CPG sites. Cell Report Published online on July 14th, 2025. doi:10.1016/j.celrep.2025.115958

Source: www.sci.news

New Research Indicates Mars Was Warm and Wet 3.7 Billion Years Ago

Planetary scientists have identified over 15,000 km of ancient riverbeds in the Noachis Terra region of Mars’ southern highlands, indicating that the planet may have been significantly wetter than previously believed.

This image depicts a flat upper eroded river wavy ridge above Mars, with dunes moving over it. Image credits: NASA/JPL/University of Arizona.

The nature of Mars’ climate during the Noatian-Hesperian transition, which occurred around 3.7 billion years ago, is still being debated. This period saw significant geological and climatic changes, as well as the formation of surface features like valley networks and lakes associated with liquid water.

There are two prevailing theories: the first suggests that a warm and wet environment followed early Mars, allowing liquid water to persist on the surface for an extended time. The second posits that Mars has generally been cold and dry, with flowing water created sporadically by melting ice during brief climate shifts.

In Noachis Terra, climate models predicting “warm and humid” conditions suggest significant precipitation levels.

A recent study led by Open University Ph.D. student Adam Losekoot and his team analyzed the region’s wavy ridges, also known as inverse channels.

“These formations likely resulted from sediments laid down by rivers that solidified, later exposed through the erosion of surrounding materials,” noted the lead researcher.

“Similar ridges have been identified in various Martian terrains.”

“Their presence implies that flowing water once traversed the area, with precipitation being the most probable source,” he added.

The team found that river-wave ridges are widespread throughout Noachis Terra, amounting to over 15,000 km in total length.

While many segments are isolated, some systems extend several hundred kilometers.

“Exploring Mars, particularly less altered regions like Noachis Terra, is thrilling because they have remained relatively unchanged for billions of years,” Losekoot commented.

“It acts as a time capsule that captures fundamental geological processes in ways that are impossible to observe on Earth.”

In their investigation, the researchers utilized data from three orbital devices: the Context Camera (CTX), the Mars Orbiter Laser Altimeter (MOLA), and the High-Resolution Imaging Science Experiment (HiRISE).

These datasets enabled them to map the locations, lengths, and forms of the ridge systems across various areas.

“Our findings present new evidence indicating that Mars was once a much more dynamic and complex planet than we suppose,” they stated.

“The size and interconnectivity of these ridges suggest that liquid water existed for an extended period, indicating that Noachis Terra experienced warm, wet conditions for a geologically significant time.

“These results challenge the conventional belief that Mars has been predominantly cold and dry, with valleys formed only by sporadic, short-term meltwater from ice sheets.”

The scientists presented their results on July 10th at the National Astronomical Conference of the Royal Astronomical Society 2025 in Durham, England.

____

Adam Losekoot et al. The history of the rivers of Noachis Terra, Mars. NAM 2025

Source: www.sci.news