Exploring the Epic Saga of Ancient Humanity: The Century’s Best Idea Revealed

In the last 25 years, the field of human evolution has witnessed remarkable growth, showcased by a significant increase in discoveries. Archaeologists have unearthed more fossils, species, and artifacts from diverse locations, from the diminutive “hobbits” to enigmatic creatures inhabiting Indonesian islands. Notably, Homo naledi is known solely from a single deep cave in South Africa. Simultaneously, advanced analytical techniques have enhanced our understanding of these findings, revealing a treasure trove of information about our origins and extinct relatives.

This whirlwind of discoveries has yielded two major lessons. First, since 2000, our understanding of the human fossil record has been extended further back in time. Previously, the oldest known human fossil was 4.4 million-year-old Ardipithecus, but subsequent discoveries in 2000 and 2001 unearthed even older species: Ardipithecus, Orrorin tugenensis from 6 million years ago, and Sahelanthropus tchadensis from 7 million years ago. Additionally, the Orrorin lineage was tentatively identified in 2022, suggesting it is slightly more recent than O. tugenensis.

According to Clement Zanoli from the University of Bordeaux, the discovery of these early human fossils represents “one of the great revolutions” in our understanding of evolution.

The second major lesson has enriched the narrative of how our species emerged from earlier hominins. By 2000, genetic evidence established that all non-Africans descend from ancestors who lived in Africa around 60,000 years ago. This revelation indicated that modern humans evolved in Africa and subsequently migrated, replacing other hominid species.

However, by 2010, the sequencing of the first Neanderthal genome opened a new chapter, along with the DNA analysis of several other ancient humans. These studies revealed that our species interbred with Neanderthals, Denisovans, and possibly other groups, creating a complex tapestry of human ancestry.

Skeletal research has long suggested interbreeding as many fossils exhibit traits that defy clear species categorization, as noted by Sheila Athreya at Texas A&M University. In 2003, Eric Trinkaus and colleagues described a jawbone excavated from Peștera cu Oase, Romania, as a Human-Neanderthal hybrid, based on its morphology. Later genetic testing in 2015 confirmed that individuals from Oase had Neanderthal ancestry, tracing back 4 to 6 generations ago.

This evidence highlights that our species did not merely expand from Africa; rather, our population absorbed genetic contributions from Neanderthals and Denisovans along the way. Genetically, we are a mosaic, a fusion of countless years of diverse human lineages.

Topics:

Source: www.newscientist.com

End-to-End Encryption: The Ultimate Security Solution of the Century

Everyone has secrets to protect. In today’s digital age, whether safeguarding personal messages, business communications, or confidential state information, end-to-end encryption (E2EE) offers essential security and peace of mind.

E2EE ensures that your communications remain private from internet service providers and the operators of messaging or video conferencing applications. Messages are encrypted on the sender’s device and only decrypted by the recipient, making them unreadable to unauthorized parties while in transit. This prevents access by any entity, including law enforcement or corporate insiders.

Digital encryption is rooted in robust mathematics rather than mere assurances. The RSA algorithm, introduced in 1977, pioneered modern encryption by relying on the complexity of factoring large numbers into their prime components. Since then, various algorithms have emerged, utilizing intricate mathematics to enhance cryptographic security.

The true strength of E2EE lies not just in its technical implementation, but in how it upholds democracy and human rights across the globe. As Matthew Feeney from the UK privacy group Big Brother Watch states, “There are individuals in perilous regions depending on encryption to preserve their lives.” Additionally, even in recognized democracies, freedom is vulnerable. Feeney warns that those who claim “I have nothing to hide” should take heed of history’s lessons.

Many governments view E2EE unfavorably because it blocks surveillance, similar to how postal services safeguard letters. Notably, UK governments have attempted to ban E2EE; most recently, Prime Minister Keir Starmer reversed a controversial request for a backdoor into Apple following a public outcry.

Feeney acknowledges the uncertainty surrounding the potential for E2EE to be compromised, as intelligence agencies typically do not disclose their capabilities. Concerns loom regarding the advent of quantum computing, which may soon breach current encryption algorithms. However, cryptography continues to evolve, with emerging mathematical solutions challenging outdated algorithms. “Governments may wield power, but they can’t override the laws of mathematics,” Feeney asserts.

Topics:

This rewrite optimizes the content for SEO, ensuring clarity, keyword inclusion, and readability while preserving the original structure and HTML tags.

Source: www.newscientist.com

Unlocking Epigenetics: The Century’s Most Revolutionary Concept

As we entered the new millennium, discussions surrounding the number of genes in our genome were highly debated. Initial estimates were significantly lower than anticipated, spurring a movement towards re-evaluating evolutionary processes.

The Human Genome Project revealed in 2001 that we possess fewer than 40,000 protein-coding genes — a number that has since been adjusted to around 20,000. This finding necessitated the exploration of alternative mechanisms to account for the complexity of our biology and evolution; epigenetics now stands at the forefront.

Epigenetics encompasses the various ways that molecules can interact with DNA or RNA, ultimately influencing gene activity without altering the genetic code itself. For instance, two identical cells can exhibit vastly different characteristics based purely on their epigenetic markers.

Through epigenetics, we can extract even greater complexity from our genome, factoring in influences from the environment. Some biologists are convinced that epigenetics can play a significant role in evolutionary processes.

A notable study in 2019 demonstrated how yeast exposed to toxic substances survived by silencing specific genes through epigenetic mechanisms. Over generations, certain yeast cultures developed genetic mutations that amplified gene silencing, indicating that evolutionary changes began with epigenetic modifications.

Epigenetics is crucial for expanding our understanding of evolutionary theory. Nevertheless, skepticism persists regarding its broader implications, particularly in relation to plants and other organisms.

For instance, Adrian Bird, a geneticist at the University of Edinburgh, expressed doubts, arguing in a recent paper that there is no clear evidence linking environmental factors like drought to mammalian genomes. Though epigenetic markers may be inherited, many are erased early in mammalian development.

Some researchers dispute these concerns. “Epigenetic inheritance is observed in both plants and animals,” asserts Kevin Lara, an evolutionary biologist from the University of St. Andrews. In a comprehensive study published recently, Lara and colleagues proposed a wealth of research indicating that epigenetics could play a role across the entire tree of life.

So, why is there such division in the scientific community? Timing may be a factor. “Epigenetic inheritance is an evolving area of study,” observes Lara. While epigenetics has been recognized for decades, its relevance to evolutionary research has only gained traction in the past 25 years, making it a complex field to assess.

Topic:

Source: www.newscientist.com

Transformer Architecture: The Revolutionary AI Innovation Redefining the 21st Century

Discover Today’s Most Powerful AI Tools

Explore the incredible capabilities of modern AI tools that can summarize documents, generate artwork, write poetry, and even predict protein folding. At the heart of these advancements is the groundbreaking transformer architecture, which revolutionized the field of artificial intelligence.

Unveiled in 2017 at a modest conference center in California, the transformer architecture enables machines to process information in a way that closely resembles human thinking patterns. Historically, AI models relied on recurrent neural networks, which read text sequentially from left to right while retaining only the most recent context. This method sufficed for short phrases, but when dealing with longer and more complex sentences, critical details often slipped through the cracks, leading to confusion and ambiguity.

The introduction of transformers to the AI landscape marked a significant shift, embracing the concept of self-attention. This approach mirrors the way humans naturally read and interpret text. Instead of strictly scanning word by word, we skim, revisit, and draw connections based on context. This cognitive flexibility has long been the goal in natural language processing, aiming to teach machines not just to process language, but to understand it.

Transformers emulate this mental leap effectively; their self-attention mechanism enables them to evaluate every word in a sentence in relation to every other word simultaneously, identifying patterns and constructing meaningful connections. As AI researcher Sasha Ruccioni notes, “You can take all the data you get from the Internet and Wikipedia and use it for your own tasks. And it was very powerful.”

Moreover, this transformative flexibility extends beyond text. Today’s transformers drive tools that can generate music, render images, and even model molecules. A prime example is AlphaFold, which treats proteins—long chains of amino acids—analogously to sentences. The function of a protein hinges on its folding pattern and the spatial relationships among its constituent parts. The attention mechanism allows this model to assess these distant associations with remarkable precision.

In retrospect, the insight behind transformers seems almost intuitive. Both human and artificial intelligence rely on discerning when and what to focus on. Transformers haven’t merely enhanced machines’ language comprehension; they have established a framework for navigating any structured data in the same manner that humans navigate the complexities of their environments.

Source: www.newscientist.com

Understanding Neurodiversity: Why ‘Normal’ Brains Don’t Exist – A Revolutionary Perspective for the Century

Historically, science operated under the notion of a “normal brain,” one that fits standard societal expectations. Those who diverge from this model have often been labeled with a disorder or mental health condition, treated as if they were somehow flawed. For years, researchers have refined the notion that neurodevelopmental conditions, including autism, ADHD, dyslexia, and movement disorders, should be recognized as distinctive variations representing different neurocognitive frameworks.

In the late 1990s, a paradigm shift occurred. What if these “disorders” were simply natural variations in brain wiring? What if human traits existed on a spectrum rather than a stark boundary between normal and abnormal? Those at either end of the spectrum may face challenges, yet their exceptional brains also offer valuable strengths. Viewed through this lens, diverse brains represent assets, contributing positively to society when properly supported.

The concept of neurodiversity gained momentum, sparking lively debates in online autism advocacy groups. By 2013, the Diagnostic and Statistical Manual of Mental Disorders recognized autism as a spectrum condition, abolishing the Asperger’s syndrome diagnosis and classifying it on a scale from Level 1 to Level 3 based on support needs. This shift solidified the understanding of neurodivergent states within medical literature.

Since the early 2000s, research has shown that individuals with autism often excel in mathematical reasoning and attention to detail. Those with ADHD frequently outperform others in creativity, while individuals with dyslexia are adept at pattern recognition and big-picture thinking. Even those with movement disorders have been noted to develop innovative coping strategies.

These discoveries have led many scientists to argue that neurodivergent states are not mere evolutionary happenstance. Instead, our ancestors likely thrived thanks to pioneers, creative thinkers, and detail-oriented individuals in their midst. A group possessing diverse cognitive strengths could more effectively explore, adapt, and survive. Some researchers now propose that the autism spectrum comprises distinct subtypes with varying clusters of abilities and challenges.

While many researchers advocate for framing neurodivergent characteristics as “superpowers,” some caution against overly positive portrayals. “Excessive optimism, especially without supporting evidence, can undermine the seriousness of these conditions,” says Dr. Jessica Eccles, a psychiatrist and neurodiversity researcher at Brighton and Sussex Medical School. Nevertheless, she emphasizes that “with this vocabulary, we can better understand both the strengths and challenges of neurodiversity, enabling individuals to navigate the world more effectively.”

Topics:

Source: www.newscientist.com

Unlocking Molecule Creation: Why Click Chemistry is the Century’s Most Innovative Concept

Explore the latest science news and in-depth articles by expert journalists on developments in science, technology, health, and the environment.

Chemistry can often be a complex and slow process, typically involving intricate mixtures in round-bottomed flasks that require meticulous separation afterward. However, in 2001, K. Barry Sharpless and his team introduced a transformative concept known as click chemistry. This innovative approach revolutionizes the field, with a name coined by Sharpless’s wife, Janet Dueser, perfectly encapsulating its essence: a new set of rapid, clean, and reliable reactions.

Though the idea appears straightforward, its elegance lies in its simplicity. Sharpless, along with colleagues Hartmas C. Kolb and MG Finn, described their creation as “spring-loaded.” This concept hinges on applying these reactions to various starting materials, assembling them akin to Lego blocks, thereby enabling the swift construction of a vast array of novel and beneficial molecules. Sharpless’s primary focus? Pharmaceuticals.

The overarching principle guiding these reactions was to steer clear of forming carbon-carbon bonds, which was the norm among chemists at the time, and instead to create bonds between carbon and what are known as “heteroatoms,” primarily oxygen and nitrogen. The most recognized click reaction involves the fusion of two reactants to create a triazole, a cyclic structure of carbon and nitrogen atoms. This motif proves to be highly effective at binding to large biomolecules such as proteins, making it invaluable in drug development. Sharpless independently published this specific reaction concurrently with chemist Morten Meldal, who researched it at the University of Copenhagen. This reaction has since been instrumental, notably in the production of the anticonvulsant drug Rufinamide.

Chemists like Tom Brown from the University of Oxford describe this reaction as simple, highly specific, and versatile enough to work in almost any solvent. “I would say this was just a great idea,” he asserts.

Years later, chemist Carolyn Bertozzi and her team at Stanford University developed a click-type reaction that operates without toxic catalysts, enabling its application within living cells without risking cellular damage.

For chemist Alison Hulme at the University of Edinburgh, this research was pivotal in elevating click chemistry from a promising idea to a revolutionary advancement. It granted biologists the ability to assemble proteins and other biological components while labeling them with fluorescent tags for investigation. “It’s very straightforward and user-friendly,” Hulme explains. “We bridged small molecule chemistry to biologists without necessitating a chemistry degree.”

For their groundbreaking contributions, Bertozzi, Meldal, and Sharpless were awarded the 2022 Nobel Prize in Chemistry—an outcome that surprised no one.

Topics:

Source: www.newscientist.com

Transforming Transient Astronomy: The Universe’s Biggest Drama Becomes a Cinematic Masterpiece

Here’s your content rewritten for SEO optimization, while keeping the original HTML tags:

New Scientist: Explore the latest science news, in-depth features, and expert analysis on technology, health, and environmental developments.

Imagine looking up at the night sky 1,000 years ago; you would likely see an additional point of light compared to today. Back then, Chinese astronomers referred to these phenomena as “guest stars,” believing they foretold significant changes.

Today, we understand these were likely supernovae—spectacular explosions from dying stars—one of many serendipitous discoveries made by astronomers observing at opportune moments.

In the modern era, the quest for these “transient” events has evolved into a strategic approach, revolutionizing the field of astronomy. We have since identified numerous fleeting events that span from mere nanoseconds to durations longer than a human lifetime.

“Astronomy considers both spatial and temporal scales, yet the latter remains largely unexplored,” states Jason Hessels from the University of Amsterdam.

To capture these ephemeral occurrences effectively, astronomers are innovating by synchronizing telescopes into a cohesive unit, akin to a well-oiled machine, as evidenced by the Palomar Temporary Factory project from 2009 to 2012. One significant flash observed by a telescope in San Diego prompted immediate follow-up investigations by others. “It was orchestrated like a conveyor belt,” Hessels remarked.

More specialized telescopes are emerging, focusing on time, rather than just space. Notably, the Zwicky Temporary Facility has taken over from Palomar, and the Pan-STARRS survey amassed 1.6 petabytes of astronomical data—recording the largest dataset ever captured from Hawaii.

These advanced telescopes have generated extensive data that unveil the twinkling and fluctuating events of the cosmos, including gamma-ray bursts, fast radio bursts, gravitational waves, and stars that either explode spontaneously or are ripped apart by black holes.

Transient astronomy is reshaping our perception of the universe. “We’ve progressed from painting to photography, and now to some form of stop-motion film,” Hessels describes. He continues, “We’re approaching a complete narrative. Each adjustment in my perspective of the sky feels as though the cinematic experience expands further.”

<section class="ArticleTopics" data-component-name="article-topics">
    <p class="ArticleTopics__Heading">Topic:</p>
</section>

SEO Optimization Highlights:

  1. Keywords: Integrated relevant keywords such as “transient events,” “supernovae,” “astronomy,” and “telescopes” without compromising readability.
  2. Descriptive Alt Text: Improved the alt text of the image to convey its significance clearly.
  3. Subheadings: Ensure to use the appropriate heading tags (e.g., <h1>, <h2>, etc.) where necessary in articles for better SEO structuring.
  4. Internal Links: Links to external credible sources to establish authority.

Feel free to customize specific details or add additional keywords you target for your SEO strategy!

Source: www.newscientist.com

Connecting Extreme Weather to Climate Change: The Most Important Insight of Our Time

New Scientist - Your premier source for the latest science news, technology advancements, health insights, and environmental developments.

January 2003: Physicist Miles Allen witnessed the River Thames flooding, threatening his home in Oxford, England. He asked, “Why did meteorologists refuse to link this incident to climate change?”

Later that year, climatologist Peter Stott from the British Met Office found himself in Italy during one of Europe’s most severe heatwaves. Instead of enjoying a vacation, he faced temperatures exceeding 40 degrees Celsius, a shocking experience for him.

Both Allen and Stott were intent on understanding climate change’s role in extreme weather events. Stott utilized existing climate models to simulate two scenarios of the 2003 heatwave: one reflecting the climate of that year and another devoid of human-induced warming.

They ran extensive model simulations and concluded that in their landmark 2004 paper in Nature, human activities have more than doubled the likelihood of experiencing a heatwave similar to that of 2003.

This groundbreaking work marked the inception of a new climate science field, which began to identify human influences on extreme weather events. Soon analyses emerged for diverse phenomena, from heatwaves to severe droughts and storms.

However, a significant challenge remained—post-event analyses often took months or years to determine the influence of climate change.

To address this, researchers, including Friederike Otto from Imperial College London, founded World Weather Attribution in 2014. This initiative facilitates swift analysis of extreme weather events, quantifying the probable impacts of climate change, with results frequently released within days.

This has dramatically altered reporting on such events globally, enabling news outlets to directly attribute deadly weather phenomena to climate change and emphasizing the real-world consequences of rising emissions.

As Otto stated, “When we began this work a decade ago, scientists and journalists maintained that individual weather events could not be blamed on climate change. That perspective has shifted immensely.”

This advancement also supports climate change litigation, with causal investigations providing evidence in numerous lawsuits against polluters worldwide. In 2022, the United Nations announced a new International Loss and Damage Fund, paving the way for climate change compensation.

In 2003, Allen queried: “Could litigation for climate change be feasible?” Thanks to developments in attribution science, the answer is now a definitive “yes.”

Topic:

Source: www.newscientist.com

New Bone Cancer Treatment Shows Unexpected Reduction in Tumor Pain

Nanomedicine Concept Art

Artist’s Impression of Nanomedicine in Action

Alfred Pasieka/Science Photo Library

Cancer that metastasizes to the bones can be both deadly and painful. A new innovative drug is showing promise in addressing these issues by disrupting the interaction between tumors and nerves. This groundbreaking approach may lead to a much more comfortable cancer treatment journey.

According to William Fan from Harvard University, who was not part of the study, “This highlights a new and exciting paradigm in which a single cancer treatment can simultaneously improve mortality and quality of life.”

Research indicates that 65-80% of individuals with breast or prostate cancer ultimately develop bone cancer when the disease spreads. As these tumors progress, they irritate nearby pain-sensing nerves.

Standard treatments such as radiation therapy and chemotherapy are commonly utilized to shrink bone tumors. However, pain may still persist due to residual cancer cells interacting with nerves. Furthermore, conventional methods can harm healthy tissues and often require long-term use of painkillers, like opioids, risking addiction, as noted by Xian Jia Asia at Zhejiang University in China.

In response, Xian and colleagues have introduced a revolutionary “nanotherapy” comprising tiny fat capsules loaded with DNA that encodes gasdermin B, a protein designed to kill cancer cells selectively. This therapy targets cancer cells while sparing healthy ones, utilizing the characteristic higher levels of reactive oxygen species found in tumor cells. The nanocapsules additionally contain OPSA, which enhances the body’s inherent anti-cancer immune response.

To evaluate the efficacy of this novel drug, researchers injected breast cancer cells into the legs of various mice. Once bone tumors formed, the mice received either the full nanotherapy, a simpler version containing OPSA but lacking the gasdermin B gene, or a saline control. Treatments were administered into the tail every other day over five days.

After two weeks, tumors in the full nanotherapy group were on average 94% smaller than those in the control group, while the simpler form resulted in a 50% reduction. Furthermore, all mice treated with the complete nanotherapy survived, in contrast to merely 60% of those receiving the simpler therapy and 20% in the control group. This treatment effectively killed tumor cells and induced an anti-tumor immune response, Xiang reported.

Interestingly, both forms of the nanotherapy improved mobility in the affected limbs significantly more than the control, particularly in the full nanotherapy group, indicating potential pain relief from bone tumors. Tumor samples revealed a noticeable decrease in the density of nerve cells within the cancerous growths.

The mechanism appears to involve enhancing the cancer cells’ ability to absorb calcium ions, essential for nerve growth and pain signal transmission. “The concept is that cancer cells act like sponges for local calcium, reducing the availability of calcium for sensory neurons,” explains Professor Huang. Further studies are necessary to establish how nanotherapy adjusts calcium uptake in cancer cells, which may expose new avenues for targeting this critical pathway.

In preliminary findings, it was observed that nerves surrounding tumors could facilitate their growth, suggesting that nerve-related mechanisms could not only alleviate pain but also inhibit tumor proliferation, although specific impacts remain uncertain, according to Xiang.

These findings bolster the emerging perspective that targeting the nervous system may transform cancer treatment paradigms, states Huang. However, translating these treatments from mice to humans remains challenging due to differences in immune responses. Xiang aspires to initiate human clinical trials within five to ten years.

Topics:

Source: www.newscientist.com

Emerging Giant Hybrid Pest in Brazil Poses Global Threat to Crops

Corn Earworm (Helicoverpa zea) Larvae Feeding on Cotton Plants

Debra Ferguson/Design Pics Editorial/Universal Images Group (via Getty Images)

The cotton bollworm and corn earworm, recognized as “giant pests,” are currently wreaking havoc on farmers globally. Recent interbreeding incidents in Brazil have resulted in a hybrid that possesses resistance to various pesticides. If this trend continues unchecked, the hybrid strain may severely impact soybean and other crop yields, jeopardizing global food security.

“This can pose significant challenges,” notes Chris Jiggins from Cambridge University.

Many nations rely on Brazilian soybeans for both human and animal feed—“it essentially feeds the world,” Jiggins remarks.

In Brazil, over 90% of soybeans cultivated are genetically modified (GM) varieties containing built-in pesticides. The emergence of resistant pests could precipitate a decline in yields, leading to heightened food prices. Additionally, increased deforestation and greenhouse gas emissions may occur as farmers seek to clear more land for cultivation.

The corn earworm (Helicoverpa zea), a moth native to the Americas, features caterpillars that are highly destructive to a variety of crops, particularly corn. They also pose threats to tomatoes, potatoes, cucumbers, and eggplants.

Historically, H. zea has not been a significant issue for soybean farmers in Brazil, as soybeans are not their primary food source. However, the detection of the cotton bollworm (Helicoverpa armigera) in Brazil in 2013 marked a troubling development. This pest, a close relative of H. zea, has proven to be widely detrimental. Both moth species are categorized as “giant pests” due to their notorious destructiveness and resistance to control measures.

“The concerns are well-founded, given their significant impact,” Jiggins emphasizes. “Moths can travel substantial distances, complicating control efforts.”

H. armigera feeds on multiple plant types, while H. zea primarily reproduces in soybeans, leading to financial strains on Brazilian agriculture with costs totaling billions of dollars, according to Jiggins.

The introduction of Bt soybeans—genetically engineered to produce proteins derived from the soil bacterium Bacillus thuringiensis—has alleviated some challenges posed by these pests.

Initial belief held that hybridization between H. armigera and H. zea was implausible. However, genetic analyses from 2018 identified hybrids within the species. Recent genomic studies of around 1,000 moths collected over the past decade have revealed alarming trends.

Analysis indicates that one-third of H. armigera specimens now possess genes conferring resistance to Bt toxins, a concerning development since H. zea strains have evolved comparable resistances after their introduction in North America during the 1990s. This resistance, now spreading to South America, suggests a perilous progression as hybridization occurs. While hybrid H. armigera has not yet produced severe consequences, experts caution that as resistance continues to evolve, the situation may change rapidly.

Gene transfers between species are occurring, and H. zea in Brazil have gained resistance to pyrethroid insecticides. “The speed of this development is astounding,” notes Jiggins.

Angela McGaughran from the University of Waikato asserts that “as global interconnectedness and climate change enable species range expansion, the looming threat of these megapests could amplify on a worldwide scale.”

Farmers are advised to implement non-Bt crop rotations alongside Bt crops to mitigate the spread of resistant pests. However, adherence to these guidelines remains inconsistent across various countries.

Biotech companies are now researching multi-gene Bt strains—producing two, three, or even five different Bt proteins to combat resistance. However, Jiggins insists that the cost and duration of bringing such innovations to market underscore the necessity for sustainable resistance management, including reducing exposure to existing Bt crops.

While hybridization facilitates resistance, Tabashnik highlights that intra-species evolution remains the leading concern. In China, for instance, H. armigera has developed resistance to the original Bt toxin independently.

Topics:

Source: www.newscientist.com

Surprising Resilience: How Sea Turtles May Thrive Amid Global Warming

Young Loggerhead Sea Turtle in the Caribbean Sea near the Bahamas

WaterFrame/Alamy

Recent research indicates that sea turtles may be more resilient to climate change than previously believed. Concerns have been raised that rising temperatures could lead to the extinction of these reptiles, as a majority of turtle eggs tend to develop into females. However, scientists have discovered a genetic safety net that maintains a more balanced sex ratio even as temperatures increase.

According to Chris Eizaguirre at Queen Mary University of London, “We believe we have uncovered the ability of turtles to adapt to the environment they find themselves in.”

The gender of baby sea turtles is temperature-dependent rather than determined by chromosomes. Laboratory studies show that cooler nest temperatures favor male hatchlings, while warmer conditions promote female hatchlings. This raises concerns that global warming could result in significantly more female turtles.

For instance, genetic research conducted in 2018 revealed that around 99% of young green sea turtles (Chelonia mydas) aged 4 to 20 years in a nesting area off Australia were female. This finding contributed to alarming predictions about male shortages which could lead to a population collapse.

However, due to the challenges of identifying a turtle’s gender before it reaches maturity, field data regarding hatchling sex ratios have been limited.

To address this gap, Eizaguirre and colleagues conducted both laboratory and field experiments focused on loggerhead sea turtles (Caretta caretta).

In one phase of the study, they collected 240 eggs from seven loggerhead nests along Florida’s Palm Beach County coast. These eggs were incubated at three different temperatures: 27°C (81°F) suitable for male hatchling production, 30°C (86°F) for an equal sex ratio, and 32°C (90°F) to promote female hatchlings.

After one to three days, blood samples were taken from the hatchlings, which were kept until mature enough for sex determination via keyhole surgery or laparoscopic imaging.

By comparing genetic data from the blood samples, researchers found distinctive activity patterns in hundreds of genes that indicated sex, attributable to an epigenetic process called DNA methylation. In females, 383 genes were hypermethylated, while males had 394 hypermethylated genes, many of which are known to play roles in sexual development.

Utilizing these findings, the team conducted field research on Sal Island, Cape Verde, collecting 29 newly laid loggerhead sea turtle eggs. The eggs were divided, with half buried in a cooler area and the other half in a warmer spot, and monitored for temperature variations.

Analysis of blood samples from 116 hatchlings revealed a higher number of males than predicted, suggesting previous models had overestimated female hatchling production by 50-60%, likely due to previously unrecognized biological adaptations.

“This discovery highlights that molecular mechanisms exist that help turtles adapt to climate change by modulating the sensitivity of sexual differentiation to temperature,” Eizaguirre explains.

“While feminization is a concern and does occur due to climate change, we are suggesting that if populations are robust and genetically diverse, species can adapt to their environmental conditions,” he adds.

These findings are supported by recent evidence from Graham Hayes at Deakin University, which showed that more male sea turtles are hatching than originally expected if temperature were the sole factor in sex determination. Hayes notes that turtles can adapt their crucial temperature-related sex ratios to local conditions.

In addition, turtles employ other strategies to mitigate the impacts of climate change, such as nesting earlier in the season and adjusting their migration patterns to breeding grounds to counteract feminization effects. “While females may not breed annually, males migrate to breeding grounds more frequently, contributing to a more balanced reproductive sex ratio,” Hayes explains.

Despite these behavioral adaptations, Eizaguirre warns that hatchlings still face threats from excessive heat, which can lead to lasting changes in DNA methylation—an indication of molecular adaptation that is promising for these vulnerable reptiles.

Topics:

Source: www.newscientist.com

Boost Your Health: The Benefits of Singing, Dancing, and Artistic Expression

Engaging in the arts promotes health

Discover the Health Benefits of Engaging in the Arts

Miguel Riopa/AFP via Getty Images

Engaging in the arts is not merely a delightful hobby; it significantly contributes to improved health. Recent groundbreaking research indicates that participation in creative activities correlates with decreased inflammation and positive changes in brain health-related proteins.

“We uncovered several new biological pathways that clarify the connection between art and beneficial health outcomes,” stated Daisy Fancourt from University College London.

Over the past decade, accumulating evidence highlights the substantial health benefits associated with participation in music, theater, and various creative arts. For instance, dance programs are shown to aid Parkinson’s patients, while art activities can lead to a lower risk of depression.

Previous studies also indicate that individuals engaged in the arts tend to exhibit lower inflammation levels, thereby enhancing both physical and mental health. However, prior research often focused solely on a few blood markers, limiting its utility. With advancements in technology, it is now possible to analyze hundreds of proteins, providing comprehensive insights into how behavior influences biology.

Utilizing this advanced methodology, Fancourt and colleagues examined data from approximately 6,000 British adults by analyzing one-time blood samples to explore how involvement in the arts links to 184 proteins associated with bodily and brain systems.

The researchers measured the frequency and variety of artistic activities, discovering that increased participation in arts such as dancing, singing, reading, photography, crafting, and attending performances correlates with significant changes in 18 specific proteins.

Tracking data also revealed that individuals actively engaged in the arts exhibited a reduced future risk for several health issues, including heart disease, type 2 diabetes, arthritis, depression, and dementia. Notably, the changes in proteins could explain between 16% and 38% of the link between artistic engagement and improved health, even after adjusting for factors like income and education.

Some affected proteins are crucial for metabolism, while others support brain cell health. Certain proteins also activate pathways that enhance anti-inflammatory processes. “Engaging in the arts may stimulate a rebalancing of the inflammatory system,” notes Fancourt.

“The benefits of engaging in arts activities on health and well-being have long been acknowledged, but the underlying mechanisms remain unclear,” commented Darryl O’Connor from the University of Leeds, UK. While the findings need replication in different populations, he emphasizes that the study presents exciting new avenues for exploring how behavior impacts health.

Researchers from King’s College London, including Carmine Pariante, highlight that these findings align with established protective effects of artistic engagement on mental and physical health. However, the current study offers only a temporal snapshot, leaving questions about the duration of exposure to art needed to yield protective benefits.

Fancourt suggested that the next logical step would be to conduct causal studies, monitoring specific proteins before and after individuals participate in various artistic activities.

Topics:

Source: www.newscientist.com

SEO Optimized Title: “New Scientist’s Top Avatar Picks: Fire, Ash, and the Fascinating World of Whales”

Here’s a rewritten, SEO-optimized version of your content, maintaining the HTML tags:

Oona Chaplin portrays Valan in 20th Century Studios' AVATAR: FIRE AND ASH. Image credit: 20th Century Studio, 2025. Unauthorized reproduction is prohibited.

Oona Chaplin as Varang in Avatar: Fire and Ash

Image credit: 20th Century Studio, 2025. Unauthorized reproduction is prohibited.

Bethan Ackerley
Deputy Editor, London

No one crafts blockbusters quite like James Cameron. Avatar: Fire and Ash, the highly anticipated third installment set on the enchanting moon of Pandora, is both spectacular and visually stunning. The narrative unfolds with captivating themes ranging from interspecies conflicts to deep family dynamics.

Around 15 years after the ex-Marine Jake Sully was embraced by the Na’vi, he now resides on Pandora with his partner Neytiri and their children, having played a crucial role in defeating the human invaders and merging with their Na’vi bodies.

However, they now face the heart-wrenching loss of their eldest son, Neteyam. Their arch-nemesis, Colonel Quaritch, has allied himself with an influential Na’vi tribe that inhabits a volcano and is led by the formidable Varan (as depicted above).

Shakespeare may not measure up (the dialogue is rather crude, to say the least), but the allure of this intricately designed universe is undeniable.

Prepare to be mesmerized by the breathtaking visuals and the story of Payakan, a member of the sentient whale-like species known as the Tulukun, who serves as the emotional core of the film.

<section class="ArticleTopics" data-component-name="article-topics">
    <p class="ArticleTopics__Heading">Topics:</p>
</section>

SEO Changes Made:

  1. Keyword Optimization: Used keywords like “Avatar: Fire and Ash,” “James Cameron,” and “Pandora.”
  2. Image Optimization: Enhanced the alt text for better relevance and inclusion of key details.
  3. Improved Readability: Broke down longer sentences for better readability and engagement.
  4. Strategic Internal Linking: Maintained the external link to increase credibility.

Feel free to modify any specific phrases or add any additional keywords your audience might be searching for!

Source: www.newscientist.com

Explore Tanzania’s Remote Regions: Stunning Images Showcasing Rich Biodiversity

A yellow baboon is standing guard.

Photo by Frederic Noy/Panos

Observe a young yellow baboon (above) surveying Tanzania’s Udzungwa Mountains National Park. Despite being shrouded in lush rainforests, this remote area has only recently been explored by biologists, revealing its unique biodiversity.

The park is a sanctuary for six primate species, including the Udzungwa red colobus and the Sanje crested mangabey, marking it as their last refuge. The Kipunji monkey, recognized as a new species in 2003, represents the first new monkey species discovery in Africa since the 1980s with the Sanje crested mangabey.

“It felt magical,” reflects photographer Frédéric Noy, who has documented the wildlife and landscapes of this rich area. “The ongoing discoveries of new fish in the deep sea and tiny insects on land aren’t surprising. But mammals are truly remarkable!”

In 2021, the Kipunji was assigned a new genus, Rungwesevas, marking the first significant monkey genus discovery since 1923. Current estimates suggest that only about 2,000 Kipunji monkeys survive, with a population residing in Udzungwa and sparse groups scattered across Tanzania.

Other fascinating findings in Uzungwa include the giant tree species, Tesmania Princeps, capable of reaching heights of 40 meters. This species was described as new just last year.

A local initiative, the Uzungwa Corridor, is actively restoring deforested areas by planting native trees, effectively connecting Udzungwa Mountains National Park with adjacent protected zones.

Cultivating seedlings for reforestation efforts at the Udzungwa Corridor Project.

Photo by Frederic Noy/Panos

This project utilizes carbon credits to provide local residents with financial incentives for planting trees on their land. The photo above depicts a nursery where trees and other vegetation are cultivated for these essential environmental initiatives.

Many community members in the Uzungwa region also maintain beehives (see below), primarily as an additional source of income. This beekeeping practice is promoted to offset the reduced access to resources in Udzungwa Mountains National Park. Moreover, beehives are believed to discourage elephants from entering the area, as elephants sometimes migrate from nearby regions, even though they do not inhabit the park itself.

Beehives strategically placed along a fence to deter elephants.

Photo by Frederic Noy/Panos

Sugarcane stands as a primary agricultural crop in this vibrant region. The image below showcases a truck transporting freshly harvested sugarcane amidst the picturesque backdrop of the Uzungwa Mountains, with part of Mitsui Falls visible as well.

Trucks transporting sugarcane against the backdrop of the Uzungwa Mountains.

Photo by Frederic Noy/Panos

This rewrite focuses on enhancing SEO by integrating keywords associated with the Udzungwa Mountains National Park, biodiversity, conservation efforts, and local agricultural practices while maintaining the original structure and HTML tags.

Source: www.newscientist.com

Why Your Body Clock Miscalculates Your True Age: How AI Can Provide Accurate Insights

Biological Age Representation

You May Be Historically Older Than Your “Real Age”

Reuters/Toru Hanai

Years ago, when I began discussing the concept of aging, the “biological clock” emerged as a key topic. This term, synonymous with the aging clock and “true age” measurement, highlights the difference between chronological age—the number of years since birth—and biological age, which indicates the actual aging process within our bodies.

Generally, biological aging follows a predictable pattern: a gradual decline in physical and mental functions throughout adulthood. Our intuitive judgments of age often incorporate visible signs like wrinkles, gray hair, and variations in posture, gait, mental sharpness, and voice.

The goal of determining biological age is to encapsulate this aging process into a single measurable figure. This provides insight into an individual’s health trajectory, emphasizing that some people age significantly faster than others.

Most individuals find their biological age within a few years of their chronological age. However, discrepancies can be stark: one 56-year-old may exhibit a biological age akin to someone in their 30s, while another may resemble a person in their 70s. Notably, biological age can increase or decrease at a different rate than chronological age.

Understanding biological age serves as a valuable tool, offering individuals clear, understandable insights into their health. This information can motivate lifestyle modifications and help assess the effectiveness of interventions like diet and exercise. The demand for biological age assessments is evident, as numerous companies now offer testing services, albeit often at a premium.

For scientists investigating anti-aging strategies, biological age measurements serve as immediate indicators of intervention success, eliminating the need for long-term studies involving human or animal subjects. Furthermore, tracking biological age enables us to comprehend the inner workings of our bodies as they age.

Despite its advantages, the concept of biological age requires refinement. The initial biological clocks were based on epigenetic markers—molecular indicators that alter gene expression. Innovators like Steve Horvath from UCLA discovered that these markers change predictably throughout life, allowing for the estimation of biological age through complex algorithms.

Yet, epigenetics isn’t the sole estimation approach. Various other biological markers—such as blood proteins, telomeres, urine metabolites, facial imagery, and even X-rays—can also inform biological age assessments. However, the inconsistency between these different measurement methods raises concerns about their reliability.

For instance, according to a recent analysis of the CALERIE trial, which examined caloric restriction as an anti-aging intervention, five different aging clocks were applied to a cohort of 220 adults. Only two showed a significant decline in biological age among calorie-restricted participants, leaving questions about which clock to trust—a dilemma faced by both individuals and researchers utilizing aging assessments.

Another challenge is the misleading perception of accuracy. Most companies report a single biological age figure without indicating a margin of error, leading to potential misinterpretations. A recent study published in npj Aging pointed out that many existing biological clocks do not perform as anticipated, which could lead to unnecessary anxiety regarding health outcomes.

But does this imply that biological clocks are without value? Not entirely. Research indicates that many limitations associated with these methods could be addressed. According to Dmitri Kulikov and fellow researchers from the Skolkovo Institute of Science and Technology, overcoming these challenges is feasible, although determining whether it is worth pursuing these improvements remains an open question.

Meanwhile, innovative solutions are on the horizon. Emerging methodologies that utilize artificial intelligence, particularly large-scale health models (LHM), hold promise. These AI-driven models, akin to those powering systems like ChatGPT, analyze vast datasets to assess individual risks related to mortality and the development of age-related conditions. A recent study in Natural Medicine suggests these modern methods may outperform traditional biological clocks.

As LHM continues to evolve, it may address many current limitations of biological age assessments. Thus, if you are contemplating determining your biological age, proceed with caution. If you’ve already done so, take the outcomes with a degree of skepticism. In future reflections on aging, I promise to approach this subject with a more critical perspective, blending newfound knowledge with experience.

Topics:

Source: www.newscientist.com

SpaceX Starlink to Prevent 300,000 Satellite Collisions by 2025

Long exposure photo depicting satellites in the night sky of the Northern Hemisphere

Credit: Alan Dyer/VWPics/Universal Images Group (via Getty Images)

A recent report submitted by SpaceX to the US Federal Communications Commission (FCC) revealed intriguing insights about the Starlink satellite network. Notably, the report states that Starlink satellites executed approximately 300,000 collision avoidance maneuvers in 2025.

Starlink, a substantial constellation of satellites, provides internet service worldwide. Since the launch of the first Starlink satellite in 2019, the fleet has expanded to around 9,400 satellites, constituting 65 percent of all operational satellites in Earth’s orbit.

Due to the potential hazards posed by satellite collisions, such as generating debris and making certain orbital paths unusable, the FCC mandates that SpaceX provides biannual updates on Starlink’s safety protocols.

In its latest report, dated December 31, SpaceX disclosed that its Starlink satellites carried out around 149,000 collision avoidance maneuvers from June to November 2025. These maneuvers are essential when two satellites are deemed to be in close proximity and at risk of colliding.

The industry standard allows for operation with a collision risk of 1 in 10,000; however, SpaceX adopts a more cautious approach, allowing only a risk of 3 in 10,000,000.

This surge in maneuvers follows SpaceX’s earlier report of 144,000 maneuvers conducted between December 2024 and May 2025. With this, SpaceX aims to complete around 300,000 maneuvers in 2025, marking a significant increase of about 50% from 200,000 maneuvers in 2024. “It’s an enormous amount of maneuvering,” says Hugh Lewis, a researcher at the University of Birmingham in the UK. “This is an exceptionally high figure.”

While most other satellite operators worldwide refrain from publicizing their maneuver data, traditional satellites typically perform only a few maneuvers a year. SpaceX reports that each of its satellites may execute up to 40 maneuvers annually.

Lewis also mentioned that the company intends to increase maneuvers to one million each year by 2027. As several other mega-constellations from the US and China are being deployed, the likelihood of collisions will rise. “From a physics standpoint, that’s not ideal,” Lewis warns. “We are heading toward a concerning scenario in orbit. This situation is unsustainable.”

Additionally, SpaceX revealed in its report that it had numerous close encounters with other satellites, citing a Chinese satellite named Honghu 2, which has approached Starlink satellites over 1,000 times, likely due to overlapping operational orbits.

“This demonstrates how SpaceX effectively dominates that orbital space,” asserts Samantha Lawler from the University of Regina, Canada. Most Starlink satellites orbit at altitudes between 340 and 570 kilometers. “According to the Outer Space Treaty, all nations have access to every part of space, and they are effectively occupying those areas.”

Furthermore, SpaceX provided details about a Starlink satellite that exploded in December, creating dozens of debris pieces. The explosion was attributed to “suspected hardware failure,” and the malfunctioning component has been “identified and removed” from future designs.

Starlink employs autonomous systems to navigate collisions and manage the extensive number of maneuvers needed. However, SpaceX mentioned that one incident involved a spacecraft from Japan’s Astroscale that “performed an unannounced maneuver” potentially increasing the collision risk with a Starlink satellite.

Astroscale disputes this account, stating that the maneuver was publicly announced and executed in compliance with Japan’s orbit maintenance guidelines. SpaceX did not respond to media inquiries regarding the situation.

However, the most noteworthy statistic remains the total number of maneuvers conducted. “They are conducting operations efficiently and effectively,” Lawler comments. “But if they make an error, we could face significant consequences.”

World Capital of Astronomy: Chile

Explore Chile’s astronomical wonders. Visit the world’s most advanced observatories and enjoy stargazing beneath the clearest skies on the planet.

Topic:

Source: www.newscientist.com

Score Review: Can We Combat the Challenges of a Rules-Based World? Insights from a New Book

Chef skillfully slicing tomatoes from an overhead perspective

Rules-based cooking is enticing due to its capacity to yield highly reproducible outcomes.

FG Trade/Getty Images

Score
C. Thi Nguyen
Allen Lane

Last year, I penned an article for New Scientist detailing how a physicist unveiled the precise method to flawlessly cook the Italian classic, cacio e pepe. The emulsion of black pepper, pecorino cheese, and water can often turn clumpy. Ivan di Terlizzi and his team at the Max Planck Institute for the Physics of Complex Systems experimented with cacio e pepe numerous times to perfect a method that guarantees consistent results.

This topic resonated with many readers. When I recently caught up with one of the scientists involved, he suggested the draw might stem from their research’s ability to unveil order in what can otherwise appear chaotic, especially when examined through the lens of mathematics and precision.

While this perspective is captivating, it also carries risks, as C. Thi Nguyen discusses in his book, Score: How to Stop Playing Someone Else’s Game. Formerly a food critic, Nguyen is now a philosophy professor at the University of Utah in Salt Lake City. He cautions that recipes promising flawless results can obscure the essential values of food as “an exercise of taste and preference.”

By employing scientific rigor—exact measurements and meticulous procedures—the outcomes might be repeatable, but this approach diminishes the diversity of culinary experiences and the delightful chaos that food can represent.

Cooking serves as merely one instance illustrating how modern tendencies to systematize and impose order on chaotic realities—often driven by state bureaucracies—can result in less-than-ideal outcomes. Nguyen constructs a vivid image of a world rife with such consequences.

Using his own academic journey as a reference, Nguyen contends with the rankings that universities and magazines impose. In philosophy, these ratings often derive from websites evaluating departments based on criteria like publishing prestige or scholars’ ability to address specialized queries, contrasting sharply with the “wild, unruly questions” that initially piqued his interest in philosophy. He began to sense a phenomenon he labeled “value capture,” where metrics intended to guide us begin to dictate our actions.

Nguyen argues for embracing these intricate rule-based systems by engaging in games as a means to explore and remain open to life’s experiences. This book encompasses a diverse array of his recreational pursuits, from Dungeons & Dragons to rock climbing, yoga, and yo-yo.

He effectively illustrates why choosing to abide by the rules within a game serves as a “spiritual vaccine” against societal pressures to conform to institutional scoring systems, like those found in educational assessments. While the notion that games can save us may seem optimistic, Nguyen compellingly presents it as a refreshing perspective.

Many of Nguyen’s concepts aren’t groundbreaking, drawing from numerous influential philosophers and scholars shaping his thought process—including Tim Marshall’s Prisoners of Geography, which explains the influence of geography on geopolitics, and James C. Scott’s Seeing Like a State, which scrutinizes the shortcomings of scientifically planned societies.

However, Nguyen’s imaginative approach to discussing the core themes of his book ensures the conversation remains engaging and thought-provoking. This work provides a compelling starting point for further exploration.

Topics:

Source: www.newscientist.com

Unlocking the Mystery: Why Did Magic Mushrooms Evolve? Discover the Answers Here!

Many mushroom species produce the psychoactive compound psilocybin

YARphotographer/Shutterstock

Magic mushrooms have been providing transformative experiences for thousands of years. Researchers suggest that fungi developed hallucinogenic compounds like psilocybin as a biological defense against insect herbivores.

Psilocybin is the main psychoactive component in magic mushrooms, present in various species found on every continent except Antarctica. Historically, these mushrooms have been utilized by shamans in traditional cultures. Recent studies are investigating psilocybin’s potential as a therapy for mental health disorders, including depression and PTSD.

This psychedelic compound primarily interacts with serotonin receptors in the human brain. However, the evolutionary reasons that lead fungi to produce compounds similar to animal neurotransmitters remain unclear. As John Ellis from the University of Plymouth points out, “There’s speculation that psilocybin serves a protective role against invertebrate fungivores, but these ideas need further exploration.”

To explore the effects of psilocybin on insects, Ellis and his team fed fruit fly (Drosophila melanogaster) larvae with dried magic mushrooms (Cylocyber cubensis). The researchers monitored the larvae’s survival rates, growth rates, and adult size and development.

Additionally, the team created liquid extracts from the mushrooms, combined them with a minimal amount of sucrose, and observed the larvae’s movements after exposure. “It resembled immersing them in a sweet magic mushroom solution,” says team member Kirsty Matthews Nicholas.

“By quantifying how rapidly the insects crawled, the distances traveled, and their overall movement coordination, we assessed the immediate impacts on their nervous systems,” Nicholas explains.

Results showed that larvae exposed to a magic mushroom diet exhibited significantly reduced survival rates. At lower doses, more than half of the larvae did not survive to adulthood. At higher doses, survival rates dropped to just about 25%.

“Among the flies that did reach adulthood, the consequences were evident. Adult flies were smaller, had shortened bodies, and asymmetrical wings – all indicators of developmental stress,” Nicholas reported. “They crawled shorter distances, moved less overall, and displayed erratic movement patterns, leading to slower and less coordinated motion.”

However, it is unlikely that insects experience psychedelia as humans do. “Our findings imply that compounds like psilocybin disrupt essential insect physiology and behavior in ways that could be detrimental rather than psychedelic,” she notes.

The research team also collected and analyzed seven mushroom species from Dartmoor, UK, and found that the DNA of invertebrates present varied according to the psilocybin-producing fungi—indicating a specific interaction pattern between these fungi and their insect hosts.

Unexpected outcomes highlighted the complexity of psilocybin’s ecological role. For instance, fruit flies with decreased serotonin receptor counts, typically impacted by psilocybin, were found to be more affected. Furthermore, the flies also showed adverse reactions to extracts from control mushroom species devoid of psilocybin.

Fabrizio Alberti from the University of Warwick indicates that their findings demonstrate that non-psilocybin mushrooms also generate other metabolites that harm insects’ speed and survival.

“Ongoing research utilizing pure psilocybin on insects will be essential to clarify its ecological significance and explore whether this psychedelic compound evolved as an insect deterrent,” Alberti emphasizes.

This study raises critical challenges in understanding the evolutionary implications of psilocybin-producing fungi. Bernhard Rupp from the University of Innsbruck, Austria, suggests, “Mushrooms producing psilocybin and similar compounds may have significant evolutionary advantages, such as deterring consumption by insects and snails.”

Insect and Ecosystem Exploration Safari: Sri Lanka

Explore the heart of Sri Lanka’s vibrant biodiversity through this unique entomology and ecology-focused expedition.

Topics:

Source: www.newscientist.com

Discovering Prototaxites: Unveiling a Hidden Frontier of Complex Life

For over 165 years, the enigmatic prototaxite has stood as one of the earliest giants to rise from Earth’s barren landscapes, defying simple classification. These towering, columnar organisms dominated the terrestrial environment over 400 million years ago, reaching impressive heights of 8 meters (26 ft), long before the advent of trees. A recent study conducted by paleontologists from the University of Edinburgh and the National Museums of Scotland posits that this mysterious entity was not merely a giant fungus, as often presumed, but rather belonged to an entirely extinct lineage of complex life.



Prototaxites dominated terrestrial ecosystems 410 million years ago as the largest living organisms. Image credit: Matt Humpage.

The prototaxite marks the first giant life form on Earth’s surface, emerging during the late Silurian to late Devonian periods, approximately 420 to 370 million years ago.

Recognized for their pillar-like fossils that can reach up to 8 meters, they played a crucial role in early terrestrial ecosystems well before the emergence of trees.

These organisms were widely distributed across ancient terrestrial environments and were likely consumed by arthropods, marking a pivotal stage in land colonization and holding significant ecological importance.

Despite over 165 years of inquiry, the biological identity of prototaxite remains a topic of heated debate among paleontologists, who contest whether it is a fungus or belonged to a distinct, entirely extinct lineage of complex eukaryotes.

In a groundbreaking study, Dr. Corentin Rollon and colleagues examined Prototaxites Taichi, found preserved in remarkable three-dimensional detail within the 407-million-year-old Rhynie Chert in Aberdeenshire, Scotland.

“The Rhynie Chert is a remarkable treasure trove,” noted Dr. Rollon, the lead author of the study published in this week’s edition of Scientific Progress.

“This site represents one of the oldest fossilized terrestrial ecosystems, and its well-preserved biodiversity enables innovative approaches like machine learning applied to fossil molecular data.”

“Numerous other specimens from the Rhynie Chert are preserved in museum collections, contributing vital context to our findings.”

The research team investigated new specimens of Prototaxites Taichi, identifying the largest known example of this species at the site, facilitating detailed anatomical and molecular comparisons with fossil fungi found in the same deposits.

Microscopic imaging revealed a complex internal structure that diverges significantly from any known fungi.

The fossil comprises three distinct types of tubes, including large, thick-walled tubes featuring annular stripes and dense spherical regions known as medullary points.

These intriguing features form a complex 3D network of interconnected tubes, suggesting a branching pattern unheard of in fungal biology.

Researchers employed infrared spectroscopy and machine learning techniques to classify molecular fingerprints from prototaxite alongside those of fossil fungi, arthropods, plants, and bacteria found in Rhynie Chert.

Fossilized fungi from this location maintain characteristic chemical signatures linked to chitin-rich cell walls, which were intriguingly absent in ancient prototaxite.

The team also searched for perylene, a biomarker associated with pigment compounds produced by specific fungi, previously detected in other Rhynie Chert fossils. However, no such compounds were found in the prototaxite sample.

Collectively, the structural, chemical, and biomarker findings imply that prototaxite does not align with any known fungal group, including the earliest forms of modern fungi.

“This research marks a significant advancement in a 165-year-long discussion,” stated Dr. Sandy Hetherington, the senior author of the paper.

“These organisms represent life forms distinct from those we currently recognize, displaying different anatomical and chemical characteristics from fungi and plants, thereby belonging to a unique, now-extinct lineage of complex life.”

“Our study combines chemical analysis and anatomical insights into prototaxite, revealing that it cannot be classified within any known fungal group,” explained co-author Laura Cooper.

“As earlier researchers have discounted classifications to other large and complex life forms, we conclude that prototaxite belonged to an entirely distinct lineage of extinct complex life.”

“Thus, prototaxite symbolizes independent evolutionary experiments in constructing large and complex organisms, known to us only through exceptionally preserved fossils.”

_____

Corentin C. Rollon et al. 2026. Prototaxites fossils are structurally and chemically distinct from both extinct and extant fungi. Scientific Progress 12(4); doi: 10.1126/sciadv.aec6277

Source: www.sci.news

Biologist Resurrects 3.2 Billion-Year-Old Enzyme: Discoveries in Ancient Biology | Science News

A groundbreaking research team from the University of Wisconsin-Madison has successfully reverse-engineered a primitive nitrogen-fixing enzyme. This discovery sheds light on how life thrived before the Earth was transformed by oxygen and establishes reliable chemical markers for detecting extraterrestrial life.



Resurrection and characterization of an ancestral nitrogenase. Image credit: Rucker et al., doi: 10.1038/s41467-025-67423-y.

Led by Professor Betül Kaçar, the research focuses on an essential enzyme known as nitrogenase, which plays a pivotal role in converting atmospheric nitrogen into bioavailable forms.

“We selected an enzyme that significantly influences life on Earth and investigated its evolutionary history,” Professor Kaçar stated.

“Without nitrogenase, the existence of modern life as we know it would be impossible.”

Traditionally, scientists have depended on geological evidence to reconstruct Earth’s historical life.

However, significant fossils and rock samples are scarce and often require fortuitous discovery.

Professor Kaçar and his team view synthetic biology as a valuable tool to bridge these gaps, allowing them to construct specific ancient enzyme reconstructions, insert these into microorganisms, and study them in contemporary lab settings.

“The Earth of 3 billion years ago was vastly different from the world we recognize today,” remarked Dr. Holly Rucker.

“Before the Great Oxidation Event, the atmosphere was rich in carbon dioxide and methane, and life predominantly consisted of anaerobic microorganisms.”

“Understanding how these microorganisms accessed vital nutrients like nitrogen enhances our comprehension of how life persisted and evolved before oxygen-dependent organisms began to alter the planet.”

“Though fossilized enzymes are unavailable for study, these enzymes can leave discernible isotopic traces, measurable in rock samples.”

“Much of the prior research assumed ancient enzymes produced isotopic signatures akin to modern enzymes,” added Dr. Rucker.

“This holds true for nitrogenase; the isotopic traces we observe from ancient times correspond with modern signatures, providing deeper insights into the enzyme itself.”

The researchers discovered that ancient nitrogenase enzymes, despite having different DNA sequences, maintain the same mechanisms for isotopic signatures observed in the rock record.

“As astrobiologists, our understanding of Earth helps us comprehend the potential for life elsewhere in the universe,” Professor Kaçar emphasized.

“The quest for life begins right here on our 4-billion-year-old planet.”

“To grasp future possibilities and life beyond our planet, we must first understand our own history.”

The results were published today in the online journal Nature Communications, accessible here.

_____

Rucker et al. 2026. The revived nitrogenase reproduces the standard N isotope biosignature spanning two billion years. Nat Commun 17,616; doi: 10.1038/s41467-025-67423-y

Source: www.sci.news

Ancient Bacteria Discovery Redefines Syphilis Origins: A Breakthrough in Medical History

Treponema pallidum Bacteria Linked to Syphilis and Related Diseases

Source: Science Photo Library / Alamy

New research reveals that traces of Treponema pallidum—the bacteria responsible for syphilis—have been identified in the bones of ancient inhabitants of Colombia, dating back over 5,000 years. This discovery suggests that syphilis was infecting humans far earlier than previously believed, prior to the advent of intensive agriculture, which many experts think may have facilitated its spread.

Currently, Treponema pallidum encompasses three subspecies that cause syphilis, bejel, and framboise. The origins and transmission pathways of these diseases remain topics of scientific debate. Although ancient DNA and infectious markers on bones offer insights, they are often limited and ambiguous.

In a groundbreaking study, researchers analyzed DNA from 5,500-year-old remains discovered in the Bogotá savannah. The unexpected finding of Treponema pallidum in a human leg bone provides critical evidence of its historical prevalence.

“This discovery was entirely unanticipated, as there was a lack of skeletal evidence indicating an infectious disease,” notes Nasreen Broumandkoshbacht from the University of California, Santa Cruz.

Many scholars have long posited that the majority of diseases affected humans only after the rise of intensive agriculture, which led to denser populations. However, this individual lived in a contrasting setting—small, nomadic hunter-gatherer bands that maintained close contact with wild animals.

“These results shed light on the extensive evolutionary history of these organisms,” states Davide Bozzi from the University of Lausanne, Switzerland. “They reveal longstanding relationships between the bacterium and human populations.”

As researchers, including Blumandhoschbacht and Bozzi, correlated ancient genomes with contemporary ones, they identified that the pallidum strain was part of a distinct lineage, separate from any known modern relatives. This indicates that early variants of syphilis were already diversifying and infecting humans in the Americas millennia ago, with many of the same genetic traits that make present-day strains particularly pathogenic.

The findings imply that these pathogens were not only early residents in the Americas but may have been affecting human populations globally for much longer than previously assumed.

Rodrigo Barquera, a researcher at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, suggests that this ancient strain might link to an elusive “missing” pathogen, Treponema carathaeum, known primarily for its physical characteristics rather than its genetic makeup.

Kertu Majumdar, a researcher at the University of Zurich in Switzerland, posits, “The genomes of even older organisms might provide insights into a variety of extinct lineages and diseases caused by these pathogens.”

For Bozzi, unearthing the evolutionary adaptations of pathogens like syphilis is crucial for understanding their genetic attributes that enhance their virulence in new hosts.

Topics:

Source: www.newscientist.com

The Brain’s Vast Interconnectedness: The Revolutionary Idea of the Century

New Scientist: Explore the latest science news, technology, health advancements, and environmental updates by expert journalists.

You’ve likely encountered the parable of the blind men and the elephant, where each individual’s perspective is limited to one part, leading to a distorted understanding of the whole. This concept resonates deeply in neuroscience, which has historically treated the brain as a collection of specialized regions, each fulfilling unique functions.

For decades, our insights into brain functionality arose from serendipitous events, such as the case of Phineas Gage, a 19th-century railroad worker who dramatically altered personality following a severe brain injury. More recent studies employing brain stimulation have linked the amygdala with emotion and the occipital lobe with visual processing, yet this provides only a fragmented understanding.

Brain regions demonstrate specialization, but this does not encapsulate the entire picture. The advent of imaging technologies, particularly functional MRI and PET scans in the late 1990s and early 2000s, revolutionized our comprehension of the brain’s interconnectedness. Researchers discovered that complex behaviors stem from synchronized activity across overlapping neural networks.

“Mapping brain networks is playing a crucial role in transforming our understanding in neuroscience,” states Luis Pessoa from the University of Maryland.

This transformative journey commenced in 2001 when Marcus Raichle, now at Washington University in St. Louis, characterized the Default Mode Network (DMN). This interconnected network activates during moments of rest, reflecting intrinsic cognitive processes.

In 2003, Kristen McKiernan, then at the Medical College of Wisconsin, and her team identified that the DMN experiences heightened activity during familiar tasks, such as daydreaming and introspection, providing a “resting state” benchmark for evaluating overall brain activity. They began to correlate DMN activity with advanced behaviors, including emotional intelligence and theory of mind.

As discoveries proliferated across other networks—pertaining to attention, language, emotion, memory, and planning—our understanding of mental health and neurodiversity evolved. These neural differences are now thought to be linked with various neurological conditions, including Parkinson’s disease, PTSD, depression, anxiety, and ADHD.

Network science has emerged as a pivotal field, enhancing our comprehension of disorders from autism, characterized by atypical social salience networks—those that detect and prioritize salient social cues—to Alzheimer’s disease, where novel research indicates abnormal protein spread via network pathways. We also acknowledge the inspiration it provides for developing artificial neural networks in AI systems like ChatGPT.

Neural networks have not only reshaped our understanding of brain functionalities but also the methodologies for diagnosing and treating neurological disorders. While we might not yet perceive the entirety of the elephant, our view is undeniably clarifying as science progresses.

Topic:

Source: www.newscientist.com

Inventing Net Zero: The Century’s Most Innovative Idea for a Sustainable Future

New Scientist - Your source for science news and expert analyses on technology, health, and the environment.

In 2005, physicists David Frame and Miles Allen were headed to a scientific conference in Exeter, England. According to Frame, they were “playing around” with climate models in preparation for their presentation.

At that time, most research centered on stabilizing the concentration of greenhouse gases in the atmosphere to avert severe climate change. However, scientists faced challenges in predicting how much the planet would warm if these concentrations reached specific levels.

Frame and Allen approached the issue from a different angle. Instead of focusing on atmospheric concentrations, they examined emissions. They wondered what would happen if humanity ceased emitting anthropogenic carbon dioxide. Using a climate model on a train, they found that global temperatures reached a new stable level. In other words, global warming would halt if humanity achieved “net-zero” carbon dioxide emissions. Frame recalled, “It was pretty cool to sit on the train and see these numbers for the first time and think, ‘Wow, this is a big deal.’

This groundbreaking presentation and the subsequent Nature paper published in 2009 reshaped the thinking within the climate science community. Prior to the net-zero concept, it was generally accepted that humans could emit around 2.5 gigatons annually (approximately 6% of current global emissions) while still stabilizing global temperatures. However, it became clear that to stabilize the climate, emissions must reach net zero, balanced by equivalent removals from the atmosphere.

The global consensus surrounding the need to achieve net zero CO2 emissions rapidly gained traction, culminating in a landmark conclusion in the 2014 Intergovernmental Panel on Climate Change (IPCC) report. The subsequent question was about timing: when must we reach net zero? At the 2015 Paris Agreement, nations committed to limiting temperature increases as close to 1.5°C as feasible, aiming for net-zero emissions by around mid-century.

Almost immediately, governments worldwide faced immense pressure to establish net-zero targets. Hundreds of companies joined the movement, recognizing the economic opportunities presented by the transition to clean energy. This “net-zero fever” has led to some dubious commitments that excessively rely on using global forests and wetlands to absorb human pollution. Nevertheless, this shift has altered the course of this century: approximately 75% of global emissions are now encompassed by net-zero pledges, and projections for global warming throughout this century have decreased from around 3.7–4.8°C to 2.4–2.6°C under existing climate commitments.Read more here.

Topics:

Source: www.newscientist.com

Did Ancient Giant Kangaroos Have the Ability to Jump Despite Their Size?

Procoptodon prehistoric kangaroo

Procoptodon Goria: The 2-Meter Tall Kangaroo

Credit: Michael Long/Science Photo Library

New bone analysis suggests even the colossal kangaroos of ancient Australia might have been capable of jumping.

During the Pleistocene, some kangaroos weighed over twice as much as today’s species. One subset, the Stenurines, reached such enormous sizes that their ability to jump was doubted, leading researchers to believe they primarily walked on their hind legs.

“When discussing giant kangaroos, the stenurines are a frequent topic,” says Megan Jones from the University of Manchester, UK. “These unique kangaroos feature very short, box-shaped skulls and one toe on each foot. The largest male red kangaroos today average around 90 kilograms, while the biggest stenulin weighed nearly 250 kilograms.”

Among these giants is Procoptodon Goria, the most significant kangaroo species, standing approximately 2 meters tall and going extinct around 40,000 years ago.

Debate has persisted regarding the stress on their feet, prompting Jones and her team to analyze bone measurements from 67 macropod species—encompassing modern kangaroos, wallabies, potoroos, beetongs, rat kangaroos, and extinct giant kangaroos.

They measured leg bones (including the femur, tibia, and calcaneus) and gathered body weight data to estimate tendon sizes and their endurance under stress.

“The kangaroo’s Achilles tendon is on the brink of rupture but serves a vital role,” states Jones. “It enables kangaroos to store elastic energy for the next jump. Simply scaling today’s kangaroo would present challenges.”

Yet, ancient kangaroos weren’t merely massive. With shorter legs and wider calcaneus bones, their structure mitigated bending influences during hopping, allowing for larger tendons capable of withstanding the corresponding forces.

“This evidence indicates they weren’t mechanically restricted from jumping,” asserts Jones. “Whether they actually jumped, however, is a different question.”

While hopping likely wasn’t their primary locomotion mode, it might have been used sporadically for rapid movement, Jones explains.

This study reinforces the view that the iconic kangaroo hop is likely an adaptable feature within a surprisingly varied locomotor repertoire, according to Benjamin Kia from Uppsala University, Sweden. Over millions of years, this versatility has contributed to the ecological triumph of macropods.

The flexibility remains clear today; red kangaroos, often perceived as constant hoppers, can also utilize their tails as a fifth limb for walking. “Tree kangaroos exhibit diverse locomotion styles—they walk, jump, bounce, and can even move on two legs,” adds Jones.

Topics:

  • Evolution/
  • Animal Behavior

Source: www.newscientist.com

Why the Internet Feels Lonely Right Now: Discover the Reasons Behind the Isolation

Sure! Here’s a rewritten version of that content, optimized for SEO while retaining the original HTML tags:

Topanga Canyon, Topanga, California, USA - lonliness in the digital age

Exploring the Loneliness of Digital Connection

Brenna Panaguiton/Unsplash

In today’s fast-paced digital landscape, I often find myself glued to my smartphone. Like many in the United States, I turn to various apps for news, from social media posts to podcasts and newsletters. However, amidst the chaos—like the unfolding protests in Minneapolis—I’ve noticed an unsettling trend: the more I consume, the lonelier I feel.

This isn’t a new phenomenon; it’s been a topic of discussion among sociologists for nearly 80 years. In 1950, scholars David Riesman, Nathan Glaser, and Reuel Denny published their influential book, The Lonely Crowd. They argued that the advent of consumerism and mass media birthed a new personality archetype, highly aware of loneliness and labeled it “other-oriented.” This description seems eerily relevant in our current social media age teeming with AI interactions.

Individuals who are other-oriented are constantly attuned to their peers, often using social cues to shape their choices related to purchases, fashion, and opinions. With their values stemming from contemporaries rather than historical influencers, they tend to prioritize present experiences over tradition. Riesman and his colleagues cautioned that an excessive focus on others can lead to a crippling fear of solitude.

These traits are starkly embodied in our engagement with social media, characterized by peer pressure, superficial connections, and even the growing surveillance culture. As we monitor one another, companies develop applications that simulate camaraderie, leaving us more isolated. This illustrates inherent risks of AI chatbots that are engineered to masquerade as companions.


When we shape our identity based on others’ expectations, we obscure our deeper selves.

There exists a contradiction within our social desires. While we yearn for inclusion, we also crave individuality. Riesman et al. contend that consumerism often creates a faux sense of unique identity. Consider the experience of browsing a rack of nearly identical polo shirts; selecting one may foster feelings of individuality, but fundamentally, they remain similar to one another.

This phenomenon of mispersonalization frequently manifests in the algorithms governing our online interactions. Platforms like TikTok curate “For You” feeds exhibiting content aligned with our tastes, yet this personalization is overseen by uncontrollable algorithms aimed at ensuring conformity.

As individuals shaped by external influences, we often find ourselves expressing our identities through group interactions, as advertisements prompt us to “join the conversation.” We generate content for the internet, portraying our lives through the lens of shared experiences.

Still, many of us wrestle with the lingering sensation of loneliness. This disconnect can be attributed to the variance between real-life relationships and those formed in digital spaces. Moreover, it may relate to the personality shift chronicled in The Lonely Crowd. By focusing excessively on others, we risk neglecting our genuine, idiosyncratic desires. Without self-awareness, meaningful connections with others become elusive.

Riesman and his collaborators proposed two solutions. First, they emphasized the need to reclaim our leisure time from the all-consuming media landscape. They argued that our vigilance towards peers often resembles labor, advocating for more playful engagement with life. Their second suggestion urged individuals, particularly children, to explore new identities and experiences. Reflect on activities you enjoy when not dictated by external definitions of “fun.” Try something novel, don vibrant or whimsical clothing, or chat with an unfamiliar neighbor. Allow yourself to be surprised and embrace experimentation.

Remember, neither a “For You” feed nor an AI chatbot can define your identity. So, take a break from your devices, engage in unexpected activities, and rediscover who you are.

What I Am Reading
Notes from the Kingslayer, A captivating narrative of rebellion and familial bonds by Isaac Ferman.

What I See
Fierce rivalry, Because I know how to embrace enjoyment.

What I Am Working On
I’m exploring Sogdiana, my favorite ancient diaspora culture.

Annalee Newitz is a science journalist and author. Their latest book is Automatic Noodles. They co-host the Hugo Award-winning podcast Our Opinion Is Correct. Follow @annaleen and visit their website: techsploitation.com.

Topics:

SEO Optimization Highlights:

  • Keywords: Key phrases like “loneliness in the digital age,” “social media,” “consumerism,” and “identity” have been strategically included throughout the content.
  • Alt Text: Improved the image alt text for relevancy and search engine visibility.
  • Headings & Structure: Organized content with clear sections that help with readability and SEO rankings.
  • Call-to-Action: Encouraged the reader to engage with their real-life identity outside of digital personas, which adds personal value to the content.

Source: www.newscientist.com

Impact of Abnormal Oral Microbiome on Obesity: Key Characteristics and Insights

Bacteria in the oral cavity

Oral Bacteria (Blue) on Human Cheek Cells (Yellow) in Scanning Electron Micrograph

Steve Gschmeisner/Science Photo Library

Recent research has revealed that individuals with obesity exhibit unique oral microbiome characteristics. This finding could pave the way for early detection and prevention strategies for obesity.

The diverse community of microorganisms in our gut significantly impacts weight gain, being commonly linked to obesity and various metabolic conditions. Notably, up to 700 species of bacteria have been implicated in obesity and overall health.

“Given that the oral microbiome is the second largest microbial ecosystem in the human body, we aimed to investigate its association with systemic diseases,” says Ashish Jha, from New York University, Abu Dhabi.

Jha and his team analyzed saliva samples from 628 adults in the United Arab Emirates, 97 of whom were classified as obese. They compared these samples with a control group of 95 individuals of healthy weight, similar in age, gender, lifestyle, oral health, and tooth brushing habits.

The analysis showed that the oral microbiome of obese individuals has a higher abundance of inflammation-causing bacteria, such as Streptococcus parasanguinis and Actinobacterium oris. Additionally, Oribacterium sinus produces lactic acid, which is linked to poor metabolic health.

Jha and his colleagues identified 94 distinct differences in metabolic pathways between the two groups. Obese participants demonstrated enhanced mechanisms for carbohydrate metabolism and the breakdown of histidine, while their capability to produce B vitamins and heme—crucial for oxygen transport—was reduced.

Metabolites notably generated in obese individuals include lactate, histidine derivatives, choline, uridine, and uracil, which are associated with metabolic dysfunction indicators such as elevated triglycerides, liver enzymes, and blood glucose levels.

“When we analyze these findings collectively, a metabolic pattern surfaces. Our data indicates that the oral environment in obesity is characterized by low pH, high carbohydrate levels, and pro-inflammatory conditions,” notes Lindsey Edwards from King’s College London. “This study offers compelling evidence that the oral microbiome may reflect and contribute to the metabolic changes associated with obesity.”

Currently, these findings suggest a correlation rather than causation. “While some associations are surprising, we cannot determine cause and effect as of now, which remains our next focus,” Jha states.

To explore whether the oral microbiome contributes to obesity or is modified by it, Jha and his team plan further experiments analyzing both saliva and gut microbiomes to investigate potential microbial and metabolic transfers.

Professor Jha believes this is plausible, as the mouth’s extensive blood vessel network facilitates nutrient absorption and taste sensing, potentially allowing metabolites direct access to the bloodstream, influencing other bodily systems.

Establishing a causal connection will also necessitate randomized controlled trials and detailed metabolic pathway analyses, according to Edwards.

As dietary patterns evolve, specific food components may become more readily metabolized by certain bacteria, leading to increased microbial activity that can influence cravings and potentially lead to obesity, Jha explains. For instance, uridine has been shown to promote higher calorie intake.

If oral bacteria are demonstrated to influence obesity, Edwards suggests it could lead to innovative interventions, such as introducing beneficial oral microbes through gels, using prebiotics to foster specific bacterial growth, or employing targeted antimicrobials. “Behavioral strategies, like reducing sugar intake, can also significantly contribute to obesity prevention,” she adds.

Even if the oral microbiome acts as a consequence rather than a cause of obesity, its assessment can still provide valuable insights. Saliva tests can easily detect distinct microbial changes, which Jha believes could be useful for early obesity detection and prevention strategies.

Topic:

Source: www.newscientist.com

Does Limiting Social Media Use Benefit Teens? New Evidence Revealed

Teens in social media trial

Teens in Trial to Limit Social Media Use: A Shift Towards Real-life Interaction

Daniel de la Hoz/Getty Images

A groundbreaking study is exploring the effects of reduced social media usage on teens’ mental health and well-being. While results are not expected until mid-2027, ongoing discussions suggest that some governments might institute bans on social media for teenagers before the outcomes are known.

The merit of such a ban is still up for debate in the courts. Despite limited evidence, Australia has introduced regulations for minors under 16, and the UK government is considering similar measures.

This trial prioritizes young people’s voices by involving them in the planning process. Historically, children and adolescents have been excluded from critical discussions concerning social media design and management.

“Involving kids is crucial,” states Pete Etchells from Bath Spa University, UK, who is not directly involved in the study.

“There is ample evidence pointing to the potential harms of social media on young users, some of which can be severe,” notes Amy Orben, co-leader of the trial, emphasizing the uncertainty regarding the broader impact of social media time.

To obtain clearer answers, large-scale studies are necessary. The IRL trial takes place in Bradford, England, aiming to recruit around 4,000 participants aged 12 to 15 across 10 schools. A bespoke app will be used to monitor social media engagement.

Half of the participants will face specific time limits on certain apps like TikTok, Instagram, and YouTube, with no restrictions on messaging apps like WhatsApp. “Total usage will be capped at one hour a day, with a curfew from 9 PM to 7 AM,” explains Dan Lewar from the Bradford Health Data Science Center, who co-leads the trial. This is significant, considering that the average social media usage for this age group is about three hours daily.

Importantly, participants will be randomized by grade level, allowing 8th graders to serve as the control group while 9th graders undergo restrictions. The aim is to create similar circumstances for both groups. “If a child’s social media is restricted, but their friends are active online post-curfew, they may feel excluded,” Orben explains.

Lewar emphasizes that the trial was designed collaboratively with teens. “They opposed a blanket ban,” he notes.

The comprehensive study will span six weeks around October, with preliminary results anticipated in mid-2027.

Orben emphasizes that this trial will yield more precise data on teenage social media habits through app monitoring rather than relying on self-reported information. The team will also gather data on anxiety, sleep quality, socializing, happiness, body image, school absenteeism, and experiences of bullying.

Etchells asserts the necessity of understanding whether restrictions or bans are beneficial or detrimental to youth. “The honest answer is we don’t know. That’s why research like this is critical.”

This initiative is welcomed due to the absence of high-quality studies in this area. A recent report from the UK Department for Science, Innovation, and Technology highlighted the need for quality causal evidence linking young people’s mental health to digital technology use, especially concerning social media, smartphones, and AI chatbots.

As stated by Margarita Panayiotou from the University of Manchester, engaging with youth is essential in social media research. Her findings show that teens often find ways to circumvent outright bans, making testing restrictions a more viable option. This approach may also be more ethical, as the harm caused by a ban is not yet understood.

“Teens view social media as a space for self-discovery,” says Panayiotou, highlighting concerns about platform distrust, feelings of loss of control, and unintentional overuse. They also report struggles with online judgment, body comparisons, and cyberbullying.

According to Etchells and Panayiotou, the primary challenge for governments is to compel tech companies to ensure safer social media environments for youth.

The Online Safety Act 2023 (OSA) mandates that technology firms like TikTok, Facebook, WhatsApp, and Instagram (owned by Meta), as well as Google (which owns YouTube), enhance user safety. “Effective enforcement of OSA could address many existing issues,” asserts Etchells.

Topics:

  • Mental Health/
  • Social Media

Source: www.newscientist.com

How Bacteria and Viruses Collaborate to Combat Cancer: Insights from Sciworthy

The history of cancer can be traced back to ancient Egyptian civilizations, where it was thought to be a divine affliction. Over the years, great strides have been made in understanding cancer’s causes and exploring diverse treatment options, although none have proven to be foolproof. Recently, a research team at Columbia University has pioneered a novel method for combating cancerous tumors by utilizing a combination of bacteria and viruses.

The researchers engineered this innovative strategy by infecting bacterial cells with Typhimurium that were modified to carry the Seneca virus A. The theory posited that when tumor cells engulf these bacteria, they would also take in the virus, which would then replicate within the cells, leading to their death and the subsequent distribution of the virus to surrounding cells. This technique has been termed Coordinated Activities of Prokaryotes and Picornaviruses for Safe Intracellular Delivery (CAPPSID).

Initially, the research team verified that Typhimurium was a suitable host for Seneca virus A. They infected a limited number of these bacteria with a modified variant of the virus that emitted fluorescent RNA. Subsequently, they applied a solution that facilitated viral entry into the bacteria. Using fluorescence microscopy, they confirmed the presence of viral RNA inside the bacterial cells, validating the infection. To further assist the viral RNA in escaping the bacteria and reaching cancer cells, the researchers added two proteins, ensuring that viral spread was contained to prevent infection of healthy cells.

After optimizing the bacteria and virus, the team tested the viral delivery system on cervical cancer samples. They found that viral RNA could replicate both outside of bacterial cells and inside cancer cells. Notably, newly synthesized RNA strands were identified within tumor cells, confirming the successful delivery and replication of the virus through the CAPPSID method.

Next, the researchers examined CAPPSID’s impact on a type of lung cancer known as small cell lung cancer (SCLC). By tracking fluorescent viral RNA within SCLC cells, they assessed the rate of viral dissemination post-infection. Remarkably, the virus continued to propagate at a consistent rate for up to 24 hours following the initial infection, demonstrating effective spread through cancerous tissue without losing vigor.

In a follow-up experiment, the researchers evaluated the CAPPSID method on two groups of five mice, implanting SCLC tumors on both sides of their backs. They engineered the Seneca virus A to generate a bioluminescent enzyme for tracking purposes and injected the CAPPSID bacteria into the tumors on the right side. Two days post-injection, the right-side tumor glowed, indicating active viral presence. After four days, the left-side tumor also illuminated, suggesting that the virus had successfully navigated throughout the mice’s bodies while sparing healthy tissues.

The treatment continued for 40 days, leading to complete tumor regression within just two weeks. Remarkably, upon observation over a subsequent 40-day period, the mice demonstrated a 100% survival rate, with no recurrence of cancer or significant side effects. The research team observed that the CAPPSID virus, being encapsulated by bacteria, could circumvent the immune response, thus preventing cancer cells from building immunity against it.

Finally, to prevent uncontrolled replication of Seneca virus A, the researchers isolated a gene from a tobacco virus responsible for producing an enzyme that activates a crucial protein in Seneca virus A. By incorporating this gene into the Typhimurium bacteria, they were able to independently produce this enzyme, ensuring the virus could not replicate or spread without the bacteria’s presence. Follow-up tests confirmed that this modified CAPPSID method improved viral spread while maintaining confinement within cancer-affected areas.

The research findings hold promising potential for the development of advanced cancer therapies. The remarkable regression of tumors in mice and the targeted delivery system of CAPPSID—without adverse effects—could lead to safer cancer treatments for human patients, eliminating the need for radiation or harmful chemicals. However, the researchers also cautioned about the risk of viral and bacterial mutations that may limit the effectiveness of CAPPSID and cause unforeseen side effects. They suggested that enhancing the system with additional tobacco virus-derived enzymes could help mitigate these challenges, paving the way for future research into innovative cancer therapies.

Post views: 148

Source: sciworthy.com

Why We Misjudged the Power of Prompting People to Drive Positive Change

Explore groundbreaking science news and in-depth analyses featured in New Scientist.

Environmental and social challenges are urgent, yet many nations grapple with underfunding and political stalemates. Imagine if we could innovate ways to tackle these issues effectively and economically without the burden of partisan politics!

Nearly two decades ago, we and our colleagues in behavioral sciences considered this a real possibility. We proposed a sophisticated idea: social issues often stem from individuals making “poor” choices, whether it’s unhealthy eating, smoking, or polluting the environment. Traditional approaches rely on taxes or bans, but our fresh perspective aimed to encapsulate a gentler, psychologically aware method. By rethinking how choices are presented, we could encourage healthier and more sustainable options, while still allowing access to alternatives.

“Nudges” were viewed as potential solutions, suggesting that societal issues could be mitigated through slight shifts in individual behavior. For instance, to combat obesity, we might reduce portion sizes and reposition salad bars at the forefront of cafeterias. To address climate concerns, why not default homeowners to renewable energy options?

Initially, it appeared we were on the verge of a nudge revolution. A team of researchers, including ourselves, sought to identify subtle modifications in “choice architecture” that could spur behavioral changes and ultimately result in major societal impacts. This presents a golden opportunity to leverage psychological insights for transformative progress.

Fast forward almost 20 years and progress remains stagnated, leaving many disappointed. When nudges do yield results, the effects are minimal, short-lived, and often fail to scale. Furthermore, emphasizing individual behavior as the primary lens for societal problems may inadvertently empower various corporate entities to resist the more traditional yet effective policy measures like taxation and regulation that reshuffle the foundational rules and incentives driving societal actions, jeopardizing their interests.

In hindsight, we realize this outcome shouldn’t come as a surprise, though it certainly was at the time. Given that human psychology has remained fundamentally unchanged, the social dilemmas we face arise from systemic shifts—not individual choices. Events like 200 years of fossil fuel reliance or the surge of ultra-processed foods over recent decades are to blame, and individuals alone cannot resolve issues like carbon emissions or unhealthy eating patterns. Moreover, a focus on individual behaviors risks distracting policymakers and the public from recognizing the need for systemic reforms and policy-driven solutions.

Correctly identifying the problem might lead to companies resistant to regulations fortifying individual-level responses that seem effective but fall short. This phenomenon is already observable, as evidenced by attention-grabbing concepts like our personal “carbon footprint.” This branding didn’t emerge from environmental movements or NGOs but originated from a massive PR campaign by BP, one of the globe’s leading fossil fuel corporations, in the early 2000s.

No matter the social or environmental challenge at hand, those opposing comprehensive change often redirect the responsibility back to individuals. As behavioral scientists, we must avoid this trap moving forward.

Behavioral scientist Nick Chater and George Loewenstein explore these themes in their new book, On You (WH Allen), released on January 27th.

Topics:

Source: www.newscientist.com

How Dried Placenta Strips Promote Wound Healing and Minimize Scarring

Scanning electron micrograph of a human placenta's cross-section

Scanning Electron Micrograph of a Human Placenta Cross-Section

Science Photo Library

Research involving both mice and humans indicates that applying dried human placenta sheets as bandages can significantly improve skin wound healing while minimizing scarring.

The healing capabilities of placenta have been recognized since the early 1900s when it was utilized on burns to alleviate scarring. However, this practice declined due to risks associated with disease transmission.

Recent advancements in sterilizing and preserving placenta have revived interest in such treatments. Specifically, scientists are exploring the healing benefits of the amniotic membrane. This inner layer of the placenta contains an abundance of growth factors and immunomodulatory proteins that promote wound healing.

In the United States, several companies began sourcing amniotic membranes from placentas donated post-caesarean sections. This thin membrane is delicately separated from the placenta, freeze-dried, cut to standard sizes, packaged, and sterilized using radiation techniques. This approach preserves essential growth factors and ensures pathogen elimination, creating a tissue-paper-like wound dressing.

To assess the efficacy of these dressings in reducing scarring, Dr. Jeffrey Gartner and colleagues at the University of Arizona conducted experiments on anesthetized mice. They made surgical incisions and manipulated the wounds to intentionally slow healing.


Untreated wounds typically heal poorly and result in pronounced, lump-like scars. In stark contrast, the application of human amniotic bandages resulted in far superior healing, yielding scars that were thinner, flatter, and significantly less visible. Notably, the bandages caused no adverse effects in mice due to the placenta’s “immune privilege” status, which safeguards it from immune system attacks.

As a result, some surgeons in the U.S. are already utilizing amniotic bandages for clinical applications. The FDA has approved their use for treating surgical wounds and chronic, non-healing wounds due to conditions like diabetes.

A recent study, published in June 2025, evaluated the performance of these bandages in real-world clinical settings. Researcher Ryan Corey and his team at Beth Israel Deaconess Medical Center in Boston analyzed a large, national database of anonymous patient health records. They identified 593 patients who received amniotic bandages for chronic wounds and burns and compared them to a control group of 593 similar patients treated with other methods.

The findings revealed that wounds treated with amniotic bandages had a lower infection rate and were less likely to develop hypertrophic scars, which are thick, raised scars. Although these results bolster the use of amniotic bandages, Cauley et al. emphasize that “additional prospective randomized studies with extended follow-up are warranted to validate these findings.”

In parallel, research teams are investigating the potential applicability of placental tissue in healing other organs beyond the skin. In 2023, Dr. Hina Chaudhry and her colleagues at the Icahn School of Medicine at Mount Sinai in New York discovered that injecting placental cells can repair heart damage in mice, hinting at future therapies for heart attack-related damage.

Topic:

Source: www.newscientist.com

Discover the Oldest Cave Art: Hand-Painted Stencils Dating Back 68,000 Years

Recent findings reveal that these stencils are over 15,000 years older than cave paintings in another Sulawesi cave, which were dated in 2024. The painting features three anthropomorphic figures interacting with pigs, believed to be approximately 51,200 years old.

“I thought my previous work was impressive, but this photo completely eclipsed it,” Blum remarked.

“This underscores the long-standing tradition of rock art creation in this region. It spans an incredible timeline,” he emphasized.

Researchers are optimistic about uncovering even older art forms, including narrative art, in Indonesia, a largely unexplored archaeological treasure trove.

Liang Methanduno, a prominent cave art location, attracts tourists. However, most artworks discovered so far, depicting domestic animals like chickens, are relatively recent, estimated to be around 4,000 years old.

In 2015, Indonesian rock art expert and lead author, Adi Octaviana, spotted a faint drawing behind a modern painting, speculating it might be an ancient hand-painted stencil.

“These had never been documented before; their existence was unknown until Addy discovered them,” Blum stated.

Previous generations of researchers exploring Ice Age cave art, dating back 30,000 to 40,000 years in regions like France and Spain, believed it marked the dawn of modern artistic culture.

However, recent discoveries in Indonesia indicate that humans outside Europe were crafting “extraordinarily sophisticated” cave art tens of thousands of years ago, even before our species arrived in that area.

Ancient cave paintings in Sulawesi.
Maxime Aubert/AFP – Getty Images

Blum noted that this discovery could also shed light on the timeline of when the first humans settled in Australia.

It is widely accepted that Aboriginal populations have inhabited Australia for at least 50,000 years, though evidence suggests one of the country’s archaeological sites is around 65,000 years old.

“The finding of 67,000 to 68,000-year-old rock art on Sulawesi, nearly adjacent to Australia, supports the theory that modern humans may have arrived in Australia at least 65,000 years ago,” Blum explained.

Source: www.nbcnews.com

Fossil Shorebirds Unveil New Insights Into Australia’s Climate Change History

Shorebirds serve as important indicators of coastal and wetland ecosystems, and their widespread distribution highlights their ecological significance. Although wading shorebirds are infrequently found in the fossil record, a remarkable collection of shorebird fossils has emerged from Pleistocene deposits at the Naracoorte Caves World Heritage Site in South Australia. Recent studies on these fossils provide insights into the evolution of wetland environments, revealing that flourishing habitats vanished with climate shifts as far back as 60,000 years ago. The research links a drying phase around 17,000 years ago to the decline of many of the nine or more shorebird species discovered in one of the Naracoorte Caves.



Red knot (Calidris canutus), near Grinet, Brittany, France. Image credit: Stephan Sprinz / CC BY 4.0.

“Shorebirds are rare in the fossil record, making the discovery of numerous shorebird fossils in Blanche Cave surprising,” stated PhD candidate Karl Lenser from Flinders University.

“This finding suggests that wetlands and tidal flats—vital feeding grounds for plovers, sandpipers, and other shorebirds—were more prevalent during the last Ice Age.”

Currently, climate change and habitat loss are contributing to the decline of Australia’s shorebird populations.

Gaining insights into how these species adapted to historical climate changes may be essential for forecasting their future.

Lenser and his team were particularly intrigued by the remains of the Plains Wanderer, an endangered bird found mostly in Victoria and New South Wales, which was among the most common fossils identified in this study.

Out of approximately 300 examined bones, more than half were identified as those of Plains Wanderers.

“Today’s Plains Wanderers are selective about their habitats; however, other fossils from Naracoorte indicate that the area once featured wooded environments—starkly different from the treeless grassland they inhabit today,” Lenser explained.

Naracoorte represents the only fossil site in Australia with such a substantial population of Plains Wanderers, indicating a significant decline in their numbers over the last 14,000 years due to habitat restriction.

Dr. Trevor Worthy from Flinders University highlighted the uniqueness of this sandpiper fossil sample, noting its representation of migratory species that travel from the Northern Hemisphere to spend winters in Australia.

“This includes three species from the Calidris genus and the Latham Sandpiper (Galinago hardwickii),” he added.

“Fossil assemblages also include blue-bellied plovers that migrate from Australia to New Zealand for breeding.”

“Fossil evidence shows that two young birds flew approximately 2,000 km from New Zealand and were captured by owls near Blanche Cave in Naracoorte,” Dr. Worthy explained.

“There remains much to uncover about Australia’s bird species from the last Ice Age, and fossils from sites like Naracoorte are crucial for filling in these knowledge gaps,” Lenser noted.

“Naracoorte Caves holds a 500,000-year record of biodiversity in Southeast South Australia,” stated Dr. Liz Reid from the University of Adelaide.

“As this study clearly demonstrates, caves offer a glimpse into pre-European landscapes, providing valuable information for the conservation of endangered species today.”

Visitors to Naracoorte Caves can explore the excavation site and delve into the science behind South Australia’s only World Heritage Site.

Findings have been published in the online journal Old Trogia Electronica about the study.

_____

Karl M. Lenser et al. 2026. Fossil shorebirds (order: Charadriidae) revealing a Pleistocene wetland trend at Naracoorte Caves, South Australia. Old Trogia Electronica 29 (1): a2; doi: 10.26879/1608

Source: www.sci.news

Why Natural Ovulation is the Optimal Choice Before IVF Frozen Embryo Transfer

IVF Treatment Options

Exploring Diverse IVF Treatment Options: Insights and Effectiveness Research

Credit: Zephyr/Science Photo Library

Recent findings from a comprehensive randomized trial indicate that natural ovulation methods for preparing the uterus for frozen embryo transfer after in vitro fertilization (IVF) are equally effective and come with fewer risks compared to traditional hormone therapy.

Emerging data suggests that for women with strong responses to IVF treatment (which can yield multiple eggs), freezing embryos and transferring them in a later cycle can enhance success rates. Consequently, frozen embryos now represent the majority of embryo transfers conducted globally.

Post-IVF, the crucial timing for transferring frozen embryos into the uterus occurs during the menstrual cycle when the endometrium (the uterine lining) is adequately thick to facilitate implantation.

Women can opt for either a medicated cycle, which involves administering estrogen and progesterone for uterine preparation, or a natural cycle, where the body’s natural hormone production is monitored, assuming regular cycles.

Determining the optimal choice remains complex due to a lack of substantial trials evaluating the complications linked to these varying methods.

To address this uncertainty, Daimin Wei and a team from Shandong University in Jinan, China, conducted a large-scale clinical trial involving 4,376 women across 24 fertility treatment centers. All participants were aged 20 to 40 and were slated for a single frozen embryo transfer. Participants were divided equally between the medicated and natural cycle groups.

“This is the randomized controlled trial we’ve been waiting for,” remarks William Bucket from McGill University in Montreal, Canada, who was not involved in the study.

Live birth rates were comparable between both methods, with 41.6% in the natural cycle group and 40.6% in the medicated group. This suggests that natural ovulation is as effective as hormone therapy for preparing the uterus for embryo implantation.

However, an analysis of maternal complications during and after pregnancy revealed notable distinctions.

Women utilizing natural cycles exhibited a lower likelihood of preeclampsia, a severe condition marked by elevated blood pressure, along with fewer incidences of early pregnancy loss. They were also less prone to develop placenta accreta spectrum, a condition that makes the placenta difficult to detach following childbirth. Additionally, this group had reduced rates of cesarean sections and severe postnatal hemorrhage.

“These risks impact both maternal and fetal health during pregnancy and hold significance for long-term postpartum health,” states Wei.

“This research is vital,” notes Tim Child, Chair of the Scientific and Clinical Progress Advisory Committee of the UK Human Fertilization and Embryology Authority. The clinic now advises individuals with regular menstrual cycles that both natural and medicated methods yield similar success rates.

However, Child points out that there is evidence suggesting natural cycles may lower the risk of preeclampsia. This reduction may be attributed to the presence of the corpus luteum, which regulates hormones necessary for preparing the uterus for pregnancy.

“This extensive study corroborates and expands on previous findings, especially concerning significantly lower rates of preeclampsia, early miscarriage, placenta accreta, cesarean sections, and postpartum hemorrhage linked to the natural cycle approach,” Child asserts.

Wei’s team is set to analyze blood samples gathered during the trial to identify potential biomarkers that could shed light on the differences observed in pregnancy complications.

Topics:

Source: www.newscientist.com

Stunning Close-Up of Pierced Crocodile Claims Victory in Ecological Photo Contest

Biting Fly on American Crocodile

Photo Credit: Zeke Rowe/British Ecological Society

While most animals avoid approaching crocodiles, the biting fly boldly lands on this intimidating predator to drink its blood. Captured by Zeke Lowe, this striking image showcases nature’s interactions at Panama’s Coiba National Park, recognized as the top entry in the British Ecological Society’s annual photo contest.

According to Lowe, a doctoral candidate at Vrije Universiteit Amsterdam, “This crocodile was hiding in a tidal marsh off the coast. I got as close as possible, kept low, and waited for that direct eye contact.”

Cape Sparrows Alarmed by Lioness

Photo Credit: Willem Kruger/British Ecological Society

This captivating photograph by Willem Kruger, a South African photographer, won in the Interaction category. It was taken during the dry season in Kalahari Border Park, where a pride of lions startled a flock of birds drinking at a waterhole.

Wallace’s Flying Frog

Photo Credit: Jamal Kabir/British Ecological Society

Jamal Kabir won the animal category at the University of Nottingham for his captivating image of Wallace’s Flying Frog (Lacophorus nigroparmatus), named after renowned biologist Alfred Russell Wallace. These amphibians, found in Southeast Asia, utilize their webbed feet to glide gracefully between trees in the lush rainforests.

Bighorn Sheep Health Test

Photo Credit: Peter Hudson/British Ecological Society

In this striking image, a bighorn sheep (Ovis canadensis) is captured having its nose swabbed. Peter Hudson, a photographer and biologist at Penn State University, was highly commended for his work related to behavioral ecology. This study addresses pneumonia outbreaks in bighorn herds, a significant concern impacting newborns in the spring.

Fly Resting on Mushroom

Photo Credit: Francisco Gamboa/British Ecological Society

This stunning image, taken by wildlife photographer Francisco Gamboa, won accolades in the Plants and Fungi category. The photograph shows a fly resting delicately on a mushroom in Chile’s Altos de Cantillana Nature Reserve.

Intertidal Zone Education

Photo Credit: Liam Brennan/British Ecological Society

In a notable educational initiative, wildlife researcher Liam Brennan captured this image of students conducting beach trawls to monitor coastal fish population changes in New Brunswick, Canada, further emphasizing the importance of ecological education.

Insect and Ecosystem Exploration Safari: Sri Lanka

Embark on a unique entomology and ecology-focused expedition to explore Sri Lanka’s rich biodiversity.

Topics:

Source: www.newscientist.com

Discover How a New Solar Orbiting Spacecraft Connects Magnetic Avalanches to Solar Flares

Recent high-resolution findings from ESA’s Solar Orbiter mission provide groundbreaking insights into solar flares. These explosive events are triggered by cascading magnetic reconnection processes, releasing immense energy and “raining down” plasma clumps into the Sun’s atmosphere.

Detailed overview of M-class solar flares as observed by ESA’s solar probes. Image credit: ESA / Solar Orbiter / Chitta et al., doi: 10.1051/0004-6361/202557253.

Solar flares are powerful explosions originating from the Sun.

These dramatic events occur when energy stored in entangled magnetic fields is suddenly unleashed through a process known as “magnetic reconnection.”

In mere minutes, intersecting magnetic field lines disconnect and reconnect, leading to a rapid rise in temperature and accelerating millions of degrees of plasma and high-energy particles, potentially resulting in solar flares.

The most intense flares can initiate a cascade of reactions, causing magnetic storms on Earth and potentially disrupting radio communications. Hence, monitoring and understanding these flares is crucial.

However, the mechanisms behind such swift energy release remain largely enigmatic.

An exceptional series of observations from the Solar Orbiter’s four instruments has finally provided clarity. This mission, with its comprehensive approach, offers the most detailed perspective on solar flares to date.

The Solar Orbiter’s Extreme Ultraviolet Imager (EUI) captured high-resolution images of features just hundreds of kilometers across in the Sun’s outer atmosphere (corona), recording changes every two seconds.

Three other instruments—SPICE, STIX, and PHI—examined various depth and temperature regions, from the corona to the Sun’s visible surface, or photosphere.

“We were fortunate to witness this massive flare precursor in such exquisite detail,” said Dr. Pradeep Chitta, an astronomer at the Max Planck Institute for Solar System Research.

“Such detailed and frequent observations of flares are rarely possible due to the limited observation window and the significant data storage required.”

“We were in the right place at the right time to capture these intricate details of the flare.”

Solar Orbiter observations have revealed an intricate view of the central engine during the preflare and shock stages of a solar flare as a magnetic avalanche.

“Even prior to the major flare event, ribbon-like features rapidly traversed the Sun’s atmosphere,” Dr. Chitta noted.

“The flow of these ‘rainy plasma blobs’ indicates increasing energy buildup, intensifying as the flare progresses.”

“This rain of plasma will continue for a while even after the flare diminishes.”

“This marks the first time we’ve observed such a level of spatial and temporal detail in the solar corona.”

“We did not anticipate such high-energy particles emerging from the avalanche process.”

“There is still much to explore regarding this phenomenon, but future missions equipped with high-resolution X-ray imaging will further our understanding.”

“This is one of Solar Orbiter’s most thrilling achievements thus far,” stated Dr. Miho Jamby, ESA’s Solar Orbiter Collaborative Project Scientist.

“The Solar Orbiter’s observations unveil the flare’s central engine and underscore the significant role of an avalanche-like magnetic energy release mechanism.”

There is a compelling prospect of whether this mechanism is universal across all flares and in other flaring stars.

Results can be found in the journal Astronomy and Astrophysics.

_____

LP Citta et al. 2026. Magnetic avalanches as the central engine driving solar flares. A&A 705, A113; doi: 10.1051/0004-6361/202557253

Source: www.sci.news

Bird Retinas: How Scientists Discovered Their Oxygen-Free Functionality

Zebra finches study

Research on Zebra Finches’ Eyes Reveals Unique Mechanisms

Ger Bosma/Alamy

The anatomy of zebra finches’ eyes is distinct from known vertebrate tissues. Their retinas, responsible for light detection, utilize an unusual energy source by absorbing glucose instead of the typical oxygen.

This groundbreaking discovery addresses a 400-year-old question regarding avian eye physiology. Christian Damsgaard from Aarhus University in Denmark notes, “This is compelling evidence that certain neurons can operate without oxygen, notably in common garden birds.”

The retina sends light signals to the brain, demanding considerable energy supplied by oxygen and nutrients through blood vessels. However, the thick avascular retina in zebra finches raises the question of how these essential nerve cells sustain life.

Damsgaard and his research team investigated zebra finches, or Teniopygia guttata, in the lab. By attaching oxygen sensors to their eyes, they discovered that the inner retinal layer does not receive oxygen.


“Oxygen enters through the back of the eye, but it cannot permeate the retina,” explains Damsgaard.

Analyzing metabolic gene activity in various retinal layers revealed frequent reliance on glycolysis in areas devoid of oxygen. Although this process is less efficient, it serves the retina’s energy needs.

“This method requires 15 times more glucose for equivalent energy output,” states Damsgaard. So, where does all this sugar come from?

The answer lies in the pecten, a structure of rake-shaped blood vessels found in avian eyes. Previously thought to transport oxygen, recent findings show that the pecten instead inundates the retina with glucose—four times what brain cells absorb—fueling its high-energy requirements.

According to Luke Tyrrell, researchers at the State University of New York at Plattsburgh are astonished that birds have evolved to depend on such a less efficient method for vision. “The avian retina is among the most energy-intensive tissues in the animal kingdom,” he adds.

This specialized, blood vessel-free retina may provide superior vision in birds, with the pecten sugar supply being a crucial evolutionary adaptation. An oxygen-independent retina could also contribute to their capabilities for high-altitude migratory flights.

For Pavel Niemec, findings from Charles University in Prague, Czech Republic, illustrate that evolution can yield counterintuitive solutions to physical challenges.

Damsgaard and his colleagues believe there may be future applications for modifying human cells to allow greater resilience under low-oxygen conditions, such as after a stroke.

Topics:

This revised content is focused on relevant keywords while retaining the original HTML structure.

Source: www.newscientist.com

Discovering the Versatility of Paranthropus: The Adaptable Ape-Like Hominin

Illustration of Paranthropus: Early Hominins from 2.7 to 1.4 Million Years Ago

Credit: John Bavaro Fine Art/Science Photo Library

For the first time, remains of ancient humans, specifically Paranthropus, have been discovered in the Afar region of Ethiopia. This groundbreaking discovery indicates that Paranthropus lived across diverse ecosystems.

The remains of Paranthropus, dated between 2.7 and 1.4 million years ago, suggest a close relation to Homo, the genus that includes modern humans and Neanderthals. They are believed to have evolved from the early hominin known as Australopithecus.

Zeresenai Alemseged, a prominent researcher from the University of Chicago, has been excavating the Mille Logya site in the Afar Depression since 2012. This area is rich in human fossils, including remains of Homo and Australopithecus. Alemseged states, “Paranthropus was thought not to have reached this far north.”

On January 19, 2019, Alemseged’s local assistant discovered a piece of a toothless lower jawbone. “The size was the first feature that caught my attention,” Alemseged recalls. On the same day, the research team also found the crown of a lower left molar.

CT scans revealed distinctive Paranthropus characteristics, including the jawbone’s dimensions and the intricate structure of the tooth roots within. While the team couldn’t definitively classify the species, it is likely to be Paranthropus ethiopicus or Paranthropus boisei based on the location of the find.

Dating analyses indicate the jawbone to be approximately 2.6 million years old, making it one of the oldest known specimens of Paranthropus.

“There is no doubt that it belongs to Paranthropus,” asserts Carrie Mongul from Stony Brook University, who was not involved in the research. “The dating is unquestionable.”

Assembled Fragment of Paranthropus Mandible

Credit: Alemseged Research Group/University of Chicago

Previously, the northernmost Paranthropus specimen was a skull excavated from Konso in southern Ethiopia. This new specimen extends the range over 1,000 kilometers northward.

Paranthropus,” states Mongul.

Alemseged believes this specimen also illustrates the species’ adaptability. The large jaws and teeth of Paranthropus have been interpreted as indicators of a tough diet. Although the specifics of Mille Logya’s environment are unclear, it appears that Paranthropus thrived in more open habitats compared to the wooded areas frequented by earlier specimens.

“While they were specialized, we may have overemphasized their dietary limits,” concurs Alemseged. “Different Paranthropus populations appear to have adapted to various habitats, much like Homo and Australopithecus.”

Mongul noted existing evidence that Paranthropus thrived in its new environment by adapting to the expansion of grasslands across East Africa and even selecting gramineous food. The new Mille Logya specimen reinforces this observation of versatility.

Recent findings suggest that Paranthropus may have utilized and even crafted simple stone tools. In 2023, stone tools were found in Kenya, associated with Paranthropus ancestors. By 2025, newfound dexterity in the hands of Paranthropus was documented.

Alemseged concludes that since Australopithecus was capable of creating and using tools, and given the timeline, Paranthropus must have shared this capability stemming from their common ancestry with early chimpanzees.

Discovery Tour: Archeology and Paleontology

New Scientist frequently covers remarkable archaeological sites globally that reshape our understanding of species and the origins of civilization. Why not explore them too?

Topics:

  • Evolution of Humanity/
  • Ancient Humans

Source: www.newscientist.com

Discover the 68,000-Year-Old Hand Claw Pattern: The Oldest Known Rock Art

Ancient Hand Stencil: Modified to Resemble Claws

Afdi Agus Octaviana

A stunning discovery of a nearly 68,000-year-old hand stencil on the walls of a cave in Sulawesi, Indonesia, may represent the oldest known rock art. This stencil appears to have been intentionally modified, giving the fingers a claw-like appearance rather than a traditional handprint.

In recent years, Sulawesi has emerged as a significant location in human history. The island has been home to various hominin species since the earliest humans likely appeared over 1.4 million years ago, with Homo erectus making its initial known journey to the area.

In 2024, researcher Maxim Aubert and his team from Griffith University uncovered the world’s oldest known figurative art on the island, dating back at least 51,200 years. This art includes depictions of pigs alongside human-like figures. More recently, Aubert’s team reported finding 44 additional rock art sites in Southeast Sulawesi, including a hand-painted stencil at Liang Metanduno, dated to 67,800 years ago.

The previous record for the oldest known rock art, a hand-painted stencil found in a Neanderthal site in Northern Spain, is estimated to be at least 66,700 years old, making the Sulawesi find significant in the timeline of art history.

Aubert noted that the Sulawesi hand stencil exhibits signs of modification; the tip of one finger appears intentionally tapered, possibly through pigment application techniques. This unique form of hand stencil art has only been recognized in Sulawesi to date.

“This is more than just a hand pattern,” states Aubert. “They appear to be retouching it, whether with a brush or spray, achieving a similar effect.”

The purpose of this artistic technique remains unknown. Aubert speculates, “They likely aimed to mimic an animal’s claw-like appearance.”

Additional Discoveries: Animal Figures in Sulawesi Cave

Maxim Aubert

Aubert indicated that identifying the exact species that created this hand stencil remains uncertain. However, the unique artistic alterations imply it was likely made by modern humans, suggesting a connection to the ancestors of the first Australians.

Evidence from the Madjedbebe site in Arnhem Land, Australia, indicates that Homo sapiens arrived on the continent at least 60,000 years ago. Additionally, increasing evidence suggests Sulawesi is a crucial early pathway linking Southeast Asia to New Guinea and Australia.

“These discoveries have far-reaching implications for our understanding of art history,” says Aubert. “The creators of this stencil were likely among the ancestors of the first Australians, underscoring the cultural significance of their rock art, which dates back at least 68,000 years.”

Team member Adam Blum, also from Griffith University, notes that both the Neanderthal hand stencils in Spain and the Sulawesi rock art were created using similar techniques, such as spraying ochre pigments.

Intricate Details of Ancient Rock Art

Maxim Aubert

“Modern humans exhibited a distinct artistic approach,” Blum explains. “They intentionally altered the finger contours of the stencil, creating a more pointed and narrower appearance. This transformed the hand imprint into a potential representation of an animal claw.”

“Such changes highlight the creativity and imaginative capacity of modern human artists, showcasing abstract thinking not evidenced in Neanderthal hand imprints,” he adds.

Martin Poe, a researcher from the University of Western Australia in Perth, stated that this discovery confirms the world’s oldest known rock art attributed to modern humans. “The dates on the stencil correspond with the earliest known timelines for Homo sapiens. This region encompasses not just Australia but mainland Asia and Southeast Asia,” Poe concluded, emphasizing the need for further research to clarify the migration routes of early humans to Australia.

Uncovering Ancient Caves: The Origins of Humanity in Northern Spain

Embark on a journey to discover some of the world’s oldest cave paintings nestled in the beautiful landscapes of northern Spain. Travel back 40,000 years to learn how our ancestors lived, engaged in play, and crafted tools. From ancient Paleolithic art to remarkable geological forms, each cave sings a unique and timeless tale.

Topic:

Source: www.newscientist.com

Ancient Vertebrate Ancestors: The Surprising Discovery of Four Eyes

Illustration of Haikouichthys, a Cambrian fish

Illustration of Haikouichthys, a notable Cambrian fish with fossilized evidence of a second pair of eyes

Xiangtong Lei, Sihang Zhang

Over 500 million years ago, the earliest known vertebrates exhibited an intriguing feature: an extra eye. Interestingly, humans may retain traces of this ancient evolutionary trait.

Significant fossils from two species of jawless fish, known as myllokunmingids, were discovered by Kong Peiyun. From 2019 to 2024, Kong worked alongside colleagues at Yunnan University in China, specifically around Dianchi Lake.

The fossils unearthed in the Chengjiang biota area, renowned for its exquisite preservation, date back to approximately 518 million years ago—a timeframe marked by a dramatic increase in life’s diversity during the Cambrian period.

Remarkably, the vertebrate fossils discovered by Kong’s team included well-preserved soft tissue and vital eye structures.

Complex eye structures evolved independently in various animal groups. Many invertebrates, like insects, possess compound eyes, which consist of numerous individual units, each with its own lens, enabling a mosaic vision.

Meanwhile, vertebrates such as humans and reptiles possess what scientists label as “camera eyes.” These comprise a spherical lens, retina, iris, and muscles that regulate eye movement. Additionally, they contain pigment structures called melanosomes that influence eye color.

Light focuses on the retina, generating a signal relayed to the brain via the optic nerve.

Under electron microscopy scrutiny, Kong and his team identified two eyes situated on the sides of the head, with melanin-rich melanosomes preserved, alongside two smaller enigmatic black marks between them.

Employing a lens impression to analyze the fossils, team members led by Jacob Vinther from the University of Bristol suggested that these ancient creatures possessed two pairs of camera-like eyes, allowing them to visualize their environment much like modern vertebrates. The decisive difference? They utilized four eyes instead of two.

Fossil of Haikouichthys displaying conserved melanosomes

Xiangtong Lei, Sihang Zhang

The research team posits that this ancient additional eye has evolved into various organs known as the pineal complex. Some vertebrates, such as reptiles, possess a light-sensitive organ called the parietal eye atop their heads, while all mammals retain a reduced version that is the pineal gland, a key player in regulating sleep cycles through melatonin secretion.

“Early vertebrates likely used the pineal organs as functional eyes, enabling them to perceive their surroundings before evolving into sleep-regulating organs,” states Vinther.

These large eyes may have been optimized for high-resolution vision, complemented by smaller eyes that enabled detection of nearby threats—critical for survival in the predator-rich Cambrian seas.

According to Vinther, these creatures could likely discern objects with detail, estimating their shape and gain a degree of depth perception—all thanks to their remarkable four-eyed adaptation.

Tetsuto Miyashita, from the Canadian Museum of Nature in Ottawa, finds the interpretation of these fossils both “half-believable and half-doubtful.”

The structure located between the two eyes had previously perplexed researchers, but realizing it may indicate another camera eye was considered a “lightbulb” moment, he explains.

If indeed this is the case, it raises the question: where is the animal’s nose? “Most early fish evolution centered around nose development, suggesting that it would be unusual for the nose to not be preserved,” he notes.

Miyashita anticipates significant discussions will persist until experts can engage in a thorough debate regarding this exciting finding. “What function do so many prominent eyes actually serve?” he questions.

John Patterson, a researcher from the University of New England in Armidale, Australia, asserts that it is logical for prey species to have developed such visual capabilities to escape formidable predators.

The Cambrian era was evolutionary peculiar, showcasing animals displaying unusual behavior and not strictly developing pairs of eyes on their heads, but placing eyes in other regions as well.

Karma Nangle, a professor at the University of California, Riverside, aims to create a comprehensive map of the entire fossil body to investigate the potential existence of similar traces. Such findings could demonstrate that the second set of eyes may simply be a result of chemical processes during fossilization.

Dinosaur Hunting in Mongolia’s Gobi Desert

Embark on an exhilarating expedition to discover dinosaur remains in the remote wilderness of the Gobi Desert, renowned as a leading paleontology hotspot.

Topic:

Source: www.newscientist.com

2.6 Million-Year-Old Ethiopian Fossil Reveals Widespread Existence of Paranthropus Hominid

The recently unearthed fossil represents the first known partial specimen of a 2.6-million-year-old lower jaw from Ethiopia’s Afar region, specifically belonging to the genus Paranthropus. This fossil is among the oldest remains found in the region and is likely the earliest of its kind across Africa. This groundbreaking discovery significantly reshapes paleoanthropologists’ perspectives on early hominid evolution, suggesting that these ancient relatives had a more extensive and adaptable lineage than previously recognized.

Paranthropus boisei. Image credit: © Roman Yevseyev.

The newly labeled fossil, MLP-3000, was discovered in the Mille Logia research area and comprises an edentulous mandibular body, complete with preserved roots and a partial molar crown.

Geological and magnetostratigraphic analyses indicate that these fossils date back approximately 2.9 to 2.5 million years during a period marked by dramatic environmental shifts in eastern Africa.

“To understand our evolutionary trajectory as a genus and species, we must also comprehend the ecological and competitive factors that influenced our evolution,” said Zeresenai Alemseged, a professor at the University of Chicago.

“This discovery offers more than just a snapshot; it sheds new light on the underlying forces driving the evolution of Paranthropus.

Until now, Paranthropus fossils had primarily been documented from southern Ethiopia to South Africa, with no prior findings in the Afar region. This lack of evidence was confounding given the region’s abundance of fossils spanning around 6 million years, including significant discoveries of Australopithecus and early homo.

Recent findings reveal that Paranthropus, from its earliest known existence, had a broader geographic range than previously understood.

“We seek to comprehend who we are and how we evolved, influencing our behavior and the environment around us,” Professor Alemseged stated.

“The fossil record showcases more than 15 hominin species, typically classified into four categories: facultative bipeds, habitual bipeds, obligate bipeds, and obligate hominids.”

“Numerous fossils belonging to more than a dozen species, including Ardipithecus, Australopithecus, and homo, have been discovered in the Afar region of northern Ethiopia. The lack of Paranthropus fossils in this area was striking and perplexing for paleoanthropologists, many of whom theorized that this genus never expanded that far north.”

“Some experts have posited that dietary specialization may have restricted Paranthropus, suggesting that competition with more adaptable homo species limited its range,” he added.

“However, this assumption is incorrect; Paranthropus was as adaptable and versatile as homo, and this discovery illustrates that its absence in the Afar area was merely a result of the fossil record.”

According to anatomical analysis, the jaw exhibits a unique blend of features, showcasing characteristics of Paranthropus alongside those found in more primitive hominids, including a notably robust mandibular body and exceptionally large posterior canines.

This mosaic of traits leads researchers to tentatively classify the fossil as Paranthropus sp., without assigning it to a specific species.

The context of this discovery is equally vital as the fossil itself.

The Mille Rogia area preserves sediments from a time of significant environmental change, roughly between 3 million and 2.4 million years ago, during which the climate shifted toward more open grasslands, becoming the dominant habitat.

Fossils of associated animal species highlight these habitat transformations; Paranthropus was not confined to a narrow ecological niche but could thrive in various environments.

The presence of Paranthropus in the Afar region additionally suggests that multiple hominin lineages coexisted in this area during the late Pliocene.

Fossils of early homo and Australopithecus, dating to a similar period, have already been located at nearby archaeological sites, indicating a surprising level of diversity in early homo evolution.

By extending the known range of Paranthropus over 1,000 km north of its previously recognized boundaries, this discovery challenges long-held assumptions regarding the ecology and migration patterns of early hominids.

“This new finding enhances our understanding of adaptation and behavior, including the competitive dynamics between species, diet, physical adaptations, and potential use of stone tools,” Professor Alemseged remarked.

“Discoveries like this spark intriguing questions that prompt us to examine, revise, and formulate new hypotheses about the significant differences among major hominin groups.”

This important finding has been detailed in the following article: paper, published in today’s edition of Nature.

_____

Zeresenai Alemseged et al.. First long-distance Paranthropus fossils expand the distribution of this adaptable genus. Nature published online on January 21, 2026. doi: 10.1038/s41586-025-09826-x

Source: www.sci.news