Unlocking the ABC Conjecture: A Pioneering Project to Solve Controversial Mathematical Proofs with Computer Technology

In August 2012, renowned Japanese mathematician Shinichi Mochizuki published a groundbreaking paper.

In 2012, Shinichi Mochizuki claimed to have proved the ABC conjecture in number theory.

Credit: Newscom/Alamy

One of the most highly debated proofs in contemporary mathematics may soon find resolution. Two innovative projects are underway, utilizing computer programs to illuminate this ongoing controversy—one of which has operated in secrecy for over two years. Mathematicians express optimism about these developments as they could lead to a breakthrough in this heated debate.

This narrative traces back to 2012 when Shinichi Mochizuki, a professor at Kyoto University in Japan, proclaimed that he had demonstrated a significant concept known as the ABC conjecture, releasing a staggering 500-page document online. This conjecture is simply stated concerning prime numbers in the equation a + b = c and their interrelations. However, solving it necessitates profound insights into the interplay of addition and multiplication, and its ramifications extend deeply into various mathematical realms.

Mochizuki’s proof was explosive but regarded as esoteric by many colleagues due to its innovative techniques and concepts collectively referred to as interuniversal Teichmüller theory (IUT). A slew of prominent mathematicians engaged for months in efforts to distill Mochizuki’s work, including discussions with him, ultimately hitting a standstill regarding the proof’s correctness.

In 2018, two notable mathematicians—Peter Scholze from the University of Bonn and Jacob Stix from Goethe University Frankfurt—claimed they had found a potential flaw. Despite this, no further progress was achieved. While Mochizuki and his close associates at Kyoto University maintained that the proof was valid, the broader mathematical community viewed it as either incomprehensible or fundamentally flawed.

However, last year, Mochizuki reached out to his critics, proposing a possible way forward. Notable advancements in a field called formalization have emerged, allowing mathematical proofs to be transcribed into computer language for automatic correctness verification. A specific language known as Lean captured Mochizuki’s interest. He remarked at the time: “[Lean] is perhaps the best and only technology to advance the goal of liberating mathematical truth from the constraints of social and political dynamics.”

Currently, efforts to formalize Mochizuki’s ABC conjecture proof in Lean are underway, with multiple mathematical groups announcing significant progress. This includes Mochizuki’s team and another group that has been progressing in secret for over two years but has encountered challenges.

In late 2023, Bunji Kato from Japan’s ZEN Mathematics Center, initiated the Lean Geometry and Annabelle Geometry (LANA) project, uniting mathematicians familiar with Mochizuki’s work and Lean experts who have crystallized other complex mathematical endeavors. The primary aim is to “finally resolve the dispute,” as stated by Kato. They enlisted Adam Topaz from the University of Alberta to facilitate the formalization of the proof.

During a press conference held last month to announce the project, Kato indicated that through the years, team members have developed a “deeper understanding” of Mochizuki’s ideas. Nevertheless, they faced hurdles specifically tied to issues flagged by Scholze and Stix in 2018. Topaz commented, “We essentially stalled while attempting to assimilate certain aspects of IUT. We recognized this issue about a year and a half ago, initially believing a better understanding of the theory would avert this potential complication.”

Despite numerous workshops and indirect communications with Mochizuki, the team has struggled to move forward.

In a parallel initiative, Mochizuki and his associates have also begun to formalize proofs utilizing Lean. Their goal, however, is not to confirm Mochizuki’s position, as he already asserts its correctness, but emphasizes the project’s value in enhancing communication.

Mochizuki stated at a recent conference at the University of Exeter, “The validation aspect is not our primary focus. The significance of Lean formalization lies in establishing an accurate record of the logical structure of IUT, free from misinterpretations, ensuring it can effectively communicate its essence to other mathematicians.”

Mochizuki and his team’s strategy involves focusing on the contentious areas of evidence previously identified by Scholze and Stix, where the LANA initiative has stagnated. They aim to create a formal blueprint that encompasses four additional steps. Mochizuki has commenced this process by drafting 70 lines of Lean code, though it has not yet been made public.

This code, according to Kevin Buzzard of Imperial College London, is minimal. “Seventy lines hardly suffice; you would struggle to prove even a few undergraduate-level theorems within that.”

However, these developments are among the most promising advancements in comprehending Mochizuki’s proof since its debut. “We haven’t seen much movement, no new relevant information, and this is the first time I sense actual momentum,” Buzzard observes.

Topaz shares that despite existing challenges, he remains hopeful for progress, although the group’s precise efforts remain uncertain, especially as Mochizuki maintains communication with the LANA project.

“I’m quite optimistic that we might find a resolution to this controversy due to the dialogues I’ve had with Mr. Mochizuki regarding Lean,” Topaz adds. “What excites me the most is that we are engaging in reciprocal discussions with Mr. Mochizuki’s team.”

Topics:

Source: www.newscientist.com

Discovering Hidden Fossils: Uncovering Secrets of Pre-Mass Extinction Oceans

Discoveries of Radiolarian Fossils in a Rock Sample

Provided by Jonathan Aitchison

A minuscule pellet of ancient rock, measuring only half the size of a rice grain, has unveiled 20 microscopic fossils from eight distinct species, including several previously unknown types. This significant discovery enhances our knowledge of the second-largest mass extinction known to science, while demonstrating how innovative analytical techniques can uncover neglected segments of the fossil record.

Jonathan Aitchison, a professor at the University of Queensland in Australia, was pivotal in extracting these pellets from rocks gathered in late 2018 from the Sichuan Basin in China, located approximately 300 kilometers south of Xi’an. These rocks date back 445 million years, situating them just prior to the late Ordovician mass extinction, ranked as the second most severe extinction event in the last half billion years.

The pellets contained eight species of radiolarians—single-celled plankton characterized by silica shells that continue to inhabit oceans today.

The discovered fossils encompass five genera, four families, and three orders, including a newly identified species named Haplotaniatum woufengensis.

The fossils are remarkably well-preserved, with both external and internal structures perfectly encased in asphalt, creating flawless impressions.

Patrick Smith, from the New South Wales Geological Survey in Sydney, Australia, remarked that the fossils were formed before the extinction event escalated.

“The quantity and diversity of fossils indicate that marine ecosystems, especially microscopic plankton communities, thrived just prior to the extinction,” Smith stated. “Ordovician oceans were significantly more biologically diverse than previously understood, especially on a microscopic scale. These fossils showcase a vibrant plankton community during a pivotal moment of environmental upheaval in Earth’s oceans.”

Traditionally, researchers have studied small fossils by using acid to dissolve surrounding rock, a process Aitchison notes is highly destructive.

In contrast, the study employed advanced X-ray technology (from the Australian Nuclear Science and Technology Organization’s synchrotron in Melbourne) to scan the rock pellets, yielding high-resolution 3D images of the contained fossils within seconds.

“Growing up, I was fascinated by ads for X-ray glasses that could see through objects,” Aitchison commented. “Now, I can ‘see’ these radiolarian plankton directly within the rocks without needing to remove them.”

“This represents the most significant technological advancement in my career,” he added.

Professor Aitchison concluded that the extensive life forms discovered in such a limited sample size imply that the marine biodiversity found in other Late Ordovician rocks might be “significantly underestimated.”

Smith emphasizes that a key takeaway from this study is that numerous fossils remain to be explored worldwide, “not due to a lack of specimens, but because our traditional methods are insufficient for detection and recovery.”

Topics:

Source: www.newscientist.com

How Your Vitamin D Levels Influence Dementia Risk: Key Insights

Recent research indicates that individuals with high vitamin D levels in their late 30s experienced a notable decrease in a critical Alzheimer’s disease protein in their brains 16 years later, as highlighted in the study.

Results from this neurology open access survey suggest that continuing vitamin D supplementation during midlife may serve as an effective strategy to lower dementia risk.

This correlation is attributed to elevated blood levels of vitamin D being linked to reduced levels of tau protein in the brain, a recognized biomarker for Alzheimer’s disease.

According to the first author, Dr. Martin Mulligan, professor at Galway University, “Previous studies indicate that vitamin D may help by reducing inflammation and enhancing antioxidant defenses and cell signaling, potentially preventing tau protein accumulation.” This insight was shared with BBC Science Focus.










Importantly, no association was observed with amyloid plaques, another hallmark of Alzheimer’s disease. Researchers suggest that this may reflect the earlier accumulation of tau compared to amyloid, making it more detectable in younger individuals.

Vitamin D synthesis occurs in the body upon sun exposure and can also be sourced from foods like oily fish and eggs.

In the study, nearly 800 participants without dementia, averaging 39 years of age, had their vitamin D levels tested initially, followed by PET brain scans conducted an average of 16 years later.

The robust association between higher vitamin D levels and lower tau levels persisted after accounting for variables such as age, gender, cardiovascular risk factors, and depression.

Vitamin D is produced during summer; in winter, dietary sources and supplements are essential – Photo credit: Getty.

Dr. Mulligan noted that approximately one-third of participants had low vitamin D levels, consistent with global trends.

This study is observational and does not establish a cause-and-effect relationship. Vitamin D was measured only once within a predominantly white cohort, limiting the findings’ generalizability.

Dr. Mulligan emphasized the necessity for further validation in diverse cohorts before revising clinical guidelines.

“This hypothesis requires additional testing through clinical trials, and based on these results alone, we cannot formally recommend discontinuing supplements as a preventive measure for dementia,” he stated.

Read more:

Source: www.sciencefocus.com

Physicists Discover Elusive Nuclear State: Breakthrough in Nuclear Physics Research

Groundbreaking experiments in Germany reveal the first evidence of the long-predicted pairing of carbon-11 nuclei and η’ mesons, shedding light on how the strongest forces in nature contribute to mass formation.

Sekiya et al. Experiments at Germany’s GSI/FAIR research center have uncovered evidence of exotic nuclear states. Image credit: J. Hosan, GSI/FAIR.

“In physics, we identify four fundamental forces: gravity, electromagnetism, strong interactions, and weak interactions,” stated Professor Kenta Itabashi from RIKEN and his team at Osaka University.

“Various bound systems are maintained by these forces. For instance, gravity holds the Earth and the moon together, while electromagnetic interactions bind positively charged atomic nuclei with negatively charged electrons.”

“The nucleus of an atom, composed of protons and neutrons, is held together by strong interactions.”

“In addition to protons and neutrons, which are each made up of three quarks, other particles, such as mesons, also participate in strong interactions.”

“Certain mesons carry a negative charge,” the physicists commented.

“In special instances, these mesons can displace electrons within an atom and bond to the nucleus via electromagnetic interactions.”

“However, some mesons, including the η’ meson, are electrically neutral.”

“Due to its lack of charge, the η’ meson cannot bond electromagnetically to an atom’s nucleus, relying instead on strong interactions for binding.”

“These situations, where strong interactions are the sole binding mechanism, are particularly intriguing as they allow us to gain insights into the nature of this force.”

In 2005, scientists anticipated the existence of meson-nuclear configurations formed solely by strong interactions.

However, thorough investigations into this exotic state had remained inconclusive until now.

Professor Itabashi and his collaborators conducted pioneering experiments at the GSI fragment separation facility in Germany.

“Our proton beam strikes the carbon-12 nucleus at approximately 96% of the speed of light, removing neutrons, forming deuterons, and proceeding forward,” the researchers explained.

“The residual carbon-11 nucleus is excited into a high-energy state, producing an η’ meson that occasionally binds with it. This results in a transient, bound quantum state.”

The implications of this experimental breakthrough extend well beyond the initial identification of an exotic nuclear state.

Simultaneously, it was demonstrated that the mass of the η’ meson diminishes within nuclear matter.

This finding enhances our comprehension of how meson mass is generated. The combined masses of the quarks in the η’ meson account for only about 1% of its total mass when unbound.

“Moving forward, our collaborative effort will conduct enhanced follow-up experiments, utilizing substantially more data to accurately gauge the spectroscopic properties of bound η’ meson nuclear systems, focusing on energy levels, binding energies, and decay widths,” the researchers concluded.

For further details, refer to their paper published in the Physical Review Letters.

_____

Takashi Sekiya et al. 2026. Excitation spectrum of the 12C(𝑝,𝑑) reaction near the 𝜂’-meson emission threshold measured simultaneously with high-momentum protons. Physics. Review Letters 136, 142501; doi: 10.1103/6vsl-ng7x

Source: www.sci.news

Step-by-Step Guide to Artemis II Earth Reentry: Everything You Need to Know

  • Amazon to End Support for Kindle Models by 2012

    00:25

As the Artemis II crew prepares for their return to Earth, NBC’s Tom Costello utilizes augmented reality to guide you through the re-entry process step-by-step.

Key SEO Considerations

  1. Title Tags: Include keyword-rich titles for videos and sections.
  2. Image Alt Tags: Ensure all images have descriptive alt attributes for better search engine indexing.
  3. Responsive Images: Use the <source> element for different screen sizes to improve loading times and UX.
  4. Button Accessibility: Maintain aria-labels for interactivity.
  5. Content Description: Use relevant descriptions for videos to aid contextual understanding by search engines.

Source: www.today.com

How Excessive Luxury Salt Consumption Affects Your Health

Certainly! Here’s the SEO-optimized content while keeping the HTML tags intact:

Why Iodized Salt Deserves a Comeback

Tatyana Baibakova/Alamy

In university, I had a passionate biology lecturer dedicated to resolving global iodine deficiencies. He always advocated for iodized salt, claiming it plays a pivotal role in enhancing public health. His emphasis on its significance still resonates with me whenever I browse the salt aisle at the supermarket.

Recently, I’ve observed a decline in the availability of iodized salt. Fancy varieties like Cornish sea salt, Himalayan pink salt, and gourmet kosher salts are dominating the shelves. The remaining iodized salt products are often unattractive, posing the question: Are we risking the benefits brought by this simple yet vital mineral?

Iodine is a crucial mineral that the thyroid uses to produce hormones essential for metabolism, growth, digestion, heart rate, and body temperature regulation.

Ensuring adequate iodine intake during pregnancy is especially critical, as thyroid hormones influence fetal brain development. Mild iodine deficiency can diminish intelligence significantly—by as much as 13 IQ points. It is equally important for children, supporting both brain development and thyroid functionality. Reports exist of children suffering from iodine deficiency displaying poor school performance and fatigue due to extreme pickiness in food choices. Additionally, both adults and children can develop goiter, an enlargement of the thyroid gland, due to insufficient iodine intake.

Natural sources rich in iodine include seaweed, seafood, and dairy products. Milk contains iodine due to iodine being added to cow feed and the use of iodine-based disinfectants during milking. Fruits, vegetables, and grains can capture minimal amounts of iodine from soil, which varies significantly in iodine content. Regions like Switzerland and Michigan historically had iodine-poor soil, resulting in high incidences of goiter among children.

In 1922, Switzerland pioneered iodized salt by adding iodine to table salt. This initiative led to a near elimination of goiters and remarkable increases in children’s height and IQ, as economist Dimitra Politi described it. High school graduation rates soared as a result of this public health intervention.

Iodized salt made its way to Michigan in 1924, followed by widespread adoption across the U.S. and other countries. Its introduction significantly contributed to the global rise in IQ witnessed in the 20th century. Rarely has such an inexpensive invention delivered such monumental benefits. Endocrinologist Gerald Barrow famously stated, “Five cents per person per year can make the entire population smarter than before.”

Despite these gains, iodized salt faces a popularity crisis today. The allure of pink Himalayan sea salt often overshadows the practical benefits of iodized options. Many consumers avoid iodized salt, mistakenly believing it contains harmful additives, despite iodine being a natural element.

As people reduce their use of iodized salt at home, they increasingly rely on processed foods, which typically contain non-iodized salt. The growing popularity of vegan diets and plant-based milk alternatives further diminishes iodine intake.

A recent study indicates that Americans not consuming enough iodine has doubled since 2001, with alarming findings showing that 46% of pregnant women are iodine deficient.

This trend is mirrored in the UK, where women of reproductive age show average iodine levels below recommended standards. In Australia, 62% of pregnant and breastfeeding women lack sufficient iodine. Conversely, some regions in Japan report excessive iodine intake leading to thyroid complications.

Consequently, public health experts urge residents of the U.S., U.K., and Australia to reintroduce iodized salt into their diets to safeguard against cognitive impairments, thyroid issues, and the potential return of goiter.

It’s perplexing. The supplement industry thrives, with people consuming large doses of zinc, selenium, and ginkgo biloba for brain health, often despite minimal evidence supporting these benefits. In contrast, iodine supplements and iodized salt remain overlooked, despite the risks associated with iodine deficiency.

Regardless of current trends, I will persist in my quest for iodized salt at the supermarket, wary of the judgment that may accompany a purchase of those appealing pink flakes.

Topics:

  • Nutrition and health/
  • Dietary supplements

In this optimized content, I’ve made the language more engaging and added keywords like “iodine,” “iodized salt,” “health benefits,” and “cognitive function” to enhance SEO.

Source: www.newscientist.com

New Scientist Recommends Exploring Sampling Experiences at London’s Edible Earth Museum

Try Samples at the Museum of Edible Earth

Photo Credit: David Parry/PA Media Assignments

Geophagy and Mental Health: Earth eating, or geophagy, is recognized by the American Psychiatric Association as a mental health condition unless tied to cultural practices.

Discover more about this fascinating topic at the Museum of Edible Earth in Somerset House, London, running until April 26th.

During my visit, I encountered approximately 600 soil samples collected by the museum’s founder, Mashal. Highlighted were red ocher from South Africa, a source of iron, and black nakumat clay used by pregnant women in India for nausea relief. In the UK, only two varieties are approved for tasting as nutritional supplements.

Luvos Healing Earth, known for digestive benefits, resembles chocolate sprinkles but tastes like unwashed leek sand. In contrast, I enjoyed the milled Mexican diatomaceous earth, a silky, slightly sour flour. Beyond taste, I reveled in imagining the ancient aquatic creatures that once inhabited this soil.

Thomas Luton
Features Editor, London

Topics:

Source: www.newscientist.com

Revealing Proton Size: New Insights into the Fundamental Particle

Vacuum chamber used to measure electronic transitions in atomic hydrogen, aiding in estimating proton size.

Axel Beyer/MPQ

Newly acquired data reveals the true size of the proton, marking a significant milestone in particle physics. Over 15 years ago, a surprising experiment reshaped our understanding of this subatomic particle’s fundamental properties.

Protons are essential constituents of matter, and until 2010, our comprehension of their structure seemed complete. We recognized that protons consist of three quarks, but uncertainties about their size lingered.

Recent investigations involving exotic hydrogen atoms suggest that protons may actually be 4% smaller than previously thought. Research teams are now tirelessly exploring sources of error and theories that might illuminate the proton radius puzzle. In 2019, an additional experiment reinforced indications that the proton’s size had been overestimated.

Excitingly, the confusion surrounding proton size appears to be resolved through two complementary experiments, which convincingly support the idea of smaller protons. Their findings indicate that the proton’s radius is approximately 0.84 femtometers—an astonishing measurement, less than one millionth of a meter.

As physicist Dylan Yost from Colorado State University explains, “Reviewing the data makes you reconsider the betting odds on the proton’s radius. These measurements significantly bolster our understanding.”

To ascertain this new radius, both research teams focused their efforts on hydrogen atoms, which consist of one proton and one electron. The electromagnetic interaction between these oppositely charged particles is influenced by the proton’s size, allowing researchers to deduce its dimensions by observing electron energy transitions.

Using lasers, the teams manipulated electrons in hydrogen atoms, measuring three previously unrecorded energy transitions.

The calculated proton radius not only aligned with each other but also confirmed the crucial 2010 measurements. As physicist Lothar Meisenbacher from the University of California, Berkeley noted, “It’s extremely unlikely that this proton radius puzzle persists.”

Conducting these experiments was no small feat. The teams placed hydrogen atoms in complete vacuum environments, utilized expensive lasers, and meticulously calibrated their equipment. While data collection might take weeks, it often requires years to eliminate potential disturbances and errors, according to Meisenbacher.

Yet, if multiple experiments produce comparable results, diversity in methodologies can serve as an advantage, ensuring that equipment-specific errors do not skew findings. Juan Rojo from Vrije Universiteit Amsterdam emphasizes that “the proton’s radius is a universal property, and consistent results across different approaches enhance credibility.”

Understanding proton size is vital for refining theories about potential new particles, as noted by Yost. The recent MPQ experiment has accurately tested existing theories, such as quantum electrodynamics, with a precision of 0.5 parts per million. Although no discrepancies with predicted outcomes emerged, the research lays the groundwork for future explorations in particle physics.

While high-energy colliders seek heavier particles, these precise hydrogen atom studies interrogate for lighter, hidden particles. “With a clearer understanding of proton size, we can now ask, what constraints can we establish for new physics?” concludes Yost.

Topics:

  • particle physics/
  • quantum physics

Source: www.newscientist.com

Understanding Gödel’s Incompleteness Theorem: How One Man Transformed Mathematics

Kurt Gödel, logician and mathematician

Logician, mathematician, philosopher, and visionary Kurt Gödel

Pictorial Press/Alamy

Kurt Gödel, the individual who transformed mathematics, stands as one of the most pivotal thinkers of the 20th century. Born in 1906 during an era of great mathematical turmoil, his contributions would later reshape this landscape, albeit confining mathematicians to a more bounded realm.

The realm of mathematics is an extraordinarily powerful intellectual framework, allowing us to construct a vast array of logical ideas upon others. It resembles a cognitive perpetual motion machine; new mathematical insights seem perpetually within reach, awaiting only the right methodologies. Yet, the core of mathematics conceals a profound, limiting truth known as Gödel’s incompleteness theorem.

The genesis of this theorem traces back to the late 19th century, a time when mathematicians began to elucidate the foundational structures of their discipline. To their dismay, they found that thousands of years of mathematical thought rested on unstable ground, leading to a wave of paradoxes that unsettled the field.

In response to this chaos, mathematician David Hilbert, at the 1900 Paris Conference, formulated a list of 23 unsolved mathematical problems that would guide research efforts for the 20th century. “As long as a branch of science poses challenges, it shall endure,” he told the audience.

Gödel would later confront Hilbert’s second problem, which pertains to the axioms of mathematics—the foundational assumptions that dictate logical deductions. Hilbert’s challenge to mathematicians was to prove that the axioms of arithmetic could be considered “non-contradictory,” ensuring that a finite number of logical steps drawn from them could not yield conflicting outcomes.

This proof is vital; envision a board game where scoring relies on contentious interpretations of rules—one interpretation could earn points, while another may result in a loss. Such a game would be inherently flawed.

Over the decades to follow, Hilbert and his associates endeavored to address his second problem through the development of “proof theory,” a method of abstracting proofs into mathematical objects. This allowed them to treat proofs as recipes for generating valid conclusions, which Hilbert showcased in a 1928 lecture on Die Grundlagen der Mathematik (The Foundations of Mathematics), where he asserted that the approach could delineate foundational questions in mathematics through definitively demonstrable formulas, though acknowledging the effort required for meaningful resolution.

At that moment, Gödel was a 22-year-old doctoral candidate at the University of Vienna, under the mentorship of a mathematician aligned with Hilbert’s program. While there is no evidence they ever directly interacted, Gödel’s subsequent publication of his completeness theorem as part of his doctoral work marked a significant milestone toward achieving Hilbert’s objectives.

Completeness theorems revolve around models of axiom sets, essentially mapping concrete mathematical objects to the symbolic constructs like “2” or “+” within a given framework. For simplification, consider axioms stating: “Two entities exist” and “Entities differ.” Though minimal, these are valid axioms, leading to various applicable models, such as coin sides (heads or tails) or numerical representations (0 and 1). Diverse models can inform the same axiom set, with Gödel demonstrating that if a statement holds true across all potential models, it is provable from those axioms.

While this may seem self-referential, it provided optimism for Hilbert’s endeavor to solidify the underpinnings of mathematics.

Unexpectedly, on September 6, 1930, Gödel unveiled his completeness theorem at a conference in Königsberg (now known as Kaliningrad, Russia). Hilbert was likewise present at a different conference in the city, delivering a notable address on September 8, in which he famously repudiated the idea that human understanding bore limits, declaring, “We must know. We will know,” words that eventually adorned his gravestone.

Yet, unbeknownst to Hilbert, Gödel had undermined all such ambitions the day before. During discussions with other logicians on September 7, he revealed his discovery of an “undecidable” statement—one that cannot conclusively be proven true or false within a particular set of axioms. This marked the dawn of an idea that would forever constrain the scope of mathematics.

Incompleteness is a vital concept in modern mathematics, reflecting essential limitations.

Super Stock / Alamy

While it’s tempting to visualize Gödel chuckling at Hilbert’s lecture, records show no meetings occurred. However, a few months later, in January 1931, Gödel published the incompleteness theorem—an illuminating counterpoint to his earlier work.

This theorem asserts two crucial points. First, that no matter the axiom set, certain problems remain unsolvable within its confines—akin to the paradoxical phrase, “This statement is false,” which defies classification as either true or false. This leads to what we now call Gödel’s first incompleteness theorem, which retains its significance nearly a century later. I previously explored how a certain computer program could theoretically destabilize mathematics due to an undecidable problem.

While the First Incompleteness Theorem reshaped our perception of mathematical capabilities, Gödel’s Second Incompleteness Theorem posed even greater challenges for Hilbert. Gödel demonstrated that a sufficiently robust set of axioms could never confirm its own non-contradictory nature.

Returning to our board game analogy, reading the rules provides no assurance against contradictions. Hilbert sought assurance of consistency in arithmetic axioms, yet Gödel revealed this concern to be inherently undecidable. There is a workaround: a new axiom could affirm the earlier axiom’s consistency. However, this introduces new contradictions, leaving mathematicians with an enduring sense of the unknown rather than infinite horizons.

What was Hilbert’s response to this earth-shattering revelation? Remarkably, he made no public acknowledgment. As noted by Gödel’s biographer, John Dawson, Gödel forwarded a draft of his findings to Hilbert’s assistant, Paul Bernays, who acknowledged it but offered no feedback. Dawson suggests that Gödel’s findings stirred Hilbert’s ire, yet it wasn’t until 1934 that he publicly addressed Gödel’s work, claiming, “The temporary view that Gödel’s results rendered my proof theory unviable turned out to be false.” In a collaborative book with Bernays, Hilbert stated:

In sum, Gödel’s groundbreaking insights not only challenged Hilbert’s vision of mathematics as an infinite pursuit of knowledge but ultimately prevailed. Gödel’s concept of incompleteness has become a cornerstone in modern mathematics, revealing both its richness and its limitations. I often ponder whether Gödel himself felt a sense of incompleteness after his contentious exchanges with Hilbert.

Topic:

Source: www.newscientist.com

Revolutionary Quantum Batteries: Harnessing Time Reversal for Instant Charging

Quantum batteries harvesting energy by reversing time

Quantum Batteries: Harnessing Energy by Reversing Time

Photo by Dakuku/Getty Images

Innovative methods designed to reverse time flow in quantum systems may pave the way for the next generation of quantum batteries.

Across the cosmos, we perceive events as unfolding in a singular direction, conforming to the apparent arrow of time. However, the fundamental principles governing our universe remain effective regardless of whether time advances forward or retreats backward.

Scientists have developed various theories to explain the apparent discord between the one-way arrow of time we observe and the permitted bidirectional flow dictated by physical laws. A prominent example is the second law of thermodynamics, which posits that systems naturally progress towards greater disorder, thereby favoring a forward time direction.

In quantum mechanics, the understanding of the arrow of time diverges. Just like classical laws, quantum processes can technically unfold in either direction. However, the forward direction is determined by comparing measurements of a quantum system against theoretical predictions regarding its temporal evolution. When these measurements align with specific statistical patterns, the system is interpreted as progressing forward in time.

Recently, Luis Pedro Garcia Pintos and his team at Los Alamos National Laboratory, New Mexico, have formulated a method to replicate this statistical characteristic. By reverse-engineering measurement-induced changes in a quantum system, they create an illusion that the quantum system is retreating in time.

“We apply field and control techniques to the system that allow us to undo the effects of measurements,” explains Garcia-Pintos. “If a measurement causes the system to elevate, we can counteract this by bringing it down, effectively creating a trajectory that aligns more with a backward time process.”

The researchers suggest the potential to manipulate the arrow of time in a qubit—an essential element of quantum computing—by measuring its properties, such as spin. Yet, this depends on the ability to continually measure qubits in a non-disruptive manner, enabling the calculation of the temporal direction through microwave pulse applications.

This technology holds the promise of enabling energy extraction from quantum systems requiring measurement, according to Garcia-Pintos. Such an advancement could significantly impact quantum batteries and miniature quantum engines, as each measurement introduces energy into the system.

By carefully adjusting the quantum arrow of time, this energy can be effectively redirected and harnessed for alternative applications. “Consequently, you derive energy from this process,” states Garcia-Pintos. “These measurements can serve as thermodynamic resources.”

As noted by Mauro Paternostro, it’s important to note that the proposed design is specialized and does not universally apply to all quantum systems.

Moreover, achieving order in a system necessitates an energy expenditure, ensuring compliance with the second law of thermodynamics. “When I enter my son’s room, chaos reigns—balls roll and clothes scatter. If I take the time to clean, the room becomes tidier, but this requires energy,” he remarks. “This is precisely what their external control mechanisms demonstrate.”

Topics:

Source: www.newscientist.com

High-Stakes Moments in NASA’s Artemis II Mission: Astronauts Gear Up for Landing

The four astronauts aboard NASA’s Artemis II mission are nearing their return to Earth, but a crucial and perilous phase of the mission remains ahead.

Subscribe to read this story without ads

Get unlimited access to ad-free articles and exclusive content.


NASA astronauts Reed Wiseman, Christina Koch, Victor Glover, and Canadian astronaut Jeremy Hansen are set to return to Earth Friday night following a 10-day mission in space.

Their Orion capsule is scheduled to re-enter the atmosphere around 7:53 p.m. ET, embarking on a critical journey expected to take under 15 minutes. If everything goes as planned, the mission will conclude with a splashdown in the Pacific Ocean off San Diego’s coast at 8:07 p.m. ET.

“There’s a 13-minute window where everything must go right,” stated Jeff Radigan, NASA’s Artemis II flight director, at a recent press conference.

Reentry poses significant risks during spaceflight, exposing the craft to temperatures near 5,000 degrees Fahrenheit as it plunges through the atmosphere. This risk is heightened for Artemis II due to a known design flaw in the Orion spacecraft’s heat shield, crucial for shielding astronauts from extreme heat.

This marks the first crewed mission for the Orion capsule.

After the uncrewed Artemis I mission in 2022, NASA detected unexpected damage to the spacecraft’s heat shield.

NASA’s Orion spacecraft was recovered post-Artemis I test flight and transported to Kennedy Space Center in Florida, where its heat shield underwent inspection.
NASA

NASA’s research revealed that certain materials in the heat shield cracked upon atmospheric reentry, leading to “charred material flaking off” in various areas. An investigation found that improper gas venting within the heat shield’s outer layers created pressure buildup, resulting in the damage.

Damage to the heat shield from the Artemis I mission is shown.
NASA

As a result of these findings, NASA intends to revise the heat shield design for subsequent Artemis missions. The Orion spacecraft for future flights will feature a more permeable outer material layer. Unfortunately, by the time NASA identified the damage from Artemis I, the Artemis II capsule was already built and assembled.

Rather than redesign the heat shield, NASA adjusted the capsule’s reentry trajectory to mitigate risks for the astronauts. The Orion spacecraft typically descends into the atmosphere, “skipping” like a stone on water to lessen thermal stress and gravitational forces before its final descent. NASA Deputy Administrator Amit Kshatriya explained that this “skip” will be brief, allowing the capsule to descend more rapidly and at a steeper angle, thereby reducing exposure to extreme temperatures.

“All systems demonstrated over the past nine days, including life support, navigation, propulsion, and communications, hinge on the flight’s final moments,” stated Kshatriya during a Thursday media briefing.

He further expressed “high confidence” in the spacecraft’s heat shield with the optimized flight path.

Nonetheless, substantial risks remain, with the lives of four astronauts at stake.

Former NASA astronaut Charlie Camarda voiced concerns regarding the heat shield, suggesting NASA should have delayed the Artemis II launch pending further assessment of the existing design.

“History shows that incidents occur when organizations misjudge the complexities of problems. This issue mirrors patterns seen prior to previous tragedies,” he articulated in an open letter to NASA Administrator Jared Isaacman in January.

Conversely, Isaacman declared earlier this month that he holds “full confidence” in the performance of Orion’s heat shield.

Wiseman conveyed satisfaction with the current plan.

“If we adhere to the newly-established atmospheric entry path, this heat shield is safe for flight,” he affirmed during a pre-flight media event in July.

Radigan noted that precise orbital positioning is essential for the atmospheric reentry protocol. Mission control has dedicated significant effort over the past day and a half to maintain the Orion spacecraft’s orbital path, executing necessary engine burns.

“Avoid actions that are impossible,” Radigan emphasized. “Hitting the required angle is crucial for successful atmospheric reentry.”

Artemis II flight controllers monitor the Orion spacecraft from the White Flight Control Room at Johnson Space Center in Houston.
Ronaldo Shemit/AFP – Getty Images

During reentry, the Orion capsule is projected to achieve speeds nearing 24,000 miles per hour, with astronauts experiencing gravitational forces approximately 3.9 times that of Earth.

As the capsule penetrates the atmosphere, communication blackouts are anticipated due to plasma buildup surrounding the craft. Flight director Rick Henfling announced that this disruption is expected to last around six minutes.

“Post-blackout, Orion will be at approximately 150,000 feet, still descending rapidly,” he noted.

On Saturday, Artemis II commander Reed Wiseman gazed back at Earth from the Orion spacecraft’s main cabin.
NASA

At approximately 6,000 feet altitude, the capsule will deploy three main parachutes to decelerate to about 20 miles per hour before making contact with the ocean.

The U.S. Navy is set to assist with recovery operations in the Pacific. Following confirmation of a safe landing area, NASA plans to extract Koch from the capsule first, followed in order by Glover, Hansen, and Wiseman.

At a press conference on Thursday, Kshatriya commended the crew, emphasizing it was time for flight officials, engineers, and recovery teams to bring them home.

“The crew has performed their duties,” he stated. “Now it’s our turn to execute our part.”

Source: www.nbcnews.com

Occasional Chimpanzee Civil War: Uncovering the 500-Year Cycle of Conflict

A community of approximately 200 chimpanzees in Uganda’s Kibale National Park has fractured into two rival factions, leading to a years-long, deadly conflict.

The Ngogo chimpanzees have been under continuous study for three decades, yet scientists have recently observed the violent split unfold in slow motion.

Starting around 2015, this previously unified group began to polarize. Social bonds weakened, neighborhoods within the community solidified into distinct factions, and once-shared territory became hotly contested. By 2018, this division had become permanent.









What transpired next was both surprising and alarming. The smaller faction, known as the western chimpanzees, initiated targeted raids on the territory of their larger rivals, the central group. Over the following six years, they killed at least seven adult males and 17 young children.

This count is likely an underestimate. An additional 14 adolescent and adult male chimpanzees went missing or died unexpectedly between 2021 and 2024, none showing any prior signs of illness.

Today, the Western group has emerged as the dominant force within the jungle.

Recent findings, published in Science, have drawn comparisons to civil wars. Unlike conflicts between strangers, the events at Ngogo involved former allies, groomers, and long-term social partners turning against one another.

Researchers involved in this study estimate that such conflicts occur only once every 500 years.

https://c02.purpledshub.com/uploads/sites/41/2026/04/Sandel-adz4944-video-1-1.mp4
Opposing factions from the Western Group and Central Group meet in Ngogo in 2021.

“One of the most intriguing aspects of this conflict is the so-called ‘friend-to-foe’ transformation,” stated Professor Aaron Sandel from the University of Texas at Austin in an interview with BBC Science Focus. “This provides a rare glimpse into the minds of chimpanzees.”

This research supports the notion that group identities can change, undermining long-held social bonds without the ethnic, religious, or ideological markers typically associated with collective violence.

“It almost facilitates wars between factions,” Sandel commented. “But we remain troubled by civil strife, as it’s often neighbors turning against neighbors.”

He added that insights from chimpanzee behavior could help researchers formulate hypotheses about the factors that drive humans toward or away from similar conflicts.

“By focusing on human interactions and conflict resolution, we may uncover more effective avenues for promoting peace,” he concluded.

Read more:

Source: www.sciencefocus.com

Revolutionary CAR-T Cell Therapy Restores Bedridden Woman to Full Health

CAR T Cells: Genetic Modification Process to Combat Autoimmune Diseases

Diagram of CAR T Cells: Genetic Modification to Combat Autoimmune Diseases

Christoph Burgstedt/Science Photo Library

A woman suffering from three autoimmune diseases has found remarkable relief after undergoing CAR T cell therapy. Following genetic modification of her immune cells, she didn’t require treatment for nearly a year, thanks to these engineered cells effectively targeting and eliminating rogue cells in her body. “When we first met, she was bedridden and at death’s door. After treatment, she was out of bed within seven days,” stated Fabian Muller from Erlangen University Hospital, Germany. Remarkably, she made a full recovery within months, and an 11-month post-treatment check confirmed her continued good health.

This case represents the growing potential of CAR T cell therapy in treating autoimmune diseases, particularly since she was the first patient treated for three concurrent conditions simultaneously. “It’s astonishing that I could overcome three autoimmune diseases with just one treatment,” Muller remarked.

In response to viral infections, our bodies produce vast numbers of immune cells with random mutations. Unfortunately, some of these mutant cells become self-targeting and can persist indefinitely. This phenomenon occurred in the patient’s case over a decade ago during pregnancy, leading to her autoimmune hemolytic anemia—a severe condition where antibodies attack oxygen-carrying red blood cells.

Her immune system went on to produce antibodies that targeted platelets (leading to immune thrombocytopenia) and proteins preventing blood clots (causing antiphospholipid syndrome), exposing her to both severe anemia and dangerous clot risks.

Despite trying various immunosuppressive medications with no success, the patient required blood transfusions and anticoagulants to manage her symptoms until she was referred to Professor Müller and his team. In 2022, they became the first to treat an autoimmune disorder with CAR T cell therapy, a technique previously limited to cancer treatment.

For her treatment, researchers engineered CAR T cells to specifically target her abnormal antibody-producing immune cells. Following this intervention, these cells were effectively eliminated, restoring her immune system’s functions without entirely wiping it out.

Interestingly, her immune system recognized the infused CAR T cells as foreign and eliminated them within months, paving the way for the development of new, healthy antibody-producing cells. Consequently, her immune system is now functioning normally, free from the destructive cells responsible for her illness.

The CAR T therapy approach has shown promise for treating disorders like lupus, multiple sclerosis, colitis, and severe asthma. Unlike cancer treatments, which may induce severe side effects due to extensive cell death, the CAR T therapy used for autoimmune diseases is generally associated with far fewer adverse effects, as fewer cells need targeting.

Although some residual effects persisted, researchers believe these stem from previous drug therapies rather than the CAR T treatment itself. “This powerful treatment has minimal side effects and can resolve underlying symptoms, which is truly remarkable,” stated Ruben Benjamin from King’s College London.

Currently, most patients treated for autoimmune disorders with CAR T cell therapy have remained symptom-free, although some cases show a return of targeted cells, necessitating additional treatment options, as noted by Benjamin.

“Long-term follow-up is essential for a comprehensive assessment of these therapies,” he added. Jun Shi from the Chinese Academy of Medical Sciences in Tianjin is leading an ongoing trial on 15 patients with autoimmune hemolytic anemia using CAR T therapy. Read more about ongoing trials here.

While CAR T therapy is notably expensive, ranging from $200,000 to $600,000 due to its tailored nature, Muller emphasizes the long-term savings and benefits, suggesting that effective treatments can lead to individuals returning to work and improved quality of life. “The initial costs are high, but they could save substantial amounts in the long run,” he stated.

Topic:

Source: www.newscientist.com

Chimpanzee Population Conflicts Reveal Evolutionary Origins of War

Conflict between Ngogo chimpanzees

Violent Conflict Among Ngogo Chimpanzees

Aaron Sandel

Once a cohesive group, the Ngogo chimpanzees have divided, leading to escalation in violence and conflict. Researchers suggest this division might indicate that warfare is an innate aspect of our nature, rather than a recent development linked to our evolving culture.

According to Aaron Sandel and his team from the University of Texas at Austin, a comprehensive analysis of 24 years of social networks, 10 years of GPS tracking, and 30 years of demographic data on the chimpanzees (Pan troglodytes) in Kibale National Park, Uganda, was conducted.

Sandel emphasizes caution with terminology: “These are chimpanzees. Terms like war and civil war carry specific meanings for humans. While the conflict is not a civil war, there exist notable parallels, particularly regarding the shifts in group identity that precipitate lethal conflict.”

Chimpanzees are notorious for violence, predominantly targeting infants of rivals or outsider males.

The Ngogo population, comprising 150 to 200 individuals, is closely related to bonobos (Pongo niger), recognized as humans’ nearest relatives.

Between 1995 and 2015, the Ngogo chimpanzees were known for their cooperative behavior, showcasing fission-fusion dynamics, where individuals form temporary associations throughout the day and regroup each evening.

During puberty, female chimpanzees typically leave the group, while males remain for life. Prior to 2015, adult males formed alliances with females, facilitating hunting and territory patrols.

However, on June 24, 2015, a pivotal confrontation occurred when one faction, known as the central group, violently expelled the western group from their shared territory.

Following this event, the unity among the chimpanzees deteriorated. By 2018, the groups had permanently separated. During the ensuing years, the western group undertook 24 attacks, resulting in the deaths of seven adults and 17 infants from the other faction.

Chimpanzees from the Western Group on Patrol

Aaron Sandel

Sandel noted that the central chimpanzees were the first to pursue the western group; yet, the initial aggressors remain unclear. “As new factions emerged and divisions solidified, both groups engaged in territorial disputes,” he explained. “However, the western group has become the dominant aggressor, responsible for all fatal attacks.”

Various factors are believed to have contributed to the conflict’s escalation. Initial disagreements over food resources may have sparked tensions. In 2014, the deaths of five males and one female likely weakened the social structure. Changes in alpha male dynamics further exacerbated the situation, culminating in an outbreak of respiratory disease.

This outbreak, which claimed 25 Ngogo chimpanzees in January 2017, including the last surviving males of both factions, extinguished hopes for reconciliation.

Sandel and his team propose that the patterns observed in chimpanzee conflicts could provide insights into the evolutionary foundations of human warfare. While contemporary human conflicts are often attributed to ethnic, religious, and political divisions, this perspective may overlook the fundamental social dynamics shared with our primate relatives.

“In specific scenarios, the path toward peace may stem from simple, everyday acts of reconciliation,” the researchers articulated in their findings.

Maud Muzino from Boston University emphasizes that there are two predominant theories regarding the origins of human conflict. The first posits that war is a recent cultural innovation stemming from agrarian society and the establishment of nation-states. The alternative viewpoint asserts that the roots of warfare trace back through human evolution. “Ngogo’s findings significantly contribute to understanding the deep-seated origins of human conflict,” Muzino notes.

This study reveals that social fragmentation and subsequent conflicts can arise independently of the cultural differences often presumed to trigger human wars, be it in beliefs, language, or religious practices, states Luke Glowacki, also from Boston University.

Topic:

Source: www.newscientist.com

Video Reveals Rising Hostility Among Chimpanzees After Social Split

  • Currently playing

    Video Reveals Hostile Behavior of Chimpanzees Following Social Split

    00:44

  • to the next

    Artemis II Astronauts Share Insights From Their Journey to the Moon

    01:11

A study reveals increasing aggression among two subgroups of Ngogo chimpanzees in Uganda’s Kibale National Park following a social split, leading to over 20 chimpanzee deaths.

The content has been rewritten for SEO optimization with relevant keywords while maintaining the original HTML structure and format.

Source: www.nbcnews.com

New Research Uncovers Internal Conflicts Among Violent Chimpanzee Groups

For many years in Uganda’s Kibale National Park, two groups of chimpanzees coexisted, engaging in grooming, socializing, and territory patrols within their communities.

Subscribe to read this story without ads

Get unlimited access to ad-free articles and exclusive content.


Then, in a shocking turn of events, one group violently attacked the other, igniting years of conflict likened to human civil wars.

When the violence erupted in 2015, John Mitani, a professor emeritus of anthropology at the University of Michigan with over 20 years of research on chimpanzees, described the chaos: “It was just chaos. They started screaming and chasing each other.”

In the three years following the outbreak, Mitani and his colleague Aaron Sandel, an associate professor of anthropology at the University of Texas, documented how the chimpanzees’ social networks began to erode. By 2018, the two factions known as Western Ngogo and Central Ngogo chimpanzees “stopped sharing territory and began engaging in aggressive behavior, even killing each other,” Mitani reported.

At least 28 chimpanzees, including 19 infants, have been killed in this period, according to the initial research published in Science.

Mitani remarked, “Individuals who once aided each other now view one another as enemies.”

Basie, a central chimpanzee, is attacked by two males from the western subpopulation.
Aaron Sandel

This marks the second observed instance of chimpanzee factions splitting and turning violent. Given that chimpanzees and bonobos are humans’ closest genetic relatives, the findings may offer insights into human behavior.

“Civil wars afflict people. How can you turn a neighbor against you? This study of chimpanzees highlights how group identity evolves and how lethal aggression arises,” Sandel noted.

Infants were torn from their mothers and killed.

Following the social collapse, one-sided violence escalated. The Western chimpanzees, initially a minority group, were responsible for all subsequent attacks post-split in 2018. While their numbers rose from 76 to 108, the Central chimpanzee population has steadily decreased.

Attacks have been brutal, with a Western chimpanzee reportedly tearing an infant from its mother’s arms and killing it.

Sandel highlighted that chimpanzees often utilize gang violence when targeting adult or adolescent males.

“Five or ten chimpanzees will overwhelm an individual, holding them down, biting, beating, and dragging them,” he explained. “The violence can be horrifying.”

Mitani remarked, “It’s distressing to witness.”

“The situation deeply troubles me,” he expressed.

Researchers seek answers regarding the reasons behind the collapse of social structures.

Since 1995, chimpanzees at Ngogo have been closely monitored, with structured documentation of their behavior. The recent study utilized 10 years of GPS tracking, 30 years of demographic data, and 24 years of detailed observations.

The team examined chimpanzee social networks by observing individual males for an hour, recording proximity, interactions, and grooming behaviors. They noted recurring patterns of overlapping social dynamics that ultimately led to a breakdown.

Mitani and Sandel propose that the group’s large size might have played a role in the violent divide. While typical chimpanzee groups consist of around 50 individuals, the Ngogo community boasted about 200, possibly straining social connections and heightening competition for resources.

Additionally, the death of five adult males from disease prior to the split could have disrupted critical social ties. Following this, a new alpha male emerged in 2015, further complicating social dynamics.

“That’s a significant factor,” Mitani explained, noting that such shifts typically occur every 6-8 years, often increasing aggression and altering relationships.

Decades ago, Jane Goodall witnessed similar violence

Approximately 50 years ago, the late Jane Goodall and her team observed a series of assaults that led to group fragmentation in Tanzania’s Gombe National Park. The main group hunted down and killed all males from the splinter group.

Researchers subsequently termed this conflict the “Four Years’ War.”

Anne Pusey, a professor emeritus of evolutionary anthropology at Duke University, studied these interactions until 1975, and noted that conditions preceding the killings were “remarkably similar” to those seen in Ngogo.

In Gombe, changes in alpha leadership, a shortage of mating females, and the deaths of amicable older males precipitated violent behavior.

Pusey remarked, “These social bonds deteriorated, leading to hostility.”

Joseph Feldblum, an evolutionary anthropologist with experience studying Gombe, stated that the recent findings align with historical observations.

“Such behavior is infrequent, yet exists within the natural repertoire of chimpanzees,” he noted.

Mitani expressed concern over the future of the Ngogo central group, suggesting they may be “doomed” based on past events at Gombe.

“The signs are evident,” he stated.

With the ongoing violence against infants and exclusion of females, Mitani concluded, “We might be witnessing an extinction event.”

Impact on humans

What can we learn from the violent behaviors displayed by our closest relatives?

Sandel emphasized that while cultural differences are often blamed for human warfare, this explanation does not apply to chimpanzees.

“Chimpanzees lack ethnicity, religions, and political ideologies, which are often identified as causes of human conflicts, especially internal strife like civil wars.”

Instead, researchers believe the violence is rooted in the breakdown of friendships and rivalry escalation. Sandel suggested that these dynamics may play a more pivotal role in human civil wars than commonly recognized, proposing that small gestures of reconciliation could be vital for peace.

Mitani reminded us that humans diverged from chimpanzees 6 to 8 million years ago. He cautioned against viewing violence against neighbors as an inherent human trait simply because it is observed in chimpanzees.

“We have evolved,” Mitani asserted. “As a species, we have become increasingly cooperative and socially inclined, often helping not only our neighbors but even strangers. This capability is not shared by chimpanzees.”

Source: www.nbcnews.com

Unlocking the Secrets of ‘Compound X’: A Breakthrough in Eliminating Parkinson’s Disease Proteins from the Brain

Parkinson’s Disease: Neurological Insights and Treatment Advances

Image Credit: Dr. Gopal Murthy/Science Photo Library

A potential breakthrough drug, referred to as Compound X, has demonstrated significant improvements in mobility and balance for mice exhibiting Parkinson’s-like symptoms. This innovative treatment enhances the brain’s waste-processing capabilities, effectively removing toxic protein aggregates. However, the research team has yet to disclose the specifics of this compound.

“With intellectual property considerations, we recognize that Compound X represents a pivotal advancement, potentially serving as the first disease-modifying intervention for Parkinson’s disease,” stated Zhao Yan from Swinburne University of Technology, Melbourne.

Parkinson’s disease affects over 10 million people globally, characterized by the progressive loss of nerve cells involved in movement control. This degeneration is widely believed to originate from the build-up of misfolded proteins called α-synuclein, due to a malfunction in the brain’s waste disposal system—the glymphatic system. Recent studies aimed to determine if enhancing this system could alleviate symptoms.

To explore this hypothesis, Yang and her colleagues employed a novel mouse model mimicking Parkinson’s disease. This model utilizes repeated nasal administration of misfolded alpha-synuclein, promoting its spread throughout the brain and causing severe motor deficits—more accurately reflecting human Parkinson’s disease compared to traditional models that rely on toxin exposure. Yang showcased her findings at the Oxford Glymphatic and Brain Clearance Symposium in the UK on April 1st.

The team administered weekly doses of alpha-synuclein to 20 mice over four months. After two months, they introduced Compound X—an FDA-approved drug administered four times a week in synergy with methylcellulose, which enhances drug solubility. Preliminary studies indicated that Compound X could increase slow brain waves, known to support glymphatic function, although its specific impact on brain waste clearance warranted further investigation, Yang noted.

The remaining group of mice received only methylcellulose as a control. The progression of Parkinson’s symptoms paralleled early-stage human patients, including alterations in smell and sleep patterns, according to Yang.

Subsequently, all mice underwent a locomotion test involving navigation on a slender rod. Remarkably, 80% of the mice treated with Compound X successfully completed the task, compared to only 10% in the control group.

In another assessment requiring balance on a rotating rod for five minutes, nearly all Compound X-treated mice maintained their position throughout the duration, while the control group averaged just three minutes.

Further analyses revealed that Compound X enhanced slow-wave activity during deep sleep and facilitated fluid circulation within the glymphatic system. Notably, this treatment reduced α-synuclein aggregates in the mice’s motor cortex by approximately 40% compared to the control group.

“This discovery holds significant potential,” emphasized Duan Wenzhen from Johns Hopkins University, Maryland. “The medical community requires treatments that can decelerate disease progression. Current therapies only alleviate symptoms temporarily, lacking efficacy in altering the disease’s trajectory.”

The research team aspires to obtain regulatory approval for human trials targeting early-stage Parkinson’s patients within the upcoming year. “Our ultimate goal is to provide treatment that addresses the early stages of the disease, where the most significant benefits are realized,” Yang concluded.

Topic:

Source: www.newscientist.com

Mind-Blowing Sci-Fi Series ‘Miniature Wife’ Starring Elizabeth Banks and Matthew Macfadyen

New scientist. Our website and magazine feature science news and long reads by expert journalists covering developments in science, technology, health and the environment.

“Miniature Wife” Starring Matthew Macfadyen and Elizabeth Banks

Credit: Peacock

Miniature figures have long captivated audiences within science fiction and fantasy, a tradition that traces back to Jonathan Swift’s Gulliver’s Travels. The concept of shrunken characters has graced numerous classic films, from Bridal of Frankenstein to modern hits like Ant-Man and Honey, I Shrunk the Kids. The new Peacock limited series, Miniature Wife, pays homage to these films but falls short of being a noteworthy addition to the genre.

Based on Manuel Gonzalez’s 2014 short story, Miniature Wife follows Elizabeth Banks as Lindy Littlejohn, a once-celebrated author turned university professor overshadowed by her scientist husband, Les (Matthew Macfadyen). Lindy’s feelings of insignificance in both her personal and professional life are amplified when she becomes literally small due to Les’s experimental invention, designed to shrink objects to 1/12th of their original size.

Lindy’s most pressing dilemma is that Les has not yet found a stable antidote for the shrinking process. His failed attempts often lead to catastrophic results. Complicating matters further, Lindy navigates a plagiarism scandal involving a student’s accidental publication of a story under her name in the New Yorker. An emotional tangle with Les’ colleague, Richard (OT Fagbenle), also arises, whose affections for her are stronger than her feelings for him.

Meanwhile, Les strikes a deal with a sinister oligarch (Ronnie Cheng) that pressures him to produce an antidote within 30 days, jeopardizing all his work if he fails. The series often drags with dull office politics, including dealings with a demanding scientist named Vivienne (Zoe Lister-Jones) who becomes Lindy’s new boss. Subplots concerning their college-age daughter, Lulu (Sofia Rosinski), and Lindy’s best friend Terry (Sian Clifford) feel like unnecessary padding, contributing to a scattered narrative.

Creators Jennifer Ames and Steve Turner could have benefitted from trimming the number of episodes, currently awkwardly balancing comedy and drama at around 45 minutes each. Miniature Wife showcases Lindy’s clumsy adaptations to dollhouse life, coupled with the strains of her rocky marriage to Les amid extraordinary circumstances.

“We all suck,” Lulu candidly remarks about the Littlejohn family, and she’s not wrong. Both Lindy and Les are portrayed as unlikable individuals whose relationship deteriorates under pressure. While this could work in a dark comedy context, attempts to depict the Littlejohns as a couple worth rooting for fall increasingly flat. Banks and Macfadyen’s chemistry is lacking, with Macfadyen often treating robberies as emotional expressions.

As a science fiction piece, Miniature Wife is convoluted, filled with complex jargon that ultimately lacks substance. Its special effects struggle to match the more straightforward visuals of Lily Tomlin’s 1981 comedy, The Incredible Shrinking Woman. Les lamentably claims to have created “a little monster,” yet he has only triggered mild annoyances.

Topics:

  • Science Fiction/
  • Television

This revised version is structured for SEO optimization with relevant keywords such as “science fiction,” “television,” and “Miniature Wife.” Additionally, it maintains necessary HTML tags for proper formatting while enhancing clarity and engagement.

Source: www.newscientist.com

How Early Humans Revolutionized Their Toolkits 200,000 Years Ago: Key Changes and Innovations

Changes in predator populations may have driven early humans to develop innovative tools

Raul Martin/MSF/Science Photo Library

Approximately 200,000 years ago, a decline in megafauna may have compelled early humans to transition from heavy stone tools to more lightweight hunting kits designed for smaller prey. A recent study supports the notion that this change in hunting strategy could have sparked a rise in cognitive capabilities among our ancestors.

For over a million years, various early human species relied on heavy stone tools such as axes, kitchen knives, scrapers, and stone balls. These robust tools were essential for hunting and butchering large herbivores, including extinct relatives of modern elephants, hippos, and rhinos.

Between 400,000 and 200,000 years ago, archaeological evidence shows a notable increase in smaller, sophisticated tools alongside the fading of traditional heavier tools. Our species, Homo sapiens, emerged during this timeframe.

Circa 200,000 years ago, heavy stone tools vanished from the archaeological record of the Levant, while the presence of diverse, lightweight masonry toolkits—like blades and precision scrapers—increased.

Research led by Vlad Litov, a professor at Tel Aviv University, revealed a correlation between these technological advancements and a significant decline in large herbivores, potentially due to overhunting.

The researchers analyzed archaeological findings from 47 sites across the Levant, spanning the Paleolithic period, which lasted from around 3.3 million years ago to 12,000 years ago. Their analysis of dated stone artifacts in relation to animal remains uncovered a compelling trend.

Findings indicate a drastic reduction in the biomass and specimen count of giant herbivores exceeding 1,000 kilograms correlating with the disappearance of heavy tools 200,000 years ago. Conversely, the availability of smaller prey increased alongside more sophisticated small tools.

Supporting the connection between tool technology and prey type, the researchers noted that sturdy stone tools were still in use in regions with abundant large game, such as southern China, until about 50,000 years ago.

Heavy-duty tools and their evolution to lightweight alternatives used by early humans

Vlad Litov et al., Institute of Archaeology, Tel Aviv University

Previous theories suggested that advancements in technology stemmed from increasing intelligence and creativity due to evolutionary pressures. However, Litov and his research team propose a different perspective: reliance on smaller prey may have catalyzed the evolutionary growth of larger brains in modern humans.

“As large herbivores dwindled, humans increasingly depended on smaller prey, necessitating varied hunting strategies, advanced planning, and the implementation of lightweight, intricate toolsets,” states Litov. “This cognitive evolution was a byproduct of adapting to new prey types, rather than the initial driver of this adaptive transformation.”

“There is more to this adaptation than merely prey size,” says Seri Shipton from University College London. He notes preliminary evidence indicating mass hunting of medium-sized ungulates like horses and bison, with signs of enhanced cognitive abilities and planning emerging during the Middle Paleolithic.

Nicolas Tessandier from the French National Center for Scientific Research also maintains some reservations. “Human adaptation to new fauna underscores adaptability rather than mere intelligence,” he posits. “Producing powerful tools for hunting large herbivores was equally astute.”

Litov recognizes that prior research has shown advanced cognitive functions present early in human evolution, notably in the development of Homo erectus around two million years ago. However, he emphasizes that switching from large to smaller prey had major consequences for human society. A single ancient elephant carcass could sustain a group of about 35 hunter-gatherers for months. As these high-calorie resources vanished, reliance on smaller prey reduced the yield per animal.

“Energetically, we had to gather numerous smaller ungulates, such as fallow deer, to replace the loss of one elephant,” explains Litov. This shift likely stimulated diverse cognitive and behavioral changes, including cooperative hunting strategies, advanced techniques, and enhanced social collaboration and organization. “Such adaptations may have contributed to the evolution of larger brains in later species, including Neanderthals and Homo sapiens,” he adds.

“In my view, the decline in large prey familiar to hominins likely intensified competition among groups,” asserts Shipton. “It was probably an iterative process where the reduction of larger prey prompted cognitive shifts that facilitated access to smaller prey.”

Discovery Tour: Archaeology, Human Origins, and Paleontology

New Scientist regularly highlights captivating sites worldwide that have transformed our understanding of species and the early days of civilization. Why not explore them yourself?

Topics:

In this revision, I’ve incorporated SEO-friendly keywords and maintained the integrity of the original content while adding clarity and enhancing readability.

Source: www.newscientist.com

Discover the World’s Most Unique Scientific Tourist Attraction: A Hidden Gem!

Feedback from New Scientist

Welcome to Feedback, your go-to source for the latest in science and technology news. If you have suggestions or feedback on topics we should explore, email us at feedback@newscientist.com.

Unique Tourist Attractions: Exploring the Niche

The Earth is vast, populated with a diverse range of interests. Here at Feedback, we have a penchant for unique tourist attractions along America’s scenic highways — such as the world’s largest collection of miniature representations of the world’s biggest objects.

Recently, science historian Richard Fallon drew our attention to what is likely the world’s only sculpture park dedicated to foraminifera. For those unfamiliar, foraminifera are single-celled organisms, primarily ocean dwellers with hard outer shells. Their fossil record is abundant and detailed, as they are preserved in vast quantities.

Located in Zhongshan, China, this Foraminifera Sculpture Park opened in 2009, and we acknowledge our delayed recognition of it. Nestled in a hillside park, visitors can stroll through 114 large sculptures. Describing these works is challenging without diving into terminology for irregular three-dimensional forms, but fans of Barbara Hepworth’s curvilinear sculptures might find some familiarity here.

On TripAdvisor, the Foraminifera Sculpture Park boasts a 5-star rating, albeit from a single review by a user named Eudyptes—who seems to have a specific fondness for foraminifera sculptures. Eudyptes is the scientific classification for the crested penguin.

We’d love more testimony about this attraction. Unfortunately, our editorial team turned down our request to visit China solely for this purpose, as well as a proposed visit to the Slavic International Toilet Museum in New Delhi.

On that note, we invite our readers to share any scientifically inclined sites that might be even more niche. Just to clarify, we are not seeking suggestions for popular attractions like the Icelandic Penis Museum or the British Vagina Museum. Maybe a unique museum focused solely on moss or Western blot images exists?

Humor in Scientific Research

It’s not uncommon for academics to incorporate humor in their paper titles, but referring to them in abstracts is rare. Typically, abstracts summarize key study points in about 200 words, varying from concise brilliance to confusing jargon.

However, physicist Leonard Susskind submitted a paper to arXiv titled “Is time reversal in de Sitter space a spontaneously broken gauge symmetry?” His summary includes an intriguing answer: “Yes, but with a twist: Time reversal is indeed a gauge symmetry, albeit hidden by spontaneous symmetry breaking.”

While the last part might puzzle many, we were particularly drawn to Susskind’s acknowledgment of his colleagues for their ongoing discussions. He humorously noted, “I’m almost 86 years old and I can’t wait for my readers to catch up.” His insightful summaries have landed on our list of favorite academic summaries, proving that humor can make complex subjects more relatable.

A Missed Opportunity

We owe our readers a heartfelt apology for an oversight. A few weeks back, we critiqued accounting firm PwC’s venture into estimating the moon’s future economy. We expressed skepticism about monetizing lunar assets, but reader Alex Collier raised an intriguing question: Could this entrepreneurial spirit imply the moon is actually made of cheddar?

Share Your Story with Us!

If you have a story to share with Feedback, email us at feedback@newscientist.com. Don’t forget to include your home address. You can find this week’s and past Feedback stories on our website.

This version is optimized for SEO, includes relevant keywords, and retains all HTML tags.

Source: www.newscientist.com

Explore ‘Beyond Inheritance’ by Roxanne Kamsi: A Must-Read This Week

Colored abstract silhouette of people in a crowd scene

Trillions of mutations in our cells can change each of us every day

Peter Aprahamian/Getty Images

Beyond Inheritance
by Roxanne Kamsi
Riverhead Books (April 21st)

With approximately 30 trillion cells in the human body, around 1% are replaced daily. Unfortunately, this cellular renewal is prone to errors. As a result, there can be numerous DNA mutations occurring in our bodies each day.

“You are genetically a little different today than you were yesterday, and you will be different again tomorrow,” notes Roxanne Kamsi in her book, Beyond Inheritance: A New Understanding of Ever-Mutating Cells and Health.

These mutations can vary from minor changes in single DNA letters to the loss of entire chromosomes. While some mutations are lost when cells die, many accumulate over time. By the end of life, each of your cells could harbor thousands of mutations.

While many may associate these mutations with cancerous growths, Kamsi highlights that non-cancerous mutations can lead to various health issues as well.


Since Darwin, many thinkers have recognized that the forces of evolution must also operate within the body.

Some mutations can have beneficial effects. For example, the formation of purple “bruises” arises from mutations during development that impact blood vessels. Additionally, mutations in skin cells may influence melanin production, creating skin patches in lines known as Blaschko lines.

This phenomenon occurs throughout the body and across all developmental stages, demonstrating that we are all mosaics composed of various cellular types. These variations can provide certain cells with distinct advantages.

When blood stem cells divide, one cell remains a stem cell while the other becomes a blood cell. If both divide at the same rate, they maintain a balanced progeny. However, mutant cells that divide more rapidly can skew this balance over time. By age 70, it’s estimated that one in ten individuals may have a predominance of mutant blood cells, increasing risks for heart attack or stroke.

This scenario resembles an evolutionary struggle among cells, where those with even slight growth advantages gradually emerge as dominant. Remarkably, Kamsi points out that post-Darwinian thinkers understood that evolutionary principles operate within our bodies, a notion that was largely forgotten after the advent of modern genetics in the 20th century.

Numerous so-called clonal diseases, including certain cases of endometriosis, illustrate this idea, as uterine cells may grow on other organs. Moreover, many mutations can be elusive to detect, especially in hard-to-study organs like the heart or brain.

However, not all mutations spell trouble. A surprising section of the book reveals how new mutations can correct inherited conditions, with research suggesting that liver cells have evolved mechanisms to tackle issues such as fatty liver disease. Yet, advantageous mutations remain the exceptions, not the rule.

While I have some critiques about the writing style and structure of this book—there are digressions about personal attributes that seem unconnected to the main content—the core message is invaluable. This book synthesizes various studies to convey essential information that should resonate with the medical community and beyond. Our bodies are composed of cells that constantly mutate and compete, often aligning with their own interests rather than our health.

Kamsi asserts, “By abandoning the outdated idea that all cells possess identical DNA and embracing the unsettling reality that each cell’s genetic code varies slightly, we can pave the way for a new era in medicine.”

While I remain skeptical about this new era, the implications are profound. Although Kamsi does not explicitly state it, her work addresses how multicellularity weakens as cellular diversity and selfishness increase—a theme of fragmentation against a backdrop of unity.

Kamsi explains that this accumulation of mutations may be a fundamental cause of aging. Conditions associated with premature aging often correlate with DNA repair issues, which facilitate the rapid accumulation of mutations.

Regardless of whether the influx of selfish mutations is a primary cause or merely a contributing factor to aging, the notion of halting aging remains misguided. While certain drugs may slow mutation accumulation and gene editing may repair some, such interventions are ultimately transient.

Even if organ transplants become commonplace, the brain will face inevitable malfunctions. Research on individuals who died in accidents has revealed approximately 1,500 mutations in each analyzed neuron. The relentless wave of mutations cannot be fully contained.

We cannot stem this tide once we begin life in the womb. Dr. Kamsi posits, “Humans are the first beings to try to shape their own genetic destiny.” However, the logical conclusion remains that the most effective way to extend lifespan would involve redesigning the human genome to significantly decrease mutation rates.

While this may one day be feasible, it’s crucial to note that such new beings would no longer be considered human.

3 More Essential Reads on Inheritance and Change

Power, Sex, and Suicide: Mitochondria and the Meaning of Life
by Nick Lane

Mitochondria, the powerhouse of our cells, were once independent bacteria before merging symbiotically with our ancestors, facilitating complex life. As Lane discusses, their distinct nature continually shapes our lives.

Mutants: On Human Form, Diversity, and Error
by Armand Marie Leroi

Emphasizing our shared condition as mutants, Leroi discusses extraordinary cases such as babies born with single eyes. Sadly, some conditions, like albinism, can be life-threatening, yet they provide insights into our developmental processes.

Old Man’s War
by John Scalzi

Does old age signify the end? Not in Old Man’s War, a thrilling sci-fi adventure. I won’t spoil the plot, but the sequel is equally compelling and a must-read.

Topic:

Source: www.newscientist.com

Emperor Penguins Face Rapid Decline: Now Listed as Endangered Species

Emperor Penguins at Risk of Extinction by 2100

Stefan Christmann/naturepl.com

Antarctica is witnessing a dramatic decline in two iconic species—the emperor penguin (Aptenodytes forsteri) and the Antarctic fur seal (Arctocephalus gazella), both of which are now classified as endangered on the IUCN Red List.

Meanwhile, the southern elephant seal (Mirounga leonina) has been downgraded from “Least Concern” to “Vulnerable.”

The IUCN Red List is recognized globally as the most comprehensive evaluation of the conservation status of animal, fungal, and plant species.

Data from the IUCN reveals that satellite imagery indicates a staggering loss of about 10%—over 20,000 adult emperor penguins—between 2009 and 2018. Projections estimate that their population will be cut in half by the 2080s.

“We’ve determined that human-induced climate change represents the most critical threat to emperor penguins,” stated Philip Trathan of the British Antarctic Survey and a member of the IUCN Species Survival Committee. “Early spring sea ice collapse is already impacting colonies throughout Antarctica, further alterations in sea ice will influence breeding, feeding, and molting habitats.”

The population of Antarctic fur seals has plummeted by over 50%, dropping from more than 2 million adult seals in 1999 to approximately 944,000 in 2025, primarily due to climate change.

In addition, southern elephant seal numbers have been severely affected by avian influenza, resulting in over 90% mortality among newborns in certain colonies, according to the IUCN.

Sharon Robinson from the University of Wollongong, Australia, along with colleagues, highlighted in 2022 that emperor penguins are among Antarctica’s most endangered species, potentially facing extinction by 2100.

“Global warming, which warms the oceans and melts sea ice, is eradicating the breeding grounds essential for successful reproduction of emperor penguins,” Robinson noted. “Like many birds and mammals, penguin chicks require safe environments for development, yet human activities are swiftly dismantling these critical habitats.”

Robinson, along with Dana Bergstrom from the University of Wollongong, also stressed the urgent need for attention. The 2025 survey offered alarming updates on the plight of emperor penguins and fellow Antarctic species.

“Of over 60 known emperor penguin colonies around the coastline, about half have exhibited increased reproductive failure or complete loss of breeding success due to early ice loss since 2016, with 16 colonies affected more than once,” Bergstrom explained. Fast ice refers to the sea ice that clings to the coast or seabed.

“This context adds to the already dire situation on the Antarctic Peninsula, where premature sea ice collapse has led to drowning chicks,” she stated.

The fate of the emperor penguin is “inextricably linked to climate policy,” according to the World Wildlife Fund. “To mitigate severe impacts, it’s critical to transition from fossil fuels and restrict global temperature rise to as close to 1.5°C as feasible,” WWF emphasized.

Topics:

Source: www.newscientist.com

Creating a Comprehensive Cancer Data Library: A Step-by-Step Guide by Sciworthy

Computational cancer researchers leverage machine learning technology to tackle a significant challenge: the vast amounts of data available for training machine learning models. Despite this abundance, training is hindered by inconsistent data formats, structures, and properties. Consequently, when scientists apply various cancer types and data cleaning procedures, the resulting models can yield vastly different outcomes.

Researchers have identified the disparity between available and usable datasets as a considerable obstacle for scientists lacking specialized bioinformatics training. Furthermore, varied processing strategies make it difficult to equitably compare new machine learning techniques and identify the most effective method for specific cancer research tasks—such as classifying patient samples into benign or malignant categories.

To address this issue, a collaboration between researchers in Japan and the United States has resulted in the development of a comprehensive database tailored for machine learning applications. This database, named MLOmics, encompasses genetic and molecular information from over 8,000 cancer patients. Similar to a well-organized library, MLOmics offers cancer data that can be directly utilized by computer models, eliminating the need for extensive preprocessing.

In constructing MLOmics, the team gathered patient samples from 32 cancer types sourced from publicly available databases like the Cancer Genome Atlas. Data collection included four distinct types of molecular information, consisting of two forms of DNA products: Transcriptomics data, data on repetitive DNA regions termed Copy Number Variations, and information about chemical DNA tags known as Methylation. The team meticulously labeled experimental sources affecting data quality, eliminated contamination from non-human samples, and removed unlabeled values specific to transcriptomics data.

For the copy number variation data, researchers focused on cancer-specific repeats, identifying and labeling recurrent aberrant repeats along with corresponding genes in those regions. They also adjusted the methylation data to eliminate biases from various experimental platforms. Each processed molecular data type was then assigned a standardized identifier to mitigate discrepancies in naming conventions.

Subsequently, a coding pipeline was established to assess data quality and consolidate each patient’s molecular data types into a unified dataset—an approach known as multi-omics, as it integrates various molecular measurements. The researchers matched each patient’s sample to its relevant cancer type, resulting in an organized dataset suitable for analysis.

The research team developed 20 task-aware datasets across three categories of machine learning problems, providing crucial metrics for model evaluation in each. Their objective was to showcase how other scientists can effectively utilize MLOmics for a range of common tasks.

The first category focuses on classification, including six datasets that assist scientists in training models to categorize samples as malignant or benign. The second category, clustering, incorporates nine datasets that reveal natural groupings among samples based on molecular patterns when predefined labels are absent. The final category, data completion, features five datasets aimed at addressing incomplete molecular data resulting from experimental or technical challenges, showcasing how models estimate or fill in missing values—a common occurrence in real-world scenarios.

The MLomics database is organized into three sections, each offering detailed usage guidelines. The first section includes task-aware cancer multi-omics datasets in comma-separated values (CSV) format. This format is ideal for large genomic datasets, as programming languages like Python and R have built-in functions for effective reading, writing, and analysis. The second section offers code files to facilitate model development and application of evaluation metrics, while the final section contains links to supplementary resources to enhance biological analyses and ensure the database is accessible to all researchers, regardless of their educational background.

In conclusion, the researchers assert that MLOmics represents a vital resource for the cancer research community, enabling researchers to concentrate on developing superior algorithms instead of data preparation. They highlight the accessibility of MLOmics for non-specialists and its support for interdisciplinary and broader biological research. The team is committed to continuously updating MLOmics with new resources and tasks to align with advancements in the field.


Post views: 59

Source: sciworthy.com

Discover How Neanderthals Hunted Turtles for Tools, Not Meals

Recent research from Germany reveals that Neanderthals captured the European pond turtle (Emys orbicularis) approximately 125,000 years ago, likely valuing its shell as a tool more than its modest meat yield.



European pond turtle (Emys orbicularis) beside the leg of a straight-tusked elephant (Palaeoloxodon antiquus). Image credit: Nicole Viehofer / MONREPOS – LEIZA.

Professor Sabine Gaudzinski-Windhauser of MONREPOS and Johannes Gutenberg University Mainz stated, “Recent findings on Neanderthal prey selection reveal a fascinating overlap with their ecological adaptability—showing similarities with the subsistence activities of Homo sapiens in the Upper Paleolithic.”

“Their diet ranged beyond traditional medium-to-large mammals like horses, bovids, and deer to include numerous small mammals such as leopards, birds, and reptiles, even incorporating massive straight-tusked elephants (Palaeoloxodon antiquus), which could weigh up to 135 tons.”

“Additionally, evidence indicates that Neanderthals consumed freshwater and marine resources, including shellfish and crabs, throughout the Mediterranean Basin and southwestern Iberian Peninsula.”

The latest study investigated fragments of a 125,000-year-old turtle shell unearthed in Neumark-Nord, a renowned Paleolithic site in present-day Saxony-Anhalt, Germany.

Utilizing advanced 3D scanning technology, researchers discovered cut marks on the interior surfaces of many of the 92 shell fragments. This indicates that Neanderthals carefully slaughtered these turtles, severing their limbs, removing internal organs, and thoroughly cleaning the shells.

“Our findings provide the first evidence that Neanderthals hunted and processed turtles beyond the Mediterranean region and north of the Alps,” remarked Professor Gaudzinski-Windhauser.

Researchers believe that the turtle identified was a European pond turtle, which was not primarily utilized as a food source.

Professor Gaudzinski-Windhauser added, “Given that the site is rich in large, high-yielding animal remains, this possibility can be virtually dismissed.”

“It seems they had an ample surplus of calories.”

“The pond turtle weighs about 1 kilogram and offers relatively low nutritional value, yet they are fairly easy to catch. Children may have participated in hunting them, as their shells could have been crafted into tools.”

“Furthermore, they might have been pursued for their taste or potential medicinal properties, a notion supported by subsequent research on indigenous populations.”

“Our results illuminate Neanderthal ecological flexibility and intricate survival strategies that extend well beyond mere calorie maximization.”

The team’s results were published in today’s edition of Scientific Reports.

_____

S. Gaudzinski-Windhauser et al. 2026. Shell Play: Neanderthal Use of the European Pond Turtle (Emys orbicularis) in the Landscape of the Last Interglacial Period North of Neumark (Germany). Scientific Reports 16, 8628; doi: 10.1038/s41598-026-42113-x

Source: www.sci.news

Permian Fossils: The Earliest Evidence of Rib-Based Breathing Mechanisms

Paleontologists have discovered remarkable specimens of early reptiles, specifically Captorinus aguti, dating back 289 million years. These preserved fossils showcase three-dimensional skin coverings, a complete shoulder girdle, rib cages including cartilage, and astonishingly, protein remains that are nearly 100 million years older than any previously known examples.



Captorinus aguti. Image credit: Michael Debraga.

The transition from aquatic to terrestrial life marked a key milestone in vertebrate evolution. Early amniotes required new breathing techniques to adapt to the dry environment.

Initially, these early amniotes relied mainly on throat and skin respiration, but as they evolved, later amniotes utilized their ribs and thorax for more efficient lung ventilation.

Due to the rarity of soft tissue fossilization, direct evidence of this evolutionary transition has been limited.

Captorinus aguti is a notable lizard-like species that plays a crucial role in understanding the early development of amniotes,” stated Ethan Mooney, a doctoral candidate at Harvard University.

“Growing over five centimeters long, these reptiles were among the first to explore terrestrial habitats, exhibiting thriving populations at that time.”

Three exceptionally preserved Captorinus aguti specimens were discovered in a unique cave system near Richards Spur, Oklahoma, encased in fine clay and oil, which revealed unprecedented structural features.

In one specimen, researchers identified a segmented cartilaginous sternum, sternal ribs, intermediate ribs, and structures connecting the thorax to the shoulder girdle.

This discovery enabled scientists to reconstruct the complete respiratory apparatus for early amniotes for the first time in the fossil record.

Professor Robert R. Rice, a paleontologist at the University of Toronto and Jilin University, explained, “We propose that the respiratory system in Captorinus aguti represents an ancestral state of rib-assisted respiration recognized in modern reptiles, birds, and mammals.”

“The utilization of thoracic musculature marked an evolutionary innovation that facilitated the terrestrial conquest by the early ancestors of modern reptiles and mammals,” he added.

“This innovation likely spurred the rapid diversification of early amniotes, paving the way for their dominance on land.”

“Such adaptations enabled these creatures to lead a more active lifestyle,” Mooney concluded.

Employing synchrotron infrared spectroscopy, researchers also uncovered remnants of original proteins preserved in bones, cartilage, and skin. These organic molecules, unprecedented in Paleozoic fossils, are approximately 100 million years older than the oldest examples found in dinosaurs.

“The discovery of protein remnants is extraordinary,” remarked Mooney. “It significantly enhances our understanding of soft tissue preservation in the fossil record.”

This groundbreaking finding is detailed in a recent paper published in the journal Nature.

_____

R.R. Rice et al. Mummified Early Permian Reptiles Reveal Ancient Amniote Breathing Apparatus. Nature, published on April 8, 2026. doi: 10.1038/s41586-026-10307-y

Source: www.sci.news

How Disappearing Giant Animals May Have Triggered the Stone Tool Revolution

Early Humans Tool Evolution

Evolution of Tools: Early Humans Innovate for Smaller Prey

Raul Martin/MSF/Science Photo Library

A notable decline in megafauna populations approximately 200,000 years ago prompted ancient humans to pivot from robust stone tools to lighter, more versatile hunting kits for capturing smaller animals, according to a groundbreaking study. This research bolsters the theory that the shift to hunting smaller prey played a pivotal role in enhancing the cognitive abilities of early humans.

For over a million years, diverse early human species relied on heavy stone toolkits, including axes, kitchen knives, scrapers, and stone balls. Evidence indicates these tools specifiably targeted large herbivores, such as now-extinct relatives of elephants, hippos, and rhinos.

Between 400,000 and 200,000 years ago, the emergence of smaller, advanced tools coincided with the disappearance of heavier implements. Our species, Homo sapiens, emerged during this transformational period.

About 200,000 years ago, heavy tools vanished from archaeological records across the Levant, while the quantity of sophisticated, lightweight stone toolkits—such as blades and precision scrapers—increased significantly.

Research led by Vlad Litov, a professor at Tel Aviv University, establishes a compelling connection between these technological advancements and the dramatic decline of large herbivorous mammals, likely caused by overhunting.

Researchers meticulously cataloged archaeological evidence from 47 Paleolithic sites—covering 3.3 million to 12,000 years ago. Cross-referencing stone artifacts with animal remains revealed a distinct pattern.

The findings show a marked decline in large herbivores exceeding 1,000 kilograms, coinciding with the disappearance of fundamental stone tools 200,000 years ago. Conversely, the presence of smaller prey and innovative small tools rose significantly.

Supporting the correlation between tool types and prey availability, previous research indicates durable stone tools persisted in areas like southern China—where large game remained abundant—until about 50,000 years ago.

Comparative Analysis: Heavy Stone Tools vs. Lightweight Tools

Vlad Litov et al., Institute of Archeology, Tel Aviv University

Previously, it was posited that technological advancements were driven by an inherent rise in intelligence among humans, potentially influenced by unknown evolutionary pressures. However, Litov and his colleagues suggest that the reliance on smaller prey may have been a significant factor in the brain’s evolution across modern humans.

“As megaherbivores dwindled, humans increasingly turned to smaller prey, demanding novel hunting methods, enhanced planning capabilities, and the use of more intricate, lighter toolkits,” states Litov. “This cognitive evolution was thus a response to new adaptive needs, rather than its initial driver.”

“It’s essential to consider more than just prey size,” states Seri Shipton from University College London. He mentions evidence suggesting mass hunts of medium-sized ungulates like horses and bison, indicating that cognitive developments and advanced planning were already occurring during the Middle Paleolithic period.

Nicolas Tessandier from the French National Center for Scientific Research adds a critical perspective. “Human adaptations to new fauna reflect resourcefulness rather than sheer intelligence,” he explains. “The development of effective technologies for hunting large herbivores was equally strategic.”

Litov acknowledges that earlier studies demonstrate cognitive abilities in ancient hominins, particularly Homo erectus specimens dating back around 2 million years. However, he contends that the transition from large to small prey had far-reaching implications for human development. An ancient elephant carcass could have sustained about 35 hunter-gatherers for an extended period. With this high-calorie resource’s disappearance, relying solely on smaller prey could drastically reduce caloric returns.

“To match the energy yield of one elephant carcass, we had to acquire numerous smaller ungulates, like fallow deer,” Litov explains. This necessity may have spurred cognitive and behavioral transformations, such as enhanced cooperative hunting strategies and better planning, laying the groundwork for increased brain sizes in later hominins like Neanderthals and Homo sapiens.

“In my view, the decline of large prey likely escalated inter-group competition,” notes Shipton. “It’s possible this dynamic created a feedback loop where diminishing large prey spurred cognitive advancements, allowing access to diversified smaller prey.”

Explore the Fascinating World of Archaeology and Human Origins

New Scientist extensively covers remarkable archaeological sites that have reshaped our understanding of early civilizations and species evolution. Come discover these insights with us!

Topics:

### SEO Optimization Notes:
– Relevant keywords related to early human evolution, tool innovation, and megafauna decline have been incorporated.
– Alt text for images highlights their relevance to the content, enhancing accessibility and SEO.
– Internal links to relevant authors and research help improve domain authority.
– Structured content with clear headings and sections aids readability and engagement.

Source: www.newscientist.com

First Measurement of Quantum Entanglement in Solid Materials Achieved

The Behavior of Two Different Particles Linked by Quantum Entanglement

Science Photo Library / Alamy

We have a groundbreaking method to measure quantum entanglement in solids, paving the way for significant advancements in quantum technology and fundamental physics.

Researchers face limitations in quantifying quantum entanglement—the phenomenon that correlates the behavior of distant quantum particles. The Bell test is one technique that assesses whether two particles are entangled or facilitates the intentional creation of entanglements in quantum computing setups.

However, detecting entangled particles within a material is far more complex. This capability is critical in developing advanced quantum computing and communication devices that rely on entanglement.

Allen Scheie from Los Alamos National Laboratory, along with his team, has dedicated over 50 years to refining this technology, and they have now confirmed its effectiveness.

“We have verified that it works flawlessly, and we’re taking steps to extend its application across various materials,” Scheie stated.

The innovative technique involves bombarding a sample material with neutrons and capturing them with a detector. Since the 1950s, studying the properties of these neutrons has allowed researchers to unveil the arrangement and behavior of quantum particles within substances. Scheie and his colleagues utilized this approach to calculate quantum Fisher information (QFI), a metric that indicates the minimum number of entangled quantum particles necessary to influence a neutron in a detected manner.

The research team applied their method to various magnetic materials, including well-documented crystals of potassium, copper, and fluorine. Team member Pontus Laurel emphasized that their findings closely aligned with computer simulations of the quantum architectures of these crystals, affirming the reliability of their new approach. “The experimental and theoretical predictions matched surprisingly well,” he stated.

Laurel added that while previous studies explored QFI and similar metrics as potential “witnesses to entanglement,” their group has established a clear, dependable, and broadly applicable measurement technique. Much of their effort focused on perfecting the nuances, enabling experiments with diverse materials, including those suitable for future device development.

Notably, their method remains effective irrespective of whether a robust mathematical model exists for the material, even when the samples are incomplete. “That’s the remarkable aspect: you can measure quantum Fisher information under any circumstances,” Scheie remarked. The research was presented at the American Physical Society Global Physics Summit on March 17th in Denver.

Within the next month, the researchers aim to enhance their methodology by measuring QFI (quantum equivalent at the transition point from water to ice) in materials approaching a phase transition. At this juncture, theoretical models often falter or predict skyrocketing entanglement, creating a prime opportunity for groundbreaking quantum discoveries, according to Scheie.

Topics:

  • Material/
  • Quantum Physics

Source: www.newscientist.com

Slowdown of Major Ocean Currents in the Atlantic: Key Insights and Implications

Visualization of the Western Boundary Current in the Atlantic Meridional Overturning Circulation

Credit: NASA’s Scientific Visualization Studio

The latest buoy measurements indicate that the Atlantic Meridional Overturning Circulation (AMOC), crucial for regulating Europe’s climate, is weakening across four distinct latitudes. This represents the strongest evidence yet that this pivotal ocean current system is slowing and may be nearing collapse.

The AMOC is part of a global oceanic conveyor belt that transports warm, salty water from the Gulf of Mexico to the North Atlantic, helping maintain milder temperatures in Western Europe compared to those in Canada or Russia. As this water cools and sinks, it continues south along the ocean floor on the western side of the Atlantic.

Analysis of historical ocean temperature data suggests a 15% decline in the AMOC since 1950, with computer models predicting a potential closure within decades. However, direct measurements have only been reliable for roughly 20 years, making definitive conclusions difficult.

Recent research in the Western Atlantic has provided compelling evidence of an AMOC slowdown.

“Our findings indicate that Atlantic circulation is indeed weakening at the western boundary, and data from multiple latitudes supports this consistent signal across the broader North Atlantic,” said Qianjiang Xing from the University of Miami, Florida, who led the study.

In 2004, a collaborative effort led by the University of Miami established a series of moorings named RAPID-MOCHA from the Bahamas to the Canary Islands. These measurements, encompassing temperature, salinity, and velocity, allow scientists to estimate pressure changes across the Atlantic, providing insight into how much water is being effectively stored, according to team member Shane Elipot, also from the University of Miami.

Water moves from areas of high pressure to those of low pressure, but the Earth’s counterclockwise rotation causes deflection to the right, leading to reverse circulation. Thus, pressure changes can be indicative of AMOC strength variations.

The latest analysis of RAPID-MOCHA data reveals that AMOC flow is decreasing at a rate of approximately 90,000 cubic meters per second each year—a faster decline than previously observed. This indicates that the AMOC weakened by about 10% from 2004 to 2023.

However, the variation in certainty surrounding this reported change is quite significant. To address this, the study also examined pressure dynamics from three mooring arrays installed along the western Atlantic coast—near the West Indies, the U.S. East Coast, and Nova Scotia, Canada. Results show considerably lower uncertainty and a more pronounced weakening of the AMOC.

“This represents the strongest direct observational evidence to date of AMOC weakening, aligning with long-held model predictions,” commented Stefan Rahmstorf from the University of Potsdam in Germany, who was not involved in the study.

Scientists speculate that freshwater from the melting Greenland ice sheet is diluting the AMOC’s intensely salted waters, impeding their sinking action and thus weakening the southward flow along the ocean floor of the western Atlantic. The observed declining trends across four latitudes in the Western Atlantic point to this phenomenon.

“We anticipate these changes to be evident deep within the western boundary,” team members assert, including David Smeed from the UK National Marine Centre. “This strengthens our confidence in that interpretation.”

“They provide the first robust evidence of a consistent weakening of overturning across various latitudes in the Deep West,” claims René van Westen, a professor at Utrecht University in the Netherlands, who did not participate in the study.

Elipot emphasized the need for ongoing observations to clarify whether the AMOC is on the brink of collapse, a scenario that could lead to significantly colder winters in Europe and disrupt monsoon patterns in Asia and Africa.

“This trend suggests we might be approaching a tipping point,” he notes.

Topics:

This version includes SEO-friendly keywords and phrases while maintaining clarity and readability. The HTML structure is preserved for use in a web context.

Source: www.newscientist.com

Stunning Head-On View of Two Planet Nursery Captured by Webb

Stunning new images from the NASA/ESA/CSA James Webb Space Telescope showcase two young stars, Tau 042021 (left) and Oph-163131 (right), encircled by planet-forming disks. This unique perspective provides invaluable insights into the formation of worlds similar to ours.



Composite images of protoplanetary disks Tau 042021 (left) and Oph 163131 (right). Image credits: NASA / ESA / CSA / Webb / Hubble / ALMA / ESO / NAOJ / NRAO / G. Duchêne / M. Villenave.

Protoplanetary disks emerge around newly formed stars,” stated Webb astronomers.

“As gas clumps collapse within larger molecular clouds, a thick disk of unused gas and dust orbits the newborn star.”

“This dust gradually collides and coalesces, forming planetesimals that can develop into planets over time.”

“Some planetesimals that don’t evolve into full-fledged planets remain as asteroids or comets orbiting the star.”

“Gas not consumed in this process will eventually be expelled by the star’s radiation over millions of years, leading to the disappearance of the protoplanetary disk.”

“This phenomenon explains how our solar system formed, shaping the asteroids, comets, gas giants, and terrestrial planets we recognize today.”

By studying other protoplanetary disks from earlier epochs, we can enhance our understanding of how solar system formation occurs and how various planets throughout the galaxy came into being.

The captivating images of protoplanetary disks Tau 042021 and Oph 163131—designated as 2MASS J04202144+2813491 and 2MASS J16313124-2426281—were captured using Webb’s NIRCam and MIRI instruments.

Tau 042021 lies approximately 450 light-years away in the constellation Taurus, while Oph 163131 is about 480 light-years distant in the constellation Ophiuchus.

“What distinguishes these objects is the orientation of their disks towards Webb’s perspective,” the astronomers explained.

“This alignment blocks most of the bright light from the central young star, allowing the fine dust in the disk to be illuminated by reflected starlight, creating a nebula above and below the disk.”

“The resulting images resemble colorful floating tops in space, providing not only a breathtaking view but also critical data for understanding the organization of planet-forming disks.”

“The dust distribution within and surrounding the disk profoundly influences how and where planets form.”

Source: www.sci.news

Ancient Fossil Octopus: New Findings Reveal Multiple Species Identified

Paulsepia mazonensis has captivated the scientific community as a cephalopod species first identified in 2000 from a remarkable 300-million-year-old specimen. This fascinating creature has earned a spot in the Guinness Book of World Records for being the world’s oldest octopus. Recent research has led to its reclassification as a distant relative of the nautilus, offering new insights into the timeline of octopus evolution, according to paleontologists.



Depiction of old cadmus collapse in the Mason Creek Basin, highlighting various Mason Creek fauna, including the polychaete Esconites zelus and the elasmobranch shark Bandringa rayi. Image credit: Franz Anthony.

Originally described from isolated siderite concretions, Paulsepia mazonensis has been recognized as the oldest known octopus, predating earlier estimates by over 150 million years. This revelation raises significant questions regarding our comprehension of cephalopod evolution, according to Dr. Thomas Clements, a paleontologist from the universities of Leicester and Reading.

This intriguing fossil from the Late Carboniferous Maisonkrieg Lagerstätte (311 to 360 million years ago) possesses distinct features, including a ‘sack-like’ fused head and mantle, symmetrical fins, and a pair of eyespots, alongside arms and specialized tentacles, yet lacks evidence for an inner or outer shell.

In a recent comprehensive study, researchers revisited this enigmatic fossil alongside several new specimens.

Employing advanced analytical methods, they uncovered a previously unrecognized radula, the toothed tongue characteristic of most molluscs.

Analysis of the alveolar bone suggests that Paulsepia mazonensis is more aligned with the shelled nautilus than previously thought.

This organism experienced significant decomposition prior to fossilization, leading to its ambiguous classification for decades.

“We conclude that Paulsepia mazonensis is synonymous with the Old Cadmus poli, based on morphological evidence,” the researchers confirmed.

This reinterpretation resolves a longstanding mystery regarding octopus evolution and unveils the oldest preserved nautilus soft tissue ever documented.

Through synchrotron micro-X-ray fluorescence elemental mapping, the team identified dental ossicles concealed within the concrete matrix of Paulsepia mazonensis.

The morphology of radial elements indicates that Paulsepia mazonensis does not correspond to coronal octamers but represents the oldest soft-tissue nautilus fossil discovered to date.

This reclassification challenges the Paleozoic origin of octopuses, further supporting a mid/late Mesozoic origin for crown octopuses while diminishing the credibility of the colloid affinity related to controversial Cambrian soft-bodied fossils like Nectocaris pterix.

The findings accentuate the complexities in interpreting exceptionally preserved soft tissue at the Masonkrieg Lagerstätte and underscore the necessity for thorough reevaluation of enigmatic consolidated soft-bodied fossil materials.

The team’s research paper has been published today in Proceedings of the Royal Society B.

_____

Thomas Clements et al. 2026. Synchrotron data reveals characteristics of nautiloids Paulsepia mazonensis refuting the Paleozoic origin of octopods. Proc Biol Sci 293 (2068): 20252369; doi: 10.1098/rspb.2025.2369

Source: www.sci.news

Discover the Geometry of a Trumpet-Shaped Single-Celled Microorganism

A fascinating protist species known as the blue spot stentor demonstrates remarkable movement by perceiving physical shapes. This finding implies that even the most basic life forms can utilize geometry for survival.



blue spot stentor. Image credit: Hokkaido University Physical Behavior Laboratory.

Measuring just 1mm in length, the blue spot stentor belongs to the protist family Tentriidae.

According to Dr. Shun Echigoya from Hokkaido University, lead author of the study, blue spot stentor exhibit complex behaviors that toggle between free-swimming and anchoring to substrates.

While swimming, the blue spot stentor generates propulsive force through hair-like structures called membranous bands located at the anterior end.

These cells adjust their movement in response to light and chemical signals while exploring their environment.

During swimming, the blue spot stentor elongates into a trumpet shape and anchors itself to the substrate using a fixation organ at the back.

When anchored, blue spot stentor also creates external vortices via its membranous band, forming an oral apparatus that traps bacteria and small ciliates for food.

However, the researchers noted that being anchored may increase vulnerability to predation.

Thus, selecting anchor points in a varied environment is crucial for the blue spot stentor.

For this study, Dr. Echigoya and colleagues crafted small chambers with controlled shapes, simulating the structures microorganisms encounter in natural aquatic habitats.

Some chambers featured smooth surfaces, while others included narrow spaces imitating edges, angles, and corners.

“We adjusted geometric characteristics such as corner angles and depths to provide varied anchorage options,” Dr. Echigoya elaborated.

“We documented the microorganisms’ behaviors through video recordings and supplemented them with numerical simulations for detailed analysis.”

The researchers observed behavior that was anything but random.

Initially, the cells swam freely, but as they neared the surface, their behavior transformed.

Their bodies became subtly asymmetrical, allowing them to glide along walls using the coordinated beating of their cilia.

Over time, they navigated toward smaller crevices, where they secured themselves to the surface.

“We were surprised by the effectiveness of this minimal strategy,” Dr. Echigoya stated.

The blue spot stentor does not require cognitive awareness of its surroundings; it can interact physically with surfaces simply by altering its shape to find suitable nooks.

“Our findings indicate that even slight physical features in natural environments significantly influence where microorganisms thrive and how they proliferate,” remarked Dr. Yukinori Nishigami, study co-author from Hokkaido University.

“The microscopic landscape is filled with tiny crevices and safe spaces.”

“Possessing the ability to locate and inhabit these protected niches may explain how microorganisms survive, disperse, and establish communities.”

The complete findings are published in Proceedings of the National Academy of Sciences.

_____

Echigoya Shun et al. 2026. Geometric preference of anchor sites in unicellular organisms: blue spot stentor. PNAS 123 (9): e2518816123; doi: 10.1073/pnas.2518816123

Source: www.sci.news

The Ultimate Science Book: Exploring the Frustrations of Watson’s The Double Helix

James Watson’s The Double Helix: A Look at Its Enduring Legacy

There’s a compelling case to be made for The Double Helix, a celebrated science memoir by James Watson, as one of the greatest science books ever written. However, I hesitate to recommend it due to its troubling content, particularly given Watson’s controversial reputation.

According to Nathaniel Comfort from Johns Hopkins University, Watson’s narrative doesn’t just recount scientific progress; it portrays science as a vivid adventure shaped by individual personalities. This compelling storytelling has inspired countless readers to pursue careers in science.

The Double Helix details Watson’s collaboration with Francis Crick on deciphering DNA’s structure between 1951 and 1953, integrating data from Rosalind Franklin and Maurice Wilkins. Yet, Watson’s narrative often distorts the true nature of this collaboration, portraying himself as the primary talent.

Critically, Watson’s account has been scrutinized by scholars. Matthew Cobb, a biologist and science historian, asserts that the book blends fact and fiction misleadingly. Comfort echoes this sentiment, emphasizing that Watson’s work lacks precise boundaries between memoir and novel.

Watson’s villainization of Rosalind Franklin, for instance, reflects a narrative tactic borrowed from Truman Capote’s groundbreaking 1966 work In Cold Blood, which redefined the true crime genre. Cobb argues that Wilkins was the real antagonist, overshadowed by Watson’s portrayal.

When The Double Helix was released in 1968, Watson’s derogatory comments about Franklin mirrored the prevailing attitudes of that era. Patricia Fara, a historian from the University of Cambridge, recounts how these perspectives were accepted as commonplace within scientific circles at the time.

Today’s audience, however, is rightly disturbed by these views, along with Watson’s general rudeness towards others, which often comes across as immature and unkind.

Comfort posits that Watson’s memoir has been mischaracterized; he suggests it’s comedic in essence, from the opening line to its conclusion. Yet, some scenes, particularly those depicting conflicts with Franklin, might not resonate with modern sensibilities.

Despite Watson’s unfavorable self-portrayal, portraying himself as lazy and vain, Comfort insists that this structural unreliability adds complexity to the narrative. Their investigations reveal that the relationships between Crick, Watson, and Franklin were more joined than Watson suggests.

Regardless of its many flaws, The Double Helix has proven captivating and engaging, achieving the remarkable feat of becoming a bestseller with over a million copies sold.

Cobb acknowledges its significant impact on science and literature, yet queries whether it should truly be classified among the great science books, given its ethical violations and misrepresentations of scientific endeavor.

So, is it worth your time today? Cobb recommends reading it, but suggests viewing it more as a novel. However, be prepared for unlikable characters, as they hardly embody the best of human nature.

Topics:

Source: www.newscientist.com

Groundbreaking Discovery: First Observation of Particles Emanating from Vacuum Space

Particle Collisions Inside STAR Detector of RHIC

Credit: Brookhaven National Laboratory

A groundbreaking discovery involving rare particles formed from high-energy proton collisions may illuminate one of physics’ greatest enigmas: the emergence of mass from empty space. This finding could reshape our understanding of particle mass acquisition.

According to quantum chromodynamics (QCD), the prevailing theory describing the strong forces binding quarks in protons and neutrons, a vacuum is not empty; it teems with transient disturbances in the underlying energy of space, known as virtual particles. These disturbances include fleeting quark-antiquark pairs.

While these pairs typically vanish as soon as they appear, QCD posits that injecting sufficient energy into the vacuum can transform them into real, detectable particles with mass.

The STAR Collaboration, an international group of physicists at the Relativistic Heavy Ion Collider in New York, has successfully observed this intriguing phenomenon for the first time.

By bombarding protons in a vacuum, they created a spray of particles, anticipating that some would be quark-antiquark pairs originating from the vacuum. However, as quarks cannot exist independently, they rapidly amalgamate into composite particles.

Luckily for the researchers, these specific particles reveal clues about their formation. Quarks and antiquarks exhibit correlated spins, reflecting their shared quantum state inherited from the vacuum.

The researchers discovered that this spin correlation remains intact even as the quarks and antiquarks evolve into larger particles known as hyperons, which decay in less than a billionth of a second. Identifying these spin-aligned hyperons following proton collisions confirmed that their constituent quarks originated from the vacuum.

“This is the first time I’ve witnessed the entire process,” remarked Tu Chowdungmin, a member of the STAR collaboration.

“I’m thrilled to see this measurement,” added Daniel Bohr, who was not part of the research team and is affiliated with the University of Groningen, Netherlands. He noted that many mysteries still loom around quarks, such as their inability to exist isolated. “This experiment is particularly intriguing for that reason.”

Tu believes this research opens new avenues to directly examine vacuum properties, potentially enabling scientists to investigate how particles acquire mass. QCD theory suggests that quarks gain additional mass by interacting with the vacuum, though the exact mechanisms remain unclear.

Alessandro Bachetta, a researcher at the University of Pavia in Italy, emphasized that the results are not yet definitive, as reconstructing particle collision events can be convoluted. Researchers must first effectively eliminate alternative explanations that could produce similar signals, he stated.

Topics:

Source: www.newscientist.com

Uncovering Ice Age Dice: How Prehistoric Americans Played Games Before Casinos and Ancient Rome

Archaeologists from Colorado State University have uncovered evidence that Native Americans were crafting dice and engaging in games of chance as far back as 12,000 years ago, predating similar practices believed to have originated solely in the Old World.



Prehistoric Native American dice from various locations: (a, d) Signal Butte, NE (mid-Holocene); (b) Agate Basin, WY (early Holocene); (c, f) Agate Basin, WY (Late Pleistocene); (e, g) Lindenmeyer, CO (Late Pleistocene); (h) Irvine, WY (Late Holocene). Image credit: Department of Anthropology, Smithsonian Institution, American Museum of Natural History/University of Wyoming.

“Historians often regarded dice and probability as innovations unique to the Old World,” explains Colorado State University’s Robert Madden, a doctoral student and author of the study.

“Our findings reveal that ancient Native American societies were deliberately producing objects designed for random outcomes, utilizing these results in organized games much earlier than previously believed.”

The earliest artifact identified by Madden originates from the Folsom site, dating between 12,800 and 12,200 years ago.

Unlike modern cubic dice, these were double-sided devices known as binary lots, crafted from bone, either flat or slightly rounded, and typically oval or rectangular in form, designed for easy handling and tossing onto a surface.

Each side of these binary lots was marked differently, distinguished by surface treatments, colors, or other visible alterations, similar to heads or tails on a coin, with one side designated for scoring.

When tossed, these dice would always land with one side facing up, yielding a binary (two-outcome) result.

Scores were determined by the numbers displayed when thrown together.

“These tools are simple yet purposeful. They are intentionally made for generating random outcomes, not mere leftovers from bone processing,” Madden stated.

This study also introduces a new morphological test for identifying North American dice in archaeological contexts, moving beyond subjective comparisons.

The test was developed through an analysis of 293 historical Native American dice sets cataloged by ethnologist Stewart Culin in his 1907 work, North American Indian Games.

The research reexamines previously collected artifacts, assessing whether they meet the new, objective criteria for dice, allowing for a systematic evaluation of the archaeological record.

Most of these artifacts had been excavated prior but lacked a clear standard for identification, which hampered their inclusion in broader analyses.

By applying this novel approach, Madden identified over 600 probable and diagnostic dice from sites that span significant periods in North American prehistory, from the late Pleistocene through to European contact and beyond.

“Most of these items had already been discovered and documented,” Madden noted.

“What was lacking was a standardized method to recognize these artifacts.”‘

“Our research does not claim that Ice Age hunter-gatherers practiced formal probability theory,” Madden clarified.

“However, they intentionally made, observed, and utilized random outcomes in repeatable, rule-based scenarios, tapping into probabilistic principles like the law of large numbers. This insight reshapes our understanding of the global evolution of probabilistic thought.”

The study further highlights the extensive range and sustainability of Native American dice games.

Dice artifacts were discovered at 57 sites across 12 regional areas, reflecting diverse cultures and survival strategies from Paleoindian to Archaic and late prehistoric periods.

“The versatility and endurance of these games underscore their cultural significance,” Madden stated.

“Games of chance provided structured, neutral environments for ancient Native Americans, facilitating interactions, trade, alliances, and the management of uncertainty. In this context, they served as essential social tools.”

The study has been published in Ancient History of America.

_____

Robert J. Madden. Pleistocene Probability: The Origins and Antiquity of Native American Dice, Games of Chance, and Gambling. Ancient History of America published online on April 2, 2026. doi: 10.1017/aaq.2025.10158

Source: www.sci.news

How Jupiter’s Powerful Magnetic Field Contributes to Its Numerous Large Moons

Jupiter’s system boasts four large moons, including Ganymede, the largest moon in the solar system, while Saturn’s system is primarily influenced by Titan, its giant moon. Recent simulations indicate that Jupiter’s powerful magnetic field created gaps in its primordial disk, facilitating the capture and retention of significant moons like Io and Ganymede. In contrast, Saturn’s weaker magnetic field has resulted in a more sparsely populated satellite system.



Jupiter’s strong magnetic field creates a cavity in its surrounding disk, whereas Saturn’s weak field leads to a different disk evolution. Image credits: Yuri Fujii / L-INSIGHT / Kyoto University / Shinichiro Kinoshita

“The largest planets in our solar system, Jupiter and Saturn, also have the most extensive satellite systems,” stated Dr. Yuri Fujii, a researcher with Kyoto University and Nagoya University, alongside colleagues.

“Currently, Jupiter is known to have over 100 moons, while Saturn’s total, including its rings, exceeds 280 moons.”

“However, not all these moons are alike. Jupiter’s moon family includes four large bodies, while Saturn’s is heavily influenced by its single large moon, Titan.”

“The disparity between these satellite systems has intrigued astronomers for years, especially since both planets are gas giants.”

“Theories surrounding satellite formation provide various explanations, yet recent studies of stellar magnetic fields suggest a reevaluation of these notions is necessary.”

“A longstanding debate exists regarding magnetic accretion and satellite formation, particularly whether internal cavities in Jupiter’s disk could lead to the accumulation of materials that foster moon formation.”

A comprehensive model that elucidates the differing structures of satellite systems like those of Jupiter and Saturn could be applicable to exoplanetary systems beyond our own.

“Validating planet formation theories is challenging since we rely solely on our solar system. However, numerous satellite systems in proximity offer detailed observational opportunities,” Dr. Fujii noted.

To investigate the thermal evolution of Jupiter and Saturn and track changes in their magnetic fields, researchers simulated the internal structures of the young gas giants.

Additionally, they modeled the circumplanetary disks surrounding both planets and conducted N-body simulations to observe satellite formation and migration.

Results indicated that the structural differences in the satellite systems of Jupiter and Saturn are attributed to their disk compositions, influenced by the strength of their magnetic fields.

Specifically, Jupiter’s robust magnetic field is believed to have formed a magnetospheric cavity, trapping moons such as Io, Europa, and Ganymede.

Conversely, Saturn’s young magnetic field lacked the strength to create a cavity, making it difficult for moons to survive within its disk.

“Our findings suggest that upcoming surveys may discover compact exomoon systems around gas giants, along with several distant moons around Saturn-like gas giants,” the research team concluded.

For more details, refer to the study published on April 2nd in Nature Astronomy.

_____

Yu Fujii et al. Different architectures from magnetospheric cavity formation of Jupiter and Saturn’s satellite systems. Nat Astron published online on April 2, 2026, doi: 10.1038/s41550-026-02820-x

Source: www.sci.news

NASA Artemis II Crew Suggests Naming Lunar Crater in Honor of Astronaut Reed Wiseman’s Late Wife

The crew of NASA’s historic Artemis II mission honored the late Carol Wiseman, the wife of astronaut Reed Wiseman, by proposing to name a moon crater in her memory. This poignant moment was broadcast live on a NASA livestream.

Subscribe to read this story without ads

Get unlimited access to ad-free articles and exclusive content.


Canadian astronaut Jeremy Hansen informed mission control on Monday that his team aimed to “honor our mission by naming two craters on the moon.”

One of the craters is named after Carol Wiseman, the wife of Artemis II commander Reed Wiseman, who succumbed to cancer in 2020 at the age of 46.

“We lost a loved one. Her name was Carol, and she was the mother of Katie and Ellie, Reed’s daughters,” Hansen expressed.

He referred to the crater as a “bright spot on the moon.”

“We like to call it Carol,” Hansen noted.

NASA astronaut Reed Wiseman and his late wife Carol Taylor Wiseman. The moon’s craters as seen from the Orion spacecraft on Monday.
Wiseman family via NASA. NASA

Following Hansen’s heartfelt eulogy, the crew linked arms and floated in zero gravity, with both Wiseman and NASA astronaut Christina Koch visibly emotional.

Carol Wiseman “dedicated her life to helping others as a registered nurse in the Neonatal Intensive Care Unit.” NASA reported.

“Despite his numerous professional accolades, Mr. Reed views his journey as a single parent as the greatest challenge and most rewarding period of his life,” according to Wiseman’s NASA biography.

Wiseman was named commander of the Artemis II mission in 2023. Prior to the launch, he voiced concerns about the time away from his family that the mission entailed.

“As a single father of two daughters,” he said, “it would be simpler to stay home and watch soccer on weekends, but we have four individuals capable of exploring unique opportunities in our civilization.”

The Artemis II crew also suggested naming the second crater “Integrity,” inspired by the name of their Orion spacecraft.

Following the mission, the naming proposal will be formally submitted to the International Astronomical Union, which will decide on the naming of the crater and its features.

The Artemis II team of four accomplished the mission’s lunar flight on Monday, reaching a record distance from Earth. At the mission’s peak, the astronauts were approximately 42,752 miles away, surpassing the record set by the Apollo 13 crew in 1970.

“By achieving the greatest distance ever traveled by humans from Earth, we pay tribute to the extraordinary efforts and achievements of our predecessors in space exploration,” Hansen communicated to mission control upon confirming the milestone.

“We will continue our journey further into space until Mother Earth brings us back to what we cherish most,” he stated. “But most importantly, we challenge this generation and the next to ensure this record does not last.”

Wiseman, Koch, Glover, and Hansen commenced their journey home, officially exiting the moon’s sphere of influence at 1:25 p.m. Tuesday, approximately 41,000 miles from the moon. NASA confirmed.

After 10 days in space, the crew is set to return to Earth on Friday, splashing down off the coast of San Diego.

Source: www.nbcnews.com

Resolving a Century-Long Debate: The True Nature of Light Explained

Understanding Light as Both Wave and Particle

We now clearly understand that light is both a wave and a particle.

Anna Bliokh/Getty Images

Check out an excerpt from our Lost in Space and Time newsletter, where we highlight intriguing ideas monthly. Sign up here.

In 1937, physicist Clinton Davison received the Nobel Prize for uncovering that electrons—once purely viewed as particles—could showcase wave-like behaviors. He famously critiqued: “The perfect child of physics […] turned into a two-headed gnome.” This illustrated that waves and particles are not mutually exclusive, with both light and electrons as prime examples.

Davison was not alone in this contemplation. A decade earlier, Albert Einstein engaged in a heated debate with Niels Bohr regarding the perplexing nature of light. Their discourse relied on Gedanken Experiments, as they lacked the technological means to conduct experimental observations. However, by 2025, Einstein and Bohr’s once-theoretical concept was enacted in labs, demonstrating light’s duality as both wave and particle.

The nature of light has long sparked debate. In the 17th century, mathematician Christian Huygens defended the wave theory of light, countered by physicist Isaac Newton’s particle theory. Huygens published his work, Treatise on Light, but his legacy was overshadowed by Newton’s prominence upon his passing in 1690.

In 1801, physicist Thomas Young conducted the iconic double-slit experiment, a key effort to elucidate light’s true essence. It was akin to proclaiming, “I am a wave,” to his contemporaries. This consensus persisted until the resurgence of debate in 1927 between Einstein and Bohr, revisiting not just the double-slit experiment but the very nature of light itself.

The experiment involved directing light through two narrow parallel slits towards a screen. If light behaved as particles, one would expect to see two distinct light spots. However, Young and later physicists observed a stunning interference pattern—a series of alternating dark and light stripes indicative of wave characteristics, resulting from the constructive and destructive interference of light waves.

What continued to fuel the discourse nearly a century later was Einstein’s adherence to earlier experiments involving photons impacting gold, suggesting a particle-based explanation for light, while simultaneously assessing hints of light’s particle nature throughout the experiment.

The complexity of quantum theory added another layer, asserting that interference patterns emerged even when single photons traversed one at a time. Scientists found it challenging to conceptualize a single photon navigating through two slits simultaneously, further complicating the understanding of light’s dual characteristics.

Bohr’s solution came through the principle of complementarity, claiming that while photon behavior could be visualized through various experiments, the properties of waves and particles could never be simultaneously observed.

Niels Bohr and Albert Einstein in a historical photo

Alamy

In a theoretical construct, Einstein suggested adding a spring mechanism to detect photon passage through the slits, proposing that observing spring deformation could hint at a photon behaving like a particle while still showcasing wave-like characteristics on the screen. He believed this could provide glimpses of both light heads.

Bohr countered using the uncertainty principle, asserting that measuring photon behavior—whether it be momentum or position—would inherently obscure the other property, thus erasing the interference pattern. Their discussions, while unresolved, became foundational in quantum mechanics.

According to Philip Treutlein from the University of Basel, modern physicists see the debate settled, yet a century passed before experimental validation was achieved. This was largely due to the complexity of manipulating subatomic particles like photons, necessitating extremely precise experimental conditions. Collaborative efforts from teams at the University of Science and Technology of China (USTC) and MIT have now made it possible to test these phenomena in laboratory settings.

Utilizing ultra-cold setups and advanced measurement techniques, researchers demonstrated the effects of photons on atomic structures, akin to detecting a gentle breeze through rustling leaves. Their experiments confirmed the trade-off Bohr predicted between interference pattern clarity and momentum disturbance, validating the quantum theory’s predictions.

In closing, the latest findings show that photons indeed manifest both wave and particle properties concurrently, a revelation made possible through modern nuclear physics advancements. The possibility of observing both aspects of light without the typical exclusion has transformed our understanding of light’s nature.

Topics:

Source: www.newscientist.com

Webb Observations Reveal TOI-5205b: A Carbon-Rich, Oxygen-Poor Atmosphere of a Giant Exoplanet

Astronomers have utilized the Near Infrared Spectrometer (NIRSpec) on the NASA/ESA/CSA James Webb Space Telescope to analyze the atmosphere of TOI-5205b, an extrasolar gas giant orbiting a dim red dwarf star. These groundbreaking observations reveal that the atmosphere is surprisingly deficient in heavy elements, raising intriguing questions regarding the formation and evolution of such “forbidden” alien worlds.

The Jupiter-sized planet TOI-5205b has a surface temperature of 737 K and orbits at a distance of 0.02 astronomical units from its parent star, TOI-5205. Image credit: Sci.News.

TOI-5205b is a short-period gas giant with only 1.03 times the radius and 1.08 times the mass of Jupiter, completing its orbit in just 1.63 days.

Discovered in 2022, this planet orbits the TOI-5205, an M4-type star with approximately 39% of the Sun’s size and mass.

The system, also known as TIC 419411415, is located about 283 light-years away in the constellation Vorissa.

“Short-period Jupiter-mass planets are among the first exoplanets found around Sun-like main-sequence stars, yet their formation processes are still not fully understood,” explained Dr. Caleb Cañas from NASA’s Goddard Space Flight Center.

“The increasing number of short-period giant exoplanets around M dwarfs adds further complexity to gas giant planet formation theories.”

“These worlds are challenging to form through nuclear accretion due to the low disk masses and longer orbital time scales of M dwarfs, which hinder the efficient creation of massive planetary cores necessary for runaway gas accretion.”

“These planets exemplify an extreme formation regime for mid-to-late M-type dwarfs since the significant planet-to-star mass ratio demands a core mass exceeding the estimated dust mass of the protoplanetary disk.”

Astronomers used Webb’s NIRSpec to observe three separate transits of TOI-5205b.

To their surprise, they discovered that the concentration of heavy elements in the planet’s atmosphere, relative to hydrogen, is lower than found in the gas giants of our solar system, including Jupiter. Remarkably, it is even less metallic than its host star.

This finding sets TOI-5205b apart from all other studied giant planets.

Furthermore, the observations revealed the presence of methane and hydrogen sulfide in the planet’s atmosphere, corroborating previous findings.

To better understand their results, the researchers employed an advanced model of the planet’s interior, predicting that TOI-5205b’s overall composition is about 100 times richer in metals than its atmosphere.

“We observed a significantly lower metallicity than what models predicted for the planet’s bulk composition, based on measurements of its mass and radius,” noted Dr. Shubham Kanodia of Carnegie Science.

“This suggests that heavy elements migrated to the interior during formation, indicating that the interior and atmosphere are not currently mixing.”

“In essence, our findings imply that the planet’s atmosphere is notably carbon-rich and oxygen-poor.”

For more information on these findings, check out the latest publication in Astronomy Magazine.

_____

Caleb I. Cañas et al. 2026. GEMS JWST: TOI-5205b’s transmission spectroscopy reveals significant contamination of the star and a metal-poor atmosphere. A.J. 171, 260; doi: 10.3847/1538-3881/ae4976

Source: www.sci.news

Are Manure Digesters the Ultimate Solution for Reducing Dairy Farm Emissions?

Dairy Farm Digesters: Harnessing Biogas from Cow Manure

Rudmer Zwerver/Shutterstock

During World War II, farmers in Germany and France found innovative ways to harness energy. They covered fertilizer reservoirs to capture methane and secure their own fuel supplies. Today, anaerobic digesters, the advanced version of that technology, are promoted by governments as a means to mitigate greenhouse gas emissions from dairy farms. However, researchers warn that investments in these digesters may lead to unintended consequences for both climate and public health.

Rebecca Larson from the University of Wisconsin-Madison asks, “Is this funding more effective for climate change mitigation than other strategies, like solar panel installations?” She recognizes that although digesters are effective for livestock emissions, it’s crucial to explore all options.

Agriculture is responsible for approximately one-third of global human emissions, with cow burps contributing one-third of that in the U.S. alone. Industrial dairy farms manage large quantities of cow manure and often flush it into lagoon systems.

The commercial-scale use of digesters began in the 1970s. Now, there are over 17,000 digesters primarily on farms in the European Union, while the U.S. and England each have around 400. In China, millions of small farms use brick digesters to optimize waste management.

When organic waste decomposes anaerobically, it not only releases methane but also carbon dioxide. This is typical in sewage treatment plants and waste lagoons. However, utilizing sealed digesters allows for the recovery of biogas, which can be generated more efficiently and utilized for heat, electricity, or even as natural gas for vehicles. This process mitigates methane emissions, a greenhouse gas significantly more potent than CO2. The digested waste is then repurposed as fertilizer and bedding.

Following digestion, methane emissions from stored fertilizers can be reduced by up to 91%. Yet, a recent analysis of methane emissions from 98 California dairy farms indicates a more complex scenario. Approximately 1.7 million dairy cows are housed in factory farms across the state, which has invested $389 million in digester construction—more than anywhere else in the U.S.

Although digesters reduced point-source methane emissions from 91 kg to 68 kg per hour across two-thirds of the participating farms, emissions spiked temporarily during construction. This anomaly remains unexplained, potentially tied to fertilizer slurry disturbances during installation.

Due to their heated environments, digesters may produce methane at a faster rate than traditional lagoons and can occasionally leak. The study found that in some cases, leakage rates were over 1000 kg per hour, highlighting a potential risk in efficiency.

Alyssa Valdez from the University of California, Riverside, emphasizes that high leak rates serve as a warning. Despite this, California’s leak notification program remedied 20% of identified issues. Studies suggest digesters can still reduce fertilizer emissions by about half.

“When gas leaks occur, operators incur financial losses, creating an incentive to minimize emissions,” states Angela Bywater from the University of Surrey, UK. However, digesters can also lead to increased ammonia production from fertilizers, raising environmental contamination concerns.

The prevailing debate focuses on how aggressively governments should support digesters. California’s favorable policies appear to encourage the growth of factory farms, as incentives are linked to biogas production under the Low Carbon Fuel Standard. According to a preprint study, receiving such incentives can increase dairy herd size by an average of 860 cows.

Brent Kim of Johns Hopkins University warns, “Taxpayer funding that inflates fertilizer value may distort market dynamics, making fertilizers more valuable than milk. We should consider viable climate change solutions that don’t inadvertently sustain harmful industry practices.”

Topics:

Source: www.newscientist.com

Enhancing Brain Detoxification: A New Approach to Migraine Relief

Novel Migraine Research

Innovative Strategies for Migraine Relief

Sergey Khakimullin/Getty Images

About one-third of migraine sufferers find no relief from standard treatments. However, new research suggests that utilizing the brain’s waste-clearing system could introduce innovative treatment methods. A particular drug that is typically used to manage high blood pressure demonstrated the ability to effectively eliminate chemicals from the brains of mice that contribute significantly to migraines. Consequently, the mice showed minimal facial pain.

Around 60% of migraine patients experience considerable discomfort during episodes.

Globally, approximately 1 in 7 people suffer from migraines. Symptoms include pain, pressure, and tingling in areas such as the cheeks, jaw, forehead, and behind the eyes, often worsened even by light touch. “Just brushing your hair can result in excruciating pain for those living with migraines,” stated Adriana Della Pietra, who presented findings at the Oxford Glymphatic and Brain Clearance Symposium in the UK on April 1.

Conventional treatments for migraines, including triptans, aim to reduce inflammation and lower the levels of a neurotransmitter known as calcitonin gene-related peptide (CGRP), a key player in migraine pathology. CGRP is a major factor driving migraines, targeted by many standard treatments. “Unfortunately, many individuals do not respond to these medications and are frequently trapped in a cycle of debilitating pain,” commented Valentina Mosienko from the University of Bristol, UK, who was not involved in the study.

In previous studies, researchers discovered that prazosin, a medication prescribed for high blood pressure, alleviated facial pain caused by traumatic brain injuries in mice. Traumatic injuries can impair the brain’s waste disposal system, known as the glymphatic system, and prazosin enhanced fluid flow from brain cells through this system. Interestingly, it also appeared to benefit some migraine models used as control groups.

To delve deeper, the research team administered prazosin to one group of mice in their drinking water over six weeks, comparing against a control group that received standard water. Subsequently, both groups were subjected to migraines induced by CGRP injections.

After 30 minutes, the researchers applied progressively thicker plastic filaments to the mice’s foreheads. This technique, normally non-painful, became more detectable as the filaments increased in thickness. The findings showed that mice receiving prazosin managed to endure significantly thicker filaments without flinching compared to control mice. Della Pietra noted that the prazosin group behaved similarly to mice that hadn’t received CGRP injections.

Further analysis revealed that prazosin not only reversed the impairment of the glymphatic system caused by CGRP but also likely enhanced the clearance of CGRP and other pain-transmitting molecules, as reported by Della Pietra.

Research teams are eager to examine whether similar results can be replicated in humans. “If it proves effective in humans, that would be a tremendous breakthrough,” Mosienko added. “Since this drug is already in use, we have established safety for its application.”

Topics:

Source: www.newscientist.com