Unforeseen vaccine side effects: Staying sharp is a bonus!
Joseph Polc / Alamy
Recent studies indicate that chronic inflammation in various body areas could contribute to Alzheimer’s disease. While it may take time to fully understand these connections, it’s evident that persistent inflammation has adverse effects and that reducing it can provide numerous health benefits.
Inflammation is the body’s response when immune activity exceeds normal levels, such as when a wound becomes infected. While short-term inflammation is beneficial, prolonged inflammation can lead to serious health issues, including cancer, heart disease, stroke, rheumatoid arthritis, and mental health disorders like depression and anxiety.
To combat long-term inflammation and enhance both physical and mental health, consider the following tips:
1. Get Vaccinated
Vaccines, including those for shingles, tuberculosis, and influenza, have demonstrated a reduced risk of dementia. For instance, individuals who received the Shingrix vaccine experienced a 17% lower chance of developing dementia compared to those who had the older Zostavax vaccine, which also lowers dementia risk. Though the exact mechanism remains unclear, vaccines likely reduce inflammation.
2. Maintain Good Oral Hygiene
Gum disease is another inflammatory condition linked to an increased risk of Alzheimer’s and heart disease. Bleeding gums can allow harmful bacteria to enter the bloodstream, which is why good dental hygiene is essential for preventing periodontal disease and maintaining overall health.
3. Embrace a Mediterranean Diet
A Mediterranean diet is rich in anti-inflammatory foods such as fruits, beans, nuts, whole grains, fish, and olive oil, while minimizing inflammatory foods like red and processed meats. This dietary approach not only helps in reducing inflammation but is also associated with longevity and overall wellness.
4. Exercise Regularly
Sedentary lifestyles contribute to increased inflammation. Numerous studies suggest that regular exercise diminishes inflammation. Whether it’s vigorous workouts or gentler activities like yoga, incorporating movement into your routine can provide significant health benefits.
5. Achieve a Healthy Weight
Although the connection is still being explored, obesity is often linked to ongoing inflammation. It raises an interesting question: Could medications like GLP-1, often used for weight loss, reduce Alzheimer’s risk? Current evidence shows that those using GLP-1 medications may experience lower dementia risk, but results for those without diabetes are still unclear.
6. Cultivate Happiness
While occasional stress is normal, chronic stress can lead to inflammation. Striving for happiness and emotional balance can help mitigate inflammation and improve overall mental well-being.
Saturn’s moon Enceladus: A Prime Candidate in the Search for Extraterrestrial Life
Credit: NASA/JPL/Space Science Institute
A revolutionary method for detecting chemical properties of living organisms could unlock the secrets to identifying extraterrestrial life forms, even those with biochemical processes distinct from life on Earth.
In the quest for extraterrestrial life, scientists traditionally depend on biosignatures—substances or patterns that reliably signify the presence of life. By analyzing the atmospheres of distant planets, astronomers search for molecular biosignatures. However, many molecules associated with life can also arise from geological activities, suggesting a careful approach to interpretation.
A novel test developed by Christopher Carr and colleagues from Georgia Tech focuses on amino acids, which serve as fundamental components of proteins that sustain all known life forms. While amino acids can also be produced in lifeless environments, they have been uncovered in lunar soil, comets, and meteorites.
Given this, Carr and his team proposed that analyzing the reactivity of molecules within samples could provide more reliable biological indicators than merely detecting amino acids.
In non-living systems, molecules are continuously formed and destroyed as they react with environmental factors like cosmic rays. The more reactive a molecule, the more likely it is to decompose. “Without stable systems to maintain molecules, their reactivity increases,” explains Carr. However, living systems require reactive molecules, therefore they retain more reactive ones, creating distinct biochemical signatures.
The reactivity of compounds hinges on the arrangement of electrons in the molecules. More reactive molecules exhibit smaller energy differences between their outermost electron and the next available electron space during reactions.
Carr and his team calculated energy differences for 64 amino acids, including those not present in Earth’s biosphere. They analyzed the prevalence of these amino acids in samples sourced from both abiotic processes (like meteorites and lunar soil) and biotic sources (like fungi and bacteria), employing molecular energy calculations to establish a statistical framework for amino acid reactivity. This allowed them to estimate the probability of a sample being alive or inorganic.
After testing over 200 living and nonliving samples, they found their method could accurately identify life with 95 percent certainty. “This approach is remarkably straightforward,” Carr asserts. “It’s easily explainable and directly linked to the principles of physics.”
This reactivity-based method is applicable to the search for extraterrestrial life, as Carr posits that if life exists elsewhere, it likely relies on carbon-based chemistry and amino acids, governed by the same principles of chemical reactivity present on Earth. “Life inherently requires control over the timing, methods, and locations of molecular interactions. Therefore, structures that facilitate electron flow and molecular interactions are essential,” Carr notes.
While utilizing molecular reactivity to identify life isn’t new, measuring reactivity through statistical distributions is an innovative advancement. Henderson Cleaves from Howard University suggests that this method could enhance the toolkit of life-detection instruments on forthcoming space missions to Mars or the moons of Saturn, most notably Enceladus. However, Cleaves notes that the technology to accurately measure molecular abundance is a significant challenge.
Exploring the Mysteries of the Universe: Cheshire, England
Embark on a weekend with some of the brightest minds in science, diving deep into the mysteries of the universe, featuring a tour of the iconic Lovell Telescope.
Two remarkable species of marsupials, long considered extinct and previously known only from fossil records, have been rediscovered alive in New Guinea. This groundbreaking finding is the result of a collaborative effort involving scientists, indigenous communities, and citizen scientists.
The confirmation of the pygmy longfinger possum and the ring-tailed glider as living specimens marks a significant moment—it’s the first time these creatures have been seen in over 7,000 years. The announcement was made by Bishop Museum, based in Honolulu.
“As both a scientist and conservationist, it’s incredibly fulfilling to confirm their existence. This opens a new chapter in our journey to learn about and protect these fascinating animals,” stated Dr. Christopher Helgen from Bishop Museum.
For the past two years, Helgen and Dr. Tim Flannery of the Australian Museum have been dedicated to verifying the existence of these elusive mammals.
These two animals are categorized as “Lazarus species,” a term for species that re-emerge after being presumed extinct. “The discovery of two Lazarus species thought to be extinct for millennia is truly unprecedented,” Flannery noted in a press release.
Helgen believes this rediscovery underscores the idea that “extinction is avoidable.”
“This discovery offers a message of hope and a testament to second chances,” he added.
These species were initially discovered through fossils by Dr. Ken Aplin, who unearthed a critical tooth during an archaeological dig in western New Guinea in the 1990s.
Helgen’s observation of a photo featuring a gliding ring-tailed possum led to the identification of it as one of Aplin’s previously “extinct” species. Indigenous communities from West Papua’s Tambulo and Maybrat regions provided invaluable assistance by sharing their extensive knowledge about the marsupial’s unique lifestyle, according to a press release.
Recently, scientists confirmed the existence of the pygmy longfinger possum after discovering two preserved specimens at the University of Papua New Guinea.
The survival of the pygmy longfinger possum has been further validated by citizen scientists. Carlos Bokos, a citizen scientist and now co-author of the study, shared a photo of the species on iNaturalist, a global platform for documenting natural science discoveries.
This rewritten content maintains the original HTML structure while enhancing SEO through targeted keywords and phrases related to the discovery of species, collaboration, and conservation efforts.
Global warming is accelerating at an alarming rate, occurring at twice the speed compared to previous decades. This increase indicates that significant climate changes could emerge sooner than anticipated.
From 2013 to 2014, the Earth warmed by approximately 0.18°C per decade. This trend has since escalated, with a temperature rise of roughly 0.36°C per decade noted in recent analyses by Stefan Rahmstorf and his team at the University of Potsdam, Germany.
If the current rate of global warming persists, humanity risks violating the Paris Agreement’s cap of limiting global temperature rise to 1.5°C by as early as 2028—much sooner than various forecasts suggest.
“Every fraction of a degree is crucial, amplifying the consequences of global warming manifesting as severe weather events and ecological disturbances,” Rahmstorf states. “With the notable exception of the United States, the global community aims to mitigate and curb the effects of climate change. The current trajectory suggests a worrying acceleration in warming trends.”
After experiencing unprecedented heat levels, climate scientists are actively discussing the potential for further acceleration in global warming throughout 2023. However, natural phenomena like El Niño have complicated efforts to ascertain whether the observed temperature rises are attributable to climate change or merely transient weather patterns.
Rahmstorf’s research is pioneering, revealing a statistically significant acceleration in global warming attributable to climate change, with 98% confidence.
This collaborative research assessed five distinct global temperature datasets, some indicating even higher temperature spikes. Based on a 20-year average, global warming may be 1.5°C hotter this year compared to pre-industrial levels, as suggested by data from the European Center for Medium-Range Forecasts.
Warm-water coral reefs are on the brink of collapse, and exceeding the 1.5°C threshold risks triggering further tipping points, including irreversible glacial melting in Greenland and West Antarctica, as well as deforestation in the Amazon rainforest.
Many scientists contend that the recent acceleration in global warming primarily results from the restrictions imposed on sulfur dioxide emissions from shipping in 2020. While harmful to public health, this pollutant previously formed an aerosol mist, shielding the Earth from excess sunlight and cooling the atmosphere.
With this sunlight barrier now diminished, the rate of warming might decelerate, though unconfirmable at this stage, notes Rahmstorf. The ongoing shift from fossil fuels is likely to decrease air contaminants that have masked temperature rises.
Aerosol levels will continue to decline, but swift adjustments in shipping emissions are improbable. “A gradual easing in warming rates over the next decade is plausible,” he adds.
Alongside the effects of El Niño, researchers also considered volcanic eruptions that generate haze obstructing sunlight, as well as heightened solar radiation during sunspot peaks. After disregarding these impacts, they applied two distinct models to global temperature data. Both indicated a marked acceleration in warming, albeit at different intervals.
Nevertheless, the study’s authors caution that completely isolating the temperature influences of El Niño, eruptions, and sunspots remains a challenge, as stated by Zeke Hausfather from Berkeley Earth, California. This raises the possibility of a slight overestimation in the acceleration of global warming. Nonetheless, the evidence strongly supports the notion of a quicker pace of change, he asserts.
“The key take-home message is that while exact figures on the acceleration rate of warming are still pending, there is compelling evidence suggesting it is intensifying,” Hausfather concludes. “We must await additional data over the next few years for clearer insights.”
Imagine showing someone a box and asking them to guess its contents without any hints. This might seem impossible, yet the box’s nature offers crucial clues. For instance, its size implies the contents are smaller, and the material — metal versus cardboard — hints at what it can hold.
Is there a mathematical way to explain how to make educated guesses based on limited information? Indeed, while outcomes like coin flips or dice rolls are random and unpredictable, many scenarios allow us to optimize our guessing strategies using a few clever tools.
These constrained guesses are essentially estimates, a concept with deep historical roots. A remarkable early example comes from the ancient Greek philosopher Eratosthenes, who resided in Alexandria, Egypt, during the third century BC. Using basic principles, he estimated Earth’s circumference with astonishing accuracy. Though his precise method has been lost, subsequent writings enable us to reconstruct it.
Eratosthenes observed that during noon on the summer solstice, the sun was directly overhead in Syene, causing no shadow in the city’s well. Meanwhile, in Alexandria, a vertical pole cast a shadow of about 7 degrees, or approximately 1/50th of a circle. Knowing the distance between the two cities was 5000 stadia, he estimated Earth’s circumference to be 250,000 stadia.
While Eratosthenes made geometric approximations that can be overlooked, the real challenge lies in determining the length of a stadion — estimated to be around 160 meters. This approximation yields a circumference of approximately 40,000 kilometers, quite close to the modern measurement of 40,075 kilometers. Variations in stadion measurements, ranging from 150 to 210 meters, affect precision, depending on how we interpret Eratosthenes’ work.
Estimating Earth’s Circumference
Chronicle/Alamy
The key takeaway is that with simple yet logical calculations, we can deduce significant insights — all without a globe in hand. In the 20th century, physicist Enrico Fermi exemplified this art of estimation, playing a pivotal role in the Manhattan Project which led to the development of the atomic bomb. During the Trinity test, he ingeniously gauged the explosion’s power by dropping small pieces of paper and observing their movements. Though the specifics of his technique remain elusive, his initial estimate of a 10 kiloton bomb was intriguingly close to the accepted yield of 21 kilotons.
Fermi’s knack for educated guesses gave rise to the concept known as the “Fermi problem.” One classic illustration involves estimating the number of piano tuners in Chicago. Starting with a population of around 3 million, estimating the number of households and pianos leads to a rough conclusion of about 150 piano tuners based on several reasonable assumptions.
The crux of this estimation lies in understanding the limits of its imprecision. While we’ve made numerous assumptions during the process, the errors are likely to balance out. An estimate suggesting a million piano tuners would be almost certainly incorrect.
Fermi estimation serves as a valuable tool for generating initial hypotheses, but as we obtain more information, we can refine our guesses. Returning to the box analogy, if a blue ball with the number 32 is drawn from it, our assumption about the contents shifts. Acknowledging that multiple colored balls are likely, we can utilize the statistics pioneered by Thomas Bayes in the 18th century to quantify this uncertainty.
Portrait of Thomas Bayes
Public Domain
Bayes revolutionized probability by transforming it from a method for understanding randomness into a framework for addressing uncertainty. His Bayes’ theorem offers a way to quantify observations into evidence, comprised of four components: ex ante, evidence, likelihood, and ex post.
Prior values denote fundamental assumptions. Imagine serving three ice cream flavors (chocolate, strawberry, and vanilla) at a gathering. Initially, you might assume each flavor will be equally popular. However, if the first ten guests all choose chocolate, your initial assumption may need reevaluation.
Evaluating the likelihood of ten consecutive chocolate selections under equal preference assumptions reveals a probability of approximately 1 in 60,000—a strong indicator to revise your original beliefs. Such updates provide a more accurate understanding moving forward.
This theorem proves powerful. Referring back to the box example, drawing a colored ball like red ’50’ sharpens the possibilities of what remains inside. Each draw further narrows down our options based on new evidence.
One practical use of Bayes’ theorem appears in spam filters. Early versions used Bayesian inference to categorize a certain percentage of emails as spam (ex ante) and learned to recognize spam emails by examining user-marked emails (evidence) and the likelihood of certain words’ presence in those emails (likely).
This application illustrates how estimation matters in real-world scenarios, far beyond mere mathematics. Especially with modern AI technologies like ChatGPT, understanding and applying Fermi estimation and Bayesian inference techniques is increasingly vital. As observed, AI often seeks to confirm pre-existing information, thus neglecting new data for accurate assessments. Equip yourself with the skills to make informed guesses.
Imagine launching from Earth on a clear day; the sky transforms from a bright blue to the deep black of outer space as you ascend. This transition, from vivid blue to the engulfing void, reveals an optical phenomenon caused by sunlight interacting with our atmosphere.
Despite our understanding today of this optical effect, the perception of space at the time was quite different. While Yuri Gagarin is celebrated as the first human in space, the question remains: was he truly the first to experience the vastness of outer space?
To explore this, we must consider the definitions of where the universe commences. The International Aeronautical Federation designates the Kármán Line at 100 kilometers above Earth, while the U.S. government sets it at 80 kilometers (50 miles). Yet these definitions are arbitrary, often tailored to align with specific technological capabilities and aerial standards.
According to the Oxford English Dictionary, space is defined as the “physical universe… beyond the Earth’s atmosphere.” Interestingly, scientific understanding of our atmosphere’s extent has evolved, suggesting it extends even beyond 630,000 kilometers. Future NASA missions, like Artemis II, are set to venture beyond the Moon, yet they will still fall short of the more than 200,000 kilometers to reach the ultimate boundaries of space.
While it seems absurd to claim that Apollo astronauts didn’t reach space, we still seek deeper definitions based on historical and cultural perspectives. What truly exemplifies the essence of “space”?
Witnessing the Sky Disappear
One significant boundary marks the moment the atmosphere no longer refracts sunlight, revealing the stark blackness of space. Historically, many Europeans believed in a bright, blue universe, a misconception maintained until scientists corrected it in the 17th century. The first astronauts to witness this darkness shattered centuries of misunderstanding.
In the 1930s, high-altitude balloonists pushed the envelope. In 1935, the U.S. Explorer II, piloted by Albert Stevens and Orville Anderson, soared to 22.1 kilometers, where they experienced an atmospheric shift. Their descriptions of the horizon hinted at the transition Gagarin would later confirm. Nonetheless, they witnessed a “very deep blue” sky rather than a true black.
In 1956, Malcolm Ross and Lee Lewis ascended to 23.2 kilometers in their Stratolab I balloon, noting the sky appeared black, a milestone in the pursuit of understanding space. This insight continued with David Simmons in the Mann High II balloon, who at 22.9 kilometers witnessed a similarly dark sky.
While rocket-powered aircraft approached these altitudes, they lacked the prolonged visibility balloonists had. In 1951, William Bridgman reached 24.2 kilometers but could barely observe the sky due to the brevity of his flight. Conversely, in 1956, Iven Kincheloe flew higher in the Bell X-2, reporting once again on the intriguing color transformations of the sky.
Confronting the Hostile Sky
Over time, the insight balloonists gained about the transitions from blue to black became crucial. David Simmons, during his 1957 flight, marveled at the hazy horizon blending with the vast blackness of space. To him, the enclosed gondola was akin to a spacecraft floating amid the void.
Joseph Kittinger’s iconic 1960 parachute jump from 31.3 kilometers illustrated the black, uninviting expanse above. He notably remarked on the hostility of the sky above him, acknowledging the inherent challenges of conquering space—conclusions echoed by Gagarin years later.
Not all spaceflights occur during daylight, yet witnessing the transition from blue to black remains pivotal for astronauts. In 2021, actor William Shatner’s experience aboard Blue Origin reached 107 kilometers. He encapsulated the moment: “It’s fascinating to see blue color passing by you, then immediately facing blackness.”
While the Kármán line exists as a conceptual framework, the emotional impact of seeing the sky fade remains profound. Those early visionaries, witnessing this transition, forever altered our understanding of what it means to reach space, a claim equally valid for their experiences as it is for Gagarin’s historic flight.
A dust plume from the Sahara Desert is set to arrive in the UK this week, potentially creating stunning sunrises and sunsets as well as what is known as “blood rain.”
Fine dust and sand particles, lifted thousands of miles by winds from North Africa, will contribute to a unique atmospheric phenomenon.
According to the Japan Meteorological Agency, “Dust in the air is expected to continue moving across the UK today and into tomorrow. This could lead to hazy skies and, in some cases, a build-up of dust on surfaces such as cars, especially if showers occur.”
During sunrise and sunset, this dust can turn the sky into deep shades of gold and orange.
“Dust particles are highly efficient at scattering sunlight, significantly contributing to the stunning red hues of sunsets,” says Dr. Claire Ryder, an Associate Professor of Mineral Dust Processing at the University of Reading.
“The iron oxides in the dust absorb blue light, further enhancing the red color in the sky.”
These iron oxides can even lead to blood rain, although the sight may not be as dramatic as it sounds.
“Despite the ominous name, this is a simple phenomenon,” explained Ryder. “When rain falls through dust-laden air from the Sahara, it picks up tiny reddish-brown particles, leaving rusty-orange stains on cars, windows, and garden furniture.”
“This explains why your car may appear slightly muddy after the rain showers this week.”
Typically, dust-laden rain in the UK is present in such low concentrations that the droplets appear normal to the naked eye.
You might want to delay that car wash until the weekend to avoid needing multiple cleanings!
Blood rain may not look dramatic, but it can leave a layer of dust on your vehicle – Credit: Getty
This Saharan dust not only affects your car but can also impact air quality, increasing particulate matter, or pollution, in the atmosphere.
With pleasant spring weather, weak winds in the south and east of the UK may cause dust to linger and accumulate in certain areas.
Fortunately, the Japan Meteorological Agency assures that there are no significant health concerns, though individuals with pre-existing respiratory conditions might experience slight air quality degradation.
While it may seem surprising, it’s not uncommon for Saharan dust to find its way to the UK. When sandstorms in North Africa interact with specific wind patterns, sand travels northwards.
Mr. Ryder noted, “Over the next few days, southerly winds ahead of an advancing front will push the dust plume across the UK. Current forecasts suggest that this dust could linger into Sunday night.”
A groundbreaking discovery of a 90-million-year-old fossil in Argentina is reshaping our understanding of the evolutionary history of a unique group of bird-like dinosaurs. This find helps settle a longstanding debate regarding their distribution across the ancient world.
The fossils detailed in Nature belong to Arunachetri seropolisiensis, a member of the Alvarezaurus family. This small dinosaur is characterized by its tiny teeth and stout arms, which end in a prominent single thumb claw.
While most well-preserved Alvarezsaurus fossils have been discovered in Asia, the existence of Alvarezsaurus in South America raises intriguing questions due to the vast ocean separating these continents.
A nearly complete skeleton uncovered at the La Buitrera fossil site in northern Patagonia has provided remarkable evidence regarding this species. This region was also home to primitive snakes and small saber-toothed mammals.
“Creating a nearly complete, articulated animal from a fragmented skeleton is akin to discovering the Rosetta Stone of paleontology,” stated Peter Makowiecki, a professor at the University of Minnesota, and the study’s first author.
Unlike their later relatives, Arunashetri had longer arms and larger teeth. This indicates that Alvarezsaurids likely reduced their body size before evolving the characteristic small limbs and teeth suited for an ant and termite diet.
“Our study suggests that alvarezsaurids form a compact group of dinosaurs, with species sizes ranging from crows to humans,” Makowiecki told BBC Science Focus. “Body size appears to fluctuate within this limited range without a clear trend.”
Peter Makowiecki discovers fossilized bones in Patagonia’s La Buitrera Fossil Field – Photo credit: Minyoung Son, University of Minnesota
This discovery also addresses an intercontinental mystery. A detailed anatomical study of Arunashetri led Makowiecki and his team to examine fossil collections globally. “We found other Alvarezaurids hiding in plain sight,” he noted.
“These species, which existed during the Jurassic period in North America and the Early Cretaceous in Europe, enhance our understanding of Alvarezsaurus’s widespread presence prior to the major rift between the Northern and Southern Hemispheres.”
Approximately 200 million years ago, all of Earth’s continents formed a single supercontinent named Pangea. This landmass gradually fragmented over tens of millions of years, evolving into its current configuration while transporting its fauna along with it.
The research team is preparing additional specimens from the same site, though Professor Makowiecki has remained tight-lipped about their specifics. “The new specimen confirms some of our findings regarding size and specialization,” he disclosed. “Currently, we have no further plans.”
Read more:
This version maintains SEO optimization by incorporating keywords, improving readability, and retaining necessary HTML tags.
Astronomers have utilized spectral data from the Hobby-Eberly Telescope at McDonald Observatory to construct the most intricate 3D map of faint cosmic structures dating back 9 to 11 billion years, unveiling galaxies and intergalactic gas previously undetectable by telescopes.
A line intensity map showcasing the distribution of excited hydrogen in the universe approximately 10 billion years ago. The stars denote areas where HETDEX has identified galaxies. The inset simulates the structure after optimizing the data by reducing background noise. Image credit: Maja Lujan Niemeyer / Max Planck Institute for Astrophysics / HETDEX / Chris Byrohl / Stanford University.
“Studying the early Universe reveals how galaxies have evolved into their current forms and the role that intergalactic gas plays in this transformation,” stated Dr. Maya Lujan Niemeyer, an astronomer at the Max Planck Institute for Astrophysics and Ludwig Maximilian University of Munich, and a key member of the Hobby-Eberly Telescope’s Dark Energy Experiment (HETDEX).
“Many objects from this epoch are faint and challenging to observe due to their vast distances,” she continued.
“Through a technique known as line intensity mapping, this innovative map enhances our understanding of these objects, adding complexity and depth to this crucial era of cosmic history.”
Although line intensity mapping is not a novel methodology, this is the first instance it has been employed to visualize Lyman alpha emissions with such exceptional precision across an extensive dataset.
The HETDEX project harnesses the capabilities of the Hobby-Eberly Telescope to catalog over 1 million luminous galaxies to decode the mysteries of dark energy.
What differentiates this project is its extensive measurement scope, equivalent to observing more than 2,000 full moons and amassing a colossal dataset of over 600 million spectra across an expansive area of the sky.
“We leverage only a fraction of our data—approximately 5%,” remarked Dr. Karl Gebhardt, principal investigator of HETDEX and an astronomer at the University of Texas at Austin.
“This leaves significant potential for future research utilizing the remaining data.”
“While HETDEX captures images of the entire sky, only a small subset of the collected data comprises sufficiently bright galaxies for our research,” noted Dr. Lujan Niemeyer.
“These galaxies are merely the beginning. In the vast expanses in between, lies an entire ocean of light awaiting discovery.”
To construct this groundbreaking map, astronomers employed a supercomputer at the Texas Advanced Computing Center to meticulously analyze approximately half a petabyte of HETDEX data.
Using the coordinates of luminous galaxies already detected by HETDEX, they inferred the positions of fainter galaxies and adjacent glowing gas.
Due to the gravitational forces that cause matter to cluster, the existence of one bright galaxy implies the presence of nearby celestial objects.
“This allows us to utilize known galaxy positions as reference points to ascertain distances to fainter celestial entities,” explained Dr. Eiichiro Komatsu, HETDEX scientist and astronomer at the Max Planck Institute for Astrophysics.
“The resultant map emphasizes regions surrounding bright galaxies while providing intricate details of the areas in between.”
“Simulation models exist for this cosmic era, yet they remain hypothetical; they do not represent the actual universe.”
“We now possess a foundational understanding that allows us to verify whether the astrophysics underlying these simulations holds true.”
For more on these remarkable findings, published on March 3, 2026, in the Astrophysical Journal.
_____
Maya Lujan Niemeyer and others, 2026. Lyα intensity mapping in HETDEX: Galaxy-Lyα intensity cross-power spectrum. APJ 999, 177; doi: 10.3847/1538-4357/ae3a98
A groundbreaking discovery of a 7.2-million-year-old femur at the Azmaka fossil site in southern Bulgaria reveals a unique blend of locomotor features, suggesting both quadrupedal and bipedal abilities. This significant finding involves a research team led by Professor Madeleine Böhme from the Senckenberg Center for Human Evolution and Paleoenvironment at the University of Tübingen. The fossil has been tentatively linked to Grecopithecus, a fossilized species of ape known from fragmented archaeological sites in the Balkans, which is considered by some as a contender for the earliest known human species.
Grecopithecus freibergii lived in the dusty savanna of the Athens Basin 7.2 million years ago.” width=”580″ height=”754″ srcset=”https://cdn.sci.news/images/2017/05/image_4888_1-Graecopithecus-freybergi.jpg 580w, https://cdn.sci.news/images/2017/05/image_4888_1-Graecopithecus-freybergi-230×300.jpg 230w” sizes=”(max-width: 580px) 100vw, 580px” />
Grecopithecus freibergii inhabited the dusty savanna of the Athens Basin 7.2 million years ago. This image from Pyrgos Vasilisis, the site of discovery, shows a southeastern view over Athens’ plains, beneath reddish clouds of Saharan dust. Background features include Mount Himethos and Mount Lycabettos. Image credit: Velizar Simeonovski.
Researchers regard Grecopithecus as a controversial late Miocene ape fossil, estimated to be around 7.2 million years old.
Some experts speculate that this ancient species could represent the earliest humans, potentially predating fossils traditionally linked to early human ancestry in Africa.
The fossil record of Grecopithecus includes a partial lower jaw discovered near Athens, Greece, in 1944, alongside isolated upper premolar fossils from Bulgaria examined in the 2010s.
“This ancestor from 7.2 million years ago is classified within the genus Grecopithecus and may represent the oldest known hominid,” stated David Bigan, a professor at the University of Toronto and co-author of the study.
The analysis in the study involved nearly complete femurs from Grecopithecus unearthed from the Azmaka site.
The newly discovered fossil, located in floodplain sediments dating back approximately 7.2 million years, showcases distinctive features. The Azmaka femur’s bulbous head is noticeably separated from the neck, with an elongated, diagonally ascending medial edge characteristic of hominids.
While this find does not fully represent the range of adaptations seen in later bipedal species, the angle of the neck axis falls within the lower spectrum observed in modern humans and approaches estimates for early human ancestors such as Orrorin, but remains below the typical angle found in suspensory apes like orangutans.
Researchers suggest that this combination of anatomical features indicates a transitional form of bipedalism that is neither specialized for climbing nor fully adapted for terrestrial life.
Weight estimates based on the dimensions of the femur suggest Grecopithecus weighed approximately 23-24 kg, akin to a small chimpanzee.
Professor Nikolai Spasov of the Bulgarian National Museum of Natural History remarked, “Numerous external and internal morphological traits, such as the elongated neck between the femoral shaft and head, the specific attachment points for the gluteal muscles, and the robust nature of the external bone layer, share similarities with our bipedal hominin ancestors and modern humans.”
“These anatomical features differ significantly from those of tree-dwelling apes,” he added. “Nevertheless, Grecopithecus did not walk in the same manner as modern humans.”
The environmental context of the Azmaka site indicates a scrub and forest savannah near a braided river system, suggesting that early terrestrial bipeds may have evolved outside of jungle habitats.
The authors hypothesize that the descendants of this group might have migrated from Eurasia to Africa during the late Miocene in response to climatic and environmental changes in the eastern Mediterranean, potentially influencing the ancestry of later African apes and hominids.
Whether the Azmaka femur ultimately reconstructs the geographic story of human origins remains contingent upon future discoveries.
Yet currently, it provides a rare insight into the origins of upright walking within a landscape characterized by seasonal rivers and open forests, millions of years prior to the emergence of the first widely recognized human ancestors in Africa.
Grecopithecus exemplifies a pivotal moment in human evolution, representing the transition from arboreal to terrestrial ancestors, akin to those from approximately 12 million years ago, including Danuvius guggenmosi, discovered at the Hammerschmiede site in southern Germany and more recently in East Africa,” emphasized Professor Bigan.
“In essence, you could classify this as a missing link. Grecopithecus is likely a descendant of apes from the Balkans and Anatolia that existed 8 to 9 million years ago, including Ouranopithecus and Anadrovicius, evolving from Western and Central European ancestors.
“Significant climate fluctuations in the eastern Mediterranean and Western Asia resulted in the periodic formation of extensive semi-desert landscapes between 8 and 6 million years ago,” he concluded. “This prompted a dispersal of Eurasian mammals into Africa, laying the groundwork for the contemporary savannah mammal fauna.”
The team’s findings were published in the November 2025 issue of Paleobiodiversity and Paleoenvironment.
_____
N. Spasov et al. Early forms of bipedal locomotion in terrestrial humans during the Late Miocene of Bulgaria. Paleobio Paleoembu, published online on November 13, 2025. doi: 10.1007/s12549-025-00691-0
Utilizing advanced X-ray technology, robotics, and artificial intelligence, entomologists have successfully developed interactive digital imagery for 792 ant species across 212 genera.
A detailed Antscan specimen rendering: Eciton Hamatum. Image credit: Katzke et al., doi: 10.1038/s41592-026-03005-0.
To create this extensive digital library, researchers at the Okinawa University of Science and Technology, led by Julian Katzke, gathered ethanol-preserved ant specimens from museums, partner institutions, and global experts.
The team organized the specimens by species and category and transported them to the lab. The Karlsruhe Institute of Technology (KIT) in Germany provided cutting-edge X-ray micro-CT scanning, similar to medical CT scans but with significantly higher magnification.
A synchrotron particle accelerator generated a powerful X-ray beam, enabling rapid scanning of a vast array of samples, while a robotic sample changer seamlessly rotated images every 30 seconds.
This sophisticated process facilitated the production of 2D image stacks, essential for constructing 3D models.
Despite the utility of raw image files, initial depictions of the ant specimens were often distorted, falling short of achieving the realistic models scientists envisioned.
3D imaging allows for the visualization of internal structures, including muscles, nervous systems, and digestive systems, at a micrometer level of resolution.
These models can easily be animated or integrated into virtual reality environments for purposes spanning research, education, and entertainment.
“If we had conducted this project using a standard lab-based CT scanner, it would have taken six years of continuous operation,” Dr. Katzke explained.
“With the KIT setup, we scanned 2,000 specimens in just one week.”
Professor Evan Economo, a researcher at the Okinawa Institute of Science and Technology and the University of Maryland, remarked, “Without these computational tools, completing this project manually would have been nearly impossible.”
Dubbed the Antscan, this initiative could pave the way for future digitization efforts across various species beyond ants.
“The significance of this research extends far beyond ants,” Professor Economo stated. “Once specimens are digitized, we can create libraries that enhance the utilization of biological materials across science labs, classrooms, and even Hollywood studios.”
The team’s study was published in the prestigious journal Nature Methods.
_____
J. Katzke et al. High-throughput phenomics of global ant biodiversity. Nat Methods published online March 5, 2026. doi: 10.1038/s41592-026-03005-0
NASA/ESA/CSA’s James Webb Space Telescope has meticulously scanned Jupiter’s circumference, documenting the mesmerizing aurora as it came into view. This dynamic spectacle arises from charged particles traveling along magnetic field lines and colliding with the planet’s ionosphere, creating a stunning glow. Utilizing Webb’s Near Infrared Spectrometer (NIRSpec), researchers captured an intriguing feature of Jupiter’s aurora, known as an auroral footprint. These bright luminescent patterns result from interactions between Jupiter’s Galilean moons—Io, Europa, Ganymede, and Callisto—and the surrounding cosmic environment. Planetary scientists leveraged NIRSpec data to analyze the physical characteristics of the auroral footprints of Jupiter’s innermost moons, Io and Europa, measuring local temperature and ionospheric density in near-infrared light. They uncovered a previously unseen low-temperature structure centered around Io’s bright spots, characterized by an exceptionally high density, likely caused by significant electron flow impacting the upper atmosphere.
Webb’s first spectral measurements of Io and Europa’s auroral footprints reveal unprecedented changes in physical characteristics linked to electron collisions in Jupiter’s atmosphere. Image credits: NASA / ESA / CSA / Webb / NIRCam / Jupiter ERS Team / Judy Schmidt / Katie L. Knowles, Northumbria University.
“Previously, these emissions were measured in ultraviolet and infrared wavelengths solely by their brightness,” stated lead author Dr. Katie Knowles, a student at Northumbria University.
“For the first time, we can describe the physical properties of an auroral footprint: the upper atmosphere’s temperature and ion density, which have never been documented before.”
Unlike Earth’s auroras, which primarily result from solar wind, Jupiter’s auroras are influenced by its four major Galilean moons, which generate their own “mini auroras.”
Jupiter’s immense magnetic field rotates every 10 hours, channeling charged particles. In contrast, its moons orbit much more slowly; for instance, Io takes approximately 42.5 hours to complete one orbit.
“The moons continuously interact with the planet’s magnetic field and plasma, driving high-energy particles down magnetic field lines into the atmosphere, forming auroral footprints that trace their orbits around Jupiter,” Knowles explained.
“Jupiter’s auroras are the most potent and persistent within the solar system.”
“Our observations with Webb offer an unprecedented glimpse into how Jupiter’s moons directly affect the upper atmosphere.”
During a 22-hour observation span in September 2023, Webb meticulously scanned around Jupiter’s edge, tracking auroras as they appeared.
Interestingly, they captured auroral footprints originating from Io and Europa, which did not exhibit the typical characteristics of Jupiter’s main auroras, which are generally hotter and denser.
Instead, researchers discovered a cold spot within Io’s auroral footprint that exhibited significantly lower temperatures and unusually high density compared to typical expectations.
Io is notably the most volcanically active celestial body in the solar system, ejecting approximately 1,000 kilograms of material into space every second, thus replenishing the dense plasma enveloping Jupiter.
This ejected material becomes ionized, forming a toroidal cloud around Jupiter known as the Ioplasma torus.
As Io moves through this complex environment, it generates powerful electrical currents that contribute to the brightest regions in Jupiter’s auroras.
The team found that these auroral footprints contained trihydrogen cation densities three times greater than those present in Jupiter’s primary auroras, with some localized areas experiencing density fluctuations of up to 45 times.
“We observed rapid fluctuations in both temperature and density within Io’s auroral footprint occurring within mere minutes,” Knowles noted.
“This indicates that the flow of high-energy electrons impacting Jupiter’s atmosphere is changing at an incredibly fast pace.”
The recorded temperature at the cold spot was only 538 degrees Celsius (265 degrees Fahrenheit), compared to 766 K (493 degrees Celsius or 919 degrees Fahrenheit) in the surrounding aurora.
This cold spot also contained three times the density of material found in Jupiter’s main aurora.
This discovery could have implications extending well beyond Jupiter, posing intriguing questions about other planetary systems.
Saturn’s moon Enceladus similarly generates auroral footprints on Earth, leading scientists to suspect that comparable phenomena may occur there too.
“This research opens up new avenues for studying not only Jupiter and its Galilean moons but also other giant planets and their satellite systems,” Knowles remarked.
“We are witnessing Jupiter’s atmosphere responding to its moons in real-time, providing insights into processes that may occur throughout our solar system and beyond.”
“This phenomenon was only observed in one of five snapshots, prompting questions: how frequently does this occur? Does it vary? How does it change under different conditions?”
The study is published in the journal Geophysical Research Letters.
_____
Katie L. Knowles et al. 2026. Short-term fluctuations in Jupiter’s moon footprint discovered by JWST. Geophysical Research Letters 53 (5): e2025GL118553; doi: 10.1029/2025GL118553
Representation of Electrons in a Half Möbius Molecule
IBM Research and the University of Manchester
Recent discoveries by chemists reveal an intriguing new molecular structure, which exceeds the complexity of a traditional Möbius strip.
A Möbius strip is a twisted shape that requires an object, like an ant, to traverse it twice to return to its original side.
Igor Roncevic and his team at the University of Manchester have uncovered a more complex half-Möbius molecular structure. This breakthrough could revolutionize the manipulation of molecular shapes and topologies for various applications.
“This molecule is entirely novel and unexpected. Not only is it captivating that we have synthesized a molecule with unconventional topology, but we have also verified that such a structure is feasible, which was previously unconsidered,” he states.
To synthesize the molecule, the researchers combined 13 carbon atoms and two chlorine atoms into a ring on a gold substrate at ultra-low temperatures. Utilizing advanced atomic force and scanning tunneling microscopes, they precisely controlled individual atoms and analyzed the electron properties. Here, electrons do not remain rigidly attached but are diffused in a localized region, resembling tiny waves of matter.
The interactions among these electrons induced unprecedented twists within the molecule. A hypothetical quantum particle would need to revolve around the structure four times to return to its starting point.
Researchers demonstrated the ability to toggle the molecular state from left-handed to right-handed or to untwist it through small electromagnetic pulses. This innovation allows chemists to engineer molecular topology on demand.
To comprehend the newly discovered molecule and its potential existence, the researchers employed simulations on classical computers and an IBM quantum computer. Electron interactions are essential for introducing twists in molecules, which are challenging to simulate accurately on traditional platforms. However, quantum computers, built upon interacting quantum entities, can perform these simulations with greater precision, Roncevic notes.
According to team members, this research illustrates how quantum computing can tackle real-world chemistry challenges. Ivano Tavernelli from IBM emphasizes this point.
“This groundbreaking experiment integrates multiple facets of organic chemistry, surface science, nanoscience, and quantum chemistry,” asserts Gemma Solomon from the University of Copenhagen.
“This is an exciting endeavor that effectively translates abstract topological ideas into the field of molecular chemistry,” adds Kenichiro Itami from RIKEN, Japan, noting the technical significance of the research.
Kim Dong Ho, a professor at Yonsei University in South Korea, highlights the potential applications of shape-switchable molecules in sensor technology, indicating that they could toggle states in response to magnetic fields.
An international team of archaeologists analyzed 85 pottery sherds containing significant food shell remains from 13 archaeological sites in Northern and Eastern Europe, dating from the 6th to 3rd millennium BC. They identified various plant tissues such as wildflowers, legumes, fruits, and herbaceous roots, leaves, and stems in 58 of these sherds. The findings reveal that prehistoric hunter-gatherers exhibited a selective preference for specific plant species and parts, often pairing them with certain animal foods.
Prehistoric Europeans demonstrated careful selection of their plant foods, consciously opting for specific species and combining them with targeted animal foods. This practice may have led to the development of unique tastes, flavors, and textures, facilitated by pottery techniques, thereby motivating their invention and adoption.
Foraging wild plants was a crucial aspect of survival for prehistoric communities. However, direct evidence, including the types of plants foraged and their uses, often remains elusive.
Traditionally, scientists analyze fat residues in ancient pottery to interpret ancient diets. Nonetheless, this method primarily sheds light on animal remains, limiting insights into plant consumption.
In a groundbreaking study, researcher Lara González Carretero from the University of York and her colleagues employed advanced techniques, including microscopy and chemical analysis, to uncover evidence of plant consumption by ancient European hunter-gatherers.
The study evaluated organic artifacts from 58 pottery pieces excavated from 13 archaeological sites in Northern and Eastern Europe, dating between the 6th and 3rd millennium BC.
This innovative approach collected tissue samples from various plant species, including grasses, fruits, leaves, and seeds, frequently found alongside remains of animals, particularly fish and other marine life.
The specific combinations of ingredients varied by region, likely reflecting local cultural practices and available resources.
This important discovery underscores the significant role of plants and aquatic foods in early European diets.
The results confirm that these communities regularly utilized pottery techniques for meal preparation, each developing their own intricate culinary traditions.
“Our findings reveal that the selection of plant foods was highly selective, with hunter-gatherers favoring specific plant species and parts, often combining them with particular animal foods,” the researchers stated.
“These results also suggest that our understanding of plant processing in pottery may be drastically underestimated if we rely solely on lipid residue analysis.”
Read their paper published in the online journal PLoS ONE.
_____
L. González Carretero et al. 2026. Selective culinary uses of plant foods by Northern and Eastern European hunter-gatherer-fishermen. PLoS One 21 (3): e0342740; doi: 10.1371/journal.pone.0342740
When an American tragically dies in a plane or train crash, a dedicated independent commission investigates the incident to pinpoint failures and develop strategies to prevent similar occurrences.
In stark contrast, there is currently no similar process in place following deadly floods or hurricanes.
Recently, Rep. Eric Sorensen from Illinois introduced a significant bill aimed at establishing such a review commission to thoroughly investigate weather-related disasters and implement preventive measures for the future.
The proposed legislation, titled the National Weather Safety Commission Act, seeks to establish an independent commission with a minimum of seven members, all possessing relevant expertise in fields like meteorology, social science, and emergency management. The President will appoint them, pending Senate confirmation. This commission is modeled after the National Transportation Safety Board, which investigates all civil aviation incidents and other emergencies.
This proposed commission would have the authority to investigate severe weather events, issue subpoenas for testimony and evidence, and compile reports and recommendations for agencies such as the National Weather Service, Federal Emergency Management Agency, and Army Corps of Engineers.
The idea of a weather disaster review has been a subject of discussion among meteorologists and emergency management professionals for years. However, it gained renewed urgency following the devastating flood in Texas this past July, which resulted in over 130 fatalities, including 27 campers and counselors at a camp along the Guadalupe River.
Caution tape marking the entrance to Hunt’s Camp Mystic on July 7. Brandon Bell/Getty Images File
In the aftermath of the flood, blame quickly shifted among Texas officials, with criticisms directed at the National Weather Service, which was operating with reduced staff due to funding cuts from the previous administration. Questions were raised about the accuracy of rainfall forecasts as well as the effectiveness of local emergency management systems and alert protocols.
“We quickly discovered that political maneuvering was complicating the issue,” Sorensen told NBC News. “We need to implement substantial changes to ensure that a tragedy of this nature does not recur. We will enlist the top experts on an independent board to deliver insights that Congress can use to formulate policies prioritizing public safety.”
As the sole meteorologist in Congress, Sorensen underscores the urgency of this initiative.
“Meteorologists have been advocating for years that we need to enhance our response mechanisms,” Sorensen asserted.
In recent years, Senators Brian Schatz (D-Hawaii) and Bill Cassidy (R-Louisiana) have introduced various proposals aimed at creating similar disaster review boards. Additionally, several Republican colleagues, including former Rep. Katie Porter (D-Calif.) and Rep. Nancy Mace (R-S.C.), are championing comparable legislation. The concept of an independent review body for weather-related incidents dates back to 2006.
Currently, the new bill lacks bipartisan sponsorship.
“This situation is a crucial test for us during this administration and in our currently polarized political climate: Can we still achieve bipartisan collaboration? Can we unite across party lines to enact necessary changes?” Sorensen expressed.
Congressman Eric Sorensen (D-Illinois) at the U.S. Capitol on April 10, 2024. Tom Williams/CQ Roll Call (via Getty Images)
Neil Jacobs, former head of the National Oceanic and Atmospheric Administration (NOAA) appointed by President Trump, endorsed the proposal for an independent review panel.
“Accurate data is essential for post-storm evaluations,” Jacobs mentioned during his Senate confirmation following the Texas disaster. “Drawing from my experiences with the NTSB on aviation incidents, I can envision something similar for weather-related catastrophes, as we require comprehensive data to assess what succeeded, what failed, and the efficacy of warning systems.”
Sorensen confirmed he has collaborated on this bill with Jacobs.
“I believe he is the ideal ally to help propel this initiative forward,” Sorensen stated.
Douglas Hilderbrand, executive director of the American Weather Companies Association, an emerging organization focused on weather forecasting and information delivery, is also collaborating with Sorensen on this legislation.
“Weather is fundamentally a bipartisan concern,” Hilderbrand emphasized. “We remain optimistic about this initiative.”
The bill delineates specific types of events qualifying as weather hazards for the commission’s examination, including any disaster declared by the President under the Stafford Act, along with severe weather events resulting in at least 10 fatalities or 100 injuries. Such events deemed “rapidly occurring” mass casualty incidents are included as well.
The Weather Safety Board would convene a vote within 14 days of the occurrence of such an event to determine whether to initiate an investigation.
The Army Corps of Engineers, Department of Homeland Security (including FEMA), Federal Communications Commission, and NOAA (including the National Weather Service) will be obliged to furnish data and information to the Commission upon request.
The board will have a period of 90 days post-major weather event to draft an interim report, followed by a comprehensive final report due within a 20-month timeframe.
Feedback is New Scientist A popular publication that keeps you updated on the latest science and technology insights. We welcome feedback on topics you think would interest our readers. Please email us at feedback@newscientist.com.
Exploring Unconventional Measurements
Since the Golden Retriever became a quirky unit for measuring ice blocks, our feedback inbox has been flooded with examples of unique and often surprising units of measurement.
Craig Downing, who describes himself as “one of those readers who checks the back of every issue,” shared a fascinating insight about the Rideau Canal in his hometown of Ottawa, Canada. Every winter, the canal transforms into the largest ice skating rink in the world, requiring meticulous snow removal for a smooth surface.
According to a statement from the National Capital Commission, “For every centimeter of snowfall, our crew clears 125,000 kg of snow from the skateway, equivalent to the weight of approximately 450 polar bears.”
Craig expressed his confusion: “I usually visualize snow depth in terms of ‘shovel loads’ or ‘knee-deep driveways.’” Moreover, living in Ottawa, he has yet to encounter a polar bear firsthand, limiting his experience with these majestic creatures.
The average polar bear reportedly weighs around 277.8 kilograms (612 pounds). However, gender nuances play a significant role; adult male bears can weigh between 350 and 600 kg (775 to 1,300 lb), while females typically range from 150 to 290 kg (330 to 650 lb), with some exceptional cases reaching up to 800 kg.
This lack of specification raises questions regarding many unconventional units. Steve Tees submitted a query, stating, “I keep hearing about ‘xxx warehouses’ causing traffic delays. Can someone clarify the size of these warehouses?”
The Sounds We Dread
Various sounds can hinder focus. While nails on a chalkboard are notoriously unpleasant, other common annoyances include loud chewing and vigorous teeth brushing by strangers.
One particularly despised sound is the high-pitched screech produced when adhesive tape is pulled from a surface, which deters many from DIY projects.
But understanding the science behind this noise could help. For example, an experiment published in Physical Review E explores the physics of peeling off cellophane tape. Researchers employed high-speed cameras and microphones to study the tape’s removal speed, discovering that “microscopic cracks travel through the tape at supersonic speeds, producing a shock wave that manifests as a high-pitched screech.”
We eagerly anticipate feedback from follow-up studies aiming to demonstrate quiet tape removal methods.
On Retractions and Their Implications
Our feedback section takes a keen interest in the world of retracted scientific papers. Whether due to questionable graphics generated by AI, manipulated images, or dubious research claims, these cases pique our curiosity.
A prominent example is a 2026 retraction from Pharmacological Research and Prospects. Originally published in 2022, the paper investigated ivermectin—an anti-parasitic drug controversially labeled a potential cure for COVID-19 and suggested for liver cancer treatment. We believe such claims warrant skepticism.
The retraction notice indicated it was “by agreement” between the authors and relevant parties, prolonging the discussion long past its culmination.
It was stated that “the corresponding author was not involved in the submission process, did not sign an open access agreement, and did not review or approve the final manuscript version before submission,” raising serious concerns.
Furthermore, the journal’s investigation uncovered evidence of image duplication from previous publications. This is, without a doubt, troubling.
Yet, the authors maintained, “the conclusions of the article are otherwise unaffected.” This left the first author musing on how conclusions could remain valid despite significant discrepancies.
Our interpretation is clear: once a paper is retracted, it loses credibility, and its conclusions are no longer taken seriously.
Have a story for feedback?
You can email your article to Feedback at feedback@newscientist.com. Don’t forget to include your home address. Discover this week’s and past feedback on our website.
Exploring the Medical Potential of Magic Mushrooms
Image Credit: John Moore/Getty Images
A recent placebo-controlled trial has revealed that a single dose of psilocybin, the active compound in magic mushrooms, significantly alleviates symptoms of obsessive-compulsive disorder (OCD). Remarkably, these effects last for at least 12 weeks, suggesting psilocybin could offer enduring relief for OCD sufferers.
“Investing in experiences like travel can disrupt patterns of obsessive thinking and behavior,” notes Dr. David Nutt from Imperial College London, who wasn’t part of the study. “The essence of OCD treatment is to guide individuals towards behavioral change—like reducing the number of times they check the lights from 15 to 2.”
Approximately 1-3% of the population suffers from OCD, a condition marked by distressing obsessions and compulsive behaviors that can severely affect daily life. Conventional treatments often involve talk therapy and antidepressants; however, 40-60% of OCD patients fail to respond to these options.
Other psychedelics, including psilocybin and ketamine, have demonstrated therapeutic potential for various mental health disorders. To explore these possibilities, Dr. Christopher Pittenger at Yale University launched the first randomized, placebo-controlled study focused on psilocybin’s effects on OCD.
The research involved 28 adults with an average of 20 years of OCD experience who had previously undergone at least two failed treatment attempts. Participants assessed their symptom severity using a standard scale ranging from 0 to 40 and were randomly assigned to receive either a single oral dose of psilocybin (0.25 milligrams per kilogram) or niacin (250 milligrams), serving as a placebo.
The psilocybin dose was sufficient to induce a psychedelic experience, often associated with profound changes in perception, cognition, and emotion. “The intensity varies, but it’s generally quite strong,” remarks Pittenger.
Following 48 hours, participants who received psilocybin reported an average symptom score reduction of 9.76 points, while those given niacin showed minimal changes. “The rapid and enduring improvements after a single psilocybin dose are astonishing,” states Dr. Alex Kwan from Cornell University.
After one week, around 70% of participants who took psilocybin experienced a 35% decrease in symptom scores, and this effect persisted through the 12-week follow-up. “Psilocybin outperforms traditional OCD medications in both efficacy and speed,” says Nutt, who was involved in a separate clinical trial that lacked a placebo control. Research indicates that low doses of psilocybin can significantly diminish OCD symptoms.
Kwan suggests that the positive outcomes in individuals who have previously undergone several standard treatments point to psilocybin’s unique influence on the brain, though the precise mechanisms behind its efficacy in managing OCD remain unclear. “Understanding the biology behind its effects could revolutionize the treatment strategies not only for OCD but also for various mental health disorders,” he notes.
One hypothesis suggests that psilocybin enhances brain plasticity, potentially weakening entrenched thoughts that typically dominate an individual’s mindset. This flexibility is a critical challenge with all psychedelic substances, according to Nutt, who previously demonstrated that a single dose of the psychedelic DMT alleviated depression symptoms. “Individuals experiencing depressive thoughts found their thinking became more adaptable post-psychedelic experience,” he states.
Another theory posits that psilocybin recalibrates the brain’s default mode network, influencing areas linked to rumination and self-awareness, according to Pittenger. Research has also shown that a single psilocybin dose can enhance mental health by rewiring neural connections and reducing inflammation.
However, safety concerns regarding psilocybin use have emerged. In a Yale University study, a participant with a history of suicidal thoughts began to actively plan suicide during the trial. Although this risk was later mitigated through standard monitoring, Pittenger emphasizes the necessity for stringent clinical safeguards when administering psilocybin in medical contexts. Comprehensive trials are also needed to validate the drug’s efficacy, safety, optimal dosing, as well as to identify ideal candidates and those at increased risk, he adds.
A recurring challenge in psychedelic research is that participants’ experiences can often reveal whether they received the active drug or placebo. To combat this, researchers administered niacin, which can provoke sensations like facial flushing and elevated heart rate. However, many participants still discerned their treatment, according to Pittenger. “As with most studies of this nature, this presents a limitation,” he concludes.
Leanne ten Brinke’s eye-opening book on dark personalities begins with an unexpected case study of a psychopath. The author highlights a well-known judge presiding over the case of a criminal, emphasizing the complexities of morality.
U.S. Supreme Court Justice William O. Douglas, once a key figure in mid-20th-century liberalism, might exemplify what ten Brinke describes as the “modern definition of a psychopath.” His actions, although not criminal, cast shadows over his legacy, impacting many lives around him.
The diagnosis of psychopathy was discontinued in 1952, attributed to stigma, leading to the adoption of broader terms such as antisocial personality disorder. By the 1980s, psychopathy re-emerged in criminal contexts, with assessments like the Revised Psychopathy Checklist highlighting the lack of empathy in violent offenders, making them capable of high recidivism rates. Individuals identified as psychopaths, although only 1% of the population, are estimated to be responsible for half of the serious crimes, according to ten Brinke.
Ten Brinke, who directs the Truth and Trust Institute at the University of British Columbia, argues that high dark personality traits are not limited to outright offenders. “If we broaden the psychopathy lens, perhaps 10-20% of the population exhibits high levels of traits associated with psychopathy, yet lack the clinical designation,” she states.
In Toxic People, ten Brinke assesses the societal costs inflicted by “predatory individuals” and proposes strategies to mitigate their impact in our lives. However, she presents a crucial caveat.
Over the last two decades, personality psychology has developed the Dark Tetrad framework, combining psychopathy, Machiavellianism, narcissism, and sadism.
Contrary to pop culture’s portrayal of psychopathy as a binary condition, ten Brinke illustrates that it functions on a spectrum. Each individual has varying scores across different traits, with roughly 10-20% exhibiting pronounced dark personality traits, arguably contributing to societal erosion of ethical standards.
However, there’s a silver lining: around 80% of individuals don’t exhibit high levels of these traits. But ten Brinke cautions against complacency, indicating that these characteristics can be influenced by environmental factors.
Through a detailed case study, she discusses how a “culture of corruption” can transform the majority into complicity. “Kind individuals can become vulnerable to dark personalities,” she warns, identifying factors like fatigue or group dynamics as potential triggers for harmful behavior.
The book offers readers useful strategies to shield themselves from toxic individuals, including the establishment of clear boundaries. Yet, it also emphasizes the importance of self-reflection. How can we maintain our moral integrity and resist enabling those with nefarious intentions? Ten Brinke poses critical questions about why we often elevate such personalities in leadership roles.
While some may argue that dark personalities make compelling leaders, ten Brinke debunks this myth in lighter sections of the book. She highlights how research into investment bankers reveals that the most manipulative managers often achieve poorer financial outcomes over time.
Findings suggest that these cunning managers earned 30% less than their cooperative counterparts over a decade. “If you aim to maximize investment returns, seeking a predatory manager may not be your best strategy,” she concludes.
Misconceptions about psychopathic effectiveness arise in workplace dynamics, as dark personalities tend to propagate self-aggrandizing lies. They find reward in deception, furthering their personal agendas. Ten Brinke articulates how such individuals often falsely claim to be exemplary leaders, creating an atmosphere of mistrust.
“
In investment banking, the most malicious and cunning managers earned 30% less than the average. “
Ten Brinke emphasizes our complicity in endorsing dishonest narratives. By refining our own darker traits, particularly strategic Machiavellian thinking, we can better identify deception.
She reminds us that if detecting lies were effortless, deception wouldn’t exist. However, vigilance can pay off. If a few “bad apples” spoil the barrel, the rest of us have the power to prevent decay. Interestingly, ten Brinke hints that certain traits, such as empathy and conscience, can counterbalance darker tendencies, offering a means to reverse corruption.
Challenging the notion that “absolute power corrupts absolutely,” she asserts it applies principally to the worst individuals. Taking ownership of our moral character can lead to rewards.
The pathway to cultivating what she calls “moral Machiavellianism” could significantly enhance our society, moving us beyond the assembly line of psychopathic behaviors.
Amyloid plaques in the brain are a defining feature of Alzheimer’s disease, but what if the roots of the condition start elsewhere in the body?
Alamy
Alzheimer’s disease has traditionally been believed to originate in the brain. However, comprehensive genomic analysis indicates that inflammation in distant organs such as the skin, lungs, or intestines may initiate the condition, potentially decades before noticeable memory decline occurs. This shift in understanding could shed light on why Alzheimer’s treatments have been largely ineffective. Current drugs intervene too late; a focus on early-stage inflammation in peripheral organs may be crucial.
“As neuroscientists, we tend to focus on the brain, but this study highlights that the brain is interconnected with the body, and changes elsewhere can impact brain function,” states Donna Wilcock from Indiana University, not involved in the study. “Although Alzheimer’s is a brain disorder, we must consider the entire body when discussing its genesis.”
To explore the genetic underpinnings of Alzheimer’s disease, researchers including Cesar Cunha from Denmark’s Novo Nordisk Foundation Basic Metabolic Research Center analyzed genetic data from the European Alzheimer’s and Dementia Biobank, encompassing over 85,000 individuals with the disease and approximately 485,000 without it. They also evaluated gene activity in 5 million single cells across 40 body regions and 100 brain regions.
The study scrutinized 1,000 genes linked to an increased Alzheimer’s disease risk, surprisingly finding these genes were more abundant in organs like the skin, lungs, and digestive system than in the brain. “It was counterintuitive at first because the expression of these risk genes in brain cells seemed low,” notes Cunha. “Our continued analysis revealed their primary presence in other body parts.”
Many of these Alzheimer’s risk genes are tied to immune regulation and are particularly abundant in barrier tissues like the skin and lungs, which defend against bacteria and toxins through inflammatory responses. “This suggests that Alzheimer’s might initiate due to inflammation in these peripheral organs,” Cunha explains. Genetic variations may even dictate the extent of inflammation and its impact on brain health. Hence, individuals with a family history of Alzheimer’s could be more vulnerable to the disease amidst infections or inflammatory episodes.
Interestingly, the highest expression of these gene variants occurs when individuals reach ages 55 to 60. Inflammation during this period seems likely to trigger Alzheimer’s, corroborated by long-term studies from Hawaii. Inflammatory markers rise in individuals in their late 50s, with those in their 70s and 80s exhibiting increased Alzheimer’s likelihood. “A person could suffer from lung inflammation due to a viral infection at age 55, which might initiate Alzheimer’s 30 years later, but the exact mechanisms remain elusive,” Cunha remarks.
Rezanur Rahman, a researcher at QIMR Berghofer Medical Research Institute, has identified a genetic mutation associated with Alzheimer’s that appears concentrated in the skin and lungs. More research is essential to understand their functional role in symptom progression, Rahman states. “Association does not imply causation.”
Previously, the brain was deemed immune-privileged and largely unaffected by inflammatory processes elsewhere in the body. Bryce Vissel from St. Vincent’s Hospital in Sydney, Australia, among those who first proposed inflammation as a trigger for Alzheimer’s, acknowledges that while initially contentious, new evidence supports that peripheral inflammation from infections or injuries may indeed instigate the disease.Infection or injury can affect brain function.
When inflammation occurs, immune cells are activated, releasing signaling proteins like cytokines that can cross into the brain via the bloodstream. An unpublished study by Vissel and his team indicates that cytokines may disrupt neuronal connections, potentially leading to memory impairment.
Concurrently, research has shown that the blood-brain barrier becomes more permeable with age, allowing inflammatory cytokines and immune cells easier access, which might elucidate why inflammation poses more of a risk during mid-life compared to youth, Cunha notes.
Current theories posit that Alzheimer’s disease stems from the accumulation of misfolded beta-amyloid and tau proteins within the brain. Yet, treatments aimed at eliminating these proteins have yielded minimal success, indicating that such accumulation might be a symptom rather than the core issue. “We’ve been trying to treat the result of the disease, not its cause,” Cunha argues.
Cunha likens this to past mistakes in obesity treatments, which initially targeted excess fat directly, failing until genetic research revealed that mutations connected with obesity are often highly expressed in the brain, disrupting appetite and energy balance. This led to the development of the weight-loss medication semaglutide (marketed as Ozempic and Wegovy), which modulates brain pathways to curb appetite.
If Alzheimer’s originates from peripheral inflammation, its treatment would necessitate a paradigm shift, Cunha asserts. Data indicate that midlife vaccinations may offer protective benefits against Alzheimer’s disease. A recent Californian study revealed that adults receiving both doses of the shingles vaccine recommended for individuals aged 50 and older were 50% less likely to develop Alzheimer’s by age 65. Another investigation found that those aged 50 and older treated with the Bacillus Calmette-Guérin (BCG) vaccine for bladder cancer had a 20% reduced risk of onset.
This phenomenon might arise as vaccines bolster the aging immune system and mitigate inflammation, suggests Wilcock. “At age 55, we should invigorate our immune systems and remind them to stay active, as most vaccinations occur in childhood.”
Beyond vaccinations, several lifestyle interventions have been shown to diminish inflammation and avert Alzheimer’s disease. These include adopting a Mediterranean diet, limiting alcohol consumption, exercising, quitting smoking, and managing blood pressure and cholesterol levels.
Professor Cunha emphasizes that the challenge lies in convincing fellow neuroscientists to recognize peripheral inflammation as a potential contributor to Alzheimer’s disease. “I’ve encountered skepticism at academic conferences, being told, ‘If you aren’t focusing on amyloid, you’re not studying Alzheimer’s disease,'” he shares. “After decades entrenched in amyloid research, adapting one’s perspective can be daunting.”
Exciting news from New Guinea! Two marsupial species, believed extinct for over 6,000 years, have been rediscovered.
The Ring-tailed Gliders and Pygmy Longfinger Possums, previously known only from fossils in Australia, were recently observed on the Vogelkop Peninsula in Papua, Indonesia, thanks to the support of local indigenous communities.
Renowned researcher Tim Flannery and his team at the Australian Museum in Sydney undertook years of investigative work, including analyzing peculiar sightings and misidentified specimens, to confirm that these remarkable animals had returned to life.
With photographic evidence and active collaboration with local communities, researchers have verified these animals’ existence. However, their habitat is under threat from logging activities. The specific ecological requirements and range of these rediscovered species are still largely unknown, complicating conservation efforts.
Scott Hucknull, a professor at Central Queensland University, remarked that this discovery is “more significant than finding a live quoll in Tasmania.”
One notable species, the Wow Glider (Thus ayamalensis), is closely related to Australian gliders in the genus Petaurodes. However, distinct features like its prehensile tail and furless ears have warranted its classification into a separate genus.
Local indigenous communities often regard gliders as sacred and protected animals, potentially contributing to their previous obscurity in scientific literature.
“This is one of the most photogenic animals and beautiful marsupials I’ve ever encountered,” Flannery stated.
The Pygmy Longfinger Possum (Dactylonax Kambuyai) is a striking striped creature characterized by an unusually long finger on each hand, which aids its survival.
As Flannery explains, “They possess unique ear adaptations that may help them detect the low-frequency sounds of larvae within wood, allowing them to extract food from decaying trees.”
The exact location of this species remains confidential to protect it from potential wildlife traders.
Flannery cautions against capturing these animals. “They are challenging to maintain in captivity due to their specialized diet—potential pet owners should be forewarned: they don’t last long in confined environments.”
Fossils trace back to approximately 3 to 4 million years ago have been uncovered in archaeological sites in Victoria and New South Wales, Australia, but significant gaps exist in the fossil record, leaving much about the genus a mystery.
Hucknull notes, “The smallest fossil species are undifferentiated from their modern counterparts. The Dactylonax Kambuyai has now been confirmed alive in West Papua.”
“Pocket-sized, peculiar, and adorable,” says Hucknull, emphasizing the ecological significance of this unique species.
Researcher David Lindenmayer from the Australian National University in Canberra commented on the significance of these discoveries while expressing concern over deforestation and habitat destruction in New Guinea. “It provokes questions about what has been lost in Australia due to similar land clearing practices.”
Explore Fossil Hunting in the Australian Outback
Join us on an extraordinary adventure through Australia’s fossil frontier! Once a shallow inland sea millions of years ago, eastern Australia is now a fossil hotspot. Over 13 unforgettable days, travel deep into the hinterland, follow in the footsteps of prehistoric giants, and uncover the secrets of Earth’s ancient history.
While direct evidence of extraterrestrial life remains elusive unless aliens reside close to our solar system, the search for signs of life beyond Earth continues. Astrobiologists typically seek biological markers such as oxygen molecules and ozone in the atmospheres of exoplanets as indicators of potential life.
However, the presence of these chemicals doesn’t guarantee life; they could arise from unknown non-biological processes. More definitive proof of intelligent extraterrestrial beings might come from identifying signs of technological activities in space, known as technosignatures. Established in 1984, the Search for Extraterrestrial Intelligence (SETI) focuses specifically on detecting these technosignatures, particularly through radio signals.
From 2006 to 2020, the SETI@home project collaborated with researchers exploring excessive radio emissions from space via the Arecibo Telescope. Over 14 years, SETI@home collected approximately 400 days of observation time, resulting in billions of detected radio emissions. Unfortunately, most of these signals are likely due to radio frequency interference, benign celestial objects like pulsars or gas clouds, rather than a single extraterrestrial source.
To refine their data analysis, the team recently developed an algorithm designed to filter out interference and pinpoint signals from fixed sources. This advancement positions researchers to re-observe these locations using the 500-meter Fast Radio Telescope.
The algorithm’s goal is to differentiate between natural cosmic signals and potential technosignatures. The team established three criteria for detecting such signals: they must remain stable within a narrow frequency range, exhibit a consistent pulsation, and contain a periodic structure spanning several seconds.
A key consideration is that signals sent intentionally for detection may differ significantly from random radio waves emitted from an alien atmosphere. The principles governing these interactions, such as the Doppler shift, complicate the analysis. Researchers theorize that intelligent civilizations would generate radio signals at a near-constant frequency, easily distinguishable from natural noise.
In their algorithm development, researchers integrated artificial data points that simulate the potential detection of distinct technosignatures, referred to as birdie candidates. If a birdie is flagged for further analysis, it validates the algorithm’s effectiveness. Adjustments to the algorithm’s sensitivity were made based on whether birdies were included or excluded from deeper scrutiny.
To tackle the complexities of data filtering and scoring, the team divided tasks into manageable segments, allowing simultaneous processing on multiple machines. Running the algorithm on 2,000 connected processors, filtering took about 15 hours, while scoring required 1.6 days. Two iterations of the algorithm on SETI@home data were completed, including one with 3,000 birdies for comparative analysis. The Birdie system helped determine which algorithm settings surpassed specified energy thresholds, leading to the identification of 92 targeted signal candidates for re-observation using 23 hours of observation time gained through FAST.
Currently, work is ongoing to analyze these signals, and as of July 2025, researchers have re-observed 80 out of the 92 candidates. Although no direct evidence of extraterrestrial intelligence has been discovered yet, the team remains optimistic that future inquiries utilizing specialized radio telescopes will yield promising results. However, the high costs and demands associated with radio telescope usage mean that SETI will likely continue to collaborate with other astronomers to maximize data collection from available observations.
Recent research highlights an extraordinary extremophile organism, Deinococcus radiodurans, known for its remarkable resilience. This unique microbe can endure the harsh conditions of radiation, frigid temperatures, and arid environments typically encountered during interplanetary transport. New findings suggest that Deinococcus radiodurans also possesses outstanding resistance to the extreme transient pressures generated by impact ejection from Mars. Consequently, this raises the possibility that such resilient life forms could traverse between planets in our solar system following a significant asteroid impact.
Artist’s impression of an asteroid. Image credit: Mark A. Garlick, Space-art.co.uk / University of Warwick / University of Cambridge.
Impact craters are prevalent on the surfaces of numerous celestial bodies, with the Moon and Mars being among the most cratered.
Scientific findings indicate that asteroid impacts can propel materials across space, as evidenced by the discovery of a Martian meteorite on Earth.
Furthermore, researchers have long speculated that asteroids could also launch microscopic life forms into space.
This theory, known as the lithopanspermia hypothesis, suggests that life could be ejected into space and potentially land on other planets.
In a groundbreaking study, researchers from Johns Hopkins University, led by Kariat (KT) Ramesh, simulated conditions under which microbes like Deinococcus radiodurans could be expelled into space due to an impact force.
The researchers placed the bacteria between two steel plates and applied pressure with a third plate, demonstrating that these microbes can withstand pressures of up to 3 GPa (30,000 times Earth’s atmospheric pressure).
By analyzing gene expression, they were able to observe biological stress responses within the bacteria under varying pressures.
While samples subjected to 2.4 GPa started to exhibit membrane damage, the unique structure of the bacterial cell envelope accounts for a survival rate of 60% among the microorganisms.
The transcriptional profiles indicated that these resilient bacteria prioritize repairing cellular damage in the aftermath of an impact.
“While we have yet to confirm the existence of life on Mars, if it exists, it likely shares similar survival capabilities,” Ramesh remarked.
“This study suggests that life could endure being ejected from one planet and travel to another.”
“These findings significantly alter our understanding of the origins of life on Earth,” remarked Dr. Lily Chao, also from Johns Hopkins University.
“Our research indicates that life can survive massive impacts and eruptions, implying that life may travel between planets. Perhaps we are all Martians!”
These findings were published in this week’s edition of PNAS Nexus. For detailed insights, refer to the study.
_____
Lily Chao et al. 2026. Extremophiles can withstand temporary pressures associated with impact ejection from Mars. PNAS Nexus 5(3):pgag018; doi: 10.1093/pnasnexus/pgag018.
Recent images from the NASA/ESA Hubble Space Telescope and the ESA’s Euclid mission showcase the intricate multi-shell structure of the fascinating planetary nebula NGC 6543, famously known as the Cat’s Eye Nebula.
This mesmerizing image from Euclid encapsulates the panoramic view of the Cat’s Eye Nebula. Image credits: NASA / ESA / Hubble / Euclid Consortium / J.-C. Cuillandre & E. Bertin, CEA Paris-Saclay / Z. Tsvetanov.
The Cat’s Eye Nebula, located roughly 4,300 light-years away in the constellation Draco, has intrigued astronomers for decades due to its complex, multi-layered architecture.
“Planetary nebulae,” as they are called, derive their name from their round appearance in early telescopic observations; they are actually colossal gas clouds expelled from stars nearing the end of their life cycle,” the Hubble and ESA astronomers explained.
This insight was initially uncovered in 1864 using the Cat’s Eye Nebula itself. Studying its light spectrum allows scientists to identify individual molecules, a characteristic that differentiates planetary nebulae from stars and galaxies.
Near-infrared and visible-light imagery from the Euclid mission illustrates the arcs and filaments of the nebula’s luminous core, enveloped in a mist of vibrant gas debris that is retreating from the star.
“This ring was expelled from the star prior to the formation of the central nebula,” the astronomers noted.
“The entire nebula is prominently set against a backdrop brimming with distant galaxies, exemplifying how local astrophysical wonders and the farthest reaches of the universe coexist in today’s astronomical surveys.”
In this remarkable image, Hubble captures the swirling gas core of the Cat’s Eye Nebula. Image credits: NASA / ESA / Hubble / Z. Tsvetanov.
Through a wide-field lens, Hubble has captured stunning high-resolution visible-light images of the nebula’s swirling gas core.
The data reveals an intricate tapestry of features that appear almost surreal, including concentric shells, high-velocity gas jets, and dense knots shaped by impact interactions,” the researchers stated.
“These structures are believed to document the transient mass loss from the dying star at the nebula’s center, creating a cosmic ‘fossil record’ of its final evolution.”
“The combination of Hubble’s focused observations and Euclid’s deep-field data not only emphasizes the nebula’s delicate structure but also situates it in the broader cosmic landscape explored by both telescopes.”
“Together, these missions offer a rich, complementary view of NGC 6543, illuminating the subtle interplay between a star’s end-of-life processes and the vast universe that surrounds it.”
The unprecedented mild winter weather affecting large parts of the Western United States combined with the most severe snow drought in decades has experts bracing for an intense wildfire season.
Snowfall across nearly every western state is significantly below average, providing insufficient time for accumulation before the spring thaw. Concurrently, warmer-than-average winter temperatures have contributed to a drier season than typical, escalating concerns over wildfire risks and diminishing water supplies.
“Snowfall in Colorado’s mountains has reached a 40-year low,” remarked Russ Schumacher, director of the Colorado Climate Center at Colorado State University.
Schumacher noted that Colorado’s October-to-February period, usually rich in snowfall, has been “by a wide margin” the warmest on record. In Fort Collins, the number of days hitting 60 degrees in winter has nearly doubled, increasing from 22 to 43.
Low snowfall due to a snow drought in the western United States affected Breckenridge, Colorado, on January 22nd. Hyun Chan/Denver Post via Getty Images
Consequently, Colorado has not experienced the typical snowstorms, with many storms delivering rain rather than snow, particularly in mid- and low-elevation areas.
This issue extends far beyond Colorado. Measurements of snow water equivalent across the western U.S. indicate that snowpack levels are well below average, with some basins recording less than 50% of their usual levels.
“Most regions are below 50% of average, meaning they would typically have more than double the snow,” stated Noah Molotch, a geographer from the University of Colorado Boulder.
Only a few basins in the western U.S. are seeing snow levels near average. Natural Resource Conservation Services
Molotch, alongside his team, monitors snowfall across the western U.S. and reports that this year is among the driest on record, with only the southern Sierra Nevada, parts of northwestern Wyoming, and minor regions in Montana, Idaho, and northern Washington experiencing near-normal snowfall.
Numerous areas continue to suffer from a “severe snow drought,” he added.
Research indicates that snow drought and premature snowmelt could intensify the wildfire season during summer. When forests and grasslands receive significantly less snow, or if the snow melts sooner, there’s increased potential for vegetation to dry out and serve as fuel for wildfires.
In Colorado, the recent Blue Bell Fire led to an evacuation order in Boulder last weekend, burning approximately 1.5 acres. Although quickly contained, it highlighted the state’s vulnerability under warmer, drier, and windier conditions.
“The absence of severe fire weather isn’t the issue,” Molotch stated. “These snow drought conditions clearly contribute to a potentially severe wildfire season.”
A sign indicating a “Protected Watershed Area” near a snow-covered hillside on February 8 near Salt Lake City, Utah, where around 95% of the water supply is reliant on mountain snowpack. Mario Tama/Getty Images
Schumacher pointed out that climate change could be influencing these trends. While linking yearly snowfall variations directly to global warming can be challenging, it is clear that climate change increases the likelihood of above-average temperatures, even in winter.
“Attributing the lack of precipitation to climate change is complex, but extreme temperatures exhibit a clear connection to global warming,” he stated.
Diminishing mountain snowpack poses a serious threat to water supplies in the West. The snow that accumulates in winter is essential for replenishing rivers and streams that support cities, agriculture, and hydroelectricity.
“This is critical for our water supply,” noted Molotch.
Without a stabilization of snow levels, the reservoirs in the western U.S. risk running dry.
“These challenges aren’t unique to Colorado. The Colorado River Basin is in a particularly precarious position as it is already overallocated, leading the federal government to face tough decisions regarding reductions in allocations,” Molotch explained.
The upcoming weeks could bring substantial snowfall to parts of the West, including Colorado and Utah; however, the current forecast offers little optimism.
A skier at Alta Ski Area in the Wasatch Mountains on February 8. High-altitude resorts like Alta enjoy sufficient snow, while lower-altitude resorts are resorting to artificial snowmaking. Mario Tama/Getty Images
Schumacher expressed a growing sense of resignation, as his winter optimism fades.
“As we approach early March, unfortunately, time is running out for a turnaround,” he lamented. “The hope now is that this year will merely be a disappointing one and not one for the record books.”
Cats possess a remarkable ability to adjust their bodies mid-fall, allowing them to land gracefully on their feet, a phenomenon known as the cat righting reflex.
According to Yasuo Higurashi from Yamaguchi University, the cat’s thoracic spine is especially flexible, providing the agility needed to rotate their body during a fall.
It’s widely understood that cats almost always land on their feet. When you drop a cat upside down, it instinctively twists its body to ensure a safe landing.
This impressive skill has puzzled scientists for over a century, leading to three primary theories about how cats achieve this feat.
The first theory suggests they use a propeller-like motion with their tails, turning their bodies in the opposite direction. However, Greg Barr, a physics author, notes that the tail isn’t crucial, as cats can accomplish this without it. His work, Falling Cats and Basic Physics, supports this observation.
The second theory, the bend-twist model, posits that cats bend their bodies at nearly right angles. This allows their front and back halves to rotate independently, enabling all four legs to align correctly upon landing.
The third model, referred to as tuck-and-turn, illustrates how cats first rotate their front half while extending their hind legs, then switch positions to ensure safe landing.
To explore feline behavior, Higurashi and his team conducted two experiments. The first involved assessing the spinal flexibility of five deceased cats, revealing that their thoracic spines can rotate three times more than their lumbar spines.
The researchers filmed high-speed videos of two adult cats falling from one meter, finding that the front half completes its rotation slightly faster than the rear.
According to Gbur, these experiments made him reconsider the significance of the tuck-and-turn model, suggesting a stronger reliance on the bending and twisting motions during a fall. His observations indicated that the front of a falling cat appears to orient itself before the back.
These models are not mutually exclusive, as Gbur emphasizes, pointing out that nature often employs complex and effective methods rather than simple solutions.
Interestingly, the study revealed a tendency for cats to rotate predominantly to the right when falling. While one cat consistently displayed this behavior, another did so six out of eight times. Gbur speculates that this may relate to the asymmetric arrangement of a cat’s internal organs, affecting their rotational preferences.
Topic:
This rewritten content is SEO-optimized while retaining the original information and HTML structure.
Rising Sea Levels: Increased Risk of Storm Surge Flooding in Coastal Cities
Credit: Thomas Wyness / Alamy Stock Photo
Many studies on the impact of future sea level rise have neglected to acknowledge that current sea levels are higher than previously estimated, resulting from a significant “methodological blind spot.” This oversight indicates that flooding and erosion may commence sooner than anticipated.
Katarina Seeger and Philip Minderhoud, researchers at Wageningen University in the Netherlands, evaluated 385 peer-reviewed studies addressing coastal vulnerability. They found that 90% of these studies failed to consider critical factors—such as ocean currents, tides, temperature, salinity, and wind—when assessing sea level variations. This oversight led to an average underestimation of coastal water levels by 24 to 27 centimeters.
Addressing this gap could potentially increase projections of individuals likely to experience flooding, estimating an increase of up to 68%, impacting approximately 132 million people by 2100. Areas significantly affected include Southeast Asia and Oceania, which often feature sea levels averaging a meter higher than previously calculated, with some regions seeing several meters’ difference.
“If representatives from these vulnerable regions attend global discussions to seek assistance, it may be frustrating, as their risks are grossly underestimated. This scientific miscalculation could affect outcomes for future generations,” Minderhoud stated during a briefing.
While predictions suggest that sea levels may rise by as much as 1 meter by the century’s end, many studies begin with baselines that are inaccurately low. Thus, the adverse effects will likely manifest sooner than expected.
Of the studies evaluated, 46 were referenced in the latest report from the Intergovernmental Panel on Climate Change (IPCC), the premier source on global warming impacts, including rising sea levels.
The Earth’s rotation causes it to bulge at the equator, while denser mantle sections exert a greater gravitational pull on overlying water. To accurately determine the elevation of a specific area, measurements must be compared to the geoid, which depicts mean sea level worldwide.
However, in some regions, actual sea levels can be several meters above the geoid due to wind and ocean currents accumulating water or thermal expansion caused by rising temperatures. Additionally, coastlines may shift due to sediment deposition in rivers or groundwater extraction beneath coastal areas.
Instead of comparing satellite observations to the geoid for insight into coastal water levels and land elevation, many researchers relied on unadjusted geoid sea levels. Even those who attempted calculations often encountered errors caused by differing geoid models for land and ocean elevations. Alarmingly, less than 1% of surveys accurately determined the current sea level at the coastline in question.
“The Coastal Research Community may not have full access to these critical sea level datasets as we are primarily focused on the coastal land aspect,” Seeger remarked during the briefing.
Climate scientists and oceanographers must collaborate more closely with geographers and environmental scientists who assess coastal impacts, emphasized Matt Palmer from the UK’s Met Office.
“It could be said that the crucial final details got lost in translation,” he noted. “Ensuring that the last mile of information is handled adeptly is vital; otherwise, the integrity of the entire effort is compromised.”
The implications of this issue extend to matters of climate justice, said Palmer. The underestimation of sea levels is particularly critical in low-income nations, including various deltas in Africa and Asia. Limited data on gravity fluctuations and lower geoid accuracy contribute to this challenge in regions that are most susceptible to rising sea levels.
The scientific community advocates for enhanced data collection in low-income regions, particularly through the installation of tide gauges for accurate sea level measurements, according to Joan Williams from the UK National Marine Centre.
“Coastal sea levels are influenced by various local factors, necessitating long-term, well-calibrated regional measurements as the gold standard,” she stated.
“Where Are They?” is the question posed by the renowned Italian-American physicist Enrico Fermi during a discussion with a colleague in the early 1950s, hinting at the existence of extraterrestrial life. Fermi conducted calculations suggesting that alien civilizations should exist and have visited Earth in the past. He argued that the absence of extraterrestrial outposts raises important questions about civilization itself.
For decades, astronomers have referenced this pivotal conversation to explore the Fermi Paradox, which questions why we don’t see signs of other civilizations in the galaxy if they exist. Various hypotheses have emerged, including the Great Filter theory, suggesting a barrier that prevents civilizations from achieving the technology to communicate with one another. Alternatively, the Zoo Hypothesis posits that extraterrestrial beings are aware of humanity and opt not to make contact to avoid confusion. It is also possible that aliens are already among us or that unidentified aerial phenomena (UAP) or interstellar objects like ‘Oumuamua could indicate alien presence.
Some solutions to the Fermi Paradox involve assumptions regarding technological growth, evolution, or intelligence itself. Recently, researcher Robin HD Corbett suggested a more routine solution. His argument is based on the Copernican Principle of Mediocrity, which implies that if alien civilizations are akin to humans, it’s not surprising we haven’t encountered them.
Corbett presents two main considerations for a “radical secularity” solution to the Fermi Paradox. Firstly, there are limits to technological advancement; even if alien civilizations are more advanced, they lack faster-than-light travel or other impossible technologies. Secondly, while numerous alien civilizations may exist, they are not ubiquitous.
Regarding technology, Corbett points out that the laws of physics inhibit any civilization from developing a warp drive to quickly traverse the galaxy. Practical limitations, including engineering challenges and ecological concerns, compel civilizations to pursue sustainable technologies rather than pursuing grand projects detectable from afar, like an artificial ring around a star or radio beacons broadcasting for thousands of years.
The existence of civilizations similar to ours carries significant implications. If they exercise similar rational thought, guiding their space exploration decisions with cost-benefit analyses, they might find that the effort required to explore other civilizations may outweigh the benefits, especially without groundbreaking technology.
Corbett further claims that space exploration would likely be conducted by autonomous, perhaps self-replicating, machines known as von Neumann probes equipped with advanced AI, capable of traveling at 1/1000th the speed of light. Concerns about uncontrollable AI escalation may increase costs, leading civilizations to limit their exploratory efforts.
Corbett concludes that if alien civilizations are located far from Earth, they may have abandoned their search for others millions of years ago, leaving us in silence. Scientists, particularly those working on the new wireless array, should be mindful that extraterrestrial beings may closely resemble humans.Star Trek‘s Vulcans suggest limitations on future technologies, further complicating our quest for contact. Corbett also posits that UAPs discovered on Earth are likely not alien in origin, concluding that extraterrestrials may find humans too ordinary to warrant their attention.
Artwork of Hibodus sharks—predators from the late Permian period that outlasted mass extinctions.
Credit: Christian Darkin/Science Photo Library
The largest mass extinction in history led to the loss of over 80% of marine life. Remarkably, certain ecosystems continued to thrive, and various species, including apex predators, managed to survive this catastrophic event.
This research indicates that the survival of specific ecosystems was influenced by their unique species compositions. A similar pattern may be observed in today’s marine ecosystems, which are under significant threat from climate change.
Approximately 252 million years ago, the end-Permian extinction was likely triggered by extensive volcanic eruptions in present-day Siberia, causing rapid global warming and diminishing ocean oxygen levels. Notably, some groups, like trilobites and eurypterids (sea scorpions), faced total extinction, while others experienced dramatic losses. In the aftermath, new species groups emerged, including dinosaurs and ichthyosaurs.
Despite the extinction of numerous species, researchers speculate that ecosystems may have become less complex. A functioning ecosystem relies on diverse interdependent species—plants that produce energy, herbivores that consume them, and predators that eat herbivores. Top predators may face extinction as they depend on prey for survival. Thus, a significant extinction event, such as the one at the end of the Permian, would simplify ecosystems.
To investigate this hypothesis, Baran Kalapunar and a team from the University of Leeds assessed preserved remains from seven marine ecosystems globally, both before and after the extinction. They analyzed the ecosystem structures based on the species present. Kalapunar declined to provide an interview as the study is yet to undergo peer review.
Even with species losses reaching 96%, five of the seven ecosystems sustained at least four trophic levels.
In regions, particularly near the poles, slow-moving herbivores caused the most significant damage, while free-swimming organisms, such as fish, were less severely impacted.
Ecosystem recovery varied based on proximity to the equator. Tropical ecosystems were primarily populated by low-trophic-level species, while those nearer to the poles experienced the addition of trophic levels as fish predators relocated away from extreme heat near the equator.
These findings imply that present-day marine ecosystems also respond differently to climate change and other anthropogenic impacts.
“I’m not aware of any other study that encompasses so many regions,” states Peter Roopnarine from the California Academy of Sciences in San Francisco. He concurs with the conclusions that many ecosystems sustain trophic levels despite extinctions, as previous smaller-scale studies indicated.
However, Roopnarine cautions against placing too much emphasis on the specifics of researchers’ ecosystem models. The fossil record does not clarify which organisms survived and which did not, requiring researchers to combine all photosynthetic organisms together without predicting outcomes if these species became extinct. “These findings are firmly supported by the fossil record, yet it remains incomplete,” he remarks.
Dinosaur Fossil Discovery in Mongolia’s Gobi Desert
Join an exciting expedition to unearth dinosaur fossils in Mongolia’s Gobi Desert, one of the world’s top paleontological locations.
X and Y chromosomes engage in competition to favorably skew sex ratios.
Katerina Conn/Science Photo Library
Have you ever noticed a family where almost all the children are boys or girls? While often just random chance, a detailed analysis of a Utah family tracing back to the 1700s offers a fascinating biological explanation: the “selfish” Y chromosome may suppress female births.
According to James Baldwin Brown at the University of Utah, “This family is of great significance. Selfish genes, like the ones highlighted, have been documented across various organisms, yet studying them in humans remains challenging.”
In most mammals, male cells feature one X and one Y chromosome. During sperm formation in the testes, half receive Y chromosomes and half receive X chromosomes, leading to a theoretical 50:50 male-female birth ratio. However, certain chromosome variations can skew this outcome, producing an unequal number of male or female offspring. For instance, some selfish chromosomes hinder other sperm’s capability to reach the egg, while others eliminate non-selfish sperm. “This phenomenon has puzzled scientists for over a century,” adds Nitin Phadnis, also from the University of Utah.
The competition between selfish X and Y chromosomes can significantly skew sex ratios. Such variations are not just limited to humans; selfish chromosomes affecting sex ratios have been observed in various animals. The challenge lies in identifying currently active selfish chromosomes. “Even having several boys consecutively can often occur by chance,” Baldwin-Brown clarifies.
To prove that sex ratio bias is not a mere coincidence, it requires analyzing multiple generations. Using the Utah Population Database, which catalogs millions, Baldwin-Brown, Phadnis, and their team focused on 76,000 individuals.
The researchers employed two distinct statistical methods, both isolating the same families as significant outliers. Over seven generations, 33 men shared the same Y chromosome, resulting in 60 male and 29 female offspring out of 89 children.
Due to data anonymization, genetic analysis remains elusive. “It would be invaluable to connect with these individuals to sequence their sperm and investigate further,” says Baldwin-Brown. “However, navigating the ethical requirements and funding this endeavor is quite challenging.”
Sarah Zanders from the Stowers Medical Research Institute in Missouri speculates that a selfish Y chromosome might be at play but acknowledges the sample size is still too small for conclusive evidence. While analyzing microbes, her team detected significant sex ratio biases, yet larger sample evaluations yielded less remarkable findings.
Infidelity poses an additional complication, Zanders noted. “Though I’m not a human expert, I suspect many father assignments could be iffy,” she reflects. Baldwin-Brown acknowledged the possibility. “Despite this, there remains robust data that appears trustworthy,” he assures.
Understanding the selfish Y chromosome extends beyond theoretical implications, Phadnis suggests. Such mechanisms could be a factor in rising male infertility rates, as a trait that diminishes half of all sperm would severely impact fertility. Moreover, studies indicate selfish chromosomes may induce infertility in certain individuals.
The research team now aims to analyze sperm samples for discrepancies in the X and Y carrying sperm ratios.
This latest examination focuses on the selfish Y chromosome for various reasons. It is simpler to trace male lineage, and another potential cause for a higher female birth ratio could stem from a deadly mutation rather than merely a selfish X chromosome.
Selfish genes aren’t exclusive to X and Y chromosomes. More broadly, DNA that enhances inheritance probabilities above 50% is referred to as a gene drive and has been discovered in various species. CRISPR technology can create artificial gene drives, with potential applications in combating malaria and controlling pest populations.
Let’s begin with an important fact: No matter what you’ve heard, you are not eating the equivalent of a credit card’s worth of microplastics every week.
You can read more about the confusion around this assertion in the article here.
However, the claim has sparked concerns, particularly after multiple studies reported microplastics accumulating in various environments—ranging from the highest mountains to the deepest ocean trenches, and even in isolated polar regions. Microplastics have also been detected in human tissues, including the heart, liver, kidneys, breast milk, and bloodstream.
Given their prevalence and potential health implications, it’s understandable to be worried, but is it truly warranted?
The ubiquity of microplastics can be traced back to the remarkable properties of plastics. The invention of Bakelite in the early 20th century marked a shift in how materials were produced—created from synthetic compounds rather than sourced from nature.
As plastic became more affordable and widespread, its applications flourished, impacting food packaging, electronics, medical devices, and more. Unfortunately, this durability also leads to a significant environmental issue; microplastics have been released into ecosystems for over a century, persisting for long periods. Consequently, these particles have made their way into the tissues and bloodstreams of various species, including us.
These microplastics are often present in everyday items we consume, such as salt, beer, and drinking water, as detailed here.
Yes, microplastics could likely be within you, but there’s no need to panic just yet. Assessing the health implications of pollutants involves several factors.
Firstly, consider the size of the microplastics, which varies significantly. Secondly, what concentration is required to elicit effects? Lastly, we must examine whether the effects are indeed harmful. Much of the current research is animal-based, which raises questions about its applicability to humans.
Microplastics and Credit Cards
In recent years, alarming headlines have often cited vague information about microplastic sizes or relied on inflated studies that use unrealistically high doses, not reflective of typical human consumption.
For example, widely circulated claims suggested that the average person ingests around 5 grams of microplastics a week—the amount in a credit card. This assertion stems from a 2019 study that employed questionable methodologies and can easily be debunked.
According to a more accurate assessment, most individuals consume only around 0.0041 milligrams per week—less than a grain of salt. This slower rate suggests that it would take over 1.2 million weeks, or 23,000 years, to consume the equivalent of one credit card’s worth of plastic.
If you were immortal, perhaps you could worry about it.
Research indicates that the average person accumulates about 12.2 milligrams of microplastics in their lifetime, but only around 41 nanograms might actually be absorbed by the body based on a study by the same researcher.
New concerns have also emerged surrounding the methodologies used to investigate microplastics within bodily tissues. Some studies employ vaporization techniques that analyze smoke for microplastics, potentially leading to false positives due to similar chemical structures released from fat.
Effects of Microplastics on Human Health
While we know that microplastics are present in our bodies, their effects remain unclear. Some studies indicate that microplastics may lead to behavioral changes and inflammation in animal models; however, these studies often utilize unrealistically high doses—1 gram per day for rodents, for example.
Other studies in pigs showed that a weekly dose of 1 gram affected gene expression and induced oxidative stress in the pancreas, yet this dosage vastly exceeds typical human exposure.
Reports from the World Health Organization have cautioned that most animal studies utilize concentrations of microplastics well above what humans typically encounter. Moreover, microplastics are processed differently in human bodies compared to rodents, complicating data interpretation.
Preliminary human studies have detected microplastics accumulating in plaques and have correlated the presence of these plastics with higher instances of heart attacks and strokes. However, correlation does not entail causation—it’s critical to avoid jumping to conclusions.
Investigating the impact of microplastics on human health is multifaceted. While these small particles carry chemicals capable of disrupting bodily processes, it is essential to recognize that not all these chemicals are absorbed immediately. Studies have demonstrated that the amount of chemicals leaching from microplastics is minimal under average conditions, as addressed in this report. Additionally, the body can excrete certain chemicals, negating long-term accumulation risks.
Concerns also revolve around the potential introduction of other hazardous substances linked to microplastics. Moreover, they may disrupt immune functions or even cause cell damage and inflammation. However, comparative assessments regarding the risks of microplastics versus other pollutants—such as air quality or dietary excesses—remain uncertain.
While it’s natural to fear the health risks posed by microplastics, we need definitive evidence to gauge their danger accurately. This discussion taps into our anxiety surrounding pollution. Just because we don’t consume a credit card’s worth of plastic each week doesn’t mean that the issue isn’t serious. However, the field of microplastic research is still nascent, and comprehensive data on their effects in humans is lacking.
Until further research emerges, I’ll focus my concerns elsewhere.
This week, AI chatbot Claude experienced an outage. Users reported being unable to access services via the Anthropic website, with the issue persistent for approximately a week. Similar outages have impacted various technology giants, government websites, and even hospitals. What is driving this surge in service disruptions?
The primary vulnerability of today’s internet lies in its heavy reliance on cloud computing. This shift has resulted in numerous services depending on just a few key providers like Amazon and Microsoft. During the early days of the internet, businesses operated on their own infrastructure—akin to a self-sufficient local store. When an issue arose in one area, others remained unaffected, but now, if a cloud provider faces difficulties, the repercussions resonate across multiple platforms.
Frequently, user-access issues stem from simple human errors. One notable incident underscoring these risks was the 2024 outage caused by cybersecurity firm CrowdStrike, which inadvertently released software configuration files that rendered millions of Windows computers inoperative—affecting airlines, banks, and emergency service centers globally.
Joseph Jarneki from the Royal United Services Institute indicates that large-scale outages are typically not premeditated. Cybercriminals tend to focus on smaller targets instead of provoking major tech companies, preferring to extract ransom payments when preying on vital services.
Tim Stevens from King’s College London highlights that ransomware attacks are increasingly directed at local authorities and crucial infrastructure. Hackers tend to infiltrate essential services such as water supplies and municipal governments, where they can hold operations hostage for payment.
The UK has witnessed such incidents, including ransomware attacks on Hackney Council, Gloucester City Council, and Leicester City Council, along with similar challenges faced by the NHS and local water suppliers. Stevens notes an ongoing cat-and-mouse game between hackers and cybersecurity experts. Unfortunately, it appears hackers currently hold the upper hand. “In recent discussions, it’s been indicated that we’re losing ground. We’re not just behind; we’re actually losing,” Stevens confessed.
State-sponsored hackers from countries like Russia and China typically do not aim to disrupt cloud providers on a large scale. “While they do target these entities, their intentions are highly focused rather than destructive,” emphasizes Jarnecki.
According to Sarah Krebs from Cornell University, cyberattacks are increasingly utilized in nations operating within a “gray zone”—a fluctuating state of unease that signifies neither full-scale peace nor active warfare. This tension often manifests as calculated disruptions aimed to weaken adversaries.
Krebs explains, “This approach acts similarly to economic sanctions; much of our GDP and overall economic stability hinges on the Internet. Disabling it critically impairs adversaries’ abilities to generate wealth, subsequently hindering their resource capabilities for warfare.”
Importantly, Krebs notes that Russia and China aren’t the sole practitioners of such tactics. Western nations, too, engage in cyber operations. Notably, intelligence agencies such as GCHQ and MI6 have previously compromised al-Qaeda computers, resulting in significant operational disruptions—these covert operations remain classified and occur behind the scenes.
Stevens mentioned, “It’s clear that Western intelligence and security agencies are conducting cyber operations against Russian assets. However, the legal frameworks often restrict the scope and intensity of these operations, which can be a source of frustration within the community.”
Claude has since resumed functioning, but Anthropic has yet to address inquiries from New Scientist regarding the recent outage effects.
Paleontologists from the University of Toronto Mississauga have uncovered numerous tooth impressions in the fossilized bones of three juvenile Diadectes, one of the earliest large herbivorous vertebrates to traverse land. This groundbreaking finding represents the earliest direct evidence of predator-prey interactions between terrestrial carnivores and herbivores.
Skeletal reconstruction of Diadectes sideropelicus. Side view illustrating left and right tooth and hole marks. Image credit: Young et al., doi: 10.1038/s41598-026-38183-6.
Paleontologists have long been aware of the existence of apex predators in the Permian landscape; however, clear physical evidence confirming their dependence on the early large herbivores has remained elusive.
In contrast to the Mesozoic Era, renowned for its dinosaur bite marks, the earlier fossil record reveals scant direct evidence of such predator-prey encounters.
“Our findings indicate that the predator-prey hierarchy emerged earlier than previously understood,” stated lead author Professor Robert Rice, a paleontologist at the University of Toronto Mississauga.
“While these interactions are well-documented in the ‘age of reptiles,’ there has been limited information regarding them in the Paleozoic era, when terrestrial vertebrates first evolved into large apex predators and herbivores.”
In this study, Professor Rice and colleagues analyzed the disarticulated skeletons of three juvenile Diadectes, dating back to the early Permian period.
The fossils were unearthed in the Mud Hill area of the Vale Formation located in Texas, USA.
The paleontologists documented five distinct types of bone damage: shallow notches, deeper holes, grooves along the shafts, conical punctures, and small holes.
Notably, many marks were concentrated around cartilage-rich joints, indicating predators had stripped away muscle and pried open connective tissues.
Some grooves ran parallel to the long axis of the bone, consistent with the motion of tearing flesh.
“The holes, pits, cuts, and wrinkles present on these three juvenile herbivores’ skeletons point to the presence of large predators in this area, such as Varanopus and Dimetrodon,” said lead study author Jordan M. Young, a researcher at the University of Toronto Mississauga.
“Scavengers and small arthropods also took part in this ‘Paleozoic feast.’”
Evidence of arthropod perforation was found where the cartilage of the bone ends would have been.
The study was published in the Journal on February 26, 2026, in a Scientific Report.
_____
JM Young et al. 2026. The earliest direct evidence of trophic interactions between terrestrial apex predators and large herbivores. Scientific Reports 16, 6977; doi: 10.1038/s41598-026-38183-6
Changes in hominid facial size and shape over time are not just significant for taxonomic and evolutionary relationships; they also indicate vital functional adaptations. Recently recovered and well-preserved, the Australopithecus skulls, especially the 3.67-million-year-old StW 573, commonly referred to as “Littlefoot,” discovered in Sterkfontein, South Africa, have greatly enriched the fossil record. Although StW 573 is nearly complete, post-depositional damage has resulted in some displacement and fragmentation of the facial skeleton. In a groundbreaking new study, paleoanthropologists set out to digitally reconstruct the surface of StW 573.
Facial reconstruction of StW 573. Image credit: A. Beaudet.
The Littlefoot fossil was uncovered in 1994 in a cave in Sterkfontein, South Africa.
This specimen, also known as StW 573, got its name from the four small leg bones discovered amidst a box full of animal fossils that ultimately led to the skeleton’s recovery.
In the 2010s, paleoanthropologist Ronald Clark suggested that Littlefoot might belong to Australopithecus prometheus, while others argued for Australopithecus africanus, a hominid species found at the same site, or even a distinct species within the Australopithecus genus.
Although many aspects of StW 573’s skeleton have been extensively studied, the face has been distorted due to millions of years of geological processes, making physical reconstruction methods ineffective.
In a recent investigation, Dr. Amélie Baudet of the Universities of Poitiers and the University of the Witwatersrand, along with her team, digitally reconstructed the facial bones, producing one of the most complete Australopithecus faces to date.
The researchers evaluated nine facial linear measurements and applied 3D geometric morphometry to compare Littlefoot with various extant great apes and three other Australopithecus fossils.
Findings indicated that Littlefoot’s overall facial size, eye socket shape, and general facial structure bore more resemblance to East African fossils than to younger South African specimens, a counterintuitive result given the lack of complete facial fossils for comparison.
“Given Littlefoot’s geographical origins, this pattern is unexpected and implies a more dynamic evolutionary history than previously believed,” remarked Dr. Baudet.
“For instance, Littlefoot may represent a lineage closely linked to East African populations, whereas later South African hominins developed more distinct facial features through regional evolutionary mechanisms.”
The study also uncovered evidence of selective pressures acting on the orbital region (around the eyes), potentially related to shifts in visual capabilities and ecological behaviors.
“Although our study is limited to a single anatomical region and a small number of comparative fossil specimens, it enriches our understanding of the links between Australopithecus populations across Africa, indicating that the orbital region may have been under evolutionary pressure during that time,” said Dr. Baudet.
“Human facial evolution suggests that our faces have become less prominent and more adaptable over time, but the timeline and inherent evolutionary mechanisms remain elusive.”
Professor Dominic Stratford from the University of the Witwatersrand and Stony Brook University commented, “This study challenges the idea that early human evolution took place in isolated regions. Instead, it supports the concept of Africa as a unified evolutionary landscape, where populations adapted to ecological pressures while remaining interconnected through common ancestry.”
“The face is crucial for primates’ interactions with their environment, serving essential functions in digestion, vision, respiration, smell, and nonverbal communication.”
“In this light, the face is an essential anatomical area for understanding how humans have adjusted and interacted with their surroundings.”
“With only a handful of Australopithecus fossils preserving nearly complete facial structures, Littlefoot offers a rare and invaluable reference point,” asserted Dr. Baudet.
“The anatomical regions of Littlefoot’s face associated with vision, respiration, and feeding will provide further vital insights into our evolutionary history.”
The study results were published in this month’s issue of Comptes rendus palevol.
_____
Baudet, A. & others. 2026. Virtual reconstruction and comparative study of the face of StW 573 (“Little Foot”). Comptes rendus palevol 25(3):43-56; doi: 10.5852/cr-palevol2026v25a3
Newly Discovered Tiny Fossil: Purgatorius
This shrew-sized mammal is recognized as the oldest known ancestor of all primates, including humans. Initially believed to be confined to northern North America, its range now extends hundreds of kilometers to the south. This week’s article in the Journal of Vertebrate Paleontology, detailed in a recent paper, challenges conventional theories about the biogeography of early primates and suggests that their diversification occurred rapidly following the end-Cretaceous mass extinction.
Shortly after the Cretaceous mass extinction, the earliest known primates like Purgatorius McKivevelli adapted quickly, specializing in an omnivorous diet that included tree fruits and archaic ungulate mammals. Image credit: Andrei Atutin.
The origins and early biogeographical history of primates is a fascinating yet contentious subject. The oldest primates, Purgatorius, are small tree-dwelling mammals that first emerged in North America around 65.9 million years ago.
Previously, Purgatorius fossils were only found in northern regions such as Montana and Saskatchewan, creating an incomplete understanding of their evolutionary history.
Paleontologist Stephen Chester from the City University of New York and his colleagues describe the southernmost fossil of Purgatorius in their new paper.
The specimens were meticulously recovered from ancient sediments in the Coral Bluffs area of the Denver Basin in Colorado.
“This discovery fills a critical gap in our understanding of the geographic distribution and evolution of our earliest primate ancestors after the dinosaur extinction,” Dr. Chester stated.
The fossils analyzed by the team consist of small teeth that display a distinctive combination of features, indicating they may belong to an earlier, previously unidentified species of Purgatorius.
“The presence of these fossils in Colorado reveals that ancient primates likely originated in the north before expanding southward, rapidly diversifying post-end-Cretaceous mass extinction,” Chester explained.
While scientists previously believed Purgatorius was absent from southern regions during this period, new findings suggest that this assumption was primarily due to limited fossil sampling.
“Our results demonstrate that small fossils can easily be overlooked,” Dr. Chester remarked.
“More intensive searches, especially utilizing screen-cleaning techniques, will likely uncover numerous significant specimens.”
The study further questions long-held assumptions about the habitats of early primates.
“The ankle bone of Purgatorius suggested tree-dwelling characteristics, and we initially suspected its absence from southern Montana was due to extensive forest destruction following an asteroid impact 66 million years ago,” Chester noted.
“Yet, our paleobotanical colleagues indicate that plant recovery in North America was rapid, leading us to believe that Purgatorius likely existed further south—we just haven’t looked hard enough.”
_____
Stephen GB Chester et al. “Southernmost Origin of Purgatorius: Insights into the Biogeographic History and Diversification of the Oldest Primates.” Journal of Vertebrate Paleontology, published online March 2, 2026. doi: 10.1080/02724634.2026.2614024
The gut and oral microbiomes play a crucial role in determining the severity of reactions in individuals with peanut allergies. This may clarify why reactions can vary greatly in intensity among allergic individuals.
According to Rodrigo Jimenez Sais from the Autonomous University of Madrid, “The central question is why some individuals experience more severe allergic reactions than others.”
A peanut allergy arises when the immune system incorrectly identifies proteins from peanuts as harmful, leading to an excessive production of specific antibodies. This immune response can result in symptoms like itching, swelling, and nausea, or in severe cases, anaphylaxis—a life-threatening condition characterized by breathing difficulties.
Since various microbiomes significantly influence our immune systems, Jiménez-Saiz and his team hypothesized that body microorganisms could affect allergy severity.
To test this, they administered peanuts to three groups of non-allergic mice: germ-free mice without a microbiome, mice with a minimally diverse microbiome, and mice with a rich, healthy microbiome.
After 40 minutes, researchers discovered that two proteins, Ara h 1 and Ara h 2, crucial for peanut allergies, were present at elevated levels in the germ-free and minimally diverse microbiome mice compared to those with a diverse microbiome.
Additionally, the mice with a diverse microbiome harbored abundant levels of a beneficial bacteria called Lotia, especially the Lotia R3 strain, which aids in digesting peanuts in the intestines.
To explore whether Lotia R3 could mitigate anaphylaxis risk, the researchers induced severe peanut allergies in another group of mice with minimal microbiome diversity.
They then introduced Lotia R3 and injected peanut paste into all subjects’ intestines. After 40 minutes, while all mice experienced anaphylaxis, those treated with Lotia R3 had an average body temperature drop of just 2%, compared to 3.5% in untreated mice—a notable difference, given that severe drops in temperature can lead to hypothermia and organ failure.
Moreover, levels of MMCP-1, an immune molecule that surges during anaphylaxis, were significantly lower in the blood of mice treated with Lotia R3. According to Mohamed Shamji from Imperial College London, “The findings are compelling. If similar immune responses occur in humans, we could anticipate a decrease in anaphylactic severity.”
In a complementary study involving 19 individuals with peanut allergies, researchers noted that those with higher peanut tolerance exhibited significantly higher levels of Lotia and considerably fewer bacteria in their saliva than those suffering from severe allergies. This indicates that the presence of these bacteria—both in the gut and oral cavity—may impact an individual’s anaphylaxis risk.
Lotia probiotics hold promise for reducing the severity of anaphylaxis during peanut allergies, according to Shamji. “There’s a significant need for such interventions,” he remarks, especially considering they could alleviate fears of accidental peanut exposure and minimize side effects during oral immunotherapy, which involves gradually introducing allergens to desensitize patients.
The research team aspires to eventually conduct a clinical trial, administering either Lotia probiotics or a placebo to participants with peanut allergies prior to their exposure to low doses of peanuts, as explained by Jimenez-Saiz.
The X and Y chromosomes can compete to skew the sex ratio in their favor.
Katerina Conn/Science Photo Library
Have you ever noticed a family with mostly boys or girls? While it often seems coincidental, a Utah family study dating back to the 1700s suggests a biological reason—the “selfish” Y chromosome may reduce female births.
“This is a significant family,” says James Baldwin-Brown from the University of Utah. “Selfish genes have been identified in many organisms, but studying their effects in humans has been challenging.”
In mammals, male cells contain one X and one Y chromosome. As sperm develop, half carry X and half carry Y, leading to a balanced male-female offspring ratio. However, certain genetic variations can disturb this balance, producing more males or females. Some selfish chromosomes may even interfere with sperm navigation, while others eliminate non-carrier sperm, though the mechanisms are not yet clear. “This question has persisted for a century, and we still seek answers,” explains Nitin Phadnis, also from the University of Utah.
In various species, selfish X and Y chromosomes compete, attempting to tilt the sex ratio to their advantage. Evidence suggests that humans may harbor similar selfish genes, yet identifying active ones proves difficult. “It’s statistically probable to have five or six boys consecutively,” Baldwin-Brown notes.
To demonstrate that the observed sex ratio bias isn’t just chance, researchers must investigate multiple generations. Utilizing the Utah Population Database, which contains data on millions, this study focused on 76,000 individuals.
By applying two distinct statistical analyses, the researchers identified specific families as significant outliers. Over seven generations, 33 men passed down an identical Y chromosome. Of their 89 offspring, 60 were male, while only 29 were female.
The data, having been anonymized, restricts direct genetic testing. “It would be enlightening to de-anonymize these samples and request consent for sperm analysis,” Baldwin-Brown states. “However, the ethical considerations involve extensive paperwork and resources.”
Sarah Zanders from the Stowers Medical Research Institute in Missouri hopes her team has identified a selfish Y chromosome, though the sample size remains small. In her research on microbes, her team observed unexpected sex ratio biases, but these disappeared in larger sample sizes.
Infidelity also complicates findings, Zanders suggests. “While I lack expertise in human behavior, television has taught me that father misattribution could be more common than assumed.” Baldwin-Brown reassures, “We have substantial, reliable data.”
Understanding the selfish Y chromosome has broader implications beyond mere academic interest. According to Phadnis, these chromosomes might contribute to rising male infertility rates. A mechanism that eradicates half of all sperm could logically lead to reduced fertility. Studies suggest that selfish chromosomes can cause infertility in certain subjects.
The research team intends to analyze sperm samples for discrepancies in X and Y sperm proportions.
In this recent study, they focused specifically on the selfish Y chromosome for several reasons: tracing male lineage is simpler, and an increased female proportion may also stem from lethal mutations—not just a selfish X chromosome.
Notably, selfishness isn’t confined to X and Y chromosomes. Genes that enhance inheritance chances above 50% are referred to as gene drives, with various types identified in the animal kingdom. CRISPR technology allows for the creation of artificial gene drives, potentially aiding in malaria control and pest management.
Discover the QuEra Quantum Computer Based on Cryogenic Atoms
Credit: Cuella
An innovative algorithm called phantom code has the potential to enable quantum computers to execute complex programs error-free, addressing a critical barrier to the broader adoption of quantum technology.
Initially, many physicists were skeptical about the viability of quantum computers due to their susceptibility to errors that are challenging to rectify. Various types of quantum computers are already operational and have shown promise in facilitating scientific research and exploration. Nevertheless, the industry is still grappling with the challenge of minimizing computational mistakes.
Traditional error correction techniques permit quantum computers to store information accurately, but their computational demands can be substantial. According to Shayan Majidi of Harvard University, this creates inefficiencies.
To tackle this issue, Majidi and his research team concentrated on complex calculations that require numerous steps, often resulting in prolonged execution times and heightened error risks.
Quantum computers utilize basic units known as qubits. These computations frequently involve logical qubits: clusters of qubits cooperating to lower error rates. In order to avoid computational inaccuracies, devices manipulate these logical qubits. For instance, physical qubits are usually subjected to lasers or microwaves to connect multiple logical qubits or alter their quantum states.
The phantom code innovation allows the entanglement of multiple logical qubits without necessitating any physical manipulations, hence its moniker “phantom.” This efficiency translates to fewer actions required for calculations, thereby diminishing the likelihood of errors.
In their experiments, Majidi and his colleagues ran computer simulations to evaluate the phantom code on two distinct tasks: preparing specialized qubit states that are essential for computations, and simulating simplified models of quantum materials. Their findings indicated that this method yielded results that were up to 100 times more accurate than conventional error correction methods by minimizing the need for physical operations.
While phantom codes may not be applicable to every quantum computing task, according to Majidi, they are particularly useful in scenarios that demand extensive entanglement. This method doesn’t generate new entanglements; instead, it optimally utilizes existing ones. As Majidi puts it, “It’s not a free lunch; it’s just a lunch that was already there, and we weren’t consuming it.”
Mark Howard, researchers at the University of Galway in Ireland, liken the selection of error-correcting codes for quantum computing to choosing protective armor. While plate armor may provide superior protection at the expense of weight and versatility, phantom code offers flexibility but requires more qubits compared to traditional strategies, making it a partial solution to quantum error challenges.
Dominic Williamson and his team at the University of Sydney in Australia point out that the competitive viability of phantom codes versus other error correction methods remains uncertain and may hinge on future advancements in quantum hardware.
Majidi’s team is collaborating closely with colleagues developing quantum computers based on extremely cold atoms. He envisions that insights gained from phantom code, along with an understanding of qubit capabilities, will pave the way for new strategies tailored specifically to both tasks and hardware implementations in quantum computing.
Understanding the Link Between Air Pollution and Dementia
Air pollution is commonly linked to respiratory illnesses, but recent studies suggest a troubling connection to another serious health concern: dementia.
A recent study published in JAMA Neurology indicates that increased exposure to fine particulate matter may exacerbate neurological changes associated with conditions like Alzheimer’s disease.
The researchers stress that further investigation is essential, yet evidence of this correlation is compelling.
A meta-analysis published in July 2025 by The Lancet Planetary Health reviewed data from over 29 million individuals across multiple countries from the late 1980s and early 1990s. The findings highlighted the detrimental effects of PM2.5 (particulate matter), nitrogen dioxide (NO2), and soot on cognitive health.
The study concluded that “the diagnosis of dementia is significantly linked to long-term exposure to fine particulate matter pollution.”
This ongoing research has identified a growing body of evidence, building on earlier publications. For instance, a 2017 study in The Lancet established a connection between living near major roads and elevated dementia rates, as discussed in this landmark research.
But what specific problems does air pollution cause, and how can we address them?
Most air pollution originates from burning fossil fuels, alongside natural sources like sandstorms. – Photo credit: Getty Images
The Role of Particulate Matter in Health
Air pollution manifests in various forms, with particulate matter (PM) being a prominent type. This term encompasses microscopic particles suspended in the air, including dust, smoke, and liquid droplets that are often invisible to the naked eye.
Particulate matter is categorized by size, ranging from fine (PM0.1) to coarse particles (PM10).
Notably, PM2.5 is exceptionally small, measuring less than 1/30th the width of a human hair. Its minute size allows it to remain airborne for extended periods, making it easily inhalable.
According to Dr. Holly Elser, an epidemiologist and co-author of the recent JAMA Neurology study, “[PM2.5 pollution] is linked to numerous health outcomes.” These outcomes range from asthma and lung cancer to heart disease and, increasingly, dementia.
The complexities surrounding PM2.5 arise from its myriad sources. “While traffic is a significant contributor, it is not the sole source,” says Dr. Hanen Kreis from the University of Cambridge, who studies urban mobility’s health impacts.
Additional sources of PM2.5 include power plants, factories, construction sites, wildfires, and biomass burning, as well as natural occurrences like sandstorms.
The toxicity of PM2.5 particles varies depending on their origin. Understanding their chemical composition is vital for addressing their health impacts.
Researchers have identified two principal pathways for PM2.5 to infiltrate the central nervous system: “through the olfactory nerve (via the nose) or through the bloodstream by crossing the blood-brain barrier.”
How PM2.5 Affects Brain Health
Due to PM2.5’s diminutive size, it can penetrate deep into the lungs, facilitating its entry into the bloodstream and ultimately reaching the brain. There, it can induce inflammation and oxidative stress, resulting in neuronal and vascular damage over time, according to Dr. Kreis.
Other hypotheses exist regarding pollution’s influence on cognition. For instance, pollutants may travel through the olfactory pathway to the hippocampus, the brain’s memory center, leading to the accumulation of harmful amyloid and tau proteins associated with Alzheimer’s disease.
Research has also indicated that PM2.5 can restrict cerebral blood flow, cause microvascular damage, and heighten the risk of vascular dementia.
Color MRI scan of the brain of a 68-year-old Alzheimer’s patient – Photo credit: Science Photo Library
Air pollution levels are notably higher near busy roads, but research shows that its concentration diminishes significantly with distance from traffic.
A 2017 study published in The Lancet analyzed data from over 6 million residents in Ontario, revealing that individuals living within 50 meters (165 feet) of a major road face a 7 to 12% increased risk of dementia compared to those residing over 200 meters (approximately 650 feet) away.
Moreover, the overall burden of PM2.5 is directly associated with dementia risk. Dr. Kreis notes that each 10 micrograms per cubic meter (μg/m3) increase in PM2.5 correlates with a 17% increase in dementia risk.
For perspective, the average PM2.5 level around central London’s roads in 2023 was 10μg/m3.
For nitrogen dioxide (NO2), another pollutant primarily released from fossil fuel combustion, every 10μg/m3 increases the relative risk of dementia by 3%. In 2023, the average roadside NO2 level in central London was 33 μg/m3.
Ultimately, fossil fuel combustion represents the largest contributor to air pollution, particularly PM2.5.
Mitigating Exposure to Air Pollution
If you reside or work near a busy road, it may be challenging to significantly lower your air pollution exposure. Yet, given that many individuals live in metropolitan areas, addressing this issue must be a priority. Dr. Kreis advocates for “targeted policy measures and a shift from fossil fuels to clean energy” as essential solutions.
Nevertheless, it’s beneficial to be informed about air quality variations (which often worsen on warm afternoons but improve following rain).
On days when the air quality index exceeds 100, indicated as “unhealthy to breathe,” minimizing outdoor activities is advisable. If going outside is unavoidable, wearing a fit-tested N95 or KN95 mask can help protect against PM2.5 exposure.
For those indoors on poor air quality days, utilizing an air purifier or fan can enhance indoor conditions. Good-quality models can be obtained for around £100, making them a cost-effective solution.
Additionally, when navigating urban environments, consider opting for less trafficked routes with more greenery, as Dr. Kreis does when biking. Fewer vehicular emissions mean lower pollution levels, and vegetation can significantly absorb air pollutants; research suggests that substantial plant coverage can reduce pollution concentrations by as much as 50%.
PM2.5 concentrations are notably elevated on the London and New York subway systems. Some research indicates that levels in certain London Underground stations can be up to 18 times greater than street level, prompting medical professionals to recommend masks in these environments.
During traffic jams, close your car windows and turn off your engine to minimize exposure. At home, ensure proper ventilation while cooking.
Awareness is a crucial first step. As Dr. Elser emphasizes, it’s important to acknowledge that while air pollution is a risk factor for dementia, it is just one of many.
When we think about infamous fictional psychopaths, like the chillingly calculating Patrick Bateman from American Psycho, they often embody the image of a scammer. But what about real-life psychopaths?
Research indicates that psychopaths are more inclined to lie to achieve their goals, exhibiting remarkable fearlessness, almost as if they have ice in their veins.
You might assume that their cold demeanor makes it hard to detect their deceit. Surprisingly, studies suggest that psychopaths are not significantly better at lying than others.
For instance, a study from the 1980s revealed that convicted psychopaths were easily identifiable, much like non-psychopaths using lie detectors. However, it’s important to note that while lie detector tests are commonly employed, they are notoriously unreliable.
In a more recent 2016 study, researchers found that criminals tend to lie frequently. Notably, psychopaths often exhibit a heightened tendency to lie during psychological tasks. Yet, they still encounter cognitive costs from lying, such as making more errors and responding more slowly.
Though psychopaths lack the moral and emotional barriers that typically hinder lying for most people, they cannot escape the psychological challenges associated with creating believable lies.
Interestingly, while psychopaths may not have a natural talent for lying, there is emerging evidence that they can learn to become more effective liars.
A 2017 study discovered that students with high psychopathic traits demonstrated significant improvement during tasks that required them to lie convincingly. They could lie faster than others, indicating that the mental strain of lying decreases along with reduced neural activity related to deceit.
In summary, psychopaths may not excel at lying initially, but they have a propensity to lie more frequently and improve at it more swiftly than others.
This article addresses the question posed by Lyle Morse via email: “Are psychopaths really good at lying?”
To submit your own questions, please email questions@sciencefocus.com or reach out via social media: Facebook, Twitter, or Instagram. (Don’t forget to include your name and location.)
For more fascinating scientific insights, visit our Ultimate Fun Facts page.
The Shahed-136 Drone: Iranian Innovation and U.S. Replication
Pictorial Press/Alamy
The Shahed-136 drone, a cost-effective attack drone developed by Iran’s Shahed Aviation Industries, is now being deployed against advanced U.S. military technologies. Despite the U.S. military’s reliance on high-tech weaponry, why are they countering this drone powered by a motorcycle engine?
Measuring 2.6 meters in length, the Shahed-136 can carry a payload of 15 kilograms over approximately 2,500 kilometers. It achieves a speed of around 185 km/h, which is significantly slower than conventional cruise missiles or bomb-laden aircraft, yet its price point is notably low—approximately $50,000 each.
Currently, hundreds of Shahed drones are utilized by Russia in its offensive operations against Ukraine. Countering them necessitates a comprehensive air defense strategy, incorporating fighter jets, ground-based missiles, and interceptor drones. These drones are also employed by various groups, including Houthi forces in Yemen.
In recent conflicts, Iran has deployed Shahed drones as part of their military response against U.S. and Israeli forces. In an interesting twist, the U.S. military introduced the Low-Cost Unmanned Combat Attack System (LUCAS), a product made by Arizona-based Speckleworks. This system is a reverse-engineered version of the Shahed-136, illustrating how Iran’s design is now weaponized against them.
LUCAS is versatile and can be enhanced with reconnaissance gear or warheads for ground strikes. The FLM136 is reportedly named as a homage to the Shahed 136 from which it was inspired.
The U.S. military’s reverse-engineering of the Shahed-136 followed the capture of Iranian-backed militia units in Iraq and Syria. A test launch from a U.S. Navy ship was successfully carried out last year.
Professor Anthony King from the University of Exeter posits that inexpensive attack drones like the Shahed serve as a form of “graffiti bug,” reminiscent of Nazi Germany’s V-1 flying bombs used during World War II.
These economical devices can be mass-produced and deployed in large quantities, overwhelming enemy defenses until they crumble, or diverting significant assets and rendering prolonged combat infeasible. This strategic approach leaves opponents vulnerable to subsequent offensives.
“We are intercepting them with weapons that are significantly more expensive than Shahed, and often the targets of these attacks are cheaper than the defenses we employ,” stated King. “This dynamic transforms the economic landscape of warfare in fascinating ways.”
Interestingly, it has been suggested that Iran may have drawn inspiration from a Cold War-era project, where Germany and the United States collaborated to create a device aimed at neutralizing Soviet radar systems, referred to as the Dornier Die Drone Anti-Radar.
Ian Muirhead, a professor at Manchester University and a former military member, suggests that while Shahed drones will not replace advanced drones or manned aircraft, they are increasingly making an appearance in warfare. Western military forces are now recognizing the effectiveness of such weaponry, inspired by lessons from the Ukraine conflict.
“Complex and costly modern weaponry can prove inefficient in extensive conflicts, particularly when resources are stretched thin,” Muirhead remarked. “Deploying thousands of inexpensive drones can swiftly overwhelm defenses using unfriendly firepower.”
“It’s purely an economic discussion. If defense costs exceed attack costs, the balance of power shifts,” Muirhead concluded.
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.