Forest Crisis Sparks Europe to Reassess Net Zero Goals

Extreme weather and bark beetles have devastated many trees in the Harz Mountains, Germany

Rob Cousins/Alamy

The abrupt and significant drop in carbon absorption by European forests has ignited concern among scientists, who fear that a marked decline could hinder efforts to combat global warming.

For many years, European forests, which span around 40% of the continent’s land area, have played a dual role as sources of timber and as carbon sinks. However, increasing extreme weather events are pushing these forests beyond their limits, swiftly altering the landscape.

“Many [European Union] countries will struggle to meet their [land-use climate] targets due to this sink reduction,” states Glen Peters from the Cicero International Climate Research Centre in Norway.

Earlier this year, Finnish officials revealed that their forest ecosystem had shifted from functioning as a net carbon sink to becoming a net carbon source. This development follows Germany’s declaration that its forests became the first in the country’s history to record a net increase in carbon emissions. Additionally, the Czech Republic has reported its forests as net carbon sources since 2018.

While these instances are particularly severe, carbon absorption rates are dwindling rapidly in many other nations. For instance, in France, the carbon uptake by forests has nearly halved in just 14 years, with a study released last month documenting a decrease from a peak of 37.8 million tonnes of carbon dioxide annually in 2008 to 74.1 million tonnes in 2022. Concurrently, Norway’s carbon absorption has plummeted from 32 million tonnes in 2010 to 18 million tonnes in 2022.

“The trend had remained relatively stable from 2013 to 2015,” comments Korosuo at the European Commission’s Joint Research Centre in Belgium. “This is a widespread issue, not confined to just one or two countries. Similar patterns are observable across nearly all forested nations.”

Many forests in Europe are privately owned and commercially managed. Some of the decrease in carbon sinks has been linked to increased logging, particularly following the sanctions on Russian timber imports due to the invasion of Ukraine in 2022. For example, Finland has seen strong demand for wood, leading to heightened harvesting levels, notes Raisa from the Natural Resources Institute of Finland.

However, scientists also attribute the rapid decline in carbon storage to the escalating impacts of climate change.

Europe has faced several droughts in recent years, with 2018 and 2022 marking the harshest conditions. Wouter Peters at Wageningen University in the Netherlands highlights that his research indicates the 2022 drought caused a significant reduction in carbon intake by European forests during summer months. “We’re observing immediate effects; the trees are under stress,” he comments.

Researchers had expected that as global temperatures rise, European forests would diminish in health, yet the extent of the recent decline is still astonishing. Wouter Peters explains, “The impact seems to be more severe than anticipated.”

This downturn could be a result of successive droughts occurring within a few years, exacerbated by other extreme weather events such as storms that disturb forests. “We see not just one drought in 2018, but additional ones in 2021 and 2022,” Wouter Peters notes. “Our models have not effectively accounted for this concentration of drought events over such a short time frame.”

Moreover, rising temperatures are leading to more frequent and widespread infestations of bark beetles across Europe, which are severely damaging spruce forests. The Czech Republic, in particular, has faced seven major bark beetle outbreaks from 2018 to 2021.

A declining carbon sink poses a threat to the EU’s climate objectives, which depend on forests to absorb the bulk of emissions generated by other sectors. The EU is even aiming to enhance this carbon sink to support its climate ambitions, targeting a removal of 310 million tonnes of CO2 equivalents annually by 2030, a significant increase from the approximately 230 million tonnes currently removed.

However, a recent analysis published in April warns that European carbon sinks are projected to decrease by around 29% below the 2030 target, with researchers cautioning that the capability of European forests to absorb carbon will “gradually deteriorate.”

Preventative measures can help mitigate this decline, such as reducing harvesting rates and prohibiting clear-cutting in plantations, which can maintain carbon stocks. Additionally, increasing species diversity and retaining some deadwood can enhance forest health and resilience against pests and droughts.

Nonetheless, Wouter Peters argues that policymakers are overestimating the carbon absorption potential of forests in warmer climates. “There has likely been an over-reliance on forests, particularly in the context of greenhouse gas emissions,” he contends. He emphasizes that other sectors must rapidly reduce emissions to meet European climate goals. “This implies that we need intensified efforts in other areas.”

Carbon dioxide levels in the atmosphere are rising at unprecedented rates, despite an overall stagnation in greenhouse gas emissions. Scientists attribute this acceleration to slower carbon absorption rates in forests, wetlands, and peatlands globally, compounded by deforestation and increased emissions from wildfires and droughts that weaken global land sinks.

This issue is most pronounced in mid-latitude regions. Alongside Europe, significant declines in carbon sink capacity have also been recorded in boreal forests of Alaska and Canada. Tropical forests are facing challenges from both deforestation and diminished carbon storage capacity, primarily due to wildfires.

This poses a serious challenge to global efforts to achieve net-zero emissions. “In a broad global context, the entire concept of net zero hinges on the functionality of forests and oceans. If these systems cease to effectively sequester carbon, it will lead to increased atmospheric carbon levels and accelerated global warming.”

Topics:

Source: www.newscientist.com

Experts Warn: Hurricanes Are Intensifying – Time for a New Category

As the Atlantic hurricane season kicks off, millions are anxiously monitoring forecasts and looking for telltale signs of impending storms.

This year promises to be particularly severe. Ocean temperatures remain exceptionally high, and conditions in the Pacific are set to amplify Atlantic storm activity.

However, beyond the immediate forecasts, a more profound and surprising phenomenon is unfolding with tropical cyclones globally.

With rising global temperatures driven by human actions, climate change is reshaping our understanding of storms that batter coastlines. These storms are becoming wetter, more intense, and sometimes extraordinarily powerful. The current classification system for these storms is quickly becoming obsolete.

Indeed, it has been noted that Category 5 hurricanes (the most intense classification on the Saffir-Simpson scale) may no longer represent the upper limit. Future storms could necessitate an entirely new category.

“This is a discussion that has occurred several times, and I believe it is a valid argument,” says Dr. Tom Matthews, a senior lecturer in environmental geography at King’s College London. BBC Science Focus.

“We’ve expanded to Category 5 on the Saffir-Simpson scale, so using the term Category 5 is misleading, and we do need a new category.”

How are hurricanes classified?

Hurricanes are currently classified using the Saffir-Simpson scale, which is based on sustained wind speeds.

  • Category 1 – 74-95 mph (119-153 km/h). Very dangerous winds cause minor damage.
  • Category 2 – 96-110 mph (154-177 km/h). Very dangerous winds cause significant damage.
  • Category 3 – 111-129 mph (178-208 km/h). Catastrophic damage occurs.
  • Category 4 – 130-156 mph (209-251 km/h). Catastrophic damage occurs.
  • Category 5 – Over 157 mph (over 252 km/h). Catastrophic damage occurs.

However, climate change is pushing storms far beyond these established limits. Hurricane Patricia recorded wind speeds of 215 mph in 2015. Hurricane Dorian in 2019 hovered over the Bahamas with wind speeds of 185 mph.

Additionally, Typhoon Haiyan, highlighted by Matthews as a prime example of these next-generation storms, struck the Philippines in 2013 with sustained winds of 195 mph (314 km/h), with gusts reaching up to 220 mph (354 km/h).

These storms are unlike any we have experienced before.

Devastation following Typhoon Haiyan in the Philippines. – Getty

How is climate change impacting hurricanes?

One might expect that as the planet warms, the number of hurricanes will increase. However, the situation is more nuanced.

“The upper atmosphere warms faster than the lower atmosphere, creating stability that resists the vertical movements essential for hurricane formation,” explains Matthews.

Hurricanes depend on rising air, but a heated atmosphere can suppress this necessary upward movement, making it more difficult to initiate a hurricane.

“It’s akin to trying to lift a hot air balloon when the surrounding atmosphere is warmer than the burner inside the balloon,” Matthews elaborates.

“Another apt analogy is that the atmospheric lid above convection—the vertical movement needed to kickstart a hurricane—is becoming stronger, impeding hurricane development.”

This translates to reduced chances of hurricane formation. Nonetheless, when they do occur, they tend to exhibit explosive intensity.

Mathews provides another perspective: “A hurricane serves as a mechanism for redistributing heat from the ocean to the atmosphere. More heat is needed to initiate a hurricane.”

“This could mean they are less frequent, but when they do occur, they pack a significant punch.”

Moreover, rising sea levels mean that even storms of similar intensity can push further inland, causing greater damage. “Unfortunately, this is an unavoidable reality,” Matthews concludes.

Why is a new category necessary?

The classification of tropical cyclones is not merely an organizational tool; it is crucial for understanding the evolving nature of storms. With storm intensity rising, the current five-level classification may be insufficient for effective assessment.

Even within Category 5, there exists a vast range that can mislead and obstruct preparedness efforts.

“What may seem like a minor change, especially in wind speeds, can correspond to significant differences in damage.”

This dynamic is amplified because the force of wind impacting an object relates to the square of its speed, and the resulting power grows proportionally. In simple terms, what may seem like a minor acceleration can lead to catastrophic consequences on the ground.

“What may appear to be a slight change can cause substantial damage. This is especially problematic when structures are designed to withstand specific wind speeds but are exceeded.”

This is a serious warning. With ongoing climate change, the strongest storms are intensifying, and our longstanding classification system may no longer suffice.

read more:

About our experts

Tom Matthews serves as a senior lecturer in environmental geography at King’s College London, UK. His research delves into extreme meteorological environments and events. He has worked extensively in mountainous regions, such as the Himalayas, where he has been instrumental in setting up state-of-the-art weather stations on Mount Everest. His studies on severe extratropical cyclones and combined events have furthered the understanding of extreme humid thermal events and their prospective changes due to climate warming.

Source: www.sciencefocus.com

This Unusual Miniature Frog Defies Nature’s Greatest Laws

The world is full of fascinating paradoxes. For instance, does this article even exist before you’ve read it? If I traveled back in time and eliminated my grandfather, would I still be here writing this? And why is it that two socks can fit into the washing machine, yet only one emerges? Perhaps one of the grandest paradoxes is how a frog can shrink as it matures.

Meet the paradoxical frog (Pseudis Paradoxa). These frogs lay their fertile eggs in South America’s lakes and lagoons, where they hatch into tadpoles that begin to consume the eggs.

The voracious larvae feed mainly on algae and begin to grow quite rapidly. Initially, they develop like ordinary tadpoles, but…

If conditions are ideal, these tadpoles can grow remarkably large. Bigger than blueberries, larger than strawberries—think of it as if “satsuma stuffed into ankle socks.” That gives you a sense of their size, and perhaps you’ve even located your missing socks.

The tadpoles of the paradox frog, with their plump, rounded bodies and long muscular tails, can reach lengths of up to 22cm (8.6 inches). Rearrange the movie Jaws, and you’ll need a bigger jam jar!

This size is about three times more significant than the adult frogs they eventually become, with much of the necessary development already complete.

By the time they morph from tadpoles to frogs, males possess well-formed testes and can produce sperm, while females create mature eggs.

This is distinct from typical frog tadpoles, which take longer to reach sexual maturity during the male frog phase of their life cycle.

Paradoxical frog tadpoles can grow up to 22cm (8.6 inches). – Photo credit: Aramie

So, how does a giant tadpole transform into such a small frog? It’s remarkably simple! At least half of the paradoxical frog tadpoles are made up of tails. Once they lose their tails, they undergo a normal transformation into relatively small adults, measuring about 7cm (approximately 2.5 inches).

This “contraction” of the amphibians explains the phenomenon often referred to as frog shrinkage.

The paradox appears resolved. Yet, as one riddle is solved, another emerges: Why do the tadpoles expend such energy in growing so large in the first place?

One possible explanation lies in the timing and location of their birth. Paradoxical frogs time their spawning for the rainy season.

In Trinidad, this occurs around May.

Some eggs are laid in permanent bodies of water, while others are deposited in fleeting ponds that eventually dry up. Those born in small, temporary locations with limited food and aquatic predators do not grow much. In contrast, tadpoles born in larger, more stable ponds with abundant food and fewer predators tend to thrive.

In these circumstances, growing larger can enhance survival since larger tadpoles are less likely to be consumed by predatory fish and other animals.


If you have questions, please email us at Question @sciencefocus.com or Message Facebook, Twitter, or Instagram (please include your name and location).

Check out our ultimate Fun fact and explore more amazing science content!


Read more:

Source: www.sciencefocus.com

IXPE Measures X-Ray Polarization from Magnetic Explosions

A magnetor is a type of neutron star that boasts an extraordinarily strong magnetic field, approximately one times stronger than Earth’s magnetic field. These colossal magnetic fields are believed to be generated when rapidly rotating neutron stars are birthed from the collapse of a giant star’s core. Magnetars emit brilliant X-rays and display erratic patterns of activity, with bursts and flares releasing millions of times more energy than the Sun emits in just one second. Polarization measurements offer insights into magnetic fields and surface characteristics. This was the focus of astronomers using the NASA Imaging X-ray Polarization Explorer (IXPE) to study 1E 1841-045, a magnetor located within Supernova Remnant (SNR) KES 73, situated nearly 28,000 light years from Earth. The findings are published in the Astrophysics Journal Letter.

Impressions of Magneter artists. Image credit: NASA’s Goddard Space Flight Center/S. Wesinger.

Magnetors represent a category of young neutron stars. They are the remnants of giant stars that collapsed in on themselves at the end of their life cycles, resembling the mass of the Sun but compressed into a city-sized volume.

Neutron stars exemplify some of the most extreme physical conditions in the observable universe, offering a unique chance to investigate states that cannot be replicated in terrestrial laboratories.

The 1E 1841-045 magnetor was observed in an explosive state on August 21, 2024, by NASA’s Swift, Fermi, and other advanced telescopes.

The IXPE team has permitted several requests to pause scheduled observations of the telescope multiple times each year, redirecting focus to unique and unexpected celestial phenomena.

When 1E 1841-045 transitioned into this bright active phase, scientists chose to direct IXPE to capture the first polarization measurements of the magnetor’s flare.

Magnetors possess magnetic fields thousands of times stronger than most neutron stars, hosting the most powerful magnetic fields among known cosmic entities.

These extreme magnetic field fluctuations can lead to the emission of X-ray energies up to 1,000 times greater than usual for several weeks.

This heightened state is referred to as explosive activity, though the underlying mechanisms remain poorly understood.

IXPE’s X-ray polarization measurements may help unveil the mysteries behind these phenomena.

Polarized light carries information about the direction and orientation of emitted X-ray waves. A higher degree of polarization indicates that the X-ray waves are moving in harmony, akin to a tightly choreographed dance.

Studying the polarization characteristics of magnetors provides clues regarding the energy processes associated with observed photons and the direction and configuration of the magnetor’s magnetic field.

This diagram illustrates the IXPE measurements of X-ray polarized light emitted by 1E 1841-045. Image credit: Michela Rigoselli / Italian National Institute of Astrophysics.

IXPE results, supported by NASA’s Nustar and other telescope observations, indicate that X-ray emissions from 1E 1841-045 exhibit increased polarization at higher energy levels while maintaining a consistent emission direction.

This significant contribution to the high degree of polarization is attributed to the hard X-ray tail of 1E 1841-045, a highly energetic component of the magnetosphere responsible for the highest photon energies detected by IXPE.

Hard X-rays refer to X-rays characterized by shorter wavelengths and greater energy than soft X-rays.

While prevalent in magnetars, the processes that facilitate the generation of these high-energy X-ray photons remain largely enigmatic.

Despite several proposed theories explaining this emission, the high polarization associated with these hard X-rays currently offers additional clues to their origins.

“This unique observation enhances existing models that aim to explain magnetic hard X-ray emissions by elucidating the extensive synchronization seen among these hard X-ray photons,” remarked a student from George Washington University. First paper.

“This effectively demonstrates the power of polarization measurements in refining our understanding of the physics within a magnetar’s extreme environment.”

“It would be fascinating to observe 1E 1841-045 as it returns to its stable baseline state and to track the evolution of polarization,” added Dr. Michela Rigoselli, an astronomer at the National Institute of Astrophysics in Italy. Second paper.

____

Rachel Stewart et al. 2025. X-ray polarization of Magnetor 1E 1841-045. apjl 985, L35; doi: 10.3847/2041-8213/adbffa

Michela Rigoselli et al. 2025. IXPE detection of highly polarized X-rays from Magnetor 1E 1841-045. apjl 985, L34; doi: 10.3847/2041-8213/adbffb

Source: www.sci.news

Paleontologists Unveil Europe’s Most Complete Stegosaurus Skull

Paleontologists discovered dinosaur skull fragments within the Upper Jurassic Villard Alzobispo layer in Teruel, Spain, and confidently identified them as belonging to the species Centegosaurus dacentrurus armatus.

Skull of dacentrurus armatus from the Villard Alzobispo Formation in Teruel, Spain. Image credit: S. Sánchez-Fenollosa & A. Cobos, doi: 10.3897/vz.75.e146618.

The name Stegosauria was first introduced in 1877, with the initial reference to dacentrurus armatus occurring two years later.

Stegosauria constitutes a small clade of theropod dinosaurs, featuring iconic and recognizable representatives such as the genus Stegosaurus.

These dinosaurs were characterized by two rows of bony skin (plates and spines), extending principally from the neck to the tail’s edge.

Stegosaur fossils date from the Middle Jurassic through the Late Cretaceous, and they are generally represented by a limited number of partial skeletons worldwide.

Skull remains of stegosaurs are often fragmentary and infrequently found in the fossil record.

Nearly half of today’s scientifically recognized stegosaur species lack preserved skull material.

A recent study by Fundación Dinópolis paleontologists Sergio Sánchez Fenollosa and Alberto Cobos focused on the skull of dacentrurus armatus, a type of stegosaurus that roamed Europe approximately 150 million years ago.

Life reconstruction of dacentrurus armatus. Image credit: Sci.News.

“A comprehensive study of this extraordinary fossil has revealed anatomical features previously unknown in dacentrurus armatus, a typical European stegosaurus,” noted Dr. Sanchez Fenollosa.

“Dinosaur skulls are seldom preserved due to their extreme fragility.”

“This discovery is crucial for understanding the evolution of stegosaur skulls.”

Additionally, alongside detailed anatomical studies, we proposed a new hypothesis that redefines evolutionary relationships among stegosaurs worldwide.”

“This research has established a new grouping termed Neostegosauria.”

According to the team, Neostegosauria includes moderate to large stegosaur species that existed in Africa and Europe during the Middle to Late Jurassic, as well as Late Jurassic and Late Cretaceous Asia.

“This dual outcome represents both a remarkable fossil study and the proposal of new evolutionary theories, positioning our work as a key reference in stegosaur research,” remarked Dr. Cobos.

“The fossil site at Riodeva remains a subject of ongoing research and holds many related fossils, including additional postcranial elements from the same adult specimen, particularly rare combinations of this dinosaur type.”

“These findings are significantly enhancing the paleontological heritage of Teruel, making it a central region for understanding life’s evolution on Earth.”

The team’s research paper was published in the journal Vertebrate Zoology on May 26, 2025.

____

S. Sánchez-Fenollosa & A. Cobos. 2025. New insights into the phylogeny and skull evolution of the stegosaur dinosaur: an extraordinary skull from the late Jurassic (Dinosaur: Stegosauria) in Europe. Vertebrate Zoology 75:165-189; doi:10.3897/vz.75.e146618

Source: www.sci.news

These Cosmic Beasts Are Sparkling the Largest Explosion Since the Big Bang

The immense void of space implies that only one star can occupy a region at a time.

This phenomenon occurs because the mass of these stars is 80 billion times lighter than Earth, with three colossal entities devouring stars ten times larger than our Sun.

A recent study from the University of Hawaii reveals that astronomers, while analyzing data from NASA and the European Space Agency, have identified three ultra-massive black holes. A gigantic one consumes stars far exceeding the sizes of those that orbit the center of our solar system.

The explosions reported by these researchers happened when the black holes tore apart and engulfed the remnants of these stars, yielding the largest events since the Big Bang that shaped our universe.

“What excites me about this research is that we are extending the boundaries of our understanding of the most energetic environments in the universe,” stated Anna Payne, a staff scientist at the Institute of Space Telescope Science and co-author, in a NASA article.

Black holes are cosmic entities that remain unseen by the naked eye, possessing a gravitational force so intense it can capture everything, including light itself. Supermassive black holes, the largest varieties, reside at the centers of galaxies, gradually consuming planets and other materials.

When a star falls under the influence of a supermassive black hole, new research published this week suggests it could end in a dramatic explosion categorized as an “extreme nuclear transient.”Journals advance in science

“These occurrences are unique as they provide the only means for us to illuminate a massive black hole that would typically remain dormant,” noted University of Hawaii graduate student Jason Hinkle in a related article from NASA.NASA article.

Hinkle serves as the lead author of a new study that documents two such events for the first time in the last decade.

Two of the three supermassive black holes were observed by ESA missions in 2016 and 2018, marking the first time they were documented. The third, cataloged as ZTF20Abrbeie, was discovered by the Caltech Observatory in California in 2020 and officially recorded in 2023.

The explosion’s magnitude rivals only that of the Big Bang, which initiated the universe.

Differing from typical stellar explosions, the variations in X-rays, optical, and ultraviolet emissions in these events indicated a “star-consuming black hole.”

NASA explains that black holes actually become brighter during these cosmic occurrences, with their luminosity lasting several months.

This brightness offers scientists a new methodology to uncover additional black holes in the far reaches of the early universe. As astronomers peer into the cosmos, the farther they look, the more ancient light they detect. For instance, light from the Sun takes eight minutes to reach Earth.

“You can use these three objects as a template for what to search for in the future,” Payne remarked.

Source: www.nbcnews.com

I’m Exhausted from Living in Tornado Alley

Residents of Shipan City are increasingly worried about the potential for severe weather, as the community is set to lose 14 staff members, leaving it vulnerable to such events.

Wolf, a retired meteorologist from Davenport, expressed his concerns. He mentioned that up to 12 staff members were typically involved in managing severe weather incidents simultaneously. The cutbacks have resulted in a significantly reduced workforce for critical situations, especially when other forecast offices are also dealing with their own weather emergencies.

“With only 14 staff members, needing 10 leaves you in a tough spot,” Wolf remarked. “If we face a major weather event in the coming weeks, I’m confident they will still perform admirably, as they have in the past.”

Brian Payne, emergency manager for Scott County, Iowa, stated he has been receiving consistent service and hasn’t observed any issues.

“We depend on them heavily,” Payne noted. “They seem exhausted.”

A former National Weather Service staff member, familiar with the situation in Davenport, indicated that the team’s professionalism and commitment are crucial in preventing more serious outcomes.

“They all strive to accomplish their tasks despite time constraints and unpredictable conditions,” said the former employee, who preferred to remain anonymous due to fears of repercussions. “I genuinely feel for the team; they carry a heavy burden.”

Sorensen noted that employees are apprehensive about retaliation and hesitant to express their concerns.

“These are my friends and colleagues. I studied alongside a meteorologist 25 years ago,” Sorensen said, referencing Friedline. “They worry that their comments could have political consequences, and that someone might respond like a bully from high school, unjustly targeting them.”

Source: www.nbcnews.com

Exploring Inequality: How Mathematical “Equality” Literature Can Transform the Real World

Numbers enable us to focus in detail on one aspect of a situation, but we can overlook complexities

Mika Baumeister/Unsplash

inequality
Eugenia Chen (Profile book (UK, for sale) Basic Books (We, September 2nd)

Are things equal or aren’t they? At least mathematically, that’s a question worth considering. Eugenia Chen argues in her new book, Inequality: With mathematics and tactics when things are done. In maths, as in life, some aspects have more weight than others.

Consider this: the equation 180 = 180 reveals nothing, yet x + y + z = 180°, where x, y, and z are the angles of a triangle, conveys a deeper insight. This statement holds true only under specific circumstances—yes, but not on the surface of a sphere.

Chen aims to investigate how mathematics identifies things as “equal.” Her methodology blends playfulness with the gravity of abstract concepts, linking them to diverse topics such as knitting and creating Battenberg cakes. She isn’t shy about tackling significant political and rights-related questions surrounding equality.

When simplifying through numbers, Chen humorously remarks that their dullness helps clarify potentially overwhelming complexities into a manageable figure. Numbers can be potent tools, focusing on a specific element of a situation.

However, overlooking this simplification can lead to misunderstandings. For instance, assuming two individuals with identical IQ scores are equally intelligent is misleading. As Chen remarks, “It’s alright to disregard the details, but you must remember that you have.”

Fortunately, mathematics encompasses more than mere numbers. Chen delves into the concepts of “local” and “global,” engaging in extensive discussions. Essentially, she explores surfaces formed by stitching together smaller flat areas.

By promoting “diverse thinking,” she proposes a valuable lens through which to view reality. In mathematics, debating whether a sphere and a torus are “the same” is futile. They can be understood as locally distinct but globally different. Similarly, in political discourse, it’s crucial to recognize when one faction utilizes localized arguments (“individual women benefit from the right to choose regarding abortion”) while the opposing side employs global ones (“all abortions constitute murder,” etc.).

Chen ventures deeply into abstract discussions regarding identity within categorical theory, guiding the reader through theoretical territories. Some of the most remarkable creations in art, literature, and music are indeed complex, yet we appreciate them without fully grasping the intricacies of chiaroscuro, counterpoints, or other sophisticated elements. Chen devotes herself to exploring the formal definitions of categories. Like art, we all appreciate certain abstract notions, but discovering their depth is worthwhile.

“If you believe that mathematics is solely about equations, seeing them as rigid black-and-white facts, then you likely perceive mathematics as solely stringent and binary,” states Chen. This book serves as a compelling counterargument to that misapprehension. Delving into the nuances of “equality” in mathematics will enrich your understanding of this field’s complexity and illuminate how the idea of equality is applied (and misapplied).

Sarah Hart is Professor Emelita and Provost of Geometry at the University of Gresham, UK. She authored Once Upon Prime.

New Scientist Book Club

Are you an avid reader? Join a welcoming group of fellow book enthusiasts. Every six weeks, we explore exciting new titles, offering members exclusive access to excerpts, author articles, and video interviews.

topic:

Source: www.newscientist.com

Top New Sci-Fi Releases for June 2025: Exploring Taylor Jenkins Reid’s Alternate 1980s

June’s new science fiction features Megan E. O’Keefe’s Space Opera

Science Photo Library / Alamy Stock Photo

Are you a fan of dystopian worlds plagued by relentless viruses and advanced technology? If so, June has a lot in store for you. Expect narratives that range from infections inciting greed to nerve chips that eliminate sleep. Inga Simpson delivers a tale of environmental apocalypse in Thin, while EK Sathue offers a feminist body horror twist in a story reminiscent of American Psycho meets Material. Also on the menu is the intriguing new space opera by Megan E. O’Keefe titled Atmosphere.

Those crafty scientists are up to something again, developing a nerve chip designed to eliminate sleep. This chip soon becomes ubiquitous, leaving humanity in a state of sleep deprivation. Survivors in the Tower of London work tirelessly to find a cure… it’s a mix of eerie entertainment and genuine fright.

We’ve encountered plenty of apocalyptic viruses before. In this installment, a deadly virus leaves infected individuals “wild with desire.” Sophie, our protagonist, is a “good Catholic girl” who will stop at nothing to find her family. Originally published in the US, this novel hits the UK shelves this month.

Although not strictly science fiction, the upcoming book offers a unique perspective against the backdrop of the 1980s space shuttle program. Taylor Jenkins Reid, known for Daisy Jones & Six and Malibu Rising, introduces us to Professor Joe Goodwin, who begins training astronauts at the Johnson Space Center in Houston in 1980. Everything shifts with Mission STS-LR9 in December 1984…

Taylor Jenkins Reid’s Atmosphere is set during the 1980s space shuttle program

NG Images/Alamy

This standalone space opera features Faven Sythe on a quest to find her missing mentor. Sythe, a “Crystbon,” charts stellar routes across the galaxy. The only individual who stands a chance of aiding her is the enigmatic pirate Amandine, and together they uncover a conspiracy that spans the galaxy.

Finn lives in a secluded area with his mother, Dianera, always ready to escape. The environment beyond their sanctuary is deteriorating, and as extinction looms, Finn must join forces with an unlikely ally—an evolved human—on a mission to restore the balance of nature.

As a virus decimates half of China’s population and heads towards the UK, the government resorts to distributing “pills of dignity.” Meanwhile, Hart Ikeda discovers a method to mutate the virus, reprogramming it to foster compassion in its hosts. Will this be the salvation needed?

Pitched as American Psycho meets Material, this body horror narrative follows a young woman who starts working for the upscale skincare brand Hebe. As Sofia quickly learns, all is not as it seems; the Youth Juice moisturizer she tests could come with costs she never anticipated. How far is she willing to go to preserve her youth?

This compelling tale unfolds as scientists, facing humanity’s potential extinction, utilize technology intended for interstellar exploration to send someone 10,000 years into Earth’s future. Microbiologist Nicholas Hindman finds himself navigating an uncharted wilderness, searching for the remnants of humanity amidst a devastating pandemic in 2068.

Enca and Mathilde bond as art school friends, but when Mathilde’s rise to fame threatens their relationship, Enca becomes desperate to maintain their connection. Will the cutting-edge technology known as scaffolding—allowing Enca to live within Mathilde’s mind—forge a stronger bond, or will it complicate their lives?

Beginning in present-day India and moving into a near future, this story centers around a populist movement that rejuvenates the ancient Saraswati River. Though it’s labeled “not exactly science fiction,” it contains “strong speculative elements deeply rooted in contemporary politics.” Compared to the works of David Mitchell, Zadie Smith, and Eleanor Catton, it’s certainly worth exploring.

Set in a near-future London where technology intertwines with everything from physical health to political dynamics, journalist Pers Budmouth seeks to uncover the truth behind the mysterious disappearance of young black children. Instead, her assignment takes her to cover protests in Benin, where tourists participate in sacred rituals. When she partakes in the Spirit Vine—an ingredient often found in ayahuasca—she uncovers a destiny that could change everything. This story is a must-read for fans of NK Jemisin’s Supacell.

topic:

Source: www.newscientist.com

Is it possible to create a gravity-powered space-time computer?

SEI 254380947

Illustration of a giant object distorting spacetime

koto_feja/getty images

Exploring the mathematical nature of space-time and physical reality could pave the way for innovative computer-like systems that utilize gravity for data processing.

Is space-time an immutable expanse, or is it subject to distortion that influences the signals traversing it? While Albert Einstein’s special theory of relativity suggests stability, his general theory signifies otherwise. In this context, massive objects can create indentations and curves in space-time, altering signal trajectories, akin to a ball impacting a taut surface.

Eleftherios-Ermis Tselentis from the Brussels Institute of Technology and Ämin Baumeler of the University of Lugano in Switzerland have devised a mathematical framework to ascertain the constancy of space-time in specific regions.

They investigated a situation in which three individuals send messages amongst themselves. They posed the question: Could Alice, Bob, and Charlie discern if space-time distortions affected their information exchange? Could Alice receive a message from Bob if the spatial-temporal region through which the signal travels is altered? This might allow her to invert the causal dynamics between Charlie and Bob, thus causing Bob to influence the space-time around her prior to obtaining a reply from Charlie.

Tselentis and Baumeler formulated equations to assist Alice, Bob, and Charlie in recognizing the feasibility of these scenarios. After multiple rounds of communication, they compiled data on received messages, which was subsequently integrated into their equations.

The outcomes indicate whether their exchange occurred in an environment where space-time manipulation was viable. This mathematical construct is general enough that the participants do not need awareness of their locations or non-standard messaging tools.

Baumeler noted that while the general theory of relativity has long been a cornerstone of our understanding of physical existence, a rigorous mathematical connection between space-time fluctuations and information flow had been absent. Grasping the dynamics of information flow is foundational for computer science.

In this regard, he believes their research could initiate a nascent exploration of using gravitational effects to manipulate and navigate space-time for computational purposes.

“If one can harness the enigmas of physics for computation, why not explore the general theory of relativity?” stated Pablo Arrighi from Paris Clay University. He pointed out that while other researchers posit extreme concepts such as placing computers in black holes, space-time distortions at black hole edges slow down time, allowing for potentially extensive calculations to yield results.

Nonetheless, the new theory uniquely sidesteps a focus on specialized devices or specific aspects of space-time, allowing for a broader range of applications, according to Arrighi. However, creating “gravity-based information” systems does not appear feasible at present.

Tselentis and Baumeler also acknowledged that substantial additional research is necessary before devising a functional device. Their current calculations depend on fantastical scenarios, such as moving an entire planet to interject between Charlie and Bob. Practical applications will necessitate a deeper comprehension of gravity’s effects at much smaller scales.

Gravity is notoriously weak when it comes to smaller objects, thus one doesn’t typically perceive the impact of space-time distortions with everyday items like a pencil on a desk. Yet, certain instruments, such as clocks using ultracold atoms, can detect these phenomena. Future advancements in such devices, alongside theoretical progress linking gravity and information, could enable more applicable outcomes from Tselentis and Baumeler’s mathematical research.

Their work posits that diverse frameworks, like information theory and special relativity, can shed light on how causal relationships are perceived. V. Virasini from the University of Grenoble Alpes in France notes that the new research touches on concepts such as event order reversal, prompting inquiries into fundamental notions like events (e.g., Alice pressing a button to dispatch a message).

She suggests that the next step involves fully integrating this approach, facilitating further exploration into the essence of space-time.

“Do astrophysical events, like black hole mergers that generate gravitational waves impacting Earth, carry a meaningful signature of the correlations examined in this study?” she inquires.

topic:

Source: www.newscientist.com

We Might Have Found the First Star Made of Dark Matter

SEI 254497072

Mysterious stars might be fueled by dark matter

Artsiom P/Shutterstock

Astronomers have uncovered compelling evidence for the existence of Dark Stars—massive stars in the early universe that might be partly energized by dark matter. If confirmed, these hypothetical stars could shed light on the enigmatic large black holes observed in the early universe, although skepticism remains among some astronomers regarding these findings.

The concept of Dark Stars was proposed in 2007 by Katherine Freese and her colleagues at the University of Texas at Austin. They theorized that immense clouds of hydrogen and helium in the early universe could interact with dark matter, forming gigantic and stable stars. Absent dark matter, such vast gas clouds would collapse into black holes, but energy from decaying dark matter can counter this collapse, resulting in star-like entities even without the nuclear fusion typical of ordinary stars.

Until recently, evidence for these exotic objects from the early universe was scant, but in 2022, the James Webb Space Telescope (JWST) began discovering numerous bright, distant celestial objects. Freese and her team identified three galaxies that exhibited several characteristics predicted by Dark Star models, such as round shapes and similar luminosity, though detailed spectral data was absent to confirm their hypothesis definitively.

Now, with new spectral observations from JWST, Freese’s team believes they can match theoretical predictions of what Dark Stars should resemble, including two additional candidates. One of these potential candidates shows intriguing hints of specific helium characteristics—missing electrons—which, if validated, could serve as a distinct hallmark of a Dark Star. Freese remarks, “If it’s real, I don’t know how else to explain it using Dark Stars.” She cautions, however, that evidence is still limited.

Meanwhile, Daniel Whalen from the University of Portsmouth in the UK suggests that an alternative theory of ultra-massive protostars, which do not involve dark matter, might also explain the JWST findings. “They overlook considerable literature concerning the formation of ultra-massive protostars, some of which can produce signatures remarkably similar to the ones they present,” claims Whalen.

Freese, however, strongly disagrees, asserting that burning dark matter is the only feasible method for creating such massive stars. “There’s no alternative route,” she insists.

A complicating factor arises from separate observations of the objects studied by Freese’s team using the Atacama Large Millimeter Array (ALMA) in Chile, which indicated the presence of oxygen. This element is not associated with Dark Stars, suggesting these candidates might be hybrid stars. On the other hand, Whalen and his team interpret the presence of oxygen as a strong indicator that these objects cannot be Dark Stars, attributing their formation to conventional stars that exploded as supernovae.

Should Freese and her collaborators confirm that these objects are indeed Dark Stars, it could address significant challenges in understanding the universe. Current models posit that such black holes can only originate from extremely massive matter, which raises questions about their formation in the early universe.

Topic:

Source: www.newscientist.com

US Halts Support for COVID-19 Vaccines for Children—Are Other Vaccines Next?

US Secretary of Health and Human Services Robert F. Kennedy JR

Tasos Katopodis/Getty

One of the leading vaccine specialists at the Centers for Disease Control and Prevention (CDC), Lakshmi Panagiotakopoulos, resigned on June 4th, just a week after Robert F. Kennedy JR announced that the Covid-19 vaccine would no longer be advised for most children and pregnancies.

This declaration prompted several days of uncertainty regarding the availability of the Covid-19 vaccine in the U.S. Although there has not been a significant shift in access, parents may face new challenges when trying to vaccinate their children. Nonetheless, Kennedy’s statement reflects a concerning departure from established public health practices.

“My career in public health and vaccinology began with a deep-seated desire to assist the most vulnerable members of our population. This is not something I can continue in this role,” Reuters reported.

Panagiotakopoulos was part of the Advisory Committee on Immunization Practices (ACIP) since 1964. However, last week, Kennedy, as the highest authority in public health in the country, reversed decades of protocol. “As of today, we are unable to announce that the Covid vaccine for healthy children and pregnant women has been removed from the CDC’s recommended vaccination schedule,” he stated in a video shared on the social media platform X on May 27th.

Despite his directive, the CDC has only made minor modifications to its recommendations regarding the Covid-19 vaccine. Rather than a full endorsement for children, it is now recommended “Based on shared clinical decisions,” meaning parents should consult their doctors prior to making a decision. It remains uncertain how this will impact vaccine access in various situations, but it may complicate obtaining vaccinations for children at pharmacies.

The CDC’s guidance on vaccination during pregnancy is rather unclear as well. The relevant website still recommends Covid-19 shots during pregnancy, noting that “This page will be updated to reflect your new vaccination schedule.”

Kennedy’s declaration also stands in stark contrast to the positions of major public health organizations. Both the American College of Obstetricians and Gynecologists (ACOG) and the American Academy of Pediatrics (AAP) have expressed opposition to this stance.

“The CDC and HHS advise individuals to consult healthcare providers regarding personal medical choices,” a spokesperson for HHS told New Scientist. “Under Secretary Kennedy’s leadership, HHS is re-establishing the connection between doctors and patients.”

However, Linda Eckart of Washington University in Seattle argues that these conflicting messages create confusion for the public, stating, “It opens doors for misinformation and undermines overall confidence in vaccines. I cannot fathom that vaccination rates will not decline.”

Numerous studies have demonstrated the safety and efficacy of Covid-19 vaccinations during adolescence and pregnancy. In fact, Martin McCurry, head of the U.S. Food and Drug Administration, emphasized this in a risk assessment for severe Covid-19 published a week before Kennedy’s announcement, further complicating the government’s public health message.

Kennedy’s announcement aligns with similar community policies in several countries. For instance, Australia and the UK do not recommend the Covid-19 vaccine for children unless they are at high risk of severe illness. Likewise, they advise against Covid-19 vaccinations during pregnancy for those already vaccinated.

Asma Khalil, a member of the UK’s Joint Committee on Vaccination and Immunization, stated that the UK’s choice is informed by a reduced risk from omicron variants, the cost-effectiveness of vaccinations, and high herd immunity. Nevertheless, these variables can differ from one country to another. Eckart notes that the UK population generally has better access to healthcare than that of the U.S. “These evaluations necessitate a meticulous consideration of risks and benefits for the national populace,” Khalil asserts. HHS did not respond to New Scientist regarding whether a similar assessment influenced Kennedy’s decision.

Perhaps the most concerning aspect of Kennedy’s announcement is its circumvention of the expected ACIP vote on proposed revisions to COVID-19 vaccine recommendations, which was slated for later this month. “This method of decision-making—by individual professionals who carefully review conflicts of interest and scrutinize the data—has never occurred in our country,” Eckart emphasizes. “We are traversing uncharted territory,” and she fears that Kennedy’s actions could establish precedents for other vaccine recommendations. “I am aware there are numerous vaccines he has actively opposed,” she continues, recalling Kennedy’s previous denunciations of vaccines linked to autism and false claims regarding the polio vaccine.

“What this implies is that [Kennedy] is undermining established scientific guidelines,” stated Amesh Adalja from Johns Hopkins University.

Topic:

Source: www.newscientist.com

Women Discover They Are More Attractive Than Men

Women’s faces are often viewed as more attractive than men’s

Aleksandarnakic/Getty Images

Research indicates that women’s faces are generally deemed more attractive than those of men. This conclusion comes from an extensive study involving 12,000 participants globally, revealing that women tend to rate other women’s faces as more appealing than men do.

“When analyzing the gender of the raters, it becomes clear that women’s preferences for female faces are significantly stronger,” says Eugen Wassiliwizky from the Max Planck Institute for Empirical Aesthetics in Germany.

Typically, in many species of mammals and birds, males develop traits to attract females, as noted by Ushiri Withkey. For instance, male mandrill baboons showcase vibrant red and blue facial colors.

“Females are usually the selective sex,” he explains. “This has resulted in men appearing even more attractive over time.”

Yet, as biologists like Charles Darwin have suggested, humans seem distinct in their preference for women as the “fairer sex.”

“There has been ongoing discourse since the 19th century regarding the reversal of sexual roles in humans, but surprisingly, this has not been empirically tested,” Wassiliwizky comments.

Using raw data from various studies on facial attraction, he seeks to substantiate this assumption. For instance, one study his team analyzed looked at the impact of emotions on perceived facial attractiveness.

Much of the analyzed data stems from studies intentionally recruiting heterosexual participants for facial evaluations, according to Washiri Withkey. While some ratings come from volunteers identified as LGBTQ+, their numbers are too limited for substantial conclusions.

Women’s preferences for facial attractiveness seem to cross cultural and national boundaries, with the research revealing “moderate to large scale” trends in nearly all global regions, excluding sub-Saharan Africa and ethnic groups identified as African.

As noted by Washiri Withkey, the perception of women’s faces as more attractive correlates with distinct physical characteristics between genders, yet familiarity with individuals can also influence how attractiveness is perceived by both sexes.

By assessing women’s ratings between feminine and masculine facial features, the research concluded that roughly two-thirds of women’s preferences are attributed to physical disparities, while the remaining third reflects an understanding of gender.

Why do women regard other women as more attractive? “Women might display solidarity with each other or better appreciate one another’s beauty,” speculates Washiri Withkey.

Conversely, regarding why women rate men less favorably than other men, he suggests it might stem from a reluctance to acknowledge male attractiveness, compounded by awareness that their assessments are scrutinized.

Alternatively, women may consider a man’s character based on his appearance. Wassiliwizky advocates for more focused future research, proposing questions such as, “Do you find yourself physically attracted to this individual?” and “How appealing is this face?”

“This paper thoroughly displays the gender variations in attraction across numerous images and cultures,” states Anthony Little from the University of Bath, UK. “Nevertheless, researchers have long highlighted that appeal transcends merely selecting peers.”

“Meta-analytic studies decisively affirm the existence of a ‘gender attractiveness gap’,” adds Karel Kleiner from Charles University in the Czech Republic.

Kleisner’s research uncovered that physical differences in facial features are least pronounced in certain African populations, potentially explaining the lack of significant effects observed there.

Moreover, local beauty standards can vary considerably from global norms, Kleisner notes. “A key limitation of this study is its insensitivity to the unique aesthetics of African beauty.”

In addition, studies focused on body attractiveness might yield different results. “Truthfully, we remain uncertain,” Washiri Withkey admits, highlighting the absence of comparative studies on full-body appeal.

topic:

Source: www.newscientist.com

Japan’s Sturdy Lunar Lander Successfully Touches Down on the Moon

The surface of the moon as captured from orbit prior to the crash

ISPACE SMBC X Hakuto-R Venture Moon

On June 5th at 7:13 PM, a Japanese space endeavor aiming to be the third private lunar landing failed as ISPACE’s Resilience lander succumbed on the moon’s surface.

The lander began its descent from around 20 km above the moon, but ISPACE’s mission control quickly lost communication after the probe activated its main engine for final descent, receiving no further signals.

The company announced that the laser tool used to gauge the distance to the surface seemed to malfunction, leading to inadequate slowing of the lander and likely resulting in a collision.

“Given the absence of a successful lunar landing at this time, our top priority is to analyze the telemetry data collected so far and diligently investigate the cause,” stated ISPACE CEO Mr. Takeshi.

Had it succeeded, Resilience would have marked the second private moon landing of the year and the third overall, making it the first non-U.S. company to land on the moon after ISPACE’s prior attempt, the Hakuto-R mission, failed in 2023.

The Resilience Lander embarked on its lunar journey aboard a SpaceX rocket on January 15th, alongside Firefly Aerospace’s Blue Ghost lander. While the Blue Ghost successfully landed on March 2, Resilience took a more circuitous route, moving into deeper space before returning on May 6 to enter lunar orbit. This complex trajectory was essential for targeting the challenging northern plain called Male Frigolis, which had not been surveyed by previous lunar missions.

Equipped with six experiments, the lander included a device for splitting water into hydrogen and oxygen, a module for algae-based food production, and a radiation monitor for deep space. Additionally, it housed a five-kilogram rover named Tenesial, designed to explore and capture images of the moon during the two weeks that Resilience was set to operate.

Topic:

Source: www.newscientist.com

A Mysterious Signal Emerges from a Dying Galaxy: Here’s What We’ve Uncovered…

Fast Radio Bursts (FRBs) represent one of the greatest mysteries of the universe in our time. Initially identified in 2007, these transient radio wave phenomena have perplexed astronomers ever since.

Although we have detected thousands of them, the precise causes, origins, and unpredictable behaviors of FRBs remain elusive.

Just when scientists thought they were starting to unravel the mysteries, two new studies published in January 2025 added twists to the ongoing FRB enigma, challenging earlier theories.

“The FRB is one of those cosmic mysteries that deserves to be solved,” states Dr. Tarraneh Eftekhari, a radio astronomer at Northwestern University, in reference to the first new paper published in Astrophysics Letter.

Though the solution may be a long way off, the universe continues to guard its secrets.

What Makes the FRB Mysterious?

While it may not be entirely accurate to say that FRBs were discovered purely by chance, their initial detection happened within data collected for an entirely different purpose.

Pulsars, or “pulsating radio sources,” are far better understood cosmic phenomena, having been discovered in 1967 by Professor Jocelyn Bell Burnell, arising from neutron stars. These are incredibly dense remnants of giant stars boasting magnetic fields far stronger than Earth’s.

These rapidly spinning stellar remnants emit regular pulses of radio waves akin to cosmic beacons.

The consistency of these pulses and their emissions at specific frequencies initially led to the hypothesis that they could be of natural origin, which earned the first pulsar the nickname “Little Green Man 1.”

While pulsars quickly found their rightful place in astrophysics, FRBs tell a different story.

Jump forward to 2007 when they emerged unexpectedly from data gathered by the Parkes Multibeam Pulsar Survey, an international collaboration involving Jodrell Bank Observatory, Massachusetts Institute of Technology, Bologna Astronomical Observatory, and Australia’s National Facilities.

The emission from this event was so powerful that it overshadowed all other known sources at the time by a substantial margin.

“In terms of energy output, a 1-millisecond-long FRB can emit as much energy as the Sun produces over three days,” says Dr. Fabian Djankowski, an astrophysicist at the French National Centre for Science and Technology specializing in FRBs.

However, for over five years after the initial detection, no similar events were recorded. Skepticism faded as more FRBs began to emerge.

Thousands have been detected since then, and astronomers estimate that two or three FRBs may blaze across the sky every minute.

These enigmatic signals release immense energy from deep space, illuminating the sky with their mysterious nature. And the strangeness does not end there.

Initially, FRBs were believed to be one-off occurrences, cosmic anomalies. This assumption seemed valid, as follow-up observations failed to reveal any repeating sources.

That changed in 2016 when FRB 121102 was found to emit repeated bursts. Currently, between 3% and 10% of FRBs are classified as “repeaters.”

Why do some FRBs remain silent after a single burst, while others emit multiple bursts? This is yet another mystery awaiting resolution.

read more:

What Causes FRBs?

Numerous hypotheses have been proposed regarding the cause of FRBs, ranging from chaotic black hole collisions to extraterrestrial signals. Many explanations have emerged, including the unlikely scenario of a microwave being accidentally detected. However, one candidate seems to rise above the rest.

“When massive stars collapse and go supernova, they leave behind highly magnetized neutron stars, or ‘magnetars,'” notes Eftekhari. “The reason magnetars are a compelling candidate for FRBs is that we have observed similar events emanating from known magnetars within our Milky Way.”

Neutron stars already possess strong magnetic fields, but magnetars are in a category of their own, with magnetic fields thousands of times stronger than those of typical neutron stars.

Furthermore, a higher frequency of FRBs has been detected in galaxies with rapid star formation. As Eftekhari explains, “To produce a supernova that results in a magnetar, a massive star is required, and these giant stars are found in star-forming galaxies.”

So, is the case settled? Not quite.

The Canadian CHIME radio telescope detected FRB 20240209A, potentially originating from a globular cluster. – Photo Credit: CHIME Experiment

This is where the two new studies published in January 2025 come into play, both examining the recurring FRB known as 20240209A.

“The first exciting aspect of this FRB is that it originates outside our galaxy,” says Vishwangi Shah, a doctoral student at McGill University, referencing the second study.

“There is only one other FRB detected outside our galaxy. In terms of its repeaters, I believe it originates from a globular cluster.”

Both Eftekhari and Shah suggest that 20240209A is also associated with globular clusters (dense groups of ancient stars existing on the outskirts of galaxies).

“This is remarkable,” Eftekhari comments. “The notion of magnetar progenitors poses a challenge since they typically require a group of young stars to form magnetars.”

So what does this mean for FRBs? One possibility is that magnetars are still the culprits, but they may be generated through entirely different mechanisms.

For instance, within these stellar graveyards, two normal neutron stars might combine to form magnetars. Alternatively, a white dwarf—a stellar remnant too small to evolve into a neutron star—could gather material from a nearby companion, culminating in a massive explosion that results in a magnetar.

Ultimately, the exact origin of these outlier events remains unknown. “It’s thrilling to contemplate that we might be dealing with a subpopulation of FRBs,” Eftekhari remarks. “This case isn’t as clear as it appears.”

Can We Determine the Origins of FRBs?

Despite nearly two decades of research, many questions regarding FRBs linger. Which objects are responsible? What processes drive these phenomena? And why do some FRBs repeat while others do not?

Thanks to advances in FRB detection technology, answers may be nearer than anticipated.

Recent findings related to 20240209A utilized Canadian Hydrogen Intensity Mapping Experiment (CHIME), a novel radio telescope capable of detecting two to three FRBs daily.

CHIME is currently undergoing enhancements aimed at pinpointing bursts with unprecedented precision.

This advancement in FRB detection represents great progress in unraveling their mysteries. While many FRBs have been observed, accurately identifying their environments has left several key questions regarding their origins unanswered.

Jankowski believes that in the near future, many cases like 20240209A could be unlocked, revealing their underlying mechanisms. “I anticipate significant progress in the coming years,” he adds.

The Square Kilometer Array (SKA), a massive observatory spanning Australia and South Africa, aims to join the search for FRBs shortly.

Eftekhari and Shah have also proposed utilizing the James Webb Space Telescope to explore the region where 20240209A was detected.

“It’s an incredibly exciting time for FRB research,” highlights Jankowski. “We are poised to make remarkable discoveries in the next few years.”

Meet Our Experts

Dr. Tarraneh Eftekhari is a radio astronomer at Northwestern University, USA, with contributions to various scientific journals including Astrophysics Letter, Nature Astronomy, and Astrophysical Journal.

Dr. Fabian Djankowski is an astrophysicist at the French National Centre for Science and Technology who specializes in FRBs. His work has appeared in Monthly Notices of the Royal Astronomical Society, Astrophysics Letter, and Astronomy and Astrophysics.

Vishwangi Shah is a doctoral student at McGill University in the USA and a researcher focusing on radio astronomy and FRBs. She has been published in Astrophysics Letter and Astronomy Journal.

read more:

Source: www.sciencefocus.com

Study: Bean Consumption Enhances Metabolic and Inflammatory Indicators in Prediabetic Adults

A 12-week study involving 72 pre-diabetic adults revealed that the consumption of either chickpeas or black beans positively influences inflammation markers in diabetic patients. Additionally, chickpea intake helps in cholesterol regulation.

Incorporating one bean daily can yield significant benefits for both heart and metabolic health. Image credit: PDPICS.

“Pre-diabetic individuals often exhibit poor lipid metabolism and persistent low-grade inflammation, both of which can lead to diseases like heart disease and type 2 diabetes.”

“Our findings indicated that levels of tofu remained constant, yet they may aid in lowering cholesterol within pre-tofu individuals while also diminishing inflammation.”

While black beans and chickpeas are widely consumed, they are frequently neglected in extensive studies examining their effects on cholesterol and inflammation in those at risk for heart disease and diabetes.

This research forms part of a broader project investigating how the intake of black beans and chickpeas influences inflammation and insulin response mediated by intestinal microbiome activity.

“Our study highlights the advantages of bean consumption for pre-diabetic adults, but these legumes are excellent choices for everyone,” stated Smith.

“These insights can help shape dietary recommendations, clinical practices, and public health initiatives aimed at preventing heart disease and diabetes.”

To enhance the practical relevance of the research, the study was conducted with participants in their natural living environments.

Participants were randomly assigned to consume either 1 cup of black beans, chickpeas, or rice (the control group) over the span of 12 weeks.

Blood samples were collected at baseline, 6 weeks, and 12 weeks to monitor cholesterol levels, inflammation, blood glucose, and glucose tolerance tests were administered at both the beginning and conclusion of the study.

The group consuming chickpeas saw a significant drop in total cholesterol, from an average of 200.4 milligrams per deciliter at the start to 185.8 milligrams per deciliter after 12 weeks.

In the black bean group, the average level of the inflammatory cytokine interleukin-6, which is a marker for inflammation, decreased from 2.57 picograms per milliliter at baseline to 1.88 picograms per milliliter after 12 weeks.

No noteworthy changes were noted in markers of glucose metabolism.

“Switching to healthier alternatives, like canned, dried, or frozen beans, is an excellent starting point for those looking to increase their bean intake,” explained the scientist.

“However, it’s crucial to watch for extra ingredients like salt and sugar based on your selections.”

“There are numerous ways to include beans in your regular diet as a budget-friendly method to enhance your overall health and lower the risk of chronic ailments,” Smith added.

“You can blend them to thicken soups, use them as salad toppings, or combine them with other grains like rice or quinoa.”

The findings were reported in a presentation on June 3rd during the Nutrition 2025 annual meeting held by the American Nutrition Association.

____

Morgan M. Smith et al. Effects of chronic intake of black beans and chickpeas on metabolism and inflammatory markers in prediabetic adults. Nutrition 2025 Summary #or18-01-25

Source: www.sci.news

Fossils of 160-Million-Year-Old Blue-Stained Bacteria Discovered in China

Fossilized blue-staining bacteria that inhabit coniferous wood, drawing the interest of insects. Xenoxylon Phyllocladoides from the Jurassic Tier Ojisian Formation in China extends the early fossil history of blue-stained bacteria by around 80 million years, reconstructing the evolutionary timeline of this fungal group and offering fresh perspectives on the evolution of ecological relationships with wood-boring insects.

Blue staining bacteria in wood tissue of Xenoxylon Phyllocladoides from the Jurassic in West Liaoning Province, China. Image credit: Tian et al., doi: 10.1093/nsr/nwaf160.

Blue staining bacteria form a distinctive group of wood-inhabiting fungi that do not have the capacity to degrade lignocellulose but can cause significant discoloration, particularly in conifers.

Generally, these fungi are not fatal to their hosts, though they can hasten tree mortality when linked with wood-boring insects.

Recent molecular phylogenetic studies suggest that blue-stained bacteria may represent an ancient group of fungi that existed during the late Paleozoic or early Mesozoic eras.

However, the geological aspects of blue-staining bacteria remain largely unexplored.

“Until 2022, the earliest confirmed fossil evidence of blue-staining bacteria was identified from the Cretaceous period in South Africa, around 80 million years ago,” stated Dr. Ning Tian, a paleontologist at Shenyang Normal University.

Dr. Tian and colleagues uncovered well-preserved fossilized hyphae from a 160-million-year-old petrified forest of Xenoxylon Phyllocladoides from the Tier Ojisian Formation in Northeast China.

“Microscopy revealed darkly pigmented fossil hyphae, resembling the characteristics of modern blue-staining bacteria responsible for forest discoloration,” they noted.

“Notably, when penetrating woody cell walls, hyphae usually form a specialized structure known as permeation pegs.”

“As they invade the wood cell walls, the mycelium tends to be finer and can more easily navigate this robust barrier.”

“The discovery of these penetrating pegs allowed us to confirm that the fossilized bacteria we encountered belong to the blue-stained fungal group.”

“Unlike wood-decomposing fungi that break down wood cell walls through enzyme secretion, blue-staining bacteria lack the enzymatic ability for wood degradation.”

“Instead, their mycelium breaches the wood cell wall mechanically using these penetrating pegs.”

“This discovery of Jurassic blue-staining bacteria from China marks the second report of such fungi and adds to the early fossil record of this group worldwide, as recognized by Nanjing Institute of Geology and Paleontology.”

“It also sheds light on the ecological interactions between blue-staining bacteria, plants, and insects during the Jurassic period.”

The bark beetle subfamily Scolytinae is considered a major spore disperser for present-day blue stain fungi.

However, molecular and fossil data indicate that Scolytinae likely originated before the early Cretaceous period.

Given the current Jurassic age of the fossil fungi, it is proposed that the spore dispersing agent was not Scolytinae but another wood-boring insect prevalent during that time.

The findings are detailed in a paper published in the June 2025 issue of the journal National Science Review.

____

Ning Tian et al. 2025. Jurassic blue staining bacteria provide new insights into early evolution and ecological interactions. National Science Review 12 (6): NWAF160; doi: 10.1093/nsr/nwaf160

Source: www.sci.news

Physicists Investigate Light’s Interaction with Quantum Vacuums

Researchers have successfully conducted the first real-time 3D simulation demonstrating how a powerful laser beam alters the quantum vacuum. Remarkably, these simulations reflect the unusual phenomena anticipated by quantum physics, known as vacuum four-wave mixing. This principle suggests that the combined electromagnetic fields of three laser pulses can polarize a virtual electron-positron pair within a vacuum, resulting in photons bouncing toward one another as if they were billiard balls.



Illustration of photon photon scattering in a laboratory: Two green petawatt laser beams collide in focus with a third red beam to polarize the quantum vacuum. This allows the generation of a fourth blue laser beam in a unique direction and color, conserving momentum and energy. Image credit: Zixin (Lily) Zhang.

“This is not merely a matter of academic interest. It represents a significant advance toward experimental validation of quantum effects, which have largely remained theoretical,” remarks Professor Peter Norries from Oxford University.

The simulation was executed using an enhanced version of Osiris, a simulation software that models interactions between laser beams and various materials or plasmas.

“We are doctoral students at Oxford University,” shared Zixin (Lily) Zhang.

“By applying the model to a three-beam scattering experiment, we were able to capture a comprehensive spectrum of quantum signatures, along with detailed insights into the interaction region and the principal time scale.”

“We’ve rigorously benchmarked the simulation, enabling our focus to shift to more intricate, exploratory scenarios, like exotic laser beam structures and dynamic focus pulses.”

Crucially, these models furnish the specifics that experimentalists depend on to design accurate real-world tests, encompassing realistic laser configurations and pulse timing.

The simulations also uncover new insights into how these interactions develop in real-time and how subtle asymmetries in beam geometry can influence the outcomes.

According to the team, this tool not only aids in planning future high-energy laser experiments but also assists in the search for evidence of virtual particles, such as axes and millicharged particles, or potential dark matter candidates.

“The broader planned experiments at state-of-the-art laser facilities will greatly benefit from the new computational methods implemented in Osiris,” noted Professor Lewis Silva, a physicist at the Technico Institute in Lisbon and Oxford.

“The integration of ultra-intense lasers, advanced detection techniques, cutting-edge analysis, and numerical modeling lays the groundwork for a new era of laser-material interactions, opening new avenues for fundamental physics.”

The team’s paper was published today in the journal Communication Physics.

____

Z. Chan et al. 2025. Computational modeling of semi-real-world quantum vacuums in 3D. Commun Phys 8, 224; doi:10.1038/s42005-025-02128-8

Source: www.sci.news

Texas Woman Dies from Brain-Eating Amoeba After Using Tap Water for Sinus Rinse

The Centers for Disease Control and Prevention reports that a Texas woman died from an infection linked to an amoeba after using tap water for sinus irrigation, which ultimately led to brain damage Case report.

The 71-year-old woman, who was otherwise healthy, experienced severe neurological symptoms, including fever, headache, and altered mental status, four days after using tap water from the RV’s water system to fill her nasal irrigation device at a Texas campground.

She received treatment for primary amoebic meningoencephalitis, an infection caused by Naegleria fowleri, often referred to as the “brain-eating amoeba.” Despite medical intervention, she suffered a seizure and succumbed to the infection eight days post-symptom onset, according to the CDC.

Laboratory tests confirmed the presence of the amoeba in the woman’s cerebrospinal fluid.

The CDC noted that while infections commonly occur after recreational water use, using undistilled water for sinus irrigation is also a significant risk factor for primary amoebic meningoencephalitis.

A survey conducted by the agency revealed that although the woman had not been recently exposed to freshwater, she had used non-boiled water from the RV’s drinking water tap for nasal irrigation multiple times prior to her illness.

According to the survey findings, the RV’s drinking water tanks were filled and potentially contained contaminated water before the RV was purchased three months earlier. The investigation concluded that contamination could have originated from local government water systems linked to the drinking water systems and bypassing the tanks.

The agency underscores the importance of using distilled, sterilized, or previously boiled and cooled tap water for nasal irrigation to lower the risk of infection or illness.

Source: www.nbcnews.com

Retinal Implants Regain Vision in Blind Mice

Retinal damage can result in blindness

bsip sa/alamy

Retinal implants have shown potential in restoring vision in blind mice, indicating that they may eventually help those with conditions like age-related macular degeneration, where photoreceptor cells in the retina deteriorate over time.

Shuiyuan Wang from Fudan University in China and his team developed a retinal prosthesis composed of metal nanoparticles that replicate the function of lost retinal cells, converting light into electrical signals to be sent to the brain.

In their experiments, the researchers administered nanoparticles into the retinas of mice that had been genetically modified to be nearly completely blind.

They restricted water access for three days to both the modified blind mice and those with normal vision. Subsequently, they trained all mice to activate a 6cm wide button on a screen to receive water.

Following training, each mouse underwent 40 testing rounds. The fully sighted mouse pressed the button successfully 78% of the time. Mice with implants achieved a 68% success rate, while untreated blind mice only managed 27%. “That presents a very noticeable effect,” stated Patrick DeGenard, who wasn’t involved in the research but is affiliated with Newcastle University in the UK.

After 60 days, researchers observed minimal signs of toxicity from the implants in the mice. However, Degenaar emphasized the need for long-term safety data, stating, “For clinical application, extensive animal testing lasting approximately five years will be necessary.”

“Patients with age-related macular degeneration and retinitis pigmentosa could benefit from this prosthetic,” noted Leslie Askew from the University of Surrey, UK, who was not part of the study.

Degenaar also remarked that justifying this solution for age-related macular degeneration patients is complex, as they possess a degree of vision that may not warrant the risks associated with implanting prosthetics.

Furthermore, he noted that mice generally have inferior vision compared to humans, raising uncertainty about how beneficial the findings will be for people until comprehensive clinical trials are conducted.

Topic:

Source: www.newscientist.com

Taurine Might Not Play a Significant Role in Aging After All

Taurine supplements are seen as potentially effective in slowing aging, but this may not hold true

Shutterstock / Eugeniusz Dudzinski

While it was previously thought that taurine, an amino acid, diminishes with age, research in animals suggested that taurine supplements might help slow down the aging process. New studies, however, indicate this decline is not consistent. In fact, taurine levels often increase with age, indicating that low nutrient levels might not be the primary factor driving aging.

Earlier research indicated that taurine levels decrease in aging men, with those exhibiting higher taurine levels at age 60 experiencing better health outcomes. This correlation suggests low taurine levels might contribute to aging, supported by evidence that taurine supplements can extend the lifespans of mice and monkeys.

The challenge lies in the fact that taurine levels can fluctuate due to various factors, including illness, stress, and dietary habits. Thus, a reduction in this vital amino acid may not be directly linked to the aging process. Maria Emilia Fernandez and her team from The National Institute of Aging in Maryland assessed taurine levels in 742 individuals aged 26 to 100. The cohort consisted of roughly equal numbers of men and women, with no major health issues and multiple blood samples taken between January 2006 and October 2018.

On average, women aged 100 had taurine levels that were nearly 27% higher than those aged 26, while men aged 30 to 97 exhibited an approximate 6% increase. Similar trends were noted among 32 monkeys sampled at ages ranging from 7 to 32 years, where female monkeys saw taurine levels rise by an average of 72% and male monkeys by 27% between ages 5 and 30.

These results underscore that taurine levels may not be a reliable indicator of aging. Importantly, taurine concentrations vary widely among individuals and can change over time due to external factors, according to Fernandez.

Nevertheless, some individuals may still find taurine supplementation beneficial. Fernandez highlights research indicating its potential to help regulate blood glucose levels in people with type 2 diabetes or those who are obese. However, the question of whether taurine can slow aging in otherwise healthy individuals remains unanswered.

Vijay Yadav from Rutgers University and his colleagues are currently leading clinical trials on taurine supplementation in middle-aged adults. “We aim to conclude the trial by the end of 2025,” he states. “Our goal is to produce robust data to determine if taurine supplementation can decelerate human aging or enhance health and fitness.”

The article was revised on June 5th, 2025

Vijay Yadav’s affiliation has been corrected

topic:

Source: www.newscientist.com

Worms Unite to Create Tentacles and Explore New Areas

https://www.youtube.com/watch?v=7jlpeimmgyw

What should a tiny millimeter worm do when food is scarce? The solution lies in teaming up with countless companions to form tentacle-like structures that can bridge gaps to nearby objects or capture larger prey to aid in their journey.

Researchers examining nematode worms in laboratory settings have long observed their ability to construct “towers,” yet these phenomena lacked thorough exploration, states Serenadine from the Max Planck Institute for Animal Behavior in Germany. Therefore, she and her team aimed to investigate this further.

The research focused on the Caenorhabditis elegans worm species. In their experiments, when food was inadequate, and given a structure to assemble, a significant number of worms tended to create towers. For these studies, they utilized toothbrush bristles as a base.

While worms occasionally formed towers without any physical support, these structures were typically under 5 mm tall and only lasted about a minute. In contrast, when built upon the bristles, the towers reached heights of 11 mm and could endure for up to half a day.

In other nematode species, reports indicate towers can grow as tall as 50 mm. “They can expand significantly,” notes Din.

Although the base of the tower remains steady, the upper portion can extend well beyond the support and exhibit movement similar to tentacles. This allows the towers to reach out to nearby surfaces, forming bridges that enable the worms to traverse much wider gaps than individual organisms could manage.

“Tower” of nematode worms on rotten apples

Perez et al. Current Biology (2025)

The towers are capable of gripping objects that come into contact with them, such as fruit fly legs, effectively hitching a ride for the worms. This allows them to travel further without exerting their own energy.

While it’s known that individual nematodes can latch onto insects for transportation, the idea that an entire tower could do the same was previously unverified. “That’s a feature we expect to observe,” says Ding.

Utilizing a digital microscope, the researchers documented the tower’s formation on a decaying apple in an orchard adjacent to their laboratory.

Worm towers are exclusively formed by a single species, despite the presence of various species around them. They can consist of worms at any stage of their life cycle, even if the team discovers them during the process. Previously, it was believed that only “Dawer” worms, which are in their hard larval stage during stressful conditions, could create these towers.

There are other similarly crude forms of aggregation. For instance, slime molds, which are single-celled organisms like amoebas, can group together to form larger masses that move in search of nourishment.

Topic:

Source: www.newscientist.com

We learned how our brains distinguish between imagination and reality.

Overlap of Brain Regions in Imagination and Reality Perception

Naeblys/Alamy

How can we differentiate between what we perceive as real and what we imagine? Recent findings have uncovered brain pathways that may assist in this distinction, potentially enhancing treatments for hallucinations associated with conditions like Parkinson’s disease.

It’s already established that the brain regions activated during imagination closely resemble those engaged when perceiving real visual stimuli; however, the mechanism distinguishing them remains elusive. “What allows our brains to discern between these signals of imagination and reality?” asks Nadine Dijkstra from University College London.

To explore this, Dijkstra and her team observed 26 participants engaged in visual tasks while their brain activity was monitored via MRI scans. The tasks included displaying static grey blocks on the screen for 2 seconds, repeated over 100 times. Participants were prompted to imagine diagonal lines within each block, with half of the blocks containing actual diagonal lines.

Subsequently, participants rated the vividness of the lines they perceived on a scale of 1-4 and indicated whether the lines were real or imagined.

Through the analysis of brain activity, researchers found that when participants viewed the lines more vividly, the fusiform gyrus, a specific brain area, was more active, irrespective of the line’s actual presence.

“Prior research indicated that this area is engaged in both perception and imagination, but this study reveals its role in tracking the vividness of visual experiences,” notes Dijkstra.

Crucially, a spike in activity in the fusiform gyrus above a certain threshold led to increased activity in an associated area known as the previous island, causing participants to perceive the lines as real. “This additional area connects to the spindle-like moment, possibly aiding decision-making by processing and re-evaluating signals,” she adds.

While it’s likely that these brain regions are not the sole players in discerning reality from imagination, further investigation into these pathways could refine our understanding of treating visual hallucinations linked to disorders such as schizophrenia and Parkinson’s disease.

“Individuals experiencing visual hallucinations might exhibit heightened activity when visualizing their imagined scenarios, or the monitoring of their signals could be inadequate,” Dijkstra suggests.

“I believe this research could be pivotal for clinical cases,” says Adam Zeman, from the University of Exeter, UK. “However, distinguishing whether minor shifts in sensory experiences are driven by real-world events, discerning fully formed hallucinations, and determining the duration of beliefs remains a significant challenge,” he explains.

To address this knowledge gap, Dijkstra’s team is currently studying the brain pathways of individuals with Parkinson’s disease.

Topics:

Source: www.newscientist.com

Fusion Potential Won’t Be Realized Without Resolving the Lithium Bottleneck

The ITER project is an experimental fusion power reactor

iter

Nuclear fusion holds the promise of nearly limitless energy, but achieving this goal requires the world to produce a significant amount of concentrated lithium fuel from the ground up.

“A major challenge is the concentration phase, where specific lithium types are concentrated,” explains Samuel Ward from Woodruff Scientific Ltd, a British firm dedicated to nuclear fusion. “There is currently no scalable solution capable of providing the fuel required for future fusion reactors.”

Lithium is essential for the most prevalent fusion technology being developed, which combines two forms of hydrogen to generate energy. Moreover, the rare lithium-6 isotope, constituting only 7.5% of naturally occurring lithium, is the most effective for sustaining the fusion process. Consequently, many fusion power projects depend on “enriched” lithium, increasing the lithium-6 content to over 50%, and occasionally as high as 90%.

Only one demonstration fusion plant is set to outpace experimental reactors by delivering net electricity to the grid. Ward and his team require between 10 to 100 tons of concentrated lithium to initiate and sustain operations. The emergence of a new demonstration plant is expected to heighten this demand.

The initial such plants are projected to be operational by around 2040, allowing time for the enhancement of lithium supplies. However, the enrichment strategy must accelerate—one report indicates that the current lithium-6 supply is nearly non-existent. The U.S. amassed stockpiles during the Cold War, producing approximately 442 tons of enriched lithium from 1952 to 1963 to support nuclear weapon fabrication. This process utilized toxic mercury, leading to environmental pollution that needed remediation for decades.

At present, low-purity lithium for fusion is transitioning from the scarce amounts of highly enriched lithium required for nuclear armaments, according to EGEMEN KOLEMEN at Princeton Plasma Physics Institute, part of the U.S. Department of Energy.

For early integration of power, researchers are advocating for a modernized, eco-friendly version of the enrichment process—yet it still relies on mercury. Last year, the German government allocated funds for a project aimed at advancing this form of lithium enrichment while improving cost-effectiveness. “We plan to launch the first concentration facility in Karlsruhe by 2028,” says Michael Frank, who is participating in this initiative at Argentum Vivum Solutions, a German consultancy.

“The only viable approach for supplying adequate lithium concentrate [in the] short and medium term relies on mercury-based methods,” asserts Thomas Giegalich from the Karlsruhe Institute of Technology in Germany, also a collaborator on the project. However, this type of method will not suffice for the extensive requirements of hundreds or thousands of commercial fusion reactors.

“There is broad recognition that mercury-dependent processes cannot sustainably support the widespread deployment of fusion energy,” states Adam Stein from the Breakthrough Research Institute, a research center based in California.

Various mercury-free concentration techniques are under exploration, but they are not yet suitable for immediate application. This is also the case with the UK’s Atomic Energy Agency, which is funding the development of a clean lithium enrichment process, including efficient lithium-6 separation through microorganisms.

“Given the current lack of demand and the need for further innovation, other techniques have yet to be demonstrated at a commercial level but must succeed,” says Stein.

Topic:

Source: www.newscientist.com

The Earth’s Atmosphere Reaches CO2 Levels Not Seen in Millions of Years

Recent data from the National Oceanic and Atmospheric Administration at the University of California, San Diego, indicates that the Earth’s atmosphere contains millions, and potentially tens of millions, of carbon dioxide molecules.

For the first time ever, the global average concentration of carbon dioxide—a greenhouse gas emitted from burning fossil fuels—surpassed 430 parts per million (ppm) in May. These measurements represent a record high, with an increase of over 3 ppm from last year.

The findings suggest that efforts to curtail greenhouse gas emissions and reverse the growing accumulation of CO2 are insufficient.

“Another year, another record,” stated Ralph Keeling, a professor of climate science, marine chemistry, and geochemistry at the Scripps Institution of Oceanography in San Diego, California; he commented. “I am saddened.”

Carbon dioxide, like other greenhouse gases, traps heat from the sun and can persist in the atmosphere for centuries. High levels of these gases contribute to rising global temperatures and other adverse effects of climate change, including increased sea levels, polar ice melt, and more frequent extreme weather events.

Since the pre-industrial era, CO2 levels in the atmosphere have sharply risen, primarily due to human activities that release greenhouse gases.

Just a few decades ago, crossing the 400 ppm threshold seemed unimaginable. This means that for every million molecules of gas in the atmosphere, over 400 would be carbon dioxide. The planet reached this daunting milestone in 2013. Current warnings suggest that CO2 levels could approach 500 ppm within the next 30 years.

Human society is now in uncharted territory.

According to Keeling, the planet likely experienced such high atmospheric CO2 levels over 30 million years ago, during a time with very different climatic conditions.

He noted the remarkable speed at which CO2 levels are rising.

“It’s changing very quickly,” he told NBC News. “If humans had evolved in an environment with high CO2 levels, the absence of suitable habitats would have likely shaped our evolution. We could have adapted to that world, but instead, we’ve constructed society and civilization based on the climate of the past.”

CO2 levels are typically illustrated using the Keeling Curve, named in honor of Keeling’s father, Charles David Keeling, who began daily atmospheric CO2 measurements in 1958 from the Mauna Loa Observatory in Hawaii.

The Keeling Curve prominently displays the steep rise in CO2 since the Industrial Revolution, attributed to human-induced climate change.

Ralph Keeling and his colleagues at the Scripps Oceanographic Institute reported that the average atmospheric CO2 concentration for May was 430.2 ppm, while NOAA’s Global Monitoring Institute, which has been conducting separate daily measurements since 1974, noted an average of 430.5 ppm for the same month.

Monitoring atmospheric carbon dioxide levels is crucial for understanding how human activities impact the Earth’s climate. These measurements also serve as key indicators of the planet’s overall health.

“These measurements provide insight into the health of the entire system with just one data point,” Keeling explained. “We achieve a comprehensive view of the atmosphere through relatively simple measurement techniques.”

Source: www.nbcnews.com

Will Life Beneath the Waves Shape Our Future as Sea Levels Rise?

Is this the future in a world where the oceans are rising?

Deep R&D Ltd

The Bajau are indigenous marine people of Southeast Asia, often referred to as sea nomads. For millennia, they have thrived along coastlines, relying on foraging underwater without the aid of diving gear, holding their breath for astonishing durations. Yet, the early 21st century introduced multiple crises that jeopardized their way of life—industrial overfishing, pollution, coral bleaching diminished food sources, and rising sea levels consumed coastal dwellings.

In 2035, a Bajau community near Saba, North Borneo, initiated fundraising for a contemporary floating and underwater settlement. They collaborated with deep, a manufacturer of submarine habitats, to create interconnected rafts and underwater homes, developing business models that could be emulated by other maritime communities facing similar threats from rising seas. Revenue streams included extreme adventure tourism, scientific research facilities, and longevity clinics.

The first habitat comprised a network of platforms and rafts, with tunnels leading to underwater levels. While residents occupied surface structures, they increasingly utilized submerged areas for storage, sustenance, and sleep. This habitat was constructed using a 3D printing technique known as Wire arc additive manufacturing, which allowed the most effective pressure distribution in areas experiencing strain.

The deeper sections were maintained at both ambient water pressure and the corresponding atmospheric pressure from the surface. In modules situated less than 20 meters deep, occupants, referred to as Aquanauts, inhaled a unique gas mixture to prevent nitrogen narcosis. Those exiting deep modules required decompression when returning to normal atmospheric conditions. An advantage of these surrounding modules was the incorporation of a moon door, enabling Aquanauts to swim directly into the deep sea for leisure, research, and farming activities.

Undersea hotels catering to extreme tourism have surged in popularity. In the Galapagos, tourists reside in submerged hydroelectric hotels, exploring hot springs and observing some of the planet’s rarest life forms. Simultaneously, scientists harness these modules to investigate deep-sea ecosystems. Undersea mapping technologies have evolved, enabling researchers to explore vast ocean territories that were previously unreachable, fostering understanding and interactions with whales and other deep-sea creatures, leading to significant advancements in marine biology.

Aquanauts can swim directly into the deep sea for recreational, research, and agricultural activities

The Bajau have long been adapted to marine environments. With thousands of years at sea, they possess enlarged spleens that provide a higher quantity of oxygen-retaining red blood cells compared to typical humans. Some Bajau divers can spend five hours underwater, diving freely to depths of 70 meters without oxygen tanks, holding their breath for up to 15 minutes. After transitioning to seabed habitats, many Bajau began to leave behind surface living, opting instead to spend more time submerged, even resorting to gene editing to enhance their aquatic capabilities, including intentional eardrum puncturing to facilitate deeper dives, and utilizing surfactants in their lungs to aid their decompression, akin to adaptations found in diving marine mammals.

Bajau’s Diver

Marco Rayman/Alamie

Numerous communities have established depth clinical treatments. Previous research has demonstrated that exposure to intermittent daily sessions of pressurized oxygen therapy can alleviate various medical conditions and age-related diseases. Hyperbaric oxygen therapy, for instance, has proven beneficial, leading individuals who underwent consistent high-pressure sessions to possess longer telomeres and enhanced clearance of senescent cells, both of which are linked to increased longevity. The deep habitat has attracted affluent seniors looking to extend their lives, simultaneously providing a lucrative income source.

The majority of marine communities have become self-sufficient, cultivating their own food through aquaculture of fish, mollusks, and seaweed, while also growing other crops on the surface. Energy sources include a combination of solar, wind, wave, and geothermal energy, tailored to local conditions. Some communities focus on tourism, whereas others specialize in carbon capture within medical facilities. A significant amount of seaweed is harvested, sunk into the ocean depths, and sold as carbon credits.

Living beneath the waves isn’t for everyone. Nonetheless, these habitats empower those most vulnerable to climate change, giving them the tools to redefine their livelihoods and lifestyles, even in the face of rising sea levels that threaten their homes.

Rowan Hooper is the podcast editor for New Scientist and author of *How to Spend $1 Trillion: These are 10 Global Issues That Can Be Actually Fixed*. Follow him on Bluesky @rowhoop.bsky.social

topic:

Source: www.newscientist.com

Could the Competition Among Microscope Manufacturers Spark the Next Major Breakthrough?

Feedback presents the latest updates in science and technology from new scientists, highlighting recent developments. Share items that may captivate readers by emailing Feedback@newscientist.com.

Get Ready…

Attention athletics fans, there’s an intriguing new competition to check out: Sperm Race.

It’s been reported that male birth rates are on the decline, with reduced sperm motility (movement speed) being a significant contributing factor. To raise awareness, a teenage founder has introduced sperm racing as a sport. As they say: “We’re creating the first racecourse for sperm: two competitors, two samples, one microscope finish line.”

Their site showcases “microscopic racetracks” that mimic reproductive systems, using “high-resolution cameras” to “track all microscopic movements.” They claim, “It’s all streamed live,” suggesting the phrase choice is deliberate, with the victor being “the first sperm to cross the finish line, confirmed via advanced imaging.”

The inaugural race on April 25th featured entries from two California universities. Readers may question why feedback on this topic emerged slowly. It’s due to a twist in the tale post-event.

Unfortunately for organizers, journalists like River Page, Reporter at Free Press, revealed that “the winner was predetermined. The ‘race’ was computer-generated.”

The issue is that microscopes can’t function that way. To have tracks long enough for sperm to swim competently, tracking them on camera is impractical. In film, a cameraman can follow Tom Cruise sprinting along the roof of a moving train. Yet, focusing a microscope can be challenging, even when the cells are nearly stationary.

The creators apparently ran a real race in a private setting, relying on computer-generated imagery to “depict” sperm racing for paying spectators.

This has led to speculation that a second round of the sperm race is improbable. I can’t help but recall how millions relish completely fabricated “sports entertainment” like wrestling, and outcomes in football often hinge on which teams have the wealthiest billionaires. Perhaps sperm racing could indeed be the next big sensation.

Water-Based Cooking

Feedback loves to explore the latest food trends, from cutting carbs to eating only lean meats, salt, and water! There’s even talk of “Air Protein,” which involves “microbial organisms that harness carbon dioxide.”

Just when I thought there couldn’t be more to discover, I stumbled upon “water-based cooking.” Given that living organisms are thought to be 60% water, my initial thought was that this might just be another way to say “cooking.” However, I later uncovered articles titled: “Food Trends and Science – Why Cooking in Water May Help Slow Aging.” and “What is Water-Based Cooking? And Why is it Healthier?”. It’s time to delve deeper.

Essentially, water-based cooking means utilizing water for cooking whenever possible, in favor of oil. Think boiling, stewing, or steaming over stir-frying or roasting. This method reduces the formation of harmful advanced glycation end products (AGEs) found in the crispy bits of fried foods known to be linked to health complications. Hence, water-based cooking enthusiasts should steer clear of those.

Driving this trend is Michelle Davenport, a UCSF and NYU-trained nutrition scientist and the former founder of Digital Children’s Food Company. She educates followers on Instagram on how to manage metabolic health through water-based cooking inspired by family recipes.

Read TikTok posts like: “You’ve switched to water-based cooking, and now your skin is clear, your digestion is thriving, and illness recovery is rapid.”

Feedback perceives this might revolve around minor details, but it fits perfectly within wellness culture: if you’re not in peak health, it’s certainly your choice. Regardless, we find ourselves empathetic toward Elle from Bruski, who aptly stated: “It’s just soup. They’re making soup.”

Pizza Insights

We sought examples of “obvious” scientific inquiries that tend to extend far beyond what one might have already guessed. The first query involved research indicating that an SUV poses a greater risk to pedestrians than a compact car.

In response, reader Roger Eldem shared a collection of findings that were decidedly unsurprising. One notable study, from Journal of Knee Surgery, led by Steven Defroda, published a paper stating: “NFL players sustain a higher incidence of knee extensor tears during brief periods of rest compared to normal intervals.” Alternatively, check a press release here. This essentially confirms that “NFL players are prone to knee injuries following shortened rest phases.” Well, yes.

Eldem’s second intriguing find came from research published in Nutrients, led by Iizuka. Its captivating title read: “The Type of Food, Not the Sequence, Influences Meal Duration, Chewing Frequency, and Pace.”

This study examines whether specific food types are consumed more quickly, potentially contributing to obesity later. A related article in MedicalXPress states: “Studies reveal that pizza is consumed more rapidly compared to meals that require chopsticks.” Clearly, food tasks can indeed be time-consuming.

Have you provided feedback?

Send your stories to feedback at feedback@newscientist.com. Please include your home address. Past and current feedback can also be found on our website.

Source: www.newscientist.com

As Technology Advances, Early Humans Developed Enhanced Teaching Skills.

As technology evolves, humans enhance their ability to teach skills to others

English Heritage/Heritage Images/Getty Images

Research into human evolution spanning 3 million years illustrates that advancements in communication and technology have occurred simultaneously. As early humans developed more sophisticated stone tools and various techniques, they also refined their abilities to communicate and educate the next generation on these new skills.

“There exists a scenario for the evolution of modes of cultural transmission throughout human history,” states Francesco Dalico, from the University of Bordeaux, France. “It seems there’s a co-evolution between the complexity of cultural traits and the complexity of their transmission methods.”

A defining characteristic of humanity is the progression toward more complex tools and behaviors. For instance, ancient humans crafted sharp stones for cutting or stabbing and affixed them to wooden shafts to create spears.

Crucially, the ability to instruct others in these skills is vital. For more intricate tasks like playing the violin or coding, extensive education and practice are typically necessary. However, in prehistoric times, the capacity for effective communication was limited, particularly before intricate languages emerged.

Furthermore, Ivan Colagè from the Pope University of the Holy Cross in Rome, along with D’Errico, investigated how the transmission of cultural information has evolved over the last 3.3 million years, aligning with changes in behavior and technology. They examined 103 cultural traits, such as specific types of stone tools, decorative items like beads, and burial customs. They documented the initial appearances of each trait in the archaeological record, indicating common practices.

The researchers assessed the complexity involved in learning each trait. Some simple tools, like stone hammers, require minimal instruction. “They don’t need much explanation,” D’Errico notes. In contrast, demonstrating the creation of more advanced tools is necessary, and the most intricate behaviors, such as deeply symbolic burials, demand explicit verbal explanations.

To analyze this, D’Errico and Colagè outlined three dimensions of learning: First, spatial proximity—can tasks be learned from a distance, or does one need to be physically present? Second, temporality—does one brief lesson suffice, or are multiple sessions necessary, perhaps emphasizing various steps? Third, the social aspect—who learns from whom?

They evaluated these traits and consulted a panel of 24 experts for assessment, whose consensus reinforced their findings. “I believe the conclusion is quite robust,” says D’Errico.

Recent studies indicate two significant shifts in cultural communication. The first occurred around 600,000 years ago when early humans began teaching one another, likely without relying on spoken language; gestures may have sufficed. This predates the emergence of our species, Homo sapiens, and aligns with the onset of hafting.

The second shift happened between 200,000 and 100,000 years ago, coinciding with the development of modern languages, which became essential for performing complex tasks like burials. “These actions involve many detailed steps, requiring explanation,” D’Errico explains.

“The relationship between cultural communication and cultural complexity is strong,” asserts Ceri Shipton from University College London. He emphasizes that while the timeline for language development remains uncertain, this new estimate provides a “reasonable timeframe.”

topics:

  • Human evolution/
  • Ancient humans

Source: www.newscientist.com

Giant Exoplanet Discovered Orbiting Low-Mass Star TOI-6894

The identification of TO-6894B, an exoplanet roughly 86% the size of Jupiter orbiting the low-mass Redd star (0.2 solar masses), underscores the importance of enhancing our comprehension of the formation mechanisms of giant planets and their protoplanetary disc environments.

Artist’s illustration of TOI-6894B behind its host star. Image credit: Markgarlic/Warwick University.

The TOI-6894 system is located approximately 73 parsecs (238 light years) away in the Leo constellation.

This planet was discovered through a comprehensive analysis of data from NASA’s Transiting Exoplanet Survey Satellite (TESS), aimed at locating giant planets around low-mass stars.

“I was thrilled by this discovery. My initial focus was on observing a low-mass red star with TESS, in search of a giant planet,” remarked Dr. Edward Bryant, an astronomer from the University of London.

“Then, utilizing observations from ESO’s Very Large Telescope (VLT), one of the most substantial telescopes globally, I identified TO-6894B, a giant planet orbiting the smallest known star with such a companion planet.”

“I never anticipated that a planet like TOI-6894B could exist around such a low-mass star.”

“This finding will serve as a foundational element in our understanding of the boundary conditions for giant planet formation.”

TOI-6894B is a low-density gas giant, with a radius slightly exceeding that of Saturn, which has only 50% of its mass.

The parent star is the lowest mass star yet found to host a massive planet, being just 60% of the mass of the next smallest star observed with such a planet.

“Most stars in our galaxy are actually small, and it was previously believed that they couldn’t support a gas giant,” stated Dr. Daniel Baylis, an astronomer at Warwick University.

“Therefore, the fact that this star has a giant planet significantly impacts our estimates of the total number of giant planets likely to exist in the galaxy.”

“This is a fascinating discovery. We still don’t completely understand why relatively few stars can form such large planets,” commented Dr. Vincent Van Eilen, an astronomer at the University of London.

“This drives one of our objectives to search for more exoplanets.”

“By exploring different planetary systems compared to our own solar system, we can evaluate our models and gain insights into how our solar system was formed.”

The prevailing theory of planetary formation is known as core accretion theory.

According to this theory, the cores of planets are initially formed by accreting material, and as the core grows, it attracts gases that eventually create its atmosphere.

Eventually, the core becomes sufficiently large to initiate the runaway gas accretion process, leading to the formation of a gas giant.

However, forming gas giants around low-mass stars presents challenges, as the gas and dust necessary for planetary formation in their protoplanetary discs is limited, hindering the formation of a sufficiently large core to kickstart this runaway process.

The existence of TOI-6894B indicates that this model may be insufficient and that alternative theories need to be considered.

“Considering TO-6894B’s mass, it might have been formed through an intermediate core-fault mechanism, whereby the protoplanet forms and accumulates gas steadily without orbiting, making it large enough to undergo runaway gas accretion,” Dr. Edward explained.

“Alternatively, it might have formed due to an unstable gravitational disk.”

“In certain cases, the disk surrounding the star can become unstable due to the gravitational forces it exerts on itself.”

“These disks may fragment as gas and dust collapse, leading to planet formation.”

However, the research team found that neither theory fully accounted for the formation of TOI-6894B based on the data available.

“Based on the stellar irradiation affecting TOI-6894B, we anticipate that its atmosphere is primarily influenced by methane chemistry, which is quite rare to identify.”

“The temperatures are low enough that atmospheric observations may even reveal the presence of ammonia.”

TOI-6894B might serve as a benchmark for methane-dominated atmospheric studies and an ideal laboratory for investigating planetary atmospheres containing carbon, nitrogen, and oxygen beyond our solar system.

Survey results will be featured in the journal Nature Astronomy.

____

Bryant et al. A giant exoplanet in orbit around a 0.2 solar mass star. Nature Astronomy, Published online on June 4th, 2025. doi:10.1038/s41550-025-02552-4

Source: www.sci.news

Physicists Achieve Unmatched Precision in Measuring Magnetic Anomalies in Mines

Researchers from the Muon G-2 Experiment have unveiled their third measurement of the Muon magnetic anomaly. The conclusive results align with findings published in 2021 and 2023 but boast significantly improved precision at 127 parts per billion, surpassing the experimental goal for 140 people.

Muon particles traveling through lead in the cloud chamber. Image credit: Jino John 1996 / cc by-sa 4.0.

The Muon G-2 experiment investigates the wobble of a fundamental particle known as the Muon.

Muons resemble electrons but are roughly 200 times more massive. Like electrons, they exhibit quantum mechanical properties called spins, which can be interpreted as tiny internal magnets.

When subjected to an external magnetic field, these internal magnets wobble akin to the axis of a spinning top.

The precession speed of a magnetic field is influenced by the muon’s characteristics, captured numerically as the G-factor.

Theoretical physicists derive G-factors based on our current understanding of the universe’s fundamental mechanics, as outlined in the standard model of particle physics.

Nearly a century ago, G was anticipated to be 2; however, experimental measurements revealed minor deviations from this value, quantified as the Muon magnetic anomaly, Aμ, based on the formula (G-2)/2, giving the Muon G-2 experiment its name.

Muon magnetic anomalies encapsulate the effects of all standard model particles, enabling theoretical physicists to compute these contributions with remarkable precision.

Earlier measurements conducted at the Brookhaven National Laboratory during the 1990s and 2000s indicated potential discrepancies with the theoretical calculations of that era.

Disparities between experimental results and theoretical predictions could signal the existence of new physics.

In particular, physicists contemplated whether these discrepancies could stem from an undetected particle influencing the muon’s precession.

Consequently, physicists opted to enhance the Muon G-2 experiments to obtain more accurate measurements.

In 2013, Brookhaven’s magnetic storage ring was relocated from Long Island, New York, to Fermilab in Batavia, Illinois.

Following extensive upgrades and enhancements, the Fermilab Muon G-2 experiment launched on May 31, 2017.

Simultaneously, an international collaboration among theorists established the Muon G-2 theory initiative aimed at refining theoretical calculations.

In 2020, the Theoretical Initiative released updated and more precise standard model values informed by data from other experiments.

The differences between the experimental results continued to widen in 2021 as Fermilab announced the initial experimental results, corroborating Brookhaven’s findings with improved accuracy.

Simultaneously, new theoretical predictions emerged, relying significantly on computational capabilities.

This information closely aligned with experimental measurements and narrowed the existing discrepancies.

Recently, the Theoretical Initiative published a new set of predictions integrating results from various groups using novel calculation techniques.

This result remains in close agreement with experimental findings and diminishes the likelihood of new physics.

Nevertheless, theoretical endeavors will persist in addressing the disparities between data-driven and computational approaches.

The latest experimental values for the muon magnetic moment from Fermilab’s experiments are:

aμ =(g-2)/2 (Muon experiment) = 0.001 165 920 705

This final measurement is based on an analysis of data collected over the past three years, spanning 2021 to 2023, and is integrated with previously published datasets.

This has more than tripled the dataset size utilized in the second results from 2023, achieving the precision target set in 2012.

Moreover, it signifies the analysis of the highest quality data from the experiment.

As the second data collection run concluded, the Muon G-2 collaboration finalized adjustments and enhancements to the experiment, boosting muon beam quality and minimizing uncertainties.

“The extraordinary magnetic moment of the muon (G-2) is pivotal as it provides a sensitive test of the standard model of particle physics,” remarked Regina Lameika, associate director of high energy physics at the U.S. Department of Energy.

“This is an exhilarating result, and it’s fantastic to witness the experiment reach a definitive conclusion with precise measurements.”

“This highly anticipated outcome represents a remarkable achievement in accuracy and will hold the title of the most precise measurement of muon magnetic anomalies for the foreseeable future.”

“Despite recent theoretical challenges that have lessened the evidence for new physics in Muon G-2, this finding presents a robust benchmark for proposed extensions to the standard model of particle physics.”

“This is an incredibly exciting moment; not only did we meet our objectives, but we surpassed them, indicating that such precision measurements are challenging.”

“Thanks to Fermilab, the funding agencies, and the host lab, we accomplished our goals successfully.”

“For over a century, the G-2 has imparted crucial insights into the nature of reality,” stated Lawrence Gibbons, a professor at Cornell University.

“It’s thrilling to contribute accurate measurements that are likely to endure for a long time.”

“For decades, muon magnetic moments have served as a significant benchmark for the standard models,” noted Dr. Simon Kolody, a physicist at Argonne National Laboratory.

“The new experimental results illuminate this fundamental theory and establish a benchmark to guide new theoretical calculations.”

These new results will be featured in the journal Physical Review Letters.

Source: www.sci.news

Research: Early Drivers of Fire Use for Meat Preservation and Predator Protection, Not Cooking

The advent of fire marks a significant point in human evolution, though scholars continue to debate its primary function. While cooking is frequently regarded as a key factor, researchers from Tel Aviv University propose that the protection of meat and fat from predators is more plausible. Homo Erectus lived during the Lower Paleolithic era, approximately 1.9 to 0.78 million years ago.

Homo Erectus.” width=”580″ height=”435″ srcset=”https://cdn.sci.news/images/2018/07/image_6228_1-Neanderthal-Fire-Use.jpg 580w, https://cdn.sci.news/images/2018/07/image_6228_1-Neanderthal-Fire-Use-300×225.jpg 300w” sizes=”(max-width: 580px) 100vw, 580px”/>

Miki Ben-Dor & Ran Barkai’s research highlights the nutritional value of meat and fat from large prey in the Lower Paleolithic, questioning the significance of culinary practices in shaping human dietary evolution and offering new insights into adaptations in Homo Erectus.

“The origins of fire usage is a ‘burning’ question among prehistoric researchers globally,” stated Professor Barkay, a co-author of the study.

“By around 400,000 years ago, it was widely accepted that fire was commonly used in domestic settings. I concur with the idea of meat roasting, as well as its use for lighting and heating.”

“However, there remains a debate concerning the past million years, with various theories put forth to explain early human interactions with fire.”

“This study aimed to approach this issue from a new angle.”

“For early humans, the use of fire wasn’t a given; most archaeological sites dated around 400,000 years ago show no signs of fire usage,” explained Dr. Miki Ben-Dor, lead author of the study from Tel Aviv University.

“However, in many early locations, there are clear indications of fire usage, even if there’s no evidence of burnt bones or roasted meat.”

“We see early humans—nearly Homo Erectus—utilizing fire sporadically for specific purposes rather than regularly.”

“Collecting fuel, igniting a fire, and maintaining it involved substantial effort, requiring a compelling energy-efficient reason.”

“We propose a new hypothesis for that motivation.”

In their research, the authors reviewed existing literature on all identified prehistoric sites between 1.8 million and 800,000 years ago where fire evidence has been found.

They identified nine sites globally, including Gesher Benot Ya’aqov and Evron Quarry in Israel, six sites in Africa, and one site in Spain.

The study also drew from ethnographic research on contemporary hunter-gatherer societies, relating their behaviors to ancient conditions.

“We examined the common features of these nine ancient sites and found they all contained a significant number of bones from large animals, mainly elephants, hippos, and rhinoceroses,” Dr. Ben-Dor noted.

“Previous research has shown these large animals were critical to early human diets, providing a substantial portion of their caloric needs.”

“For instance, the meat and fat from a single elephant can supply millions of calories, enough to sustain a group of 20 to 30 people for over a month.”

“Thus, hunting elephants and hippos was highly valuable—essentially a ‘bank’ of meat and fat that required protection and preservation, as it was sought after by predators and susceptible to decay.”

Through their analysis of findings and assessments of energetic benefits of preserving meat and fat, the researchers arrived at new conclusions that challenge previous theories. Early fires served dual purposes: first, to safeguard valuable resources from predators, and second, to facilitate smoking and prevent spoilage.

“This study introduces a novel perspective on the motivations behind early human fire use: the necessity to protect large game from other predators and the long-term preservation of substantial meat supplies,” Professor Barkay explained.

“Cooking may have occurred occasionally after fire was established for these protective purposes.”

“Such usage could elucidate evidence of fish roasting around 800,000 years ago found at Gesher Benot Ya’aqov.”

“Our approach aligns with evolving global theories that characterize major prehistoric trends as adaptations to hunting and consuming large animals, followed by a gradual shift to smaller prey exploitation.”

Survey results were published in the journal Nutrition Frontier.

____

Miki Ben-Dor & Ran Barkai. 2025. The bioenergy approach supports the conservation and protection of prey, rather than cooking, as a primary driver for early use of fire. Front. Nutr. 12; doi:10.3389/fnut.2025.1585182

Source: www.sci.news

New Study Reveals How Astrophysicists Can Utilize Black Holes as Superco-leaders of Particles

A recent study conducted by physicists at the University of Oxford, Johns Hopkins, and the Institute of Astrophysics in Paris reveals a natural process involving a gravitational particle charger that utilizes free-falling particles from infinity, matter collisions from the most stable circular orbit of rotating black holes, and a gravitational particle charger that repeatedly cycles mass energy—excluding heavy particles. In essence, this describes the Super Collider.

The artist’s concept depicts an ultra-high massive black hole in the heart of the Milky Way galaxy known as Sagittarius A*. Image credits: NASA/ESA/CSA/RALF CRAWFORD, STSCI.

Particle corridors accelerate protons and other subatomic particles towards one another at nearly the speed of light, revealing the fundamental properties of matter.

A subtle energy flash occurs upon collision, with fragments potentially unveiling previously unknown particles that may serve as candidates for dark matter—a crucial, yet elusive, component of the universe that remains undetected by scientists.

Facilities like the Large Hadron Collider also contribute to advancements in areas such as the internet, cancer therapy, and high-performance computing.

“One of the great aspirations for a particle collider like the Large Hadron Collider is to produce dark matter particles, though we have yet to find any evidence,” commented Professor Joseph Silk, an astrophysicist from Johns Hopkins University and Oxford University.

“This is why there’s ongoing dialogue about the necessity of constructing a much more powerful version for the next generation of Super Collider.”

“However, we’ve been waiting for 40 years to invest $30 billion in building this Super Collider, allowing nature to give us a glimpse into the future with supermassive black holes.”

A black hole can rotate around its axis like a planet but possesses significantly greater strength due to its intense gravitational field.

Increasingly, scientists are discovering that massive black holes rapidly spinning at the center of galaxies release enormous explosions of plasma, potentially due to jets transporting energy from the spin and surrounding accretion disks.

These phenomena can yield similar results to those produced by engineered Super Colliders.

“If ultra-high energy black holes can generate these particles through high-energy proton collisions, we could receive signals on Earth. Some high-energy particles pass through the detectors rapidly,” Professor Silk explained.

“This indicates a new particle collider effect within one of the universe’s most mysterious entities, achieving energies unattainable by any accelerator on Earth.”

“We may observe something with a unique signature believed to indicate the presence of dark matter. While this is somewhat speculative, it remains a possibility.”

New research indicates that gas falling into a black hole can harness energy from its spin, resulting in more violent behavior than previously thought.

Near rapidly spinning black holes, these particles can collide in a coordinated manner.

While not identical, this process resembles the collisions created using strong magnetic fields, where particles are accelerated in a circular high-energy particle corridor.

“Some particles from these collisions are swallowed by the black hole and vanish forever,” stated Professor Silk.

“However, due to their energy and momentum, some particles emerge, achieving unprecedented high energies.”

“We have recognized the immense energy of these particle beams, rivaling what can be produced in a Super Collider.”

“Determining the limits of this energy is challenging, but these phenomena are certainly aligned with the energy levels of the latest Super Colliders we plan to construct, providing complementary results.”

To detect such high-energy particles, scientists can utilize observatories that are already monitoring supernovae, massive black hole eruptions, and other cosmic occurrences.

These include detectors like the IceCube Neutrino Observatory and the Kilometer Cube Neutrino Telescope in Antarctica.

The difference between a Super Collider and a black hole is their vast distances from one another. Nevertheless, these particles still reach us.

The team’s paper was published this week in the journal Physical Review Letters.

____

Andrew Mamalie and Joseph Silk. 2025. Black Hole Super Collider. Phys. Rev. Lett. 134, 221401; doi:10.1103/physrevlett.134.221401

Source: www.sci.news

AI Analysis Suggests Some Dead Sea Scrolls Are Older Than Previously Believed

Characterized by pale greening, a timeline of ancient handwritten manuscripts—like the scroll of death—is vital for reconstructing the progression of ideas. However, there is an almost complete absence of manuscripts with dates. To address this challenge, an international team of researchers developed an AI-driven date prediction model named Enoch, inspired by biblical figures.



Dead Sea Scroll 4Q7, fragment Genesis Wadi Qumran Cave4. ImageCredit: Ketefhinnomfan.

While some ancient manuscripts include dates, facilitating precise dating by archaeologists, many do not provide this information.

Researchers can estimate the age of certain undated manuscripts by analyzing the evolution of handwriting styles, but this requires a sufficient number of manuscripts with known dates for creating an accurate timeline.

In the recent study, the University of Groningen and Dr. Mladen Popović assessed the historical periods of manuscripts from various locations in contemporary Israel and the West Bank through radiocarbon dating and utilized machine learning to explore the handwriting styles of each document.

By merging these two datasets, they developed the Enoch program, which objectively estimates the approximate age range by comparing handwriting styles from other manuscripts in the area.

To validate the program, ancient handwriting specialists reviewed age estimates for 135 Ennochs from the Dead Sea Scrolls.

Experts concluded that around 79% of the AI-generated estimates were credible, while the remaining 21% were considered too old, too young, or uncertain.

Enoch has already aided researchers in uncovering new insights about these ancient manuscripts.

For instance, both Enoch and radiocarbon dating techniques estimated an older age for more Dead Sea scrolls compared to traditional handwriting analyses.

“While additional data and further investigation could enhance our understanding of the timeline, our findings offer novel perspectives on the creation periods of these documents,” the researchers stated.

“The Enoch tool serves as a gateway to an ancient world, akin to a time machine, permitting the exploration of biblical handwritten texts.

“It is thrilling to establish significant steps in developing new tools that can tackle the dating challenges of the Dead Sea Scrolls and examine other partially dated manuscript collections from history.”

“This achievement would not have been feasible without collaboration across diverse scientific fields and genuine teamwork.”

A paper detailing this study was published in the journal PLOS 1.

____

M. Popovich et al. 2025. Dating ancient manuscripts using radiocarbon and AI-based writing style analysis. PLOS 1 20 (6): E0323185; doi: 10.1371/journal.pone.0323185

Source: www.sci.news

Neck and Facial Massage: A Natural Way to Detoxify Your Brain

Magnetic resonance image scan of the human brain

Phanie/Sipa Press/Alamy

A device designed for facial and neck massage suggests it might enhance the brain’s waste removal system and alleviate symptoms associated with conditions like Alzheimer’s disease.

Cerebrospinal fluid (CSF) envelops the brain and inflates it before moving through a network of delicate tubes known as grinft blood vessels. Research on mice indicates that this fluid clears waste produced by brain cells, including proteins linked to diseases like Alzheimer’s and Parkinson’s, such as beta-amyloid.

This has prompted researchers to consider whether increasing CSF flow could promote brain health. However, they note that the grinft vessels, previously only discovered deep within the neck, are difficult to access. Gou Young Koh, from the Advanced Science and Technology Research Institute in Korea, remarks on this challenge.

Recently, Koh and his team identified a network of grinft vessels located just five millimeters beneath the skin on the faces and necks of mice and monkeys. They made this breakthrough by administering fluorescent dyes that label the CSF and imaging the subjects under anesthesia. “We utilized a different kind of anesthesia than was applied in earlier studies. The previous anesthetic blocked the visualization of vessels close to the skin,” Koh explains.

In their effort to determine if massaging these vessels could boost CSF flow, the researchers developed a device with small rods attached to a 1 cm cotton ball. They used it to gently stroke down the face and neck of a 2-year-old mouse for a few months, applying strokes for one minute on younger mice. “A gentle facial and neck massage can compress the liquid and enhance the CSF flow,” Koh states.

After 30 minutes of massage, CSF flow was observed to increase nearly threefold in the brains of the mice compared to their flow prior to the massage. Furthermore, this process seemed to reverse age-related decreases in CSF flow. “After stimulation, the CSF flow in older mice appeared comparable to that of younger mice [who hadn’t received the massage],” Koh elaborates.

In their unpublished findings, the team observed similar outcomes in monkeys. They also identified glymphetic blood vessels in human cadavers, implying that massage could stimulate CSF flow in humans, as suggested by Koh.

However, due to anatomical differences between mice, monkeys, and humans, further investigations are necessary to confirm this, remarks Vesa Kiviniemi from Uru University in Finland. “It’s a slightly different scenario.”

Moreover, it remains uncertain whether increased CSF flow can genuinely mitigate brain aging or offer protection against neurodegenerative conditions like Alzheimer’s. Stephen Prucks of the University of Bern in Switzerland stated that Koh’s team aims to investigate this with mice that exhibit Alzheimer-like traits.

topic:

Source: www.newscientist.com

The River Releases Ancient Carbon Into the Atmosphere

Rivers like the Chuya in Russia can emit carbon dioxide and methane.

Parilov/Shutterstock

Globally, rivers are releasing ancient carbon into the atmosphere, revealing surprising insights for scientists and indicating that human impact on natural landscapes may be more severe than previously understood.

It is already established that rivers emit carbon dioxide and methane as part of the carbon cycle, a rapid gas exchange linked to the growth and decay of organisms, estimated to release around 2 Gigatonnes of carbon annually.

Researchers, including Josh Dean from the University of Bristol, explored the age of this carbon.

The team utilized radiocarbon dating to analyze carbon and methane released from over 700 river segments across 26 countries.

“When we compiled the available data, what we found was surprisingly significant. [Regarding the carbon released], these ancient stores may originate from much older reserves,” Dean states.

Ancient carbon is sequestered in geological formations such as rocks, peat bogs, and wetlands. The findings reveal that around one Gigatonne of this carbon is released annually via rivers, leading to the conclusion that ecosystems are currently removing one Gigatonne less carbon from the atmospheric balance than previously believed.

“This represents the first comprehensive assessment of river emissions on a global scale, which is quite remarkable,” remarks Taylor Maavara from the Cary Ecosystem Studies Institute in Millbrook, New York.

The pressing concern now is understanding the reasons behind the release of such ancient carbon. Factors might include climate change and human activities that alter natural landscapes. Dean observes that the carbon from rivers has appeared “aged” since the 1990s.

“Human activity may be accessing these long-term carbon reservoirs, which can lead to older carbon being released through these channels,” he explains.

For instance, rising temperatures due to climate change can result in carbon being released from thawing permafrost and increase the weathering rates of rocks. Additional factors such as peatland drainage and wetland desiccation could also play a role. Dean emphasizes the necessity for further research to ascertain the degree to which human activities contribute to this phenomenon and how carbon release varies over time.

“This is a critical area of research,” he asserts. “If we believe we are storing old carbon within these reservoirs, we’re mistaken; this understanding is crucial.” These insights carry significant implications for national climate strategies, particularly concerning reliance on natural ecosystems to mitigate ongoing emissions.

“This research raises intriguing questions about how and to what extent we can manage ancient carbon,” says Scott Teig from Oakland University in Rochester Hills, Michigan. He adds that tackling climate change is likely vital to prevent the release of CO2 and methane from these ancient reserves.

topic:

Source: www.newscientist.com

Scientists Report Seaweed in the Caribbean and Surrounding Areas Experienced a Collision in May

“That’s the million-dollar question,” he remarked. “I don’t have a very satisfactory answer.”

There are three distinct types of Sargassum found in the Caribbean and surrounding regions, buoyed by small air sacs, which makes their presence truly remarkable. According to Burns, scientists are currently observing various factors influencing its growth, which depend on sunlight, nutrients, and water temperature.

Experts also point to agricultural runoff, warmer waters, and alterations in wind, currents, and rainfall as factors that can have an impact.

Large mats of algae in the open ocean create what Burns refers to as a “healthy and thriving ecosystem,” home to species ranging from tiny shrimp to endangered sea turtles. However, Sargassum close to shore can wreak havoc.

It can block sunlight essential for coral reefs and seagrasses, and when the algae sink, they may suffocate these ecosystems. Once washed ashore, the organisms that inhabit the algae either perish or are scavenged by birds, according to Burns.

The massive piles of odorous seaweed pose a significant challenge for the Caribbean, especially since tourism is a vital economic driver for many small islands.

“It’s a hurdle, but it hasn’t impacted every corner of the Caribbean,” said Frank Comitto, a special advisor to the Caribbean Hotels and Tourism Association.

At a popular tourist destination in Punta Cana, Dominican Republic, officials have invested in barriers to keep Sargassum from reaching the beaches, he noted.

In St. Maarten’s Dutch Caribbean territory, teams equipped with backhoes were mobilized for an emergency cleanup after residents reported a strong ammonia and hydrogen sulfide odor.

“The smell is quite unpleasant,” Burns stated.

Meanwhile, in the French Caribbean, officials plan to quickly utilize storage barges and specialized vessels capable of collecting several tons of seaweed daily.

Sargassum “will harm our coastlines, hinder swimming, and create unbearable living conditions for local residents,” French Prime Minister François Beilou recently informed the press.

However, Comitto mentioned that employing such vessels is “very costly” and not widely accepted, while an alternative method (using heavy machinery) is labor-intensive.

“We must tread carefully, as sea turtle eggs might be affected,” he advised. “You can’t just go there and bulldoze everything away.”

As some Caribbean islands face financial challenges, most cleanup efforts fall to hotels, with certain guests receiving refunds and complimentary shuttles to unaffected beaches.

Each year, the volume of Sargassum increases at the end of spring, peaks during summer, and then starts to decline in late autumn or early winter, noted Burns.

The recent record levels remain relatively stationary. Experts are hopeful for more Sargassum in June.

Source: www.nbcnews.com

Emerging Theories May Finally Bring “Quantum Gravity” to Reality

Researchers might be on the brink of solving one of the most significant challenges in physics, potentially laying the groundwork for groundbreaking theories.

At present, two distinct theories—quantum mechanics and gravity—are employed to elucidate various facets of the universe. Numerous attempts have been made to fuse these theories into a cohesive framework, but a compelling unification remains elusive.

“Integrating gravity with quantum theory into a single framework is one of the primary objectives of contemporary theoretical physics,” states Dr. Mikko Partanen, the lead author of the recently published research in Report on Progress in Physics. He elaborates on this innovative approach in the context of BBC Science Focus, calling it “the holy grail of physics.”

The challenge of formulating a theory of “quantum gravity” arises from the fact that these two concepts operate on entirely different scales.

Quantum mechanics investigates the minutest scale of subatomic particles, leading to the development of standard models. These models link three fundamental forces: electromagnetic, strong (which binds protons and neutrons), and weak (responsible for radioactive decay).

The fourth fundamental force, gravity, is articulated by Albert Einstein’s general theory of relativity, which portrays gravity as a curvature of spacetime. Massive objects and high-energy entities distort spacetime, influencing surrounding objects and governing the domain of planets, stars, and galaxies. Yet, gravity seems resistant to aligning with quantum mechanics.

The Duality of Theories

A significant issue is that gravity is rooted in a “deterministic classical” framework, meaning the laws predict specific outcomes. For instance, if you drop a ball, gravity guarantees it will fall.

In contrast, quantum theory is inherently probabilistic, offering only the likelihood of an event rather than a definitive outcome.

“These are challenging to merge,” Partanen comments. “Attempts to apply quantum theory within gravitational contexts have yielded numerous nonsensical results.”

For example, when quantum physicists measure the electron’s mass, the equations spiral into infinity. Similarly, applying gravity in extreme conditions, like at the edge of a black hole, renders Einstein’s equations meaningless.

Even general relativity fails to explain phenomena within a black hole. -NASA

“While intriguing approaches like string theory [which substitutes particles with vibrating energy strings] exist, we currently lack unique, testable predictions to differentiate these theories from standard models or general relativity,” notes Partanen.

Instead of crafting an entirely new theory for unification, Partanen and his colleague, Professor Jukka Tulkki, approached gravity through the lens of quantum mechanics by reformulating the gravitational equations using fields.

Fields represent how quantum theory elucidates the variation of physical quantities over space and time. You may already be acquainted with electric and magnetic fields.

This novel perspective allowed them to replicate the principles of general relativity in a format that combines effortlessly with quantum mechanics.

Testing the Theories

A particularly promising aspect of this new theory is that it does not require the introduction of exotic new particles or altered physical laws, meaning physicists already possess the necessary tools for its verification.

According to him, this new theory generates equations that account for phenomena like the bending of light around massive galaxies and redshifts—the elongation of light’s wavelength as objects recede in the expanding universe.

This new theory aligns with predictions from general relativity. – Credits: ESA/Hubble & NASA, D. Thilker

While this validates the theory, it does not confirm its correctness.

To establish this, experiments must be conducted in extreme gravitational environments where general relativity falters.

If quantum gravity can make superior predictions in such scenarios, it would serve as a crucial step towards validating this new theory and suggesting that Einstein’s framework might be incomplete.

However, this is challenging due to the minimal differences between the two theories.

For instance, when observing how the sun’s mass bends light from a distant star, the predictive discrepancy is a mere 0.0001%. Current astronomical tools are insufficient for precise measurements.

Fortunately, larger celestial bodies can amplify these differences dramatically.

“For neutron stars with intense gravitational fields, relative differences can reach a few percent,” Partanen observes. While no observatory currently exists to make such observations, advancements in technology could soon enable this.

The theory remains in its nascent stages, with the team embarking on a mission to finalize mathematical proofs to ensure the theory avoids diverging into infinities or other complications.

If progress remains encouraging, they will then apply the theory to extreme situations, such as the singularity of a black hole.

“Our theory represents a novel endeavor to unify all four fundamental forces of nature within one coherent framework, and thorough investigation may unveil phenomena beyond our current understanding,” concludes Partanen.

read more

About Our Experts

Mikko Partanen is a postdoctoral researcher in the Department of Physics and Nanoengineering at Aalto University in Espoo, Finland. He specializes in studying light and its quantum properties, with his research appearing in journals such as Physics Chronicles, New Journal of Physics, and Scientific Reports.

Source: www.sciencefocus.com

Wood-based Adhesive with Standard Blending Techniques

Glue guns generally employ harmful oil-based adhesives

Shutterstock/ekaterina43

A by-product from the wood industry has been innovatively transformed into safe, reusable hot glue adhesives that could serve as an alternative to hazardous solvent-based adhesives.

Ziwen LV of Beijing University of Forestry, along with a colleague, developed an adhesive from xylan, a component of plant cell walls.

“Xylan acts as a binding agent for cellulose, yet isn’t traditionally considered a ‘glue’ on its own,” stated Nick Aldred, who wasn’t part of the research team at the University of Essex, UK. “This initiative aims to reactivate it as a viable adhesive.”

The LV team chemically modified xylan to create dai-alcohol xylan, utilizing sodium acid and sodium borate in the process.

The resultant adhesive, when extruded from the hot glue gun, boasts a bond strength of 30 megapascals, surpassing that of traditional epoxy resin adhesives. Additionally, it can be reused by remelting, maintaining its adhesive properties even after 10 cycles.

The team also constructed plywood held together with xylan adhesive and found its performance comparable to that of phenol-formaldehyde resin adhesives.

However, there’s a significant limitation: after being submerged in water for one hour, the adhesive melts and the layers disintegrate. The researchers didn’t respond to requests for comment from New Scientist.

Jonathan Wilker from Purdue University, Indiana, highlights the pressing need for sustainable alternatives to the petroleum-based adhesives presently in use.

“[The] combined performance [of the new glue] was quite impressive, especially on wood substrates,” remarked Wilker.

“If we can implement this on a larger scale within the plywood industry, it could be revolutionary,” emphasized Aldred. “Plywood remains one of the last consumer products still containing materials like phenols and formaldehyde, substances that were banned years ago in products such as cosmetics.”

Topic:

Source: www.newscientist.com

Can AI Comprehend Flowers Without Touching or Smelling Them?

If you can’t smell, what are flowers?

ClearViewimages RF/Alamy

The newest artificial intelligence models demonstrate a comprehension of the world akin to human understanding. Yet, their sensory limitations hinder their grasp of concepts like flowers and humor.

Qihui Xu from Ohio State and her team explored the understanding of nearly 4,500 words by both humans and large-scale language models (LLMs), covering terms such as “flowers,” “hooves,” “humorous,” and “swings.” Both human participants and AI models evaluated these words based on emotional arousal and physical interactions associated with various body parts.

The objective was to analyze how LLMs, such as OpenAI’s GPT-3.5 and GPT-4, along with Google’s Palm and Gemini, compared with human rankings. While both humans and AI exhibited similar concept maps for words unrelated to sensory interaction, substantial discrepancies arose when it involved physical sensations and actions.

For instance, AI models often suggested that flowers could be perceived through the torso, a notion that most people find peculiar, as they typically enjoy flowers visually or through scent.

The challenge lies in the fact that LLMs develop their understanding from a vast array of text sourced from the internet, which falls short in tackling sensual concepts. “They are fundamentally different from humans,” she explains.

Certain AI models have undergone training using visual data like images and videos alongside text. Researchers have noticed that these models yield results more closely aligned with human evaluations, enhancing the chances that future AI will bridge sensory understanding with human cognition.

“This illustrates that the advantages of multimodal training might surpass expectations. In reality, it seems that one plus one can yield two or more,” states Xu. “In terms of AI advancement, this underscores the significance of developing multimodal models and the necessity of embodying these models.”

Philip Feldman at the University of Maryland in Baltimore County suggests that simulating an AI with a robotic body, exposed to sensorimotor experiences, could greatly enhance its capabilities, but he cautions about the inherent risks of physical harm to others.

Preventing such dangers requires implementing safeguards in robotic actions or opting for softer robots to avoid causing injury during training, warns Feldman, although this approach has its downsides.

“This may distort their perception of the world,” Feldman remarks. “One lesson they might learn is that they can gently bounce objects. [In a real robot with mass] The humanoid robots might believe they can collide with one another at full speed. That could lead to serious issues.”

Topic:

Source: www.newscientist.com

Is Planet Nine a Myth? Some Astronomers Believe They’ve Discovered a New Dwarf Planet

A potential new dwarf planet has been identified at the distant fringes of our solar system, taking approximately 25,000 years to complete one orbit around the Sun.

This celestial object, designated 2017, was discovered by a team from the Advanced Research Institute and Princeton University who were searching for a “Planet 9,” a hypothesized planet larger than Earth that is believed to orbit beyond Neptune. Some astronomers suspect that this elusive Ninth planet could shed light on the peculiar clustering of various objects and other oddities observed in the outer solar system.

While in pursuit of the elusive Planet Nine, researchers instead came across another resident of our cosmic neighborhood.

“It’s similar to the way Pluto was discovered,” remarked Sihao Cheng, a member of the Advanced Research Institute that spearheaded the research team. “This endeavor was a real adventure.”

If validated, the newly found dwarf planet could be what Chen refers to as Pluton’s “extreme cousin.” The findings were published on the Preprint site arXiv and have yet to undergo peer review.

Cheng and his colleagues estimate that 2017 measures approximately 435 miles in diameter.

Dwarf planets are categorized as celestial bodies orbiting the Sun that possess enough mass and gravity to be nearly round, yet unlike typical planets, they do not clear their orbital paths of asteroids and other objects.

Eritayan, a co-author of the study and a graduate student at Princeton University, noted that one fascinating characteristic of 2017 is its highly elongated orbit. At its most distant points from the Sun, it lies over 1,600 times farther than Earth does from the Sun.

The potential dwarf planets were discovered through a meticulous examination of a vast dataset from a Chilean telescope that was scanning the universe for signs of dark energy. By compiling observations over time, the researchers identified moving objects exhibiting clear patterns.

While 2017 may be one of the most distant known objects in the solar system, its discovery suggests that other dwarf planets may exist in that vast region of space.

“We used public data that had been available for some time,” explained Jiaxuan Li, a graduate student and co-author of the research at Princeton University. “It was just hiding in plain sight.”

Li mentioned that the object is currently located near the Sun, necessitating a wait of about a month for researchers to conduct follow-up observations using ground-based telescopes. They also hope to eventually study the object with the Hubble Space Telescope or the James Webb Space Telescope.

In the meantime, Chen stated he remains committed to the quest for Planet Nine. However, new findings may complicate long-held theories about the existence of such a planet.

The hypothesis surrounding Planet Nine suggests that planets several times Earth’s size in the outer solar system might clarify why certain groups of icy objects seem to have unusually clustered orbits.

“Under the influence of Planet Nine, any object lacking a specific orbital geometry would eventually become unstable and be expelled from the solar system,” Yang explained.

Despite 2017’s long orbit leading it away from clustered objects, Yang’s calculations indicate that its path will remain stable for the next billion years.

In essence, if Planet Nine existed, 2017 would not persist. Yet, Yang emphasized that further research is essential, and the discovery of a new dwarf planet candidate does not definitively rule out Planet Nine’s existence.

For one thing, the simulations currently utilize a single hypothetical location for Planet Nine, and scientists do not all agree on the locations of these planets.

Konstantin Batygin, a planetary science professor at the California Institute of Technology, first proposed the existence of Planet Nine in a 2016 study co-authored with Mike Brown from Caltech.

He remarked that the discoveries related to 2017 neither confirm nor deny the theory. Batygin noted that outer solar system objects that might demonstrate gravitational influences of Planet Nine must have their closest points of orbit remain sufficiently distant and not interact significantly with Neptune.

“Unfortunately, this object does not fall into that category,” Batygin told NBC News. “It’s in a chaotic orbit, so the implications are not significant, as it complicates the scenario.”

Batygin expressed excitement about the new research for providing additional context regarding how objects evolve in the outer solar system, praising the researchers’ efforts in mining public datasets as “heroic.”

Chen, however, remains optimistic about finding Planet Nine.

“The entire project commenced as a search for Planet Nine, and I’m still in that mindset,” he remarked. “This, however, is an enthralling tale of scientific discovery. Whether or not Planet Nine exists, the pursuit is a captivating venture.”

Source: www.nbcnews.com