The alien world found in their “habitable zone” of their stars may not be suitable for life yet
pandorumbs/alamy
Recent findings suggest the potential number of planets capable of supporting alien life may be fewer than previously assumed, largely due to advances in understanding planetary climates. When carbon dioxide levels in an atmosphere surpass a critical threshold, conditions can become inhospitable.
Life as we know it requires liquid water, prompting astronomers to target “habitable zones” around stars—regions where temperatures allow for water to exist in liquid form. However, Haskelle White-Gianella from the University of Washington and her research team have revealed that having liquid water alone does not guarantee habitability.
The researchers conducted nearly 10,000 simulations to determine how CO₂ levels fluctuate based on surface water amounts on planets that are Earth-sized. Their results indicate that at least 20% of Earth’s total water must be present for a planet to be potentially habitable.
This is largely due to the role rainfall plays in carbon storage within the ground through chemical reactions in rocks; insufficient rainfall could lead to CO₂ accumulation in the atmosphere, trapping heat and rapidly increasing temperatures beyond 126°C (259°F).
“We discovered that there exists a water threshold essential for maintaining a stable climate,” White-Gianella stated during the Goldschmidt Geochemical Conference in Prague, Czech Republic, on July 10.
This indicates that simply being in a habitable zone does not guarantee that a planet can support life, according to White-Gianella, necessitating a deeper examination of geological histories.
A parallel situation may elucidate how Venus transformed into the inhospitable environment we observe today, White-Gianella shared at the conference. While the increase in the sun’s brightness since the solar system’s inception is believed to contribute to Venus’ atmospheric changes and temperature rise, it alone doesn’t account for all observed transformations. By re-running models with a Venus-like amount of starlight, the team found that even planets with water levels similar to Earth’s could lose too much CO₂, leading to uninhabitability.
This provides a compelling rationale for how planets similar to Venus can become excessively hot, as noted by Benjamin Tutoro from the University of Calgary in Canada. Over time, reductions in CO₂ emissions complicate planetary climates, as recorded in geological data from Mars.
In the case of Mars, liquid water attracted carbon dioxide and sequestered it as carbonate minerals, which ultimately thinned its atmosphere and cooled the planet, according to Tutoro. White-Gianella stated that the team’s simulations focused on Earth-like planets, agreeing that conditions on planets like Mars could differ significantly.
The knot problem for mathematicians finally has a solution
Pinky Bird/Getty Images
Why is it trickier to untie two small knots compared to one large knot? Surprisingly, researchers have found that larger, seemingly complex knots formed by combining simpler ones are, in fact, easier to untangle. This discovery contradicts notions held for nearly 90 years.
“We were searching for counterexamples without anticipating we’d actually find one, as this speculation has persisted for so long,” Mark Brittenham from the University of Nebraska, Lincoln, shared. “In the back of our minds, we thought the speculation was likely right. It was an unforeseen and astonishing outcome.”
Mathematicians like Brittenham study knots by considering them as intertwined loops with connected ends. A fundamental principle in knot theory is that each knot has a “knot number,” representing the instances a string is cut, with another segment inserted and rejoined at a junction known as a “note.”
Calculating knot numbers can be computationally demanding, with certain knots containing 10 intersections remaining unsolved. Thus, analyzing knots by breaking them down into two or more simpler knots is often advantageous. This concept is akin to prime numbers in number theory.
Now, alongside Susan Hermiller at the University of Nebraska, Lincoln, Brittenham demonstrates that this may not be the case. “This speculation has lingered for 88 years; as people failed to disprove it, the desire for it to be true persisted,” Hermiller noted. “Initially, we uncovered one example, soon revealing an infinite number of knot pairs where the number of knots was strictly less than the total for the two knots combined.”
“We discovered that our understanding was not as clear as previously thought,” Brittenham remarked. “Even knots that lack connections may untie more efficiently than we expected.”
Examples of knots that are easier to undo than components
Mark Brittenham, Susan Hermiller
Finding and verifying counterexamples involves a mix of existing knowledge, intuition, and computational strength. Remarkably, the final proof verification was achieved through a straightforward, practical approach: tying knots with a rope and physically demonstrating their resolvability.
Andras Juhasz from Oxford University, who previously collaborated with AI firm DeepMind to validate various knot theory speculations, attempted to solve this latest challenge similarly but faced no success.
“We spent a year or two seeking counterexamples without luck, so we eventually abandoned the effort,” Juhasz mentioned. “AI might not be the best tool for finding counterexamples, akin to searching for needles in haystacks – a profoundly elusive pursuit.”
Applications of knot theory vary widely, spanning from encryption to molecular biology. Nicholas Jackson at the University of Warwick in the UK cautiously suggests that this new development could have practical implications. “We seem to have gained a deeper understanding of how circular entities operate in three-dimensional spaces than we did previously,” he remarked. “Concepts that were unclear a few months ago are now coming into clearer view.”
Carbon dioxide monitoring at the Mauna Loa Observatory in Hawaii may be discontinued due to US budget cuts
noaa
Scientists from various countries are urged to prepare for the potential takeover of the major carbon dioxide monitoring services currently operated by the US, according to climate experts.
The monitoring efforts could be terminated next year if budget cuts proceed, leading to the loss of vital data. “At this moment, no one is stepping forward to say, ‘We can take that responsibility,'” states Pierre Friedlingstein from the University of Exeter, UK. “It’s imperative that we do.”
Friedlingstein leads the Global Carbon Budget, an international initiative focused on accurately assessing carbon emissions and absorption by land and oceans, which is essential for understanding global temperature trends.
This work heavily relies on the National Oceanic and Atmospheric Administration (NOAA), whose budget cuts are proposed by the Trump administration. A 2026 budget document suggests eliminating funding for climate and weather research and decreasing the workforce by over 2,000 employees. Furthermore, it plans to close labs, including the Mauna Loa Observatory, a key CO₂ monitoring site.
“NOAA GML [Global Monitoring Laboratory] is essential for the Greenhouse Gas Program, which supports multiple functions,” says Ralph Keeling from the California Institute of Oceanography.
NOAA directly measures gas levels, including CO2, at various sites and aids in monitoring at additional locations worldwide. According to Friedlingstein, this includes calibrating measurements with samples sent from different areas.
The agency compiles and evaluates global data, leveraging subtle variations in CO2 levels across locations, combined with knowledge of atmospheric circulation, to trace CO2 flows accurately.
“NOAA provides critical baseline data,” Keeling noted. “If NOAA’s efforts cease, our ability to monitor CO2 and other greenhouse gas emissions globally will diminish.”
“All of these responsibilities must be assumed by other organizations,” Friedlingstein observes.
However, exchanging the loss of monitoring sites and NOAA records for new data poses challenges. “This is where maintaining long-term consistency becomes crucial,” Keeling explains. “It can’t simply switch from one data set to another; this would compromise the reliability of trend analysis.”
There is particular concern regarding ongoing monitoring at Mauna Loa, which has been conducted since 1957, providing the longest continuous CO₂ record from a single location. NOAA supports the Scripps-led monitoring efforts.
“Without NOAA’s involvement, continuing nearby measurements becomes challenging, although not impossible,” Keeling states.
He also expresses worry about Scripps-led monitoring in Antarctica, which currently depends on NOAA personnel from the US Bureau. The site’s funding, sourced from the National Science Foundation, is also at risk.
“Antarctica represents the most significant long-term station in the Southern Hemisphere. Establishing a reliable long-term global average is just as critical as the Mauna Loa data, particularly for tracking variations between the Northern and Southern Hemispheres through large-scale carbon flows,” says Keeling.
CO₂ levels can also be monitored using certain satellites, which, according to Friedlingstein, measure CO₂ not just at the surface but throughout the entire atmospheric column up to the satellite.
When asked whether there’s a plan to substitute NOAA’s functions, the European Union’s Copernicus Atmosphere Monitoring Service New Scientist reached out to the European Commission’s Defense Industry and Space Administration (DEFIS). DEFIS did not reply before the deadline for this article.
Innovative treatments may transform the management of lower back pain by addressing the root causes associated with inflammatory “zombie” cells. Recent research conducted using mice.
A group of scientists, led by researchers from McGill University in Canada, found that a combination of two medications, O-Vanillin and RG-7112, effectively eliminates zombie cells from mouse spinal tissues, alleviating pain and inflammation symptoms.
“Our results are promising because they indicate that by eliminating cells that not only obscure pain but also contribute to issues, we can approach lower back pain treatment in a novel manner,” stated the senior author, Professor Lisbet Haglund from McGill’s Ministry of Surgery.
Zombie cells, also referred to as senescent cells, do not function like typical cells. Rather than undergoing division and death to make way for new cells, they persist in the body.
As we age, these zombie cells can build up, leading to inflammation, pain, and spinal damage.
For the hundreds of millions of adults globally suffering from back pain, the impact of zombie cells is often masked and inadequately addressed by current medications.
This new treatment, however, aims to alleviate back pain by targeting and eliminating these lingering zombie cells, thereby addressing the underlying issues.
Aging or zombie cells accumulate in the shock-absorbing discs between each spinal vertebra, releasing inflammatory molecules that damage discs – Credit: Nemes Laszlo/Science Photo Library via Getty
The McGill research team discovered this promising new treatment while working with mice genetically engineered to develop spinal injuries and lower back pain over seven months.
The researchers administered varying doses of O-Vanillin and RG-7112 to these mice. Some received only one of the drugs, while others received a combination of both.
RG-7112 is a medication already established to remove zombie cells in various contexts, though it hasn’t been applied to lower back pain treatment until now.
O-Vanillin, a natural compound sourced from turmeric, is recognized for its anti-inflammatory benefits, but had not been previously tested against zombie cells.
After 8 weeks of treatment, mice receiving both drugs at higher doses exhibited the lowest levels of zombie cells, inflammation, and pain.
Those treated with a single drug showed some improvement, but the results were not as significant as those achieved with the combination therapy.
“The pressing question now is whether these medications can produce the same effects in human subjects,” Haglund remarked.
Tailoring your workout routine to align with your personality can significantly enhance your commitment to your training regimen, according to new research.
The study, led by Professor Flaminia Ronka at University College London, found that individuals who derive enjoyment from sports are more likely to maintain their participation. This indicates that a customized fitness plan is more effective than a generic approach.
“If you don’t follow that, there’s no point in prescribing the ideal exercise plan,” Ronka stated in an interview with BBC Science Focus.
“Fun is crucial for behavioral change. To encourage the population to be more proactive, a strategic approach is essential in identifying enjoyable activities for individuals.”
This study explored the impact of personality traits on preferred types of exercise.
Researchers categorized participants based on the five major personality traits: agreeableness (willingness to cooperate), conscientiousness (self-discipline and detail orientation), extraversion (sociability), neuroticism (tendency toward anxiety or unhappiness), and openness (willingness to try new things).
The findings suggested that individuals who are more sociable enjoy engaging in intense workouts, particularly those that involve social interactions like team sports and dance classes, implying that many can find enjoyment in vigorous aerobic activities.
Conversely, those who are higher in neuroticism preferred privacy and were less comfortable being observed while completing the 15-minute cycling exercise used to assess fitness levels. Moreover, they were also less inclined to monitor their heart rate during workouts.
“This indicates that individuals in this category may value a setting that allows for independence and privacy during their exercise routine,” Ronka explained. Participants in this group typically favored more calming exercises, such as stretching, yet remarked that “high intensity was acceptable as long as breaks were allowed.”
All participants who adhered to the program reported positive outcomes, but the more anxious individuals particularly stood out.
“These participants exhibited a notably significant reduction in stress following the exercise program,” Ronka noted. “This is encouraging, as it highlights that those who experience the greatest stress relief are highly responsive to exercise.”
More anxious individuals tended to benefit from calmer exercises, such as stretching – Source: Getty Images
To examine how personality influences exercise habits, this study required participants to engage in an eight-week home fitness program involving cycling and strength training.
Fitness levels were assessed at both the beginning and end of the program, during which participants completed a questionnaire to evaluate their personality type and attitudes toward exercise.
Not all personality types exhibited strong inclinations. Agreeable and open individuals did not show a preference for a specific type of exercise.
“Nonetheless, agreeableness and openness can influence how we participate in health behaviors in different ways, such as the types of exercises we are willing to try,” Ronka added.
“As long as you do not enjoy a specific session, don’t worry,” she concluded. “You can always try something different.”
About our experts
Flaminia Ronka is an associate professor at the Institute of Sports, Exercise, and Health at the University of London. Her research focuses on motor neuroscience and the connection between movement and cognition. Ronka also collaborates with British police to enhance officer wellbeing and performance.
A group of paleontologists from Yale University and Stony Brook University made a significant discovery while studying dinosaur fossils, including two bird species found in the Gobi Desert, Mongolia.
This scene illustrates the oviraptorid dinosaur Citipati appearing astonished as it rests on sand dunes. The creature raises its arms in a threat display, exposing its wrists and emphasizing the small, relocated, closed carpal bones (highlighted in blue x-ray). Image credit: Henry S. Sharp.
For years, the identity of a particular carpal bone in the bird’s wrist was a scientific enigma, until researchers determined it functioned as a trap.
This bone, originally resembling a kneecap-like sesame bone, shifted from its original position in the wrist, replacing another carpal bone known as Urna.
Positions in modern birds indicate a link that enables the bird to automatically fold its wings when it bends.
The bone’s large V-shaped notch allows for the alignment of hand bones to prevent dislocation during flight.
Consequently, this bone plays a crucial role in the bird’s forelimb and is integral for flight.
“The carpal bone in modern birds is a rare wrist bone that initially forms within muscle tendons, resembling knee-like bones, but eventually takes the place of the ‘normal’ wrist bones known as Urna,” commented one researcher.
“It is closely associated with the muscle tissue of the arm, linking flying muscle movement to wrist articulation when integrated into the wrist.”
“This integration is particularly vital for wing stabilization during flight.”
In their recent study, Dr. Bhullar and his team analyzed two Late Cretaceous fossils: Troodontid (birds of prey, related to Velociraptor) and citipati cf. osmorusca (an oviraptorid with a long neck and beakless jaw).
“We were fortunate to have two rigorously preserved theropod wrists for this analysis,” said Alex Rubenstal, a paleontologist from Yale University.
“The wrist bones are small and well-preserved, but they tend to shift during decay and preservation, complicating their position for interpretation.”
“Observing this small bone in its correct position enabled me to thoroughly interpret the fossil wrists we had on hand, as well as those from previous studies.”
“James Napoli, a vertebrate paleontologist and evolutionary biologist at Stony Brook University, noted:
“While it’s unclear how many times dinosaurs learned to fly, it’s fascinating that experiments with flight appear only after they adapted to the wrist joint.”
“This adaptation may have established an automated mechanism found in present-day birds, although further research on dinosaur wrist bones is necessary to validate this hypothesis.”
Placing their findings within an evolutionary framework, the authors concluded that it was not merely birds but rather theropod dinosaurs that underwent the confinement of this adaptation by the origin of Penalaptra, a group of theropods that includes Dromaeosaurids and Oviraptorosaurs like Velociraptor.
Overall, this group of dinosaurs exhibited bird-like features, including the emergence of feathered wings, indicating that flight evolved at least twice, if not up to five times.
“The evolutionary replacement of Urna was a gradual process occurring much deeper in history than previously understood,” stated the researchers.
“In recent decades, our understanding of theropod dinosaur anatomy and evolution has expanded significantly, revealing many classical ‘bird-like’ traits such as thin-walled bones, larger brains, and feathers.
“Our findings suggest that avian construction is consistent with a topological pattern traced back to the origin of Penalaptra.”
The team’s paper was published in the journal Nature on July 9, 2025.
____
JG Napoli et al. Theropod wrist reorganization preceded the origins of bird flight. Nature, Published online on July 9, 2025. doi:10.1038/s41586-025-09232-3
Planetary scientists have identified over 15,000 km of ancient riverbeds in the Noachis Terra region of Mars’ southern highlands, indicating that the planet may have been significantly wetter than previously believed.
This image depicts a flat upper eroded river wavy ridge above Mars, with dunes moving over it. Image credits: NASA/JPL/University of Arizona.
The nature of Mars’ climate during the Noatian-Hesperian transition, which occurred around 3.7 billion years ago, is still being debated. This period saw significant geological and climatic changes, as well as the formation of surface features like valley networks and lakes associated with liquid water.
There are two prevailing theories: the first suggests that a warm and wet environment followed early Mars, allowing liquid water to persist on the surface for an extended time. The second posits that Mars has generally been cold and dry, with flowing water created sporadically by melting ice during brief climate shifts.
In Noachis Terra, climate models predicting “warm and humid” conditions suggest significant precipitation levels.
A recent study led by Open University Ph.D. student Adam Losekoot and his team analyzed the region’s wavy ridges, also known as inverse channels.
“These formations likely resulted from sediments laid down by rivers that solidified, later exposed through the erosion of surrounding materials,” noted the lead researcher.
“Similar ridges have been identified in various Martian terrains.”
“Their presence implies that flowing water once traversed the area, with precipitation being the most probable source,” he added.
The team found that river-wave ridges are widespread throughout Noachis Terra, amounting to over 15,000 km in total length.
While many segments are isolated, some systems extend several hundred kilometers.
“Exploring Mars, particularly less altered regions like Noachis Terra, is thrilling because they have remained relatively unchanged for billions of years,” Losekoot commented.
“It acts as a time capsule that captures fundamental geological processes in ways that are impossible to observe on Earth.”
In their investigation, the researchers utilized data from three orbital devices: the Context Camera (CTX), the Mars Orbiter Laser Altimeter (MOLA), and the High-Resolution Imaging Science Experiment (HiRISE).
These datasets enabled them to map the locations, lengths, and forms of the ridge systems across various areas.
“Our findings present new evidence indicating that Mars was once a much more dynamic and complex planet than we suppose,” they stated.
“The size and interconnectivity of these ridges suggest that liquid water existed for an extended period, indicating that Noachis Terra experienced warm, wet conditions for a geologically significant time.
“These results challenge the conventional belief that Mars has been predominantly cold and dry, with valleys formed only by sporadic, short-term meltwater from ice sheets.”
Paleontologists have announced the discovery of what they believe to be a new species of early-running Neonysikhia dinosaurs, part of the Jurassic Yangliao Biota in northern China.
Skeleton of Plasaurustinron shown in side view. Image credit: Hailong Zhang.
Named Plasaurustinron, this newly identified dinosaur species lived in what is now China approximately 160 million years ago during the Jurassic period.
The ancient reptiles are part of what is known as the Yanliao Biota, a Jurassic ecosystem that included dinosaurs, mammals, amphibians, insects, lizards, and numerous plants.
“The Yanliao Biota is one of China’s most significant Mesozoic Lagerstättes, comprising fossil communities from the Jiulongshan and Tiaojishan formations, dating from 168 to 157 million years ago,” according to researchers from the China Academy of Sciences and Yunnan University.
“Overall, between 54 and 58 vertebrate species have been reported from the Yanliao Biota, which includes nine non-avian dinosaurs.”
“The Yanliao Biota preserves a large array of vertebrate material from various species, offering valuable insights into major paleontological milestones, such as the emergence of birds and the early evolution of mammals.”
“However, all non-avian dinosaurs found within the Yanliao Biota are small ceratopsians, while Ornithischia is represented by merely one species likely from the Jehol Biota.”
“This contrasts sharply with other contemporary Chinese terrestrial faunas, like the Shishugou and Shaximiao Faunas, where body sizes and taxonomic compositions are far more diverse.”
Plasaurustinron belongs to the group known as Neornithischia (New Ornithischians), a category of dinosaurs within the order Ornithischia.
First identified in 1985, Neornithischians are characterized by a thick layer of asymmetric enamel on the inner surfaces of their lower teeth.
“Neornithischia is a significant group of dinosaurs with early origins traceable to the central Jurassic region and possibly represented by several early extinction species such as Sanxiasaurus, Agirisaurus, and Hexine Rusaurus found in China,” said the paleontologist.
“Besides China, Neornithischian fossils have been reported from Jurassic regions in Eastern Europe, Scotland, and other geological periods and countries.”
“Neornithischia experienced rapid diversification into numerous species during the Cretaceous period.”
Well-preserved specimens of Plasaurustinron were discovered in the Tiaojishan formation in the Hebei Province of China.
“The fossil comprises nearly complete skeletons encased in slabs of brownish-red sandstone,” the researchers noted.
“The specimen retains most of its skull and complete post-cranial skeleton.”
According to scientists, Plasaurustinron was a small neornithischian dinosaur.
“The specimen’s total length is approximately 72.2 cm (measured from the rostral end of the skull to the caudal end of the last preserved vertebra), whereas the skull measures around 8 cm,” they mentioned.
The identification of this new species enhances our understanding of the biodiversity of the Yangliao Biota and the evolutionary relationships of early-running neornithischians.
“Phylogenetic analyses position Plasaurustinron at the base of Neornithischia, in proximity to Agirisaurus, the earliest neornithischian known,” the authors explained.
“This new species marks the first Neonysichian found within the Yangliao ecosystem and will help bridge the temporal and geographical gaps in the distribution of Neornithischia in China.”
“Additionally, the preserved remains of Plasaurustinron represent the second documented occurrence of ossified laryngeal structures among non-avian dinosaurs.”
“The laryngeal structures observed in Plasaurustinron suggest the presence of ossified laryngeal devices across other dinosaur species.”
In a manner resembling the arc shape found in modern birds, Plasaurustinron may have possessed bird-like vocalizations.
The discovery of Plasaurustinron is detailed in a paper published in the journal PeerJ.
____
Y. Yang et al. 2025. A new neornithischian dinosaur from the Jurassic Tiaojishan formation in northern China. PeerJ 13:E19664; doi:10.7717/peerj.19664
Ozempic and other GLP-1 medications might not need as frequent dosing as currently prescribed
Associated Press/Alamy
Individuals using GLP-1 medications such as Ozempic can still achieve weight loss despite facing difficulties in obtaining their prescriptions.
Medications like Semaglutide, marketed under the names Ozempic and Wegovy, have transformed obesity treatment, yet the increasing demand has led to significant supply shortages. In the U.S., changes in insurance coverage for these drugs can lead to confusion. For instance, CVS Caremark, which assists insurers in managing their prescription plans, recently discontinued coverage for Eli Lilly’s Zepbound, linked to the GLP-1 drug tilzepatide.
To investigate the impact of this confusion, Kaelen Medeiros and a colleague from a New York firm analyzed data from over 6,000 participants in the U.S. who enrolled in their program for a year between 2021 and 2024.
The program provided access to an app delivering bi-weekly lessons aimed at optimizing lifestyle choices such as nutrition and physical activity. Additionally, participants enjoyed regular one-on-one consultations with a health coach who assisted in applying these lessons. For an extra charge, all participants received GLP-1 medications, like Ozempic, mainly on a weekly basis.
By the program’s conclusion, 73% of participants experienced at least one disruption in GLP-1 access, defined as missing the medication for a minimum of 13 weeks. These participants received, on average, eight months’ supply of GLP-1 over the year-long trial. Participants lost an average of 14% of their body weight, compared to a 17% reduction among those who did not face such disruptions. The findings were shared at the Endocrinology Society’s annual general meeting in San Francisco on July 14th.
A similar rate of weight loss was observed in the program’s second year, regardless of the consistency of GLP-1 supply. “Although this confusion is concerning, it’s encouraging to see significant clinically relevant weight loss achieved despite it,” Medeiros remarks.
“This study is promising,” says Priya Jaisinghani from NYU Langone Health, New York. However, further research is needed to assess how the health coaching and lifestyle lessons provided to participants influenced weight loss, she notes. The researchers didn’t measure the engagement levels of participants in this segment of the program. Medeiros pointed out that variations in adherence might have impacted the outcomes.
Some participants also took metformin, a medication for type 2 diabetes that can aid in weight loss. Nevertheless, metformin is associated with only about a 2% reduction in body weight.
The Nor’easter Storm that caused flooding in Lynn, Massachusetts in January 2024
CJ Gunther/EPA-EFE/Shutterstock
The notorious New England storm system, dubbed the Norwegian Star, has intensified since the 1940s, posing an increased threat to the Northeastern U.S. coast, likely due to elevated ocean temperatures.
“The cause of rising sea surface temperatures is clear: greenhouse gas emissions and their impact on trends,” states Michael Mann from the University of Pennsylvania.
Mann and his team have compiled data on nor’easters and their trajectories over the last 85 years. They employed statistical techniques to discern patterns in maximum wind speeds and variations in precipitation during storms.
“Our findings indicate that while we couldn’t pinpoint significant changes in the average intensity of these storms, the most powerful storms are indeed gaining strength,” Mann revealed.
This phenomenon relates to how the sea temperatures that power the storms interact with various factors, including wind shear, influencing overall strength. Weaker storms tend to be more influenced by elements other than ocean temperature, which dictates potential storm strength. “Bringing a bit of personality to the analysis, these storms have the chance to realize their full potential,” Mann commented.
While hurricanes at tropical latitudes have been known to behave this way, the reaction of nor’easters to rising temperatures remains less understood. “Unlike hurricanes, nor’easters draw energy from a variety of elements,” notes Brian Tan from the University of Albany, New York.
A slight increase in both intensity and precipitation has been observed. The alteration in wind speed of the strongest storms is just shy of 2 meters per second since 1940.
Combined with rising sea levels, storm surges are causing flooding along the coast, while increases in snow and rain contribute to flooding inland. “The primary hazard is water,” Tan emphasizes.
The article was revised on July 14th, 2025
We have disclosed the strongest changes in nor’easter wind speeds.
Lithium-based batteries, such as those used in electric vehicles, face the danger of overheating
yonhap/epa-fe/shutterstock
Batteries enhanced with polymeric materials that emit chemicals to suppress flames at elevated temperatures are considerably less prone to catching fire. This innovation can markedly improve the safety of battery-operated devices, including electric vehicles and medical equipment.
“Our method enhances safety in conventional liquid lithium batteries,” says Ying Chan from the Chemistry Institute of the Chinese Academy of Sciences. “It functions like a safety valve. These chemicals help to stifle flammable gases before they ignite, thus preventing fires.”
Zhang and her team developed and examined polymeric materials that extinguished flames in prototype lithium metal batteries. These batteries are presently being utilized, but upcoming versions are expected to potentially replace current batteries in electric vehicles and portable electronic gadgets. Lithium metals can store ten times more energy than widely used lithium-ion batteries by utilizing pure lithium in place of graphite for the negative electrodes.
The researchers incrementally raised the temperature of the prototype battery along with standard lithium metal batteries to 50°C. When the temperature exceeded 100°C, both batteries began to overheat, yet the special polymeric material in the prototype began to break down autonomously, releasing chemicals that functioned as “microscopic fire extinguishers.”
At temperatures surpassing 120°C, the standard battery without safety mechanisms overheated to 1000°C within 13 minutes and ignited. In contrast, under similar circumstances, the prototype battery’s peak temperature reached 220°C without any fire or explosion.
This “innovative material science strategy” suggests that it’s not only lithium metal batteries that can benefit, but also specific lithium-ion and lithium-sulfur batteries which may lower the risk of battery fires and overheating. Jaggit Nanda at the SLAC National Accelerator Laboratory, California, expresses that this could lead to safer batteries, especially for electric vehicles and aircraft.
Fire control technology has been incorporated into current battery manufacturing as a “short-term safety enhancement,” and the industry is actively seeking a long-term solution that encompasses alternative battery designs and materials, according to Zhang. However, she notes that integrating polymeric materials into the battery necessitates a re-manufacturing process.
University of Pennsylvania researchers used a deep learning tool named Apex to explore a worldwide venom dataset in search of new antibiotic candidates.
Guan et al. Vococcus is a rich source of previously hidden antibiotic scaffolds, showing that merging experimental validation with extensive computational mining can enhance the search for urgently needed antibiotics. Image credits: Guan et al., doi: 10.1038/s41467-025-60051-6.
The increasing prevalence of antibiotic-resistant pathogens, especially Gram-negative bacteria, underscores the critical demand for new treatments.
Venococcus represents a vast, largely untapped source of bioactive molecules with potential antibacterial properties.
In their recent study, researcher César de La Fuente and his team analyzed a comprehensive database containing 16,123 poison proteins and over 40 million poison-encoded peptides via a vertex deep learning model.
The algorithm successfully pinpointed 386 candidate peptides that differ in structure and function from known antimicrobial peptides.
“These poisons are evolutionary wonders, yet their antibacterial capabilities have not been thoroughly examined,” said Dr. de la Fuente.
“Apex can rapidly explore extensive chemical landscapes, identifying exceptional peptides that combat some of the most stubborn pathogens worldwide.”
From the potential candidates selected by AI, scientists synthesized 58 peptide variants for laboratory assessment.
Remarkably, 53 of these demonstrated efficacy against drug-resistant bacteria such as E. coli and Staphylococcus aureus, at doses safe for human red blood cells.
“By combining computational analysis with traditional laboratory techniques, we achieved one of the most thorough antibiotic studies to date,” noted Dr. Marcelo Torres, co-author of the research.
“The platform has mapped over 2,000 novel antibacterial motifs, enhancing its capacity to eliminate or suppress bacterial growth through short, specific amino acid sequences within proteins or peptides.”
“Our team is now advancing the top peptide candidates towards the development of new antibiotics, optimizing them through medicinal chemistry modulation.”
results will be published in the journal Nature Communications.
____
C. Guan et al. 2025. A global assessment of venom data for antibacterial discovery using artificial intelligence techniques. Nat Commun 16, 6446; doi:10.1038/s41467-025-60051-6
Annapolis, Maryland – Two species are in jeopardy within the Chesapeake Bay in Maryland.
This season, the surveys for legendary crustaceans reached one of the lowest levels ever recorded. This has driven up prices at restaurants as disposable income tightened and inflation increased costs for food and other consumer items.
Luke McFadden, 29, who has been crabbing since he was 18, mentioned that the season got off to a tough start.
“We’re doing our best to serve our customers at the lowest price possible to cover our expenses,” he remarked. “But I get it; it’s not easy.”
Krabal Luke McFadden. Cesar Gonzalez / NBC News
At a family-owned crab house, Pit Boys, Seafood Manager Charlie George indicated that customer prices range from $75 to $140 based on size. This is “much higher” than previous years due to the shortage of crabs in the bay.
According to the 2025 Blue Crab Advisory Report, the overall blue crab population has declined to approximately 238 million, down from 317 million last year. This marks the second lowest level since the annual winter dredge survey commenced in 1990.
Alison Colden, Executive Director of the Chesapeake Bay Foundation, attributes the decline to pollution, climate change, and the encroachment of invasive blue catfish throughout the Chesapeake Bay. These catfish were introduced in the 1970s and 1980s to enhance recreational fishing.
“Since then, they’ve proliferated across nearly every river and stream in the Chesapeake Bay region,” Colden remarked. “They are voracious predators.”
Recent images from the NASA/ESA Hubble Space Telescope highlight NGC 1786, a spherical cluster located in the constellation of Dorado.
This Hubble image depicts NGC 1786, a spherical cluster approximately 163,000 light-years away in the Dorado constellation. The color images were created from various exposures captured in visible and near-infrared regions of the spectrum using Hubble’s Wide Field Camera 3 (WFC3). Three filters sampled different wavelengths. Colors were assigned by applying distinct hues to each monochromatic image related to individual filters. Image credits: NASA/ESA/Hubble/M. Monelli/M Hözsaraç.
Spherical clusters are ancient star systems, bound together by gravity, typically spanning around 100-200 light-years.
These clusters host hundreds of thousands, or even millions, of stars. The significant masses at the cluster’s core attract stars inward, forming a spherical configuration.
Considered among the universe’s oldest known objects, spherical clusters are remnants from the early Galactic era. It’s believed that all galaxies harbor a population of these structures.
The Large Magellanic Cloud, a neighboring dwarf galaxy located about 163,000 light-years away, possesses roughly 60 spherical clusters, including NGC 1786.
This spherical cluster, also referred to as ESO 56-39, was discovered on December 20, 1835, by the British astronomer John Herschel.
“Data from the new image is derived from spherical clusters within Milky Way galaxies, including the Large and Small Magellanic Clouds, as well as Fornax dwarf spheroidal galaxies,” stated Hubble astronomers.
“Our galaxy contains over 150 of these extensively studied ancient spherical formations.
“Due to its stability and longevity, it acts as a galactic time capsule, preserving stars from the galaxy’s formative stages.”
“While it was once believed that all stars in spherical clusters formed nearly simultaneously, our research on ancient clusters within our galaxy has revealed multiple populations of stars of varying ages,” they further explained.
“To utilize spherical clusters as historical markers, it’s essential to comprehend their formation and the origins of stars from different ages.”
“This observational program analyzed older spherical clusters like NGC 1786 in external galaxies to determine whether they contained multiple star populations.”
“Such studies can provide insights into the original formation mechanisms of the Large Magellanic Cloud as well as the Milky Way galaxy.”
The innovative battery storage solution, utilizing SuperCapacitor Technology, may “jump” traditional lithium-ion batteries, transforming the landscape for renewable energy storage and use, according to its creator.
On July 8th, British firm SuperDielectrics unveiled its new prototype storage system, dubbed the Faraday 2, at an event in central London. Incorporating a polymer designed for contact lenses, this system boasts a lower energy density than lithium-ion batteries but claims numerous advantages, such as quicker charging, enhanced safety, reduced costs, and a recyclable framework.
“The current energy storage market at home is reminiscent of the computer market around 1980,” said SuperDielectrics’ Marcus Scott while addressing journalists and investors. “Access to clean, reliable, and affordable electricity isn’t a future goal; it’s now a practical reality, and we believe we are creating the technology to support it.”
Energy storage is pivotal for the global transition to green energy, crucial for providing stable electricity despite the intermittent nature of wind and solar power. While lithium-ion batteries dominate the storage technology market, they present challenges, including high costs, limited resources, complex recycling processes, and safety risks like overheating explosions.
With its aqueous battery design grounded in supercapacitor technology, SuperDielectrics aims to address these challenges. Supercapacitors store energy on material surfaces, facilitating extremely rapid charge and discharge cycles, albeit with lower energy density.
The company’s design employs a zinc electrolyte, separated from the carbon electrode by a polymer membrane. SuperDielectrics asserts that this membrane technology is cost-effective, utilizing abundant raw materials, thus unlocking a new generation of supercapacitors with significant energy storage capabilities.
During the event, the company’s CEO Jim Heathcote mentioned that the technology could outperform lithium-ion systems in renewable energy storage.
The Faraday 2 builds on the earlier Faraday 1 prototype launched last year, claiming to double the energy density. The Faraday 2 operates at 1-40 Wh/kg, allowing for faster charging times, which will harness fleeting spikes in renewable energy production, as noted by Heathcote.
However, Gareth Hinds from the UK National Physical Laboratory points out that the technology still lags behind lithium-ion batteries, which can achieve around 300 Wh/kg at the cell level. Andrew Abbott of the University of Leicester adds that the energy density now offered by SuperDielectrics is akin to that of lead-acid batteries commonly used in automobiles and backup power systems. “There are no immediate plans among leading manufacturers to transition,” he states.
Marcus Newborough, scientific advisor at SuperDielectrics, acknowledges that they are still “on a journey” to enhance the system’s energy density. “We are aware of our high theoretical energy density,” he mentioned, noting the company’s commitment to realizing this potential in the coming years, aiming for a commercial energy storage solution ready for launch by the end of 2027.
Despite the optimism, Hinds remains skeptical about the technology competing with lithium-ion batteries regarding energy density. “Clearly, it’s an early-stage development, and while they continue to push for higher energy density, achieving lithium-ion levels is a significant challenge due to strict limitations,” he comments.
Nonetheless, he suggests that there could be a market for larger storage solutions that provide lower energy density but at a much more affordable price than lithium-ion batteries and with a longer lifespan.
Sam Cooper from Imperial College, London, concurs: “If we can develop a system offering equal energy storage capacity to the Tesla Powerwall, regardless of size or weight, and at a cost of 95% less, that would represent a groundbreaking achievement.”
Astronomers are focused on discovering planets that closely resemble Earth in size, composition, and temperature.Earth-like planets face numerous challenges in this quest. These planets are small and rocky, making them hard to detect. The current methods of planet hunting tend to favor gas giants, complicating matters. For a planet to have temperatures similar to Earth, it must orbit its host star at a similar distance, similar to Earth’s orbit around the Sun. This means it takes about a year to complete its orbit around the star. This raises an additional challenge for astronomers: locating Earth-like planets around a star requires telescopes to be dedicated to monitoring them for more than a year.
To maximize efficiency and reduce time spent on monitoring, scientists are seeking alternative methods to identify promising stars for in-depth searches before committing resources. A team of astronomers explored whether observable characteristics of planetary systems could indicate the presence of Earth-like planets. They found that the arrangement of known planets, along with their mass, radius, and proximity to their nearest star, could help predict the likelihood of Earth-like planets existing in those systems.
How effectively did the team test their approach using Machine Learning? They initiated their study by compiling a sample of planetary systems, some with Earth-like planets and some without. Since astronomers have only discovered about 5,000 stars that host orbiting planets, this sample size was too small for training machine learning models effectively. Consequently, the team generated three sets of planetary systems using a computational framework that simulates how planets form, based on the Bern model.
The Bern model initiates with 20 dust clumps, measuring around 600 meters, which is approximately 2,000 feet. These clumps help kickstart the accumulation of gas and dust into full-sized planets over a timespan of 20 million years. The planetary system evolves to a stable state over more than 10 billion years, leading to a Synthetic Planetary System that astronomers can utilize in their datasets. Using this model, they created 24,365 systems with sun-sized stars, 14,559 systems with similar stars, and 14,958 systems with different types of stars. Each group was further subdivided into those containing Earth-like planets and those without.
With these larger datasets in hand, the team utilized machine learning techniques known as Random Forest Models to categorize planetary systems based on their potential to host Earth-like planets. In a random forest setup, outputs are determined as either true or false through various components called trees that outline subsections of the entire training dataset. The team concluded that if a planetary system could host one or more Earth-like planets, the Random Forest algorithm should categorize it as “true.” They evaluated the algorithm’s accuracy using a metric known as the Precision Score.
The random forests made decisions based on specific characteristics within each synthetic planetary system. These factors included the number of planets, the presence of similar systems observed by astronomers, the system’s total planet count, and the mass and distance of planets over 100 times that of Earth, as well as the characteristics of the stars involved. The team allocated 80% of the synthetic planetary systems for training data, reserving the remaining 20% for initial testing of the completed algorithm.
The findings revealed that the random forest models accurately predicted where Earth-like planets are likely to exist with an impressive precision score of 0.99. Building on this success, they tested the model against data from 1,567 stars of similar sizes, each with at least one known orbiting planet. Out of these, 44 met the algorithm’s threshold for having Earth-like planets, suggesting that the majority of systems in this subset are stable enough to host such planets.
The team concluded that their models can effectively identify candidate stars for hosting Earth-like planets; however, they issued a caution. One concern is that the synthesis of planetary systems is time-consuming and resource-intensive, limiting the availability of training data. A more significant caution is rooted in the assumption that the Bern model accurately simulates the layered structure of planets. They urged researchers to rigorously validate their models for future theoretical work.
Coal Power Plants Contribute to Cooling via Sulphate Pollution
Frank Hermann/Getty Images
The presence of sulfate air pollution causes clouds to darken and reduces sunlight. This factor could contribute to recent temperature increases beyond just greenhouse gas effects.
“Two-thirds of the global warming observed since 2001 is attributed not to rising CO2 levels, but to decreasing SO2 levels,” says Peter Cox from the University of Exeter, UK.
While some sunlight is reflected and some is absorbed before being released as heat, increased carbon dioxide levels enhance the retention of this heat. This greenhouse effect is a primary driver of global warming, but the albedo, or reflectivity of the planet, significantly influences temperature.
Since 2001, satellite instruments like Ceres have measured sunlight reflection and absorption. These observations reveal a decline in sunlight reflectivity, indicating a darker planet with diminishing albedo, leading to more intense warming.
Factors contributing to this reduced albedo include diminished snow and sea ice as well as fewer clouds. However, Cox and Margaux Marchant’s analysis of Ceres data spanning 2001 to 2019 suggests that the most significant contributor is the darkening of clouds.
Industrial and maritime sulfate emissions are known to enhance the density of cloud droplets, improving their reflectivity. This principle underpins a proposed geoengineering technique called Marine Cloud Brightening. However, recent shifts away from high-sulfur fuels like coal have led to reductions in these emissions.
Thus, Merchant and Cox explored whether the observed loss of cloud brightness is linked to reduced SO2 levels and found correlations. They presented initial findings at the Exeter Climate Forum recently.
These findings are promising, as the accelerated warming trends indicate that some researchers fear the global climate sensitivity (the temperature rise associated with increased atmospheric CO2) could be at the upper range of estimates. While the short-term effects of reduced pollution contribute to warming, this suggests greater warming potential as CO2 emissions rise if cloud darkening results from increased CO2.
“If this darkening signifies a genuine shift in cloud feedback indicating greater sensitivity than previously thought, rather than a mere result of decreased SO2 emissions, it is promising news,” stated Laura Wilcox from the University of Reading, UK, who was not involved in the research.
Wilcox notes limitations in the datasets utilized by Marchant and Cox; for instance, the SO2 contamination data may have changed since their analysis.
Furthermore, two recent studies suggest dimming is largely due toreduced cloud cover, not darker clouds. “The factors behind these recent darkening trends are currently being intensely debated,” she says.
Overall, Wilcox adds that her research supports the view that the recent acceleration of global warming is chiefly driven by reduced air pollution, and this effect is likely to be temporary.
“Isometric movements like planks can help alleviate pain.”
Sutulastock/Shutterstock
In my previous article, I discussed how isometric exercises, which involve holding muscles in a fixed position, can effectively lower blood pressure. Since then, I’ve started integrating them into my workouts. This leads me to ask: Do these exercises provide additional benefits?
The answer is yes—and some benefits were surprising. Isometric exercises, such as planks (as shown) and wall squats, can alleviate pain, prevent injuries, and significantly enhance fitness in an efficient manner. In fact, most individuals will likely gain from incorporating these into their workout regimes.
One of the most notable advantages of isometric exercises is their ability to build strength with minimal movement, making them less physically demanding compared to more dynamic workouts. They are indeed effective: a review revealed that isometric training over 42-100 days could boost muscle strength by as much as 92%.
These strength gains can be highly targeted. Athletes frequently employ isometric exercises to strengthen the challenging aspects of their movements, like the lowest point in a squat. This focused training may enhance overall performance, as researchers have discovered that isometric training could surpass jump-based training in terms of durability.
Moreover, these exercises are gentle on the body, making them easy to include at the beginning or end of a standard workout, providing extra benefits. They serve as excellent warm-ups and research has shown they can reduce muscle soreness post-exercise without hindering running performance. This contrasts with static stretching, which doesn’t alleviate muscle pain and can actually decrease performance.
Incorporating some isometric movements into your warm-up routine can also help in preventing injuries. Slow, controlled training that targets less active stages during workouts is commonly used to guard against hamstring injuries in soccer players; however, isometric exercise is found to be more effective, according to a study.
While the exact mechanism remains unclear, it appears that isometric exercises can activate the signaling pathways between nerves and muscles, enhancing muscle responsiveness during workouts. This could help in correcting muscle imbalances, which are often a source of injuries.
These advantages are not limited to athletes. A review published this year demonstrated that isometric training significantly reduces pain and strengthens muscles in individuals with osteoarthritis. Because they are low-impact, these exercises are perfect for beginners and those with limited mobility due to injuries.
Considering their myriad benefits, isometric exercises have become a consistent part of my training routine. Furthermore, because they require no equipment and minimal space, I can perform them almost anywhere at any time.
Grace Wade is a health reporter for New Scientist, based in the US.
For more projects, please check out newscientist.com/maker
Mel Brooks and Carl Reiner had a nightly movie ritual, often indulging in cheesy films where phrases like “ensure you have boundaries!” get thrown around. So, why bring this up in relation to Foundation? This adaptation of Isaac Asimov’s work started with provocative ideas but has since become a formulaic experience—something like a ticking clock.
It’s been two years since the last season of Foundation, so if you’re a bit hazy about the plot, here’s a quick recap: the empire has long been governed by genetic dynasties. Three clone emperors—representing dawn, day, and dusk—come to rule, under the watchful eye of Demerzel (Laura Birn), the last existing robot. Approximately 150 years after Season 2, the first Foundation, conceived to replace the empire, now governs the outer planets.
Hari Seldon (Jared Harris), with the ability to foresee the empire’s fall through the mathematical theory of psychohistory, has uploaded his consciousness to a secure location just before the impending “Seldon Crisis.” These crises signify pivotal moments that can plunge the galaxy into epochs of darkness. Meanwhile, the second Foundation—a secretive colony with telepathic prowess—operates covertly, aiming to prevent the third Seldon crisis, led by a version of Seldon and his protégé, Dornik (Rurbell).
Visually stunning—an array of exquisitely rendered planets render the cosmos seemingly infinite.
That’s the essential backdrop as we venture into Foundation‘s third season. There’s much to unpack, especially with new characters joining the narrative. There’s Quento (Cherry Jones), the inaugural ambassador of the Foundation, navigating a complicated rapport with the Empire; Han Pritcher (Brandon P. Bell), a spy operating between the two Foundations; and Toran Marrow (Cody Fern), a descendant of Wily Hober Marrow from Season 2.
This ensemble is designed to create a rich and intricate universe filled with well-crafted characters. The show skillfully merges drama with grand concepts, particularly involving DeMerselle, giving it a visually spectacular quality as numerous planets are rendered beautifully.
However, herein lies the paradox: while Foundation strives for intellectual stimulation through its lore and epic scope, many of its plotlines come off as ridiculous and superficial. The most captivating elements—the two Seldons, the potential alliance between the Foundation and the Empire, and the intrigues among the three emperors—remain largely unexplored. The narrative often feels intellectually shallow at times. And don’t get me started on the awkward dialogue; phrases like “we have a partnership” made me cringe, not to mention the repeated insistence to “ensure you have boundaries!”
It’s disheartening to watch a promising show decline while retaining traces of its former brilliance. After viewing nine episodes, I’m hopeful the tenth episode will tie everything together, much like Seldon’s Vault, with its buried secrets finally unearthed. Until then, whether you can overlook its shortcomings may determine your enjoyment of Foundation, which feels a step removed from the television gems it once resembled, at least offering some form of unsatisfactory entertainment.
Recommendations for Further Viewing…
Andor Disney+ Foundation caters to historical enthusiasts intrigued by civilization’s cycles. For a similar experience, check out this Star Wars series that chronicles key figures in a very different empire’s downfall—something quite rare.
The Rise and Fall of the Galactic Empire Chris Kempshall While still rooted in Star Wars, this narrative of Emperor Palpatine’s 24-year reign, depicted from an in-universe historian’s viewpoint, makes for an engaging read.
The Art and Science of Writing Science Fiction
Engage in the craft of science fiction writing this weekend by creating new worlds and artistic creations.
New records for black holes have transformed our understanding of the universe’s most extreme entities.
The Laser Interferometer Gravitational-Wave Observatory (LIGO) began its groundbreaking detection of gravitational waves—ripples in the fabric of spacetime—ten years ago, unveiling nearly 100 black hole collisions. On November 23, 2023, Rigo announced receiving a signal described as “an extraordinary interpretation that defies explanation.” According to Sophie Binnie from the California Institute of Technology, her team ultimately concluded that it corresponded to the largest black hole merger ever recorded.
One of the merging black holes was approximately 100 times the mass of the sun, while the other neared 140 solar masses. Previous records featured black holes that were almost half as massive, primarily due to earlier mergers. Team member Mark Hannam from Cardiff University, UK, emphasized that these black holes were not only immense but also spinning at such high speeds that they challenged mathematical models of the universe regarding their formation.
According to Hannam, the masses of these black holes exceed those typically formed from the collapse of aging stars, suggesting they likely resulted from earlier mergers between smaller black holes. “It’s possible that multiple mergers have occurred,” he notes.
“A decade ago, we were astonished to find black holes around 30 solar masses. Now, we observe black holes over 100 solar masses,” adds Davide Gerosa from the University of Bicocca in Milan, Italy. He mentions that gravitational wave signals from these large, quickly rotating black holes are shorter and consequently more challenging to detect. Binnie presented her findings at the Edoardo Amaldi Conference on Gravitational Waves in Glasgow, England, on July 14.
Both Hannam and Binnie emphasize that future observations of similarly remarkable mergers are essential to further decipher these new signals, including unraveling the origins of black holes. As upgrades progress, LIGO is expected to detect more cosmic record-breakers. Yet, in May, the Trump administration proposed halving resources at the facility, which, in Hannam’s opinion, could render capturing new signals exceedingly difficult.
A closer look at the internet reveals numerous charming videos of dogs seemingly learning to “speak” with their owners by using electronic buttons pre-programmed to produce specific words, often referred to as soundboards.
Take, for instance, Labrador Copper. These Labradors press the “copper” and “eat” buttons when they’re craving cheese and tap “Where” and “Dad” when their owner is gone.
While it may seem impressive, the question arises: are these dogs truly communicating, or are they merely reacting to cues from their owners?
I was fascinated by Federico Rossano from the Department of Cognitive Sciences in San Diego, California, who enlisted the help of 59 dogs trained by their owners to utilize these soundboards.
While working from their homes, owners and researchers randomly pressed select buttons and recorded whether the dogs responded appropriately. For example, if the word “outside” was heard, dogs would likely approach the door after hearing a food bowl or another sound.
At least for some words, the answer was indeed “yes.” Dogs were significantly more likely to exhibit play-related behaviors upon hearing the word “play” and looked towards the door when they heard “outside.”
Are these dogs genuinely communicating, or are they simply reacting to their owners’ cues? – Photo credit: Aramie
Importantly, these responses were true regardless of whether the button was pressed by the owner or researcher; it didn’t matter who initiated the button press or verbalized the word.
This suggests that the dog isn’t merely reading body language from the owner but is actually processing the words themselves, according to Rossano.
So, is the debate settled? Not quite. The study indicates that dogs can recognize and respond to verbal cues (which we already knew), but critics argue that this doesn’t clarify what the words convey for dogs.
So when Bunny asks, “Where’s Dad?” and her owner replies, “He’s on a climbing trip now,” does she truly understand? For now, Bunny is the only one who knows.
This article answers the question posed by Hatty Kingston from Bristol: “Do dogs truly understand the words associated with soundboard buttons?”
Please email us to submit your questionsQuestion @sciencefocus.com or MessageFacebook,Twitter or InstagramPage (don’t forget to include your name and location).
Explore our ultimateFun Fact and discover more amazing science pages.
This deficiency is crucial for energy production, brain development, and maintaining a robust immune system.
According to the World Health Organization, anemia affects 31% of women of reproductive age, 36% of pregnant women, and 40% of children under 5.
Inflammation can interfere with iron absorption, stemming from acute diseases or chronic conditions such as obesity. With rising global obesity and chronic disease rates, this creates additional challenges in tackling iron deficiency worldwide.
Iron Deficiency
Iron deficiency can lead to anemia, as iron is vital for red blood cell production. Anemia is characterized by low hemoglobin levels, the protein that gives blood its red color and transports oxygen.
The World Health Organization reports that anemia affects 31% of adult women of reproductive age, 36% of pregnant women, and 40% of children under 5 years old. Approximately half of all global anemia cases result from iron deficiency. Common symptoms include pale skin, fatigue, shortness of breath, and irregular heartbeat (known as palpitations).
Iron deficiency poses serious health risks, especially when it causes anemia, including a weakened immune system, complications during pregnancy and childbirth, maternal and infant mortality, and delayed growth and brain development in children.
Diet can influence iron absorption. – Photo credit: Getty
The repercussions of iron deficiency are particularly severe for women and children, who are the most susceptible.
Menstruating women have a heightened need for iron due to monthly blood loss. Pregnant women require extra iron for the placenta, fetus, and increased blood volume. Children need iron for rapid growth and brain development, making adolescent girls—who are both growing and menstruating—especially vulnerable.
In their study, Benson and Law convened a panel of 26 experts alongside four patient representatives. Their collective recommendations advocate for a more positive and inclusive strategy for managing iron deficiency, particularly for at-risk populations.
The panel stressed the importance of regular screening during pregnancy and early childhood. They emphasized utilizing ferritin, a blood protein indicating liver iron storage, as a reliable marker for diagnosing iron deficiency and determining intervention timing.
If treatment is necessary, oral iron supplements are the first recommendation. They are effective, widely accessible, and cost-effective. For those experiencing side effects like nausea and constipation, the panel suggested taking supplements on alternate days to enhance tolerability. In more severe instances, or if oral iron proves ineffective, intravenous iron may be needed.
Lastly, the panel asserted that iron deficiency should not be viewed as an isolated issue, but rather part of the routine care for mothers and children, including pregnancy tests, child health visits, and nutrition programs.
Iron Advice
While some individuals may need treatments for iron deficiency, many cases can be prevented through daily dietary choices.
Begin by adding more iron-rich foods to your meals, such as pulses, legumes, green leafy vegetables, nuts, and iron-fortified cereals (opt for lower sugar options for kids and adolescents).
For those consuming animal products, limit intake to moderate amounts of lean meat—about 70g (2.5oz) per day, as recommended by the UK Eatwell Guide—which can provide easily absorbable iron.
If you primarily follow a plant-based diet, consider pairing iron-rich foods with vitamin C sources like lemon juice, tomatoes, and strawberries to enhance iron absorption.
Avoid drinking tea or coffee during meals as polyphenols can hinder iron absorption; this applies to taking iron supplements as well. Consuming them with a vitamin C source, such as orange juice, can significantly improve absorption.
If you belong to a higher-risk group—such as menstruating individuals or caregivers of young children—or if you experience excessive fatigue, consult your doctor. A simple blood test can evaluate your iron levels. In children, iron deficiency may also manifest as unusual cravings, such as for ice or non-food items.
Iron deficiency is prevalent but manageable and often preventable. With awareness and mindful choices, maintaining healthy iron levels can be as straightforward as selecting what goes on your plate.
Atrophy can affect nearly every part of our body, but the knees bear a particularly heavy burden as they age. Why? Because they are intricate anatomical structures with numerous components that undergo constant wear and tear.
As we grow older, the muscles supporting our knees weaken, and bone density declines. The cartilage that cushions our bones may wear thin, and the ligaments connecting them also lose elasticity.
This leads to stiffness, pain, reduced mobility, and those involuntary sounds we all make when we rise from the sofa.
Fortunately, like any machine, our knees can thrive with proper care. Understanding the risk factors for knee injuries and osteoarthritis is a crucial first step.
The main threats to healthy knees include excess weight and footwear lacking adequate support for multiple joints.
For instance, in 2021, French researchers found that Parkour athletes experienced an average of 1.7 knee or ankle injuries per 1,000 hours of training.
While this might not sound alarming, it can lead to long-term damage—affecting more than just those jumping off risky buildings.
A study focused on elite dancers revealed that knee injuries, such as meniscus tears, are among the most prevalent injuries.
Activities that involve impact on the legs or require rapid direction changes can strain the knees significantly. A severe ACL injury can sideline football players for up to a year.
Moreover, research suggests that ten to twenty years post-injury, about half of those affected may develop osteoarthritis linked to their original injury.
There’s also an ongoing debate about whether running is beneficial or detrimental to knee health. For example, some claim running on pavement can be as punishing as hitting the soles of your feet with a hammer for an hour.
As you age, the muscles that stabilize your knees become weaker and bone density decreases – Illustration credit: Daniel Bright
However, a 2017 study found no significant differences in the risk of knee osteoarthritis between runners and non-runners.
In fact, exercise is believed to strengthen joints. A 2023 study indicated that individuals engaged in strength training are up to 20% less likely to experience osteoarthritis than those who do not.
Additionally, strengthening surrounding muscles, such as the quadriceps, appears beneficial. If you invest in properly fitting shoes and maintain a regular exercise routine, another key practice for knee care is recognizing mild discomfort.
Minor injuries can easily escalate into more serious, long-lasting conditions. If experiencing pain, consider using knee supports or opting for swimming.
Some research suggests that non-weight bearing activities, like swimming, can facilitate recovery from minor knee injuries and lessen the risk of long-term complications.
This article addresses the question posed by Thomas McPherson from Wakefield: “How do I take care of my knees as I age?”
Please email us to submit your questionsat Question@sciencefocus.com or MessageFacebook,Twitter or InstagramPage (don’t forget to include your name and location).
Check out our ultimateFun fact More amazing science pages
From hot dogs to crispy bacon, by 2026, many food staples in the US will utilize gene-edited meat. Indeed, the US Food and Drug Administration (FDA) has recently given the green light to the agricultural use of certain genetically enhanced pigs. Other global regulators may soon follow suit.
But should we be concerned? Is this modified pork safe? And what about the ethics of creating these pigs?
Firstly, it’s important to note that not all gene-edited animals are produced in a laboratory setting. Instead, these livestock come from animals whose DNA has been modified early in their development, often conferring advantageous traits starting from a single cell or fertilized egg.
This gene editing isn’t focused on enhancing pork flavor; it’s primarily aimed at safeguarding the pigs from diseases.
For instance, a UK company is currently developing genetic modifications in pigs that render them resistant to Porcine Reproductive and Respiratory Syndrome (PRRS), a virus that significantly weakens the immune system of pigs. PRRS poses a serious threat, leading to the deaths of piglets, miscarriages in pregnant sows, and increased vulnerability to other infections.
Pork is the third most consumed meat in the United States after chicken and beef.
The stakes are high, with efforts to manage PRRS costing the US pork industry about $1.2 billion (£878 million) each year.
When the virus does break through, the implications can be dire. In 2006, a pandemic in China infected over 2 million pigs, resulting in 400,000 deaths.
CRISPR Bacon
How much have these pigs really changed? That’s a valid concern. However, the actual modifications are surprisingly minor.
To combat the PRRS virus, scientists have edited out a portion of the CD163 protein in the pig’s DNA, which the virus uses to invade pig cells.
Pigs with this genetic modification show resistance to nearly all known strains of PRRS, but they are otherwise similar to conventional pigs. Despite initial fears that viruses could evolve to bypass edited proteins, this hasn’t occurred.
Dr. Christine Tait-Burkard, a researcher at the University of Edinburgh’s Roslin Institute, describes the original CD163 protein as “like nine beads on a string,” with only one bead—the fifth one—removed during editing.
Interestingly, the gene rearrangement could also occur naturally in some pigs. “It’s possible there is a pig somewhere in the world resistant to this virus,” Tait-Burkard states. “However, we don’t have the luxury of time for natural breeding, so we must utilize biotechnology to introduce it into our breeding programs.”
The editing employs a toolkit known as CRISPR, a Nobel Prize-winning technology that has gained popularity in scientific research for its efficiency, precision, and affordability. The CRISPR tool uses a “guide” sequence to target DNA, employing protein “scissors”—naturally occurring proteins found in bacteria—to make necessary cuts. Minor adjustments, such as those seen in PRRS-resistant pigs, disable particular genes.
A New Norm?
Once they hit grocery store shelves, PRRS-resistant pigs are expected to become the first widely consumed gene-edited animals. However, they are not the first genetically modified products available to consumers.
Hypoallergenic “Gal Safe” Pork, designed for consumers with meat allergies, received approval in 2020. In 2022, the FDA also approved a type of cow known as Smooth cow—a breed enhanced with traits from naturally occurring genetic variants in tropical cows for shorter hair and better heat recovery. Additionally, genetically modified “Aquadvantage” Salmon is available in the US, albeit primarily sold in restaurants.
The situation is more complex across the Atlantic. As it stands, gene-edited foods cannot be marketed in the EU, and legislation for Genetic Technology (Precision Breeding) in the UK lays groundwork for breeding gene-edited crops, but it has not yet been extended to animals.
Even if regulations evolve globally, will consumers be eager to purchase gene-edited sausages and bacon?
The labeling for this new gene-edited pork remains undecided, but Dr. Katie Sanders, a communications specialist at North Carolina State University, suggests that there is greater potential for consumer acceptance compared to traditional genetically modified (GM) foods. This perception stems from the belief that gene-edited products appear more natural.
In the past, genetically modified (GM) crops stirred up fears and headlines focused on “frankenfood.” However, many of these crops were ultimately approved, with most scientists considering them safe for consumption. These GM crops often incorporate foreign genes—like “Bt” corn, which carries genes from the bacterium Bacillus thuringiensis to repel insect pests.
In contrast, the current wave of CRISPR-edited foods only features modifications that could naturally occur within the species. Scientists have not created an entirely new variety of pigs.
Sanders and her colleagues, along with associate professor Jean Parera at Texas A&M University, conducted a national survey of more than 2,000 Americans to gauge attitudes towards CRISPR-edited pork. While results await publication, Sanders notes that respondents generally indicated a likelihood to purchase CRISPR-edited pork.
This trend was especially noted in urban populations (compared to rural ones) and among those with lower educational attainment (as opposed to individuals with degrees).
In 2006, PRRS outbreaks in China affected over 2 million pigs, leading to 400,000 deaths.
When asked how producers can persuade more consumers to adopt gene-edited meat, Parrella emphasized the importance of “responsible use and ethical considerations surrounding CRISPR applications.”
Initial marketing of PRRS-resistant pigs highlights these ethical considerations, demonstrating they have been addressed. A division of the industry, known as The Pig Improvement Company—yes, that’s its actual name—underscores benefits like enhanced animal welfare, reduced antibiotic reliance, and positive environmental effects.
If their messaging resonates, could more gene-edited animals find their way to our dinner tables? Perhaps. Scientists at the Roslin Institute are currently researching edits to combat other livestock diseases, including the bovine diarrhea virus.
However, Tait-Burkard cautions that engineering resistance to specific viruses, like avian influenza, may pose more significant challenges or require edits harmful to animal cells. The proteins they edited for pig PRR resistance are “excellent targets,” but they are challenging to identify.
For traits linked to productivity, such as improved breeding and meat quality, the agricultural sector is already refining efficient breeding techniques to achieve these objectives. As such, it’s unlikely that costly gene editing will be utilized to create “super” meat anytime soon.
Nonetheless, if gene editing can enhance animal protection, minimize antibiotics, and alleviate environmental burdens, it could swiftly transition from novelty to normalcy—provided animal welfare remains uncompromised.
Research conducted by astronomer Matthew Hopkins and his team at Oxford University suggests that 3i/Atlas, the second interstellar comet discovered near our solar system, may have been on its trajectory over 3 billion years ago.
Top view of the Milky Way displaying the predicted orbits of our Sun and 3i/Atlas. Comets are represented by dashed red lines, while the sun is indicated by a dashed yellow line. The comet’s route to the outer thick disc is mostly clear, whereas the sun remains close to the nucleus of the galaxy. Image credit: M. Hopkins / Otautahi Oxford Team / ESA / Gaia / DPAC / Stefan Payne-Wardenaar / CC-SA 4.0.
“All comets formed alongside our solar system, like Halley’s comets, are up to 4.5 billion years old,” Dr. Hopkins explained.
“In contrast, interstellar visitors can be significantly older. Our statistical analyses indicate that 3i/Atlas is very likely to be the oldest comet we’ve observed thus far.”
Unlike 1i/Oumuamua and 2i/Borisov, the two previous interstellar objects that passed through our solar system, 3i/Atlas appears to be on a more inclined path through the Milky Way.
A recent study forecasts that 3i/Atlas is likely to be rich in water ice, as it probably formed around the star of the ancient, thick disc.
“This is an aspect of the galaxy that we’ve never encountered before,” said Chris Lintot, a professor at Oxford University and host of The Sky at Night.
“I believe there is a two-thirds chance that this comet predates the solar system and has been drifting through interstellar space ever since.”
As it nears the Sun, the heat from sunlight activates 3i/Atlas, generating a coma and tail composed of steam and dust.
Initial observations indicate that the comet is already active and may even be larger than any of its interstellar predecessors.
If this is validated, it could influence the detection of similar objects by future telescopes, such as the upcoming Vera C. Rubin Observatory.
Furthermore, it could offer insights into the role that ancient interstellar comets play in the formation of stars and planets throughout the galaxy.
“We’re in an exciting phase. 3i/Atlas is already displaying signs of activity,” remarked Dr. Michele Bannister, an astronomer at the University of Canterbury.
“The gases we might observe in the future, as 3i/Atlas is heated by the Sun, will help us evaluate our models.”
“Some of the world’s largest telescopes are currently monitoring this new interstellar entity. One of them may make a significant discovery!”
To celebrate the remarkable advancements in science during the third year, astronomers have utilized the NASA/ESA/CSA James Webb Space Telescope to capture images of the Cat’s Paw Nebula.
This web image depicts the Cat’s Paw Nebula, a significant star-forming region located 5,500 light years from the constellation Scorpio. Image credits: NASA/ESA/CSA/STSCI.
The Cat’s Paw Nebula resides in the southern constellation of Scorpio and is approximately 5,500 light years from Earth.
First identified in 1837 by British astronomer John Herschel, this dynamic star-forming region spans an estimated 80 to 90 light years.
Also known as NGC 6334 or the Bear Claw Nebula, it is one of the most vibrant stellar nurseries in the night sky, producing thousands of young, hot stars that emit light not visible from our perspective.
Recent images captured by Webb’s NIRCam instrument reveal structural details and functionalities previously unseen.
“Massive young stars are actively interacting with nearby gas and dust, and their bright stellar light produces a luminous, hazy glow, represented in blue,” Webb astronomers stated.
“This scenario illustrates a transient period where a destructive young star plays a significant role in the broader narrative of the region, characterized by relatively short lifespans and high luminosity.”
“Due to the dynamic activities of these massive stars, the local star formation process will eventually come to a halt.”
“We begin with a central area identified as the ‘opera house’ because of its hierarchical circulatory structure,” they noted.
“The principal sources of the blue glow in this area are likely positioned towards the bottom, obscured by dense brown dust, interspersed with light from bright, yellowish stars or nearby sources.”
“Beneath the orange-brown dust lies a bright yellow star displaying distinct diffraction spikes.”
“This giant star is sculpting its surrounding environment but has not managed to push gas and dust away sufficiently nor create a compact shell of surrounding material.”
“Take note of smaller regions, such as the tuning fork-shaped area adjacent to the opera house, which contains fewer stars.”
“These seemingly vacant zones are still in the process of forming stars, indicating the presence of dense filaments of dust that obscure the light of background stars.”
At the center of the image, small, fiery red masses can be seen scattered within the brown dust.
“These glowing red sources highlight areas where large-scale star formation is occurring, albeit in a less visible manner,” the researchers explained.
“Some of the blue-white stars, particularly in the lower left area, appear more sharply resolved than others.”
“This sharper appearance is attributed to the material between the star and the telescope being diffused by the star’s radiation.”
Near the bottom of this area is a compact dust filament.
“These small dust aggregates have managed to survive the intense radiation, indicating they are dense enough to give rise to protostars.”
The small yellow section on the right marks the location of a massive star still in its formative stages, managing to shine through the intervening material.
Numerous small yellow stars are scattered across the scene, displaying distinct diffraction spikes.
“The bright blue-white stars prominently feature in the foreground of this web image, with some possibly being part of the larger Cat’s Paw Nebula region.”
A particularly striking feature of this web image is the bright red-orange oval shape located in the top right corner.
The low concentration of background stars indicates it is a dense area where the star-forming process has only recently commenced.
Several visible stars are distributed throughout the region, contributing to the illumination of central materials.
Some of the developing stars have left behind traces of their existence, such as the shock wave visible in the lower left area.
Chinese paleontologists have uncovered the fossilized skeleton of the colossal Mamenchisaurus dinosaurs, heralding a remarkable new genus from the late Jurassic epoch.
Fossil remains of Tongnanlong Zhimingi. Image credit: Wei et al., doi: 10.1038/s41598-025-09796-0.
The newly identified species inhabited southwestern China approximately 147 million years ago (late Jurassic epoch).
Scientifically designated as Tongnanlong Zhimingi, this sauropod dinosaur measured around 23-28 m (75.5-92 feet) in length.
“Sauropods are enormous, herbivorous quadrupeds and represent the largest terrestrial dinosaurs that ever existed,” remarked Dr. Xuefang Wei, a researcher from the Western Center for China Geological Survey.
“They first appeared in the late Triassic period, spread globally by the Middle Jurassic, and ultimately went extinct at the end of the late Cretaceous period.”
More than 150 genera have been documented, including over 20 genera from the Jurassic period within China.
“Southwest China is a significant area for Jurassic sauropod discoveries, particularly in the Sichuan Basin,” they added.
The sauropod fauna found in the Jurassic Sichuan Basin was once considered an endemic population distinct from the terrestrial fauna of Pangaea.
This distribution was often explained by the East Asian seclusion hypothesis, suggested to have occurred between the Jurassic and early Cretaceous periods.
However, this hypothesis faces challenges from recent phylogenetic analyses conducted in China, as well as the discovery of Neosaurupod dinosaurs, including a Mamenchisaur dinosaur found in Africa.
The holotype specimen of Tongnanlong Zhimingi was excavated from a construction site in the Dongnan district of the Chonging region within the Sichuan Basin.
This includes three dorsal vertebrae, six caudal vertebrae, scapulae, coracoids, and hind limb bones.
“Our fieldwork indicates that the fossil site belongs to the upper part of the Jurassic. The Monitor formations are situated above Quaternary sediments,” noted the paleontologist.
“The Sorning Formation is composed of purple-red mudstone and sandstone.”
“The layer is rich in invertebrate fossils, including various freshwater Conchostracans, particularly ostracods and stone trails.”
Several vertebrates are identified from this layer, such as fish Ceratodus szechuanensis, turtle Plesiochelys Tatsuensis, and dinosaurs like Mamenchisaurus anyuensis.
Anatomical and phylogenetic studies affirmed that Tongnanlong Zhimingi is part of the sauropod dinosaur family Mamenchisauridae.
“Mamenchisauridae was not a fauna confined to East Asia, but rather had a global distribution during the late Jurassic period,” the researchers concluded.
“Tongnanlong Zhimingi enhances the diversity of Eusauraupods and offers new insights into sauropod diversity and evolutionary developments from the mid-Jurassic to the Late Jurassic as they increased in size.”
Their study was published in the journal Scientific Reports on July 10th.
____
X. Wei et al. 2025. New Mamenchisaurus discoveries monitoring the Upper Jurassic formations in the Sichuan Basin, China and their implications for sauropod gigantism. Sci Rep 15, 24808; doi:10.1038/s41598-025-09796-0
The catastrophic flood in Texas, claiming nearly 120 lives, marked the first major crisis encountered by the Federal Emergency Management Agency (FEMA) under the current Trump administration. Despite the tragic loss of life, both former and current FEMA officials have expressed to NBC News that the effects on smaller geographic regions don’t adequately challenge the capabilities of the agency, especially as staffing has been reduced significantly.
They argue that the true tests may arise later this summer, when the threat of hurricanes looms over several states.
As discussions about the agency’s future unfold—with President Donald Trump hinting at the possibility of “dismantling it”—Homeland Security Secretary Christy Noem, who oversees FEMA, has tightened her control.
Current and former officials have mentioned that Noem now mandates that all agents personally authorize expenditures exceeding $100,000. To expedite the approval process, FEMA established a task force on Monday aimed at streamlining Noem’s approval, according to sources familiar with the initiative.
While Noem has taken a more direct approach to managing the agency, many FEMA leadership positions remain unfilled due to voluntary departures. In May, the agency disclosed in an internal email that 16 senior officials had left, collectively bringing over 200 years of disaster response experience with them.
“DHS and its components are fully engaged in addressing recovery efforts in Carville,” a spokesperson from DHS remarked in a statement to NBC News.
“Under Chief Noem and Deputy Manager David Richardson, FEMA has transformed from an unwieldy DC-centric organization into a streamlined disaster response force that empowers local entities to assist their residents. Outdated processes have been replaced due to their failure to serve Americans effectively in real emergencies… Secretary Noem ensures accountability to U.S. taxpayers, a concern often overlooked by Washington for decades.”
Civilians assist with recovery efforts near the Guadalupe River on Sunday.Giulio Cortez / AP
On Wednesday afternoon, the FEMA Review Council convened for its second meeting, set up to outline the agency’s future direction. “Our goal is to pivot FEMA’s responsibilities to the state level,” Trump told the press in early June.
At this moment, FEMA continues to manage over 700 active disaster situations, as stated by Chris Currie, who monitors governmental accountability.
“They’re operating no differently. They’re merely doing more with fewer personnel,” he noted in an interview.
While some advocates push for a more proactive role for the agency, certain Republicans in Congress emphasize the need to preserve FEMA in response to the significant flooding.
“FEMA plays a crucial role,” said Senator Ted Cruz of Texas during a Capitol Hill briefing this week. “There’s a consensus on enhancing FEMA’s efficiency and responsiveness to disasters. These reforms can be advantageous, but the agency’s core functions remain vital, regardless of any structural adjustments.”
Bureaucratic Hurdles
A key discussion point in the first FEMA Review Council meeting was how the federal government can alleviate financial constraints. However, current and former FEMA officials argue that Noem’s insistence on personal approvals for expenditures introduces bureaucratic layers that could hinder timely assistance during the Texas crisis and potential future hurricanes.
Current officials voiced that the new requirements contradict the aim of reducing expenses. “They’re adding bureaucracy…and increasing costs,” one official commented.
A former senior FEMA official remarked that agents need to procure supplies and services within disaster zones, routinely requiring their authorization for contracts over $100,000 to facilitate these actions.
“FEMA rarely makes expenditures below that threshold,” disclosed an unnamed former employee currently involved in the industry to NBC News.
In addition to the stipulation that Noem must approve certain expenditures, current and former staff members revealed confusion regarding who holds authority—Noem or Richardson, who has been acting as administrator since early May. One former official noted a cultural shift within the agency from proactive measures to a more cautious stance, as employees fear job loss.
DHS spokesperson Tricia McLaughlin referred to questions regarding who is in charge as “absurd.”
Further changes are underway. Last week, agents officially ceased their practice of sending personnel into disaster areas to engage with victims about available services. This decision followed complaints regarding interactions that had been criticized last fall. Acting managers previously labeled this conduct by FEMA staff as “unacceptable.” Distancing from the scrutiny, the dismissed personnel claimed to have acted under their supervisor’s instructions to avoid “unpleasant encounters.”
Although many individuals access FEMA services through various channels like the agency’s website and hotline, two former officials emphasized that in-person outreach remains essential for connecting disaster victims with available resources. It remains uncertain if the agency plans to send personnel into Texas for door-to-door outreach.
This week, Democratic senators expressed frustration that Noem has yet to present the 2025 hurricane plans she mentioned in May, after they were promised to be shared.
New Jersey Senator Andy Kim, leading Democrat on the Disaster Management Subcommittee, plans to send another letter to Noem on Wednesday to solicit these plans.
“The delay in FEMA’s 2025 hurricane season plan report at the start of hurricane season highlights the ongoing slowness of DHS in providing essential information to this committee,” Kim asserted in his letter.
FEMA’s Future
Critical questions remain regarding FEMA’s role in disaster recovery: What responsibilities will it retain, and which will be delegated to states to manage independently?
Experts consulting with NBC News concur that while federal agencies should maintain responsibility for large-scale disasters, the question persists as to whether states could be empowered to handle smaller ones rather than deferring to federal assistance.
“Disaster prevention is paramount,” remarked Jeff Schlegermilch, director of Columbia University’s National Center for Disaster Response.
Natalie Simpson, a disaster response expert at the University of Buffalo, added that larger states could assume greater risk during disasters.
“I believe we could establish a local FEMA due to economies of scale in larger states like California, New York, and Florida, but I doubt their efficacy in smaller states,” she stated during an interview.
Current and former FEMA officials, including Texas Governor Greg Abbott, have criticized FEMA as “inefficient and slow,” asserting the need for a more responsive approach. They highlighted that the governor called for a FEMA disaster declaration within days of the flood.
On Sunday, the president sidestepped inquiries about potential agency restructuring, stating:
White House spokesperson Karoline Leavitt commented that ongoing discussions are taking place regarding the agency’s broader objectives. “The President aims to ensure that American citizens have the resources they need, whether that assistance is provided at the state or federal level; it’s a matter of continuous policy discourse,” Leavitt remarked.
Denver – The Denver Museum, famous for its dinosaur exhibits, has unearthed fossil bones right beneath its parking lot, bringing paleontological discoveries closer to home than many anticipated.
This find originated from a drilling operation that reached over 750 feet (230 meters) deep to explore geothermal heating options at the Denver Museum of Natural Sciences.
The museum is a favorite among dinosaur lovers of all ages, where full-sized dinosaur skeletons astonish children who can barely reach their parents’ knees, especially the mighty Tyrannosaurus.
Ornithopod vertebrae discovered at a depth of 763 feet in the core excavation at City Park, located within the parking lot of the Denver Museum of Natural Sciences. Richard M Wicker/Video Denver Natural Museum AP
While this latest find may not be visually striking, the likelihood of discovering a fossil sample shaped like a hockey puck is notably low.
Museum representatives highlighted the rarity of encountering dinosaur remains, even in localized areas with a modest width of just a few inches (5 cm).
“Finding dinosaur bones in the core is akin to drilling into one of the moons. It’s like winning the Willy Wonka Factory. It’s extraordinarily uncommon,” noted James Hagerdorn, the museum’s geology curator.
Geologist James Haggadawn closing a box of core sample locks at the Denver Museum of Natural Sciences on July 9th. Thomas Paypert / AP
Museum officials mentioned that only two similar discoveries have been documented in borehole samples globally, let alone on the grounds of a dinosaur museum.
These vertebrae are believed to come from small, herbivorous dinosaurs that thrived during the late Cretaceous period, approximately 67.5 million years ago, shortly before the asteroid impacts that led to their extinction.
Fossilized plant materials were also uncovered in the vicinity of the bone.
“The animal inhabited a wetland ecosystem that was likely lush with vegetation at that time,” explained Patrick O’Connor, curator of vertebrate paleontology at the Denver Museum of Natural Sciences.
The region has long been recognized for its dinosaur discoveries, including fossils resembling Tyrannosaurus rex and Triceratops. This recent find is noted to be Denver’s deepest and oldest, according to O’Connor.
While other experts validate the findings, reactions to the discoveries have been varied.
“It’s impressive. However, it might not be scientifically groundbreaking,” commented Thomas Williamson, curator of paleontology at the Museum of Natural History in Albuquerque, New Mexico.
Williamson remarked that it’s challenging to accurately determine the species of dinosaur from the evidence found.
Yet, Erin Rack Count, the educational program director for Dinosaur Ridge, located just west of Denver, exclaimed in an email that the discovery is “absolutely legitimate and utterly fascinating!”
The fossil’s shape suggests it may belong to a duck-billed dinosaur or perhaps a tecosaurus.
Currently, the borehole fossils are on display at the Denver Museum of Natural Sciences, but there are no plans to search for additional finds beneath the parking lot.
“I wish I could dig a 763-foot (233 meters) hole in the parking lot and unearth more dinosaurs, but I don’t think it will happen because of parking constraints,” said a museum official.
AI can streamline government paperwork, yet significant risks exist
Brett Hondow / Alamy
A number of nations are exploring how artificial intelligence might assist with various tasks, ranging from tax processing to decisions about welfare benefits. Nonetheless, research indicates that citizens are not as optimistic as their governments, potentially jeopardizing democratic integrity.
“Focusing exclusively on immediate efficiency and appealing technologies could provoke public backlash and lead to a long-term erosion of trust and legitimacy in democratic systems,” states Alexander Utzke, at Ludwig Maximilian University in Munich, Germany.
Utzke and his team surveyed around 1,200 individuals in the UK to gauge their perceptions regarding whether human or AI management was preferable for government functions. These scenarios included handling tax returns, making welfare application decisions, and assessing whether a defendant should be granted bail.
Participants were divided; some learned only about AI’s potential to enhance governmental efficiency, while others were informed about both the advantages and the associated risks. The risks highlighted included the challenges in discerning how AI makes decisions, an increasing governmental reliance on AI that may be detrimental in the long run, and the absence of a straightforward method for citizens to challenge or modify AI determinations.
When participants became aware of these AI-related risks, there was a marked decline in their trust towards the government and an increased feeling of losing control. For instance, the percentage of those who felt government democratic control was diminishing rose from 45% to over 81% when scenarios depicted increasing governmental dependence on AI for specific functions.
After learning about the risks, the percentage of individuals expressing skepticism regarding government use of AI surged significantly. It jumped from under 20% in the baseline scenario to over 65% when participants were informed of both the benefits and risks of AI in the public sector.
Regardless of these findings, democratic governments assert that AI can be utilized responsibly to uphold public trust, according to Hannah Key de la Vallee from the Center for Democracy and Technology in Washington, DC. However, she notes that there have been few successful applications of AI in governance to date, with several instances of failures already observed, which can have serious consequences.
For instance, attempts by various US states to automate public interest claim processing have resulted in tens of thousands of individuals being incorrectly charged with fraud. Some affected individuals faced bankruptcy or lost their homes. “Mistakes made by the government can have significant, long-lasting repercussions,” warns Quay de la Vallee.
As Texans look for solutions to flooding issues in the hill country, prominent meteorologists and policymakers are advocating for the creation of a disaster review board similar to the National Traffic Safety Commission, which investigates all civil aviation incidents and significant traffic occurrences.
The proposal for an independent committee to evaluate weather-related disasters is not a recent idea; however, it seems to have gained renewed momentum following floods in Texas that have claimed over 120 lives and left another 170 unaccounted for.
During a Senate confirmation hearing on Wednesday for the head of the National Oceanic and Atmospheric Administration (NOAA), Neil Jacobs expressed his support for the initiative when Sen. Ted Cruz (R-Texas) inquired about how he would enhance public response to emergency weather notifications.
“We also need more data and need to conduct post-storm evaluations,” Jacobs stated. “I have been involved with some aviation incidents at the NTSB, and we’re looking at something similar here, requiring data to identify what went wrong, whether proper warnings were issued, and how to respond to weather-related disasters.”
Neil Jacobs in 2019. Get McNamee/Getty Images files
Lawmakers from both parties have taken the initiative.
In 2022, the House of Representatives passed legislation that included provisions for creating a Natural Disaster Safety Committee; however, it failed in the Senate.
Cassidy and Schatz did not respond immediately when asked whether they feel there is renewed momentum in Congress for establishing a disaster review panel. Notably, Porter is no longer serving in Congress.
In a communication to NBC News, the only meteorologist in Congress, Rep. Eric Sorensen (D-Ill.), indicated he is collaborating with colleagues to initiate an NTSB-style program to investigate severe weather events.
A flooded home in New Orleans following Hurricane Katrina in 2005. Michael Appleton / New York Daily News / Getty Image File
“It would be incredible if meteorologists had access to research reports that could inform their future actions, regardless of past mistakes,” Sorensen remarked.
Illinois also faced significant flooding this week, with around five inches of rain descending in just 90 minutes at Garfield Park on Chicago’s west side, leading to multiple rescue operations.
The floods in Chicago and Texas were among four extreme rainfall events occurring within a week, events researchers describe as once-in-a-thousand-years occurrences.
The notion of an independent disaster review board has circulated within meteorological and disaster management circles for years. Mike Smith, a meteorologist and former senior vice president at Accuweather, has championed this concept since the aftermath of Hurricane Sandy in 2012.
Artistic rendering inspired by actual images of the IceCube neutrino detectors in Antarctica.
icecube/nsf
Our focus lies in understanding the true nature of the rarest and most energetic cosmic rays, which aids in deciphering their elusive origins.
The universe continuously showers us with bursts of particles. Brian Clark, from the University of Maryland, explains that the most energetic particles are termed ultra-high energy cosmic rays, possessing more energy than particles accelerated in labs. However, they are quite rare. Researchers are still investigating their sources and the constituent particles remain largely unidentified. Clark and his team are now analyzing the composition using data from the IceCube Neutrino detector situated in Antarctica.
Previous detections of ultra-high energy cosmic rays by the Pierre Auger Observatory in Argentina and a telescope array in Utah have led to disagreements. Clark posits that it remains uncertain whether these rays are mainly composed of protons or if they consist of a mix of other particles. The IceCube data sheds light on this, indicating that protons account for about 70% of these rays, with the remainder composed of heavier ions like iron.
Team member Maximilian Meyer from Chiba University in Japan notes that while IceCube data complements other measurements, it primarily detects neutrinos—by-products resulting from collisions between ultra-high-energy cosmic rays and residual photons from the Big Bang. Detecting and simulating neutrinos is inherently challenging.
The characteristics of cosmic ray particles influence how the magnetic fields generated in space affect their trajectories. Thus, comprehending their structure is crucial for the challenging endeavor of tracing their origins, according to Toshihiro Fujii from Osaka Metropolitan University in Japan.
These mysterious origins have given rise to numerous astonishing enigmas, such as the Amaterasu particle cosmic rays. Interestingly, it seems to have originated from a region in space near the Milky Way that lacks clear astronomical candidates for its source.
Clark expresses optimism about solving many of these mysteries within the next decade, as new observational tools, including an upgrade to IceCube, will soon be operational. “This domain has a clear roadmap for how we can address some of these questions,” he states.
In directing this play, which commemorates the centenary of the trial, Buck emphasizes that leaders in Dayton are pursuing the same mission as their predecessors a hundred years ago.
“I’ve generated interest in this town, and I’m thrilled about the people here, positioning Dayton on the map,” Buck stated. “Perhaps we’re utilizing this narrative and trial to shine a spotlight on this unique location.”
Descendants
Jacob Smith, 23, only realized his connection to the iconic trials after delving into history. His great-great-grandmother’s brother was Walter White, the county’s school chief and a pivotal figure who brought the trial to Dayton.
Smith portrays Dudley Field Malone, Scope’s defense attorney, who delivered equally passionate and memorable speeches during the trial, in a manner reminiscent of Brian and Darrow. One of Smith’s favorite lines references the contentious nature of the courtroom battle.
“He essentially states, ‘There’s never a duel with the truth,’” Smith explained. “He argues, ‘It always prevails. It doesn’t conspire or require suspension of laws, governments, or ‘Mr. Brian’.”
Now a county archivist, Smith is eager to see visitors discover the original courthouse in Dayton, with its creaky, polished wooden floors, lofty windows, and impressive staircase leading up to the expansive courtroom on the second level.
“Like the lawyers before them, they could ascend to that circuit court and grip the railing, and back in 1925, the entire audience would have turned their gaze,” Smith noted.
“The Great General”
Larry Jones, who has been acting in community and local theatres since childhood, thought he knew the story of the Scope Trial following his performance in “The Wind Inheritance.”
He soon realized that renowned plays often embraced creative liberties, transforming into a commentary on something else that had captivated the nation at the time, such as McCarthyism.
Jones portrays Brian, a notable Christian orator and populist politician. He mentions that the most challenging aspect wasn’t memorizing Brian’s lengthy speeches, but rather responding to Darrow’s unexpected challenges that demand a defense of the literal truth of the Bible.
“I have to react instinctively and appear spontaneous each time,” Jones remarked. “Part of me thinks, ‘Oh, is that the right cue? Will I say the correct thing?’
Jones asserts that the audience will connect to the trial’s enduring narrative as it echoes into the next century. The discourse continues.
“Discussions about the same themes persist,” Jones explained. “What role should federal or state governments play in public education? What should or shouldn’t be allowed? How should parents guide their children’s education? Whether concerning evolution, literature, or numerous contemporary political issues, the debate remains alive.”
There Is No Conclusion
The trial’s outcome came as little surprise, with the jury swiftly deeming the scope guilty after mere minutes of deliberation. Nonetheless, the defense’s aim has always focused on establishing legal precedents in higher courts.
Today, Dayton embraces its historical significance during the annual trial celebration. Businesses promote “Monkey Trials,” and locals have adopted the phrase “Dayton has evolved.”
“We’re discarding very old tales, yet they feel refreshingly new,” Buck expressed. “It’s so, so very relevant now.”
Social Media and Short-Form Video Platforms Drive Language Innovation
lisa5201/getty images
Algospeak Adam Aleksic (Every (UK, July 17th) Knopf (USA, July 15th))
You won’t age, just as slang is wrapped in bamboo. In Adam Aleksic’s chapter Algospeak: How Social Media Will Change the Future of Language, this phenomenon is discussed. Phrases like “Pierce Your Gyat for Rizzler” and “WordPilled Slangmaxxing” remind me that as a millennial, I’m just as distant from boomers as today’s Alphas are.
Linguist and content creator (@etymologynerd), Aleksic has ignited a new wave of linguistic innovation fueled by social media, particularly short video platforms like TikTok. The term “Algospeak” has been traditionally linked to euphemisms used to avoid online censorship, with recent examples including “anxiety” (in reference to death) or “segg” (for sex).
However, the author insists on broadening the definition to encompass all language aspects affected by the “algorithm.” This term refers to the various, often opaque processes social media platforms use to curate content for users.
In his case, Aleksic draws on his experience of earning a living through educational videos about language. Like other creators, he is motivated to appeal to the algorithm, which requires careful word selection. A video he created dissecting the etymology of the word “pen” (tracing back to the Latin “penis”) breached sexual content rules, while a discussion on the phrase “from river to sea” remained within acceptable limits.
Meanwhile, videos that explore Gen Alpha terms like “Skibidi” (a largely nonsensical term rooted in scat singing) and “Gyat” (“Goddamn” or “Ass”) have performed particularly well. His findings illustrate how creators modify their language for algorithmic advantage, with some words transitioning online and offline to achieve notable success. When Aleksic examined educators, he found many of these terms had entered regular classroom slang, with some students learning the term “anxiety” before understanding “suicide.”
A standout aspect of his study lies in etymology, investigating how algorithms propel words from online subcultures into mainstream lexicon. He notes that the misogynistic incel community is a significant contributor to contemporary slang, evidenced by its radical nature that can outpace linguistic evolution within a group.
Aleksic approaches language trends with a non-judgmental perspective. He notes that the term “anxiety” parallels earlier euphemisms like “deceased,” while “Skibidi” is reminiscent of “Scooby-Doo.” He frequently mischaracterizes slang within arbitrarily defined generations, which claim to infuse toxic narratives into the evolution of normal languages.
The situation becomes more intricate when slang enters mainstream usage through cultural appropriation. Many contemporary slang terms, like “cool” before them, trace back to the Black community (“Thicc,” “bruh”) or originate from the LGBTQ ballroom scenes (“Slay,” “Yas,” “Queen”). Such wide-ranging adoptions can sever these terms from their historical contexts, often linked to social struggles and further entrenching negative stereotypes about the communities that birthed them.
Preventing this disruption of context is challenging. Successful slang’s fate is often to be stripped of its original nuances. Social media has drastically accelerated the timeline for language innovation. Algospeak is a necessary update, yet it can become quickly outdated. However, as long as algorithms exist, fundamental insights into how technology influences language will remain important.
Victoria Turk is a London-based author
New Scientist Book Club
Enjoy reading? Join a welcoming community of fellow book enthusiasts. Every six weeks, we explore exciting new titles, offering members exclusive access to book excerpts, author articles, and video interviews.
Coral bleaching in the Great Barrier Reef off the coast of Queensland, Australia
Nature Picture Library/Alamy
Researchers stress the urgent need for strategies to artificially provide shade from rising temperatures affecting Australia. This alerts us following recent findings that link changes in transport fuels to an increased risk of coral bleaching.
In recent years, significant sections of barrier reefs have experienced severe bleaching due to rising sea temperatures attributed to climate change.
Adjustments made in 2020 to regulations governing fuel composition have led to additional detriment, according to Robert Ryan from the University of Melbourne. These changes have decreased sulfur dioxide emissions, which are protective pollutants for health, but have also eliminated aerosols that contribute to the cooling of marine clouds over the reefs.
In February 2022, Ryan and his team leveraged computer models to analyze the impacts of cloud cover and solar radiation in relation to fuel emissions over a span of 10 days.
They discovered that emissions at the pre-2020 levels would enhance the local cooling effect of clouds and noted that regulations aimed at reducing sulfate aerosol pollution diminished this cooling effect. Consequently, the new transport fuel regulations led to a rise in sea surface temperatures equivalent to 0.25°C, which created coral bleaching conditions that ranged from 21-40% during the studied period.
“There’s been an 80% reduction in sulfate aerosol transport, likely contributing to conditions that favor coral bleaching in the Great Barrier Reef,” states Ryan.
Bjørn Samset from the International Climate Research Centre in Oslo, Norway, asserts that this study will help address critical inquiries regarding the effects of reduced aerosol pollution on the surrounding environment. “The local aerosol influences may be more significant than previously considered, and we still have limited understanding of their impacts on ocean heat waves,” he remarks.
However, he cautions that the findings illustrate evident links between air quality and the conditions of clouds around notable reef systems, though they only represent a brief timeframe and are complex compared to other related research.
Ryan is also involved in efforts to devise methods to artificially cool coral reefs using Marine Cloud Brightening (MCB), a climate intervention technology that involves dispersing ocean salt particles into the atmosphere to amplify the cooling effects of marine clouds.
Researchers suggest that given their recent findings, such artificial cooling measures for large barrier reefs may be more crucial than ever. “If changes in sulfate emissions have diminished the brightening effects of ocean clouds, it could be worth reconsidering their reimplementation in targeted programs,” Ryan explains.
Daniel Harrison from Southern Cross University in Australia emphasizes that their findings indicate that MCBs can effectively cool the reef, mirroring the cooling effects seen with past shipping emissions. “This study highlights the real-world implications of ongoing changes,” he adds. “It confirms that it was indeed effective.”
Harrison has secured funding from the UK’s Advanced Research and Innovation Agency for a five-year initiative to test the MCB in the Great Barrier Reef, asserting that MCB “aims to harmonize our efforts to lower emissions.”
On the other hand, some experts remain skeptical, arguing that there is insufficient evidence to confirm the safety and efficacy of intentional MCBs. Terry Hughes from James Cook University in Queensland, Australia, has stated that previous trials of MCB were “not successful” and produced no compelling evidence that it can reduce the local sea temperatures of the reef.
This captivating and intimate image offers a unique view of the Caribbean reef octopus (Octopus Briareus), showcasing the mother and her potential offspring in the Blue Heron Bridge diving area near West Palm Beach, Florida.
Following mating, these solitary creatures retreat to seclude themselves while safeguarding their developing eggs. However, for Octopus Briareus and several other octopus species, this tale takes a tragic turn.
Once her mother octopus lays a batch of hundreds of eggs, she ceases to feed and dies shortly after the eggs hatch. Research conducted in 2022 illuminated this phenomenon. The optic nerve gland, the primary neuroendocrine hub of the octopus, regulates lifespan and reproduction in invertebrates, akin to the pituitary gland in vertebrates.
Octopus mothers can dramatically boost cholesterol production post-mating, leading to self-destructive spirals, although the reason behind this cycle remains elusive. One theory suggests that the octopus stops eating for her young.
The mother of the octopus by freelance nature photographer Kat Zhou triumphed in the Aquatic Life category at the Bigpicture Natural World Photography Competition, which invites both professional and amateur photographers to capture, narrate, and advocate for the conservation of Earth’s diverse life forms.
The overall grand prize went to photographer and conservationist Zhou Donglin for Lemur’s Tough Life, a breathtaking capture (shown below) taken at the Tsingy de Bemaraha Strict Nature Reserve in Madagascar. After a challenging trek through rugged terrain, Donglin documented a common brown lemur (Eulemur Fulvus) making a daring leap from one cliff to another—with her baby clinging on.
Lemur’s tough life Zhou Donglin
Zhou Donglin
Next is Mud Skip by Georgina Steytler (shown below), depicting a fascinating reminder of life’s ancient past as a beautiful amphibian emerges from the mud. Steytler, a finalist in the Aquatic Life section of the competition, spent days at Goode Beach in Bloom, Western Australia to capture the precise moment when a Boleophthalmus pectinirostris leaped into the air.
Mud Skip By Georgina Steytler
Georgina Steytler
The final image (shown below) appears reminiscent of a scene from another planet. In reality, Remaining in the Snow by plant photographer Ellen Woods, a finalist in the awards for landscapes, waterscapes, and flora, was captured near her home in Connecticut, in the northeastern USA.
Remaining in the snow By Ellen Woods
Ellen Woods
It features skunk cabbage (Symplocarpus foetidus), often among the first plants to bloom at winter’s end. Notably, it can create its own microclimate, generating warmth of up to 23°C even when ambient temperatures remain below freezing.
This unique capability of thermal regulation protects the plant from frost damage and attracts beetles and fly pollinators drawn to its warmth and scent of carrion.
However, it’s not particularly pleasant; the name arises from its odor, likened to a skunk’s scent when the leaves are disturbed.
The winning photograph will be displayed at the California Academy of Sciences in San Francisco later this year.
Urban trees exhibit greater drought resilience than those in parks due to their access to leaking pipes, providing a unique water source.
During prolonged dry spells, trees in park settings experience greater decreases in water levels and sap flow compared to those on streets, although the underlying reasons were previously not well understood.
To delve deeper, Andre Poilier from the University of Quebec in Montreal, Canada, and his team studied trunk samples from both Norwegian and silver maple trees (Acer Platanoides and Acer Saccharinum) located in nearby parks and city streets. They analyzed various lead isotopes to establish a connection between isotopic levels and the trees’ recent history by examining the unique isotopic variations found in their trunk rings.
While park trees commonly showed lead isotopes linked to air pollution, those on the street displayed isotopic variations corresponding to lead from water pipes made of metals sourced from ancient local sediments.
Typically, a maple tree requires approximately 50 liters of water each day. Since street trees cannot rely on the rainwater that collects on concrete and drains into city sewer systems, Poilier suggests that the most plausible explanation lies in Montreal’s leaky pipes, which lose an estimated 500 million liters of water daily.
“The bright side is that planting trees along city streets can continue, as they thrive better than those in parks,” Poilier noted while presenting his findings at the Goldschmidt Geochemical Conference in Prague, Czech Republic, on July 8th.
“The sheer volume of water utilized by these urban trees is astonishing and contradicts conventional wisdom. I believe this will enhance the health of park trees as well,” commented Gabriel Filipeli from Indiana University.
Reducing emissions and capturing carbon is essential to limit warming
Richard Saker/Alamy
The planet must eliminate hundreds of billions of tons of carbon dioxide to keep global temperature rise under 1.5°C this century. Even the less ambitious 2°C targets seem increasingly unattainable without substantial carbon capture and removal (CDR) technologies and urgent emission reductions.
The contentious role of carbon management technologies in meeting climate objectives has been debated for some time. According to the Intergovernmental Panel on Climate Change, a degree of carbon management is “inevitable” for reaching zero emissions required to stabilize global temperatures. However, it stresses that the necessary technologies have yet to be validated at the needed scale and emphasizes the risk of providing justifications for continued emissions.
“There’s an ongoing debate among scientists about whether CDR is essential or fundamentally unfeasible,” says Candelaria Bergero from the University of California, Irvine. “Some argue that CDR is unavoidable,” she adds.
To assess what is at stake, Bergero and her research team simulated the potential for global temperature increases to stay below 2°C while analyzing CO2 management across various emission scenarios aligned with the Paris Agreement targets. These scenarios incorporated both technological CDR methods like direct air capture and nature-based solutions such as tree planting, alongside varying carbon capture applications for emissions from power plants and industrial sources.
They determined that failing to capture or remove CO2 could lead to an additional 0.5°C rise in global average temperature by century’s end. Moreover, half of the carbon management predicted in the scenarios could induce about 0.28°C of warming, making it nearly impossible to restrict temperature increases to 1.5°C, even within frameworks that consider violations of that threshold.
While achieving 2°C warming targets might still be feasible without carbon management, researchers found that drastic emission reductions of 16% annually since 2015 are necessary. Such a rapid decrease appears unlikely given the increasing global emissions over the last decade, according to Bergero.
Furthermore, initiatives for scaling up carbon management aren’t progressing swiftly enough. According to Steve Smith at Oxford University, only 40 million tonnes of CO2 are currently captured and stored globally, and only about 1 million tonnes are removed directly each year.
“Like with other emissions reductions, countries frequently discuss ambitious long-term goals, yet lack immediate measures to implement the billions of tons of reductions necessary for these pathways to succeed,” he states.
Paleontologists have unearthed fossilized teeth from a newly identified genus and species of plagioclase polyuria in the lower Cretaceous Luluworth Formation within the Purbeck Group in Dorset, England.
Artist’s depiction of Nova Cradon Mirabilis. Image credit: Hamzah Imran.
Multi-liquids represent a highly successful and diverse group of Mesozoic mammals.
Over 200 species have been documented, ranging in size from that of a mouse to a beaver.
These mammals thrived during the Central Jurassic throughout the Mesozoic Era, even surviving the mass extinction at the end of the Cretaceous, which led to the early, sparsely populated neural period.
They adapted to various ecological niches, from living in dens to climbing like squirrels.
The newly identified species existed during the Beliasia period in the pre-Cretaceous Epoch, around 143 million years ago.
Dubbing it Nova Cradon Mirabilis, this mammal was omnivorous, likely consuming small invertebrates like worms and insects.
Its sharp incisors and distinct, blade-like premolars demonstrate feeding strategies that differ from those of modern rodents such as squirrels and rats.
“This study illustrates how early mammals established their ecological roles while dinosaurs dominated the Earth,” remarked Professor David Martill from the University of Portsmouth and his colleagues.
The 1.65 cm jaw of Nova Cradon Mirabilis was discovered in 2024 by undergraduate Benjamin Weston at the University of Portsmouth.
“The fossil showcases long, pointed incisors at the front, followed by a gap and then four sharp premolars,” the paleontologist stated.
“While it superficially resembles a rabbit’s jaw, the pointed incisors and unique premolars clearly link it to the multi-iron group.”
The specimens were found at the upper beach area of Dalston Bay, Dorset, England.
This location is part of the Luluworth Formation from the Lower Cretaceous period of the Purbeck group.
“The new specimen is the most complete multi-iron structure found in the Purbeck group,” the researchers noted.
“The fossils were extracted from a distinctive layer in the so-called freshwater bed, specifically the flint bed, which scientists believe indicates deposition within freshwater lagoons.”
“Nova Cradon Mirabilis is also the first mammal recovered from the flint bed,” they added.
The discovery of Nova Cradon Mirabilis is detailed in a paper published in the Proceedings of the Geologists Association.
____
Benjamin T. Weston et al. A new polyiron tube (mammal, Arosaria) from the Luluworth Formation (Cretaceous, Beliasian) in Dorset, England. Proceedings of the Geologists Association Published online on July 9, 2025. doi: 10.1016/j.pgeola.2025.101128
Sneezing and coughing are prevalent symptoms of hay fever
Mohammad Hosein Safaei/Unsplash
Individuals suffering from hay fever may find relief with a novel “molecular shield” designed to stop pollen from penetrating the nasal lining, likely with fewer side effects than traditional treatments.
Hay fever is an allergic response triggered by pollen interacting with IgE antibodies found in the nose, mouth, and eyes, leading to inflammation and symptoms like sneezing and itching. Common treatments, such as antihistamines and steroids, help reduce inflammation but often come with side effects, including drowsiness.
Seeking alternatives, Kaissar Tabynov from Kazakh National University of Agricultural Research and his team first collected blood samples from mice. They then isolated antibodies that did not participate in the allergic response but could bind to major mugwort pollen allergens, the primary trigger for hay fever. This binding action inhibited allergens from connecting with IgE antibodies in laboratory tests. “It acts as a molecular shield,” Tabynov explains.
To evaluate the shield’s effectiveness, the researchers induced mugwort pollen allergies in 10 mice by injecting them with allergens and chemicals to stimulate an immune response.
After a week, they administered small amounts of liquid containing the pollen-blocking antibodies into the noses of half the mice, gradually increasing the dosage over five days. The other group received saline solutions. An hour following each droplet, the mice were exposed to mugwort pollen at concentrations similar to those encountered during peak pollen seasons, according to Tabynov.
Following the final injection, the mice receiving the antibody treatment showed an average of 12 nose rubs over five minutes, in stark contrast to 92 in the saline group.
The researchers aimed to diminish inflammation and confirmed their success by imaging the nasal tissues collected from the mice at the study’s conclusion. This imaging revealed that the treatment not only had localized effects but also systemic ones. “Our research is the first to show that allergen-specific monoclonal antibodies can be administered intranasally to achieve both local and systemic protection against plant pollen allergies,” states Tabynov.
While the researchers did not assess potential side effects, they do not anticipate the adverse reactions associated with oral hay fever treatments, since the antibodies act at the site of allergen entry.
“This study represents a significant breakthrough and underscores the promise of intranasal therapies for allergic rhinitis. [hay fever] It lays the groundwork for early clinical trials exploring this method in humans,” remarks Sayantani Sindher from Stanford University in California.
Nonetheless, translating success in mice to human applications may prove challenging, and the antibodies will need to be modified to ensure they do not provoke an unexpected immune response in humans, Tabynov notes. If all goes well, the team hopes to advance this method to a nasal spray for human use within the next two to three years, he adds.
Such sprays could also address additional pollen types responsible for hay fever. “We envision a future where tailored antibody sprays can be made for individuals with sensitivities to different pollen varieties,” muses Tabynov.
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.