Why Long-Term Exercise Recommendations During COVID-19 May Be Detrimental

Resistance Training: An Emerging Therapy for Long COVID Symptoms

Bailey Cooper Photography/Alamy

With millions of individuals suffering from long-term symptoms of coronavirus, researchers are exploring effective strategies to alleviate these conditions. Resistance training presents a promising, drug-free, and affordable option that could potentially expedite recovery from long COVID infections. Despite its potential benefits, skepticism remains, with some experts claiming existing studies lack robustness, reminiscent of past debates on exercise for conditions like chronic fatigue syndrome.

Dr. Caroline Dalton from Sheffield Hallam University, UK, emphasizes the necessity for precision in defining exercise efficacy, warning against generalizing results to all long COVID patients.

One notable study led by Dr. Colin Berry at the University of Glasgow sought to investigate lifestyle interventions as long COVID emerged as a significant post-infection complication. Berry’s hypothesis was that exercise might serve as a viable alternative to lengthy drug development processes.

Berry’s team conducted a three-month resistance training program for participants with long COVID, assessing their performance improvements, including a notable increase of 83 meters on a timed walking test for those who completed the program compared to just 47 meters for the control group. The findings suggested resistance training could be a feasible treatment for persistent COVID-19 symptoms like fatigue and mobility issues.

Despite the enthusiasm surrounding the study, critical voices raised concerns about its findings. The difference in distance walked by both groups fell short of clinical significance, according to David Tuller from the University of California, Berkeley. Berry acknowledged that individual benefits need consideration beyond aggregated group data.

Another major criticism of the study relates to its diverse participant group. The inclusion of individuals with varying severity of COVID-19 infections meant the results could reflect broad averages rather than specific insights. According to Todd Davenport from the University of the Pacific, this diversity risks obscuring individual outcomes.

Understanding Post-Exercise Fatigue

Crucially, the study’s approach to assessing post-exertional fatigue—a debilitating symptom of chronic COVID-was limited. This condition causes severe fatigue after exertion, which often disproportionate to the activity level. Danny Altman at Imperial College London notes that effective evaluation of post-excretion fatigue can be challenging.

Unfortunately, fatigue assessment only occurred after the study, losing sight of early responses to exercise protocols. Alarmingly, at a three-month evaluation, 67% of participants reported inadequate recovery post-activity compared to 49% in the control group, potentially indicating greater setbacks for the intervention group, as shared by Leonard Jason from DePaul University.

Emerging research has also suggested that exercise may exacerbate certain symptoms, with a 2024 study highlighting potential muscle damage and mitochondrial dysfunction in individuals suffering from long COVID-related fatigue.

While Berry’s research sparked widespread debate, other studies support the notion that exercise can yield benefits for long COVID patients. A comprehensive review of 33 randomized controlled trials asserted that exercise significantly enhances the quality of life for affected individuals; however, it did not specifically address the prevalence of post-exercise fatigue, noted by over 80% of those surveyed.

The negative impact of post-exercise fatigue on quality of life was echoed by Margaret O’Hara, who highlighted the inadequacies of studies failing to consider this critical symptom.

Similarities to Chronic Fatigue Syndrome

This ongoing discussion echoes sentiments from chronic fatigue syndrome (CFS) studies, where post-exertional fatigue plays a central role. Landmark research suggested graded exercise therapy provided some benefit but has faced scrutiny since, per critiques regarding the modified definitions of improvement throughout the trial period.

As outlined in subsequent analyses, the recovery rates reported in these trials have come under fire for failing to capture realistic improvements, reiterating the complexity and varying responses in exercise interventions for exercise-related fatigue.

As national health organizations pivot away from generally endorsing graded exercise therapy for CFS, acknowledging the necessity for tailored management strategies, experts argue a similar consideration is warranted for long COVID cases. Such insights advocate for a subtyping strategy in research, honing in on individual symptoms to gather nuanced perspectives on exercise impacts.

Assessing Risks and Benefits

Acknowledging that current guidelines do not endorse graded exercise therapy for long COVID, further inquiries into varied forms of exercise interventions remain essential. “Long COVID” encompasses a range of conditions, prompting the need to discern effective exercise practices for different patient profiles.

Factors like persistent viral presence in specific individuals or immune system overreactions post-COVID necessitate individualized investigation.

Mike Ormerod, a long COVID volunteer, stresses the importance of advocating for informed medical advice and managing the narrative surrounding exercise recommendations. “Most doctors encourage physical activity under the belief that it’s universally beneficial, yet this can lead to detrimental outcomes for those with specific fatigue profiles,” cautions Dalton.

Source: www.newscientist.com

Revolutionizing Table Tennis: The Rise of a Champion Robot

Ace’s Performance in December 2025 Match

Credit: Sony AI

Ace, an advanced autonomous robot with AI and state-of-the-art sensors, plays competitive table tennis, defeating elite human competitors—a groundbreaking achievement in robotics.

While computers have dominated the strategic game of chess, Ace’s triumph suggests a pivotal moment for physical sports is near, reminiscent of the “Deep Blue” episode in 1997 when a machine bested chess champion Garry Kasparov.

“Games have always served as benchmarks for AI, akin to Deep Blue Chess and the game-changing AlphaGo,” says Peter Duerr, the mastermind behind Ace at Sony AI, Zurich.

Duerr emphasizes that unlike previous AI milestones achieved online, Ace competes directly against real-world professional table tennis champions, marking a significant progression.

“Ace offers a unique insight: the competition between robots and humans in genuine athletic events,” observes Duerr.

Ace incorporates three key advancements in autonomous robotics, according to Duerr. First, it uses “event-based sensors” that focus on specific image areas, detecting movement and brightness changes critical for tracking the trajectory of a table tennis ball.

Moreover, Ace’s table tennis skills are enhanced through “model-free reinforcement learning,” where it learns through practical experiences rather than pre-defined models of play. This process equates to an extensive training regimen of thousands of hours in simulated environments.

Finally, the high-speed robotic hardware enables Ace to exhibit “human-like agility.” Duerr notes that Ace’s response time is around 20 milliseconds, compared to the 230 milliseconds human athletes typically require, making it exceptionally agile.

Currently, Ace maintains a robotic appearance akin to those on factory floors, utilizing a network of cameras and sensors around the table tennis arena. Yet, researchers predict that with technological advancements, Ace may eventually evolve into a humanoid form.

In a recent match conducted under the regulations of Japan’s professional table tennis league, Ace faced five elite non-professional players, all with at least 10 years of experience and extensive training hours. Two professional players also participated in the event.

Ace secured victories in three out of five matches against elite players, although encountered losses against professional opponents, with a notable win against one professional participant.

Ace’s advantage lies in its unpredictability; while human players often read body language for cues, Ace relies solely on data, creating a distinctive challenge.

“Some athletes noted they usually gauge opponents’ expressions, yet Ace doesn’t exhibit such behavior,” Duerr explains.

A few players were surprised by Ace’s ability to interpret the spin on serves, which it countered adeptly despite attempts to disguise them. Ace even surprised its creators by returning a ball that hit the net, showcasing an unanticipated skill.

Since the research concluded over a year ago, the team has been continuously refining Ace’s capabilities.

In December 2025, Ace won its first match against a professional player, followed by victories against three professional players, including Miyu Kihara, currently ranked within the top 25 in the world, and two male pros, Touto Ryuzaki and Fumiya Igarashi.

“With further improvements, we aim to surpass even world champions,” states Duerr.

Duerr adds that the evolution of skills works both ways.

“Former Olympian Kinjiro Nakamura remarked that he believed certain shots were impossible until he witnessed Ace in action, leading him to believe that human athletes could emulate those techniques,” Duerr concludes.

Topics:

Source: www.newscientist.com

Meet the Incredible Insect That Can Survive Boiling Water

In 1977, the film Star Wars Episode IV – A New Hope debuted while marine geologists made a groundbreaking discovery of deep-sea hydrothermal vents.

The explorers were aboard a submarine known as Alvin, operating at a depth of approximately 2,500 meters (8,200 feet) in the eastern Pacific Ocean.

Scientists gazing through Alvin’s portholes were astounded to witness a towering rocky chimney emitting superheated liquid, surrounded by an astonishing array of life.

This vibrant ecosystem was as fascinating as any creation from George Lucas’s imagination.










The Pompeii Worm (Alvinella pompeziana) was one of the remarkable species uncovered during the initial exploration of hydrothermal vents in the late 1970s and early 1980s.

These pink creatures can grow up to 15 cm (almost 6 inches) and are uniquely covered in soft, gray hair. Their red gills give them a distinctive resemblance to the Demogorgon monster from the Netflix series Stranger Things.

Scientists named these peculiar worms after the ancient Roman city famously destroyed by volcanic activity, symbolizing their extreme habitat.

While the Pompeii worms don’t inhabit an active volcano, their environment is still incredibly inhospitable.

Hydrothermal vents function as the ocean’s deep-sea equivalent of hot springs, presenting much higher temperatures and toxic conditions than their land counterparts.

These vents form at the boundaries of oceanic tectonic plates, where shallow magma chambers heat seawater that intrudes through porous ocean floor rocks, causing it to rise back up at temperatures reaching hundreds of degrees Celsius.

The Pompeii worm is considered the most heat-tolerant animal species, with probes recording temperatures of 60-80°C (140-176°F) near their habitats.

Interestingly, the worms can endure temperature spikes exceeding 100°C (212°F). While scientists have yet to fully understand this phenomenon, the worms’ gray, fluffy coating may provide insulation against the intense heat.

This fur, comprised of bacteria, is a source of nourishment for the worms, suggesting a symbiotic relationship. It appears to also aid in circulating colder seawater around their bodies and may even detoxify heavy metals released from hydrothermal vents.

Pompeii worms thrive in hydrothermal vents like this, where heat from magma chambers rises from the ocean floor – Credit: Getty

A significant part of the Pompeii worm’s resilience lies in its genetics. They produce highly durable heat shock proteins that prevent critical cellular components from degrading under extreme temperatures. They also produce strong collagen to withstand drastic oceanic pressure.

Intriguingly, these extraordinary worms have shown a sci-fi-like reproductive strategy. In laboratory settings, scientists have successfully chilled their eggs to ambient temperatures of 2°C (36°F) in the deep ocean, away from hydrothermal vents.

The chilled eggs temporarily ceased dividing but remained viable. Once the temperature was increased, development resumed.

This raises the exciting possibility that the Pompeii worm may release its eggs into the deep sea in a state of suspended animation, reviving them upon encountering another hydrothermal vent, thus forming new colonies.

One day, these insights could lead to advancements in human colonization of other planets.


Questions? Email us at: questions@sciencefocus.com or send us a message on Facebook, Twitter, or Instagram (please include your name and location).

Discover more with our ultimate fun facts and amazing science pages.


Read more:


Source: www.sciencefocus.com

Fermat’s Last Theorem: The Essential Science Book Revealing 350 Years of Mathematical Secrets

How does Simon Singh’s classic popular science book “Fermat’s Last Theorem” resonate today?

Did you know that the number 26 is unique? It’s the sole integer nestled between the square number 25 (5) and the cube number 27 (3). This intriguing detail highlights that no other examples exist between zero and infinity.

Simon Singh’s 1997 book Fermat’s Last Theorem is an insightful exploration of mathematical proof. It delves into what proof means, how it can be achieved, and what drives mathematicians in their passionate pursuits. This book narrates a captivating quest for evidence, making it a compelling read. Given that it took 350 years for the proof to surface, it also offers an impressive historical lens on mathematics. For many, the essence of mathematics feels like abstract reasoning beyond reach. Yet, Singh’s work transports readers into this captivating realm, remaining a treasure even nearly 30 years after its publication.

Singh begins with Pythagoras, renowned for his contributions to triangle theory. Most people are familiar with the Pythagorean theorem, stating that the sum of the squares of a right triangle’s two shorter sides equals the square of the longest side (2 + y2 = z2). While others used this methodology before, Singh highlights how Pythagoras distinguished himself by proving it true for all right triangles—not through trial and error, but via inarguable logic. “The quest for mathematical proof is a pursuit for absolute knowledge,” Singh asserts.

My favorite segment involves the tale of Pythagoras, as I learned he was the founder of the Secret Brotherhood of Proofs, and was fascinated by the story of Cyclone, a man denied admission, who conspired against Pythagoras.

Next, Pierre de Fermat enters the narrative. Living in 17th-century France, this judge revealed remarkable mathematical prowess. He famously proved the uniqueness of the number 26. Fermat became renowned for his “last theorem,” an elegant extension of the Pythagorean theorem. While an infinite number of integers can satisfy the Pythagorean equation, Fermat proposed that tweaking it to n + yn = zn with any integer n results in no integer solutions. In 1637, he audaciously claimed to possess “really excellent” proof, though he never documented it.

For 350 years, mathematicians chased its secrets. Singh adeptly navigates this journey, introducing a colorful cast of characters. One standout is Sophie Germain, a pioneering French mathematician who operated under a male alias. Evariste Galois, a fervent revolutionary, made significant contributions but fell in a duel. Yutaka Taniyama, a brilliant Japanese mathematician, played a key role in the eventual proof but tragically took his life.

Yet, our narrative’s hero is mathematician Andrew Wiles, who ultimately proved Fermat’s theorem true in 1994. Singh skillfully portrays Wiles, illuminating his notable achievements, even as he shunned the limelight. Through Wiles’ work—constructing a logical bridge between elliptic curves and modular forms—readers gain insight into complex mathematical realms.

However, the journey contains a tense twist: Wiles’ original proof revealed an error—a nightmare scenario. Yet, he rose from these ashes, ultimately correcting the flaws. My only critique is that this part of the narrative could have been more concise.

Although Singh’s book dates back to the 90s, its themes remain pertinent in modern mathematics. One concept tying both the book and Wiles’ proof is the Langlands program, proposed by mathematician Robert Langlands in 1967. It suggests that various mathematical areas are interconnected, and uncovering these ties could lead to breakthroughs in previously unsolvable problems. Wiles’ research provided early confirmation of the Langlands conjecture, with recent discoveries shedding further light on this vibrant area of mathematics.

Upon finishing the book, I felt as if I was wandering through a gallery of abstract art. Mathematics proofs, like art, invite quiet observation, arousing curiosity about the minds behind them, and providing glimpses beyond everyday experience. This book deserves the highest praise for evoking such profound emotions.

Topics:

This revised content is optimized for SEO while preserving the original HTML structure.

Source: www.newscientist.com

Monkeys Uncover Unique Natural Supplement to Combat Human Junk Food Diet

Gibraltar monkeys have recently been observed engaging in a surprising behavior: eating dirt. This habit may be a response to the adverse effects of human junk food, as detailed in a study published in Scientific Reports.

This unusual eating behavior seems to help the monkeys manage digestive issues caused by tourist treats, particularly when lactose-intolerant individuals consume ice cream.

According to Dr. Sylvain Lemoine, a biological anthropologist at the University of Cambridge, “Food consumed by Gibraltar’s monkeys is often high in calories, sugar, salt, and dairy, which is starkly different from their natural diet of herbs, leaves, seeds, and occasional insects.”

“Eating dirt may enable them to indulge in these unhealthy yet appealing foods, similar to how humans enjoy them,” he added.










A research team studied 230 monkeys in Gibraltar, finding that each monkey consumed dirt approximately 12 times per week.

Interestingly, geophagy (the act of eating soil) decreased by 40% during winter compared to the tourist-heavy summer months.

Furthermore, researchers noted that monkeys with a higher frequency of human interaction—and consequently junk food consumption—exhibited more geophagy. In fact, three groups of eight macaque monkeys residing in tourist areas accounted for 72% of geophagy incidents.

Conversely, the only group of monkeys without access to tourist-provided snacks was also the group that did not eat soil.

The study concluded that geophagy is directly linked to junk food, suggesting that it acts as a gut health supplement, mitigating the negative effects of cookies, ice cream, and other snacks on the monkeys’ microbiomes.

“Macaques may have developed this behavior to protect their digestive systems from the high-energy, low-fiber snacks that can lead to stomach issues in primates,” explained Dr. Lemoine.

“Soil can serve as a barrier in the gastrointestinal tract, reducing the absorption of harmful substances, and may even provide beneficial bacteria that assist in gut health.”

Different groups of monkeys showed distinct soil preferences; while most favored red clay, one group preferred tar-rich soils from potholes – Credit: Martin Nicourt/Gibraltar Macaques Project

Such geophagy isn’t unique to Gibraltar’s monkeys; ring-tailed lemurs consume dirt an average of 16 times a week, while East African chimpanzees do so about 14 times. This behavior is also observed in human cultures, particularly during pregnancy.

Geophagy is notably rare among Barbary macaques of North Africa, the ancestral population of Gibraltar’s monkeys.

Remarkably, a group of monkeys at Hong Kong’s Jamshan Country Park has been documented eating dirt over 33 times a week, likely due to their high consumption of human junk food.

Dr. Lemoine emphasized the study’s implications regarding human influence on animal behavior and culture.

“Gibraltar’s monkeys represent a unique case of human-primate interaction, offering valuable insights into how anthropogenic environments impact primate behavior and culture,” he noted.

Read more:

Source: www.sciencefocus.com

How Biofertilizers Transform Natural Microbes in Plants – Sciworthy Insights

A specialized group of soil bacteria known as Plant Growth Promoting Bacteria (PGPB) plays a crucial role in enhancing plant growth and overall health. PGPB typically resides in the soil zones around plant roots, commonly referred to as the rhizosphere or within the plant roots, known as the inner sphere. These beneficial bacteria stabilize nutrients, prevent diseases, and significantly improve plant vitality.

PGPB serves as a primary ingredient in live microbial mixtures applied by farmers in crop fields, often termed biofertilizers. The development of PGPB mixtures is pivotal for sustainable crop management, as biofertilizers are regarded as a more eco-friendly alternative to conventional chemical fertilizers.

A team of Italian researchers investigated how three different PGPB mixtures impacted natural microbial populations in the rhizosphere and endosphere of two sunflower varieties. Their objective was to evaluate whether the PGPB inoculant would exert a lasting influence on the microbial community of sunflowers, while also examining any significant differences between the microbial communities of natural and genetically modified sunflower strains.

Initially, researchers identified bacterial strains that promote plant growth by producing beneficial acids such as indole lactic acid, which enhances resistance to heavy metals, aids in mineral dissolution, and facilitates nutrient release. They cultured 40 distinct bacterial types sourced from bee guts, pollen, wheat rhizospheres, and fruit trees, assessing their acid production. From these trials, they formulated three PGPB mixtures containing six types of bacteria, including Bacillus stocks, 3-in-1 Lactobacillus family stocks, and 2-in-1 Paenibacillus sp. strain.

To evaluate the PGPB mixtures’ effectiveness on crops, the team conducted a two-year field experiment in northern Italy during 2023 and 2024. This study involved 24 fields, including 12 with genetically modified hybrid varieties and 12 plots of naturally grown, open-pollinated sunflowers. The researchers applied the three PGPB mixtures to three plots each, resulting in nine microbe-treated plots per sunflower variety and three control plots devoid of microbes. The PGPB mixture was administered at four different points during the growing season through the irrigation water, while the control plots received microorganism-free irrigation water.

Upon flowering, the researchers harvested the sunflowers and sterilized the roots using saline, effectively isolating the soil microbes in the rhizosphere from those in the endosphere. They then extracted DNA from the samples for analysis of specific genetic regions to identify the microorganisms present using 16S rRNA gene sequencing.

After reviewing the data, the researchers found notable differences in microbial communities between the 2023 and 2024 field experiments, likely attributable to variations in temperature and rainfall. Therefore, they conducted separate analyses for each growing season to accurately gauge the PGPB treatment’s effectiveness. Their findings indicated that the microbial community of the inoculated sunflowers differed significantly from that of the control group, with hybrid sunflowers demonstrating more pronounced alterations in both rhizosphere and endosphere microbial communities compared to open-pollinated varieties, suggesting a stronger response to inoculation.

The research team identified several microbial taxa as “therapeutic indicators,” indicating their abundance varied significantly between treated hybrid sunflowers and controls. The endosphere of treated hybrids showed decreased levels of Pseudonocardiaceae and Nocardiaceae, while levels of Blastocatellaceae and Flavobacteriaceae increased compared to controls. Similarly, the rhizosphere of treated hybrids contained fewer Pseudomonadaceae and Bacillusidae, while exhibiting higher levels of Gemmataceae and Vicinamibacteriaceae. The researchers noted that these microorganisms were part of the sunflowers’ native microbiome, existing in the soil prior to PGPB application.

Furthermore, the research team compared control plots to check for inherent microbial differences between the two sunflower varieties, finding no significant discrepancies in microbial phylum richness. In fact, both varieties’ rhizosphere microbial communities closely mirrored one another, with Bacillus, Pseudomonas, and Actinobacteria comprising approximately 31%, 23%, and 16% of the hybrid sunflowers’ rhizosphere, while accounting for 29%, 25%, and 16% of the open-pollinated variety’s rhizosphere, respectively.

Finally, the researchers assessed whether rhizosphere and endosphere microorganisms were similar across sunflower varieties, discovering that populations of specific microbial families, such as Streptomycetes and Burkholderiaceae, experienced parallel increases and decreases in both the endosphere and rhizosphere. This suggests a possible direct transfer of microorganisms between these layers or that sunflowers may actively select for distinct microbial types.

In conclusion, the research team determined that the PGPB mixture significantly altered the rhizosphere and endosphere of sunflowers by enriching specific beneficial microorganisms. They proposed that scientists could eventually design custom microbial biofertilizers to enhance crop resilience against drought and disease or to improve yield. They emphasized the need for continued exploration into biofertilizers and microorganisms’ roles in soil ecosystems.


Post views:
4

Source: sciworthy.com

How the Number of Siblings Influences Your Personality Traits

Family dynamics are evolving dramatically. The global trend shows a rise in the number of childless individuals, with families increasingly opting for just one child.

The “one-child family” is becoming more common, potentially setting the stage for it to become the new standard in the future.

Factors such as economic uncertainty, escalating childcare costs, shifting gender roles, and the growing trend of women having children later in life — combined with environmental concerns — foster the perception that raising more than one child is challenging or even unfeasible.

As of 2022, 44% of households in the UK included only one child, compared to 41% with two children. Similarly, in the EU, 49% of families have just one child.

Unlike previous generations, where larger families were the norm, this shift is globally recognized. The global birth rate has plummeted from an average of 5 children per woman in 1960 to 2.3 children in 2020, with no end in sight for this downward trend.

Should we be concerned about the well-being of only children? Contrary to popular belief, they do not suffer from social isolation or undue entitlement. In fact, having one child may ultimately provide better outcomes for families.

Discover More:

The Hidden Truth About Only Children

“It’s a widely acknowledged fact that only children are thriving,” says Susan Newman, a renowned parenting expert and social psychologist. “This should prompt parents who feel pressured to have more kids, especially those from earlier generations.”

Newman emphasizes how entrenched these stereotypes are, equating them to sexism and ageism, but also expresses hope that we are breaking free from the unjust stigma.

To trace the roots of the concept of only child syndrome, we revisit 1896 when child psychologists G. Stanley Hall and E.W. Bohannon studied the traits of only children. Their findings suggested a plethora of negative characteristics, branding only children as lonely, bossy, and spoiled — a notion that has perpetuated stereotypes to this day.

Despite criticism of this early research, its influence persists, overshadowing more recent studies that highlight the advantages of being an only child.

Dr. Adrian Mancillas, author of Challenging Stereotypes About Only Children, states that “research consistently shows that only children perform comparably to those with siblings in various social and personality metrics, with no notable behavioral discrepancies.”

While some only children may exhibit traits identified in the 1896 study, these characteristics can also be present in children with siblings. Ultimately, parental influence supersedes sibling presence in determining childhood happiness and social stability.

While siblings can shape a child’s experience, parents wield the greatest influence in raising a happy, well-adjusted adult. – Credit: Getty

That said, there are unique aspects to consider. “Only children receive their parents’ full attention and resources, while avoiding sibling rivalry,” explains Mancillas. However, such intense relationships may expose children to parental stress more acutely than those with siblings may experience.

What about the experiences of only children as they transition into adulthood? Studies indicate that those who grew up as only children often reflect positively on their childhood.

Newman refers to this phenomenon as a “one-child dynasty,” noting that only children are frequently inclined to have only one child as well.

The Rising Trend of One-Child Families

Could the uptick in one-child families signal a successful debunking of detrimental stereotypes? Newman identifies several factors contributing to this one-and-done trend.

“Women are increasingly starting families later, pursuing higher education and stable careers, significantly reshaping their life priorities,” she notes.

“Family structures are diversifying, with more single parents and individuals opting for adoption or IVF, redefining what it means to be a family today.”

As traditional gender roles evolve, so too does the notion of family. The practical aspects of having one child cannot be overlooked; by 2023, the expected cost of raising a child stands at £166,000 for married couples and £220,000 for single parents.

Credit: Getty/Catherine Delahaye

Environmental concerns are also a prominent factor. As awareness of climate issues rises, many choose to have fewer children to minimize their family’s carbon footprint.

However, the benefits of having only one child extend to parents as well. Research indicates that parents with one child report greater happiness. But the second child can significantly reduce overall happiness, and studies suggest happiness gains diminish with subsequent children.

This trend is particularly notable in the UK, US, and Canada, where parental support systems are less robust than in places like Germany, where free public childcare is available, and Romania, which provides extended parental leave.

As we look to the future, could the one-and-done trend persist? While birth rates are on the decline, many may find that being part of a one-child family leads to greater financial security and personal freedom. Ultimately, it appears only children are thriving.

Explore Further:

Source: www.sciencefocus.com

Potential mRNA Vaccine Poised for Release as Bird Flu Pandemic Threatens

Sure! Here’s an SEO-optimized rewrite of your content while keeping the HTML tags intact:

Innovative Vaccines in Development to Combat Potential Bird Flu Pandemic

Weyo / Alamy

The emergence of COVID-19 highlighted the urgency of rapid vaccine development, taking approximately one year to roll out the first SARS-CoV-2 vaccine. Tragically, this was after millions of deaths and economic turmoil. However, if a bird flu pandemic strikes, we can respond significantly faster, thanks to pre-approved mRNA vaccines that are ready for immediate deployment. Phase III trials are actively being conducted in the UK and US.

“An influenza pandemic is highly likely in the future. It’s crucial we are adequately prepared,” states Richard Pebody from the UK Health and Safety Executive.

The primary threat is the H5N1 avian influenza strain, notably clade 2.3.4.4b. Emerging roughly a decade ago, this strain has sprawled among wild bird populations globally, even reaching Antarctica. It has also been reported in numerous wild mammals and poultry farms. Alarmingly, the infection is widespread among dairy cows in the United States.

Since 2024, over 100 cases of human infection have been documented; however, there is no evidence of person-to-person transmission. The risk continues as long as H5N1 avian influenza remains active.

“While we cannot predict the timing or severity of the next pandemic, proactive preparedness is essential as influenza viruses continue to circulate in animal populations and may adapt,” warns Hiwot Hirui from Moderna.

Moderna’s mRNA-1018 vaccine targeting H5N1 has completed Phase I and II trials with no safety concerns reported. Current Phase III trials involve 3,000 volunteers in the UK and 1,000 in the US.

Typically, vaccine trials assess effectiveness; however, due to the limited prevalence of H5N1 in humans, the focus will be on measuring immune responses in participants. Early results indicate a robust immune response, as noted by Hirui.

The trial prioritizes individuals aged 65 and older, along with poultry workers, who face higher risks of avian influenza exposure.

Some countries are stockpiling traditional vaccines against H5N1; for instance, The UK has secured 5 million doses. However, this conventional vaccine, similar to many seasonal influenza vaccines, is produced using chicken eggs, making it challenging to scale up production or adapt quickly if the virus evolves significantly.

In contrast, mRNA vaccine production can be rapidly scaled and easily modified. This adaptability presents a considerable advantage in pandemic preparedness, as outlined by Pebody.

The trial is Funded by the Coalition for Epidemic Preparedness Innovations (CEPI), which has the support of over 30 countries and various organizations, particularly following the reduction in mRNA vaccine funding by the US government.

Countries like England and the US are exploring the rollout of H5N1 vaccines for livestock, especially poultry. This methodology has been employed in various nations for years, with studies in France showing that vaccinating ducks significantly decreased H5N1 infections on farms.

Topics:

This version includes relevant keywords for SEO while maintaining the same structure and content integrity.

Source: www.newscientist.com

3 Simple Steps to Improve Your Decision-Making Skills

Have you ever found yourself overwhelmed while trying to order a product online, such as a new electric toothbrush during a work break? What seemed like an easy task quickly turns into a situation of decision paralysis due to the multitude of options available.

Factors to consider go beyond just price and delivery time. You might also need to evaluate battery life, features like warning lights for excessive pressure, and even app integrations.

When time is limited during a break, making a decision feels almost impossible.

This scenario exemplifies decision paralysis—the inability to choose among numerous options due to the difficulty in weighing all factors.

While more choices were once thought to be beneficial for consumers, they can actually backfire and lead to overwhelm.






Beyond consumer choices, decision-making permeates other areas of life, including selecting a college, job, or even a romantic partner. While you might have only a couple of options, the fear of making the wrong choice can be paralyzing.

If you’re a “maximizer”, someone who strives to make the best possible decision, you may be more susceptible to decision paralysis. Conversely, if you’re a “satisfied person” who can settle for “good enough,” you’re likely to face this issue less often.

The anticipation of regretting a poor choice contributes to this paralysis. The more you dwell on potential regret, the more likely you are to become stuck.

This relates to the concept of opportunity cost, which refers to what you’ll miss out on by choosing one option over another.

Fortunately, there are effective ways to combat decision paralysis:

  1. Acknowledge that achieving a perfect decision is virtually impossible without a crystal ball.
  1. Understand that avoiding a decision is still making a choice. Don’t fool yourself into thinking that procrastination is a viable solution.
  1. If you’re willing to invest some time, a practical method is to research and prioritize the factors that matter most to you. This will simplify the decision-making process.

For instance, when choosing a toothbrush, prioritize price and battery life. When selecting a university, consider factors like reputation, friendships, and proximity to home.

Evaluate your options against these factors, assigning higher priority to more significant ones. This will give you a numerical score indicating the most favorable choice.

And if all else fails, a simple roll of the dice might help in case of a tie!

This article addresses the question posed by Carrie Muller of Tunbridge Wells: “What is decision paralysis and how can I overcome it?”

For inquiries, please email us at: questions@sciencefocus.com or reach out via Facebook, Twitter, or Instagram (please include your name and location).

Explore our ultimate fun facts for more fascinating science content.

Read more:


Source: www.sciencefocus.com

450-Million-Year-Old Fossils Uncover Unique Ancestor of Tube-Dwelling Jellyfish

Paleontologists have recently discovered a new genus and species of soft-bodied tubular polypoid mesozoan, named Paleocanna tentaculum, from a remarkably preserved specimen located approximately 50 kilometers northeast of Quebec City, Canada.



Depiction of Paleocanna tentaculum. Individuals are found living in a single tube or in clusters of two or three interconnected tubes. Image credit: Ramirez-Guerrero et al., doi: 10.1017/jpa.2025.10211.

“Jellyfish and their related polyps belong to an ancient group known as cnidarians,” stated Professor Christopher Cameron from the University of Montreal and his research team.

“Due to their soft bodies, these organisms are rarely preserved as fossils, creating gaps in our understanding of their evolutionary history.”

The recently identified species thrived in the Ordovician seas around 450 million years ago.

Paleocanna tentaculum represents ancient life forms that existed either solitarily or in groups within upright tubes.

The polyp features an elongated structure with a crown of tentacles extending past the tube’s edge.

“This species is closely related to modern jellyfish,” paleontologists noted, emphasizing the rarity of this find.

“Only a handful of other species from this subphylum have been documented in the fossil record.”

The fossil remains are found on the upper surface of a shallow limestone layer, discovered in a small quarry within the Neuville Formation in Quebec, Canada.

“This area is regarded as one of the most diverse fossil locations globally for Ordovician specimens,” Professor Cameron remarked.

The team analyzed 15 slabs of sandy limestone that contained approximately 135 specimens of Paleocanna tentaculum.

“Soft-bodied organisms are less likely to be preserved compared to their hard-bodied counterparts, making these fossils particularly valuable for understanding life’s history,” noted Louis-Philippe Bateman, a graduate student at McGill University.

“This discovery underscores the significance of Quebec’s fossil records.”

“Often, the fossil record is perceived as less appealing compared to regions like British Columbia or Alberta.”

“Findings like this highlight the potential for new discoveries and insights here.”

By comparing Paleocanna tentaculum, scientists have revealed that it is more closely related to present-day jellyfish species, including box jellyfish, true jellyfish, and stalk jellyfish, than to other extinct tubular genera.

This indicates that Paleocanna tentaculum occupies a more modern position on the evolutionary tree than many other known fossil polyps.

“The extraordinary preservation of these fossils marks this discovery as one of the rare instances of delicate, soft-bodied organisms found in Ordovician sediments,” the authors concluded.

For more details, refer to their paper published in the Paleontology Journal on February 13, 2026.

_____

Greta Ramirez-Guerrero et al. Upper Ordovician Tecate stem mesozoan polyp of Quebec. Paleontology Journal published online on February 13, 2026. doi: 10.1017/jpa.2025.10211

Source: www.sci.news

How Extreme Weather Patterns Could Explain Titan’s Mysterious Plains

Titan - Saturn's Largest Moon

Images of Titan captured by Cassini Spacecraft

NASA/JPL/SSI/Val Klavans

Titan, Saturn’s largest moon, features vast plains often covered with up to a meter of light, organic “snow.” Remarkably, approximately 65% of Titan’s surface consists of uniformly flat plains blanketed in a porous layer made of particles deposited from its hazy atmosphere.

Due to its dense atmosphere, studying Titan from a distance poses challenges. The Cassini spacecraft, which orbited Saturn from 2004 to 2017, employed radar technology to gather in-depth observations. Recently, Professor Alexander Hayes and his team at Cornell University refined their analysis of the radar data.

The interaction of radio waves from Cassini’s radar with Titan’s surface suggests complexities beyond those of typical rocky celestial bodies. “Existing models developed for the Moon and similar bodies do not apply directly to Titan,” Hayes explains. “Its radar scattering properties reveal it as a unique entity in our solar system.”

The researchers propose a two-layer model to better explain Titan’s surface characteristics, indicating that a hard substrate is covered by a soft, low-density material, differing from the simplistic rocky models. They suggest that this outer layer, varying in thickness from a few centimeters to a meter, comprises organic molecules descending from Titan’s dense atmosphere, resembling snowfall before compaction over time.

Furthermore, Titan’s surface experiences rain, wind, and erosion, necessitating exploration into how these processes contribute to the development of its blanket layers. “Understanding these mechanisms can provide valuable insights into Titan’s broader environmental processes,” Hayes adds.

NASA’s upcoming Dragonfly mission, set to launch in 2028 and reach Titan by 2034, aims to analyze these surface layers, enhancing our comprehension not only of Titan but also aiding the design of future missions targeting this extraordinary moon.

Exploring the History and Future of Space Exploration in the United States

Join us on an extraordinary journey through America’s significant space and astronomy sites, tailored for the curious and lifelong learners.

Topics Covered:

Source: www.newscientist.com

Scientists Unveil Insights into One of the Oldest Neanderthal Communities

Recent studies of mitochondrial DNA (mtDNA) from eight fossils unearthed in Poland’s Stazynia Cave unveil a tightly-knit community of Neanderthals who inhabited the region approximately 100,000 years ago. This discovery offers one of the most definitive genetic insights into a singular prehistoric group in Europe.

Approximately 100,000 years ago, at least seven Neanderthals inhabited Stazynia Cave in modern Poland. Image credit: Tyler B. Tretsven.

Located at 359 meters altitude, Stazynia Cave lies between the villages of Milow and Boborice on the Krakow-Częstochowa Plateau in southern Poland.

This limestone cave is defined by its narrow entrance and is a site of significant archaeological interest.

From 2007 to 2010, excavations were meticulously conducted over an area of approximately 16 square meters in the cave’s rear.

Among the critical discoveries were nine human teeth, five of which have been confirmed as Neanderthal remains.

In this groundbreaking study, Professor Andrea Pisin from the University of Bologna and colleagues successfully extracted and analyzed mtDNA from the nine fossils.

The findings suggest these fossils belonged to at least seven, and potentially eight, Neanderthals.

Interestingly, three of the specimens shared identical mtDNA, indicating a close relation or possibly sharing maternal lineage.

Through detailed analysis of their genetic patterns, researchers estimate that this group thrived during a warm interglacial period between approximately 120,000 and 92,500 years ago.

This research makes them the oldest known Neanderthal group identified genetically in central Europe.

“This is a groundbreaking result; we can now observe a small community of at least seven Neanderthals in central and eastern Europe who lived around 100,000 years ago,” stated Professor Pisin.

“Typically, Neanderthal genetic data is derived from isolated fossils or various sites scattered over time; here at Stazynia, we can reconstruct a miniature population, presenting the first cohesive genetic perspective of Neanderthals in this European region.”

Dr. Violeta Nowaczewska from the University of Wrocław and Dr. Adam Nadaczewski from the Institute of Animal Systematics and Evolution at the Polish Academy of Sciences noted, “Stazynia Cave has long been recognized for its exceptional preservation, but these findings have surpassed our expectations.”

The identification of this small, ancient Neanderthal population at such a complex site marks a significant milestone in Polish research and Neanderthal studies in Europe.

Researchers discovered that Neanderthals sharing similar genetic traits also resided in regions as distant as southeastern France, the Iberian Peninsula, and northern Caucasus, suggesting a once-wide-ranging maternal lineage that was later supplanted by other genetic groups.

“Particularly intriguing is the shared mtDNA found in two juvenile teeth and one adult tooth, indicating they may be closely related,” remarked Dr. Mateja Hadjdinjak from the Max Planck Institute for Evolutionary Anthropology.

Furthermore, comparisons with a Neanderthal known as Thorin, excavated from France’s Mandolin Cave, show strikingly similar mtDNA, although Thorin is estimated to be around 50,000 years old.

“This study serves as a reminder to approach ancient timelines with caution,” advised Professor Sala Talamo from the University of Bologna. “As radiocarbon dating approaches calibration limits, it becomes crucial to not assign undue precision to the data.”

“Therefore, integrating archaeology, radiocarbon dating, and genetic data is essential.”

These results were published in this week’s edition of Current Biology.

_____

Andrea Pisin et al. For the first time, multiple Neanderthal mitogenomes have been discovered in the northern Carpathians. Current Biology published online on April 20, 2026. doi: 10.1016/j.cub.2026.03.069

Source: www.sci.news

New Research Reveals Connection Between Coffee Consumption, Microbiome Changes, and Enhanced Mental Health

A groundbreaking study conducted by researchers at University College Cork reveals that both caffeinated and decaffeinated coffee can positively reshape the gut microbiome, leading to reduced stress and enhanced psychological well-being. This study provides valuable insights into the long-acknowledged health benefits of coffee.



Boscaini et al. uncover a previously unrecognized effect of coffee on the microbiota-gut-brain axis, indicating a strong link between coffee consumption and gut microbial composition. Image credit: Sci.News.

Coffee, a popular plant-based beverage derived from processed coffee beans, offers a complex flavor and chemical profile influenced by factors such as bean variety, ripeness, processing methods, roasting techniques, and brewing styles.

This beverage is rich in bioactive compounds, including alkaloids (like caffeine), polyphenols (such as phenolic acids), diterpenes, and melanoidins formed during the roasting process.

Research indicates that moderate coffee consumption is linked to a reduced risk of several chronic illnesses, including type 2 diabetes, liver disease, cardiovascular issues, and certain types of cancer.

Moreover, increased coffee intake has been associated with a lower risk of Parkinson’s disease, emphasizing a dose-dependent relationship.

Individuals who drink coffee regularly are statistically less prone to depression, with one study noting a 27% lower incidence of Alzheimer’s disease among habitual coffee consumers.

In this recent research, Professor John Cryan and his team from University College Cork explored how coffee intake, withdrawal, and reconsumption impact cognition, mood, and behavior, specifically in relation to the microbiota-gut-brain connection.

The researchers conducted a comprehensive analysis involving psychological assessments, caffeine and food diaries, as well as stool and urine samples from 31 coffee drinkers and an equal number of non-coffee drinkers to track changes in their microbiome and reported mood and stress levels.

Coffee aficionados were identified as individuals who consume 3 to 5 cups daily, a quantity deemed safe by the European Food Safety Authority (EFSA).

Participants initially abstained from coffee for two weeks, during which they underwent regular psychological evaluations and provided stool and urine samples.

This abstinence period correlated with significant changes in the gut microbiota among coffee drinkers compared to their non-coffee-drinking counterparts.

Upon reintroducing coffee, participants experienced a blinded trial where half consumed decaffeinated coffee while the other half drank caffeinated coffee.

Both groups reported decreased scores for stress, depression, and impulsivity, indicating that coffee consumption notably enhances mood, independent of caffeine content.

Notable increases in specific bacterial species such as egger terra sp. and Cryptobacterium cultum were observed in coffee drinkers, suggesting their roles in promoting digestive health by supporting stomach and intestinal acidity and contributing to bile acid synthesis, crucial for combatting harmful gut bacteria and stomach infections.

A rise in Firmicutes bacteria, which is linked to positive emotional states in women, was also noted.

However, cognitive improvements such as enhanced learning and memory were predominantly seen in those consuming decaffeinated coffee, hinting at the influence of non-caffeine components like polyphenols on cognitive function.

Conversely, scientists determined that only caffeinated coffee contributed to reduced anxiety levels and heightened alertness, with caffeine also linked to a lower risk of inflammation.

“The growing public interest in gut health is significant,” states Professor Cryan.

“As the connection between digestive health and mental well-being becomes clearer, we still need to unravel the mechanisms through which coffee impacts the gut-brain axis.”

“Our research illuminates the relationship between the microbiome and neurological responses to coffee, highlighting potential long-term health benefits related to a healthier microbiome.”

“Coffee modifies microbial activity and the metabolites they utilize.”

“As awareness regarding dietary adjustments to promote digestive health increases, coffee could serve as a beneficial addition to a balanced diet.”

“Coffee is more than just a caffeine source; it is a multifaceted dietary element that interacts with gut bacteria, metabolism, and mental health.”

“Our findings imply that both caffeinated and decaffeinated coffee have distinct, yet complementary, health impacts.”

The team’s findings were published in today’s issue of Nature Communications.

_____

S. Boscaini et al. 2026. Habitual coffee consumption shapes the gut microbiota and alters host physiology and cognition. Nat Commun 17, 3439; doi: 10.1038/s41467-026-71264-8

Source: www.sci.news

How Brushing Your Teeth in the Hospital Can Prevent Infections

Close-up of a woman brushing her teeth, highlighting oral health.

The Overlooked Importance of Tooth Brushing in Hospitals

Drazen Zigic/Getty Images

Brushing your teeth while receiving hospital treatment can drastically lower your risk of pneumonia.

For various reasons, many patients in hospitals neglect to brush their teeth. Some may forget their toothbrush, some may not consider it, and others might lack the ability or inclination to do so. Additionally, healthcare personnel often overlook routine oral hygiene practices when caring for patients.

However, a significant randomized controlled trial revealed that patients who received tooth brushing, toothpaste, and dental hygiene advice during hospitalization were 60% less likely to develop pneumonia—a common hospital-acquired infection. This finding was highlighted by Brett Mitchell at Avondale University, Australia.

“This underscores the necessity of educating patients about pneumonia risks and the vital role of oral care during hospitalization,” he states.

Pneumonia often arises from medical devices that disrupt natural respiratory functions. However, many hospitalized individuals who are not on ventilators also develop pneumonia at least 48 hours post-admission. Researchers continue to investigate the causes and preventive measures of this condition, which is linked to longer hospital stays, increased healthcare costs, and higher mortality rates— research article.

“Addressing this is crucial,” notes Michael Klompas from Harvard University, who was not part of the study. “Hospital-acquired pneumonia is one of the most prevalent and dangerous healthcare-associated infections, yet there is a lack of substantial data on how to effectively prevent it.”

Mitchell suspected that bacteria in the mouth might be contributing to this issue. The oral microbiome can influence respiratory health since inhaling bacteria-laden droplets can introduce pathogens into the lungs. He points out that a hospitalized patient’s oral microbiome changes, making it important to address the problem.

To investigate, Mitchell and his colleagues launched a year-long randomized controlled trial involving 8,870 patients across three Australian hospitals to assess the effect of oral hygiene on pneumonia risk. He shared the findings from this segment of the Nosocomial Pneumonia Prevention (‘HAPPEN’) study at the international conference organized by the European Society for Clinical Microbiology and Infectious Diseases (ESCMID Global) in Munich, Germany.

Each hospital divided participants into three groups. No interventions were made during the initial three months. Following this period, one group was provided with toothbrushes and toothpaste featuring messages like “Brushing your teeth helps prevent pneumonia” and “Blow away pneumonia!” The brushes were designed with special handles for individuals with limited dexterity, and patients received QR codes linking to educational resources on the HAPPEN website.

After six months, the second group also received toothbrushes, and after nine months, the third group did as well. Consequently, all participants had the opportunity to practice tooth brushing for the final three months of the study.

For medical staff, the research team provided oral care training for nurses on the wards and offered professional guidance through their website. They also encouraged nursing staff to remind patients to brush and floss, assisting those who encountered difficulties.

During the pre-intervention phase, only 15.9% of patients brushed their teeth daily. In contrast, during the intervention phase, 61.5% of patients engaged in oral hygiene at least once a day, averaging 1.5 times per day. Web analytics showed frequent engagement from both patients and nurses with the HAPPEN resource during the intervention.

Notably, the incidence of hospital-acquired pneumonia not associated with ventilators decreased significantly—from 1 case per 100 hospital days in the control group to 0.41 cases in the intervention group.

“This study is noteworthy due to its large scale and randomized design,” says Klompas, emphasizing that brushing teeth during hospitalization not only promotes oral health but could potentially save lives.

Piry Sipila from the University of Helsinki highlights the significance of substantial risk reductions resulting from straightforward actions. “The intervention was simple: patients received a toothbrush, toothpaste, and practical advice,” he notes, but results may differ based on a patient’s hospital reason and existing oral hygiene habits.

Topics:

Source: www.newscientist.com

Understanding US Policy Shifts in Iran Through Game Theory Analysis

Ship Navigating the Strait of Hormuz

Image Credit: Shady Alasar/Anadolu via Getty Images

“Mission accomplished.” This phrase has cast a long shadow over U.S. foreign policy since 2003, when George W. Bush triumphantly declared victory on the aircraft carrier Abraham Lincoln—only for the conflict to endure for another eight years. It has come to symbolize the disconnection between military objectives and reported achievements.

The conflict surrounding the Strait of Hormuz is escalating as we enter the second month of tension. Insights from game theory, which studies strategic decision-making, can help illuminate these complex dynamics.

In a traditional military confrontation, the combined might of the United States and Israel stands unmatched. Their advanced weaponry and precision strike capabilities significantly damage Iran’s military infrastructure, suggesting a traditional victory for the alliance.

However, this situation diverges from conventional warfare. It has transformed into a war of attrition, where multiple “players” engage in a costly stalemate, each hoping that the opposing side will eventually falter. Game theory posits that in such scenarios, victory is less about military might and more about which side can endure losses longer. Time is particularly on Iran’s side in this equation.

While Iran incurs significant costs, they remain manageable for the regime. Notably, the Iranian government exhibits an impressive ability to regenerate its command structures—removing one layer of leadership merely allows another to take its place. Moreover, their stockpiles of missiles and cost-effective drones keep replenishing faster than they are used.

In contrast, the United States faces a far steeper financial burden. Maintaining naval dominance in the straits demands continuous, costly deployments. Each drone interception, carrier rotation, and diplomatic effort to sustain a wavering coalition adds to the mounting expenses. In a war of attrition, the asymmetry of costs becomes more critical than raw firepower, and this factor unfavorably affects American interests.

A Blurry Objective

This structural reality explains the Trump administration’s ambiguous definition of victory—a point that has puzzled many. This lack of clarity serves a strategic purpose. Game theory suggests that when battlefield conditions are unfavorable, it is beneficial to obscure objectives.

To devise an effective strategy and foresee potential outcomes, one must first comprehend each player’s goals. Yet, the parameters seem to continuously shift.

This conflict did not initially revolve around the straits. The original aims included regime change, dismantling Iran’s nuclear capabilities, and neutralizing the Islamic Revolutionary Guards Corps. The narrowing of these goals to the more limited objective of controlling the straits signifies a loss of momentum for the campaign.

Interestingly, game theory identifies a dual advantage in such ambiguity. Unconsolidated objectives limit interaction; players who remain vague on their goals have the flexibility to proclaim victory and exit the situation on their terms.

Unclear objectives provide an adaptability that concrete commitments do not. This allows a player with undefined goals to escape accountability for failing to achieve them, and they may even be perceived as successful if they possess the necessary diplomatic skill. President Donald Trump has utilized this strategy throughout his two terms in office.

Moreover, time constraints play a crucial role. Research into the political economy of conflict indicates that leaders facing imminent electoral decisions experience pressure to conclude wars of attrition before voters cast their ballots. With midterm elections approaching, President Trump’s options for securing a withdrawal are rapidly diminishing.

Topics:

Source: www.newscientist.com

Understanding Why Cats Lick People: The Fascinating Reasons Behind This Quirky Behavior

**Licking**: Cats have an innate obsession with licking. Research indicates that adult felines can spend up to 8% of their waking hours grooming themselves using their tongues. This behavior serves not only a personal hygiene purpose but also plays a vital social role, as adult cats often lick one another before mating.

But what about their interactions with humans? Why do cats lick people? Fortunately, there’s no evidence suggesting that your cat is treating this as part of a pre-mating ritual. Unfortunately, scientists and cat behaviorists are still piecing together the reasons behind this puzzling behavior of cats licking human skin.

While a definitive explanation is still elusive, several theories attempt to explain why domestic cats engage in this licking behavior. Spoiler alert: Your feline friend might not enjoy all of them.

<h2 class="wp-block-heading" id="h-why-do-cats-lick-people">Why Do Cats Lick People?</h2>
<p>There's no singular explanation for your cat's licking behavior. However, three primary theories have emerged regarding why domestic cats exhibit this action:</p>
<ul class="wp-block-list">
    <li>It demonstrates trust.</li>
    <li>They are gathering biochemical information from your skin.</li>
    <li>They are marking you as their territory.</li>
</ul>

<h3 class="wp-block-heading" id="h-the-trust-theory"><strong>Trust Theory</strong></h3>
<p>Cats may lick you to indicate that they trust you or, at the very least, that they do not see you as a serious rival. Dr. <a href="http://www.problempets.co.uk/about.asp" target="_blank" rel="noreferrer noopener">David Sands</a>, an expert in animal psychology with over 25 years of clinical experience, notes that this licking is akin to allogrooming among cats. This mutual grooming strengthens their bond, a behavior learned from their mothers during kittenhood.</p>
<p><strong>Read more:</strong></p>
<p>"Ultimately, adult cats will only lick other cats they trust and do not view as competition. This affectionate grooming behavior can also be transferred to humans. From their perspective, cats categorize beings as either competitive or non-competitive.</p>
<p>"If your cat licks you, it's not necessarily an affectionate gesture, but it’s still a positive sign of recognition, not rivalry." <em/></p>

<figure class="wp-block-image size-landscape_thumbnail">
    <img loading="lazy" decoding="async" width="940" height="627" src="https://c02.purpledshub.com/uploads/sites/41/2021/04/cat-licking-1969c3a.jpg?webp=1&amp;w=1200" alt="A cat licking a person's face." class="wp-image-76541" title="Licking Cat © Getty"/>
    <figcaption class="wp-element-caption">Helpful note: Always consider where your cat last licked before allowing this - Photo credit: Getty</figcaption>
</figure>

<p>If you’re uncertain, take a cue from the cleverly titled Lincoln University study: <em>Domestic cats show no signs of attachment to their owners</em>. Researchers swapped owners with 20 cats and found the felines deepened their bonds with strangers based on behaviors like play and mirroring.</p>
<p>The researchers concluded: "These results suggest that adult cats typically maintain autonomous social relationships and do not rely on humans for security."</p>

<h3 class="wp-block-heading" id="h-the-biochemical-theory"><strong>Biochemical Theory</strong></h3>
<p>This theory is straightforward: cats lick you to explore the interesting scents on your skin. According to Dr. Sands, "Cats’ taste buds are so refined that they can detect pheromones or scents from other animals on your skin." </p>
<p>"There may also be residues from food you've eaten, such as salt or moisturizer. These intriguing scents draw cats in, encouraging them to lick."</p>

<h3 class="wp-block-heading" id="h-the-possession-play-theory"><strong>Possession Play Theory</strong></h3>
<p>As Dr. Sands notes, cats are essentially "head-to-tail sniffing machines." Their favorite scent? Their own! Cats are so enamored with their unique scent that they often want to replace other smells with theirs.</p>
<p>"This explains why they lick themselves after being petted—it's a way to eliminate your scent!" says Sands.</p>
<p>"Much of a cat's behavior centers around territory and ownership. When they groom humans, they might be removing scents and marking you with their own to say, 'This is mine! You're mine!'"</p>
<p>Many people mistakenly believe that when cats rub against or lick you, it represents affection. However, in reality, cats are quite possessive, and the more they can mark you with their scent, the better!</p>

Source: www.sciencefocus.com

Unlock Your Productivity: Neuroscientist Tips to Trick Your Brain for Maximum Efficiency

Browse social media, and you’ll encounter numerous claims about productivity hacks, such as waking up at 4 a.m., taking specific supplements, or keeping a jam-packed schedule.

However, many of these tips lack scientific support, and some are even misleading. So, what truly enhances productivity? Are there easily applicable life hacks, rooted in science, that we can incorporate into our daily routines?

While no hack will instantly transform you into the next Bill Gates, there are small yet effective changes you can make to boost your workplace productivity.

The Benefits of Background Music

There’s ongoing debate about productivity levels in home versus office environments, with each side claiming the other is more prone to distractions.

One often-overlooked aspect is that certain distractions can actually enhance productivity. While some individuals prefer a quiet setting, many find they are more productive with ambient noise.

This noise often manifests as background music, which can aid concentration rather than disrupt it. Research shows that we have two distinct attention systems: the conscious one we control, and the unconscious one that alerts us to stimuli, redirecting our focus.

Music enhances our unconscious alertness – Credit: Rachel Tunstall

When concentrating on tasks, our conscious attention can still be interrupted by unconscious inputs. In a silent environment, background noises become more pronounced, making distractions more likely and negatively impacting productivity.

Playing music can help mask these annoying sounds and redirect our unconscious focus, akin to giving a toy to a bored child. However, the genre matters; songs with lyrics can disrupt concentration because our brains respond more to verbal cues. Research indicates that music negatively affecting your mood can undermine motivation.

Interestingly, video game soundtracks tend to be the most effective for enhancing productivity, as they are designed to engage listeners while they focus on other tasks.

In conclusion, background noise or music can improve productivity instead of hindering it.

Prioritize Adequate Sleep Over Early Rising

If you’ve ever forced yourself to wake up before dawn in hopes of being more productive, you know it can backfire, leaving you fatigued and unable to accomplish tasks.

That said, any wake-up time can be productive if it follows a night of sufficient rest. Going to bed at 8 p.m. and rising at 4 a.m. certainly allows for adequate sleep.

There are numerous health benefits tied to quality sleep. Improved memory retention, focus, overall health, mood enhancement, and reduced irritability all contribute to greater productivity.

Sleep serves as the foundation for productivity – Credit: Rachel Tunstall

Sleep also enhances productivity by enabling memory processing and integrating daily experiences into existing neural pathways.

This is why the concept of “sleeping on a problem” often leads to better insights, as your brain processes the issue while you rest, as opposed to exhausting yourself by staying awake to understand it.

In summary, sleep is crucial for productivity, even more so than the time you wake up.

Nature’s Influence: Walks and Workplace Plants

Incorporating houseplants into the work environment is quite common, as is seeking a workspace with a view of nature. While some workplaces may prioritize uniformity over greenery, plants and natural sights are generally appreciated by employees.

Connecting with nature sharpens our focus – Credit: Rachel Tunstall

But why do we expend so much effort bringing nature indoors? It’s not merely aesthetic. Numerous studies indicate that introducing plants into the workplace can boost productivity.

This increase can be attributed to the restoration of attention, also known as “fascination.”

In modern environments, stimuli like screens, signs, and constant changes can hijack our focus. While our brains enjoy these distractions, they require significant mental resources to process them, leading to fatigue.

Conversely, looking at plants provides a cognitive relief, similar to the experience of reading a captivating book. This natural engagement replenishes mental energy, which is why nature enhances productivity. So, if you feel the urge to take a walk to clear your mind, you might be intuitively seeking a refreshment of your brain’s resources.

Diet and Exercise: Moving Beyond Fads for Enhanced Productivity

Many articles focus on how productivity can be improved through diet and exercise, often reflecting the habits of “highly successful people.” However, many of their recommendations can seem impractical for everyday individuals.

A balanced diet and regular exercise promote alertness more than the latest productivity fads – Credit: Rachel Tunstall

You’ve likely encountered stories about individuals with extravagant breakfast routines involving “superfoods” and elaborate preparations. These narratives can appear daunting or unattainable.

Yet, disregarding the eccentricities, it’s clear that both diet and exercise greatly impact productivity. Regular exercise has repeatedly shown to provide significant benefits for both body and brain. A healthier body can allocate more resources to cognitive tasks, thereby enhancing brain function.

Your diet also fundamentally affects your mental efficiency. Research suggests that junk food can negatively impact brain function, reducing your ability to concentrate and maintain motivation.

Thus, focus on improving your diet and exercise routine to elevate productivity rather than chasing after the newest trends.

Finding Your Productivity Zone

Bear in mind that everyone’s productivity pathway differs. What works for one person may not suit another, as individual factors play a crucial role in productivity.

Understanding your habits is key to maximizing productivity – Credit: Rachel Tunstall

Identifying the elements that work best for you is essential. Achieving a state of cognitive “flow,” often referred to as “being in the zone,” can significantly increase your productivity.

Flow represents the ultimate state of productive focus, allowing you to perform to the best of your abilities. However, reaching this state can be challenging due to the various distractions competing for your attention.

Ultimately, everyone has unique triggers for achieving this in-the-zone experience; thus, discover the specific conditions that enhance your productivity. While productivity advice can be helpful, no one knows your unique productivity style better than you.

Read more:

Source: www.sciencefocus.com

Study Reveals Parrots Use Flexible Naming: Mimicking Human Communication Styles

A comprehensive study involving nearly 900 parrots living alongside humans has unveiled groundbreaking insights. Researchers from the University of Northern Colorado, the University of Vienna, the Acoustical Institute of the Austrian Academy of Sciences, and the University of Pittsburgh Johnstown discovered evidence that certain parrots don’t just imitate human speech but may also create and utilize names to identify specific individuals.



Gray parrot (Psittacus erithacus) named John munching on a cucumber. Image credit: Papuga.

The question of whether animals can employ proper names for themselves and others has intrigued both scientists and the general public for years.

Significant evidence indicates that numerous animals can recognize and respond to names assigned by humans, and some can even invent and utilize unique vocal traits.

Despite this, previous research has failed to demonstrate that a variety of animal species can create and use names that conform to human language conventions.

“While many animals respond to human language cues, only a select few are capable of learning language-like sounds and using them correctly,” stated Professor Lauryn Benedict of the University of Northern Colorado and her team.

“Parrots excel in vocal learning, including human words, and can accurately apply those words as labels for individuals.”

“This capacity for vocal production learning allows researchers to delve into whether and how animals employ vocal labels, rather than merely responding to them. This enriches our understanding of the cognitive processes behind word usage and labeling.”

In this study, the authors scrutinized data from over 889 captive parrots as part of the ManyParrots project, designed to explore vocal learning and cognition in parrots through survey responses and vocal recordings.

Moreover, many survey participants shared additional details that assisted researchers in comprehending how these birds utilize their names.

Nearly half of the respondents provided examples of parrots using names creatively.

Of the 413 audio clips analyzed, 88 instances demonstrated birds using their names as labels for humans or other animals.

The findings also revealed strong evidence that some parrots refer to specific individuals rather than general categories like ‘human.’

Interestingly, many parrots employed these labels in ways typically unrecognized by humans. For example, parrots might vocalize their names to attract attention.

This study suggests that parrots possess the cognitive and vocal abilities to use names flexibly, ranging from social communication with humans to conversing about individuals who are not present.

Nonetheless, due to variations among species and individual birds, numerous questions persist regarding when, why, and how animals utilize these skills to call out the names of other beings.

“Our research indicates that parrots frequently learn names from humans and apply them in diverse contexts, aligning with their cognitive ability to associate names with specific individuals,” the researchers asserted.

“Although the parrots in our study primarily used human-given names, unanswered questions linger about their capacity for self-naming.”

“Nonetheless, our findings clearly illustrate that animals can learn and employ unique names in appropriate social contexts.”

“Future studies need to investigate this behavior in controlled settings to comprehend the cognitive foundations behind it in parrots and other animal species.”

“The capability to label individuals is expected not only in captive animals but also extends to those in the wild.”

“We anticipate that forthcoming research will unveil effective methods to identify animal names independent of human language.”

For more details, refer to the study published this month in the online journal PLoS ONE.

_____

L. Benedict et al. 2026. Name use by companion parrots. PLoS One 21 (4): e0346830; doi: 10.1371/journal.pone.0346830

Source: www.sci.news

New Geological Discovery Reveals Evidence of Ancient Ocean in Mars’ Northern Plains

A continent-like shelf beneath Mars’ surface indicates that a vast ocean may have once covered up to one-third of the planet, reigniting a long-standing debate about Mars’ watery past.



Artist’s impression of Mars as it appeared around 4 billion years ago. Credit: M. Kornmesser / ESO.

While it is widely accepted that Mars had some liquid water on its surface, the existence of long-lasting oceans remains uncertain. It’s debated whether water existed solely in lakes and streams or whether significant oceans formed during Mars’ history.

Previous Mars missions have identified geological features resembling coastlines, but their subtlety and varying elevations complicate their interpretation.

Real coastlines would exhibit consistent elevation across the globe, similar to Earth’s sea level. However, observations suggest otherwise.

“If Mars had an ocean, it likely dried up billions of years ago, more than half of Mars’ age,” states Michael Lamb, a professor at the California Institute of Technology.

“Earth has very few features that are billions of years old, especially after continuous erosion and disturbances over time,” he adds.

“We sought terrain that could provide stronger evidence of such an ancient ocean.”



Illustration from orbiter data showing the coastal shelf region of Mars, a hallmark of global oceans formed over extended periods. Image credit: A. Zaki.

Professor Lamb and Dr. Abdallah Zaki from the California Institute of Technology and the University of Texas at Austin analyzed Earth’s geological features to find indicators of past oceans.

Using computer simulations, they drained ocean models to assess the remaining terrain.

The simulations revealed that a distinct flat landmass, known as the continental shelf, surrounds the region where land meets sea, akin to a ring left by a drained bathtub.

While sea levels have fluctuated on Earth, continental shelves have remained stable, which supports the hypothesis of an ancient Martian ocean.

The researchers utilized topography data from Mars orbiters, discovering similar shelf formations in the northern hemisphere, hinting at an ocean covering a significant portion of the planet.

Such landforms take considerable time to form and are rare in lake environments, supporting the theory of a stable ocean existing for millions of years.

Additionally, evidence of river deltas and coastal features known as “bathtubbling” shelves were observed.

“The discovery of the shelf is a vital observation that consolidates the evidence for a Martian coastal zone,” Dr. Zaki commented.

“This previously overlooked aspect strengthens the case for a northern ocean on Mars, leading to further studies on deposits and satellite data.”

For further details, refer to the publication in Nature.

_____

Zaki, A. & Ram, M.P. Identifying topographical features of the early Martian ocean. Nature, published online April 15, 2026. doi: 10.1038/s41586-026-10381-2

Source: www.sci.news

Discover the Unexpected Resilience of Small-Diameter Diamonds

Nanodiamond

Artistic Render of Nanodiamonds

Katerina Conn/Science Photo Library/Alamy

While diamonds are renowned for their eternal qualities, when reduced to nanoscale dimensions, these crystals exhibit unexpected elasticity. Recent experiments on minuscule diamonds have illuminated what contributes to their surprising flexibility.

According to Chongxin Shan from Zhengzhou University, China, “Bulk diamonds are widely recognized for their extreme stiffness and hardness. However, the properties may differ significantly at the nanoscale.” His team investigated diamonds measuring just 4 nanometers in diameter—hundreds of times smaller than certain viruses—to analyze their behavior under pressure.

The investigation involved sandwiching nanodiamonds between cylinders featuring diamond tips designed for compression. A force sensor measured the drag on the diamonds while a specialized microscope captured their compressed state.

Shan noted the challenges of accurate nanoscale measurements, as minor disturbances can obscure data. To minimize these variables, researchers conducted experiments with approximately 100 distinct diamonds, ensuring a high vacuum environment to eliminate air interference. They discovered that reducing the diamond diameter from 12 nanometers to 4 nanometers resulted in a 30% decrease in stiffness, enhancing elasticity.

By combining experimental data with computer simulation, researchers discerned the underlying mechanics of this phenomenon. Due to their small size, nanodiamonds possess a higher surface-to-core atom ratio, with weak chemical bonds between these regions contributing to increased elasticity. This contrasts with larger diamonds, where stronger core bonds dictate behavior, as explained by Shan.

Yan Lu, a researcher at the City University of Hong Kong, highlighted that these findings reveal unexpected shifts in diamond characteristics compared to earlier investigations. Their work marks a pivotal contribution to understanding nanoscale diamonds, crucial for emerging applications in electronics and quantum technology. “With lab-created diamonds available at lower costs, now is the opportune moment to expand their use,” Lu states.

Topics:

  • Diamond/
  • Materials Science

Source: www.newscientist.com

Stunning Hubble 36th Anniversary Image Captures the Trifid Nebula

Stunning new images from the NASA/ESA Hubble Space Telescope showcase the rapid evolution of the Trifid Nebula, a dynamic stellar nursery where newborn stars are actively shaping gas and dust on human timescales.



Hubble’s detailed view of the Trifid Nebula. Image credit: NASA/ESA/STScI/J. DePasquale, STScI.

The Trifid Nebula, also known as Messier 20 or NGC 6514, was discovered by the French astronomer Charles Messier on June 5, 1764. This stunning nebula resides in the constellation Sagittarius.

Distance estimates to the Trifid Nebula vary, ranging from 2,200 to 9,000 light-years away.

According to Hubble astronomers, “The vibrant colors in this region of star formation evoke an underwater tableau of fine sediments drifting through the deep ocean.”

“Massive stars beyond this image have been sculpting this spectacular area for at least 300,000 years,” they noted.

“Their intense ultraviolet winds are still reshaping the environment, creating bubbles that compress gas and dust, subsequently sparking fresh star formation.”

“This isn’t Hubble’s first look at the Trifid Nebula,” they added. “The telescope revisited this cosmic site 29 years after its first observation in 1997, effectively documenting changes within the nebula on human timescales.”

“Why return to the same view? Beyond tracking time-related changes, Hubble has been upgraded with an improved camera that offers a wider field of view and enhanced sensitivity from its fourth servicing mission.”

In this latest view, Hubble captures the Trifid Nebula’s “head” and flowing “body,” resembling a cosmic sea lemon gliding through the universe.

“The ‘horn’ of the Cosmic Sea Lemon is part of Herbig Halo 399, a periodic plasma jet ejected by a young protostar embedded within,” the astronomers explained.

“These observed changes help scientists measure outflow rates and gauge the energy injected by the protostar into its environment.”

“Such measurements provide valuable insights into how newly formed stars interact with their surroundings.”

“Evidence of a counterjet can be seen below and to the right, marked by a jagged line of orange and red running across the dust.”

“To the right of the head, at the endpoint of a dimmer triangular ‘horn,’ lies another young star.”

“A green arc hovering above a faint red point, accompanied by a small jet, suggests that a nearby massive star is eroding the circumstellar disk with its intense ultraviolet radiation.”

“As the region around this protostar clarifies, it implies that its formation might be nearing completion.”

“Just to the left of the Cosmic Sea Lemon is a faint pillar; the densest material remains at the top, while most of the gas and dust has been blown away.”

“Distinct stripes and sharp lines provide further clues about the activities of other young stars.”

“Look for wavy diagonal lines that transition from bright orange to fiery red for an illustrative example.”

“At the pinnacle of the Cosmic Sea Lemon’s head, bright yellow gas ascends, showcasing ultraviolet light illuminating the dark brown dust and breaking it down.”

“Many ridges of dark material will persist for millions of years as the star’s ultraviolet radiation gradually erodes the gas.”

“Dense regions harbor protostars, which remain hidden in visible light.”

“The far right corner appears nearly pitch black, suggesting high-density dust where stars may not belong to this star-forming region but rather be foreground objects.”

“Search for bright orange orbs; these represent fully formed stars, surrounded by empty space.”

“In the coming millions of years, the gas and dust constituting the nebula will vanish, leaving only the newly formed stars behind.”

Source: www.sci.news

Triassic Fossil Discovery: Ancient Crocodile Cousin with Powerful Jaws Unveiled at Museum

CT scans of specimens from the Yale Peabody Museum of Natural History have unveiled a new species of short-nosed crocodilian with remarkably robust jaws, offering a glimpse into late Triassic ecological specialization.



Eosphorosuchus lacrimosa (left) is disturbed by Hesperosuchus agilis (right) near the carcass of Coelophysis at Ghost Ranch, New Mexico, USA. Image credit: Julio Lacerda.

Eosphorosuchus lacrimosa thrived 210 million years ago, inhabiting areas near rivers and lakes in present-day New Mexico, USA.

This ancient reptile was known for its speed, featuring large hind legs and small, slender arms.

Characterized by a short snout, a heavily fortified skull, and powerful jaw muscles, Eosphorosuchus lacrimosa was adept at swiftly catching sizable prey.

“This discovery highlights the early diversification of primitive crocodiles at the onset of the reptilian era,” stated Dr. Bart Anjan Brar, a paleontologist at Yale University and the Yale Peabody Museum of Natural History.

“During this Late Triassic period, two dominating reptilian lineages were emerging: one lineage led to modern crocodiles, while the other gave rise to birds—and, eventually, dinosaurs.”

In contrast to dinosaurs of that time, which were slender and agile, resembling herons, ancient crocodiles were robust four-legged predators, sharing physical traits with jackals and large foxes.

The holotype specimen of Eosphorosuchus lacrimosa comprises its skull, lower jaw, spine, limbs, and sections of its armor.

Discovered in 1948 at Ghost Ranch, New Mexico, this fossil remained largely unexplored for 75 years until now.

Phylogenetic analysis positions Eosphorosuchus lacrimosa near the base of Crocodylomorpha, outside a clade that also includes the small crocodilian, Hesperosuchus agilis.

This positioning suggests that its distinct traits evolved early in crocodilian history.

The fossilized remains indicate that Eosphorosuchus lacrimosa coexisted with Hesperosuchus agilis, hinting at early ecological niche differentiation among similarly sized terrestrial predators.

Eosphorosuchus lacrimosa is one of the few well-preserved relatives of early crocodilians, representing the ‘dawn’ of functional diversification within the lineage leading to modern crocodiles,” noted Miranda Margulis Onuma, a doctoral student at Yale University.

“Beyond its unique anatomy and preservation history, this specimen underscores the potential of existing museum collections to unveil new insights into life’s history.”

Notably, the discovery provides a rare look into an ancient ecosystem where biodiversity flourished, and species exhibited specialized feeding structures to fulfill distinct ecological roles.

The research team’s study appears this month in Proceedings of the Royal Society B.

_____

Miranda Margulis Onuma et al. 2026. Short-snouted phenosuchids with unusual feeding anatomy indicate that ecological specialization occurred early in crocodilian evolution. Proc Biol Sci 293 (2069): 20260130; doi: 10.1098/rspb.2026.0130

Source: www.sci.news

Can We ‘Vaccinate’ Ourselves Against Stress? Exploring Effective Stress Management Techniques

Explore science news and long reads from expert journalists at New Scientist, covering technology, health, and the environment.

While it might sound unusual, you can actually inoculate yourself against stress.

Just as vaccines help the immune system fend off invaders, research suggests stress inoculation can prepare individuals for future stressors.

This concept is particularly noted among military personnel. By allowing soldiers to undergo simulated stressful situations and equipping them with coping mechanisms, they can reduce the impact of stress over time. For instance, a study found that cadets with resilience training showed lower cortisol levels following intense military drills compared to those without such training. Similarly, emergency personnel also experience lower risks of post-traumatic stress disorder (PTSD) and depression due to their resilience training strategies.

Fortunately, you don’t need military training to reap the benefits. Regular, manageable exposure to stress can enhance resilience, as observed by Julie Vashuk from Masaryk University, Czech Republic.

Recent studies indicate that navigating stressful experiences can actually reshape the brain. This includes changes in key areas like the prefrontal cortex, involved in emotion regulation, the hippocampus, crucial for memory, and the amygdala, responsible for threat perception. Facing mild stressors can help individuals adapt to challenges in the following ways: it enhances resilience and accelerates recovery to baseline.

It’s essential to keep stress levels manageable. As Vashuk advises, mild stress should induce just enough discomfort to be tolerated without becoming overwhelming. “Once you’re overwhelmed, it becomes traumatic,” she explains. Activities like visiting unfamiliar places or engaging with new people can be beneficial. She also recommends surrounding yourself with supportive individuals.

This exposure therapy can be useful for adults, but how about children? Numerous studies, like one that highlights that early childhood adversity can elevate health risks, suggest that a small amount of controlled adversity may actually be advantageous. In rodent studies, constant separation from their mothers increases adult stress responses, while brief separations can lead to stronger adult responses. A similar phenomenon has also been observed in primates concerning short-term mother-infant separation.

Extrapolating such studies to humans poses ethical challenges, yet researchers like Carmine Pariante from King’s College London argue that we may not be as resilient as a society as we think. This doesn’t imply inflicting trauma intentionally, but rather suggesting that facing manageable challenges can benefit both adults and children.

Simulated stress exposure helps soldiers build real-life resilience.

Daniel Ceng/Anadolu via Getty Images

Vashuk also highlights a cultural phenomenon in the Czech Republic, where children are introduced to classical music early on. “Five-year-olds perform with their teachers, gradually performing solo as they mature. Although the stress remains, their early exposure equips them to effectively handle stress and rebound quickly,” she notes.

Exposure isn’t the sole method for building resilience. Techniques such as breathing exercises, mindfulness, altering your mindset regarding stress, and recognizing your strengths are proven to boost resilience and transform negative stress into positive energy.

Research is ongoing into the concept of a literal stress vaccine. Studies on rodents indicate that exposure to a heat-killed bacterium, Mycobacterium vaccae, calms stress responses via anti-inflammatory effects. Additionally, experimental drugs like “Alexigent” aim to enhance stress tolerance in individuals predisposed to PTSD and depression, although significant advancements remain limited. A notable 2017 study showed that a single ketamine dose can mitigate stress impacts on mice.

For most of us, however, the solution lies in the simplicity of understanding that stress is not inherently detrimental (see “Why the right kind of stress is crucial for health and well-being”). “Stress is beneficial for growth,” Vashuk states. “Experiencing stress is vital for our responses. What’s equally important is the ability to recover swiftly. Building resilience is crucial for regulating stress hormones effectively.”

Topics:

Source: www.newscientist.com

Revolutionary New Method Shows Promise in Preventing Sepsis Deaths

Blood Plasma Treatment for Sepsis

Extracting a patient’s plasma and removing certain proteins may enhance sepsis treatment outcomes.

Vital Hill/Shutterstock

Patients suffering from severe sepsis may soon benefit from innovative treatments that filter their blood to remove critical proteins underlying life-threatening responses. Promising results in animal studies set the stage for human trials scheduled for next year.

Sepsis occurs when the immune system overreacts to an infection, causing severe damage to tissues and organs. It can escalate to septic shock, which leads to dangerously low blood pressure and further complications. In 2017, there were 49 million cases of sepsis worldwide. According to a meta-analysis involving patients in Europe, North America, and Australia, 32% of sepsis patients died within 90 days despite treatment for the initial infection and organ damage, while the mortality soared to 39% among those with septic shock.

Emerging therapies that target the root causes of this condition could halt the progression of sepsis. Isaac Elias from the Amitava Medical Clinic Healing Center in Santa Rosa, California, has dedicated decades to studying a protein called galectin-3. This protein has numerous functions in healthy individuals, including regulating cell life cycles and activating immune responses. Galectin-3 is believed to be implicated in various health conditions, with Elias stating, “Our research spans multiple areas, from autoimmunity to cancer.”

Curious about galectin-3’s potential role in sepsis, Elias noted that high levels of this protein correlate with an increased risk of death in sepsis patients. To explore this, Elias and his team developed a device that filters galectin-3 from the blood. The process involves withdrawing a sizable blood sample, separating the plasma in a centrifuge, and using a filter with antibodies to target and remove galectin-3. The purified plasma is then combined with the blood cells and reintroduced to the patient.

This innovative apheresis device is currently being tested by teams including Peng Zhiyong from Zhongnan Hospital of Wuhan University in China, applying a multifaceted approach.

Initially, they monitored 87 septic patients versus 27 healthy individuals, discovering elevated galectin-3 levels in the sepsis group. Subsequent assessments showed a decrease in galectin-3 levels among survivors.

The research team also utilized the hemofiltration device in two animal models of sepsis, starting with 48 rats that developed sepsis due to a large intestine puncture. Of these, 28 underwent galectin-3 hemofiltration, while the rest received a sham procedure. Remarkably, 57% of the treatment group survived, compared to just 25% in the control group.

Furthermore, the team applied galectin-3 apheresis to minipigs subjected to lipopolysaccharide, a bacterial component that induces a robust immune response and sepsis. All pigs received intensive care, with 16 undergoing galectin-3 apheresis and 15 getting sham apheresis. The treatment group demonstrated higher survival rates: 69% versus 27%.

“This is certainly innovative,” remarks Jirali Anand of Raymond Poincaré Hospital in Garches, France. “The results remain consistent across both animal models.” Nevertheless, he emphasizes the need for further research to uncover how galectin-3 contributes to sepsis before establishing a standardized treatment. Anand also anticipates replicating these results in independent studies and different animal species, including primates.

Elias’ company, Elias Therapeutics, is actively seeking funding to launch a randomized clinical trial of galectin-3 apheresis in humans, aimed for initiation in 2027.

Source: www.newscientist.com

Experience the Stunning Earthset Video Captured by Artemis II Astronauts Using Their iPhones

Witness a rare spectacle: the distant Earth vanishing behind the massive moon, a moment experienced by only a few.

Subscribe for uninterrupted access to this incredible story.

Enjoy unlimited, ad-free articles and exclusive content.


NASA astronaut Reed Wiseman, commander of the lunar orbiting Artemis II mission, shared a breathtaking video of Earth fading away from the far side of the moon.

“It’s like experiencing a beach sunset from the most exotic seat in the universe,” Wiseman commented on the video, captured through the Orion spacecraft’s window. He described this moment as a “once in a lifetime opportunity.”

On April 1, Wiseman and fellow NASA astronauts Christina Koch, Victor Glover, and Canadian astronaut Jeremy Hansen embarked on their lunar journey. After orbiting both Earth and the moon for 10 days, they returned home on April 10, landing in the Pacific Ocean near San Diego.

During their mission, the Artemis II crew became the first humans to experience the moon’s far side—an area that remains hidden from Earth.

Wiseman couldn’t resist filming the Earthset using his cellphone while orbiting the moon on April 6, capturing intricate details of the cratered lunar surface.

“The docking hatch window barely revealed the moon,” Wiseman noted. “But an iPhone perfectly captured the view, with an uncropped, uncut 8x zoom, akin to the human eye’s perspective.”

While Wiseman recorded the Earthset, his crew members diligently photographed and documented the moon’s varied terrain and impact craters.

“Listen to the Nikon shutter as @Astro_Christina takes that stunning Earthset photo through the 400mm lens,” Wiseman shared on X about Koch’s work.

This stunning image, captured by the Artemis II crew from the Orion spacecraft on April 6, shows Earth dipping behind the moon’s edge.
NASA

The astronauts dedicated around seven hours to take photographs and collect data during this historic lunar flight. Upcoming releases will showcase more breathtaking images of the moon’s landscapes with Earth in the background.

Wiseman’s Earthset video pays tribute to the iconic Earthrise photo from the 1968 Apollo 8 mission. Whereas Apollo 8 showcased the Earth emerging, Wiseman’s video depicts it vanishing.

On December 24, 1968, Apollo 8 crew captured the moment when Earth appeared above the moon’s horizon.
William Anders / NASA

Artemis II marked NASA’s first moon mission in over 50 years. Wiseman, Koch, Glover, and Hansen were the pioneers traveling aboard the Space Launch System rocket and Orion capsule.

Looking forward, NASA’s Artemis III mission is set for mid-2027. The mission aims to remain in low-Earth orbit, executing technology tests with either a SpaceX or Blue Origin lunar lander before the upcoming lunar landing scheduled for Artemis IV in 2028. The agency intends to have one of the landers rendezvous with the Orion capsule in lunar orbit for a crewed lunar descent.

Source: www.nbcnews.com

How to Calculate Your Stress Score: Assess Your Stress Levels Effectively

Understanding your stress levels can often feel subjective, but advancements in technology are making it more measurable.

Many smartwatches are now equipped to assess your heart rate, offering a basic indicator of stress. The normal resting heart rate for adults ranges from 60 to 100 beats per minute. When stress occurs, the body releases cortisol and adrenaline, which can elevate this rate. A diminished capacity to recover from stress may lead to prolonged increases in heart rate.

Additionally, various smartwatches measure heart rate variability (HRV), which captures the natural fluctuations between successive heartbeats. Under stress, both cortisol and adrenaline cause your heart rate to quicken, leading to reduced variability. Conversely, when the parasympathetic nervous system activates to regain balance, heart rate fluctuations increase. Since average HRV varies from individual to individual, it’s advisable to track deviations as markers of stress.

Over time, monitoring your heart rate and HRV can yield a stress “score,” pinpointing activities or individuals that may contribute to excessive stress (refer to Why the right kind of stress is important for your health and well-being). However, these scores can be imprecise; recent research indicates that they may fail to differentiate between positive excitement and harmful stress.

Cortisol is another critical biomarker for stress researchers. However, its rapid increase—occurring roughly 20 minutes post-stressor—makes it less practical for immediate assessment. Research conducted by Julie Vashuk at Masaryk University in the Czech Republic requires saliva, urine, or blood samples for comprehensive analysis. A biosensor designed for continuous cortisol monitoring is under development, aiming for future commercial availability. Monitor cortisol functionality will enhance our understanding of stress.


In the near future, Vashuk predicts potential biomarker innovations might stem from bone cells. Under stress, these cells produce glutamate, which can inhibit the hormone osteocalcin.

This leads to an influx of osteocalcin in the bloodstream, decreasing parasympathetic activity and triggering a fight-or-flight response.

Understanding heart rate variability is essential for assessing stress levels

Nastasic/Getty Images

“We believe that under stress, the skeleton rapidly produces molecules that serve as better biomarkers for real-time conditions,” Vashuk mentions.

“These bone-derived substances play a significant role in directing energy to necessary areas,” she continues. “In the future, one of these molecules could emerge as a valuable biomarker for stress.”

Topics:

Source: www.newscientist.com

How Parrots Use Broken Beaks to Establish Dominance Among Males

Caring for a Broken Beak

Bruce is a kea with only half a beak.

Photo by: Ximena Nelson

In 2013, a small, malnourished parrot faced dire circumstances in the Arthur’s Pass wilderness of New Zealand’s South Island, missing half of its beak.

Ximena Nelson, a researcher at the University of Canterbury, discovered the bird (known scientifically as nestor notabilis) suffering from a beak injury, likely due to trauma. Recognizing the kea’s endangered status, Nelson’s student opted to rescue him.

This decision would change Bruce’s life forever, setting him on a path to unexpected prominence.

Initially, zookeepers at the Willowbank Wildlife Sanctuary in Christchurch assumed the parrot was female and named her Kati due to the absence of the upper beak. Male keas possess large upper beaks for digging, and it was unclear how a bird with a half-beak could thrive. “I felt he could bite my finger off,” Nelson remarked.

However, DNA tests later confirmed that Kati was a male, and he was renamed Bruce, a title they considered humorously unfit for a parrot.

To everyone’s surprise, Bruce excelled among nine males and three females at Willowbank, swiftly establishing himself as the alpha male of the group, called a “Circus” (the collective term for a group of keas).

Bruce’s success stemmed from his unique adaptation; the absence of his upper beak allowed him to use his lower beak as a weapon, enhancing his competitive edge.

According to Nelson, Bruce’s straight and sharp lower beak proved instrumental in his jousting tactics against rival birds.

Although other males typically weigh over 1 kg and outweigh Bruce, their upper beaks obscure their lower ones, limiting their effectiveness in confrontations.

“Should they attempt to headbutt another bird, the impact would be blunted,” Nelson noted. “Conversely, Bruce charges at his competitors, often almost falling over in his enthusiasm.”

Nelson added, “His jabs are intense; the other birds despise it. When Bruce engages, they quickly take flight.”

Of the 162 aggressive interactions noted over four weeks, Bruce dominated, winning all 36 encounters he participated in.

He also maintained control over four feeders in the enclosure, sometimes enlisting lower-status birds to preen and groom their lower beaks, a behavior unseen in other captive birds.

The research team aimed to investigate how Bruce’s dominance impacted the social hierarchy, discovering that his stress hormone levels were significantly lower than those of his competitors. His alpha status enabled him to engage in aggression far less frequently than required by others.

Researchers assert that besides humans, Bruce represents the first documented case of an injured animal achieving and sustaining alpha male status solely through behavioral innovation.

His story embodies the message that differences need not be disadvantages, and notably, he did not require any beak repairs.

“I genuinely admire Bruce,” Nelson commented. “When it’s time to fight, he puts in his all, fiercely and energetically. Nevertheless, he isn’t a bully.”

Topics:

Source: www.newscientist.com

Exploring the Existence of ‘Cosmic Fossils’: Black Holes from Before the Big Bang Still Present Today

New research by Professor Enrique Gaztanaga of the University of Portsmouth and the Institute of Space Sciences in Barcelona proposes a groundbreaking theory that some black holes might have formed before the Big Bang and survived a cosmic ‘bounce’. This intriguing idea could shed light on dark matter, the gravitational wave background, and the formative years of supermassive black holes and galaxies.



Gaztanaga proposes a new dark matter mechanism involving relic black holes stemming from a pre-big-bounce collapse.

“For almost a century, cosmologists have traced the universe’s history back to a singular event known as the Big Bang,” Professor Gaztanaga remarked.

“The conventional theory suggests that space and time originated from an extremely hot and dense state approximately 13.8 billion years ago, leading to billions of years of cosmic expansion and galaxy formation.”

“This prevailing model has been remarkably successful, accounting for the cosmic microwave background (CMB) radiation—an echo from the early universe—and accurately predicting the distribution of galaxies across the cosmos.”

“Nevertheless, several profound mysteries in physics remain unresolved. We still lack understanding about the Big Bang’s cause, the universe’s initial special conditions, the rapid expansion known as inflation, and the nature of dark matter, which outnumbers ordinary matter by a factor of five.”

“Our research investigates the possibility that the universe didn’t originate from a single shock but may have emerged from a cosmic bounce that mimicked inflation, with some of the universe’s oldest objects potentially surviving as relics from an earlier epoch.”

Some black holes may have emerged during the universe’s early stages and survived this cosmic bounce, leaving behind relics that could still influence galaxy structures billions of years later.

Others may have formed immediately after density fluctuations were amplified, resulting in a more uneven distribution of matter during the early universe.

These concentrated clumps of matter collapse more readily under their own gravity, increasing the likelihood of forming large cosmic structures and black holes early on.

Within Einstein’s theory of general relativity, the Big Bang represents a singularity, a point where density becomes infinite and known physical laws cease to function.

Many physicists view this as indicative of an incomplete understanding of the universe’s earliest moments.

Another concept to consider is bounce cosmology. This theory posits that our universe originated from a colossal cloud that first contracted and then expanded.

Rather than collapsing into an infinite singularity, the universe reaches a very high but finite density before reversing its motion.

“Singularities often signal that a theoretical framework has hit its limitations,” Professor Gaztanaga asserts.

“Bounces offer an avenue for the universe to transition from contraction to expansion without necessitating new and exotic physics.”

Scientists posit that this bounce might emerge naturally from quantum physics. Under extreme densities, quantum effects generate powerful pressures that prevent matter from compressing infinitely. This phenomenon stabilizes dense objects like white dwarfs and neutron stars, potentially replicating the inflationary phase.

New models suggest that similar effects could manifest on a cosmic scale. As the universe contracts, this quantum pressure can halt the collapse and trigger a rebound into expansion.

This cosmic bounce could address two pressing mysteries in cosmology.

First, it could elucidate why the early universe expanded so rapidly and uniformly in all directions.

Second, it may help explain why the universe appears to be expanding at an accelerating rate today—an effect currently attributed to a poorly understood force referred to as dark energy.

A notable hypothesis is that certain structures formed during the collapse phase may have persisted after the bounce.

New calculations indicate that compact objects exceeding about 90 meters in size might traverse the transition and reemerge as remnants in the expanding universe.

Potential artifacts include gravitational waves, density fluctuations, and ancient black holes.

These relic black holes could serve to explain dark matter, the unseen material that shapes large-scale structures of galaxies and the universe.

If substantial numbers were created during the bounce, they could constitute a significant portion, or even all, of dark matter.

This notion may also provide insight into the recent observations by the NASA/ESA/CSA James Webb Space Telescope of an unexpectedly massive object, often referred to as a ‘tiny red dot,’ in the early universe.

Many astronomers speculate these sources are related to rapidly growing black holes that emerged shortly after the Big Bang.

“If a supermassive black hole existed right after the bounce, we wouldn’t have to start from square one when forming the initial galaxies in the early universe,” Gaztanaga explained.

This theory also presents predictions that could be tested through future observations.

Scientists may seek to detect relic gravitational waves from previous cosmic stages or subtle patterns in the CMB that preserve traces of a pre-Big Bang universe.

“Much research is still required to validate these concepts,” Professor Gaztanaga states.

“However, if the universe did experience a bounce, the dark structures that shape today’s galaxies might be remnants from an earlier cosmic age that preceded the Big Bang.”

This paper is published in Physical Review D.

_____

Enrique Gaztanaga. 2026. Cosmological Bounce Relics: Black Holes, Gravitational Waves, and Dark Matter. Physics. Rev.D 113, 043544; doi: 10.1103/pr4p-6m49

Source: www.sci.news

Chernobyl’s New Reality: Why Radiation is No Longer the Top Threat

When you mention a work trip to New York, envy is likely the reaction you’ll receive. A summit in Paris? Instant jealousy. But say you’re heading to Chernobyl for the 40th anniversary of the world’s worst nuclear catastrophe, and you’ll likely see concern instead.

Many will caution you about cancer risks while others will recall sensational headlines and dramatic documentaries, suggesting radioactive contamination is unavoidable. To uncover the truth, we ventured into the no-go zone. Has pollution improved or worsened? Is nature suffering or thriving? Will the region ever see repopulation? Could the ongoing conflict with Russia reopen radiation concerns?

Four decades on, Chernobyl offers a range of insights, from engineering advancements aimed at radiation containment to environmental transformations as large cooling ponds give way to flourishing forests, and the increasing populations of rare species such as wolves and moose. However, the narrative is complicated by the war, which has sparked widespread devastation, military involvement, and a tumultuous geopolitical landscape.


The one-dimensional view of Chernobyl as a contaminated wasteland is far from accurate.

Presently, Chernobyl exists as a heavily restricted military zone—situated on the Ukrainian border and a potential route for further invasions. With the limited cooperation of scientists in the area, New Scientist has obtained rare access. Documenting our visit reveals how the simplistic view of Chernobyl as just a barren wasteland misses its complex history. Nature is resilient, pollution is largely under control, and the Exclusion Zone has become an intriguing and beautiful locale.

The future of Chernobyl—and indeed all of Ukraine—is uncertain. The ongoing conflict complicates management efforts and hinders scientific research. With the threat of drone attacks looming, the most pressing danger to Chernobyl’s stability may not be radiation (which can be monitored with appropriate funding) but rather the actions of Russia.

Source: www.newscientist.com

Unlocking Quantum Computing: The Key to Revolutionizing AI Development

Quantum Computing and AI: A Future Collaboration

Nespix/Shutterstock

Quantum computers are on the brink of revolutionizing AI applications that currently rely on extensive traditional computing resources. This groundbreaking technology could substantially accelerate advancements in machine learning and various artificial intelligence algorithms.

These advanced quantum systems promise capabilities to perform certain calculations unattainable by classical computers. However, researchers continue to explore whether these advantages extend to data-intensive tasks, like those involving machine learning—an essential component of modern AI.

Now, Fan Xinyuan of Oratomic, along with other research teams, advocates that the answer is indeed affirmative. Their innovative mathematical studies are paving the way for a future where quantum computing significantly enhances AI functionality.

“Machine learning permeates not only science and technology but also our daily lives. In an optimized quantum ecosystem, I believe this architecture will be applicable whenever large datasets are deployed,” he states.

The research from Huang and his team addresses the pivotal concern of how non-quantum data (like restaurant reviews or RNA sequencing results) can efficiently integrate with quantum systems, allowing these computers to utilize their unique properties for superior data processing and learning.

This integration necessitates the process of “overlaying” data—a mathematical combination that classical machines struggle to create. Previously, it was deemed impractical since all data in the superposition state was thought to require immense storage in dedicated memory devices. However, as Zhao Haimeng at the California Institute of Technology points out, that assumption has been challenged.

Huang’s team has explored a novel method that allows data input in smaller batches without the need for extensive memory, akin to streaming a movie rather than downloading it entirely before viewing.

This method not only demonstrates efficacy but also showcases that quantum computers can manage larger data sets with a reduced memory footprint compared to traditional systems.

Remarkably, the memory efficiency is so pronounced that a quantum computer utilizing approximately 300 error-correct qubits could outperform a classical computer constructed from every atom in the observable universe, according to Zhao.

While it may take years to build a quantum computer with 300 logical qubits, Huang anticipates that a 60-qubit model could be feasible by decade’s end. Their analysis indicates significant quantum advantages over classical computers for tasks involving large data sets already in AI applications.

“Quantum machines are indeed formidable, but they require innovative feeding methods,” notes Adrian Perez Salinas from ETH Zurich, Switzerland, emphasizing the importance of gradual data integration.

Nevertheless, challenges remain in applying this new research to tangible devices and real-world datasets. Past quantum machine learning algorithms often proved amenable to “inverse quantization,” a technique allowing algorithms to function without quantum hardware but still deliver effective outcomes. Furthermore, the importance of quantum properties in their new algorithm warrants further investigation, according to Perez-Salinas.

Researchers like Vedran Duniko from Leiden University in the Netherlands believe their findings are applicable to large-scale scientific endeavors, such as the Large Hadron Collider, where immense volumes of data are continually generated yet often discarded due to memory limitations.

While quantum computers are predicted to handle only specific AI applications and similar data-processing tasks, Duniko suggests, “This may not significantly disrupt today’s GPU-driven data centers, but its implications could still be substantial.”

The research teams continue to explore expanding the range of algorithms suitable for this methodology and devising innovative configurations for quantum computers to process data efficiently, with minimal memory, within practical time limits.

Topics:

  • Artificial Intelligence/
  • Quantum Computing

Source: www.newscientist.com

Rising Temperatures Challenge Ants in Protecting Host Plants – Sciworthy Insights

According to climate models,
global temperatures are anticipated to increase by 2-4 degrees Celsius
by the end of this century (approximately 4-7°F). Cold-blooded animals, or
ectothermic species
, are particularly sensitive to environmental fluctuations, as they depend on ambient temperatures for thermoregulation. In tropical ecosystems, where temperatures remain stable year-round, these cold-blooded organisms might experience limited thermal variability. Consequently, they could exhibit lower resilience to temperature shifts, heightening their susceptibility to heat stress.

Social insects, including ants and bees, exemplify cold-blooded species that adapt their behavior in response to temperature changes at both individual and colony levels, complicating predictions about their responses to climate change. For instance, arboreal ants frequently engage in “service exchanges” with host plants through
mutualistic relationships
. These intricate ant-plant interactions extend their impact, influencing other species. A notable example is certain bird species that prefer nesting in acacia trees defended by ants. Disruptions to this mutualism due to rising temperatures could trigger significant ecological ramifications.

To investigate how increasing temperatures influence symbiotic relationships, researchers analyzed the impacts of direct sunlight and experimentally elevated temperatures on tropical ants residing in trees. This study, conducted in Panama’s Metropolitano Natural Park from February to April 2024, focused on a specific ant species that engages in a mutually beneficial relationship with giant acacia plants. These ants provide protection against herbivores and eliminate competing vegetation in exchange for nourishment and shelter.

The researchers set up open-topped plastic enclosures around 33 acacia trees, ensuring that each ant colony was evenly distributed between shaded and sunlit areas. Sixteen control enclosures were well-ventilated through plastic holes, while seventeen heated enclosures were sealed at the base and contained black paper to enhance heat absorption. The temperature within the heated enclosures was approximately 1.3°C (2.3°F) higher than the control enclosures.

After a week, the researchers assessed ant activity on the branches twice daily—once in the morning (from 7 a.m. to 9:30 a.m.) and again in the afternoon (from 12 p.m. to 3:30 p.m.). Each branch was marked, and researchers counted the number of ants crossing it within a three-minute span, simultaneously recording branch and spine temperatures and noting their sun or shade exposure. They found that ant colonies in heated environments exhibited reduced activity compared to control colonies, particularly on sun-exposed leaves in the afternoon. The ants tended to navigate through the spines, avoiding direct surfaces. Although the spines were approximately 2°C (3.6°F) warmer than the branches, they provided shelter from direct sunlight, indicating that the ants adjusted their behavior to manage heat.

To determine the effect of elevated temperatures on ant defense mechanisms, the researchers pinned a pincer leaf to the acacia trunk’s base and monitored interactions. Findings revealed that ant colonies in heated enclosures demonstrated diminished defensive behavior toward foreign foliage compared to control colonies.

Researchers then measured the maximum temperature threshold, labeled Tmax, which indicates the temperature above which ants can no longer function. They collected three worker ants from each colony prior to, and three weeks following, enclosure setup. Each ant was placed in a tube at 36°C (97°F), with the temperature increased by 2°C (3.6°F) every 10 minutes. Researchers tapped the tubes gently to assess ant recovery capabilities, recording the temperature threshold for maximum function.
The average Tmax for the 33 ant colonies was 46.5°C (115.7°F), showing no significant difference between control and heated groups. Similar Tmax values (around 48°C or 118°F) were noted for the same ant species from hotter, drier environments, suggesting these ants possess a naturally limited tolerance for high temperatures. The branch temperatures in their experiments reached 48°C (118°F), indicating that ants are already operating close to their thermal threshold.

The research team concluded that ants reduced their activity levels in response to heat, consequently weakening their protective role for the acacia plants. The researchers speculated that such behavioral changes may render the plants more vulnerable to herbivores and disrupt interactions with other species, including pathogens and birds. They emphasized the need for future studies examining how climate stressors affect these complex interdependencies and their broader ecological implications.


Post views:
38

Source: sciworthy.com

How Daily Tooth Brushing Lowers the Risk of Nosocomial Pneumonia

The overlooked advantages of tooth brushing in hospitals

Drazen Zigic/Getty Images

Brushing your teeth while receiving hospital treatment can significantly decrease your risk of developing pneumonia.

Despite its benefits, many patients in hospitals neglect to brush their teeth. Reasons may vary, including forgetting a toothbrush, lack of motivation, or physical limitations. Additionally, healthcare providers often fail to prioritize routine oral hygiene for patients.

The largest randomized controlled trial in this area revealed that patients who received toothbrushes, toothpaste, and educational materials on dental care were 60% less likely to acquire pneumonia during their hospital stay, according to Brett Mitchell from Avondale University, Australia.

“This underscores the necessity of discussing pneumonia risks and the critical role of oral care and tooth brushing during hospitalization,” he states.

Pneumonia, especially ventilator-associated pneumonia, often arises due to medical devices disrupting normal respiratory functions. However, many hospitalized patients not on ventilators also develop pneumonia 48 hours post-admission. Ongoing research aims to uncover why this occurs and how to prevent it. Nosocomial pneumonia is notably linked to increased length of hospital stays, higher costs, and elevated mortality rates, as mentioned in this study.

“This is a crucial inquiry,” says Michael Klompas from Harvard University, who was not affiliated with the study. “Nosocomial pneumonia is among the most prevalent and lethal hospital-acquired infections, yet we lack concrete data on effective preventative strategies.”

Mitchell hypothesized a connection between the disease and oral bacteria. The oral microbiome can influence respiratory health as bacteria-laden droplets may be inhaled into the lungs. When hospitalized, a patient’s oral microbiome can shift, highlighting a pressing need for intervention, he explains.

Consequently, he and his team initiated a year-long randomized controlled trial involving 8,870 patients across three Australian hospitals to assess the impact of oral care on pneumonia risk. Mitchell presented the findings from this segment of the Nosocomial Pneumonia Prevention (‘HAPPEN’) study at the European Society for Clinical Microbiology and Infectious Diseases (ESCMID Global) conference in Munich, Germany.

In the study, each hospital divided participants into three groups, with no interventions in the first three months. After this period, one group received toothbrushes and toothpaste featuring motivational messages like “Brushing your teeth helps prevent pneumonia” and “Blow away pneumonia!” These brushes were designed with special handles for ease of use. Patients were also given QR codes linking to educational resources on the HAPPEN website.

After six months, a second group received brushes, followed by the third group after nine months, allowing all participants to practice tooth brushing for the study’s last three months.

To support medical staff, the research team provided oral care training for ward nurses and linked professional advice on their website. They encouraged nurses to remind patients about oral care and assist those who struggled with brushing.

During the non-intervention phase, only 15.9% of patients brushed their teeth daily. However, during the intervention phase, 61.5% of patients engaged in daily oral care, averaging 1.5 brushes per day. Web analytics showed that both patients and nurses frequently accessed information on the HAPPEN portal during this period, noted Mitchell.

Simultaneously, the incidence of hospital-acquired pneumonia unrelated to ventilators saw a significant decline, dropping from 1 case per 100 hospital days in the control group to 0.41 cases in the intervention group.

“This study is groundbreaking,” Klompas remarks, emphasizing the substantial sample size and randomized methodology. “Brushing your teeth while hospitalized not only promotes oral health but can also save lives.”

Piry Sipila from the University of Helsinki appreciates the profound risk reduction achieved through such simple interventions. “Patients were essentially provided with a toothbrush, toothpaste, and basic advice,” he observes. Nonetheless, outcomes may differ based on hospitalization reasons and patients’ usual oral hygiene practices.

Topic:

Source: www.newscientist.com

Understanding Self-Sabotage: Insights from Psychologists and Simple Strategies to Overcome It

Understanding Self-Sabotage: Self-sabotage, often referred to by psychologists as “self-handicapping,” involves consciously engaging in behaviors that undermine your path to success. This can manifest in various domains, such as academic performance, sports, or personal relationships.

For instance, you might skip rehearsing for an important work presentation, or neglect training for an upcoming race. In a romantic setting, even if things are going well, you may start ignoring your partner’s messages.

While such behaviors may seem puzzling and counterproductive, research indicates that self-sabotage serves a purpose. It’s often a subconscious strategy to safeguard self-esteem and mitigate the fears of failure or rejection.

Consider a scenario where you deliberately underprepare for a work presentation, resulting in a poor performance. The failure can be justified by your lack of preparation, rather than reflecting negatively on your abilities.

Similarly, if you finish last in a race due to insufficient training, you can attribute your loss to that lack of effort rather than a lack of talent.

In relationships, if you choose to ignore your partner’s texts and they decide to break up with you, you can attribute the rejection to your behavior instead of feeling that you weren’t good enough for them.

Essentially, self-sabotage provides a convenient excuse to protect your ego in the face of setbacks.

Individuals with a fear of failure or low self-esteem are particularly prone to this pattern. In the short term, it may offer temporary relief, but ultimately, it can increase the risk of long-term failure or rejection.

Waiting for negative outcomes can result in self-sabotage. – Photo credit: Getty

How to Overcome Self-Sabotage

If you genuinely want to excel in your endeavors, such as delivering a great presentation, training for a race, or nurturing a healthy relationship, proactive steps are essential. Avoiding self-sabotage involves addressing these habits one step at a time.

A helpful strategy is adopting a “Master Mindset”. This involves viewing challenges as opportunities for growth, rather than as definitive assessments of your self-worth. If your presentation or race doesn’t go as planned, focus on what you can improve for next time.

Another effective technique is practicing self-compassion. Treat yourself with the same kindness as you would a close friend and recognize that your value isn’t dependent on any single event or relationship outcome.

As you grow closer to a romantic partner, embrace any feelings of vulnerability. Remember that even if the relationship ends, it does not diminish your worth or lovability.

This article addresses the inquiry posed by Samantha Osborne via email: “Why do I keep self-sabotaging, and how can I stop it?”

Have more questions? Reach out to us at: questions@sciencefocus.com or connect with us on Facebook, Twitter, or Instagram (be sure to include your name and location).

Explore our ultimate collection of fun facts and more intriguing science articles.


Read more:


Source: www.sciencefocus.com

Newly Discovered Triassic Dinosaur Species Unearthed in New Mexico

A newly identified genus and species of carnivorous herrerasaurid dinosaur has been revealed from a well-preserved skull unearthed in northern New Mexico.

Artistic rendition of Ptychoterates buculentus. Image credit: Megan Sodano / Virginia Tech.

This newly discovered dinosaur species lived approximately 201 million years ago during the Rhaetian period of the Late Triassic era.

Identified as Ptychoterates buculentus, this species offers a rare glimpse into a lesser-known chapter of dinosaur evolution.

“Dinosaurs emerged during the Carnian period (237 to 227 million years ago) in the early Late Triassic and eventually branched into three lineages that thrived into the Jurassic: ornithischians, theropods, and sauropods,” explained Virginia Tech paleontologists Simba Srivastava and Stirling Nesbitt.

“While most of the earliest dinosaur fossils have been found in high-latitude regions of Pangea (present-day Brazil, Argentina, Zimbabwe, and India), comparable dinosaur remains from lower latitudes (like Late Triassic deposits in the southwestern United States and Morocco) are rare.”

The fossil remains of Ptychoterates buculentus include a nearly complete skull, which features an intact braincase and the majority of the skull roof, discovered in 1982 at the Coelophysis Quarry in northern New Mexico.

The skull measures about 22 cm (9 inches) in length, indicating it was a relatively tall and narrow-headed dinosaur.

“The skull reveals this species had prominent cheekbones, a broad braincase, and likely a short, deep snout,” added the paleontologist.

“These characteristics are the first of their kind seen in early dinosaurs, highlighting the ongoing evolution of these magnificent creatures.”

Ptychoterates buculentus belongs to one of the earliest-known families of carnivorous dinosaurs, the Herrerasauria.

This species is closely related to two other Triassic dinosaurs, Tawa Harae and Chindesaurus briansmalli.

These species form part of a newly defined clade, Morphoraptora, characterized by a mix of anatomical traits found in both primitive dinosaurs and later theropods.

“Our anatomical comparisons with other Triassic archosaurs support the identification of Ptychoterates buculentus as a new taxon within the saurischian dinosaurs, closely linked to Tawa Harae,” explained the researchers.

“More broadly, our findings position Ptychoterates buculentus as a member of Morphoraptora, a clade known primarily from Late Triassic deposits in the southwestern United States.”

Previously, scientists believed that by the late Triassic period, the earliest lineages of carnivorous dinosaurs had vanished, replaced by more advanced theropods.

However, the discovery of Ptychoterates buculentus indicates that some of these lineages survived much longer than anticipated, at least in the lower latitudes of the ancient supercontinent Pangea.

Ptychoterates buculentus was found in strata that appear to date just before the Great Extinction at the end of the Triassic, and members of this family never appeared again, suggesting that this group perished due to the mass extinction,” the scientists noted.

“This finding necessitates a reevaluation of the end-Triassic extinction’s impact, showing that it not only eliminated competing dinosaur species but also long-established dinosaur lineages,” Srivastava added.

“Furthermore, since herrerasaurids have not been discovered elsewhere in the Late Triassic, it is likely that what is now the American Southwest served as the final refuge for these dinosaurs.”

The discovery of Ptychoterates buculentus is detailed in a research paper published in this week’s edition of the journal Paleontology Papers.

_____

Simba Srivastava & Sterling J. Nesbitt. 2026. A new taxon of saurischian dinosaurs (Triassic: modern Norian or Rhaetian) from Coelophysis Quarry in New Mexico, USA, highlighting the diversity of Herrerasaurus in the Late Triassic. Paleontology Papers 12 (2): e70069; doi: 10.1002/spp2.70069

Source: www.sci.news

How Electric Car Owners Can Make Thousands by Supporting the Power Grid

Electric Vehicle Car

Electric Cars Generate Income While Parked

Maskot Bildbyrå

Currently, over 90% of new power generation relies on renewable sources. However, solar and wind energy can produce electricity only intermittently, leading to fluctuations in supply. A pilot project in Delaware has demonstrated that electric vehicle (EV) owners can earn considerable income—amounting to thousands of dollars annually—by utilizing their idle vehicles as a sophisticated storage system. This system stores excess electricity generated during peak production and discharges it during high-demand periods.

Recent findings indicate that the average electric vehicle is parked for 95% of the day. This highlights the potential for power utilities to tap into the energy stored in these vehicles during peak hours and recharge them when demand is lower. Willett Kempton from the University of Delaware estimated that EV owners could profit by selling their stored energy back to the grid.

“Electric vehicles can act as a cheaper energy storage solution compared to traditional battery installations, provided they’re plugged in most of the time,” says Kempton. This innovation could bolster power system reliability and enhance the integration of renewable energy sources.

The Delaware project showcased adaptations on four Ford electric vehicles contributed by Delmarva Power. Throughout 2025, V2G (vehicle-to-grid) charging was monitored, revealing that each EV could generate up to $3,359 annually if energy sold aligns with market rates.

Despite initial optimism about V2G technology when it was first studied in 1997, nearly three decades later, it remains mostly experimental in select regions across the U.S., Europe, Japan, and China.

The complexity of reversing energy flow from grid to vehicles has posed significant challenges, necessitating adjustments from automakers, utility companies, and regulatory bodies alike.

The core issue lies in the power grid predominantly relying on AC (alternating current), while most household appliances—including EVs—convert AC to DC (direct current) when charging. For vehicles to supply power back to the grid, this energy must be converted back to AC.

Implementing this safely requires V2G components compliant with stringent safety regulations. Currently, the simplest V2G setup involves installing a wall-mounted charger that converts DC to AC, making it suitable for solar installations. Various manufacturers, including Volkswagen and Nissan, now provide wall chargers compatible in select areas.

However, these wall chargers can reach high costs. To combat this, companies like Tesla, BYD, and Renault are innovating EVs equipped with built-in converters for DC to AC inside the vehicles. Additionally, experts like Kempton are working on new safety standards for AC chargers. With broader adoption, the cost of implementing V2G technologies could be substantially less, adding just a few hundred dollars to the price of a vehicle.

Presently, a rivalry exists between manufacturers adopting DC V2G, such as Volkswagen and those focused on AC V2G, like Tesla. This scenario is likened to the VHS versus Betamax format war of the 1980s, as explained by Alex Schoch, an executive at Octopus Energy. “While Betamax had superior quality, VHS emerged as more affordable, ultimately dominating the market,” he adds.

“There’s potential for both standards to coexist for a time, but long-term scalability demands a dominant standard,” Schoch states. “We unequivocally back AC.”

For consumers considering investing in V2G, a feed-in tariff structure is vital, allowing them to profit from supplying energy back to the grid. In 2024, Octopus launched the UK’s inaugural V2G tariff, though access remains limited for many EV owners. The partnership with BYD allows customers to lease V2G-enabled chargers and electric vehicles.

“Today’s EVs and the next generation rolling out are increasingly V2G-compatible,” Schoch notes, indicating a future with immense distributed energy capacity across the nation.

The advent of V2G technology could help achieve real-time balance in grid supply and demand. However, the rising number of V2G-equipped EVs may strain existing power systems, potentially necessitating grid upgrades.

Recent research indicates that a holistic approach to grid upgrades would be more economical than incremental improvements as V2G technology expands. The study’s lead researcher, Xu Liangcai from the National University of Singapore, emphasizes the need for proactivity in preparing power systems for the emerging V2G landscape.

“Initially, I thought V2G would be a panacea,” remarked co-author Ziyou Song, also from the National University of Singapore. “However, it’s clear that significant upgrades to power systems are essential to accommodate increased demand for charging.”

Topics:

Source: www.newscientist.com

Urgent Warning: The Internet Faces Possible Collapse—Act Now to Prevent It!

A significant wave of cyber threats is sweeping across the internet, and it’s showing no signs of slowing down. According to World Economic Forum, global cyberattacks surged by 58% over two years, projected to reach alarming heights by 2025.

Much of this escalation is attributed to AI, with 89 percent of attacks leveraging artificial intelligence in 2025 alone.

While phishing attacks—where criminals disguise themselves in emails, calls, or texts to extract sensitive information—are predominantly responsible for the rise, a fundamental shift is underway. The announcement of the Claude Mythos Preview by Anthropic reverberated through the tech space, indicating significant advancements in AI capabilities.









This revolutionary model has raised concerns, as it can identify vulnerabilities in software that even seasoned analysts may overlook. As a result, Anthropic launched Project Glasswing, uniting over 40 leading software companies to utilize the Claude Mythos Preview in order to detect and rectify these flaws before malicious actors can exploit similar AI functionalities.

Reportedly, this model has already uncovered thousands of critical vulnerabilities across key operating systems and web browsers. Anthropic warned that it’s “not too distant” when AI models may proliferate with such capabilities, posing severe risks to economic, public safety, and national security.

In essence, the Mythos Preview and similar models reveal that many widely trusted systems on which the Internet is built harbor longstanding vulnerabilities that AI can exploit faster than any hacker.

The pressing question remains: Can we address these security flaws and fortify the Internet in time?

The Open Source Gap

Irrespective of your stance on the tech giants leading the AI charge, one encouraging note is that the most advanced tools in safeguarding the Internet are currently in the hands of “good people.” However, this situation may not hold indefinitely.

The industry’s top AI systems, known as “frontier models,” include closely monitored entities like Mythos Preview.

Nevertheless, a new category known as “open source models” is rising, offering more transparency and innovation opportunities, albeit with accompanying risks. Decentralization could allow malicious actors to modify AI systems for illicit purposes if these models operate on independent servers.

“A few years ago, it wasn’t so accessible, but now anyone can access tools to create AI agents,” says Professor Peter Bentley from University College London, in a discussion with BBC Science Focus.

“While it requires powerful hardware, criminals will undoubtedly invest to reap rewards. They’ll acquire robust systems and local models, making malicious deployment plausible. Pandora’s box is indeed open.”

Anthropic’s Project Glasswing includes Amazon Web Services, Apple, Google, and more to enhance software security – Photo courtesy of Getty

Traditionally, open source models have been less advanced than state-of-the-art systems, but this gap is narrowing quickly. A recent report by the AI Security Research Institute indicates that the disparity is now about six months.

With this pace, it could be just under a year before models like Mythos Preview fall into malicious hands, further endangering fundamental web software. Is urgency starting to sink in?

Filtering the Noise

Before you dive into hysteria, it’s crucial to acknowledge that the AI sector is often prone to sensationalism.

Firms like Anthropic, OpenAI, and Google may exaggerate their models’ potential and dangers.

This tendency is especially prevalent in workplaces. Despite years of claims about AI revolutionizing industries, many roles have witnessed minimal change.

“Significant investments have been made in AI,” noted Bentley, “Yet the landscape has shifted primarily toward efficiency rather than transformation.”

While Anthropic hails the Mythos Preview as a “quantum leap,” others exhibit skepticism.

For instance, noted scientist and author Gary Marcus highlighted in a Substack post after the Project Glasswing announcement that the model is an incremental improvement rather than a groundbreaking leap forward.

An analysis from the AI cybersecurity firm Aisle indicated that a smaller, less expensive model could deliver performance nearly equivalent to that of Claude Mythos Preview.

Despite rising fears regarding malicious use of future AI models, the intent behind such misuse varies widely. “Criminals typically engage for financial gain,” Bentley explains. However, political adversaries might be more inclined to sow chaos than to profit.

“Once any nation acquires this technology, it’s likely they’ll employ it against others,” Bentley warns. “We are inadvertently weaponizing AI.”

AI is driving an increase in phishing scams, where hackers impersonate trusted figures to infiltrate systems – Photo credit: Getty

The Race is On

Clearly, the race is underway to reinforce the Internet before this new generation of models gains traction.

But is simply patching every vulnerability the right strategy? And can we feasibly do so?

Using AI for code correction presents its challenges. “AI-generated code is often convoluted and subpar,” notes Bentley. “Attempting to patch existing code with AI can lead to further complications and new vulnerabilities.”

Perhaps the solution lies in gaining an upper hand while defenders remain ahead of the curve.

A recent post from the UK’s National Cyber Security Center highlighted that defenders can “shape the battlefield,” leveraging their environment to their advantage while minimizing risks for adversaries.

AI can also be effectively employed to monitor for malicious AI activities. In the near term, AI is clumsy in penetrating systems, producing noticeable alerts that are easier to track, as explained in the NCSC post.

For Bentley, the situation resembles an arms race: “It’s akin to providing smart scientists with comprehensive blueprints for creating explosives and letting them loose,” he asserts.

The underlying concern remains: What vulnerabilities may go up in smoke first?

Read More:

Source: www.sciencefocus.com

How Urban Living Affects Estrogen Levels: Understanding the Impact of City Life

How the Gut Microbiome Influences Hormonal Levels

Nopparit/Getty Images

Recent studies reveal that bacteria in our gut can recycle discarded sex hormones back into the bloodstream. Researchers found that individuals in industrialized societies host significantly more bacteria that perform this recycling than those in hunter-gatherer populations or non-industrialized farmers. This phenomenon may lead to elevated blood levels of certain sex hormones, presenting potential health risks.

“We don’t yet know how the body reacts to this increased input,” explains Rebecca Britten from Jagiellonian University School of Medicine in Poland. “However, the implications could be substantial.”

Sex hormones, including estrogen, travel in the bloodstream. Elevated hormone levels trigger a chemical signal in the liver, causing the hormone to be excreted via the intestines. Bacteria feed on a sugar molecule attached to the hormone, utilizing an enzyme named β-glucuronidase to remove this tag.

Once the tag is cleaved, hormones can be reabsorbed by the body and re-enter the bloodstream. Research indicates that a notable portion of excreted sex hormones undergoes this recycling process due to gut bacteria.

The term “oestrobolome,” introduced in 2011, refers to the collection of intestinal bacteria that influence estrogen levels. Recently, the term “Testbolome” was proposed, indicating gut bacteria’s role in altering testosterone levels as well.

The latest research, conducted by a British team, analyzed gut microbiome data from various populations, including hunter-gatherers in Botswana, rural farmers in Venezuela, and urban residents in Philadelphia and Colorado. The findings show that the estrogen recycling ability of gut microbes in industrialized populations is up to seven times greater and twice as diverse compared to hunter-gatherers or rural communities.

Interestingly, the study also highlights that formula-fed infants exhibit up to three times more recycling capacity and eleven times more diversity than breastfed infants. However, factors such as age, gender, and BMI did not significantly affect the oestrobolome composition.

Researchers are now investigating if the enhanced recycling capabilities linked to gene sequences translate to actual increases in estrogen levels in the bloodstream. It remains to be seen whether the body compensates for heightened recycling by adjusting hormone levels.

If certain individuals maintain high estrogen levels due to their microbiome, it could significantly impact fertility and overall health, potentially raising the risk for conditions like certain cancers. Conversely, increased recycling might be beneficial for those with low estrogen levels. “We shouldn’t automatically assume that higher estrogen recycling is detrimental,” Britten notes. “In some cases, it can be advantageous.”

Katherine Cook, a professor at Wake Forest University School of Medicine studying the microbiome’s connection to breast cancer risk, emphasizes the growing evidence of gut microbiome’s role in human health. However, she cautions that the current study’s cohort is primarily based in the United States, suggesting that including a European group could strengthen the findings.

Britten expresses her intention to explore the lifestyle factors contributing to these observed differences. “We want to gather more precise data for further research,” she remarks.

Topic:

This revision enhances SEO by incorporating relevant keywords and phrases, improving readability, and maintaining the HTML structure.

Source: www.newscientist.com

Unexpected Evidence Reveals Fake News Is Not Just a 21st Century Issue

The Largest Ear of Corn Ever Cultivated, photographed by W.H. Martin, published by The North American Post Card Co. in 1908, acquired in 2018

“The Largest Ear of Corn Ever Cultivated,” photographed by W.H. Martin and published by the North American Post Card Company in 1908

Rijksmuseum Amsterdam

Rijksmuseum Amsterdam

Do you remember the viral image of Pope Francis in a striking white down jacket from 2023? It was later found to be generated by the AI tool Midjourney. With fake images and videos saturating the internet, a new exhibit at the Rijksmuseum delves into the historical manipulation of photographs since the advent of the medium.

Featuring prominently in this exhibit is the extraordinary image of a giant ear of corn (above), captured by W.H. Martin in 1908 as part of a fascinating series of postcards showcasing oversized crops and livestock. Martin would cut and paste his scenes before reshooting new images, showcasing innovative photographic techniques for the time.

This incredible work is part of the exhibition fake! Early Photo Collages and Photo Montages, which is on display at the Rijksmuseum in Amsterdam until May 25th. Below is a pre-1908 photomontage postcard depicting a futuristic New York where cars can soar above the skyline. The color was added later, slightly altering the contours to give a painterly effect despite being a photograph.

“Cars Flying Over Mulberry Bend Park, New York” by Theodor Eismann, published before 1908

Rijksmuseum Amsterdam

The Rijksmuseum notes that photographers began utilizing cut-and-paste techniques as early as 1860. This exhibition showcases the evolution of image manipulation leading up to World War II.

Next, we see a peculiar image of a wheelbarrow with an oversized head, crafted between 1900 and 1910.

Photomontage by Unknown Artist, 1900-1910

Rijksmuseum Amsterdam

The fascination with oversized crops culminates once more in a 1908 postcard featuring geese, dwarfed by their human companions, congregating at a market.

Bringing Our Geese to Market, published by Martin Post Card Company, 1908

Rijksmuseum Amsterdam

Topics:

Source: www.newscientist.com

Exploring the Rise, Fall, and Recovery of Periodic Cosmology: A Comprehensive Analysis

The largest 3D map of the universe, with Earth at the center and every dot representing a galaxy

The Largest 3D Map of the Universe

Collaboration between DESI and KPNO/NOIRLab/NSF/AURA/R. Proctor

The universe is in a state of transformation. While not yet at its conclusion, one day all we know will fade away.

Everything we know—the cities, lakes, planets, solar systems, and the stars—are on a path to an ultimate finale.

What lies ahead? Some experts speculate that the universe’s expansion will eventually reverse, gathering everything tightly until it culminates in a big crunch, only to start anew in a big bounce. This idea, known as cyclic cosmology, has resurfaced, partly fueled by groundbreaking data from the Dark Energy Spectrograph (DESI)’s comprehensive 3D map of the universe.

Proponents of periodic cosmology often advocate for its aesthetic simplicity. If the universe follows this cycle, we may not need to grapple with what caused the Big Bang or what existed before it—these questions may have been resolved already. Scottish astronomer Katherine Heymans eloquently summarized during a recent lecture hosted by New Scientist: “The universe undergoes a big bang, expands, slows down, and gravity pulls it back, culminating in another big bang.”

Nobel Prize winner Adam Riess, who contributed significantly to the discovery of dark energy, highlights why many cosmologists favor this concept. He states, “This suggests we are not in a unique universe, implying that the periodic nature of the cosmos makes we, as existences, less coincidental.” However, this perspective may be seen as anthropocentric rather than purely physics-based.

For decades, periodic cosmology lost momentum, especially after Riess’s findings indicated that the universe is expanding at an accelerating rate. Should dark energy outweigh gravitational forces, the likelihood of the universe collapsing decreases. Heymans noted, “Current evidence points towards a desolate, cold demise for our universe,” referring to heat death, which is currently the prevalent theory concerning the universe’s fate.

This notion isn’t without challenges, particularly when exploring how energy, matter, and entropy behave between cosmic cycles.

The second law of thermodynamics complicates the scenario. It posits that disorder, or entropy, never declines in a closed system like the universe. While entropy rises overall as the universe expands, it would seemingly decrease if contraction occurs—an apparent contradiction lies therein. Although some theoretical work has aimed to circumvent this, the ultimate cycle still reverts to a Big Bang followed by heat death, albeit through a convoluted path.

Prominent theoretical physicist Roger Penrose introduced a model called conformal periodic cosmology to navigate these complexities. His theory posits that the universe remains seemingly ever-expanding until the end, where matter disintegrates entirely into photons. Here’s the novel aspect: the uniformity at the new cycle’s start mirrors the emptiness at the previous cycle’s conclusion, potentially allowing a new universe to emerge.

While intriguing, this paradigm remains hard to empirically test, though Penrose has suggested potential measurable evidence. However, skepticism persists in the cosmological community, yet its avoidance of the entropy quandary means it shouldn’t be disregarded outright.

Mayall 4-Meter Telescope at Kitt Peak National Observatory

DESI Collaboration/DOE/KPNO/NOIR

DESI’s expansive cosmic map indicates that dark energy—a previously unstoppable force—may be losing strength. This suggests that while the universe’s expansion continues, its acceleration might be slowing down. As Heymans pointed out, this doesn’t imply a cosmic contraction but marks a significant shift in our understanding of dark energy.

The possibility that dark energy can weaken over the next ten billion years could usher in a new phase for periodic cosmology. “The transformation of dark energy may pave the way for a universe that can reverse its expansion one day,” noted Heymans.

Understanding the universe’s fate hinges on comprehending dark energy, which constitutes nearly 70% of the universe’s matter and energy. The nature of dark energy remains elusive, complicating efforts to theorize regarding the universe’s long-term trajectory. As Reese contended, “Extrapolating into the future without knowing more about dark energy renders predictions difficult.” While the cold death of the universe may seem the most probable outcome, the prospect of a big bounce-back is more conceivable than it has been in decades.

topic:

Source: www.newscientist.com

Understanding the Challenges of Changing Your Mind: Why It’s So Difficult

When was the last time you changed your mind?

Peter Kavanagh/Alamy

Novelist Leo Tolstoy famously stated: “The slowest person can explain something if he has no idea yet, but the most intelligent person cannot clarify the simplest thing if he is firmly convinced that he already knows beyond a shadow of a doubt what is before him.”

Until recently, I would have agreed with this sentiment. Various psychological studies reveal that many individuals are remarkably resistant to changing their opinions. This obstinacy, when combined with the rise of social media, has contributed to increased political polarization over the past two decades.

However, I was pleasantly surprised by a recent study indicating grounds for optimism. According to Stephanie Dolvia and psychologists from the University of California, Los Angeles, various techniques can help open our minds, particularly by enhancing our tolerance for emotional discomfort.

Open-mindedness varies among populations and can be measured via a series of statements that gauge agreement, such as:

  • People should consider evidence that contradicts their preferred conclusion.
  • When faced with puzzling questions, multiple possible answers should be considered before reaching a conclusion.

Conversely, individuals who believe that:

  • Changing your mind is a sign of weakness.

Are likely to be less open-minded. Those who agree with the first two statements and disagree with the third demonstrate a greater willingness to embrace new perspectives, unlike those who settle on one opinion without evaluating alternative viewpoints or updating their beliefs with new evidence.

The benefits of cultivating an open-minded attitude are many. As illustrated in research by Philip Tetlock at the University of Pennsylvania, open-mindedness enhances individuals’ performance in predicting geopolitical events. After a two-year competition involving over 700 participants, Tetlock discovered that top performers—dubbed “super forecasters”—were significantly more willing to revise their opinions in light of new evidence compared to the average person. This mental flexibility safeguards us against irrational beliefs rooted in hasty conclusions.

Despite the advantages of open-mindedness, practicing it can be challenging. Fear of embarrassment can prevent us from acknowledging past mistakes, while our beliefs often intertwine with key aspects of our identities—like religion or political affiliation—making change feel daunting.

To guard our egos, our brains often engage in “motivated reasoning,” seeking justifications for solidifying our core beliefs, which may involve logical fallacies or misinformation. Thus, maintaining an open mind demands considerable strength to withstand mental discomfort.

Greater emotional awareness is crucial in this pursuit. Dr. Dolbier and colleagues highlight a 2019 study on “Wise Reasoning.” It revealed that individuals who express their emotions more subtly can better consider different perspectives than those who simply label feelings as “good” or “bad.”

If I were more emotionally aware, I might recognize that my anger towards someone else’s ignorance stems from my discomfort in articulating my viewpoint. This insight could lead me to evaluate my arguments more critically, prompting a shift in perspective.

This connection between emotional awareness and open-mindedness may explain why mindfulness often aids individuals in reasoning more rationally. By tuning into our internal emotions, we become more adept at recognizing and overcoming instinctive reactions to opposing viewpoints, resulting in balanced opinions.

Mindfulness helps people avoid sudden reactions

Frank Bienewald/Light Rocket via Getty Images

If meditation isn’t your style, consider role-playing exercises. One study revealed that participants who approached upsetting events with the objectivity of a scientist were noticeably more tolerant of polarizing issues, such as the Israeli-Palestinian conflict. Remarkably, a follow-up experiment found that these benefits lasted for at least five months.

Additionally, contextualizing our disagreements can provide perspective. Intense debates often obscure our multifaceted identities, leading us to mistakenly equate our self-worth with being “right.” Simply reminding ourselves of our attributes—like loyalty, creativity, or humor—can alleviate perceived threats during disputes. This approach is most effective for those already aware of their biases, reinforcing the importance of self-awareness.

Lastly, reframe challenging emotions as growth opportunities. Evidence shows that recalling our potential for cognitive development enables us to respond constructively to opposing viewpoints. This perspective encourages us to view mistakes as learning moments, making it easier to accept that our previous opinions may not have been entirely correct.

Dolbier and her colleagues emphasize that many of these strategies require further testing in diverse contexts, with potential for new methods to emerge. However, existing research offers a solid starting point, and I plan to apply some of these techniques when confronted with challenges to my beliefs.

David Robson’s latest book is The Law of Connection: 13 Social Strategies That Will Change Your Life. If you have a question for his column, reach out at: davidrobson.me/Contact.

Topic:

Source: www.newscientist.com