Pleistocene Fossils Uncover Evidence That Hopping Was Common Among Large Species, Not Just Small Kangaroos

A groundbreaking study conducted by paleontologists from the University of Bristol, the University of Manchester, and the University of Melbourne has uncovered that the giant ancestors of modern kangaroos possessed robust hindlimb bony and tendon structures, enabling them to endure the stress of jumping. This challenges the previous assumption that body size strictly limited this iconic locomotion.

Simosthenurus occidentalis. Image credit: Nellie Pease / ARC CoE CABAH / CC BY-SA 4.0 Certificate.

Currently, red kangaroos represent the largest living jumping animals, averaging a weight of approximately 90 kg.

However, during the Ice Age, some kangaroo species reached weights exceeding 250 kg—more than double the size of today’s largest kangaroos.

Historically, researchers speculated that these giant kangaroos must have ceased hopping, as early studies indicated that jumping became mechanically impractical beyond 150 kg.

“Earlier estimates relied on simplistic models of modern kangaroos, overlooking critical anatomical variations,” explained Dr. Megan Jones, a postgraduate researcher at the University of Manchester and the University of Melbourne.

“Our research indicates that these ancient animals weren’t simply larger versions of today’s kangaroos; their anatomy was specifically adapted to support their massive size.”

In this new study, Dr. Jones and her team examined the hind limbs of 94 modern and 40 fossil specimens from 63 species, including members of the extinct giant kangaroo group, Protemnodon, which thrived during the Pleistocene epoch, approximately 2.6 million to 11,700 years ago.

The researchers assessed body weight estimates and analyzed the fourth metatarsal length and diameter (a crucial elongated foot bone for jumping in modern kangaroos) to evaluate its capacity to endure jumping stresses.

Comparisons were drawn between the heel bone structures of giant kangaroos and their modern counterparts.

The team estimated the strength of tendons necessary for the jumping force of a giant kangaroo and determined whether the heel bones could accommodate such tendons.

The findings suggest that the metatarsals of all giant kangaroos were adequate to withstand jumping pressures, and the heel bones were sufficiently large to support the width of the required jump tendons.

These results imply that all giant kangaroo species had the physical capability to jump.

Nevertheless, the researchers caution that giant kangaroos likely did not rely solely on hopping for locomotion, given their large body sizes, which would hinder long-distance movement.

They highlight that sporadic hopping is observed in many smaller species today, such as hopping rodents and smaller marsupials.

Some giant kangaroo species may have used short, quick jumps to evade predators. Thylacoleo.

“Thicker tendons offer increased safety but store less elastic energy,” said Dr. Katrina Jones, a researcher at the University of Bristol.

“This trait may have rendered giant kangaroo hoppers slower and less efficient, making them more suited for short distances rather than extensive travel.”

“Even so, hopping doesn’t need to be maximally energy-efficient to be advantageous. These animals likely leveraged their hopping ability to rapidly navigate uneven terrain or evade threats.”

University of Manchester researcher Dr. Robert Nudds remarks: “Our findings enhance the understanding that prehistoric Australian kangaroos exhibited greater ecological diversity than seen today, with some large species functioning as herbivores, akin to modern kangaroos, while others filled ecological niches as browsers, a category absent among today’s large kangaroos.”

For more details, refer to the study results published in the journal Scientific Reports.

_____

M.E. Jones et al. 2026. Biomechanical Limits of Hindlimb Hopping in Extinct Giant Kangaroos. Scientific Reports 16/1309. doi: 10.1038/s41598-025-29939-7

Source: www.sci.news

Understanding Probability: Common Misconceptions Explained

Language and Probability

The Language of Probability: Clarity is Key.

Makhbubakorn Ismatova/Getty Images

When someone states they are “probably” having pasta for dinner but later opts for pizza, do you find it surprising or consider them dishonest? On a more critical note, what does it imply when the United Nations asserts it is “very likely” that global temperatures will rise by over 1.5 degrees Celsius in the next decade, as reported last year? The translation between the nuances of language and the intricacies of mathematical probability can often seem challenging, yet we can discover scientific clarity through careful analysis.

Two fundamental points about probability are widely accepted: Something labeled “impossible” has a 0% chance of occurrence, while a “certain” event carries a 100% likelihood. However, confusion arises in between these extremes. Ancient Greeks, including Aristotle, differentiated between terms such as Eikos, meaning the most likely, and Pitanon, which signifies plausible. This presents challenges: persuasive rhetoric may not always align with likelihood. Additionally, both terms were translated by Cicero into the modern term probability.

The concept of a measurable mathematical approach to probability emerged significantly later, primarily in the mid-17th century during the Enlightenment. Mathematicians began to address gambling dilemmas, such as equitable distribution of winnings during interruptions. Concurrently, philosophers probed whether it was feasible to quantify varying levels of belief.

For instance, in 1690, John Locke categorized degrees of probability on a spectrum from complete certainty to confidence based on personal experience, down to testimony affected by repetition. This classification remains vital in legal contexts, both historically and presently.

The interplay between law and probability persisted among philosophers. In his writings of the mid-19th century, Jeremy Bentham criticized the inadequacy of common language in expressing evidence strength. He proposed a numerical ranking system to gauge belief strength, but ultimately deemed its subjectivity as impractical for justice.

A century later, economist John Maynard Keynes rejected Bentham’s certainty measure in favor of relational approaches. He argued that it was more effective to discuss how one probability might exceed another, focusing on the knowledge base for these estimations, thus establishing a hierarchy without offering systematic communication methods for terms such as “may” or “likely.”

Interestingly, the first systematic resolution to this challenge did not arise from mathematicians or philosophers but from a CIA intelligence analyst named Sherman Kent. In 1964, he introduced the idea of estimating probability with specific terminology for National Intelligence Estimates designed to guide policymakers. He articulated the dilemma faced by “poets,” who articulate meaning through words, versus “mathematicians,” who advocate for exact figures. Kent initiated the idea that specific words correspond to precise probabilities, designating “virtually certain” as a 93% probability, but also allowing some leeway to accommodate differing interpretations.

This framework for understanding probability transitioned from the intelligence sector to scientific applications. A review of recent research dating back to 1989 explored how both patients and medical professionals interpret terms like “may” in medical scenarios. The findings showed some alignment with Kent’s framework, although with distinctions.

Returning to the original question about the meaning of “very likely” regarding climate change, the Intergovernmental Panel on Climate Change (IPCC) offers clarity with explicit definitions. According to their guidance, “very likely” signifies a 90% to 100% probability of an event’s occurrence. Alarmingly, many climate scientists now assert that temperatures have already surpassed the critical threshold of 1.5 degrees Celsius.

However, situations are rarely straightforward. Logically, the statements “Event A is likely to occur” and “Event A is unlikely to be avoided” should correlate, albeit research published last year reveals that labeling a climate forecast as “unlikely” diminishes perceived evidence strength and consensus among scientists compared to stating it’s “likely.” This cognitive bias might stem from a preference for positive framing over negative alternatives. A classic example includes a community of 600 individuals facing a health crisis; when presented with two treatment options, most favor one that saves 200 lives over one that saves 400, even if both are statistically similar.

So, what lessons can we draw from this exploration? Firstly, quantifiable data effectively enhances communication of uncertainty. If numerical specificity isn’t available, stating, “75% of the time, I plan to have pasta for dinner,” may raise eyebrows. In such instances, ensure shared understanding of terminology, even in the absence of a formalized framework like Kent’s. Lastly, accentuating the positive tends to foster acceptance of predictions. How likely is that? Well, that’s hard to quantify.

Topics:

Source: www.newscientist.com

Discovering the Formation Process of Common Planetary Systems in an Ultra-Low Density World

Comparison of Taurus and Earth

Exploring a Low-Density Planet Compared to Earth

Image Credit: NASA

Newly discovered planets orbiting V1298 Tau are unusually lightweight, possessing a density comparable to polystyrene. This discovery may bridge critical gaps in our understanding of planetary system formation.

Unlike most planets in our Milky Way galaxy, which are often larger than Earth and smaller than Neptune, this solar system showcases an uncommon configuration. Astronomers have cataloged numerous planetary systems that formed billions of years ago, complicating our understanding of their genesis.

The research team, led by John Livingstone from the Astrobiology Center in Tokyo and Eric Pettigura from UCLA, has identified four dense planets that likely formed recently around a young star, V1298 Tau, which is around 20 million years old.

“We are examining younger models of the types of planetary systems commonly found across our galaxy,” Pettigura remarked.

Initially discovered in 2017, V1298 Tau and its accompanying planets remained largely unstudied until now. Over five years, researchers utilized both terrestrial and space telescopes to observe tiny variances in orbital durations, revealing intricate gravitational interactions among the four planets. These measurements enable more precise calculations of each planet’s radius and mass.

To effectively employ this observational method, researchers required initial estimates of each planet’s orbital duration without gravitational interference. Lacking that data for the outermost planet, they relied on educated conjectures, risking inaccuracies in their calculations.

“I initially had my doubts,” Petitgras admitted. “There were numerous potential pitfalls… When we first acquired data from the outermost planet, it felt as exhilarating as making a hole-in-one in golf.”

By accurately measuring the orbital durations and subsequently estimating the radii and masses, the team determined the densities of the planets. They discovered these are the lowest-density exoplanets known, with radii spanning five to ten times that of Earth, yet only a few times its mass.

“These planets exhibit a density akin to Styrofoam, which is remarkably low,” Pettigura explained.

This low density can be attributed to the planets’ ongoing gravitational contraction, potentially classifying them as super-Earths or sub-Neptunes—types of planets typically formed during the evolutionary stages.

The planets of V1298 Tau operate in a so-called orbital resonance, indicating their orbital periods are harmonically related. This observation aligns with astronomers’ theories on the formation of most planetary systems, including our own solar system, which initially have tightly packed configurations that eventually evolve into less stable arrangements, according to Sean Raymond from the University of Bordeaux in France.

“This newly identified system of close, low-mass planets revolving around a relatively young star could provide insights into typical sub-Neptunian systems,” Raymond pointed out. “This discovery is remarkable due to the inherent challenges in characterizing such youthful systems.”

Related Topics:

Source: www.newscientist.com

Common Types of Inflammatory Bowel Disease Linked to Harmful Bacteria

Ulcerative colitis is characterized by inflammation of the colon and rectum lining.

BSIP SA/Alamy

Toxins from bacteria in contaminated water can destroy immune cells in the colon’s lining. This implies that individuals whose intestines host these bacteria are significantly more likely to develop ulcerative colitis.

This conclusion is derived from a series of studies undertaken with both humans and animals by Shwena Chan and colleagues at Nanjing University, China. If validated, these findings could pave the way for new treatment options.

Ulcerative colitis is one of the primary types of inflammatory bowel disease (IBD), marked by inflammation of the colon and rectum lining. Symptoms typically fluctuate between periods of remission and flare-ups, sometimes necessitating the removal of the colon in severe cases.

The exact cause of ulcerative colitis remains unclear, although it is often regarded as an autoimmune disorder influenced by both genetic and environmental factors. Chan’s team theorized that immune cells called macrophages might be integral to the condition.

Macrophages are found throughout various body tissues, performing the dual roles of clearing debris and bacteria while regulating local immune responses. They can signal additional immune cell recruitment and initiate inflammation but are equally important in mitigating it.

Researchers discovered that the density of resident macrophage cells was notably reduced in colon tissue from patients with ulcerative colitis compared to those without the condition. Further experimentation demonstrated that depleting macrophages in mice increased their susceptibility to colitis, suggesting that losing macrophage protection leads to colon damage and inflammation.

But what accounts for the lower macrophage levels in ulcerative colitis patients? By analyzing fecal samples, the research team identified a toxin named aerolysin, which significantly harms macrophages while sparing other intestinal cells.

Aerolysin is secreted by several strains of bacteria belonging to the genus Aeromonas, frequently found in freshwater and brackish environments. The strains responsible for producing aerolysin are referred to as MTB (macrophage-toxic bacteria).

In experiments where mice were deliberately infected with MTB, they exhibited greater vulnerability to colitis. Conversely, even after removing the aerolysin gene from the bacteria or neutralizing the toxin with antibodies, the mice did not show increased susceptibilities to the condition.

Ultimately, the research team tested for Aeromonas in stool samples, discovering its presence in 72% of the 79 patients with ulcerative colitis, versus only 12% among 480 individuals without the condition. This test, however, could not confirm if these bacteria were indeed MTB or if they produced aerolysin.

The findings offer a nuanced perspective. Not every case of ulcerative colitis is linked to MTB, and some individuals can carry MTB without developing the disease.

“We cannot assert that MTB is the exclusive cause of ulcerative colitis,” Zhang states. “Ongoing MTB infection can create a hypersensitive environment in the colon, yet not everyone infected will develop colitis.”

“Environmental and genetic factors certainly influence the emergence of colitis,” she adds.

According to Zhang, there are at least three potential approaches for new treatment development. One involves creating drugs to neutralize the toxin; another would focus on vaccines targeting the toxin or the bacteria producing it; while a third approach seeks to eradicate toxin-producing bacteria via phage therapy, which utilizes viruses that selectively kill specific bacteria.

“The leading theory posits that MTB toxin depletes specialized macrophages in the intestinal lining, undermining intestinal immunity,” explains Dr. Martin Kriegel from the University Hospital of Münster, Germany.

He has observed that when the team eradicated all intestinal bacteria in mice and subsequently infected them with MTB, their susceptibility to colitis diminished. This observation indicates that other yet-to-be-identified bacterial species could also play a role.

“Nonetheless, this may represent a crucial, overlooked factor in the multi-step development of ulcerative colitis, especially in China,” Kriegel suggests.

Zhang and her research group intend to conduct more extensive epidemiological studies to substantiate the association between MTB and ulcerative colitis. If MTB infection is confirmed and becomes increasingly prevalent, it may elucidate the rising incidence of IBD.

topic:

Source: www.newscientist.com

Kissing Likely Evolved in Our Common Ancestor with Great Apes 21 Million Years Ago

Kissing is common among most living great apes and likely was practiced by Neanderthals, having evolved in the ancestors of these groups between 21.5 million and 16.9 million years ago, according to a study led by researchers from Oxford University.

Neanderthal. Image credit: Gemini AI.

Kissing can be observed in various animal species, yet it poses an evolutionary enigma. While it carries significant risks, such as disease transmission, it lacks clear reproductive or survival advantages.

Until now, the evolutionary background of kissing has received limited attention, despite its cultural and emotional importance across numerous human societies.

In this recent study, Dr. Matilda Brindle and her team from the University of Oxford undertook the first investigation into the evolutionary history of kissing, utilizing a cross-species perspective based on primate family trees.

The findings indicated that kissing is an ancient characteristic of great apes, having developed in their ancestors between 21.5 and 16.9 million years ago.

This behavior has persisted through evolution and is still evident in most great apes.

The researchers also concluded that Neanderthals, distant relatives of modern humans, likely engaged in kissing as well.

This evidence, alongside earlier studies showing that humans and Neanderthals exchanged oral microbes (through saliva) and genetic material (via interbreeding), strongly implies that kissing occurred between the two species.

Dr. Brindle stated: “This marks the first exploration of kissing from an evolutionary standpoint.”

“Our results contribute to an expanding body of research that illuminates the incredible variety of sexual behaviors found among our primate relatives.”

To carry out the analysis, scientists needed to define what constitutes a kiss.

This task was challenging due to the numerous mouth-to-mouth interactions resembling kisses.

Given their investigation spanned a diversity of species, the definition had to be suitable for a wide range of animals.

Consequently, they defined kissing as non-aggressive mouth-to-mouth contact that does not involve food transfer.

After establishing this definition, the researchers concentrated on groups of monkeys and apes that evolved in Africa, Europe, and Asia, gathering data from the literature where kissing has been documented in modern primates.

Among these are chimpanzees, bonobos, and orangutans, all of which have displayed kissing behavior.

Following that, they conducted a phylogenetic analysis, treating kissing as a “trait” to map onto the primate family tree.

Using a statistical method known as Bayesian modeling, they simulated various evolutionary scenarios along the tree’s branches and calculated the chances that different ancestors also kissed.

The model ran 10 million simulations, producing robust statistical estimates.

Professor Stuart West from the University of Oxford noted: “Integrating evolutionary biology with behavioral data enables us to draw informed conclusions about non-fossilized traits like kissing.”

“This paves the way for studying the social behaviors of both extant and extinct species.”

While the researchers caution that current data is limited, particularly beyond great apes, this study sets a framework for future inquiries and offers primatologists a consistent method for documenting kissing behaviors in non-human animals.

“Though kissing may seem like a universal act, it’s only documented in 46% of human cultures,” remarked Dr. Katherine Talbot from the Florida Institute of Technology.

“Social customs and situations differ vastly among societies, prompting the question of whether kissing is an evolved behavior or a cultural construct.”

“This research represents a first step in addressing that question.”

This is part of a study published this week in the journal Evolution and Human Behavior.

_____

Matilda Brindle et al. 2025. A comparative approach to the evolution of kissing. Evolution and Human Behavior in press. doi: 10.1016/j.evolhumbehav.2025.106788

Source: www.sci.news

Remarkable Images Reveal the Effects of Common Antibiotics on E. coli

The above image displays untreated E. coli bacteria, with the lower image showing the effects of polymyxin B after 90 minutes.

Carolina Borrelli, Edward Douglas et al./Nature Microbiology

High-resolution microscopy unveils how polymyxins, a class of antibiotics, penetrate bacterial defenses, offering insights for developing treatments against drug-resistant infections.

Polymyxins serve as a last-resort option for treating Gram-negative bacteria responsible for serious infections like pneumonia, meningitis, and typhoid fever. “The priority pathogens identified by the top three health agencies globally are predominantly Gram-negative bacteria, highlighting their complex cell envelopes,” states Andrew Edwards from Imperial College London.

These bacteria possess an outer layer of lipopolysaccharides that functions as armor. While it was known that polymyxins target this layer, the mechanisms of their action and the reasons for inconsistent effectiveness remained unclear.

In a pivotal study, Edwards and his team employed biochemical experiments combined with nuclear power microscopy, capturing details at the nanoscale. They discovered that polymyxin B, amongst other treatments, actively targets E. coli cells.

Shortly after treatment commenced, the bacteria rapidly began releasing lipopolysaccharides.

Researchers observed that the presence of antibiotics prompted bacteria to attempt to assimilate more lipopolysaccharide “bricks” into their protective walls. However, this effort resulted in gaps, allowing antibiotics to penetrate and destroy the bacteria.

“Antibiotics are likened to tools that aid in the removal of these ‘bricks’,” Edwards explains. “While the outer membrane doesn’t entirely collapse, gaps appear, providing an entryway for antibiotics to access the internal membrane.”

The findings also elucidate why antibiotics occasionally fail: they predominantly affect active, growing bacteria. When in a dormant state, polymyxin B becomes ineffective as these bacteria do not produce armor strong enough to withstand environmental pressures.

E. coli images exposed to polymyxin B illustrate changes to the outer membrane over time: untreated, 15 mins, 30 mins, 60 mins, and 90 mins.

Carolina Borrelli, Edward Douglas et al./Nature Microbiology

Interestingly, researchers found that introducing sugar to E. coli could awaken dormant cells, prompting armor production to resume within 15 minutes, leading to cell destruction. This phenomenon is thought to be applicable to other polymyxins, such as polymyxin E, used therapeutically.

Edwards proposes that targeting dormant bacteria with sugar might be feasible, though it poses the risk of hastening their growth. “We don’t want bacteria at infection sites rapidly proliferating due to this stimulation,” he cautions. Instead, he advocates for the potential to combine various drugs to bypass dormancy without reactivating the bacteria.

topic:

Source: www.newscientist.com

Common Vitamin D Supplements May Actually Decrease Your Vitamin D3 Levels

A recent study reveals that taking vitamin D2 supplements can actually lower vitamin D3 levels in the body, according to a research published in Nutrition Reviews.

This finding comes at a time when health experts advise individuals to start replenishing vitamin D, especially as the Northern Hemisphere transitions into autumn.

Vitamin D plays a critical role in regulating calcium and phosphate levels, which are essential for maintaining healthy bones, teeth, and muscles. The deficiency of this vitamin is prevalent globally; for instance, in the US, the rates are 31% among non-Hispanic Black adults.

Not all vitamin D is alike; it comes in two primary forms: Vitamin D2, which is derived from plants and mushrooms, and Vitamin D3, which is synthesized in the skin when exposed to sunlight and can also be found in animal products like oily fish.

During summer months, individuals living in higher latitudes can typically produce sufficient amounts of vitamin D through sun exposure. However, as autumn and winter approach, the sun’s angle is often insufficient for this to occur.

“People often produce vitamin D on sunny days in October, November, and December,” said Professor Susan Lanham-New, the Director of Nutrition Science at the University of Surrey and co-author of the study. This was reported by BBC Science Focus. “But they aren’t actually doing anything.”

A simple rule of thumb: If your shadow is not shorter than your height, the sun is not strong enough for vitamin D3 production.

We analyzed 20 randomized controlled trials to assess the impact of vitamin D2 supplementation, as noted by Emily Brown, a doctoral researcher in the Lanham-New group. Of these trials, 18 participants receiving vitamin D2 exhibited lower levels of vitamin D3 when compared to placebo or control groups.

In northern latitudes during winter, our bodies cannot produce vitamin D even on sunny days – Credit: Getty

“We don’t want people to think that vitamin D2 is somehow harmful. That’s not the case,” Brown commented to BBC Science Focus. “While vitamin D2 does elevate overall vitamin D levels, vitamin D3 should be prioritized.”

A prior study indicated that vitamin D3 is converted more efficiently into its active form compared to vitamin D2, making it a preferable option for supplementation, particularly during the darker months.

Brown is currently planning to explore the reverse effect, specifically what happens to D2 levels when individuals take D3.

The findings are especially pertinent for vegans who cannot obtain D3 from conventional dietary sources. While vegan-friendly D3 is derived from lichens, it is not as widely accessible as D2.

In 2022, scientists also developed a GMO tomato capable of producing D3, and the Lanham-New team is currently investigating whether this can effectively boost human vitamin D levels.

“I was genuinely surprised to find that when I administered D2, my D3 levels fell even when compared to placebo,” Lanham-New said.

“There is a lot we need to investigate now, as the long-term implications might hinder our ability to meet necessary vitamin D levels, potentially worsening the situation,” she added.

Read more:

Source: www.sciencefocus.com

Study Suggests Common Nasal Antihistamine Sprays Could Help Prevent Community Infections

Nasal sprays available over-the-counter, historically noted for their safety and efficacy in treating seasonal allergies, could be perceived in a new light following clinical trial results released on Tuesday.

The antihistamine azelastine has been observed to have antiviral properties against various respiratory infections, including Influenza, RSV, and viruses responsible for COVID, according to a growing body of research.

Researchers from Saarland University Hospital in Germany conducted a study involving 450 adults, predominantly in their early 30s. One group of 227 participants used a nasal spray three times daily, while the other 223 received a placebo spray under the same regimen.

Throughout nearly two months, all participants underwent COVID rapid testing twice weekly. The results indicated that the incidence of symptomatic infections was 2.2% in the azelastine group, markedly lower than the 6.7% infection rate in the placebo group.

Furthermore, azelastine seemed to diminish the incidence of other symptomatic respiratory infections, as highlighted in a study published in JAMA Internal Medicine.

While researchers remain uncertain about the exact mechanism by which azelastine limits infections, they hypothesize that it may bind to the virus in the nasal mucosa.

Another possibility is that azelastine interacts with the ACE2 receptor, the primary entry point for COVID viruses into human cells, thereby preventing their attachment.

“Our findings imply that azelastine could serve as a scalable and commercially viable preventive measure against COVID, especially in high-risk scenarios such as crowded indoor events and travel,” the researchers noted.

However, the study had limitations, including that all participants were relatively young and healthy, according to the researchers.

Valz emphasized that azelastine should not replace vaccination and further research is essential before considering it as a standard precautionary measure for the public, particularly for vulnerable groups.

Dr. William Messer, associate professor at Oregon Health & Science University, found the results “rationally convincing” in terms of risk reduction, but pointed out the intensive regimen of daily sprays in the trial.

He questioned whether wearing a mask might be a simpler approach to preventing COVID infection.

“Masks can be inconvenient and bothersome, yet may be easier to adhere to than remembering to use three nasal sprays daily,” Messer remarked.

Nevertheless, he added, “I do not discourage anyone who wishes to try it.”

Other researchers are seeking more data to ascertain the effectiveness of nasal sprays in high-risk populations, such as the elderly and immunocompromised individuals who require additional preventive measures.

Dr. Peter Chin-Hong, a professor at UCSF Health specializing in Infectious Diseases, speculated that azelastine could serve as an additional COVID-blocking tool for individuals already using nasal sprays for seasonal allergies, although he believes the evidence is insufficient for broader recommendations.

“While the potential is promising, I believe now is not the appropriate time to recommend it as a COVID preventative,” he stated in an email. “For those over 65, I continue to advocate for vaccination as the primary defense against COVID.”

Nonetheless, Chin-Hong highlighted that the trial results underscore the importance of targeting the nasal mucosa in developing future vaccines against COVID and other respiratory viruses as a more effective means of infection prevention.

“Current COVID vaccines have not proven to be highly effective in preventing infection,” he remarked. “There is a need for more mucosal vaccines for respiratory viruses. While flu vaccines are widely used, ongoing efforts are being made to create mucosal vaccines for coronaviruses, necessitating continued advocacy for federal support and prioritization for these initiatives.”

Source: www.nbcnews.com

Common Artificial Sweeteners May Disrupt Cancer Treatment

SEI 260736834

Some artificial sweeteners can alter the gut microbiota composition, influencing overall health.

Ian Allenden/Aramie

Individuals who consume the artificial sweetener sucralose may have reduced responsiveness to cancer immunotherapy, indicating that sweeteners could diminish treatment efficacy.

Immunotherapy enhances the immune system’s ability to identify and eliminate cancer cells, proving vital for many cancers. “When successful, it is highly effective. Patients can feel better, enjoy their lives, and survive for years,” states Abigail over Eichaldergoff from the University of Pittsburgh, Pennsylvania. “Regrettably, not all patients respond well; many cancer types benefit only a limited number of individuals.”

The reasons behind this are unclear, but numerous studies indicate that the gut microbiota plays a critical role in regulating immune responses; prior research has also demonstrated that artificial sweeteners can modify human gut microorganisms.

Consequently, Overacre and colleagues investigated the potential effects of artificial sweeteners on immunotherapy outcomes. They tracked the treatment results of 157 patients who underwent cancer immunotherapy for a minimum of three months. Among these, 91 had advanced melanoma, 41 had non-advanced non-small cell lung cancer, and 25 had melanoma that had been surgically excised but were at risk of recurrence.

Prior to treatment commencement, participants filled out a dietary questionnaire covering the previous month, enabling researchers to estimate their artificial sweetener intake.

Consumption exceeding 0.16 milligrams of sucralose per kilogram daily correlated with poorer treatment outcomes. Participants with advanced melanoma who ingested lower amounts of sucralose experienced longer survival rates, approximately five months more without cancer progression.

In the case of non-small cell lung cancer participants, the survival advantage was about 11 months. For those at higher risk of melanoma recurrence, reducing sucralose intake allowed them to remain cancer-free an additional six months compared to heavier consumers.

Similar outcomes were noted for participants who consumed more than 0.1 milligrams per kilogram daily of Acesulfame K, another artificial sweetener.

The US Food and Drug Administration (FDA) advises limiting sucralose intake to below 5 milligrams per kilogram daily. “Thus, the threshold which seems to reduce the effectiveness of immunotherapy is not half, or even 25%, but rather about 5% of the recommended daily amount,” states Diwakar Dabar from the University of Pittsburgh. “This suggests that even a small amount could have a detrimental effect.”

Additional experiments with mice bearing various types of tumors demonstrated that adding sucralose to their water during immunotherapy expedited tumor growth and decreased survival rates.

Genetic analysis revealed that immune cells activated by immunotherapy were less effective in mice provided with sucralose to combat cancer. Fecal analyses also indicated significant alterations in the rodent gut microbiota, notably increased activity in the metabolic pathway utilized by T cells to process arginine, a crucial amino acid.

The findings imply that sucralose may hinder immunotherapy by reducing arginine levels and modifying gut microbiota in ways that impair T-cell efficacy. Furthermore, experiments demonstrated that arginine supplementation improved survival rates in mice consuming sucralose, bringing them in line with those not consuming artificial sweeteners.

However, it remains uncertain if sucralose exerts similar effects on human gut microbiota and T-cell function. Josam Suez from Johns Hopkins University in Maryland notes, “It is incredibly challenging to derive findings based solely on human data, particularly regarding nutrition and food frequency surveys, while isolating specific impacts of non-nutritive sweeteners and isolating the effects of sucralose on clinical outcomes.”

“We invest considerable resources in the development of new medications, which is costly, challenging, and time-consuming,” remarks Davar. Discovering ways to enhance existing treatments, such as avoiding artificial sweeteners or using arginine supplements, presents a more straightforward and economical approach.

Nonetheless, further investigation is essential to determine if it genuinely enhances patient outcomes. “Hence, it is crucial to maintain support for these research priorities in a challenging funding landscape,” concludes Davar.

topic:

Source: www.newscientist.com

Immobilized Lifestyle Changes Are the Most Common Approach to Combat Cognitive Decline

Regular exercise aids in maintaining cognitive sharpness

Yoshikazu Tsuno/AFP via Getty Images

Engaging in structured exercise programs, dietary changes, cognitive activities, and social interactions has proven more efficient in combating cognitive decline than casual, self-initiated efforts.

The brain’s capabilities for memory, language use, and problem-solving typically diminish with age, often resulting in dementia. Nevertheless, studies indicate that up to 45% of global dementia cases are preventable by addressing 14 risk factors, which include inadequate education, social isolation, and brain injuries.

To explore strategies for preventing cognitive decline, Laura Baker from Wake Forest University School of Medicine in North Carolina and her team studied the American Pointer Study.

They involved over 2,100 individuals deemed at high risk for cognitive decline aged between 60 and 79, who led sedentary lifestyles, had suboptimal diets, and met at least two criteria related to dementia, such as a family history of memory issues.

Participants were randomly placed into one of two groups. Both were designed to promote physical and cognitive activity, healthy eating habits, and social interaction, although their methods varied.

One group followed a highly structured format, with 38 small group sessions across two years, led by trained facilitators who devised plans. This regimen also incorporated regular exercise at a community center along with weekly online brain training exercises.

The other group was less structured, participating in only six group meetings over the same two-year period. They were provided with public education materials and $75 gift cards aimed at encouraging behavioral changes, like attending gym classes.

After two years, both groups demonstrated enhancements in cognitive assessments measuring memory, executive function, and processing speed. The structured group saw an improvement of 0.24 standard deviations per year compared to their initial scores, while the self-guided group improved by 0.21 standard deviations per year.

“It’s remarkable that the structured care group has shown improvement,” remarks Gil Livingston from University College London. However, she points out the absence of a control group that received no intervention, making comparisons challenging between structured and self-directed participants.

Baker estimates significant declines in cognitive scores would have occurred without either regimen, claiming the benefits are significant. “A two-year structured intervention can effectively delay cognitive aging by nearly one to two years,” she states.

Baker mentions that improvement in both groups aligns with a placebo effect, suggesting that participants might have expected positive outcomes regardless of their group assignments.

Claudia Sumoto from the University of São Paulo in Brazil suggests the minor differences in cognitive scores between groups are likely imperceptible to participants and their families, given that dementia progresses gradually; clear effects may take more than two years to manifest.

Baker notes the team will continue monitoring participants for a total of six years, as the US Pointer Study has a four-year extension. “We’re observing subtle changes because they are cognitively normal individuals, and we are effectively slowing the rate of decline over time. We’re genuinely excited about empowering individuals at risk of dementia to take control of their health,” she remarks.

She believes that a structured approach is practical beyond the study context, emphasizing the need for caregivers and health professionals to motivate individuals rather than assuming high public expenditure is necessary to instill healthy habits.

“Overall, dementia care can be highly costly, and mitigating the burden can save expenses,” Livingston adds. “This study is crucial because lifestyle enhancements have shown benefits, and while guided support aids improvement, it’s not the only approach.”

Topic:

Source: www.newscientist.com

Study: Common Sweetener Erythritol May Impact Brain Cells and Elevate Stroke Risk

A recent study from the University of Colorado Boulder indicates that erythritol, a widely used non-nutritive sweetener, may be linked to a higher risk of cardiovascular and cerebrovascular events.



Berry et al. Our study demonstrates that erythritol, at concentrations commonly found in standard size sugar-free beverages, negatively impacts cerebral microvascular endothelial cell oxidative stress, ENOS activation, NO production, ET-1 expression, and T-PA release in vitro. Image credit: Tafilah Yusof.

Erythritol is a popular alternative to non-nutritive sugars due to its minimal effects on blood glucose and insulin levels.

This four-carbon sugar has a low-calorie content of 60-80%, being as sweet as sucrose, and commonly replaces sugar in baked goods, confections, and beverages.

Authorized by the FDA in 2001, erythritol is recommended for individuals with obesity, metabolic syndrome, and diabetes, as it aids in regulating calorie consumption, sugar intake, and minimizing hyperglycemia.

Found naturally in small amounts in certain fruits, vegetables, and fermented foods, erythritol is quickly absorbed in the small intestine through passive diffusion.

In humans, erythritol is produced endogenously from glucose and fructose by erythrocytes, liver, and kidneys via the pentose phosphate pathway, making its levels dependent on both endogenous production and external intake.

“Our findings contribute to the growing evidence that non-nutritive sweeteners, often considered safe, could pose health risks,” stated Professor Christopher Desouza from the University of Colorado.

A recent study involving 4,000 participants from the US and Europe revealed that individuals with elevated erythritol levels are at a significantly increased risk of experiencing a heart attack or stroke within three years.

Professor Desouza and his team sought to determine what factors were contributing to this heightened risk.

They exposed human cells lining blood vessels in the brain to erythritol for three hours, using concentrations similar to those found in standard sugar-free beverages.

The treated cells exhibited several alterations.

Notably, they produced significantly less nitric oxide, a molecule critical for dilating blood vessels, while increasing the expression of endothelin-1, which constricts blood vessels.

Furthermore, the challenge of a thrombogenic compound called thrombin significantly slowed the cell’s production of T-PA, a naturally occurring compound that promotes coagulation.

Cells treated with erythritol also generated more reactive oxygen species, or free radicals, which can lead to cellular damage and inflammation.

“We’ve been diligently working to share our findings with the broader community,” noted Auburn Berry, a graduate student at the University of Colorado in Boulder.

“Our research indicates that erythritol may indeed heighten the risk of stroke.”

“Our study solely focused on sugar substitutes,” emphasized Professor Desouza.

“For individuals consuming multiple servings daily, the potential impact could be even more pronounced.”

The researchers caution that their findings are based on lab research conducted on cells, necessitating larger-scale studies involving human subjects.

Nonetheless, they advise consumers to check product labels for erythritol or “sugar alcohol.”

“Considering the epidemiological evidence informing our research, along with our cellular discoveries, monitoring the intake of such non-nutritive sweeteners seems wise,” Professor Desouza remarked.

The study was published today in the Journal of Applied Physiology.

____

Auburn R. Berry et al. 2025. The non-nutritive sweetener erythritol negatively affects brain microvascular endothelial cell function. Journal of Applied Physiology 138(6):1571-1577; doi:10.1152/japplphysiol.00276.2025

Source: www.sci.news

Low Iron Levels Are Common, But They Can Be Improved: Here’s How to Naturally Boost Yours

Recent reviews published in Lancet Hematology by Dr. Ashley Benson and Dr. Jamie Law at Oregon Health and Science University reveal that iron deficiency impacts nearly one in three women, making it the most prevalent nutritional deficiency globally.

This deficiency is crucial for energy production, brain development, and maintaining a robust immune system.

According to the World Health Organization, anemia affects 31% of women of reproductive age, 36% of pregnant women, and 40% of children under 5.

Inflammation can interfere with iron absorption, stemming from acute diseases or chronic conditions such as obesity. With rising global obesity and chronic disease rates, this creates additional challenges in tackling iron deficiency worldwide.

Iron Deficiency

Iron deficiency can lead to anemia, as iron is vital for red blood cell production. Anemia is characterized by low hemoglobin levels, the protein that gives blood its red color and transports oxygen.

The World Health Organization reports that anemia affects 31% of adult women of reproductive age, 36% of pregnant women, and 40% of children under 5 years old. Approximately half of all global anemia cases result from iron deficiency. Common symptoms include pale skin, fatigue, shortness of breath, and irregular heartbeat (known as palpitations).

Iron deficiency poses serious health risks, especially when it causes anemia, including a weakened immune system, complications during pregnancy and childbirth, maternal and infant mortality, and delayed growth and brain development in children.

Diet can influence iron absorption. – Photo credit: Getty

The repercussions of iron deficiency are particularly severe for women and children, who are the most susceptible.

Menstruating women have a heightened need for iron due to monthly blood loss. Pregnant women require extra iron for the placenta, fetus, and increased blood volume. Children need iron for rapid growth and brain development, making adolescent girls—who are both growing and menstruating—especially vulnerable.

In their study, Benson and Law convened a panel of 26 experts alongside four patient representatives. Their collective recommendations advocate for a more positive and inclusive strategy for managing iron deficiency, particularly for at-risk populations.

The panel stressed the importance of regular screening during pregnancy and early childhood. They emphasized utilizing ferritin, a blood protein indicating liver iron storage, as a reliable marker for diagnosing iron deficiency and determining intervention timing.

If treatment is necessary, oral iron supplements are the first recommendation. They are effective, widely accessible, and cost-effective. For those experiencing side effects like nausea and constipation, the panel suggested taking supplements on alternate days to enhance tolerability. In more severe instances, or if oral iron proves ineffective, intravenous iron may be needed.

Lastly, the panel asserted that iron deficiency should not be viewed as an isolated issue, but rather part of the routine care for mothers and children, including pregnancy tests, child health visits, and nutrition programs.

Iron Advice

While some individuals may need treatments for iron deficiency, many cases can be prevented through daily dietary choices.

Begin by adding more iron-rich foods to your meals, such as pulses, legumes, green leafy vegetables, nuts, and iron-fortified cereals (opt for lower sugar options for kids and adolescents).

For those consuming animal products, limit intake to moderate amounts of lean meat—about 70g (2.5oz) per day, as recommended by the UK Eatwell Guide—which can provide easily absorbable iron.

If you primarily follow a plant-based diet, consider pairing iron-rich foods with vitamin C sources like lemon juice, tomatoes, and strawberries to enhance iron absorption.

Avoid drinking tea or coffee during meals as polyphenols can hinder iron absorption; this applies to taking iron supplements as well. Consuming them with a vitamin C source, such as orange juice, can significantly improve absorption.

If you belong to a higher-risk group—such as menstruating individuals or caregivers of young children—or if you experience excessive fatigue, consult your doctor. A simple blood test can evaluate your iron levels. In children, iron deficiency may also manifest as unusual cravings, such as for ice or non-food items.

Iron deficiency is prevalent but manageable and often preventable. With awareness and mindful choices, maintaining healthy iron levels can be as straightforward as selecting what goes on your plate.

For more fact-checked news, visit the BBC Verification Website.

Read more:

Source: www.sciencefocus.com

Common “Natural Beauty” Ingredients That Harm the Planet

The beauty industry often resists trends. From campaigns on aging to home LED masks, consumers have encountered a range of innovations. However, one particularly enduring trend over the last decade is the shift towards “natural” or “organic” beauty products.

At first glance, this sounds appealing: fewer plant ingredients, minimal processing, and no synthetic pesticides. What could be wrong with that? The reality is more complex.

Choosing “natural” beauty products may feel like a wise choice when considering our planet.

Yet, as the beauty industry comes under scrutiny for its environmental impact, we must move beyond greenwashing and evaluate whether relying on naturally grown resources is truly sustainable within a billion-dollar industry.

Growth Market

The global natural and organic beauty sector is currently seeing robust growth driven by heightened consumer interest, with projections estimating gross revenues of approximately £11.3 billion ($14.9 billion) by 2025.

In the UK alone, the natural cosmetics market is expected to reach around £210 million ($278 million) in 2025, with annual growth rates of about 2.74% over the next five years.

From ingredient-light serums to zero-waste shampoo bars, the diversity and volume of products available have never been greater. While this thriving market is exciting, it also presents challenges.

More products lead to increased material extraction, mining, and synthesis, as well as greater packaging and emissions throughout the supply chain.

This intricate situation can easily confuse well-meaning consumers, who may get caught up in labels like “natural” or “organic” without fully understanding their implications.

Steam distillation is a traditional method of extracting oil from flowers used to make rose water – Photo credit: Getty Images

There’s a common belief that if something is labeled “natural,” it must be beneficial for the environment. However, whether it’s Moroccan argan oil or Mexican aloe vera, obtaining natural ingredients often comes at a high price.

Crops require extensive land, water, and energy for cultivation.

Many high-demand crops are susceptible to climate change and, regrettably, are often linked to unethical labor practices. While we aspire for organic farming to represent a more sustainable approach, it can also lead to unintended negative outcomes.

For instance, many organic agricultural practices may yield lower crop outputs while occupying more land. This can result in deforestation as farmers seek additional land to maximize production of slowly-growing crops.

Naturally derived pesticides used in organic agriculture can also harm the soil.

Copper sulfate, commonly used in the wine industry’s “Bordeaux mixture,” has long been approved for use in organic farming but has recently faced regulation due to its negative effects on soil microbiomes and potential threats to local insect populations.

read more:

Lab-grown Materials

This is where biotechnology enters the conversation. While it may not have the allure of “Wild Harvest Lavender,” biotechnology could ultimately prove to be one of the planet’s most eco-friendly resources.

In simple terms, biotechnology utilizes scientific methods (often involving fermentation with yeast, plant sugars, or bacteria) to cultivate ingredients in laboratories, as opposed to sourcing them from nature. Think of it like brewing beer, but instead of a refreshing pint, you yield powerful active ingredients for moisturizers and shampoos.

These lab-generated components are molecularly identical to their natural counterparts and can be produced without ecosystem emissions, using significantly less water, land, and energy.

This highly controlled process can also be scaled efficiently while maintaining consistent quality.

For example, swapping “wild harvested lavender” for biotechnologically produced lavender essential oils can lead to substantial reductions in energy and water usage.

Producing 1g (0.04oz) of natural lavender oil requires about 20L (approximately 5 gallons) of water and about 4 megajoules of energy—roughly equivalent to watching TV for 20 hours.

In contrast, if biotechnologically produced, the same 1g can potentially require just 2-5L (0.5-1.3 gallons) of water and 1 megajoule of energy (the equivalent needed to boil a kettle).

Biotechnology has advanced significantly in recent years, although companies have yet to replicate every component of these unique essential oils.

Laboratory-grown cosmetic ingredients are molecularly identical to natural ingredients and could become a more sustainable alternative – Photo Credit: alamy

One ingredient successfully replicated is bisabolol, known for its soothing properties in the cosmetics field. It’s utilized in a diverse range of products, from hormone-related creams to sun care and baby products.

To extract natural bisabolol, it must be derived from Candea trees native to Brazil. This cultivation can lead to deforestation, biodiversity loss, and ecosystem strain, with natural harvest quality varying based on weather conditions.

To obtain 1kg (2.2 pounds) of natural bisabolol, cutting down around 1-3 trees is necessary, with each tree taking 10-15 years to mature.

To create one ton (2,204 pounds) of bisabolol, approximately 3,000 to 5,000 trees are needed—a staggering statistic given the global demand is around 16 tons (35,000 pounds) annually.

Each tree consumes about 36,000 liters (9,500 gallons) of water over its lifetime (equivalent to 72,000 500ml bottles) and 75 megajoules of energy (approximately analogous to charging a smartphone 2,500 times).

Givaudan, a Swiss ingredient manufacturer, has already developed bisabolol through biotechnological means, resulting in a much higher specification than what natural agriculture can achieve.

Comparatively, biotechnological yields of bisabolol can utilize 90-95% less water and 50-60% less energy than natural Candeia tree yields, not to mention the hectares saved from potential deforestation.

Brands like Boots and Estée Lauder are investing in biotechnology.

Even smaller indie brands are beginning to highlight fermented or lab-grown ingredients. Eco Brand Biossance uses a similar moisturizing ingredient to squalene, but instead of harvesting it from shark fins, they derive it from sugarcane, claiming to save an estimated 20-30 million sharks each year.

Moreover, biotechnology ingredients tend to be purer, more stable, and often more effective than their natural counterparts, meaning your product will last longer, perform better, and evoke less guilt regarding the environment.

What Should I Look For?

For consumers, all this information can feel daunting, especially with packaging filled with misleading marketing buzzwords. However, here are a few straightforward tips for choosing cosmetic products that align with your values.

  • Seek out biotechnology or lab-grown ingredients, often labeled as “fermented origin,” “biodesign,” or “bioidentical” on ingredient lists.
  • Be cautious of common marketing greenwash terms like “eco-friendly,” “clean beauty,” “sustainable,” and “biodegradable.” Look for tangible values, timelines, or explanations backing these claims.
  • Avoid brands that shift their focus away from sustainability to other concerns, such as “opposing animal testing,” which has been banned by the EU since 1998 for British cosmetics.

While the notion that beauty should be “natural” is comforting, this approach isn’t necessarily the most sustainable choice, especially as the UK lacks a legal definition of what “natural” cosmetics entail.

If you genuinely want to protect the planet for future generations, it’s essential to move past the notion of nature as an infinite resource and start supporting smarter scientific innovations that collaborate with nature rather than oppose it.

read more:

Source: www.sciencefocus.com

Common Gut Bacteria Can Transform Everyday Plastic Waste into Paracetamol

Paracetamol, also known as acetaminophen, is a pain reliever traditionally produced from a diminished supply of fossil fuels, such as crude oil. Every year, thousands of tons of fossil fuels, alongside numerous drugs and chemicals, are utilized to transport painkillers to manufacturing facilities. Professor Stephen Wallace from the University of Edinburgh and his team discovered that E. coli bacteria can transform molecules derived from waste plastic bottles into paracetamol.

Johnson et al. Reporting the reorganization of phosphate-catalyzed loss of biocompatibility in bacteria E. coli Activated acylhydroxamate is transformed into primary amine-containing metabolites in living cells. Image credit: Johnson et al. , doi: 10.1038/s41557-025-01845-5.

The issue of plastic waste is increasingly pressing, making the quest for sustainable plastic upcycling solutions a priority.

Metabolic engineering combines organic chemistry with the exploitation of biological cell chemical reaction networks to create new small molecules.

However, it remains uncertain whether these reactions can be effectively combined to convert plastics into useful products.

“Our research indicates that polyethylene terephthalate (PET) plastic is not merely waste, but can be converted by microorganisms into valuable new products with potential applications in disease treatment,” stated Professor Wallace.

In their study, Professor Wallace and co-authors found that a specific type of chemical reaction, known as loss rearrangement, occurs within living cells and is catalyzed by internal phosphates in E. coli.

This reaction produces nitrogen-containing organic compounds that are vital for cellular metabolism.

The researchers demonstrated that chemical processes can decompose PET plastic to yield starting molecules for further reactions, allowing cellular metabolism to regenerate these plastic-derived molecules.

Additionally, they discovered that this plastic-derived compound can serve as a precursor for paracetamol production in E. coli, achieving a yield of 92%.

This finding may mark the first instance of paracetamol synthesized from E. coli waste materials.

Future research will focus on exploring how other bacteria and types of plastics can yield beneficial products.

“Thus, biocompatible chemistry should be viewed as a complement to early enzyme design research and non-biological chemistry engineering, integrating collaboratively as a tool for biological cells to enhance potential synthetic chemistry within biological systems,” the scientists noted.

The team’s study was published in the journal Nature Chemistry on June 23, 2025.

____

NW Johnson et al. Relocation of loss of biocompatibility in E. coli. Nat. Chem. Published online on June 23, 2025. doi:10.1038/s41557-025-01845-5

Source: www.sci.news

Studies Suggest Common Vitamin Supplements May Help Slow Aging

Recent studies indicate that daily vitamin D intake can assist in managing the effects of aging.

Research has shown that supplementing with vitamin D for four years could potentially offset the aging process by about three years.

Prior studies have suggested that vitamin D supplements may help mitigate some prominent aging signs linked to various age-related diseases, such as cancer, heart disease, and dementia.

To explore this hypothesis, researchers from Mass General Brigham and Georgia Medical University examined the findings of previous trials. In this experiment, over 55 women and more than 50 men participated, taking either Vitamin D, Omega 3, or a placebo daily for five years.

The recent study assessed telomere length, concentrating on 1,054 participants who underwent specific tests at the beginning of the trial, as well as in their second and fourth years.

Telomeres are repetitive DNA sequences that protect chromosomes. Professor Morten Schiebye-Knudsen from the University of Copenhagen, who was not involved in the study, noted in BBC Science Focus Magazine.

Telomeres safeguard chromosome ends and prevent fusion or degradation – Credit: Getty Images/Knopprit

“Consider them like the plastic tips on shoelaces. They prevent chromosomes from fraying and sticking to each other, which helps maintain genetic stability during cell division,” he explained.

With each cell division, telomeres shorten slightly. If they become too short, the cell loses its ability to divide, leading to cell dysfunction.

The study found that participants taking vitamin D exhibited significantly reduced telomere shortening, effectively preventing nearly three years of aging.

This finding could offer valuable insights into promoting longer health spans, as telomere shortening is linked to various age-related diseases.

“I often refer to these cells as angry old men. They lose functionality, become inactive, and worsen over time, negatively impacting their environment,” Schiebye-Knudsen remarked.

“Telomere shortening may lead to older, more dysfunctional cells, resulting in increased inflammation in our bodies, particularly in rapidly dividing cells, like those in bone marrow, skin, and hair.”

About our experts

Morten Schiebye-Knudsen serves as an associate professor at the Faculty of Cellular Molecular Medicine at the University of Copenhagen.

read more:

Source: www.sciencefocus.com

The Giant Ground Sloth Developed Three Distinct Rotations for a Common Purpose

The ancient sloths exhibited a variety of sizes

Diego Barletta

The cool and arid climate has shaped sloths into giants—before humans potentially drove these large animals to extinction.

Today’s sloths are small, well-known herbivores that navigate through the lush canopy of tropical rainforests. However, for tens of millions of years, South America was home to an astonishing variety of sloths, many of which were massive ground dwellers, with some giants weighing close to five tonnes.

This remarkable range of sizes is of particular interest to Alberto Boscani from the University of Buenos Aires, Argentina, and his colleagues.

“Body size is correlated with all biological characteristics of an animal,” states Boscaini. “This provides a promising avenue for studying sloth evolution.”

Boscaini and his team have synthesized data on physical attributes, DNA, and proteins from 67 extinct and extant sloth genera (groups of closely related species) to construct a family tree that illustrates their evolutionary relationships.

They then analyzed this evolutionary timeline, spanning 35 million years, incorporating insights on habitat, diet, and lifestyle for each sloth. They also examined evolutionary patterns in body size and made weight estimates for 49 ancient and modern sloth groups.

The findings indicate that the evolution of sloth body sizes was significantly influenced by climate change and shifts in habitat. For instance, certain sloth genera began adapting to arboreal living, much like today’s sloths, resulting in a reduction in body size.

Simultaneously, three separate lineages of sloths evolved the proportion of elephant-like features independently. This adaptation appears to have occurred in the last few million years as global cooling and the uplift of the Andes transformed South America into a drier environment.

“The giants are more closely associated with colder and drier climates,” remarks team member Daniel Casari from the University of Sao Paulo, Brazil.

A significant number of these various sloths went extinct in two catastrophic phases: one around 12,000 years ago and another approximately 6,000 years ago, according to Boscaini.

“This aligns with the expansion of Homo sapiens across the American Supercontinent and subsequently into the Caribbean,” he explains, noting that many giant sloths lived in these regions. The surviving sloth species primarily inhabit trees, making them less accessible to humans compared to larger sloths.

The hypothesis that humans played a significant role in the extinction of ancient megafauna is strongly supported, states Thaís Rabito Pansani from the University of New Mexico, who was not part of the research.

“However, solid evidence is necessary to substantiate this theory, especially concerning unresolved and highly debatable issues such as megafauna extinction,” she emphasizes. Recent evidence adds context to this narrative.

“Sloths flourished for much of their history,” says Casari. “[The findings] indicate how a once-successful group can quickly become vulnerable.”

Topic:

Source: www.newscientist.com

Chimpanzee Medical Care and Hygiene Are More Common Than You Might Think

Primatologists have recorded and examined both previously noted and newly observed instances of self-administered and socially oriented wound care, snare removal, and potential medicinal hygiene behaviors within the Sonso and Waibira chimpanzee communities of the Budonggo forests in Uganda. They documented self-directed wound care actions, such as licking wounds, slapping leaves, pushing fingers against wounds, applying plant material to injuries, and successfully removing snares. The researchers also noted self-guided hygiene behaviors, including cleaning genital areas with koital leaves and wiping foliage post-defense.

Social grooming between two chimpanzees in Budonggo forest, Uganda. Image credit: Elodie Freymann.

“Our research sheds light on the evolutionary origins of human medicine and healthcare systems,” stated the first author of the study, Dr. Elodie Freyman, a researcher at Oxford University.

“By observing how chimpanzees identify and utilize medicinal plants to care for others, we can gain valuable insights into the cognitive and social foundations of human medical practices.”

Dr. Freyman and his team focused their study on the Sonso and Waibira chimpanzee communities in Budonggo forest.

Like all chimpanzees, individuals in these communities face injuries from various causes, including human-instigated fights, accidents, or snares.

Approximately 40% of all Sonso individuals are observed with snare injuries.

The researchers dedicated four months to each community’s observation, employing video evidence from a comprehensive APE dictionary database, a logbook filled with decades of observational data, and research gathered by other scientists who have witnessed chimpanzees treating injuries and illnesses.

Chimpanzees have been noted to use specific plants for external treatment. Some have been identified to possess chemical properties that enhance wound healing and traditional medicinal applications.

During their field observations, scientists noted 12 injuries at Sonso, all likely resulting from group conflicts.

In Wyvila, five chimpanzees were documented as injured—one female from a snare and four males from combat.

https://www.youtube.com/watch?v=Amnbsz6uvfq

Researchers also recognized that care was provided for their offspring rather than Waibira.

“This may be influenced by factors like variations in social hierarchy stability and greater observation opportunities in the well-acquainted Sonso community,” noted Dr. Freyman.

The scientists recorded a total of 41 care instances: seven instances of prosocial care and 34 instances of self-care.

These instances frequently involved various care behaviors, whether addressing different aspects of a wound or indicating the chimpanzee’s personal preferences.

“Chimpanzee wound care involves several techniques, which can remove debris and apply potentially antibacterial substances, possibly even antibiotics from their saliva.”

“All chimpanzees documented in our study exhibited recovery from their wounds, yet we are unable to determine the outcome had they chosen not to address their injuries.”

“We also recorded hygienic behaviors such as using leaves to clean the genitals post-mating and wiping the anus with leaves after defecation—practices that serve to prevent infections.”

Among the seven instances of prosocial care, the researchers noted four instances of wound treatment, two instances of assistance in snare removal, and one instance involving hygiene help for another chimpanzee.

Care was administered without preference towards a specific gender or age group. Attention was given to genetically unrelated individuals in four cases.

“These behaviors contribute to evidence from other areas where chimpanzees appear to acknowledge the needs and sufferings of others and take deliberate actions to alleviate them, even in the absence of direct genetic advantages,” Dr. Freyman stated.

The research team intends to delve deeper into the social and ecological contexts in which care is provided and which individuals are recipients of such care.

“There are some methodological limitations in our study,” Dr. Freyman added.

“The disparity in familiarity between the Sonso and Waibira communities introduces observational bias, particularly regarding rare behaviors like prosocial healthcare.”

“We have documented the plants used in healthcare contexts, but further pharmacological exploration is necessary to confirm their specific medicinal characteristics and efficacy.”

“The relative rarity of prosocial healthcare also complicates the process of identifying patterns related to when and why such care is provided, or when it is withheld.”

“These challenges underscore future research avenues in this burgeoning field.”

Study published in the journal Frontiers in Ecology and Evolution.

____

Elodie Freymann et al. 2025. Self-direction and prosocial wound care, snare removal, and hygienic behavior among Budongo chimpanzees. Front. Ecol. Evol. 13; doi:10.3389/fevo.2025.154092

Source: www.sci.news

Shingles Vaccines Linked to Reduced Risk of Various Common Heart Issues

SEI 250069808

The shingles vaccine appears to offer additional benefits

Cavan images / Alamy

Vaccination against shingles, also known as herpes zoster, not only prevents this painful infection but also lowers the chance of cardiovascular issues.

A recent observational study involving over 1 million participants has revealed that individuals who receive the shingles vaccine Zostavax have a 26% reduced risk of developing heart disease, heart attacks, or heart failure compared to those who are unvaccinated.

“Shingles is known to cause inflammation in blood vessels,” notes a researcher. “Thus, by preventing the infection, vaccines could potentially reduce the risk of cardiovascular diseases.”

Shingles manifests when the varicella-zoster virus, which causes chickenpox, reactivates after lying dormant in the body. This reactivation can happen due to factors like stress or prolonged chemotherapy, leading to painful rashes.

While cardiovascular complications are not commonly highlighted, research has shown a link between shingles and increased risks for conditions such as stroke and heart attack, especially within the first year post-infection, with stroke risk rising by approximately 30% and heart attack risk by 10%.

To investigate whether vaccinations mitigate these risks, Lee and colleagues analyzed data from 1,271,922 individuals over 50, gathered by the South Korean National Health Registry Bureau from 2012 to 2024. They assessed who received the live vaccine and compared it with the later onset of 18 cardiovascular diseases, including heart failure, stroke, and arrhythmias, while also considering various health-related factors like age, gender, and lifestyle.

Throughout a six-year average follow-up period, the study found that the risk of cardiovascular events post-vaccination was 23% lower than in unvaccinated individuals.

The reduction was more pronounced in men, with a 27% lower risk compared to a 20% decrease in women. Among those under 60, there was a 27% reduction in risk, while in older populations, it was 16%. Rural residents showed a 25% risk reduction versus 20% in urban settings, and low-income groups had a 26% decrease, while higher earners experienced a 20% reduction. The data also indicated that risk reduction decreased as BMI increased.

For specific cardiovascular incidents, vaccinated people were found to be 26% less likely to experience a stroke, heart attack, or heart failure, and 26% less likely to die from heart disease. Additionally, the risk of coronary artery disease was reduced by 22%.

The benefits were most significant in the two to three years following vaccination, gradually tapering off over the subsequent five years.

The findings support the notion that shingles vaccination “enhances our confidence” in its capability to lower cardiovascular risk by decreasing vascular inflammation potentially triggered by the shingles virus, states Galen Faulke from Pennsylvania State University.

“Zoster itself has a notably high incidence of pain and postherpetic neuralgia, which can be extremely distressing,” he adds. “However, healthcare systems globally can significantly reduce cardiovascular ailments by advocating the use of cost-effective shingles vaccines.”

While further research is necessary, scientists theorize that the vaccine may indirectly contribute to lowering cardiovascular risks associated with shingles.

Initially, the focus was on Zostavax, which uses viral proteins, but more attention is now directed towards Shingrix.

“That’s why it is more effective at preventing shingles. I believe recombinant vaccines could offer even stronger cardiovascular protection,” Lee explains.

Despite the study design not establishing causality as in randomized trials, researchers can identify risk correlations across a large population. Such extensive data can reveal risk patterns that clinical trials might overlook, Lee explains.

Topics:

Source: www.newscientist.com

Common Painkillers During Pregnancy Linked to Increased Risk of ADHD in Children

Microscopic view of paracetamol crystals

Henri Koskinen/Shutterstock

Children who used paracetamol, also known as acetaminophen, during their mother’s pregnancy, are more likely to develop ADHD than those whose mothers do not, suggesting small studies. Although inconclusive, this finding gives weight to the contested idea that widely used painkillers can affect fetal brain development.

Previous studies on paracetamol and neurodevelopmental conditions provide conflicting findings. For example, a 2019 study linking over 4,700 children and their mothers with their mothers to use painkillers during pregnancy There is a 20% higher risk Children developing ADHD. However, the analysis presented by nearly 2.5 million children last year There is no such connection When comparing siblings who were exposed or not to paracetamol before birth.

One problem is that most of these studies rely on self-reported medication use. This is a serious limitation as you may not remember taking paracetamol during pregnancy. For example, only 7% of participants in the 2019 study reported using paracetamol during pregnancy. It's well below the 50% seen in other studies. “A lot of people take it [paracetamol] Without knowing that” Brennan Baker At Washington University in Seattle. “It could be the active ingredient in some of the cold medicines you're using, and you don't necessarily know.”

So Baker and his colleagues used more accurate metrics instead. They looked for medication markers in blood samples collected from 307 women. They were all black and lived in Tennessee in the late pregnancy. None of them were taking medication due to chronic illnesses and were unaware of the complications of pregnancy. The researchers then followed up with participants when the child was 8 to 10 years old. In the US, Approximately 8% of children I have ADHD between the ages of 5 and 11.

On average, children whose mothers had a marker of paracetamol in their blood were three times more likely than children born to mothers, even after adjusting for factors such as mother's age and body mass index before pregnancy. There was a possibility (BMI), mental health status among close relatives of socioeconomic status and mental health. This suggests that using paracetamol during pregnancy may increase the risk of developing ADHD in children.

However, it is also possible that the actual factor that increases your risk of ADHD is not the drug itself, but the first thing you will be taking paracetamol. “They couldn't explain anything like the reason why their mother took it. [paracetamol]”It has been found to be risk factors for adverse development in children, such as headaches, fever, pain and infectious diseases.” Viktor Ahlqvist At the Karolinska Institute in Sweden.

But Baker believes that it is the drugs that are responsible. Subsequent analysis of tissue samples from 174 of participants' placenta showed that people using paracetamol have different metabolic and immune system changes. These changes are similar to those seen in studies testing the effects of paracetamol in pregnant animals without infection or underlying health conditions.

“I think the fact that we see immune upregulation in animal models also really strengthens the causal case,” Baker says. “There are many previous studies showing that elevated immune activation during pregnancy is linked to adverse neurodevelopment.”

Yet these findings are far from conclusive. For one thing, the study included a small number of participants, all black and lived in the same city, limiting the generalization of the findings. Another case, it measured only the blood markers of paracetamol at one moment. These markers have stuck for about three days, so the study probably captured more frequent users, and may have a dose-dependent effect, says Baker.

“[Paracetamol] Now is the first-line treatment option for pain and fever during pregnancy,” says Baker. “But I think I'm an agency [US Food and Drug Administration] In addition, various obstetric and gynecological associations should continuously review all available research and update their guidance. ”

Meanwhile, if it's unclear whether to take paracetamol during pregnancy, people should talk to their doctor, Baker says.

topic:

Source: www.newscientist.com

Saline nasal drops and sprays showing promise in treating the common cold

Saline nasal sprays may stop children's sneezing faster

ONFOKUS.COM, Sebastian Court/Getty Images

Saline nasal sprays appear to help speed up cold recovery: In a new study, children who were given the homemade nasal spray recovered from cold symptoms like sneezing and stuffy nose two days faster than those who weren't.

More than 200 different viruses can cause cold-like symptomsTherefore, it is difficult to develop general and effective treatments that target them. As a result, most cold therapies only relieve symptoms but do not shorten the duration of symptoms.

But research increasingly suggests that saline may be the exception. Studies have shown that adults who use saline nasal drops or sprays to relieve cold symptoms: Reduces symptoms and speeds recovery and Less likely to spread infection.

now, Steve Cunningham Researchers at the University of Edinburgh in the UK tested this method on children. They asked the parents of 150 children with cold symptoms to place three drops of saline solution into their children's nasal passages at least four times a day within 48 hours of the onset of symptoms, until symptoms subsided. The water-based solution that the parents mixed themselves contained 2.6 percent salt.

Another group of 151 children received standard cold care from their parents, such as prescribing over-the-counter medicines and encouraging rest. All of the children were under the age of seven, and their symptoms were recorded by their parents.

The researchers found that children who started using the drops within 24 hours of the onset of symptoms recovered two days faster than those who never used the drops, and their families were also less likely to develop cold symptoms. But children who started using the drops later didn't get better and were less likely to spread the cold than those who never used the drops.

Cunningham, who will present his findings at the European Respiratory Society meeting in Vienna, Austria, on September 8, says that the chloride ions in saline could prompt cells to produce an antiviral substance called hypochlorous acid, though this may need to be started early in infection, before the virus can take hold, he says.

but William Shaffner Researchers at Vanderbilt University Medical Center in Tennessee are skeptical that this method will actually help cure viral infections. [evidence] They want me to believe that this is an antiviral effect, not just symptom relief,” he says.

Schaffner says the researchers could have also given another group of children regular water drops or a low-concentration saline solution, which would show whether the saline nasal spray targets the virus and speeds recovery or simply keeps mucous membranes moist to ease symptoms, he says.

topic:

Source: www.newscientist.com

New study uncovers common, mysterious I motif structure in human genome DNA

The so-called i-motif is a knot-like DNA structure that forms in the nuclei of human cells and is thought to provide important genome control. Garvan Institute of Medical Research Other studies have used immunoprecipitation and next-generation sequencing to identify i-motif structures in human DNA.

Peña Martinez othersIn total, we observed 53,000 i-motifs across three human cell lines (MCF7, U2OS, and HEK293T). Image courtesy of Peña Martínez. others., doi: 10.1038/s44318-024-00210-5.

The I motif is a DNA structure that differs from the iconic double helix shape.

These form when runs of cytosine letters on the same DNA strand pair up with each other to form a four-stranded twisted structure that juts out from the double helix.

In 2018, scientists at the Garvan Institute of Medical Research were the first to successfully directly visualize i-motifs inside living human cells, using new antibody tools they developed to recognise and bind to the i-motifs.

The new study expands on these findings by using the antibody to identify the location of i-motifs throughout the genome.

“In this study, we have mapped more than 50,000 i-motif sites in the human genome that are found in all three cell types we looked at,” said Professor Daniel Crist from the Garvan Institute of Medical Research, lead author of the study.

“This is a surprisingly high number for a DNA structure whose presence in cells was once a matter of debate.”

“Our findings confirm that the i-motif is not just an object of laboratory study, but is widespread and likely plays an important role in genome function.”

The researchers found that i-motifs are not scattered randomly, but are concentrated in important functional regions of the genome, including those that control gene activity.

“We found that the i-motif is associated with genes that are highly active at specific times in the cell cycle,” said lead author Cristian David Peña Martinez, PhD, also of the Garvan Medical Institute.

“This suggests that it plays a dynamic role in regulating gene activity.”

“We also discovered that i-motifs are formed in the promoter regions of cancer genes. For example, MYC Oncogenes encode one of cancer’s most notoriously ‘untreatable’ targets.”

“This opens up exciting opportunities to target disease-related genes through i-motif structures.”

“The widespread presence of the i-motif near these 'holy grail' sequences implicated in hard-to-treat cancers opens up new possibilities for novel diagnostic and therapeutic approaches,” said study co-author Sarah Kummerfeld, PhD, a researcher at the Garvan Medical Institute.

“It may be possible to design drugs that target the i-motif to affect gene expression, potentially expanding current treatment options.”

Team result Published in EMBO Journal.

_____

Christian David Peña Martinez othersi-motif structures are widely distributed in human genomic DNA. Embo JPublished online August 29, 2024, doi: 10.1038/s44318-024-00210-5

Source: www.sci.news

New Study Shows Common Kitchen Worktop Material Can Lead to Irreversible Lung Disease

Doctors are calling for a ban on artificial stone, a popular material used for kitchen worktops, following the confirmation of eight cases of artificial stone silicosis in the UK for the first time.

Also known as engineered or reconstituted stone, artificial stone has gained popularity for its aesthetics and durability over the last two decades. However, a new report published in the British Journal of Construction highlights the serious health risks posed by its high silica content, which exceeds 90% compared to 3% in marble and 30% in granite.

“Silicosis is a progressive lung disease caused by inhaling crystalline silica dust,” said Dr. Patrick Howlett, a spokesperson for BBC Science Focus. “The risk of developing silicosis is significantly higher for workers in the artificial stone industry compared to those with chronic respiratory conditions.”


“Various industries expose individuals to silicosis, including mining, pottery, cement work, and now artificial stone fabrication. Prolonged exposure to low levels of silica dust can lead to the development of silicosis over time,” added Dr. Howlett.

All eight affected individuals were male, with an average age of 34, and most worked for small businesses with fewer than 10 employees. Poor safety practices, such as inadequate respiratory protection and ventilation systems, were reported by workers during cutting and grinding operations.

The report’s authors emphasized the need for national guidelines and better enforcement to protect workers from artificial stone silicosis. They highlighted the urgent need for early detection of cases and preventative measures to avoid a potential epidemic.

Since 2010, cases of artificial stone silicosis have been reported worldwide, but the UK confirmed its first cases in mid-2023. California has identified nearly 100 cases of silicosis among countertop workers, prompting the adoption of new regulations to safeguard workers.

Australia has already banned the use of artificial stone as of July 2024, aiming to eliminate the health risks associated with its production and installation.

In related editorials, Dr. Christopher Barber and researchers from Sheffield Teaching Hospitals NHS Foundation Trust drew parallels between artificial stone silicosis and historical occupational health crises, urging stricter regulations and enforcement to protect workers.

Experts are currently reviewing exposure limits for crystalline silica dust in the UK, with a focus on mitigating the risks associated with artificial stone worktops. Silicosis remains a significant concern for clinicians and researchers in the occupational health field.

About our experts

Patrick Howlett: An MRC Clinical Research Fellow at the National Heart and Lung Institute, Imperial College London, focusing on silicosis and tuberculosis among small-scale miners in Tanzania.

Christopher Barber: A leading expert in occupational and environmental lung disease, serving as a medical advisor to the UK Health and Safety Executive and conducting extensive research in the field.


Read more:

Source: www.sciencefocus.com

The common misconception that moderate alcohol consumption is beneficial for your health

Drinking alcohol is bad for you, but it is often a social activity.

Violeta Stoymenova/Getty Images

Rigorous research suggests that drinking even small amounts of alcohol can shorten your lifespan, and that only people with serious health problems would benefit from moderate drinking. That's the conclusion of a review of 107 studies that looked at how drinking alcohol at specific ages affects the risk of dying from all causes.

“People need to be skeptical of the claims that the industry has been peddling for years.” Tim Stockwell “They clearly have a strong interest in promoting their products as not cancer-causing but as life-prolonging,” said researchers from the University of Victoria in Canada.

Stockwell says people should be told that while the risks of moderate drinking are small, it's not beneficial. “It may not be as dangerous as a lot of other things, but it's important that consumers are aware,” he says. “I also think it's important that manufacturers inform consumers of the risks through warning labels.”

The best way to assess the effects of alcohol would be to randomly select people who drink and who don&#39t drink as children, and then monitor their health and drinking for the rest of their lives. Because such studies are not possible, researchers instead have to ask people about their drinking habits and follow them over a much shorter period of time.

By the 2000s, a number of studies of this kind had been done, suggesting that the relationship between alcohol consumption and risk of death at a given age follows a J-shaped curve: drinking a little alcohol slightly reduces your risk of dying from any cause compared with a non-drinker, but as you drink more alcohol, your risk increases sharply.

Stockwell says he was convinced the science was well-established at the time, but he and other researchers have since Such studies have serious flaws.

The main problem is that they often don&#39t compare people who have never drunk alcohol to people who have. Many studies instead compare people who no longer drink to people who still drink. People who stop drinking, especially later in life, often have health problems, so moderate drinkers seem healthy in comparison, Stockwell says.

Although some studies claim to compare current drinkers with “never drinkers,” the definition of the latter group often actually includes occasional drinkers, Stockwell says. For example, one study defined people who had up to 11 drinks a year as lifetime abstainers.

“In our opinion, the majority of research has not addressed this potential source of bias,” Stockwell says, “To be clear, people have tried to address this, but we don&#39t think they&#39ve done so adequately.”

In fact, his team found that of 107 studies they reviewed, only six adequately addressed these sources of bias, and none of those six found any risk reduction with moderate drinking.

” [high-quality] “The research suggests a linear relationship,” Stockwell says, “the more you drink, the higher your risk of heart disease. Our study looks at total mortality, and it&#39s clear that heart disease is the main issue.”

The review says that it is very clear that lower quality studies are more likely to suggest a beneficial effect. Duane Mellor At the British Dietetic Association.

But he points out that this doesn&#39t take into account the social aspects of moderate drinking. “While it&#39s healthier to socialize without drinking alcohol, the benefits of spending time with other people are likely to outweigh the risks of consuming one or two units of alcohol,” he says. “Perhaps the challenge is to limit alcohol intake in this way.”

topic:

Source: www.newscientist.com

New understanding suggests LUCA, the last common ancestor of all life, emerged earlier than previously believed

Illustration showing LUCA possibly being attacked by a virus

Scientific Graphic Design

The organisms that gave rise to all life on Earth evolved much earlier than previously thought – just a few hundred million years after Earth formed – and may have been more sophisticated than previous assessments had suggested.

The DNA of all living organisms today is E. coli There are many similarities in the evolution leading up to the blue whale, suggesting that we can trace our origins back to a universal common ancestor, LUCA, billions of years ago. While many efforts have been made to understand LUCA, studies taking a broader approach have revealed surprising results.

“What we're trying to do is bring together representatives from different disciplines to develop a comprehensive understanding of when LUCA existed and what its biological characteristics were,” he said. Philip Donahue At the University of Bristol, UK.

Genes that are currently present in all major lineages of life may have been passed down uninterrupted from LUCA, which could help us understand what genes our ancient ancestors had. By studying how these genes changed over time, we should be able to estimate when LUCA lived.

In reality, this is a lot more complicated than it sounds, as genes are lost, gained, and swapped between branches. Donohue says the team created a complex model that took this into account, to work out which genes were present in LUCA. “We've found a much more sophisticated organism than many have previously claimed,” he says.

The researchers estimate that 2,600 protein-coding genes come from LUCA, up from previous estimates of as few as 80. The team also concludes that LUCA lived around 4.2 billion years ago, much older than other estimates and surprisingly close to the formation of Earth 4.5 billion years ago. “This suggests that the evolution of life may have been simpler than previously claimed, because evolution happened so quickly,” Donohue says.

The earlier date is largely due to the team's improved methodology, but also because, unlike others, they don't assume that LUCA could have existed only after the Late Heavy Bombardment, when Earth was hit so hard by space debris that any new life that emerged could have been wiped out. Based on rocks returned from the Moon, the period has been put at 3.8 billion years ago, but there's a lot of uncertainty around that number, Donohue says.

Their reconstruction suggests that LUCA had genes that protected it from ultraviolet damage, which leads them to believe that it likely lived on the ocean's surface. Other genes suggest that LUCA fed on hydrogen, which is consistent with previous findings. The team speculates that LUCA may have been part of an ecosystem with other types of primitive cells that are now extinct. “I think it's extremely naive to think that LUCA existed on its own,” Donohue says.

“I think this is compelling from an evolutionary perspective.” Greg Fournier “LUCA is not the beginning of the story of life, but merely the state of the last common ancestor that we can trace back to using genomic data,” say researchers from the Massachusetts Institute of Technology.

The results also suggest that LUCA had a primitive version of the bacterial defense system known as CRISPR to fight viruses. “Even 4.2 billion years ago, our earliest ancestors were fighting viruses,” the team members say. Edmund Moodyalso at the University of Bristol.

Peering into the distant past is fraught with uncertainty, and Donohue is the first to admit that his team may have missed the mark. “We've almost certainly got it all wrong,” he says. “What we're trying to do is push the envelope and create the first attempt to synthesize all of the relevant evidence.”

“This won't be the last word,” he said, “and it won't be our last word on this subject, but we think it's a good start.”

Patrick Forter Researchers at the Institut Pasteur in Paris, France, who coined the term LUCA, also believe that the organism did not live in isolation. “But the claim that LUCA lived before the Late Heavy Bombardment 3.9 billion years ago seems to me completely unrealistic,” says Forterre. “I'm convinced that their strategy for determining the age and gene content of LUCA has several flaws.”

topic:

Source: www.newscientist.com

Study reveals last common ancestor lived 4.2 billion years ago

The Last Universal Common Ancestor (LUCA) is a hypothetical common ancestor of all modern cellular life, from single-celled organisms such as bacteria to giant sequoia trees and even to us humans. Our understanding of LUCA therefore has implications for our understanding of the early evolution of life on Earth.

Probabilistic inference of metabolic networks for modern organisms present in LUCA. Image courtesy of Moody others., doi: 10.1038/s41559-024-02461-1.

LUCA is a node on the tree of life from which the basic prokaryotic domains (Archaea and Bacteria) branch off.

Modern life evolved from LUCA from a variety of different sources: the same amino acids used to build proteins in all cellular organisms, a shared energy currency (ATP), the presence of cellular machinery such as ribosomes involved in creating proteins from information stored in DNA, and even the fact that all cellular organisms use DNA itself as a way to store information.

In the new study, University of Bristol scientist Edmund Moody and his colleagues compared all the genes in the genomes of modern species and counted the mutations that had occurred in the sequences over time since a common ancestor called LUCA.

The time when some species split off is known from the fossil record, and the team used a genetic equivalent of a familiar equation used in physics to calculate speed to determine when LUCA existed, arriving at 4.2 billion years ago – just 400 million years after Earth and the solar system formed.

“The evolutionary history of genes is complicated by the exchange of genes between lineages,” Dr Moody said.

“Reconciling the evolutionary history of genes with species lineages requires the use of complex evolutionary models.”

“We didn't expect LUCA to be so old, within just a few hundred million years of Earth's formation,” said Dr Sandra Alvarez-Carretero, also from the University of Bristol.

“But our findings are consistent with modern views of the habitability of early Earth.”

The study authors also traced the lineage of life back to LUCA and modeled the physiological traits of modern species to elucidate LUCA's biology.

“One of the real advantages here is that we applied the gene tree and species tree reconciliation approach to a highly diverse dataset representing the major domains of life: Archaea and Bacteria,” said Dr Tom Williams from the University of Bristol.

“This allows us to make statements with some confidence about how LUCA lived and to assess that level of confidence.”

“Our study shows that LUCA was a complex organism not too different from modern prokaryotes, but what's really interesting is that LUCA clearly had an early immune system, indicating that by 4.2 billion years ago our ancestors were in an arms race with viruses,” said Professor Davide Pisani, from the University of Bristol.

“LUCA clearly used and transformed its environment, but it is unlikely to have lived alone,” said researcher Dr Tim Lenton, from the University of Exeter.

“That waste would then serve as food for other microorganisms, such as methanogens, helping to create a recycling ecosystem.”

“The insights and methods provided by this study will also inform future studies looking in more detail at the subsequent evolution of prokaryotes in the context of Earth's history, including the less-studied archaea and their methanogens,” said Professor Anja Spang, researcher at the Royal Netherlands Institute for Marine Research.

“Our study brings together data and methods from multiple disciplines, revealing insights into the early Earth and life that could not be achieved by any single discipline alone,” said Professor Philip Donoghue, from the University of Bristol.

“It also shows how quickly ecosystems were established on the early Earth.”

“This suggests that life may thrive in an Earth-like biosphere somewhere in the universe.”

This study paper Published in the journal today Natural Ecology and Evolution.

_____

ERR Moody othersThe nature of the last universal common ancestor and its impact on the early Earth system. Nat Ecol EvolPublished online July 12, 2024, doi: 10.1038/s41559-024-02461-1

This article is a version of a press release provided by the University of Bristol.

Source: www.sci.news

Science debunks 7 common myths about your reality

Our perception of reality is quite limited because we evolved on the African plains 3 million years ago. Our senses were shaped to help us survive in that environment, with eyes that can detect approaching predators and ears that can hear the rustling of grass.

Although our senses have given us a basic understanding of the world, they also deceive us at times. The majority of nature remains hidden from us, and things are not always as they appear.

Here are a few examples of things that seem obvious but are not necessarily true:

1. The Earth is flat

Many ancient peoples believed the Earth was a disk. – Photo credit: Alamy

While the Earth may appear flat, evidence such as ships disappearing over the horizon and the curved shadow of the Earth on the Moon during a lunar eclipse point to its spherical nature. Observations like the first circumnavigation of the globe also support the round Earth theory.

Proving the Earth’s size involved measurements and calculations, with early estimates by Eratosthenes aligning closely with modern figures.

2. The stars revolve around the Earth

It may seem logical that stars move around a stationary Earth, but evidence such as artillery deviations and the Foucault pendulum disproves this. The invention of the pendulum provided physical proof of the Earth’s rotation.

3. Living things are designed to suit their habitats

The apparent design in nature is often attributed to mutations and natural selection rather than intentional design. DNA plays a crucial role in the adaptation of organisms to their environments.

4. Your time is the same as everyone else’s

Speeds close to the speed of light and strong gravitational fields (such as near a black hole) distort time. – Photo credit: Science Photo Library

The concept of time is influenced by speed and gravity, as demonstrated by Einstein’s theories. Time dilation occurs in different gravitational fields, impacting the flow of time.

5. The moon won’t fall

Newton’s insights about gravity and orbital mechanics explain why the moon stays in orbit rather than falling to the Earth. Objects in free fall experience weightlessness due to the effect of gravity.

6. Stars are tiny dots on the celestial sphere

The Milky Way galaxy contains over 100 billion stars. – Photo credit: Getty

The apparent size of stars is deceiving, with parallax observations revealing their true distance and magnitude. Spectral analysis further confirms the nature of stars as distant suns.

7. We can know what the universe is like “now”

The concept of “now” is complex in a universe where light travels slowly through vast distances. Observations of distant objects reflect their past states, allowing us to study the history of the universe but not its current state.

Source: www.sciencefocus.com

The Role of a Common Bacterium in the Sudden Deaths of 200,000 Longhorn Bees

Saiga enters a bar. The bartender asks, “Why the long face?” Saiga responds, “A long nose helps me filter out dust in the summer and warm the cold air in winter. Plus, female saigas love big noses.”

Despite its unusual appearance, the saiga antelope has even stranger qualities. In May 2015, during breeding season in central Kazakhstan, a mysterious tragedy struck the saiga population. Over 200,000 saigas, equivalent to 60% of the global species, died from unknown causes.

Conservation efforts had been ongoing to protect the saigas, which had been hunted for their horns in the past centuries, leading to a decline in their numbers. The sudden mass die-off in 2015 shocked experts and led to extensive testing and analysis.

After thorough investigations, it was determined that a strain of bacteria, Pasteurella multocida, had caused the fatal infection in the saigas. This outbreak was possibly triggered by unusual weather conditions, sparking concerns about future die-offs.

Despite these challenges, conservation efforts have been successful in stabilizing the saiga population, with estimates now around 1.5 million. Strict measures like anti-poaching initiatives, habitat protection, and community engagement have contributed to this recovery.

The International Union for Conservation of Nature recently reclassified the saiga from “endangered” to “near threatened,” signaling progress in their conservation. However, researchers remain cautious about the species’ future due to ongoing threats.

For inquiries, contact us via email or visit our social media pages: Facebook, Twitter, and Instagram.

Discover more incredible science facts on our website and stay informed about our latest updates. For further information, visit our pages and interact with us online.

Source: www.sciencefocus.com

Twin cicada nymphs emerge – a common occurrence

They come again with their beady little eyes

George Baird/Shutterstock

Every spring, billions of cicadas emerge from their underground burrows. Things like this happen so often that we don’t usually mention it. But this year, two friends, both of whom have been underground for more than a decade, will emerge simultaneously, blanketing parts of the United States with trillions of bugs. They’ll create a racket, an all-purpose buzz that stretches from the Atlantic Ocean to the Canadian border. And so are the news articles announcing their arrival. It’s said to be a historic event unlike anything we’ve experienced since 1803, but that depends on how you look at it.

So is this rare? Has it really been more than a century since groups of cicadas appeared?
Hmm, no. This happens sometimes. In the United States, 3 cicada swarms appear every 13 years and 12 cicada swarms appear every 17 years. Two people lined up in 2014, and two people lined up in the same year in 2015.

This year, Brood XIX, also known as the Great Southern Brood, will spread to more than a dozen states in the southern United States. At the same time, Brood XIII appears in several states around the Great Lakes in the northern part of the country. The last time he appeared together was in 1803, and they would not sync up again until 2245.

But they don’t come with little stamps on their feathers identifying them as one of our own. The most noticeable thing to the average person might be the loud soundtrack of a summer night, that classic drone of cicadas that emanates from a much louder chorus.

But has there been any news about cicadas lately?
Indeed, there is. For example, in 2021, the Great Eastern Brood emerged with a bang after 17 years underground, hitting densely populated areas such as Philadelphia, Pennsylvania, and Washington, DC. This hype was enough to draw insect-seeking tourists and spur culinary creations such as: semi-scampi. (Note to foodies: Cicadas are a member of the crustacean family, and people with shellfish allergies should avoid eating cicadas.) US Food and Drug Administration. )

So why is cicada emergence in 2024 attracting so much attention? Will more cicadas arrive than ever before?
The issue is not so much the number of cicadas as the extent to which these red-eyed insects invade. “Double brood emergence is not unprecedented, but this one is notable for its wide geographic range,” he said. Jonathan Larson at the University of Kentucky. “That would be a great force of nature.”

This is especially true as Brood means that the person may witness this phenomenon.

Is there any reason to worry about cicadas?
Cicadas do not bite or sting, so while they may be a temporary nuisance to some people, they are not dangerous. “That’s all there is to it. [broods] It’s something you can experience in a lifetime.” jessica ware At the American Museum of Natural History in New York. “So instead of getting annoyed by the sound, just enjoy the sound.”

Why do periodic cicadas live this way?
Cicadas burrow underground as nymphs, feeding on the sap from tree roots for years at a time. When the soil warms up in May-June, they crawl out of their underground burrows and immediately look for vertical surfaces (trees, houses, cars) to scale. “You’ll wake up one morning and all of a sudden there’ll be cicadas everywhere,” he says. chris simon at the University of Connecticut.

After a week, they shed their exoskeleton and reach their final adult form. The male then woos the female by vibrating the membranes of her body and producing a cacophonous song. After mating, the female lays eggs and all adults die. Within a month, the cicadas will be gone. This waiting game is all part of the periodic cicada’s dramatic survival tactics. Birds and other predators quickly fill the buggy buffet as thousands of insects flood the area, leaving only a few cicadas behind. The pattern of 13 years and his 17 years is so unpredictable that it is difficult for predators and diseases to keep up.

It is not clear exactly how cicadas time their primes, but most scientists agree that the insects measure the passage of time through environmental signals from the trees they feed on. I agree.

Will something like this happen again soon?
The next double spring won’t occur until 2037, so cicada researchers want to sample and study as many insects as possible. They are also interested in seeing how many cicadas actually emerge, since it has been many years since both cicadas appeared on the ground. Because the insect burrowed underground more than a decade ago, nutrients in the soil around the insect may have changed due to fertilizer use, plants growing in the area, or even climate change. It is also possible that something was built on top of the cicada’s bed. That’s the bet these insects are making.

After 2037, the next double chick will appear in 2041, followed by double chicks in 2050, 2053, and 2054. As the name suggests, it continues like this on a regular basis for as long as cicadas exist.

topic:

Source: www.newscientist.com

PFAS – The Persistent Chemicals – Are Becoming Common in Food Packaging

Potentially dangerous chemicals may be present in food packaging

Pirin Petunia/Getty Images

Food packaging and utensils commonly contain up to 68 “forever chemicals” that can pose health risks, many of which regulators are potentially unaware of. There is a gender.

Perfluoroalkyl and polyfluoroalkyl substances (PFAS) are a type of synthetic chemical used to make products such as nonstick cookware and waterproof clothing. The bonds between carbon and fluorine atoms in PFAS are so strong that it can take hundreds to thousands of years for the bonds to break down.

Many of these chemicals are associated with adverse health outcomes, including: cancer and reproduction and immunity problem.

“There are thousands of these chemicals,” he says. Birgit Geueke At the Swiss Food Packaging Forum organization. “We wanted to understand what information is known about the presence of PFAS in food packaging.”

Geweke and his colleagues analyzed 1,312 studies conducted around the world that looked in detail at chemicals that come into contact with food that can be generated during manufacturing, packaging, and cooking. He then cross-referenced these chemicals with his list of known PFAS.

The research team found that 68 types of PFAS are commonly present across food-contact materials, such as packaging and utensils. Of these, 61 were not included in the regulatory list mandating the use of PFAS because they were not previously known to be present in such substances.

Of the 68 PFAS, only 39 have been tested for toxicity. One of the substances analyzed was perfluorooctanoic acid, which has been shown to have the potential to cause cancer in humans, based on limited evidence that it can cause testicular and kidney cancer. It is classified, Geueke said.

“I think it’s the manufacturer’s responsibility to minimize the use of PFAS,” she says. Regulators around the world are working in the right direction, she says. For example, the European Union recently proposed banning most PFAS.

topic:

Source: www.newscientist.com

Common household products and cosmetics found to impact cell epigenetics

New research has found that formaldehyde poses serious risks to epigenetics, interfering with gene activity and potentially causing cancer and other diseases. The study emphasizes the need for stricter policies to limit exposure to formaldehyde, given its prevalence in various household products, cosmetics, polluted air, architecture, and other industries.

The research, conducted by Dr. Manel Esteller and Dr. Lucas Pontel from the Josep Carreras Leukemia Research Institute and Dr. Christopher J. Chan from the University of California, Berkeley, focused on the effects of high formaldehyde concentrations in the body. The study revealed formaldehyde’s harmful impact on normal epigenetic patterns and its association with cancer, liver degeneration, and increased asthma risk.

Formaldehyde is commonly found in products used in architecture, furniture manufacturing, textiles, and hair products, as well as in polluted gases and the metabolism of certain food substances. It can also be produced in the body and has the potential to alter the epigenetic landscape of cells.

The study concluded that formaldehyde inhibits the production of S-adenosyl-L-methionine (SAM), a universal donor of the methyl chemical group that regulates genetic activity. This decrease in SAM content leads to a loss of methylation of histones, proteins that package DNA and control gene function, contributing to formaldehyde’s carcinogenic properties.

As such, the researchers stressed the need for environmental and health policies aimed at reducing exposure to formaldehyde in various industries and environmental sources. Despite restrictions by international health authorities, there are still areas of work where formaldehyde is used at high levels, necessitating further regulations to minimize exposure to this hazardous substance.

Reference: Vanha N. Pham, Kevin J. Bruemmer, Joel DW Toh, Eva J. Ge, Logan Tenney, Carl C. Ward, Felix A Dingler, Christopher L. Millington, Carlos A. Garcia Prieto, Mia C. Pross Holmes, Nicholas T. Ingoglia, Lucas B. Pontel, Manel Esther, Keetan J. Patel, and Daniel K.・Nomura, Christopher J. Chan, November 3, 2023, science. DOI: 10.1126/science.abp9201

Source: scitechdaily.com

Artificial Intelligence Will Not Eliminate Jobs, Despite Common Misconceptions.

by

New research reveals that work experience has a significant impact on how employees interact with AI. Employees with more experience with a particular task will benefit more from AI, but senior employees will be less likely to trust her AI due to concerns about its imperfections. The findings highlight the need for customized strategies when integrating AI into the workplace to enhance human-AI teamwork.

New research sheds light on the complex aspects of human-AI interaction and reveals some surprising trends. Artificial intelligence systems tend to benefit younger employees, but not for the reasons you might expect.

New research published in INFORMS journal Business Administration provides valuable insights to business leaders about the impact of work experience on employees’ interactions with artificial intelligence.

In this study, two main forms of human work experience—narrow experience defined by the amount of specific tasks and broad experience characterized by overall seniority—were used to examine the dynamics within human-AI teams. We are investigating the impact on

Surprising findings from medical record coding research

“We developed an AI solution for medical record coding at a publicly traded company and conducted field research with knowledge workers,” says Weiguang Wang of the University of Rochester. “We were surprised by what we found in our research: Different dimensions of work experience clearly interact with AI and play a unique role in human-AI teaming.”

“While some might think that less experienced workers should benefit more from the help of AI, we find the opposite, that AI benefits workers with more task-based experience. At the same time, even though senior employees have more experience, they gain less from AI than junior employees,” said Guodong (Gordon) Gao, Johns Hopkins Carey School of Business. says.

Seniority and AI trust dilemma

Further research revealed that the relatively low productivity gains from AI were not the result of seniority per se, but rather a high sensitivity to imperfections in AI, which led to a decline in trust in AI. .

“This finding presents a dilemma: Experienced employees are well-positioned to leverage AI for productivity, but senior employees who take on greater responsibility and care about their organization They tend to avoid AI because they are aware of the risks of relying on it.” Aid. As a result, they are not using AI effectively,” said study co-author Ritu Agarwal of the Johns Hopkins Carey School of Business.

The researchers urge employers to carefully consider the types and levels of experience of different workers when implementing AI into jobs. New employees with little work experience are at a disadvantage when it comes to utilizing her AI. On the other hand, senior employees with more experience in an organization may be concerned about the potential risks posed by AI. Addressing these unique challenges is key to productive human-AI teaming.

Reference: “Friend or enemy? Artificial Intelligence and Teaming Workers with Different Experiences” Weiguang Wang, Guodong (Gordon) Gao, Ritu Agarwal, October 11, 2023. Business Administration.
DOI: 10.1287/mnsc.2021.00588

Source: scitechdaily.com

Ineffective Common Shoulder Treatments Identified

A new study concludes that a combination of saline injection and ultrasound-guided irrigation in the treatment of shoulder calcific tendinopathy is no more effective than a placebo treatment, calling into question current treatments and calling for further treatment. The need for research and alternative approaches is emphasized.

Results from recent trials suggest that the use of this therapy should be reevaluated.

Recently published clinical trials BMJ A saline injection treatment commonly employed to treat calcific tendinopathy, a painful condition caused by calcium buildup in the rotator cuff tendons of the shoulder, has a significant It turns out that there is no advantage.

The study found that the perceived benefits of ultrasound-guided irrigation (a procedure in which calcium deposits are injected with saline to dissolve them), even when combined with steroid injections, are no greater than those gained from sham (placebo) treatment. It has been demonstrated that it is equivalent to

Researchers say the findings call into question the use of ultrasound-guided irrigation for this condition and should lead to a “significant reconsideration” of existing treatment guidelines.

Research background and methodology

Despite its widespread use, ultrasound-guided irrigation has never been compared to sham treatment, and it remains unclear whether the reported improvements are due to the treatment itself, natural recovery over time, or It is unclear whether this is due to a placebo effect.

To fill this important evidence gap, researchers from Norway and Sweden are the first to test the true effectiveness of ultrasound-guided irrigation with steroid injections in patients with shoulder calcific tendinopathy. A sham control study was conducted.

Their findings show that between April 2015 and March 2020, 218 adults (average age 50 years old, approximately 65% ​​female).

At the beginning of the trial, patients provided information about various health and lifestyle factors, and X-rays were taken to assess the size of their calcium deposits.

Patients were then randomly divided into three treatment groups. Washing and steroid injection (73 participants), sham washing and steroid injection (74 participants), and sham only (71 participants). After treatment, all patients were asked to complete a home exercise program.

Evaluation and results

The primary measures of interest were pain intensity and functional disability on the Oxford Shoulder Score (0-48 point scale) reported by the patient at 2 weeks, 6 weeks, and 4, 8, 12, and 24 months. was.

At 4 months, there were no significant differences in pain and functional limitations between the three groups. At subsequent evaluations, scores remained similar even in patients whose calcium deposits had disappeared, which the researchers say casts doubt on the notion that lysis of periarticular calcium resolves symptoms. Says.

The steroid injection group reported better pain relief than the sham group at 2 and 6 weeks post-treatment, but of note, after 4 months the improvement was no different than the sham group. did not.

Findings and recommendations

Although the researchers acknowledge some limitations, including the lack of an untreated group to assess the natural course of symptoms, the double-blind, three-group design, including a sham group, They stated that they were able to evaluate the true clinical efficacy. Active treatment.

Therefore, they wrote, “Our results question existing recommendations for the treatment of calcific tendinopathy and may require a critical reexamination of established treatment concepts for these patients.” ” concludes.

Future studies should investigate alternative treatments, such as defined physical therapy programs, and should also include no treatment groups to assess the impact of the natural history of calcific tendinopathy on outcomes. the researchers added.

In a linked editorial, US researchers say that cleaning appears to be overused and may not be as effective as we think. However, it would be premature to conclude that ultrasound-guided irrigation or subacromial corticosteroid injections no longer have a role in the treatment of shoulder calcific tendinopathy.

These new findings should inform discussions with patients suffering from similar long-term symptom courses in which time resolves and corticosteroids may promote short-term pain relief. may provide some reassurance to the population,” the researchers added.

And they say future studies should include sham control groups, assess treatment response earlier in the course of symptoms, and investigate whether ultrasound classification systems can better predict treatment response. suggests.

Reference: “Ultrasound-guided lavage with corticosteroid injection versus sham lavage with corticosteroid injection for calcific tendinopathy of the shoulder: a randomized double-blind multi-arm study” Stefan Moosmayer, Ole Marius Ekeberg, Hanna Björnsson Hallgren, Ingar Heier, Synnove Kvalheim, Niels Gunnar Jewell, Jesper Blomquist, Hugo Ripp, Jens Ivor Brox, October 11, 2023, BMJ.
DOI: 10.1136/bmj-2023-076447

This study was funded by the Bergersen Foundation, the Aase Bye and Trygve J.B. Hoffs Foundation, Smith and Nephew, and the Medical Research Council of South East Sweden.

Source: scitechdaily.com