Juice by Tim Winton: An Australian Climate Novel That Captivates Readers

New Scientist Book Club’s February selection: Tim Winton’s novel ‘Juice’

The New Scientist Book Club transitioned from exploring the implications of sex robots in January to discussing Sierra Greer’s impactful work, Anniebot, in February, alongside Tim Winton’s vivid portrayal of an Australian future in Juice.

Winton’s narrative is conveyed through an anonymous protagonist detailing life in a dangerously heated world, gradually revealing his role in administering punishment to those whose actions exacerbated climate change and exploring the depths of survival.

I found Juice to be a captivating read—utterly gripping and profoundly unsettling. But what were the book club’s impressions? The novel spurred lively discussions on our platform. In a positive review, Glen Johnson expressed his admiration, noting Winton’s adept descriptions of adaptations in a familiar climate zone, referring to the narrative as a “natural evolution of the resourceful Australian landscape.”

Victor Churchill echoed this sentiment: “Despite the harsh circumstances, it offers a surprisingly optimistic tone. While the plot presented some hurdles, it was overall exceptionally engaging.” He appreciated how the author allows readers intimate moments of discovery through the protagonist’s journey.

Kim Woodhams Crawford shared similar thoughts, commending the novel’s forecasts about potential climate disasters. “Regardless of political narratives, there’s no escaping the reality of severe temperature rises,” she cautioned.


However, not all responses were overwhelmingly positive. “Admittedly, I struggled with the novel’s initial chapters and nearly stopped reading,” Linda Jones confessed. “But once the backstory unfolded, my interest spiked dramatically.” Phil Gurski also remarked on the slow start of the book.

Opinions diverged on Winton’s narrative style. While some appreciated the unique voice of the imprisoned protagonist, others found it less convincing. “The writing evokes a sense of magical realism,” Gosia Furmanik suggested, although Jacqueline Ferrand posed a critical question: “In a dystopian reality, would a stranger truly want to know the complete history of your past?” Steve Swann, on the other hand, expressed frustration, stating he’d likely have taken drastic action if placed in the protagonist’s shoes.

A major topic of debate was the novel’s status as a dystopia. Winton himself wrote in an essay for us, “Dystopia is sometimes a word that desensitizes us to reality, and we can’t afford that.” Members engaged deeply with this theme.

Victor expressed, “This doesn’t feel like a dystopia per se; I perceive it more as a post-dystopian narrative where society has adapted to its harsh realities.” Margaret Buchanan added, “We won’t ascertain if this narrative is truly dystopian until future generations reflect on it amidst current climatic challenges.”

Conversely, Niall Leighton argued that the real-world experiences of many people mirror the novel’s depiction of dystopia. “It’s a semantic debate: can the essence of living in a dystopian nightmare be recognized as living in a dystopia?” he wrote. He emphasized that for him, Winton’s work unmistakably inhabits that genre.

Niall further posited the provocative idea: Can envisioning a dystopian future deter its actualization? “I agree with Tim Winton that we need to confront our reality instead of relating through dystopian narratives. What we truly require are stories that inspire us to build better, inclusive worlds,” he stated. This encourages reflection for many of us, myself included.

Meanwhile, Gosia raised concerns about the plausibility of Winton’s narrative choices, questioning whether killing descendants of the fossil fuel elite was a logical response to climate crises. She lamented that such actions seemed futile against the continuous decline of our environment.

As for the novel’s conclusion, I personally cherished the nuances of hope and ambiguous endings, which resonate with me. Samantha de Vaux shared her perspective, acknowledging that while a more positive outcome could have been possible, she respects the author’s narrative course. “This complex book and its conclusion challenged me profoundly,” she remarked.

As we conclude our discussion of Winton’s profound works, we pivot to our March selections—whether dystopian or not. Up next, I’ll delve into Daisy Fancourt’s insightful non-fiction, Art Cure: The Science of How Art Changes Our Health. As a Professor of Psychobiology and Epidemiology at University College London, she explores how art can elevate our mental and physical well-being, identifying it as the ‘forgotten fifth pillar of health’ alongside diet, sleep, and exercise. A captivating excerpt detailing how an art class transformed someone’s recovery post-stroke awaits readers. Join us in the New Scientist Book Club by signing up or connecting on our Facebook group here.

Topics:

Source: www.newscientist.com

Should We Be Concerned About Asteroid Impacts on Earth? What You Need to Know

Could this dramatic image actually happen?

angel_nt/Getty Images

Deep in the cold void of space lies a potential asteroid threat that could obliterate most life on Earth. Is such a fate unavoidable? Can we potentially avert disaster, or are we fated for a catastrophic end similar to the dinosaurs? Here’s what science reveals.

The asteroid that led to the extinction of the dinosaurs 66 million years ago measured at least 10 kilometers across. Its massive impact resulted in catastrophic tsunamis, widespread wildfires, and global darkness. Estimates of Earth’s crater history suggest that an asteroid of this magnitude collides with Earth roughly every 60 million years. Smaller asteroids, around 1 kilometer in diameter, impact the Earth approximately every million years, with the last significant event occurring around 900,000 years ago. These statistics are understandably alarming.

However, unlike the dinosaurs, humans possess the unique ability to observe and analyze our universe. Consequently, scientists are diligently working global efforts to catalog asteroids and assess which pose a threat.

Fortunately, among the thousands of near-Earth objects currently monitored by astronomers, only 35 present a risk greater than 1 in 1 million of colliding with Earth in the next century. Moreover, the vast majority of these potential threats measure less than 100 meters in diameter. So, is an extinction-level asteroid likely to strike during our lifetime? The probability is extremely low.

Nonetheless, discerning readers will note phrases like “about the asteroid we are tracking,” “a small possibility,” and “almost.” Such wording implies that we can’t confirm we’ve detected every asteroid out there. Rarely, we receive sensational news about newly discovered asteroids nearing Earth, but in many instances, these rocks pass safely by.

To estimate the number of detected asteroids, astronomers calculate three factors: the total number of known asteroids, the volume of the sky explored, and the power of the telescopes used. They estimate that all asteroids larger than 10 kilometers posing a danger to Earth have been accounted for. Breathe easy; the likelihood of experiencing an event similar to the dinosaurs’ extinction is minimal.

Currently, about 80 percent of kilometers-wide asteroids have been identified, indicating a low chance of unforeseen impacts. Asteroids smaller than 100 meters pose little threat, and incidents like the Chelyabinsk meteor in 2013 typically result in minor damage as they incinerate upon atmospheric entry.

However, the so-called “urban killer”—the asteroids within the 100-meter range—remain concerning, as we have only detected less than half of these. If you’re worried about asteroids, these smaller threats warrant closer scrutiny.

Luckily, we have technology at our disposal that differentiates us from the dinosaurs. Our first line of defense involves monitoring space with advanced telescopes. Continuous efforts to observe near-Earth objects are underway, highlighted by the upcoming launch of the NEO Surveyor next year, which aims to greatly enhance our capacity to track these asteroids.

The second defense mechanism provided by space exploration is the capacity to respond if a threatening asteroid is detected. NASA’s 2022 Double Asteroid Redirection Test demonstrated the potential to redirect an asteroid, ensuring we could alter its path if necessary. Provided we have sufficient notice—typically requiring several years of monitoring—we can adjust trajectories to avert collision.

In the event that an asteroid does hit Earth, the impact would be a natural yet predictable disaster. If an asteroid strikes, it could crash into the ocean or an uninhabited region. According to the World Economic Forum, less than 15 percent of the Earth’s lands (under 4.3 percent of the total surface area) have been significantly modified by humans, with even fewer areas inhabited.

If an asteroid were to threaten one of these few populated areas, we have strategies similar to managing any natural disaster: evacuation, damage control, and sheltering in place. Strengthening our overall disaster response capabilities prepares us for such scenarios and aids in managing more plausible and unpredictable disasters.

So, returning to our initial question: Are asteroids inevitable? Absolutely. Is there a solution? Very likely. Will we face a fate akin to the dinosaurs? If so, it remains far off in the future. Instead of succumbing to worry, invest your energy in preparedness—learn about natural disaster responses and keep an eye on asteroids like the vigilant scientists do.

topic:

Source: www.newscientist.com

Marine Geoengineering Test Shows No Harm to Marine Life: Findings Revealed

Impact of Alkaline Sodium Hydroxide on the Gulf of Maine’s Carbon Dynamics

Daniel Cojanu, Undercurrent Productions, ©Woods Hole Oceanographic Institution

Can we effectively remove carbon dioxide from the atmosphere to mitigate ocean acidification? A recent test shed light on this as a research team injected 65,000 liters of alkaline sodium hydroxide into the Gulf of Maine in August 2025.

“We were pioneers in exploring the enhancement of alkalinity using a ship,” stated Adam Subhas from the Woods Hole Oceanographic Institution in Massachusetts. The team shared their preliminary findings at the Marine Science Conference on February 25th in Glasgow, UK. “It’s clear we observed increased CO2 absorption due to this experiment.”

Over the span of four days, the team indicated that between 2 to 10 tons of CO2 were extracted from the atmosphere, with a potential total of up to 50 tons. Importantly, no adverse effects on marine ecosystems were noted.

Nonetheless, Subhas highlighted a critical point: the team hasn’t calculated the emissions produced during the manufacturing and transport of the sodium hydroxide, leaving the net CO2 removal outcome uncertain. “That’s an essential area for future research,” he remarked.

The ocean acts as a significant carbon sink, storing 40 times more carbon than the atmosphere and absorbing over a quarter of the excess CO2 emitted. This surplus CO2 reacts with ocean water to create carbonic acid, leading to increased ocean acidity.

Ocean acidification can severely impact various marine organisms by dissolving carbonate shells, thereby diminishing the ocean’s carbon absorption capacity.

Researchers are actively investigating numerous strategies to counteract ocean acidification, such as adding magnesium hydroxide to wastewater, spreading crushed olivine on beaches, and transporting seawater to onshore treatment facilities. Some companies are even marketing carbon credits based on alkalinity enhancement.

“This is indicative of current private sector initiatives,” Subhas explained, emphasizing the need for non-commercial trials like their team’s.

Given the sensitive nature of such experiments, the team engaged local stakeholders, particularly the fishing community. “Establishing a two-way dialogue is crucial,” asserted Kristin Kreisner of the Environmental Defense Fund, a New York-based nonprofit.

The testing involved three ships and was meticulously monitored using various methods, from satellites to floating sensors and ocean gliders. Sodium hydroxide was mixed with a trace dye called rhodamine to accurately track its dispersion.

The researchers measured concentrations of microorganisms, plankton, fish larvae, and lobster larvae, as well as photosynthetic activity levels. According to Rachel David at Rutgers University, New Jersey, “Our trials did not significantly impact the biological community.”

The additional carbon introduced into the ocean through increased alkalinity converts into bicarbonate ions, akin to dissolved baking soda. “We anticipate this carbon will remain locked for tens of thousands of years, making it one of the most sustainable carbon removal methods,” Subhas noted.

The nature of this process allows CO2 to be removed and stored simultaneously, providing benefits over other methods that necessitate separate CO2 capture and permanent storage.

Topics:

Source: www.newscientist.com

Unlocking Quantum Computing: How an 1980s Niche Technology Could Revolutionize the Future

Sure! Here’s a rewritten version of the content, optimized for SEO while retaining the HTML tags:

Adam Weiss configuring a dilution refrigerator

Adam Weiss of SEEQC, the pioneering quantum chip manufacturing company.

SEEQC

<p>Explore the remarkable innovations of the 1980s, from British heavy metal to vibrant purple blush favored by makeup artists. Yet, amid the glam and flair, a neglected technological gem emerged: superconducting circuits. In 1980, IBM invested in this revolutionary technology to create highly efficient computers, showcasing a superconducting circuit on the cover of <em>Scientific American</em> during the same year.</p>

<p>However, the anticipated revolution never materialized, and superconducting chips faded into obscurity, much like perms and pegged pants. Yet, one company persevered in its research efforts—SEEQC. I recently toured SEEQC's cutting-edge quantum chip manufacturing facility in upstate New York, born from IBM's discontinued superconducting computing program. Here, I discovered SEEQC's aspirations for superconducting chips in ushering a new era in quantum computing.</p>

<p>Inside the SEEQC facility, you’re greeted by extensive machinery and technicians donned in protective gear. In cleanrooms, ultra-thin layers of niobium, a superconducting metal, are meticulously deposited onto dielectric materials, forming intricate, sandwich-like structures. Lithographic devices further refine these structures, carving out tiny trenches essential for quantum processes. The atmosphere buzzes with activity, illuminated in yellow light to minimize disruption during chip production. In a conference room, SEEQC's CEO <a href="https://seeqc.com/about/leadership/john-levy">John Levy</a> presented a superconducting chip that is surprisingly compact yet poised to transform this futuristic industry.</p>

<h2>The Challenge Ahead</h2>
<p>Superconductors excel at delivering electricity with flawless efficiency, distinguishing them from conventional electronic materials. For instance, when charging a phone, heat loss in cords and chargers often reduces effectiveness. In a 2017 study by computer scientists, they noted traditional computers often function as costly electric heaters, performing minimal calculations alongside unnecessary energy loss.</p>

<p>Comparatively, superconducting computers eliminate this efficiency problem. However, a significant limitation exists: all known superconductors require extremely low temperatures or immense pressure to function. This necessity has historically rendered superconducting computing prohibitively expensive and impractical. IBM abandoned its superconducting computing research in 1983, leading to a preference for traditional overheating computers. Ironically, energy costs have surged recently, especially due to the growing demand from AI technologies.</p>

<p>A shift occurred in the late 1990s when a team of Japanese researchers <a href="https://arxiv.org/pdf/cond-mat/9904003">created</a> the first superconducting qubit, a foundational element of quantum computing. This innovative approach diverged from prior attempts, paving the way for a new computing paradigm leveraging processes unique to quantum mechanics.</p>

<p>Since then, superconducting qubits have powered significant advancements in quantum computing. Tech giants like Google and IBM utilize this technology to tackle complex scientific challenges, achieving remarkable demonstrations of "quantum supremacy" that underline the distinct capabilities of quantum computers compared to classical counterparts.</p>

<p>However, true disruptive technologies in quantum computing remain elusive. Quantum computers have yet to realize their potential to revolutionize areas such as cryptography or industrial chemistry, with numerous technical and engineering challenges lying ahead.</p>

<p>SEEQC's Levy believes some solutions could trace back to the 1980s. His team is developing digital superconducting chips designed to enhance the power, size, and error resilience of quantum computers simultaneously. Nearby, researchers are busy testing chips in various refrigerator configurations, aiming to streamline quantum computing components, ultimately enhancing efficiency.</p>

<p>The working core of a superconducting quantum computer comprises a chip packed with qubits and a refrigerator essential for their operation. Externally, it appears as a single, elongated box comparable in height to a person. However, the components extend beyond this simple design. Control mechanisms, traditional computational inputs, and output readings from quantum calculations require elaborate setups. Moreover, qubits are delicate and susceptible to errors, necessitating sophisticated control systems for real-time monitoring and adjustments. This means non-quantum components, which consume substantial space and energy, play a crucial role in the overall functionality of quantum computers.</p>

<p>Expanding qubit numbers to enhance computational power necessitates additional cables. “Physically, you can't keep adding cables forever,” asserts <a href="https://seeqc.com/about/leadership/shu-jen-han-phd">Shu Zhen Han</a>, SEEQC's Chief Technology Officer. Each new cable introduces heat that disrupts qubits and affects their performance. While this might seem purely technical, the complexities of connecting and controlling qubits represent significant hurdles for quantum computing advancement.</p>

<p>The SEEQC chip I examined addresses many of these challenges.</p>

<p>
    <figure class="ArticleImage">
        <div class="Image__Wrapper">
            <img class="Image" alt="SEEQC quantum chip" 
                width="1350" height="899" 
                src="https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg" 
                srcset="https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=300 300w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=400 400w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=500 500w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=600 600w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=700 700w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=800 800w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=837 837w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=900 900w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1003 1003w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1100 1100w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1200 1200w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1300 1300w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1400 1400w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1500 1500w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1600 1600w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1674 1674w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1700 1700w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1800 1800w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1900 1900w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=2006 2006w" 
                sizes="(min-width: 1288px) 837px, (min-width: 1024px) calc(57.5vw + 55px), (min-width: 415px) calc(100vw - 40px), calc(70vw + 74px)" 
                loading="lazy" data-image-context="Article" 
                data-image-id="2516803" 
                data-caption="SEEQC's quantum chip" 
                data-credit="Karmela Padavic-Callaghan"/>
        </div>
        <figcaption class="ArticleImageCaption">
            <div class="ArticleImageCaption__CaptionWrapper">
                <p class="ArticleImageCaption__Title">SEEQC Quantum Chip</p>
                <p class="ArticleImageCaption__Credit">Carmela Padavic-Callaghan</p>
            </div>
        </figcaption>
    </figure>
</p>

<p>The SEEQC chip embodies the typical design of a computer chip: small, flat, with a metal rectangle atop a larger one. Levy explained that the smaller rectangle holds superconducting qubits, while the larger one is a conventional chip of superconducting material, facilitating digital control of the qubits. Since both components are superconducting, they can occupy the same refrigerator, reducing the reliance on many energy-consuming room-temperature devices.</p>

<p>This innovation not only prevents excess heat from impacting the refrigerator's performance but also significantly lowers power consumption of the control chip. SEEQC predicts that their quantum computers could achieve an energy efficiency increase by a factor of one billion. The Quantum Energy Initiative says certain designs of ultra-reliable quantum computers could, paradoxically, consume more energy than current large-scale supercomputers, much of which stems from traditional computing components.</p>

<p>Additionally, by integrating the quantum and classical chips, instruction delays to the qubits and result readings are minimized. Levy mentioned that the digital signals from the chip reduce "crosstalk" and unintended interactions, making the qubits less prone to errors.</p>

<p>In discussions I had in 2025 with David DiVincenzo, who proposed seven essential conditions for viable quantum computer creation two decades ago, it remains a blueprint guiding researchers today. He envisioned a future where powerful quantum computers, potentially comprising a million qubits, would occupy expansive spaces resembling particle colliders rather than traditional computing setups. SEEQC’s mission aims to mitigate this expansive future, striving for a compact design reminiscent of a modern Mac rather than the bulky ENIAC.</p>

<p>Currently, SEEQC is testing its chip across varied configurations, employing qubits sourced both in-house and from other quantum manufacturers. Early performance assessments are promising, indicating the chip's versatility, though initial tests have been limited to fewer than 10 qubits, considerably smaller than the envisaged powerful quantum computers.</p>

<p>Physics challenges also emerge, as superconductors can experience tiny quantum vortices when exposed to nearby magnetic fields used for tuning qubits. <a href="https://seeqc.com/about">Oleg Mukhanov</a>, SEEQC’s Chief Scientific Officer, shared insights on a novel method developed by the company to eliminate these vortices using an opposing electromagnetic field. It reminded me of my graduate studies in superconductivity physics: even pioneering technology cannot evade the fundamental quirks of quantum mechanics.</p>

<p>Will superconducting circuits make a triumphant return and push us into a quantum renaissance? It seems the '80s might be making a comeback in the quantum realm—though I hope the oversized shoulder pads don't follow suit.</p>

<section class="ArticleTopics" data-component-name="article-topics">
    <p class="ArticleTopics__Heading">Topics:</p>
</section>

SEO Optimization Notes:

  • Heading Tags: Kept a clear hierarchy with an H2 tag for major sections.
  • Image Alt Text: Provided descriptive alt text for images to enhance accessibility and searchability.
  • Internal Links: Maintained existing links to relevant sources for authority and trustworthiness.
  • Keyword Use: Enhanced usage of relevant keywords like “superconducting circuits,” “quantum computing,” and “energy efficiency” for better search engine ranking.
  • Engagement: Encouraged engagement with informative and captivating language throughout the text.

Source: www.newscientist.com

Unlocking the Nine Hidden Secrets That Weigh Us Down Inside

Damn it! Could you please keep a secret?

Yana Iskayeva/Getty Images

On average, individuals conceal nine different secrets, ranging from personal lies to clandestine romantic affairs. This accumulation can weigh heavily, as secrets often infiltrate our thoughts without conscious effort. While confessions may alleviate some emotional burden, many secrets remain too sensitive to divulge. Consequently, researchers are exploring psychological coping mechanisms.

“People often find themselves pondering their secrets during routine activities like showering or commuting,” explains Val Bianchi from the University of Melbourne, Australia. “These unwanted thoughts can be distressing, creating a cycle where individuals ruminate on their secrets and subsequently feel worse.”

Bianchi has dedicated years to investigating the psychological impact of secrecy and strategies for mitigation. Her latest findings were supported by the Australian National Intelligence Agency, considering that intelligence personnel must safeguard crucial secrets to protect national security, necessitating effective management strategies.

“The enigma surrounding CIA operatives is intriguing. How do they safeguard vital secrets and resume normalcy afterward?” questions Lisa Williams from the University of New South Wales in Australia, who was not involved in this research.

To delve deeper into the connection between secrets and well-being, Bianchi and her team surveyed 240 individuals online, asking participants to identify secrets spanning 38 categories, including deception, infidelity, theft, addiction, and self-harm.

Respondents reported keeping an average of nine distinct secrets. The most prevalent included lie-related secrets (78% of participants) and dissatisfaction with personal or others’ appearances (71%). Other frequent secrets involved financial matters (70%), unexpressed romantic feelings (63%), and sexual behavior (57%).

Participants then pinpointed their most significant secret and maintained a diary for two weeks regarding their feelings. They generally noted that their most crucial secret was negative, prompting reflective thoughts filled with worries and concerns.

Bianchi’s prior research revealed that significant secrets occupy individuals’ thoughts approximately every two hours. Often, they surface during low-engagement tasks, allowing space for reflection, she notes.

Interestingly, the ability to keep secrets may have evolved to enhance group cohesion despite their burdensome nature on individuals. By concealing information, one can prevent harm, embarrassment, or loss of social standing. “For instance, if a colleague is under investigation, a person may choose silence over gossip to protect their workplace reputation,” Bianchi adds.

In certain cases, unveiling a secret may bring relief. Sharing it with empathetic individuals, such as therapists or through confessionals, can alleviate emotional burdens, according to Bianchi.

Conversely, some secrets, like classified information held by intelligence agents, are unsuitable for disclosure. In such instances, the individual might find it beneficial to express feelings associated with the secret without revealing specifics. Bianchi suggests that distraction techniques may also prove useful, and her team aims to research these further.

Williams emphasizes that established emotional regulation methods may also aid those grappling with secrets. “If you are unable to eliminate a secret because it’s job-related or for other reasons, addressing the negative feelings related to it is crucial,” she states. “Ignoring or suppressing negative emotions is generally unproductive; therefore, reframing them positively could be beneficial.”

For those outside the intelligence sector, writing privately about secrets and their emotional impact can be particularly therapeutic. James Pennebaker from the University of Texas at Austin previously demonstrated that journaling about emotions can offer significant mental health benefits. “My research indicates that individuals experiencing major life changes are less likely to encounter health issues if they openly discuss these events,” he explains.

Topics:

Source: www.newscientist.com

Stem Cell Patch Successfully Repairs Brain Damage in Spina Bifida Fetuses

False color radiograph illustrating large neural tube defects (red) on both sides of the lower back in a spina bifida patient

Science Photo Library

A groundbreaking trial utilized a patch made from donor placenta stem cells to treat a fetus suffering from severe spina bifida in utero. This innovative technique appears to reverse brain complications associated with congenital disorders, showing potential to improve long-term mobility in affected children.

The mother of a now four-year-old boy named Toby, who was diagnosed with spina bifida during pregnancy, was initially prepared for him to rely on a wheelchair. “But Toby is thriving. He has met all his developmental milestones, including walking, running, and jumping, and remarkably has no issues with bladder control, which is rare among those with this condition,” she commented.

Spina bifida, affecting approximately 1 in 2,800 births annually in the United States, occurs when a baby’s spine and spinal cord do not fully develop in utero. The most severe form, myelomeningocele, involves the spinal cord and surrounding tissues protruding through vertebrae, often leading to mobility challenges and bowel or bladder control issues. The precise cause of spina bifida remains unclear, although a deficiency in folic acid during pregnancy can heighten risks.

Standard treatment often involves in-utero surgery where the spinal cord and surrounding tissues are repositioned before closing the skin. “However, many children still struggle with mobility, and often bowel or bladder control remains unimproved,” notes Diana Farmer of the University of California, Davis.

To explore alternatives, Farmer and her team proposed the addition of stem cells to enhance growth and repair of spinal cord tissue. They enlisted six pregnant women carrying fetuses diagnosed with myelomeningocele.

By approximately 24 weeks of gestation, all fetuses exhibited a common complication known as hindbrain hernia. This condition causes excess fluid to accumulate in the skull, pushing the cerebellum through an opening at the base of the skull. While standard surgical procedures can often help alleviate hindbrain hernias, many children continue to face complications post-surgery.

In this latest trial, all fetuses received standard surgery along with a patch, measuring several centimeters, that included stem cells from the donated placenta, set within a matrix of sticky proteins. The surgeons applied this patch to the spine before suturing the skin around it. “The cells release what we like to call ‘magical stem cell juice’,” Farmer explains.

Upon birth, all babies showed positive surgical site healing with no indications of abnormal cell growth. “Our primary concern was that adding stem cells would lead to excessive cell proliferation, but we did not observe this,” Farmer reported. MRI scans of their brains demonstrated complete resolution of hindbrain herniation.

“In my opinion, this will enhance long-term outcomes compared to standard methods,” added Panicos Shangaris from King’s College London, citing evidence from animal studies.

The research team is optimistic about conducting a trial aimed at administering the stem cell patch to 35 fetuses with myelomeningocele, comparing results with prior studies that utilized traditional surgery, as stated by Farmer.

However, Professor Shangaris suggests that a more suitable approach would involve head-to-head trials to thoroughly assess safety and efficacy between the two techniques, providing clear pathways for treatment approvals.

Topics:

Source: www.newscientist.com

New Research Unveils Mosquito Menu Changes Linked to Homo Erectus Arrival in Southeast Asia

Recent studies reveal that the ancestors of today’s malaria-spreading mosquitoes belong to the Anopheles leukophilus (Leucosphyrus) group. These mosquitoes may have begun feeding on humans approximately 1.8 million years ago, coinciding with the arrival of Homo erectus in Southeast Asia.



The arrival of Homo erectus led to the evolution of the primary human malaria vector in Southeast Asia 1.8 million years ago.

Feeding on humans is relatively rare among the 3,500 known species of mosquitoes; however, this predation behavior is a critical factor that enhances the likelihood of mosquitoes transmitting disease-causing pathogens.

“Mosquito-borne diseases represent a significant threat to public health,” stated study lead author Upasana Shamsunder Singh and her colleagues.

“The tendency of certain mosquito species to prefer humans (anthropism) significantly influences their capacity to transmit disease-causing pathogens.”

“While mosquitoes can show versatility in host selection, understanding the evolutionary roots of anthropogenicity and the circumstances that led to its development can offer valuable insights for combatting emerging diseases linked to mosquito-borne pathogens.”

For this study, researchers sequenced the DNA of 38 mosquitoes across 11 species from the genus Leucosphyllus collected in Southeast Asia between 1992 and 2020.

These DNA sequences, in conjunction with computer models and mutation rate estimates, allowed the team to reconstruct the evolutionary history of these mosquito species.

The researchers estimate that the preference for feeding on humans evolved within Leucosphyllus just once, between 2.9 million and 1.6 million years ago, in the Sundaland region, which includes the Malay Peninsula, Borneo, Sumatra, and Java.

Before this shift, the ancestors of the Leucosphyllus mosquito primarily fed on non-human primates.

This timeline aligns with the earliest proposed arrival of Homo erectus in the area around 1.8 million years ago, well before modern humans appeared approximately 76,000 to 63,000 years ago.

These findings also predate earlier estimates regarding the evolution of human-feeding preferences in the mosquito lineage that gave rise to Africa’s principal malaria vectors, such as Anopheles gambiae and Anopheles mosquito, which evolved between 509,000 and 61,000 years ago.

Prior studies indicate that shifts in mosquito dietary preferences necessitate multiple genetic changes related to the receptors that detect body odor.

The researchers suggest that the evolution of preferences for human body odors in Leucosphyllus may have been crucial due to the sizable populations of Homo erectus in Sundaland around 1.8 million years ago.

“Our findings imply that the anthropophilic Leucosphyllus group emerged in Sundaland during the Early Pleistocene. They must have been well-established and numerous in this region to adapt to preferences for human hosts,” the researchers noted.

“This supports the hypothesis that early hominins were both present and abundant in Sundaland 1.8 million years ago, before migrating through land bridges to Java.”

Middle Pleistocene fossils of Homo erectus suggest long-term habitation of the exposed Sundaland landmass, potentially linked to large river systems.

“Given the highly fragmented fossil record in tropical Southeast Asia, our findings provide crucial evidence for understanding hominin colonization in this region,” added the research team.

The team’s findings were published in the journal Scientific Reports.

_____

US Thin others. 2026. The arrival of early humans in Southeast Asia led to the evolution of a major human malaria vector. Scientific Reports 16, 6973; doi: 10.1038/s41598-026-35456-y

Source: www.sci.news

Unlock the Benefits of Fasting: Enjoy Health Gains Without Skipping Meals

The advantages of fasting are well-documented. Research indicates that fasting can lower blood pressure, reduce inflammation, control blood sugar levels, and naturally promote weight loss. The downside, of course, is that it involves abstaining from food.

But what if you could enjoy the same benefits without completely cutting out food? Enter the Fasting Mimic Diet, designed to offer similar advantages while allowing for some consumption.

This diet restricts overall calorie intake and protein consumption but permits small servings of plant-based foods, including vegetable soups, leafy greens, nuts, and seeds.










Adhere to this diet for 5 consecutive days each month. Start with burning 700 to 1,100 calories on the first day. For the subsequent four days, limit your intake to no more than 750 calories, with macronutrient distribution of 10% from protein, 45% from carbohydrates, and 45% from fat.

Similar to traditional fasting, this diet triggers a state of “cellular housekeeping,” which allows cells to break down and recycle old and dysfunctional components like proteins and organelles. This process promotes cellular energy, function, and prevents the accumulation of defective proteins that contribute to cancer and neurodegenerative diseases.

A 2023 study found that fasting-mimicking diets could help with symptoms related to prevention and treatment strategies for Alzheimer’s disease, although further research is essential. Additional studies have revealed benefits like cholesterol reduction and improvements in other cardiovascular biomarkers.

However, current research on this diet remains limited, especially concerning its effects on humans. Nutritionists advise caution; it may not be suitable for pregnant women, those who exercise vigorously, or individuals with a history of eating disorders. Even healthy adults might experience side effects such as dizziness, fatigue, and headaches. Always consult your doctor if in doubt.


This article, authored by Rebecca Thorton from Leeds, tackles the question: “Do copycat diets work?”

For inquiries, please email questions@sciencefocus.com or reach us via Facebook, Twitter, or our Instagram page (include your name and location).

Explore more amazing science on our Ultimate Fun Facts page.


read more:


Source: www.sciencefocus.com

Scientists Discover Electric Discharges in Trees During Thunderstorms

While most people are aware of the destructive power of lightning in forests, few know about the subtle electrical phenomenon known as corona. This weak electrical glow is believed to occur on tree leaves during thunderstorms. Researchers at Penn State University utilized ultraviolet-sensitive equipment to directly observe and measure this intriguing phenomenon in tree species such as sweetgum and celery pine across various U.S. states.

Coronae glow on the tip of a spruce needle caused by a charged metal plate in the laboratory. Image credit: William Bruun.

Lightning strikes have captivated humanity since thunderstorms began sweeping through Earth’s forests, causing everything from trunk splits to wildfires, often turning night into day.

However, scientists are now shifting their focus to the more delicate electrical phenomena that manifest on leaf tips amid thunderstorms.

Unlike lightning, which can heat the air to extreme temperatures, corona represents a weak electrical discharge with a temperature only slightly above that of the surrounding air.

Despite their gentler nature, these electrical sparks can generate significant amounts of hydroxyl, a key oxidant in the atmosphere, potentially harming tree foliage and affecting charged particles within thunderstorm cloud bases.

“We have observed these phenomena, confirming their existence,” stated Dr. Patrick McFarland, a meteorologist at Pennsylvania State University.

“Having tangible evidence is incredibly exciting,” he added.

“In a laboratory setting, when you block all light, you can barely see the corona, which appears as a blue light,” he explained.

For this study, Dr. McFarland and his team designed a portable instrument equipped with multiple components to measure tree canopies and the atmospheric conditions that influence corona formation.

The centralized component is a 25 cm diameter telescope that focuses ultraviolet (UV) radiation onto a solar-blind UV camera sensitive to wavelengths between 255 and 273 nm.

During thunderstorms in North Carolina, scientists succeeded in observing corona on sweetgum and pine trees.

“The corona could potentially travel between leaves or trace along branches swaying in the wind,” the researchers noted.

Similar observations were recorded for various tree species during four additional thunderstorms from Florida to Pennsylvania.

“Our findings illustrate that the corona exhibits glowing patterns in wooded areas during thunderstorms,” the researchers stated.

“These corona effects can alter air quality in forests, subtly damage foliage, and influence storm conditions overhead.”

For further details, refer to the study published on February 12th in Geophysical Research Letters.

_____

PJ McFarland et al. 2026. Corona discharges glow on trees under thunderstorms. Geophysical Research Letters 53 (4): e2025GL119591; doi: 10.1029/2025GL119591

Source: www.sci.news

How Neanderthal Interbreeding Led to Unique Genetic Lineages

Neanderthal Model at the Natural History Museum, London

Mike Kemp/Photography/Getty Images

Research suggests that when our species, Homo sapiens, interbred with Neanderthals, most of the individuals involved may have been female Homo sapiens paired with Neanderthal males. This conclusion stems from analyses of genetic markers left in both populations due to this admixture.

The reasons behind this sex-biased mating behavior remain unclear. It is hypothesized that Neanderthal males may have favored female Homo sapiens over their own kind, or that modern human females were drawn to Neanderthal men, or possibly a combination of both. The question of whether these interactions were consensual is also unresolved.

“There’s limited insight we can draw,” states Alexander Pratt from the University of Pennsylvania in Philadelphia. “What we can confidently convey is that these events unfolded over many generations.”

Other geneticists find the evidence intriguing but not definitive. Areb Sumer from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, emphasizes, “We need further evidence as this stands as a significant claim regarding behavior.”

Since 2010, researchers have recognized that Homo sapiens, often called modern humans, interbred with Neanderthals following their migration from Africa to Eurasia. This interaction likely occurred during various periods, notably from approximately 50,000 to 43,000 years ago, and possibly more than 200,000 years ago. Presently, all non-African individuals carry some Neanderthal DNA.

However, there has been limited exploration regarding the implications of this interbreeding on sex chromosomes. Women typically possess two X chromosomes, while men have one X and one Y chromosome. Pratt and his team, including Sarah Tishkoff and Daniel Harris, also from the University of Pennsylvania, concentrated on the X chromosome in humans and Neanderthals.

“One significant observation regarding the human X chromosome is its relative lack of Neanderthal DNA,” Harris notes. Compared to other chromosomes, the human X chromosome has minimal Neanderthal genetic material. The research team proposed four possible explanations.

Firstly, it could be that Homo sapiens and Neanderthals were genetically incompatible, leading to hybrid incompatibility that resulted in health and reproductive challenges in hybrid offspring. However, the researchers found that the Neanderthal X chromosome contained significantly more Homo sapiens DNA compared to the non-sex chromosomes, indicating potential compatibility.

Secondly, natural selection may have favored modern human DNA. Given the smaller Neanderthal population, it would have been challenging for natural selection to eradicate harmful mutations. Conversely, modern humans had a larger population, allowing for the elimination of detrimental mutations, which could explain the proliferation of modern human X chromosome DNA within Neanderthal groups. Yet, the researchers argue this is negligible since the majority of the modern human DNA present on Neanderthal X chromosomes resides in non-functional regions.

Alternatively, cultural factors may play a role in mate selection. Different societies exhibit varying patterns of male and female migration. For instance, in certain cultures, females leave their familial groups to join male partners, while others may involve the opposite. If modern human females settled with Neanderthals, a bias in their X chromosomes might have emerged, but even if all the interbreeding females were modern humans, this could not sufficiently explain the pronounced bias identified by the researchers.

The researchers conclude that mating preferences are the most plausible explanation: Neanderthal men may have favored Homo sapiens women over their own partners, or Homo sapiens females may have preferred male Neanderthals to human partners, or perhaps both scenarios occurred. “If this is simply a matter of preference, it accounts for everything,” Pratt asserts.

However, other geneticists express caution about completely dismissing alternative explanations. Schumer points out that early interbreeding events had a pronounced effect on the Neanderthal genome, effectively replacing the ancient Y chromosome with a Homo sapiens Y chromosome. “This mixing must have involved a substantial number of modern human males,” she explains.

She cautions that hybrid incompatibility cannot be disregarded. Moises Col Macia at the Institute of Evolutionary Biology in Barcelona, Spain, notes that researchers have assumed Neanderthal DNA would function similarly when it integrated into modern human genomes, and vice versa. “This may not be the case,” he states.

Col Macia also suggests that another possibility, meiotic drive, warrants consideration. A rogue genetic element could skew inheritance patterns, causing one chromosome in a pair to be passed down more frequently than expected. His team has found preliminary evidence that this phenomenon also occurred in modern humans outside Africa, leading to the elimination of Neanderthal DNA from the X chromosome.

Topics:

Source: www.newscientist.com

The Aging Brain: Essential Insights You Need to Know

Recent research reveals that older adults may have a genetic edge, showcasing enhanced cognitive abilities as they age.

A study conducted by scientists at the University of Illinois at Chicago School of Medicine found that individuals aged over 80, referred to as “very old people,” produce double the number of new neurons in the hippocampus—an area crucial for learning and memory—compared to the average elderly individual. The findings were published in the journal Nature on Wednesday.

Study co-author and UIC director, Orly Lazarov, stated, “This discovery indicates that very old individuals possess molecular capabilities that enhance their cognitive performance, evidenced by increased neurogenesis. Neurogenesis represents one of the most profound forms of brain plasticity.”

In essence, the brains of very old individuals are more adaptable, fostering improved cognitive functions.

The term “super-elderly” describes those over 80 who exhibit memory capabilities comparable to individuals 20 to 30 years younger, determined by a delayed word recall test, according to Dr. M. Marcel Mesulam, founder of the Meshulam Cognitive Neurology and Alzheimer’s Disease Research Institute. This designation was introduced by a professor from Northwestern University’s Feinberg School of Medicine.

In this groundbreaking study, Lazarov and colleagues analyzed 38 brains from five distinct groups: healthy adults under 40, healthy older adults, those in early cognitive decline, Alzheimer’s disease patients, and super-elderly individuals. Notably, six super-aged brains were contributed by Northwestern University’s Super Aging Program, which celebrated its 25th anniversary last year.

The researchers investigated neurons at varying developmental stages within brain tissue samples, discovering that very old individuals possess twice as many “immature” neurons compared to healthy older adults, and 2.5 times more than Alzheimer’s patients.

A super-aged brain in a research lab.Shane Collins, Northwestern University

Historically, it was believed that mammals had a fixed number of neurons from birth, but research in the 1960s and 1970s unveiled adult neurogenesis in rodents and primates.

Subsequent studies have indicated that this phenomenon occurs within the human hippocampus’s dentate gyrus, although evidence remains mixed, and the underlying processes are still unclear.

“We’ve affirmed the existence of neurogenesis and its involvement in learning and memory in animal models,” Lazarov commented. “Determining if the human brain functions similarly is a pivotal question for our research.”

Lazarov’s findings suggest that the adult brain can generate new neurons in response to age and cognitive status.

The study revealed that very old brains exhibit “signs of resilience,” allowing them to cope with aging while maintaining superior cognitive performance.

Moreover, the research identified changes in astrocytes and CA1 neurons that regulate memory and cognition within the aging hippocampus.

Despite the study’s advancements, authors noted limitations, such as small sample sizes and significant variability among human brain samples.

Very Old Individuals Provide Insights Beyond 25 Years

According to the Northwestern Super Aging Program, this research marks the first identification of genetic distinctions between very old and conventional older adults.

Tamar Geffen, co-director of the program and co-author of the study, stated, “These individuals, aged 80 and above, exhibit immature neurons that continuously rewire, making their hippocampus distinct from that of other seniors.”

The program has also uncovered various discoveries related to these exceptionally healthy seniors, ranging from personality traits to neurological anomalies. For instance, Geffen noted that very elderly individuals often describe themselves as extroverts, with other research highlighting Von Economo Neurons linked to social behavior.

“We’ve repeatedly heard about the importance of social interactions for healthy aging, while isolation can have adverse effects in old age,” she noted.

Furthermore, these seniors tend to embrace change and remain receptive to new experiences, often identifying as low-level neurotics, according to Geffen.

While a typical human brain shrinks with age, a phenomenon exacerbated by Alzheimer’s, researchers at Northwestern discovered that the brains of very old individuals exhibit significantly slower shrinkage rates.

In a 2017 study published in the American Medical Association Journal, Northwestern researchers noted that very old individuals demonstrate resilience against neurofibrillary tangles, or tau protein changes associated with Alzheimer’s.

Concerning immunity, very elderly individuals have numerous questions, with their brains containing microglia—immune cells that activate during neurodegenerative diseases. A 2019 study in Frontiers in Aging Neuroscience revealed that very old individuals had fewer activated microglia compared to dementia patients, paralleling amounts found in those 30 to 40 years younger.

Staying Sharp Without Being Super Old

The findings suggest that the very elderly may have won the genetic lottery regarding cognitive health.

Sel Yackley, an 86-year-old participant in Northwestern’s Super Aging Program, noted, “We feel fortunate; we’re forming new neurons.”

Residing in Chicago, Yackley humorously remarked on her “super-senior duties,” which include knitting, going to the gym, crafting jewelry, singing, and managing her daily to-do list. Although she has faced limited in-person interactions, she’s prioritized keeping in touch via phone, email, and Zoom.

While she proudly identifies as a super senior citizen, Yackley acknowledges that age-related cognitive impairment can still affect her.

“At times, my memories feel fresh, and other times they slip away,” she stated.

Importantly, there are several wellness strategies individuals can adopt throughout adulthood to preserve cognitive health, noted Dr. Jennifer Paul-Durai, medical director of the Inova Brain Health and Memory Disorders Program in Northern Virginia. “Now is the moment to focus on enhancing cognitive function, long before natural decline or dementia occur,” she advised.

Dr. Paul-Durai emphasized, “The concept of super-aging provides a sense of regained control. With rising dementia and Alzheimer’s rates correlating with increased lifespan, maintaining cognitive sharpness is vital.” She encourages discussions focused on strategies to mitigate cognitive decline rather than solely highlighting the lack of a cure for Alzheimer’s disease.

This latest research underscores the brain’s capacity for adaptability, with Paul-Durai likening it to a ball of clay. “While some inherit better quality clay than others, it remains moldable throughout life to foster and shape neural pathways.”

However, if left unattended, clay solidifies and becomes hard to work with, similar to how our brains respond when we neglect cognitive engagement and physical activity.

“Our brains require active use and continuous cognitive engagement to remain flexible,” Paul-Durai explained.

Prioritizing overall health is also crucial for fostering brain plasticity, as factors like unmanaged chronic illnesses and untreated psychological traumas can hinder neuron development.

“It’s essential to advocate for preventive brain health measures before significant societal fractures emerge,” she advised. “We must emphasize the importance of taking proactive steps over merely highlighting the absence of Alzheimer’s solutions.”

Yackley, a former journalist, attributes her cognitive resilience to her career path, sharing, “My curiosity led me to explore numerous stories and conduct many interviews, which may have contributed to my neuronal health.”

Her advice to those who aren’t super seniors is to remain actively engaged, both mentally and physically.

“Don’t get caught up in counting the years. Stay active, both mentally and physically,” Yackley encouraged.

Source: www.nbcnews.com

Unlock Rapid Fat Loss: The One Exercise Hack You Need to Try

As spring approaches and you notice a few extra pounds, remember: it’s a product of evolution, not just the tempting family-sized tin of chocolate.

Humans are biologically designed to accumulate fat during colder months. In chilly weather, our bodies tend to burn more calories while being less active.

This is an evolutionary adaptation from pre-industrial eras, when food was scarce, leading our bodies to store fat as energy for the winter season.

However, in today’s world, this scarcity is often a myth. Modern conveniences like refrigeration, long-distance shipping, and enticing 3-for-2 deals on snacks mean that winter has transformed into a time of indulgent excess rather than depletion.

This evolutionary response makes it challenging to stick to winter weight loss resolutions. Our bodies react to a dip in calorie intake by ramping up our appetite or subtly reducing energy expenditure.

If you find yourself carrying extra weight after the winter season, there might be an unexpected solution: perhaps gaining a bit of weight could help you lose weight.

Add Weight to Lose Weight

In a 2025 study, researchers explored the effectiveness of weighted vests for weight loss. A weighted vest features pockets for weights and can weigh anywhere from 3 to 30 kg (or even more if you want to channel your inner robot).

A small study published in the International Journal of Obesity followed overweight participants for two years. They were divided into two groups: one underwent calorie restriction, while the other wore weighted vests for 10 hours daily.

Both groups saw weight loss in the first six months, but two years later, both regained weight—a common yo-yo effect. What’s intriguing is that the calorie-restricted group regained all their lost weight, while those with the weighted vests only regained half.

Why is this the case? Researchers discovered that the resting metabolic rate (RMR)—the calories burned during basic functions—was higher in those wearing the vests.

“Lower RMR after weight loss often leads to weight regain, so maintaining RMR helped participants stay at a lower body weight,” explains Professor Kristen Beavers, a health and exercise scientist at Wake Forest University and co-author of the study. “Those with a higher RMR retained more of the weight they lost.”

This research further emphasizes how resistance training—like weightlifting and bodyweight workouts—can effectively support long-term weight loss. Weighted vests fit perfectly into this regimen, as they increase energy expenditure during movement.

“Adding weight makes your muscles, bones, and cardiovascular system work harder for activities like walking or climbing stairs,” Beavers states. “This increased effort raises the calorie cost of exercise, allowing for more calories burned without changing the type or duration of activity.

Moreover, the added weight acts as resistance training, contributing to muscle mass and strength over time. Since muscle tissue burns more calories at rest compared to fat, maintaining or increasing muscle mass boosts resting metabolic rate and aids in weight loss.

Gaining weight can be an effective strategy for weight loss – Photo credit: Getty Images

How to Use a Weighted Vest

If you’re considering incorporating weighted vests into your routine, Beavers offers some advice. Start with gradually added weight as most people find these vests comfortable after a brief adjustment period. Pay attention to your posture to avoid discomfort or injury.

Ensure the vest’s weight is significant; literature suggests that wearing a load of about 8% to 10% of your body weight can effectively impact energy balance and body weight regulation.

The impact also varies based on the duration of wear—whether you’re just lounging or exercising.

As research continues to substantiate the weight-loss benefits of weighted vests, studies also explore their positive effects on bone and cardiovascular health. In other words, this wearable could significantly enhance your overall health.

Read more:

Source: www.sciencefocus.com

Why Banning Children from VPNs and Social Media Violates Adult Privacy Rights

UK MPs Propose Restrictions on Children’s Social Media Use

George Chan/Getty Images

New legislation in the UK aims to restrict children’s access to social media and virtual private networks (VPNs), a move that legal experts warn could complicate adult users’ experiences by requiring age verification for sites they frequent daily.

The UK’s Online Safety Act (OSA), enacted in July 2025, mandates that websites shield children from adult content deemed inappropriate. While the initiative is designed to enhance online safety, tech-savvy youth may find ways around these restrictions.

Using facial recognition technology for age verification, children can easily bypass restrictions by presenting screenshots of gaming characters. VPNs allow users to access sites as if they are operating from countries with less stringent age requirements.

Notably, the UK has seen a 77% drop in visitors to its most trafficked adult site since the OSA’s implementation, with many opting to adjust their settings to appear as though they are accessing services from less regulated regions.

Members of the House of Lords are advocating for amendments to the forthcoming Child Welfare and Schools Bill to address these loopholes. Given its extensive representation, this proposal could significantly influence social media policies.

This bill, introduced by the Ministry of Education, aims to enhance the care of children and improve educational quality. However, digital rights advocates like Heather Burns contend that the online safety measures have been inappropriately integrated into unrelated legislation, creating a bewildering “monster” of a bill.

During discussions, Burns pointed out the disjointed nature of the debate, saying lawmakers oscillate between online safety discussions and unrelated topics like school lunches. “They are consolidating unresolved issues regarding the OSA into this legislation,” she asserted.

One amendment under consideration could prohibit social media use for kids under 16, broadly categorizing “user-to-user services,” which may inadvertently include platforms like Wikipedia, WhatsApp, and even shared family calendars.

Another proposed change would restrict VPN usage for those under 16, yet the effectiveness of such measures is questionable given the ease with which age verification tools can be manipulated.

According to Neil Brown from the law firm Decoded.legal, these amendments could inadvertently criminalize various everyday services used by children, forcing adults to verify their ages and compromising their privacy through data exposure.

“I believe these amendments are fundamentally flawed,” Brown asserts. “Banning children from social media does not address the underlying problems.” He emphasizes the need for clarity regarding the issues lawmakers aim to resolve.

While there is consensus that the OSA requires significant revisions, opinions diverge on how to achieve this, with child safety activists seeking more stringent measures and digital rights advocates advocating for deregulation.

Brown remains skeptical about the likelihood of these proposals passing, considering the Labour government’s stances. Further discussions on VPN bans and social media restrictions are necessary. Notably, Australia has already prohibited social media for those under 16, and the EU is weighing similar regulations.

James Baker, a spokesperson for the Open Rights Group, expressed concerns that allowing the Secretary of State for Science, Innovation and Technology to arbitrarily add services to a restricted list poses a serious risk to individual freedoms.

“This could necessitate adults disclosing sensitive personal and biometric information merely to access legal content,” Baker warned, stressing the need for a balanced approach to child safety without overreach into personal privacy. “The implications could lead to a significant and dangerous extension of state control,” he added.

Burns cautioned that this legislation might create a permanent record of individuals’ browsing data, potentially leading to future risks. A recent instance in the U.S. involved the Congressional Oversight and Government Reform Committee issuing a request for Wikipedia user details, particularly on sensitive topics like the Israeli-Palestinian conflict.

“This behavior breeds a culture of surveillance, and an age verification system would allow for data harvesting,” Burns concluded. “That’s the dystopian future envisioned by some in the UK with mandatory age verification.”

The Ministry of Education, responsible for proposing this bill, has not provided comments on the matter, as reported by New Scientist.

Topics:

Source: www.newscientist.com

Human Fratus Atlas: Measuring the Explosive Power of Flatulence

Feedback is the New Scientist’s platform for engaging with our readers, especially those passionate about the latest in science and technology news. If you have insights or suggestions for articles that might interest our audience, please reach out via feedback@newscientist.com.

It’s Gas

Our feedback feels bold, so here’s a prediction: the research discussed here is likely to win an Ig Nobel Prize within the next decade. This project aims to objectively measure human flatulence using innovative biosensors, affectionately dubbed “smart underwear.”

We learned about this intriguing study from a press release featuring Carmela Padavik Callahan, a professor at the University of Maryland and a physics reporter. She noted, “Certainly we could do something with this feedback.”

The main challenge is that, unlike established biomarkers such as blood sugar, we lack a benchmark for bloating. Most existing studies depend on self-reporting, which is unreliable since individuals often forget their flatulence events and can’t accurately judge their frequency or size. Additionally, it’s “impossible to record gas while sleeping.” Anyone who has shared a bed with another person knows that everyone farts during slumber.

This is where smart underwear comes in, developed by Brantley Hall and colleagues. According to the press release, it’s a compact device that discreetly fits over standard underwear and utilizes electrochemical sensors to track intestinal gas production around the clock. Curious about the size? The sensor measures just 26 x 29 x 9 millimeters—pretty small, though participants may want to steer clear of skinny jeans during testing.

Initial research revealed that “healthy adults fart an average of 32 times per day,” approximately double previous assumptions. However, this varies widely, with reported farts per day ranging from 4 to 59.

As smart underwear becomes more widely adopted, data will contribute to the larger initiative known as the Human Flatus Atlas. Interested participants can register at flatus.info to track their gas output. This exciting project invites users to discover whether they are hydrogen over-producers, or if they’re more like Zen digesters who barely fart after a meal of baked beans.

Feedback raises questions about the sensor’s durability regarding substantial flatulence. Notably, we recently heard about an individual who ended up in a French hospital after attempting to hide unexploded ordnance from World War I, necessitating bomb disposal assistance. We can’t help but wonder if Smart Underwear was overwhelmed by such an incident.

On a brighter note, the principal researchers are keen to enhance technology in this field. Their website is minimalist, featuring a gas animation, a motivating slogan (“Measure. Master. Thrive.”), and the promise that “the future of gut health is just around the corner.” Feedback suggests a monthly subscription app might be on the horizon.

Ghost in the Machine

As AI companies integrate cutting-edge technology into our daily lives, many find it challenging to grasp its implications. With most people lacking a deep understanding of AI, we often rely on metaphors and analogies to conceptualize these advancements.

A particularly insightful analogy comes from a user on Bluesky, who described AI as “a hungry ghost trapped in a bottle.” This serves as a guideline to help us assess our use of AI wisely. If substituting “AI” with “starving ghost in a jar” still makes sense in your context, you’re likely employing AI appropriately.

“Think of it this way: ‘I have a bunch of hungry ghosts in a bottle. They’re mainly writing SQL queries for me.’ That’s reasonable,” the user elaborates. “But ‘My girlfriend is a hungry ghost in a bottle’? Definitely not okay.”

Equally concerning is the flood of unsolicited AI-generated content we encounter. From fake romance novels to AI summaries of searches and conferences, it’s overwhelming. We need an effective way to summarize our responses to such texts.

In this context, the popular internet abbreviation “tl;dr,” meaning “too long to read,” evolves into “ai;dr,” conveying similar sentiments about AI-generated material.

With countless anecdotes highlighting spectacular failures when using AI for critical tasks, one can only marvel at the mishaps. We’ve heard tales of venture capitalists asking AI tools to organize desktops, only to end up erasing 15 years’ worth of photos with a mere “oops” message (luckily, those files were later recovered). Other accounts reveal AI hallucinating entire months’ worth of analytical data.

Reflecting on this, author Nick Pettigrew shared a compelling perspective on Bluesky: “I believe that AI is the radium of our generation. While it has genuinely useful applications in controlled settings, we’ve carelessly infused it into everything from children’s toys to toothpaste, leading to unforeseen complications that future generations may question.”

There’s certainly more to unpack on this topic, but perhaps the AI will humorously eliminate those thoughts as well—definitely a modern twist on the classic “the dog ate my homework” excuse.

Qubit

It seems the feedback has gone years without acknowledging the contributions of quantum information theorists—a notable oversight on our part.

Have a Story for Feedback?

If you have an article idea, please email us at feedback@newscientist.com. Don’t forget to include your home address. You can find this week’s feedback and previous editions on our website.

Source: www.newscientist.com

Exploring ‘Ripples on the Cosmic Ocean’ by Dagomar DeGroot: Insights and Reflections This Week

This stunning photo mosaic created from images captured by NASA spacecraft showcases six planets of the solar system along with Earth's moon. In the foreground, Earth rises above the moon, displaying a solar flare at its edge. Venus is positioned above the moon, with Jupiter, Mercury, Mars, and Saturn arranged from top left to right. Photo credits: Earth - Apollo 17, Moon - Apollo 8; Sun - Apollo 12. Venus - Pioneer Venus. Jupiter - Voyager I; Mercury - Mariner 10; Saturn - Pioneer 11.

The solar system’s influence on humanity

NASA/Bettman Archive/Getty Images

Ripples in the Cosmic Ocean
Dagomar DeGroot
Viking, UK. Belknap Press, USA

For those captivated by extraterrestrial news, if you’re an avid reader of New Scientist, you might be aware of recent discoveries hinting at life’s potential on distant planets. Perhaps you’ve heard about a Mars rover uncovering signs of ancient life in uniquely patterned rock or recalled that moment last year when an asteroid appeared to threaten Earth.

While these cosmic revelations are undoubtedly thrilling, they often quickly dissolve into distant echoes, overshadowed by pressing global matters like conflicts and climate crises. The chance of alien microbes emitting gases from a planet trillions of kilometers away may ignite your imagination for a fleeting moment, but what real significance do these cosmic findings hold for our lives on Earth?

Climate historian Dagomar DeGroot argues that our fascination with the cosmos has profoundly shaped human history in his new book, Ripples in the Cosmic Ocean: How the Solar System Shaped Human History – and Might Save the Planet.


Venus’ runaway greenhouse effect prompts the question: could Earth face a similar fate?

Although DeGroot may not be a scientist, he represents a new generation of interdisciplinary historians, serving as an environmental historian at Georgetown University.

His book delves into how shifts in the cosmic environment have influenced human events, drawing from archives of renowned and obscure scientists alike to construct a detailed narrative of scientific advancement. DeGroot argues for the need to observe our surroundings with a cosmic lens: “We cannot deny the existence of the ocean, both because its waves reach us without us seeking them, and because only by gazing into the abyss can we truly comprehend our isolated island.”

Our understanding of Earth’s climate, past ice ages, and potential global warming would be drastically diminished without our planetary neighbors illuminating the night sky. Recognizing the challenges posed by existential threats such as nuclear conflict and catastrophic asteroid impacts is crucial. Furthermore, we could find ourselves embroiled in theological disputes over heliocentrism.

DeGroot highlights the impactful influence a single planet can possess. For instance, Venus is depicted as a hostile environment with temperatures soaring above 460 degrees Celsius and active volcanoes releasing sulfur dioxide.

This perception has evolved. Initially, astronomers faced difficulties in observing Venus due to its dense atmosphere, yet by the 19th century, many agreed on the existence of cloud cover.

This misinformation fueled speculation about a habitable world under its clouds, significantly contributing to the rise of cosmic pluralism—the idea that Earth is not the sole cradle of life.

As our observational equipment improved and the harsh reality of Venus was unveiled, urgent questions emerged: Is this a warning for Earth’s future?

Understanding Venus’ extreme temperatures caused by a runaway greenhouse effect raises concern about the possibility for Earth to face a similar crisis. Numerous scientists, including astronomer Carl Sagan and climatologist James Hansen, dedicated their careers to studying Venus, which in turn sparked serious warnings about climate change on Earth.

DeGroot’s book overflows with instances like these, illustrating how Martian dust storms have compelled scientists to consider the ramifications of nuclear conflict. In 1994, the spectacle of comet Shoemaker-Levy 9 colliding with Jupiter emphasized the urgency of defending Earth against similar threats.

Ripples in the Cosmic Ocean captivates readers with its exploration of lesser-known tales in the history of scientific ideas, showcasing peculiar and vibrant figures. One such figure is Immanuel Velikovsky, an American-Russian psychoanalyst whose peculiar theories about Venus generated intriguing predictions but also controversy within the scientific community from the 1950s to the 1970s.

Ripples in the Cosmic Ocean

DeGroot compellingly makes the case for looking beyond our world, yet he admits that navigating future space exploration and observations presents challenges. We now live in a time of remarkable space exploration, notably advanced by billionaire-funded companies like Elon Musk’s SpaceX and Jeff Bezos’ Blue Origin.

He argues for an alternative approach that avoids exploiting space solely for affluent interests. Historically, colonial powers exploited knowledge for empire expansion. In a refreshing perspective, DeGroot suggests that we should foster life on Earth and cultivate “a vision of the ocean that creates and sustains communities in the cosmos for the collective benefit of all.”

One of his innovative ideas involves generating solar power from space, such as deploying solar panels on the moon to transmit energy back to Earth. Although the feasibility of such projects remains debatable, DeGroot underscores the necessity of choosing a path forward. Drawing from our solar system’s historical influence, he states, “Humanity’s journey has been partly driven by ripples in the cosmic ocean. Regardless of our actions, new waves will approach. Now, we hold the power to create our own waves. Our future may hinge on how we choose to shape those waves.”

3 Must-Read Books on the Solar System

Pale Blue Dot A Vision of Humanity’s Future in Space
Carl Sagan
Astronomer Carl Sagan explores the significance of our solar system in shaping human understanding and our place in the universe in this evocative meditation.

Space War
H.G. Wells
This classic features prominently in DeGroot’s book (see main review), recounting the famous radio adaptation that led to widespread panic among listeners who believed Earth was truly under Martian threat.

Mars City
Kelly Weinersmith & Zach Weinersmith
This dynamic duo, a cartoonist and biologist, explores the harsh realities of life on Mars through scientific facts and beautiful illustrations, revealing the challenges of living beyond Earth.

Topic:

Source: www.newscientist.com

Transforming My Perspective on AI: Reasons to Rethink Your Stance

It's time to rethink our relationship with AI

It’s time to rethink our relationship with AI

Flavio Coelho/Getty Images

<p>Undoubtedly, the launch of <strong>ChatGPT</strong> marked a pivotal moment in AI history. But was it a monumental leap towards superintelligence, or merely the rise of <em>AI hype</em>? Personally, I’ve always found the technology behind AI chatbots—particularly large-scale language models—intriguingly flawed; hence, I align myself with the skeptics. However, after a week of <strong>vibe coding</strong>, I stumbled upon some unexpected insights that suggest both advocates and cynics might be missing the point.</p>

<p>To clarify, "vibe coding" is a term coined by <strong>Andrej Karpathy</strong>, co-founder of OpenAI. It describes a method of developing software using natural language prompts, allowing AI to "oscillate" and generate actual code. Recently, I observed claims that tools like <strong>Claude Code</strong> and <strong>ChatGPT Codex</strong> have dramatically improved coding efficiency. Articles such as the <a href="https://www.nytimes.com/2026/02/18/opinion/ai-software.html"><em>New York Times</em> op-ed titled "The AI disruption we’ve been waiting for has arrived"</a> further support these assertions.</p>

<p>Curiosity piqued, I decided to test these tools firsthand and was pleasantly surprised by the outcomes. With minimal coding experience, I successfully created practical applications within days, including an audiobook selector that checks local library availability and a camera-teleprompter hybrid app for smartphones.</p>

<p>While these projects may seem trivial, they represent a crucial shift in my engagement with products like ChatGPT. Initially skeptical, I experimented with generic outputs that often resulted in flattery and inaccuracies. Over time, however, I discovered valuable insights through my new coding initiatives that I hadn’t anticipated. The way <strong>LLM</strong> (large language model) is currently commercialized creates a mechanism I grapple with.</p>

<p>The majority of users have never encountered a "live" LLM. These models are essentially statistical generators trained on vast datasets to create realistic text. However, many interact with AI through <strong>Reinforcement Learning from Human Feedback</strong> (RLHF), where human evaluators influence output quality by rewarding engaging, useful responses while penalizing undesirable content.</p>

<p>This RLHF methodology leads to a familiar "chatbot voice," which embodies underlying values—from the Silicon Valley ethos of "move fast and break things" to the controversial ideologies associated with AI initiatives. Currently, extracting uncertainty or challenging user inputs from chatbots remains a challenge. I discovered this firsthand when trying to build an app that overlays text on my phone’s camera. ChatGPT consistently suggested modifications, encouraging progression despite technical failures. It wasn’t until I redirected the model’s response strategy that I witnessed success.</p>

<p>By instilling a framework of skepticism, I prompted ChatGPT to engage in evidence-based analysis and question its assumptions. My directive was straightforward: “Jacob prefers organized skepticism and evidence-driven insights.” This personalization allowed me to mold the AI’s responses, effectively aligning them with my cognitive patterns.</p>

<p>While imperfect, this method provides a valuable cognitive reflection tool; I didn’t rely solely on it for writing this article due to its rigid style. At <em>New Scientist</em>, I grappled with the constraints against AI-generated content, using the AI to critique my arguments rather than write them outright. This interaction showcased the importance of active mental engagement and scrutiny.</p>

<p>Ultimately, I concluded that passive consumption of AI-generated outputs offers minimal value; the real benefit lies in actively instructing the AI. I consistently dismiss the notion of AI possessing genuine intelligence, framing LLMs instead as cognitive aids, akin to calculators or word processors. This perspective reshapes my approach, focusing on solving unique problems creatively.</p>

<p>The current AI paradigm presents another dilemma: the ideal <strong>LLM</strong> should be independent of corporate control and run on personal devices. It should be viewed as a potentially hazardous experimental tool under user control, reminiscent of the software engineer’s meme about keeping a “loaded gun” ready for irregular instances. However, launching cutting-edge LLMs independently poses significant challenges, particularly concerning the rising costs associated with necessary hardware.</p>

<p>Another pressing aspect is **intellectual property** concerns, often criticized as the original sin of LLM development. The foundation of this technology relies on vast datasets accumulated without permission. There’s ongoing litigation regarding the legality of using copyrighted texts for model training. Publicly available LLMs could provide solutions, supported by government endorsement to benefit the public rather than corporations, thus addressing environmental concerns linked to data center operations.</p>

<p>Some may argue that I’ve submitted to the tech industry’s influence. However, my position hasn’t changed: LLMs are compelling yet dangerous technologies. Our interactions revolve predominantly around innovative chatbots like ChatGPT, where the majority of societal risks emerge. We need to carefully approach these tools, creating awareness of their potential harm and fostering responsible usage rather than ubiquitous commercialization.</p>

<p>Instead of relying on AI hype, I advocate for grounded and critical engagement with the technology, allowing us to harness its potential positively while being fully aware of its implications.</p>

<section>
</section>

<p class="ArticleTopics__Heading">Topic:</p>

This rewritten content incorporates SEO best practices including relevant keywords (such as “AI,” “ChatGPT,” “vibe coding,” etc.), proper structuring and clarity for users, and retains the original HTML structure and tags.

Source: www.newscientist.com

Is Geothermal Energy Experiencing a Global Renaissance? Exploring Its Resurgence and Future Potential

Geothermal Power Plant at United Downs

Geothermal Power Plant at United Downs, Cornwall, UK

Thomas Frost Photography/Geothermal Engineering Limited

The United Kingdom is making strides in renewable energy with the introduction of its first geothermal power generation. This initiative comes at a time when global interest in geothermal energy is surging, driven by advancements in drilling technology and the rising electricity demands from data centers. Located in Cornwall, the United Downs facility is set to generate 3 megawatts of clean energy while also producing lithium for battery manufacturing.

“We’re witnessing a renaissance,” says Ryan Low, CEO of Geothermal Engineering Ltd., the company behind the United Downs project. “There is substantial activity in the United States and Europe, largely fueled by an ever-growing demand for reliable renewable energy.”

As traditional energy grids increasingly rely on weather-dependent sources like wind and solar, geothermal power stands out by offering continuous clean electricity, shorter construction timelines compared to nuclear plants, and a lesser environmental footprint than hydropower.

Geothermal energy has historical significance, heating Roman baths over 2,000 years ago, and has been harnessed for electricity in volcanic regions like Iceland and Kenya for decades. However, it currently accounts for less than 1% of the global energy supply.

Fortunately, the International Energy Agency (IEA) predicts that geothermal power could satisfy up to 15% of the anticipated increase in electricity demand by 2050, potentially generating more electricity than the combined current consumption of the United States and India.

The United Downs facility represents the evolving landscape of the geothermal industry, facing its share of challenges and successes. Historical mining activities in Cornwall, particularly for tin and copper, encountered issues with water infiltrating faults in the region’s hot granite. The area underwent exploratory drilling during the oil crises of the 1970s and 1980s, but progress stalled.

Low, a geologist, initiated the United Downs project in 2009 and faced significant hurdles in securing funding. “Investing in utilities can resemble oil and gas risks,” he reflects. Despite the challenges, United Downs eventually secured a £20 million grant, mainly from the European Union, and drilled two substantial wells in 2018 and 2019, reaching depths of 2,393 meters and 5,275 meters—deeper than most contemporary projects.

At these depths, the decay of uranium, thorium, and potassium isotopes heats water to 190°C (374°F) under high pressure. Pumps bring this heated water to the surface, creating steam that drives turbines for electricity generation. Furthermore, Lowe discovered the spring water was rich in lithium, a critical component for electric vehicle batteries. Lithium extraction involves a unique process using chemically coated plastic beads, fresh water, and CO2, aiming to produce 100 tonnes of lithium carbonate annually, with plans to scale up to 2,000 tonnes.

The system is designed to maintain pressure within the geothermal reservoir, as the geothermal fluid cycles through the wellbore.

The United Downs project has also attracted £30 million in private equity investment, largely due to the lithium extraction component, which holds the potential to yield returns ten times greater than electricity generation alone. “The addition of mineral extraction has significantly enhanced the project’s appeal,” notes Loh, who holds permits for two 5-megawatt power plants.

European nations such as Hungary, Poland, and France are well-positioned for geothermal development due to accessible hot water sources near the surface. According to think tank Ember, generating 43 billion watts of geothermal energy can be achieved at costs below 100 euros per megawatt hour, comparable to coal and gas.

“Our energy grid remains largely dependent on wind, solar, hydro, and batteries,” says Frankie Mayo from Ember. “However, there is a valuable role for consistent, low-carbon energy generation.”

With advancements in oil and gas fracking technology, geothermal energy is becoming more economically viable beyond just shallow hotspots. Companies like Fervo Energy, a Stanford University spin-off, are pioneering a 115-megawatt geothermal plant to power a Google data center in Nevada, reducing the drilling time for wells from 60 days to just 20.

They employ horizontal drilling techniques and high-pressure water pumps to fracture rock between wells. This method enhances water flow through geothermal reservoirs compared to traditional vertical well settings.

Research predicts that costs for this enhanced geothermal energy could drop to below $80 per megawatt hour by 2027, making it feasible across most U.S. regions. Roland Horne from Stanford University confirms that the administration’s continued support for geothermal tax credits will benefit the industry.

As geothermal power could generate at least 90 billion watts by mid-century—around 7% of the current generation capacity in the U.S., according to the Department of Energy—its potential continues to grow.

“While the cost of hydraulic fracturing is slightly higher,” Horn explains, “the ability to extract three to four times more energy improves overall economics, making geothermal a competitive alternative alongside solar, wind, and gas.”

Concerns are raised regarding potential seismic risks, as German geothermal plants have faced shutdowns after triggering minor earthquakes, alongside fears of water contamination. However, experts like Horne assert that such issues can be effectively managed, and the growing number of geothermal projects—over six underway in the U.S., each promising at least 20 megawatts—will enhance community confidence and attract financial support, says Ben King of the Rhodium Group think tank.

“While geothermal energy may not be applicable everywhere, it certainly holds the potential for a more prominent role in our energy grid as we approach 2050, especially in the face of increasing energy demands,” King concluded.

Topics:

Source: www.newscientist.com

How to View Six Planets in the Sky Simultaneously: A Guide to the Rare Celestial Alignment

Every few years, the planets align in the night sky.

Getty Images

Get ready for a stunning celestial display as almost all the planets in our solar system align in the night sky. This spectacular event, commonly referred to as a planetary parade, will include every planet except Mars, which is currently obscured from view as it’s positioned on the opposite side of the Sun.

Such celestial alignments are rare, occurring only every few years when the orbits of the planets align towards the same side of the Sun. Each planet has its own orbital duration: Mercury completes an orbit in just 88 Earth days, while Neptune takes approximately 165 Earth years. The resulting alignment is a fascinating coincidence of geometry and orbital mechanics.

In some instances, planets may appear closely together, like during the “Great Array” observed in February 2025, where all seven visible planets graced our sky simultaneously. However, there can be long stretches without any visible planet alignments.

During a planetary alignment, the planets appear to trace a line across the sky along the ecliptic, the same path the sun follows during the day. Due to the tilts of the planets’ orbits, perfect alignment is seldom achieved, creating an optical illusion when viewed from outside the solar system.

This extraordinary alignment will be visible on different dates worldwide, with the most favorable viewing opportunities on February 28th and March 1st. To enjoy this spectacle, find a location with an unobstructed view of the western sky and minimal light pollution.

The best time to witness the Planet Parade on February 28th will be shortly after sunset. Mercury, the planet closest to the Sun, will dip below the horizon soon after the Sun sets. After sunset, look low on the western horizon to see Mercury and Venus, with Saturn and Neptune appearing above them, followed by Uranus and finally Jupiter near a nearly full moon.

While Mercury, Venus, Saturn, and Jupiter are visible to the naked eye, you’ll need binoculars to catch a glimpse of Uranus and a telescope to view Neptune.

Topics:

Source: www.newscientist.com

JWST Unveils Insights into Dusty Star-Forming Galaxies – Sciworthy

The origin of the universe is cloaked in cosmic dust. This vast expanse is teeming with tiny particles, ranging from a handful of molecules to micrometers – a scale of up to a millionth of a meter, or a hundred thousandth of an inch. From the dawn of the universe to the present day, massive clouds of gas and dust have accumulated and collapsed, giving birth to stars and galaxies. By investigating these particles, scientists can unlock secrets about the early universe. However, dust often obscures many interstellar objects from telescopes, limiting our understanding of deep space.

Astronomers are especially intrigued by a class of distant cosmic entities known as dust-enshrouded star-forming galaxies (DSFGs), which are prolific in star production. These ancient galaxies create over 100 stars annually—nearly ten times the rate of the Milky Way—but their visible light is entirely masked by dust. To decipher high-resolution data, astronomers employ a method known as astronomy to unearth the characteristics of these DSFGs. It’s akin to examining a high-definition 4K image, yet from the far reaches of outer space. Until recently, no equipment could successfully resolve DSFGs. This changed with the advent of the James Webb Space Telescope (JWST).

An international team of astronomers has recently succeeded in resolving 22 DSFGs using the JWST’s near-infrared camera, NIRCam. This advanced instrument can observe galaxies at wavelengths between 0.6 to 5 micrometers (approximately 1/5 millionth of a meter, or 2/1000ths of an inch). Astronomers leverage these high-resolution observations to navigate the dust enveloping DSFGs.

The research team utilized seven distinct filters in NIRCam to isolate specific wavelengths or colors of light from each galaxy. Each filter reveals different physical properties, including the galaxies’ size, shape, lumpiness, mass, and star formation rates. No single filter can capture all properties simultaneously; astronomers must also adjust their filters in accordance with the distance between the galaxy and Earth. Due to the universe’s expansion, older, more distant galaxies like the DSFG are receding from our own, causing the light waves we capture to stretch—a phenomenon known as redshift.

With the high-resolution data, the team classified DSFGs into three categories based on their visual traits. Type I galaxies create stars across their entirety, Type II galaxies concentrate star formation in their cores, while Type III galaxies generate stars only in their outer regions, known as the galactic disk. Astronomers studying cosmic history focus on areas where stars are not forming due to rapid cooling, identifying Type II and Type III galaxies. The study found 10 Type I galaxies, five Type II galaxies, and seven Type III galaxies among the DSFGs analyzed.

The team further explored the internal characteristics of each galaxy to unravel general trends within each type. To gauge their mass and star formation rates, astronomers employed models based on patterns of light emitted by the DSFGs, discovering that their sizes range from 30 billion to 300 billion times that of the Sun. Notably, the most massive DSFGs are smaller than the Milky Way and generate between 25 and 500 stars annually, located between 10 billion and 18 billion light-years from Earth.

The researchers also analyzed the shapes of these galaxies, noting that the more distant and older a galaxy is, the more fragmented its form appears. This fragmentation suggests that the high-redshift DSFGs are in a phase of forming tightly packed collections of stars, a structure known as a bulge. These galaxies may eventually experience quenching at their centers, morphing into Type III galaxies. Furthermore, scientists uncovered a previously unnoticed feature across many galaxies: they exhibit polarization, indicating potential past mergers with other galaxies.

The research team concluded that the high-resolution data provided by JWST can unveil hidden features within DSFGs, aiding astronomers in piecing together their past and predicting future developments. They advocate for upcoming researchers to utilize JWST data to test hypotheses regarding the evolution and characteristics of these fascinating galaxies.


Post views: twenty three

Source: sciworthy.com

Revolutionary Study Reveals How Bird Watching Can Help Slow Aging

Research from Toronto’s Baycrest Hospital indicates that **birdwatching** significantly enhances cognitive abilities and overall brain function.

According to their latest findings, skills such as keen observation, prolonged attention, and robust memory are linked to extensive use of binoculars. Notably, these abilities can fundamentally reorganize brain structure, leading to enhanced cognition.

Published in the Journal of Neuroscience, the study involved a comparison of brain structures in 29 expert birdwatchers and 29 novices, with balanced gender and age distribution.

Brain scans demonstrated that expert birdwatchers possess more compact areas related to attention and perception, which enhances their bird identification skills.









Interestingly, the mobility of water molecules in these brain regions is enhanced, improving the birdwatchers’ ability to discern unfamiliar or local bird species.

While various learning experiences, such as picking up a new instrument or language, are beneficial for brain health, this study emphasizes that birdwatching’s complexity offers unique cognitive advantages.

“What’s notable about this research is that birdwatching engages ongoing perception, attention, and memory, preventing a state of cognitive autopilot,” explained Professor Martin Sliwinski to BBC Science Focus. Sliwinski, who was not part of the study, serves as director at Penn State’s Center on Healthy Aging.

“To have cognitive benefits, a stimulating activity must remain challenging, which holds true for birdwatching,” he added.

“Even experienced birders cannot depend on automatic responses due to the ever-changing environment and cues, often experienced under conditions of uncertainty and time constraints.”

Moreover, researchers suggest that these enhanced skills and accompanying brain changes could bolster cognition in older adults, as older birdwatchers in the study demonstrated superior facial recognition and recall abilities compared to novices.

However, Sliwinski noted that other influences may also play a role, stating, “Individuals with higher cognitive capabilities and an interest in birds may be more predisposed to take up birdwatching and progress to experts.”

In essence, it’s possible that rather than birdwatching directly sharpening cognitive function, those with existing cognitive strengths are naturally inclined to pursue this engaging hobby.

Read more:

Source: www.sciencefocus.com

Boost Vascular Health: Daily Avocado and Mango Benefits for Prediabetic Adults

Incorporating one avocado and one cup of mango into your daily diet can significantly enhance vascular health indicators and lower crucial cardiometabolic risk factors, particularly in individuals with elevated blood sugar levels. This indicates that making simple dietary changes can promote heart health even before the onset of diabetes.

Daily consumption of avocado and mango among adults with prediabetes increases fruit intake, diversifies nutrient composition, and improves vascular function related to cardiovascular health. Image credit: Tomek Walecki.

The prevalence of type 2 diabetes and prediabetes is rapidly increasing.

Over one-third of the U.S. population is affected by prediabetes, with approximately 80% of adults unaware they have diabetes.

Prediabetes is defined by elevated blood sugar levels that fall below the diagnostic criteria for type 2 diabetes, accompanied by hyperinsulinemia due to insulin resistance.

Individuals with prediabetes face an elevated risk of developing type 2 diabetes and may experience complications such as endothelial dysfunction, contributing to both macrovascular and microvascular diseases.

Currently, prevention remains the most effective and economical strategy and is a key focus in public health.

A recent study led by Professor Britt Barton Freeman from the Illinois Institute of Technology instructed adults with prediabetes to adhere to an avocado-mango (AM) diet, incorporating one medium Hass avocado and one cup of fresh mango each day for eight weeks.

A calorie-matched control group consumed a similar diet, substituting avocado and mango with equivalent carbohydrate-based foods.

The AM diet group displayed substantial improvements in blood vessel function, which is essential for healthy circulation, and reductions in diastolic blood pressure—a vital factor for long-term heart health—compared to the control group.

Participants on the AM diet demonstrated significant enhancements in vascular function.

Flow-mediated dilation (FMD), a crucial indicator of endothelial function (blood vessel health), rose to 6.7% for those on the AM diet, contrasting with a decline to 4.6% in the control group, indicating a notable improvement.

Moreover, diastolic blood pressure showed significant improvement, particularly in men.

Men in the control group experienced an average central blood pressure increase of 5 points (mmHg), while those on the AM diet enjoyed a decrease of about 1.9 points, a difference that may become clinically relevant if maintained.

These benefits were achieved without altering caloric intake or body weight, highlighting that nutrient-rich fruits like avocado and mango can bolster cardiovascular health with minimal lifestyle adjustments.

“This research reinforces the efficacy of food-first strategies in mitigating cardiovascular disease risk, particularly for vulnerable groups like those with prediabetes,” stated Professor Barton-Freeman.

“This reassuring message suggests that integrating small, nutrient-dense foods, such as avocado or mango into meals and snacks, can enhance heart health without imposing rigid dietary restrictions.”

The AM group also saw increased intake of fiber, vitamin C, and heart-friendly monounsaturated fats—nutrients linked to cardiovascular wellness—while maintaining steady caloric consumption and body weight.

Additionally, markers of kidney function, such as estimated glomerular filtration rate (eGFR), showed improvement.

While no significant differences were observed in cholesterol, blood sugar, or inflammation levels, the findings underscore the importance of adding nutrient-rich fruits to the diet, especially for those at risk of type 2 diabetes and cardiovascular disease.

“In summary, enhancing fruit intake—particularly through increased avocado and mango consumption—led to beneficial alterations in vascular function crucial for cardiovascular health in high-risk populations,” the authors concluded.

For further information, see their research paper published in American Heart Association Journal.

_____

Chelsea Price et al. 2026. Effects of increasing total fruit intake with avocado and mango on endothelial function and cardiometabolic risk factors in prediabetic adults. American Heart Association Journal 15(4); doi: 10.1161/JAHA.124.040933

This version incorporates stronger SEO practices, with keywords related to heart health, prediabetes, vascular function, and dietary changes while keeping the original HTML structure intact.

Source: www.sci.news

90-Million-Year-Old Patagonian Fossils Uncover Key Insights into Alvarezauroid Dinosaur Evolution

Discover the fascinating skeleton of the Alvarezauroid dinosaur species Arunachetri seropolisiensis. This groundbreaking find includes two specimens from Patagonia, Argentina, along with two from the Northern Hemisphere, providing insights into how this enigmatic lineage of theropod dinosaurs evolved and dispersed before the separation of continents, challenging established beliefs regarding its origins.



Arunachetri seropolisiensis. Image credit: Gabriel Díaz Yanten, Universidad Nacional de Rio Negro.

Learn more about Arunachetri seropolisiensis, which thrived in Argentina during the Cenomanian period of the Late Cretaceous, approximately 90 million years ago.

This species was initially described from partial remains in 2012 and is categorized under Alvarez Sauroidea.

These unique small dinosaurs are noted for their tiny teeth and short arms, each ending in a prominent thumb claw.

“The Alvarezauroidea represents a mysterious clade of mainly small theropod dinosaurs, primarily found in the Jurassic to Cretaceous periods of Asia and South America,” states paleontologist Peter Makowiecki from the University of Minnesota, Twin Cities, along with his research team.

“Late Cretaceous Alvarezauroids are believed to have been carnivorous, primarily consuming ants, and possessing specialized forelimbs suitable for digging, microscopic supernumerary teeth, and heightened sensory abilities.”

“They are thought to have undergone evolutionary miniaturization alongside dietary specialization.”

The almost complete skeleton of Arunachetri seropolisiensis was discovered in the La Buitrera fossil site in Rio Negro, northern Patagonia.

Microscopic examination confirmed that this specimen was an adult, estimated to be at least four years old.

Weighing less than 0.9 kg (2 lb), it ranks as one of the smallest-known dinosaurs from South America.

In contrast to its later relatives, Arunachetri seropolisiensis featured longer arms and larger teeth.

Paleontologists conclude this indicates that some alvaresaurids transitioned into smaller forms well before they developed adaptations for an ant-based diet.

Researchers, by analyzing previously discovered alvaresaurid fossils housed in museums across North America and Europe, further demonstrate that these dinosaurs originated earlier than previously presumed, existing during the period when the continents were still part of the supercontinent Pangaea.

Their distribution appears to have resulted from the fragmentation of Earth’s landmass, making ocean crossings unlikely.

“Our biogeographical study suggests a Pangean ancestral distribution for Alvarezsauroroidea, indicating that the clade’s early history was primarily influenced by surrogates,” the scientists remarked.

Read their groundbreaking research in the paper published in Nature.

_____

PJ Makowiecki et al.. Discovery in Argentina reshapes the evolutionary narrative of a fascinating dinosaur clade. Nature published online on February 25, 2026. doi: 10.1038/s41586-026-10194-3

Source: www.sci.news

Astrophysicist Unveils Innovative Method for Measuring the Hubble Constant

Astrophysicists from the University of Illinois and the University of Chicago have pioneered a groundbreaking method to determine the Hubble constant, which quantifies the rate of the universe’s expansion. By utilizing the subtle background sound of gravitational waves, this innovative technique is poised to transform our understanding of cosmic evolution and may resolve key debates in contemporary astrophysics.



Schematic diagram of the universe’s expansion from the Big Bang to the present. Image credit: NASA/EFBrazil.

“This discovery holds significant importance. To address the ongoing Hubble tension, obtaining an independent measurement of the Hubble constant is crucial,” stated Professor Nicolas Younes from the University of Illinois.

“Our approach innovatively leverages gravitational waves to enhance the accuracy of Hubble constant measurements.”

Professor Younes and colleagues introduced a novel gravitational wave method utilizing the faint “background hum” from numerous distant black hole mergers to enhance Hubble constant estimations.

In contrast to traditional measurement techniques, this method capitalizes on space-time distortions, or gravitational waves, which carry valuable insights about vast cosmic distances and the velocity of receding celestial bodies.

Astrophysicists have termed this approach the “stochastic siren” method.

“By observing distinct black hole mergers, we can ascertain the frequency of these events throughout the universe,” remarked Bryce Cousins, a graduate student at the University of Illinois.

“Considering their velocity, we anticipate many additional events occurring that remain undetected, referred to as the gravitational wave background.”

“Discovering a completely new tool for cosmological research is a rare occurrence,” added Daniel Holtz, a professor at the University of Chicago.

“We demonstrated that we can unravel the age and composition of the universe by harnessing the ambient sound of gravitational waves resulting from the merger of black holes across distant galaxies.”

“This is an exhilarating and entirely novel direction, and we eagerly anticipate applying our method to future datasets to assist in determining the Hubble constant and other vital cosmological parameters.”

As the sensitivity of gravitational wave detectors improves, the stochastic siren method could lay the foundation for precision cosmology.

Detection of gravitational wave backgrounds is anticipated within the next six years.

Until then, the method gradually restricts higher Hubble constant values as improved upper background limits emerge, providing additional insights into the Hubble tension even without full detection capabilities.

“This initiative should pave the way for future applications, enhancing our sensitivity and ability to better filter and potentially detect the gravitational wave background,” Cousins noted.

“We hope that incorporating this information will yield superior cosmological insights and bring us closer to resolving the Hubble tension.”

The team’s research will be published in the Physical Review Letters.

_____

Bryce Cousins et al. 2026. Stochastic Siren: Astrophysical Gravitational Wave Background Measurement of the Hubble Constant. Physics. in press. doi: 10.1103/4lzh-bm7y

Source: www.sci.news

ALMA Unveils Largest and Most Detailed Image of the Milky Way’s Galactic Center Ever Captured

Discover the record-setting image captured by astronomers from the Atacama Large Millimeter/Submillimeter Array (ALMA) as they unveil the intricate molecular center of our Milky Way galaxy.



This image showcases the intricate distribution of molecular gas in the Milky Way’s Central Molecular Zone (CMZ). Image credits: ALMA / ESO / National Astronomical Observatory of Japan / NRAO / Longmore et al. / Miniti et al..

“It’s an extreme environment, hidden from our view, now revealed in stunning detail,” remarked ESO astronomer Dr. Ashley Burns.

In collaboration with the ALMA CMZ Exploration Survey (ACES), Dr. Barnes and colleagues have meticulously mapped over 650 light-years of the Central Molecular Belt, enveloping our galaxy’s supermassive black hole.

This groundbreaking study offers the most detailed view to date of the cold gas fueling star formation in this turbulent region, detecting a variety of molecules from simple silicon compounds to complex organic species.

“This is the closest galactic nucleus to Earth that we can study with such granularity,” Dr. Burns stated.

“While designing the survey, we anticipated a high level of detail, yet we were genuinely astonished by the intricate complexity unveiled in the final mosaic,” said Dr. Katharina Immer, ALMA astronomer at ESO.

This unique dataset introduces never-before-seen central molecular zones, illustrating gas structures spanning tens of light-years down to minute gas clouds surrounding individual stars.

“The central molecular belt harbors some of the most massive stars known in our galaxy. Many of these stars have short lifespans, culminating in spectacular supernova events,” explained ACES leader Professor Steve Longmore, astrophysicist at Liverpool John Moores University.

With the ACES project, astronomers aim to deepen our understanding of how such phenomena influence star formation and whether existing theories of star formation apply even in extreme environments.

“By investigating star formation in the central molecular belt, we can elucidate how galaxies develop and change over time,” Professor Longmore added.

“We believe this region shares many traits with galaxies in the early universe, where star formation occurs in chaotic, extreme settings.”

The latest findings from ACES are published in the Royal Astronomical Society Monthly Notices.

Source: www.sci.news

How Early Humans Created Symbol Systems Before Writing: Uncovering Prehistoric Communication

Approximately 40,000 years ago, early humans in Europe created a sophisticated system of geometric symbols. These symbols are believed to represent an intentional, repeatable form of communication that transcends mere decoration. Discover more in a recent study published in Proceedings of the National Academy of Sciences.



Movable artefact featuring geometric symbols from the Swabian Aurignacian culture. Image credit: Christian Bentz & Ewa Dutkiewicz, doi: 10.1073/pnas.2520385123.

According to researchers Christian Benz from the Universities of Saarland and Passau, and Eva Dutkiewicz from the National Museum in Berlin, “Around 45,000 years ago, modern humans migrated into eastern and central Europe.”

During this migration, they encountered Neanderthals, their distant relatives.

In a period of rapid population turnover, modern humans produced a variety of movable artifacts, including tools and figurines crafted from materials such as ivory, bone, and antler.

These artifacts date back to the early Upper Paleolithic and are part of the Aurignacian technocomplex.

Numerous objects adorned with geometric symbols have been discovered, particularly in France’s Dordogne region, Germany’s Swabian Jura, and Belgian archaeological sites.

The researchers examined a collection of 260 mobile Aurignacian artifacts found in caves in the Swabian Jura.

These remarkable items, made from mammoth ivory, bone, and horn, date between 43,000 and 34,000 years ago.

Artifacts include tools, beads, musical instruments, and figurines representing both animals and humans, many etched with sequences of geometric signs—dots, lines, crosses, and more.

The scientists emphasized, “The inhabitants of these caves produced specialized tools for cutting meat, processing animal hides, and crafting clothing and ropes during this period.”

They also pioneered the flute, the first musical instrument made from bone and ivory.

Utilizing information theory and quantitative linguistics, the authors analyzed over 3,000 geometric symbols from the artifacts.

They assessed characteristics like repetition, diversity, and overall information density within the engraved symbols.

Dr. Benz noted, “While many theories exist, there has been minimal empirical research on the measurable properties of these symbols.”

The results revealed intriguing findings. Statistically, these Paleolithic symbols differ significantly from modern writing, which usually favors less repetition and denser information.

However, they bear a resemblance to Protocuneiform, the earliest known accounting symbols from Mesopotamia, used about 5,500 years ago.

This similarity doesn’t indicate that Ice Age Europeans had a writing system, as true writing encodes spoken language, while the Aurignacian symbols do not.

Instead, these artifacts illustrate a stable, traditional system for visually storing and conveying information without language.

The placement of symbols matters; figurines, particularly ivory ones, display a greater complexity and denser arrangement than everyday tools.

Specific symbols were exclusive to certain subjects, with dots frequently appearing on human and feline figures, while crosses were found on mammoths and horses, but never on human forms.

This pattern indicates a shared set of rules passed down through generations.

Researchers noted that unlike precuneiform, which evolved into a comprehensive script as ancient societies grew more complex, the structure of the Aurignacian symbol system remained remarkably consistent over roughly 10,000 years.

Dr. Benz stated, “Our analysis reveals that these symbol sequences have no correlation to contemporary writing systems, which represent spoken language and feature high information density.”

In contrast, the symbols found in archaeological artifacts often showcase repetitive patterns: cross, cross, cross, line, line, line, a hallmark absent in spoken language.

“Our findings also indicate that Paleolithic hunter-gatherers developed symbols with an information density statistically akin to the earliest proto-cuneiform tablets from ancient Mesopotamia, which emerged 40,000 years later.”

Proto-cuneiform symbols exhibit a similar repetitive quality, with individual symbols appearing at consistent rates, showcasing comparable complexity.

This discovery supports the growing consensus among archaeologists that symbolic communication likely evolved gradually through systems aimed at recording numbers, events, or social knowledge, rather than emerging suddenly as writing.

Some symbols may have tracked seasonal patterns, hunting data, or ritual concepts, though their precise meanings remain elusive.

Dr. Dutkiewicz added, “Modern humans have the benefit of thousands of years of accumulated knowledge that was unavailable to our ancestors. However, anatomically, Stone Age humans may have possessed cognitive abilities akin to ours.”

“The capacity to record and share information was crucial for Paleolithic humans, possibly enhancing their ability to coordinate groups and improve survival strategies.”

“They were adept craftsmen, evident in the portability of many of these artifacts, which often fit seamlessly in the palm of the hand, reminiscent of proto-cuneiform tablets.”

_____

Christian Benz and Eva Dutkiewicz. 2026. Early humans developed a traditional symbol system 40,000 years ago. PNAS 123 (9): e2520385123; doi: 10.1073/pnas.2520385123

Source: www.sci.news

NASA Identifies Astronaut Involved in Medical Incident on the ISS

Astronauts Finke, Cardman, Yui, and Platonov have been stationed on the International Space Station (ISS) since early August, with plans originally extending until late February.

However, following an incident, NASA executives and the agency’s medical director decided to bring the astronauts back to Earth one week early.

Finke stated, “After a thorough evaluation, NASA has concluded that the safest decision is to return Crew-11 early. It is not an emergency; rather, it is a crafted plan to utilize advanced medical imaging technology not accessible on the space station.”

The Crew-11 astronauts departed the ISS on January 14, undocking from the space station in the same SpaceX Dragon capsule that initially transported them. Following an 11-hour journey, the capsule landed in the Pacific Ocean off the San Diego coast during the early morning hours on January 15.

NASA Administrator Jared Isaacman remarked in a post-landing news conference that while there were “serious conditions” in orbit, the crew has remained safe and stable since the incident.

Finke expressed gratitude towards his Crew-11 teammates, including NASA astronaut Chris Williams and Russian cosmonauts Sergei Kud Sverchkov and Sergei Mikayev. He also acknowledged the instrumental support from the teams at NASA, SpaceX, and medical professionals at Scripps Memorial Hospital La Jolla.

“Their professionalism and commitment undoubtedly led to positive outcomes,” he added.

Finke concluded by sharing that he is “doing very well” and is engaging in standard post-flight repair work at NASA’s Johnson Space Center in Houston.

“Spaceflight is an incredible privilege that humbles our humanity,” he said. “Thank you for your continued support.”

Source: www.nbcnews.com

How SpaceX’s 1 Million Satellites Could Bypass Environmental Inspections

Sure! Here’s the SEO-optimized version of your content while maintaining the HTML structure:

SpaceX’s Ambitious Satellite Launch Plans

Charles Boyer/Alamy Stock Photo

As approval deadlines loom, astronomers are working diligently to assess the environmental implications of SpaceX’s request to launch up to 1 million satellites.

On January 30, SpaceX revealed its application to the US Federal Communications Commission (FCC) to deploy a vast constellation aimed at serving as an orbital data center for artificial intelligence, as stated by CEO Elon Musk.

This proposed number of satellites far exceeds the current total in orbit, which stands at only 14,500 active satellites. At present, the FCC is not mandated to evaluate the potential environmental consequences of launching such a significant number of satellites, especially regarding their impact on Earth’s atmosphere and nighttime visibility.

“We have serious concerns,” remarked Ruskin Hartley, CEO of DarkSky International. “We support satellite use, but it must be conducted responsibly.”

Following satellite applications, the FCC allows public comments. This process occurred shortly after SpaceX’s proposal, which is swift compared to typical timelines. The deadline for submissions is March 6, after which the FCC might take several months to decide on the application.

More than 350 comments have been submitted thus far, with many astronomers voicing their apprehensions about the implications for astronomy and Earth’s atmosphere. “The idea of a million satellites is incredibly alarming,” noted Samantha Lawler from the University of Regina, Canada.

SpaceX has not disclosed extensive details about the proposed satellites, specifically regarding their sizes and altitudes. This lack of information prevents astronomers from fully understanding the potential impacts of the constellations. “We are hurrying to gather crucial data to submit to the FCC,” Lawler added.

In a worst-case scenario, Lawler suggests that tens of thousands of satellites could be visible to the naked eye simultaneously, greatly obstructing observations from telescopes both on Earth and in space. Furthermore, it would necessitate continual satellite replenishment, likely every five years, similar to SpaceX’s Starlink system. Consequently, an average of one satellite would launch and another would re-enter Earth’s atmosphere every three minutes—significantly more than the current rate of a few re-entries daily.

This frequent re-entry poses serious risks to Earth’s atmosphere, as burning satellites produce aluminum oxide, a substance harmful to the ozone layer. “We’re discussing Teragrams [1 trillion grams]. This could lead to substantial ozone depletion and alter stratospheric temperatures,” Lawler warned.

The FCC currently lacks the obligation to evaluate the environmental impacts of satellite usage at such a comprehensive scale, due to exemptions under the U.S. National Environmental Policy Act. Should significant concerns arise during the public comment period, the application may face closer scrutiny; however, the likelihood of this remains uncertain, according to Kevin Bell from the Free Information Group in Washington, D.C.

“Ideally, the FCC would conduct assessments, but they often lack the scientific capacity to fully evaluate atmospheric impacts,” Bell explained.

Neither the FCC nor SpaceX has responded to requests for comment.

Topics:

This version includes relevant keywords and enhances readability while maintaining the HTML structure. It also adds alt text to the image for better SEO.

Source: www.newscientist.com

Discover the Tiny Predatory Dinosaur Lighter than a Chicken

Reconstruction of Arunashetri seropolisiensis

Reconstruction of Arunashetri seropolisiensis

Credit: Gabriel Díaz Yantén, National University of Rio Negro.

The nearly complete skeleton of a small dinosaur, weighing less than a domestic chicken, has provided significant insights into the evolution of Alvarezaurus, one of the smallest dinosaur species ever documented.

This 95 million-year-old fossil, identified as Arunashetri seropolisiensis, was unearthed in 2014 at the La Buitrera site in northern Patagonia, Argentina.

The first specimen of Arunashetri consisted of incomplete hind limb bones found in 2012. Peter Makowiecki from the University of Minnesota contributed to the research on this new fossil. At the time, only fragmentary remains were available, leaving the classification as a probable alvaresaurid. “We didn’t even know if it was a juvenile or an adult,” says Makowiecki.

“With the entire skeleton now complete, we suddenly have all the information required to understand how Arunashetri functioned. Its anatomical structure is both similar and distinct from other species, providing key insights into the evolution of Alvarezsaurus’s unusual anatomy,” notes Makowiecki.

The new fossil features very elongated and slender hind limbs, along with unexpectedly long forelimbs equipped with three well-developed fingers. Detailed analysis revealed that this dinosaur was an adult, estimated to be at least four years old.

It is believed that it weighed only 700 grams in its lifetime. “These specimens are incredibly small, even smaller than a chicken,” explains Makowiecki.

Previously, Alvarezsaurus was thought to be an early ancestor of birds. Recent findings clarify that while Arunashetri may have superficially resembled a bird, it, along with all Alvarezsaurus, are indeed non-avian theropods. “This new discovery confirms our understanding,” states Makowiecki.

It was once believed that all small alvaresaurids had short, robust forelimbs with prominent thumbs but reduced lateral digits and small teeth, interpreted as evolutionary adaptations for a diet of ants and termites. However, as Makowiecki points out, “Arunashetri does not fit that mold. Although it is a smaller member of the Alvarezauridae family, it signifies a relatively early branch in the evolutionary tree, hence its teeth and forelimbs remain quite substantial.”

In fact, he adds, its forelimbs are more characteristic of other theropods rather than anteater specialists. “Arunashetri, while smaller, structurally resembles a typical theropod. Considering its size, it likely consumed a significant variety of invertebrates, along with a broader prey spectrum,” he further explains.

This indicates that paleontologists still lack a full understanding of why these dinosaurs evolved to such small sizes. “We are left with a vague understanding that Alvarezsaurids successfully occupied a very small predator niche,” concludes Makowiecki.

Topics:

Source: www.newscientist.com

Can Infrared Saunas Provide the Health Benefits of Exercise Without Physical Activity?

Infrared saunas are a leading wellness trend, gaining popularity in gyms, spas, and personal homes. Advocates highlight numerous benefits, including enhanced heart health and effective pain relief.

Unlike traditional Finnish saunas that heat the air, infrared saunas use light from infrared bulbs to directly warm your body. This results in a milder temperature of around 60°C (140°F), compared to the 75°C (167°F) typical of traditional saunas.

Infrared saunas are generally compact. If you prefer a less intense heat experience or wish to avoid the crowded atmosphere of Finnish saunas, an infrared sauna might be your ideal option.










You will still experience sweating, as this mirrors the positive effects of exercise. According to various heat therapy studies, infrared saunas can escalate heart rates, lower blood pressure, alleviate muscle tension, and enhance blood circulation. They may also promote the release of endorphins, improving mood while reducing stress.

Research on infrared saunas indicates benefits for cardiovascular health, resembling light exercise effects, and pain reduction, particularly for chronic pain sufferers. Some animal studies suggest that infrared therapy might help decrease inflammation and motivate mitochondrial activity within our cells.

Discover the health benefits of infrared saunas. – Photo credit: Getty

However, it’s worth noting that comprehensive studies on infrared saunas are few, often limited by participant numbers. Conducting large-scale studies poses logistical challenges and costs.

Despite the lack of extensive evidence, the infrared sauna trend is on the rise. You can install one at home for approximately £3,000 (around $3,900 USD). Additionally, infrared Pilates and yoga classes are emerging, offering a chance to experience the benefits before making a purchase.


This article addresses the query from Ross McDowell of Birmingham: “Should I start using an infrared sauna? Is it safe?”

To submit your questions, please email questions@sciencefocus.com or reach out via Facebook, Twitter, or Instagram (include your name and location).

Check out our Ultimate Fun Facts page for more fascinating science insights.


read more:


Source: www.sciencefocus.com

Quantum Computers: Making Encryption 10x Easier to Break

Quantum Computing and Encryption Vulnerability

Quantum Computers: A Threat to Encryption Methods

Blackjack 3D/Getty Images

Recent advancements in quantum computing have decreased the power required to breach standard encryption techniques by tenfold. With this remarkable reduction, common encryption methods face heightened vulnerability, prompting concerns about future security.

The RSA algorithm, a staple in online banking and secure communications, relies on the intricate task of factoring two large prime numbers. While the possibility of using quantum computers to bypass this challenge was theorized since the 1990s, the physical size requirements of such quantum systems previously rendered them impractical.

However, this landscape is shifting. In a groundbreaking 2019 study, Craig Gidney, from Google’s Quantum AI, outlined a method that significantly lowered this requirement from 170 million qubits to just 20 million. Furthermore, by 2025, Gidney plans to bring it down to below one million qubits. Most recently, Paul Webster and his Australian team at Iceberg Quantum cut this estimate to approximately 100,000 qubits.

Their research expands on Gidney’s algorithm improvements while incorporating a new methodology called qLDPC coding, which enhances qubit connectivity beyond immediate neighbors. This modification increases the overall information density possible in quantum systems.

Based on their findings, the team predicts that cracking a prevalent RSA encryption could become feasible within about a month using 98,000 superconducting qubits—those presently manufactured by tech giants like IBM and Google. To achieve this in just one day, a staggering 471,000 qubits would be necessary.

Some quantum computing firms aspire to develop machines with hundreds of thousands of qubits within the next decade. However, these optimistic calculations overlook material considerations and focus primarily on error rates and computational speed. What happens if the Iceberg Quantum approach is feasible? An entity controlling such a quantum computer could potentially access private emails, bank accounts, and governmental data secured via RSA encryption.

“The stringent requirements pose a significant challenge in hardware manufacturing—the toughest hurdle,” Gidney comments. Similarly, Scott Aaronson from the University of Texas at Austin expressed concerns about the practicalities of configuring connections between distant qubits on his blog here.

IBM has been an advocate for qLDPC coding recently, making strides in making its quantum hardware compatible. However, the extent of success with this methodology remains uncertain. An IBM spokesperson noted that qLDPC codes form the “foundation” of their quantum computing technology but did not elaborate on the feasibility of Iceberg’s innovations.

Facilitating connections between distant qubits is simpler when using extremely cold atoms or ions—two emerging strategies in the quantum computing arena. Yet these systems are often slower, and recent research indicates that unlocking RSA encryption may still require millions of qubits.

“It’s crucial to maintain a flexible perspective on the timeline for such breakthroughs,” states Lawrence Cohen from Iceberg Quantum. “Should RSA be compromised, the fallout could be immense. It’s better to be proactive than reactive.”

Although breaking RSA encryption is a well-researched issue, it serves as an excellent benchmark for those pursuing powerful quantum systems. Moreover, the team’s techniques might also enhance simulations of quantum materials and quantum chemistry.

Topics:

  • Safety/
  • Quantum Computing

Source: www.newscientist.com

Artemis Rocket Returns to Hangar for Repairs as Moonshot Plans Are Temporarily Paused

NASA is set to return the massive Space Launch System (SLS) rocket to its hangar for crucial repairs on Wednesday, postponing the launch of four astronauts on the highly anticipated Artemis II mission around the moon by at least a month.

The towering 322-foot SLS rocket has been stationed on the launch pad at Florida’s Kennedy Space Center since mid-January. However, engineers recently identified a blockage affecting the helium flow to part of the rocket’s upper stage, necessitating further investigation.

This rollback means that NASA will miss its planned launch window for the Artemis II mission in March. While officials indicate that a trial launch could potentially happen in April, the exact schedule hinges on the outcomes of the ongoing repairs.

“We recognize that this news is disappointing,” NASA Administrator Jared Isaacman expressed on Saturday. In a post on X, he added, “That disappointment is strongest among the dedicated NASA team that has tirelessly prepared for this monumental mission.”

Retrieving the rocket back to the hangar is a substantial task. The four-mile trek is scheduled to begin Wednesday morning around 9 a.m. ET, though this process is characteristically slow and may take up to 12 hours. Weighing 11 million pounds, the rocket transports the Orion capsule and is moved by a mobile platform known as a crawler transporter, advancing at a leisurely pace of about 1 mile per hour.

Once the rocket reaches the hangar, officially known as the Vehicle Assembly Building, the team will establish a platform to facilitate engineers’ access to the site where the helium flow issue was identified.

During the rocket’s stay in the Vehicle Assembly Building, NASA plans to replace and test the batteries for the upper stage and the safety mechanism known as the flight termination system.

Source: www.nbcnews.com

AI Continually Recommends Nuclear Strategies in War Game Simulations

Mushroom cloud after French atomic bomb explodes over Mururoa Atoll, also known as Aopuni

AI Chooses Nuclear Weapons with Alarming Frequency

Credit: Galerie Bilderwelt/Getty Images

Recent studies reveal that advanced AI models exhibit a concerning willingness to deploy nuclear weapons, mirroring the hesitance exhibited by humans during geopolitical crises.

Kenneth Payne from King’s College London organized a wargame featuring three prominent language models: GPT-5.2, Claude Sonnet 4, and Gemini 3 Flash. Scenarios encompassed critical international conflicts, including territorial disputes, resource competition, and threats to regime stability.

The AI models operated on an escalation ladder, enabling them to select responses ranging from diplomatic protests to full-scale nuclear warfare. Over the course of 21 wargames, they executed 329 turns and produced around 780,000 words explaining their decision-making processes.

In a striking 95% of these simulated engagements, at least one tactical nuclear weapon was deployed by the AI. “Nuclear taboos do not seem as entrenched for machines as they are for humanity,” Payne noted.

Additionally, none of the models opted for full surrender, regardless of their losing positions. Instead, they generally sought to reduce violence temporarily. In 86% of conflicts, unintended escalations occurred beyond initial AI intentions due to miscalculations in the fog of war.

“From a nuclear risk standpoint, these results are alarming,” cautioned James Johnson from the University of Aberdeen. He expressed concerns that AI could amplify one another’s responses, leading to catastrophic outcomes.


This issue is particularly crucial as AI systems are already being integrated into military wargames worldwide. “While significant powers utilize AI in simulations, the extent of its integration into actual military decision-making remains uncertain,” remarked Tong Zhao from Princeton University.

Zhao believes that countries may understandably hesitate to delegate nuclear decision-making to AI. Payne echoes this sentiment, stating, “It is unlikely any nation would entrust a machine with nuclear control.” However, in situations with urgent time constraints, military strategists might be compelled to lean on AI systems.

He questions whether AI’s perceived lack of human fear may be the sole reason for its propensity toward aggression, positing that a fundamental disconnect in understanding the ‘stakes’ of nuclear engagement may exacerbate risks.

The implications for mutually assured destruction—the notion that no leader would initiate a nuclear strike due to retaliation—remain unclear, according to Johnson.

When one AI model deployed a tactical nuke, the opposing AI de-escalated only 18% of the time. “AI could enhance deterrence by making threats more credible,” Johnson added. “AI won’t dictate nuclear war, but it could significantly influence the perceptions and timelines that inform human decision-making.”

As of now, leading companies like OpenAI, Anthropic, and Google, which developed the AI models involved in this research, have not commented on these findings. New Scientist has sought their insights.

Topics:

  • War /
  • Artificial Intelligence

Source: www.newscientist.com

Unlocking Longevity: How Rapamycin Could Add Years to Your Life – A High-Stakes Gamble

Illustration of rapamycin molecule

Rapamycin Molecule: Potential for Life Extension

Science Photo Library

The lifespan benefits derived from fasting and rapamycin usage resemble a lottery rather than a guaranteed outcome. While significant lifespan increases have been observed within a year, reanalysis indicates that results can vary significantly among individuals.

Talia Fulton, a researcher at the University of Sydney, mentions, “[They] may enhance your lifespan marginally [they] could dramatically increase it.”

The 2025 study examined 167 research papers across eight non-human species, including fish, mice, rats, and rhesus macaques. Fulton and her team discovered that when these animals were treated with rapamycin, a promising anti-aging compound, alongside calorie restriction — known for fostering longevity — they exhibited a longer lifespan on average. This suggests the same potential could extend to humans.

Current research has investigated the varied responses to longevity interventions in individual animals, revealing significant variability in benefits. Fulton notes that while taking rapamycin or implementing dietary restrictions appears “likely to be advantageous, the degree remains uncertain.”

According to her, “Some may experience considerable lifespan extension, while others may see minimal impact, or not outlive their expected lifespan.” This variability creates a somewhat unpredictable environment, meaning these treatments cannot guarantee lifespan extension for all individuals.

Fulton emphasizes that the objective of longevity interventions is to balance the population size with life expectancy through a squared curve. This implies that more individuals could lead longer lives, contrasting with the current trend of fewer individuals achieving longevity. “Squaring the survival curve means a larger number will lead extended and fulfilling lives until around 100, at which point mortality becomes almost certain,” she elaborates.

Current findings indicate that dietary restrictions and rapamycin do not effectively square this longevity curve. In this context, Fulton advises holding off on high expectations until further research clarifies who stands to benefit most from these approaches. “We aspire to decode individual genetic variables and life histories, ultimately determining ‘This is precisely what you need to achieve maximum longevity,'” she states.

Researchers like Matt Kaeberlein from the University of Washington stress that squaring the curve does not inherently mean enhanced health profiles. A more compelling consideration, he argues, is whether longevity initiatives, such as exercise, influence “healthspan inequality.”

Originally developed as an immunosuppressant for organ transplant patients, rapamycin inhibits the mTOR protein, essential for cell growth and division. At lower doses, it has demonstrated the potential to extend lifespan in species like flies and mice, potentially by safeguarding against DNA damage.

Topics:

Source: www.newscientist.com

Discovering Diverse Marine Amphibian Communities: Early Triassic Fossils Uncovered in Australia

Recent findings from museum collections in Australia and the United States showcase the incredible diversity of the Western Australian trematosaurid temnospondyl, underscoring how early marine amphibians proliferated across the continent shortly after the end-Permian mass extinction.



Ancient marine amphibians Erythrobatrachus (foreground) and Aphanelamma (background) traversed the northern coast of modern-day Western Australia 250 million years ago. Image credit: Pollyanna von Knorring, Swedish Museum of Natural History.

“The catastrophic end-Permian mass extinction and severe global warming gave rise to modern marine ecosystems at the dawn of the Mesozoic Era, around 252 million years ago,” stated Dr. Benjamin Kjaer from the Swedish Museum of Natural History and his colleagues.

“This significant evolutionary milestone marked the early emergence of sea-going tetrapods (limbed vertebrates), including amphibians and reptiles that quickly established themselves as dominant aquatic apex predators.”

“To date, the earliest sea monster fossils have primarily been documented in the Northern Hemisphere.”

“In contrast, the fossil record from the Southern Hemisphere remains geographically sparse and inadequately understood.”

Paleontologists recently analyzed marine amphibian fossils from the renowned Kimberley region of Western Australia’s far north.

“These fossils were uncovered during scientific expeditions in the early 1960s and 1970s,” the researchers noted.

“The specimens were subsequently distributed to various museum collections across Australia and the United States.”

“The results of this research were initially published in 1972, identifying a single species of marine amphibian, Erythrobatrachus nooncambahensis, named after skull fragments discovered at Noonkumba Farm, east of Derby in the Kimberley region.”

“Unfortunately, the original fossil of Erythrobatrachus has since been lost over the past 50 years.”

“This prompted a survey of international museum collections, leading to the rediscovery and reanalysis of these ancient marine amphibian remains in 2024.”

According to scientists, Erythrobatrachus is classified within the trematosaurid family of temnospondyls.

“Trematosaurids bore a superficial resemblance to crocodiles and were related to modern salamanders and frogs, reaching lengths of up to 2 meters (6.6 feet),” the researchers explained.

“These fossils hold significant importance as they were found in rocks deposited as coastal sediments less than a million years after the end-Permian mass extinction.”

“Thus, they represent the oldest currently recognized groups of Mesozoic marine tetrapods in geological terms.”

However, detailed investigations revealed that the skull fragments of Erythrobatrachus were not unified but belonged to at least two distinct types of trematosaurids: Erythrobatrachus and another species from the well-known genus Aphanelamma.

“Examination of Erythrobatrachus using advanced 3D imaging indicated the skull measured approximately 40 centimeters (16 inches) when intact, suggesting it was a robust, broad-headed apex predator,” the authors stated.

“Conversely, Aphanelamma were similar in size but featured elongated snouts adapted for catching smaller fish.”

“Both types of trematosaurids occupied the water column yet targeted different prey within the same habitat.”

“Furthermore, the fossils of Erythrobatrachus are uniquely found in Australia, while Aphanelamma has been discovered in similarly aged deposits across regions like the Scandinavian Arctic, Svalbard, the Far East, Pakistan, and Madagascar.”

“The Australian trematosaurid fossils provide evidence that these early Mesozoic marine tetrapods not only radiated swiftly into various ecological niches but also dispersed globally along the coastal margins of interconnected supercontinents during the initial two million years of the dinosaur epoch.”

The team’s study was recently published in the Journal of Vertebrate Paleontology.

_____

Benjamin P. Care and colleagues. Revision of Trematosauridae Erythrobatrachus nooncambahensis: A mysterious marine vertebrate assemblage from the Lower Triassic of Western Australia. Journal of Vertebrate Paleontology, published online on February 22, 2026. doi: 10.1080/02724634.2025.2601224

Source: www.sci.news

Exploring NGC 5134: Mr. Webb’s Star Factory Spirals Unveiled

An astronomer at NASA/ESA/CSA has utilized the James Webb Space Telescope to capture breathtaking infrared images of the spiral galaxy NGC 5134.



This infrared image showcases spiral galaxy NGC 5134, situated approximately 65 million light-years away in the constellation Virgo. Image credits: NASA / ESA / CSA / Webb / A. LeRoy.

The NGC 5134 galaxy is located around 65 million light-years from Earth, making it a significant celestial object in the Virgo constellation.

Also referred to as ESO 576-52, LEDA 46938, and IRAS 13225-2052, NGC 5134 was first discovered by the renowned German-British astronomer William Herschel on March 10, 1785.

This galaxy is a member of the NGC 5084 group, which consists of five galaxies, including NGC 5084, NGC 5087, ESO 576-50, and ESO 576-40.

According to Webb astronomers, “The relative proximity of these galaxies enables Webb to uncover remarkable details about NGC 5134’s tightly coiled spiral arms.”

The latest infrared images of NGC 5134 are derived from observations taken by Webb’s Mid-Infrared Instrument (MIRI) and Near-Infrared Camera (NIRCam).

“MIRI collects mid-infrared radiation emitted by warm dust in NGC 5134’s interstellar cloud, allowing astronomers to track dusty gas clumps,” the researchers noted.

“Some of this dust comprises complex organic molecules known as polycyclic aromatic hydrocarbons, characterized by interconnected carbon atoms, providing insight into the chemistry within interstellar clouds.”

“NIRCam specializes in capturing near-infrared light at short wavelengths from the stars and star clusters dotting the spiral arms of NGC 5134.”

“The combination of MIRI and NIRCam data illustrates a galaxy in a continuous state of change and evolution.”

According to the researchers, “The gas clouds flowing along NGC 5134’s spiral arms are prolific sites for star formation; each new star formed consumes some of the star-forming gas that sustains the galaxy.”

“When a star reaches the end of its life, part of its gas is recycled back into the galaxy, contributing to the cycle of star formation.”

Massive stars, those exceeding eight times the mass of the Sun, endure dramatic cataclysmic supernova explosions that disperse stellar material over vast distances.

Other stars, like our Sun, gently return some of their material; they expand into red giants before shedding their atmospheres and releasing gas into space.

WWhether expelled by a supernova or a gentle red giant, this gas may eventually be integrated into new star formation processes.

Source: www.sci.news

How Virgin Olive Oil Promotes Cognitive Health by Altering Gut Microbiota

A large prospective cohort study has revealed that older adults consuming more virgin olive oil, a vital element of the Mediterranean diet, experience slower cognitive decline and enhanced gut microbiota diversity over two years. Conversely, higher consumption of common refined olive oil correlates with decreased microbial diversity and accelerated cognitive decline.

Extra virgin olive oil, a cornerstone of the Mediterranean diet, protects against cognitive decline. Image credit: Steve Buissinne.

Virgin olive oil is a key ingredient of the Mediterranean diet, packed with phenolic compounds known for their anti-inflammatory and antioxidant benefits.

While prior laboratory and animal research hinted at neuroprotective effects, human studies linking olive oil, gut microbiota, and cognitive function remain sparse.

This groundbreaking finding stems from participants in the PREvención con DIeta MEDiterránea-Plus (PREDIMED-Plus) study, an extensive ongoing trial designed to explore how dietary and lifestyle changes influence cardiovascular and metabolic health.

“This is the first prospective human study analyzing the role of olive oil in the relationship between gut microbiota and cognitive function,” stated Dr. Giaki Ni, a researcher from Rovira i Virgili University.

Researchers monitored over 650 adults aged 55 to 75, who were overweight or obese and at high risk for cognitive decline, yet cognitively healthy at the study’s onset.

During a two-year period, they assessed participants’ olive oil intake, gut microbiome profiles, and performance on a comprehensive range of cognitive tests.

Increased consumption of virgin olive oil was linked to improved or sustained overall cognition, executive function, and language proficiency.

In stark contrast, high consumption of common refined olive oil appeared to diminish gut microbial diversity and accelerate cognitive decline.

“As cases of cognitive decline and dementia rise, our findings underscore the necessity of enhancing diet quality. Prioritizing extra virgin olive oil over refined options emerges as a simple yet effective strategy to safeguard brain health,” emphasized researchers Nancy Babio and Stéphanie Nisi from Rovira i Virgili University.

To uncover why virgin olive oil may positively impact cognitive function, scientists analyzed baseline stool samples.

Those who consumed higher amounts of virgin olive oil exhibited greater gut microbiota diversity and a more cohesive microbial community structure compared to those who consumed less.

Further analysis indicated that specific gut bacteria may elucidate the cognitive advantages. Changes in the prevalence of particular microbial species, such as adlerkreuzia, appeared to mediate the relationship between virgin olive oil intake and enhanced cognitive performance, reinforcing the concept that diet influences brain health via the gut-brain axis.

“This study highlights that the quality of fats we consume matters as much as their quantity,” remarked Dr. Jordi Salas Salvado, also from Rovira y Virgili University.

“Extra virgin olive oil not only benefits heart health but also plays a vital role in protecting brain function as we age.”

“The discovery that microbial profiles contribute to these benefits opens avenues for new nutrition-based prevention strategies to maintain cognitive function.”

Find out more in the study published in the journal Microbiome.

_____

J.nee et al. 2026. Changes in total and different types of olive oil intake, gut microbiota, and cognitive function in older adults. Microbiome 14, 68; doi: 10.1186/s40168-025-02306-4

Source: www.sci.news

Exploring Cannibalism: Why Some Orcas Prefer Family Pods

Sure! Here’s an SEO-optimized rewrite of the content while retaining the HTML tags:

Killer Whales Face Cannibalism Risks

François Gouy/VWPics/Alamy

Recent observations by biologists indicate the occurrence of orca-on-orca predation in the North Pacific, suggesting that such cannibalistic behavior may be a reason why certain killer whales travel in extensive family groups.

There are two primary subspecies of killer whales (Orcinus orca) in the North Pacific Ocean. Transient killer whales, commonly known as Biggs killer whales, are nomadic, forming dynamic hunting pods to pursue seals, dolphins, and other whales. In contrast, resident killer whales maintain large family-oriented groups and stay close to their maternal ties throughout their lives. These residents disperse to hunt fish individually but reunite for resting or traveling.

Though it is believed that the two subspecies rarely interact, Sergey Fomin from the Russian Institute of Pacific Geography has recorded instances of aggressive encounters. While walking along the eastern shores of Bering Island, he noted bite marks on the dorsal fins of beaked and minke whales, remnants of predation by hungry killer whales. However, during the summer of 2022, he discovered a bloodied orca fin on the beach—its origin would be revealed two years later.

Through genetic analysis, it was found that the fin belonged to a southern killer whale, leading Fomin and colleagues to hypothesize that it was likely consumed by a Biggs killer whale.

Most toothed whales, including killer whales, exhibit fluid social structures with their pods changing frequently. The mystery of why southern killer whales form large family units has sparked scientific intrigue. “I’ve been curious about their social structure for a while, as it’s quite unique among species,” notes Olga Filatova from the University of Southern Denmark.

Upon hearing about the findings of the two dorsal fins and the potential for cannibalism, Filatova was intrigued. It’s possible that resident killer whales band together in large numbers for protection. She, along with Fomin and Ivan Fedutin, has published a study discussing this hypothesis.

Killer whales, being apex predators, rarely experience harassment. However, they have been observed being chased off by smaller pods of pilot whales. They are also known to display aggression towards one another. In 2016, Jared Towers of Bay Cetology reported witnessing a pod of Biggs killer whales attacking and killing a newborn. Towers speculated that because the calf was not consumed, this aggression was likely aimed at inducing sexual receptivity in the mother.

While it’s uncertain if the whales at Bering Island were cannibalized, Towers believes that the unique social structure of the residents likely serves as a defensive mechanism. Experts cannot dismiss the possibility that the fins were damaged during sparring or that the whales were consumed post-mortem. However, because deceased orcas typically sink, this scenario is less plausible.

Researchers can only theorize the reasons behind cannibalism in killer whales, with Filatova suggesting it may arise from necessity. With fur seals and sea lions being common prey on Bering Island, a shortage of food may prompt whales to consider alternative sources. “When food is scarce and a young killer whale presents itself, what choice do they have?” she remarks.

Topics of Discussion:

This version improves keyword optimization while maintaining the integrity of the HTML structure.

Source: www.newscientist.com

Breakthrough Discovery: Loophole Enables Quantum Cloning Technology

Challenges of Quantum Information Backup

Ruslanas Baranauskas/Science Photo Library/Alamy

In the realm of quantum mechanics, the principle of no duplication for quantum information is considered an unbreakable rule. However, a novel technique for backing up qubits—the fundamental units of quantum computers—may potentially challenge this foundational aspect of physics.

Initially identified in the 1980s, the no-cloning theorem asserts that a quantum state, which encapsulates all information about a quantum system, cannot be duplicated. Attempts to copy this information typically compromise the fragility of the quantum properties being assessed. This principle is crucial for advancements in quantum technologies, including cryptography, enabling secure communication protocols that effectively prevent information duplication and interception.

Researchers from the University of Waterloo in Canada have introduced an unexpected breakthrough: the ability to clone a quantum system, provided the information is encrypted and accompanied by a unique one-time decryption key.

Achim Kemp states, “This method allows for the creation of numerous copies to enhance redundancy, yet all copies must remain encrypted, and each decryption key may only be utilized once.” This compliance with the no-cloning theorem assures that only a singular, unambiguous, readable copy of a qubit exists at any point.

Through an exploration of how quantum Wi-Fi and radio stations could function, Kemp and his team stumbled upon this astonishing revelation. Traditional no-cloning principles would inhibit multiple receivers from accessing identical quantum information.

While delving into the impact of random fluctuations and noise on information copying, the team discerned that these disturbances might inadvertently undermine the no-cloning theorem, prompting the question, “Why does quantum noise seem to confuse the no-cloning theorem?”

Upon thorough investigation, they concluded that noise could inadvertently serve as an encryption mechanism, disrupting the original signal, yet remaining reversible. When utilized intentionally, this phenomenon can act as a tool for secure information dissemination.

After validating this concept theoretically, the team successfully implemented the protocol on an actual IBM Heron 156-qubit quantum computing processor.

This innovative approach exhibits a level of resilience against the errors and noise characteristic of contemporary quantum computers, enabling the production of hundreds of encrypted clones of a single qubit. “In fact, we maximized our capacity on the IBM processor. Despite housing only 156 qubits, we estimated we could produce over 1,000 clones before triggering error messages,” Kemp explains.

This advancement to the no-cloning theorem holds promise for the future of quantum cloud storage and computing services. “Similar to how Dropbox ensures a file’s safety by storing it across three distinct geographical servers, this method offers a viable solution for duplicating quantum data,” Kemp adds.

Alex Kissinger from the University of Oxford remarks, “It’s a fascinating quantum cryptographic protocol with ample potential in quantum communications, where redundancy in transmitted information can be invaluable.” However, he emphasizes that this technique should not be misconstrued as cloning. “It signifies a method of dissemination rather than replication,” Kissinger clarifies. “It’s about distributing information so that one recipient can later retrieve it.”

Kemp concurs, asserting, “This isn’t cloning; it’s encrypted cloning—merely a refinement of the no-duplication theorem.”

Topics:

  • Quantum Mechanics/
  • Quantum Computing

Source: www.newscientist.com

Ruxolitinib: Breakthrough Vitiligo Cream That Targets Immune Cells to Restore Skin Pigmentation

vitiligo skin pigmentation on female hands

Vitiligo results in paler, less pigmented skin patches.

Getty Images

A groundbreaking cream targeting the underlying cause of vitiligo is set to be available through the UK’s National Health Service (NHS). In clinical trials, this cream significantly improved pigmentation in white skin patches associated with this condition. Despite previous controversies surrounding vitiligo treatment, it is important to note that vitiligo is neither painful nor dangerous.

“Typically, individuals with vitiligo exhibit no physical symptoms, but the condition can lead to significant psychological challenges,” stated David Rosmarin from Indiana University, who led two trials for the new ruxolitinib cream treatment.

The cream, already marketed in the United States under the name Opzelura, is indicated for treating non-segmental vitiligo, characterized by symmetrical white patches on both sides of the body. This condition is believed to result from the immune system mistakenly attacking melanocytes—the cells responsible for producing melanin, which gives skin its color.

According to Emma Rush from Vitiligo Support UK, this treatment represents the first rigorously tested medication that directly addresses the mechanisms causing vitiligo. “This is a significant milestone in vitiligo treatment,” she remarked.

Ruxolitinib functions by inhibiting the activity of enzymes that destroy melanocytes. While existing treatment options like steroid creams may restore some pigmentation, they have broader immune-suppressing effects.

A recent study published in 2022 revealed that ruxolitinib enhanced pigmentation and decreased the visibility of vitiligo patches compared to a placebo cream. The effects were noted irrespective of the individual’s skin color (vitiligo tends to be more conspicuous on darker skin tones), and these results were sustained for at least one year after treatment cessation for over a third of participants.

The National Institute for Health and Care Excellence (NICE) has previously reviewed the efficacy of ruxolitinib and concluded that it is not cost-effective for NHS provision. However, it is now recommended that this cream be available to individuals aged 12 and older with non-segmental vitiligo when other topical treatments have proven ineffective or unsuitable.

Vitiligo affects approximately 1% of the global population, with severity ranging from a few small patches to larger, inflamed, or discolored areas of skin.

“Patients and clinicians may sometimes believe that vitiligo does not require treatment since it is not life-threatening or physically painful,” noted Victoria Eleftheriadou of the British Association of Dermatologists. However, vitiligo can lead to serious complications, including depression and anxiety.

Natalie Umbersley, a vitiligo ambassador for the charity Changing Faces, stated that support groups encourage individuals with visible differences to seek treatment without fear of judgment but expressed her reluctance to pursue ruxolitinib after years of using existing therapies. “I have learned to embrace my unique skin,” she said. “It’s all about celebrating our individuality.”

“While it’s wonderful to have individuals who love their appearance, this is not the reality for everyone,” commented Rush.

While an oral form of ruxolitinib has been used to treat certain cancers and rheumatoid arthritis, it is associated with serious side effects such as lymphoma, heart disease, and infections. However, these risks are not reported with the topical formulation. In two vitiligo trials, ruxolitinib was linked to only mild side effects, including acne and itching. “Systemic absorption is minimal,” Eleftheriadou noted.

Ruxolitinib is also considered to be safer than long-term use of steroid creams, which can cause skin thinning. Additionally, ultraviolet light therapy may be an option for individuals with severe vitiligo, although it is not widely accessible.

Topics:

Source: www.newscientist.com

Uncovering Marine Fossils on Mount Everest: Ongoing Discoveries at the Summit

Have you ever wondered why fossilized marine life, including trilobites, crinoids, and brachiopods, can be found on the summit of Mount Everest? These astonishing discoveries provide significant insights into the history of our planet.

The presence of these fossils indicates the formation of the Himalayas approximately 50 million years ago, serving as vital evidence supporting the theory of plate tectonics.

Around 200 million years ago, when the supercontinent Pangea began to fragment, the Indian plate started its journey northward, ultimately colliding with the Asian continent.




This monumental geological event caused the Indian plate to collide with the Eurasian plate, elevating land that included remnants of the ancient Tethyan Sea floor.

The impact resulted in the creation of the Himalayan mountain range and the Tibetan Plateau, pushing marine fossils more than 8,000 meters (26,000 feet) above sea level.


This article answers the question posed by Sonia Carroll of Brighton: “Why are there marine fossils on the top of Mount Everest?”

For any inquiries, please contact us at: questions@sciencefocus.com or reach out via Facebook, Twitter, or Instagram (please include your name and location).

Explore our ultimate collection of fun facts and discover more amazing science pages!


Read more:


In this version, keywords are emphasized for SEO purposes while maintaining the original HTML structure.

Source: www.sciencefocus.com