How Ants Capture Carbon Dioxide to Create Natural Armor

Electron Microscopy Reveals Ants Transforming CO2 into Dolomite

Credit: Li Hongjie

The remarkable ability of certain ants to convert carbon dioxide from the atmosphere into dolomite stones within their exoskeletons offers potential insights into innovative methods for humans to sequester greenhouse gases and mitigate climate change.

Fungi-farming ants, like Acromyrmex echinathiol, forage for vegetation to nourish the fungi cultivated in their colonies, which serve as their primary food source. High ant and fungal densities can lead to elevated levels of CO2 within their nests.

Research in 2020 by Cameron Currie at the University of Wisconsin-Madison discovered that this species incorporates carbonate biominerals into their exoskeletons through a unique symbiotic relationship with specific bacteria. These bacteria facilitate the conversion of CO2 into rock using a somewhat enigmatic chemical process.

Recently, the research team identified another species, Sericomyrmex amabilis, residing in Central and South America, capable of achieving this remarkable feat without the assistance of symbiotic bacteria. This makes it the first known animal to evolve such an ability.

Interestingly, the mineral produced by these ants is dolomite, which is notoriously challenging for chemists to synthesize in laboratory conditions. The formation of dolomite rocks, such as those in the Italian Dolomites, requires millions of years and intricate geological processes for the calcium and magnesium atoms to align properly. In stark contrast, ants can accomplish this swiftly and effortlessly, according to Li Hongjie from Zhejiang University in China.

Dolomite is a composite of calcium, magnesium, and carbonate. Its laboratory formation is difficult due to magnesium’s strong bonding with surrounding water molecules, which hinders the integration of magnesium into the calcium carbonate structure, as indicated by Currie. Typically, scientists employ high temperatures and pressures to facilitate this process. The next step for researchers is to unravel how these ants master this extraordinary capability.

For fungi-farming ants, the transformation of CO2 into stone not only strengthens their exoskeletons but also neutralizes detrimental CO2 accumulation within their hives.

“We uncovered a natural system that has evolved over millions of years to mitigate the buildup of harmful atmospheric carbon dioxide in ant colonies,” Currie remarked.

In their quest to combat global warming, scientists are investigating techniques to convert atmospheric CO2 into carbonate minerals, essentially solidifying carbon into stone. “These ants represent the first known animals to partake in such processes, providing exciting potential as models for human applications,” asserts Currie.

Cody Freas, a professor at the University of Toulouse in France not involved in the study, hailed the ants’ capability to transform CO2 into dolomite as an “extraordinary adaptation.” “These ants function as living carbon scrubbers, converting atmospheric CO2 into a protective mineral armor. This dual strategy aids them in regulating the nest atmosphere and crafting bioengineered physical defenses,” Freas elaborated.

Topics:

Source: www.newscientist.com

Could These Gases Indicate Extraterrestrial Intelligence? – Sciworthy

For over a century, humanity has been on a quest to find signs of intelligent life beyond Earth. This endeavor, best illustrated by the search for extraterrestrial intelligence (SETI), gained notoriety thanks to Carl Sagan’s 1985 novel, Contact, which was later adapted into a film. Like Sagan’s protagonist, many SETI researchers utilize telescopes to capture radio signals from distant civilizations. However, radio waves are merely one of the tools scientists employ in the ongoing search for extraterrestrial life.

Astronomers look for measurable indicators of advanced technologies, known as technosignatures. In 1906, astronomer Percival Lowell mapped what he thought were numerous man-made structures, specifically Mars’ canals. Then, in 1960, physicist Freeman J. Dyson suggested that advanced civilizations might construct massive structures around stars to harvest energy, now referred to as a Dyson Sphere. Although Lowell’s canals were later attributed to natural erosion and Dyson’s idea remains a hypothesis, the quest for technosignatures persists.

Currently, astronomers analyze the chemical signatures in distant planetary atmospheres for indicators of life or advanced technologies. Researchers advocate measuring industrial gases like: CFCs or hydrofluorocarbons to help detect extraterrestrial civilizations on exoplanets. However, given their low atmospheric concentrations on Earth, detecting these gases on other worlds poses a challenge. Optimal conditions may require up to 500 hours of observation time with the James Webb Space Telescope (JWST), the largest telescope ever constructed.

The team led by Sarah Seager at MIT proposed nitrogen trifluoride (NF3) and sulfur hexafluoride (SF6) as potential technosignature gases. Both substances are industrially produced on Earth; NF3 is utilized for cleaning semiconductors and solar panels, while SF6 is used in insulating transformers and high-voltage equipment, with its atmospheric concentration increasing significantly in recent decades.

Interestingly, the research team initially ruled out biological sources for these gases, as living organisms can produce false positives for technosignatures. Their investigation into Earth’s biogenic chemical database revealed no known organisms that generate NF3 or SF6. In fact, no life forms are recognized to create molecules with nitrogen-fluorine or sulfur-fluorine bonds.

The researchers proposed that Earth’s life forms may deliberately avoid using fluorine-based molecules due to fluorine’s propensity to bind within minerals, making extraction challenging. Moreover, these molecules possess unique chemical properties that complicate their utilization by biological systems. Specifically, their strong electron affinity leads to violent reactions with other molecules, resulting in robust bonds that are hard to break. This, they argued, suggests that fluoride may be unsuitable for extraterrestrial life.

Next, they examined potential non-biological, or abiotic sources for these gases, such as tectonic and various geological processes. While NF3 has no known abiotic sources on Earth, volcanic activity does generate minute quantities of SF6. They theorized that volcanic eruptions releasing SF6 would also emit silicon tetrafluoride (SiF4), a more prevalent volcanic gas, enabling astronomers to detect both SiF4 and SF6 simultaneously, thus strengthening the case for technosignatures if SF6 is found without corresponding SiF4.

Finally, the scientists evaluated the feasibility of distinguishing these gases from other atmospheric components on exoplanets. To achieve this, astronomers monitor the exoplanet’s transit in front of its star, measuring the light’s wavelengths that pass through its atmosphere, generating patterns known as a transmission spectrum. Ideally, each peak in the spectrum corresponds to a unique atmospheric gas; however, overlapping or obscured gases can complicate detection.

Utilizing a computer model called Simulated Exoplanet Atmospheric Spectra, the research team generated a transmission spectrum for a rocky exoplanet approximately five times the mass of Earth, termed a super-Earth, orbiting a M-dwarf star. They simulated three atmospheric compositions dominated by H2, N2, and CO2. Their findings revealed that both NF3 and SF6 display spectral signatures distinct from those of the predominant atmospheric gases, and could theoretically be detected by the James Webb Space Telescope, albeit at concentrations much higher than those found in Earth’s atmosphere. Next-generation telescopes, such as the Habitable Worlds Observatory and the Large Interferometer for Exoplanets, are optimized for detecting such signatures.

While Seager and her team view NF3 and SF6 as promising technosignature gases, many uncertainties remain. Our understanding of how these gases behave in Earth’s atmosphere is limited. Additionally, the potential overlap of their transmission spectra with chlorofluorocarbon gases necessitates further studies for signal separation. Scientists also noted the unpredictability of byproducts from extraterrestrial biology. If astronomers were to observe a steady increase in technosignature gases on an exoplanet over a century, it could indicate the presence of an industrialized alien civilization. Astronomers hope to be fortunate enough to witness this evidence.


Post views: 32

Source: sciworthy.com

Discover If You’re Truly Cool: Insights from Science

At some point, many of us yearn to be perceived as cool. This pursuit significantly influences our purchases, fashion choices, hobbies, social circles, and even our vocabulary.

Being accepted by a group has its advantages. Research indicates that those deemed cool are often more admired, likable, and viewed as friendly and competent. But what truly defines “cool”?

The idea of coolness has historical roots, with parallels in cultures worldwide, including West Africa and China. In Europe, the concept traces back to the 16th century Italian term sprezzatura, embodying a refined and effortless style (think of the Mona Lisa—her enigmatic smile and poise exemplify this ideal). This form of coolness hinges on nonchalance and mastering the art of making challenges appear effortless.







Perhaps one key to being cool is to be effortlessly yourself. We all recognize the discomfort of trying too hard to impress others (just recall former British Prime Minister Theresa May’s infamous dance video).

Being cool often stems from confidence and a sense of adventure.

Cool slang evolves over time, from rad and hip to swell, dope, fresh, and light, but “cool” remains timeless.

This term, signifying “fashionable,” originates from African American culture in the 1930s and 1940s, particularly in the jazz music scene.

Jazz musicians with a relaxed playing style were labeled as cool, a term later embraced by bohemian groups like beatniks and hippies in the 1950s and 1960s. Subsequently, the concept of coolness became commercialized, with businesses exploiting it to market everything from apparel to cars.

So, what might define coolness in 2026? A recent study involving around 6,000 participants from six continents outlined the characteristics of individuals considered cool. Findings revealed six core traits: power, hedonism, adventure, autonomy, openness, and extroversion. However, balance is crucial; excessive hedonism or a desperate pursuit of power can disrupt credibility.

The consistency of these traits across cultures suggests that coolness fulfills a universal social function.

Individuals embodying these traits are more likely to challenge the status quo, innovate, and inspire others to embrace new perspectives.

Moreover, simply being perceived as cool can elevate a person’s social status by showcasing their creativity and promoting cultural evolution.

But what if you don’t identify as cool or prefer not to chase that label? The same study identified personality traits regarded as “good” rather than cool, such as kindness, sincerity, friendliness, and warmth. Pursuing these qualities can also leave a lasting impression.


This article responds to the query (posed by Jonathan Schaefer of Wakefield): “What truly makes someone cool?”

For questions, please email questions@sciencefocus.com or connect with us on Facebook, Twitter, or Instagram (include your name and location).

For more exciting scientific discussions, check out our Ultimate Fun Facts page.


Read more:


Source: www.sciencefocus.com

Unveiling Odd Weapons Being Launched into Space: What’s Next?

In February, Germany announced a substantial investment of billions of euros in new military capabilities, stirring interest not just due to the investment scale but also the candid discussions among officials regarding its implications.

This initiative includes plans for a network of encrypted communications satellites and the “Inspector” spacecraft, capable of maneuvering close to other satellites. Additional features encompass sensors, tracking devices, and even lasers designed to interfere with adversarial satellites.

Historically, space was perceived as a tranquil environment detached from terrestrial conflicts, primarily serving to support underground operations. However, that perception is rapidly changing. Germany now joins an expanding coalition of nations viewing space not merely as infrastructure but as a vital territory requiring active defense and control.










“Adopting a militaristic mindset about orbit can be perilous,” warns Dr. Michael Mulvihill, Vice-Chancellor Research Fellow in Astropolitics at Teesside University.

“Traditionally, space has been viewed as a collaborative domain; however, even in military contexts, its usages were typically confined to communications and reconnaissance.”

This landscape is evolving. From the United States and China to the United Kingdom, France, India, and Japan, multiple governments are investing heavily in military space systems. But what exactly are these nations planning to deploy, and what are the implications of this militarization?

A Misconception of Peace in Space

The notion that space was once a calm arena devoid of political conflict is largely a myth that serves governmental narratives.

The U.S. has utilized a broad interpretation of what constitutes peaceful operations, highlighting a significant misconception about the role of space.

“The overly simplified depiction of the space system as a ‘silent sentinel maintaining peace between superpowers’ has misled many,” states Aaron Bateman, Assistant Professor of History and International Affairs at George Washington University, and author of Space Weapon.

In reality, both the United States and the former Soviet Union have been testing weapons in orbit since the inception of the space age. For instance, the U.S. operated Program 437, a nuclear-capable anti-satellite system, until 1975, and Russia is believed to have equipped the Salyut 3 space station with a machine gun tested in space.

One of the most notorious examples of space combat capabilities occurred on July 9, 1962, when the U.S. detonated a nuclear warhead 400 km above the Pacific Ocean in the Starfish Prime experiment, creating an electromagnetic pulse that disabled several satellites and leading to the 1967 Outer Space Treaty banning mass destruction weapons in orbit.

Bateman emphasizes that the changes are more about scale, sophistication, and transparency rather than intent. “Currently, the U.S. government is signaling its military capabilities openly,” he notes.

A prime example is the X-37B, a military spacecraft that recently completed a covert multi-year mission in orbit, with the U.S. Air Force now publicly commemorating its launch—showing a marked shift from previous secrecy.

Read more:

<h2 class="wp-block-heading">The End of Conventional Space Warfare?</h2>
<p>According to the <a href="https://drive.google.com/file/d/1FA8aLXiQeAEK1Z8mTpHFls_c27Ne50qa/view" target="_blank" rel="noreferrer noopener">Secure World Foundation</a>'s 2025 <em>Global Counter Space Capability</em> report, four nations—China, the U.S., India, and Russia—have the ability to destroy satellites physically.</p>
<p>However, the era of explosive demonstrations may soon dwindle, as destroying satellites is becoming less viable.</p>
<p>"Using kinetic anti-satellite technologies, especially in low Earth orbit, could create significant debris," warns <a href="https://www.bis-space.com/team-members/stuart-eves/" target="_blank" rel="noreferrer noopener">Stuart Eves</a>, a space consultant with nearly 40 years of experience, including work for the UK Ministry of Defence.</p>
<figure class="wp-block-image size-large">
    <img loading="lazy" decoding="async" width="1200" height="800" src="https://c02.purpledshub.com/uploads/sites/41/2026/02/250307-X-F3227-1002.jpg?webp=1&amp;w=1200" alt="The X-37B landed at Vandenberg Space Force Base, California." class="wp-image-212396"/>
    <figcaption class="wp-element-caption">The U.S. Space Force's X-37B Orbital Test Vehicle remains enshrouded in mystery, with public updates on its missions - Photo Credit: VELOZ ALEXANDER/US Space Force</figcaption>
</figure>
<p>Space debris poses a significant challenge for nations active in space. According to <a href="https://orbitaldebris.jsc.nasa.gov/faq/#:~:text=small%20particle%20impacts.-,Return%20to%20Top,number%20of%20orbital%20debris%20determined?" target="_blank" rel="noreferrer noopener">NASA</a>, there are about 500,000 debris objects in orbit ranging from 1 to 10 cm in size. The <a href="https://sdup.esoc.esa.int/discosweb/statistics/#:~:text=Here%20are%20some%20statistics%20on%20space%20debris:,*%20Rocket%20fragmentation%20debris%20*%20Rocket%20debris" target="_blank" rel="noreferrer noopener">European Space Agency</a> estimates that number exceeds 1 million.</p>
<p>At the speeds typical in low Earth orbit, a mere 1 cm object has the kinetic energy equivalent to a grenade, as noted by Eves, indicating significant reluctance among countries to escalate actions that cause further debris.</p>
<p>As satellite constellations grow, traditional kinetic attacks become increasingly illogical. Approximately 16,000 active objects can be tracked in orbit, with approximately 10,000 being part of Elon Musk's Starlink constellation.</p>
<p>"Utilizing classic anti-satellite weapons, like missiles, is unfeasible," states Bateman. "The high number of satellites allows for quick replenishment, which poses financial disincentives." </p>

<h2 class="wp-block-heading">Evolving Tactics in Space Warfare</h2>
<p>So, if direct destruction is off the table, what alternative methods are employed? Increased sophistication is key. Jamming, or flooding a satellite's signal with noise, has been a longstanding tactic, yet more refined techniques are emerging.</p>
<p>An illustrative instance is Russia's cyberattack on the Viasat network during its invasion of Ukraine in February 2022, which disrupted government communications by sending deceptive commands across Europe.</p>
<p>Then there are lasers, a technology more nuanced than seen in Hollywood portrayals. Typically, these are directed at the optical sensors of reconnaissance satellites to obscure or blind them. "A laser aimed at the optics of observation satellites makes them effectively invisible," explains Mulvihill.</p>
<p>According to the SWF report, Russia's mobile ground-based laser system, Peresvet, is deployed to secure mobile nuclear missile systems, while China is believed to have at least five "directed energy" testing facilities.</p>
<p>Ground infrastructure also becomes a target. For instance, Bateman cites a 2022 incident where a fiber-optic cable connecting mainland Norway to a satellite ground station in Svalbard was severed.</p>
<p>"There's no need to destructively interfere with a satellite or ground station; interrupting the data link can be equally compelling in a conflict scenario," he states.</p>
<figure class="wp-block-image size-large">
    <img loading="lazy" decoding="async" width="1200" height="799" src="https://c02.purpledshub.com/uploads/sites/41/2026/02/asat-attack.jpg?webp=1&amp;w=1200" alt="A depiction of a space conflict, featuring a missile targeted at an artificial satellite in orbit." class="wp-image-212397"/>
    <figcaption class="wp-element-caption">Only four nations have demonstrated capabilities for the physical destruction of satellites - Photo credit: Getty</figcaption>
</figure>

<h2 class="wp-block-heading">China's Ascendance and Implications for Middle Powers</h2>
<p>If the Cold War was marked by a race between two dominant powers, today's landscape is far more intricate, yet two clear leaders remain: the United States and the rising power of China.</p>
<p>China has launched over 1,000 satellites in the past decade, with more than 510 reportedly equipped for intelligence, surveillance, and reconnaissance capabilities, as noted by the <a href="https://www.uscc.gov/annual-report/2025-annual-report-congress" target="_blank" rel="noreferrer noopener">U.S.-China Economic Security Review Commission</a>.</p>
<p>This backdrop raises questions about the role of middle powers, such as Germany. While its investments may seem late, they are strategically viable, Mulvihill asserts.</p>
<p>"NATO's reliance on the U.S. for space capabilities has become transactional, where states can opt-out of services they've previously relied on," he explains.</p>
<p>Germany’s investments in inspection satellites and electronic warfare capabilities are poised to operate under the NATO framework, potentially providing balance against U.S. constraints.</p>
<p>Cooperation among middle powers could yield significant benefits. Past collaborations, such as the France-Germany partnership in surveillance satellite access, exemplify how these nations can contribute positively.</p>
<p>Nevertheless, Bateman expresses skepticism about the smooth adjustment to these dynamics. "Historical patterns suggest this transition will be challenging," he counters.</p>
<p>So, does a world with more space powers cultivate safety or sow danger? Likely both, with increasing unpredictability. "The situation is more chaotic," Mulvihill says. “Cooperative zones are fragmenting, with self-interest and transactional politics becoming paramount." </p>
<p>The reality reveals that space was never as peaceful as previously believed. The difference today is that, as more nations emerge, no one is pretending otherwise.</p>
<p><strong>Read more:</strong></p>

Source: www.sciencefocus.com

Scientists Explore Giant Fire Tornadoes as a Revolutionary Method for Ocean Cleansing

An oil spill at sea represents one of the worst man-made disasters in history. Surprisingly, introducing a fire whirlpool may emerge as an innovative solution. A recent study reveals it might be an effective method to address the aftermath.

In responding to significant oil spills, emergency teams often ignite oil slicks on the ocean surface, creating fire pits “on-site” to curb the further spread of oil.

While this approach helps protect marine ecosystems, it simultaneously releases substantial amounts of smoke and toxic soot into the atmosphere.

The inspiration for this method traces back to an unusual incident in Kentucky in 2003, where a bourbon spill ignited 800,000 gallons, creating a 30-meter (100-foot) firestorm over a lake. Professor Elaine Oran and her team began exploring whether this process could be utilized more permanently.

“We were joking about what it would smell like,” she shared with BBC Science Focus. “Then we examined the event closely. The larger fire vortex was effectively consuming smaller fire vortices, drawing them in and absorbing them.”

The team constructed a 4.8-meter (16-foot) triple-walled triangular structure at a fire training facility in Texas, featuring a pool of crude oil at its center. When ignited, this setup created a roaring fire vortex approximately 5.2 meters (17 feet) high.

Initial large-scale experiments demonstrate that fire vortices burn spilled oil faster and cleaner than traditional fire pools, showcasing innovative potential for ocean cleanup. – Photo credit: Texas A&M University College of Engineering

Compared to conventional fire pools, the oil burns 40% faster, soot emissions are reduced by 40%, and up to 95% of the fuel is consumed.

The secret to this efficiency lies in the fire’s spin. Instead of spreading outward, the vortex pulls in oxygen from all angles, allowing for hotter and more complete combustion, akin to a giant incinerator rather than a simple bonfire.

However, harnessing this fire whirlpool’s power is no easy task. The structure is unpredictable; too much wind can lead to its collapse, while insufficient airflow control may revert it to a traditional fire pool.

Nonetheless, achieving a “Goldilocks Zone” on-site is “very realistic,” according to Oran, who envisions deploying a movable barrier structure directly above oil spills at sea.

“This research is more than just an experiment; it offers a glimpse into a future where fire is not merely a destructive force, but a tool to safeguard our oceans and our planet,” she stated.

The findings were published in the journal Fuel.

Read More:

Source: www.sciencefocus.com

Juice Spy: Exploring the 3I/ATLAS Interstellar Comet Mission

ESA’s Jupiter Icy Satellite Explorer (JUICE) has unveiled new images of the interstellar object 3I/ATLAS captured by the JANUS scientific camera.



This striking image of interstellar comet 3I/ATLAS was taken by the JANUS camera aboard ESA’s JUICE spacecraft on November 6, 2025, just seven days post the comet’s closest approach to the Sun. At this juncture, JUICE was approximately 66 million kilometers (41 million miles) from the comet. The inset image enhances the coma structure, with the arrow indicating the comet’s movement direction (blue) and its trajectory relative to the Sun (yellow). Image credit: ESA / Juice / JANUS.

The interstellar comet 3I/ATLAS was first identified on July 1, 2025, by the NASA-funded ATLAS survey telescope located in Rio Hurtado, Chile.

This remarkable comet, also known as C/2025 N1 (ATLAS) and A11pl3Z, appears to have entered our solar system from the constellation Sagittarius.

3I/ATLAS boasts the most dynamically extreme orbit ever recorded in the solar system, underscoring its interstellar origin and exceptional speed.

On October 30, 2025, the comet achieved its closest perihelion to the Sun, reaching within 1.4 astronomical units (210 million kilometers, or 130.5 million miles)—just inside Mars’ orbital path.

Throughout November 2025, the JUICE spacecraft meticulously observed 3I/ATLAS utilizing five scientific instruments: JANUS, MAJIS, SWI, PEP, and UVS.

These instruments collectively gathered crucial information on the comet’s behavior and composition.

“For several months post observation, JUICE was positioned on the opposite side of the Sun from Earth,” noted members of the JUICE team.

“We utilized the main high-gain antenna as a heat shield, while the smaller medium-gain antenna transmitted data back to Earth at a reduced rate.”

“Consequently, we had to wait until last week to receive the data,” they elaborated.

“Currently, we are diligently analyzing these findings.”

The JANUS camera successfully captured over 120 images of 3I/ATLAS across a broad range of wavelengths.

Researchers are actively studying these images to enhance their understanding of the comet.

Additionally, they are examining spectroscopic data as well as information regarding the comet’s composition and particle characteristics.

“[The latest JANUS image] reveals a jet emerging from the core of 3I/ATLAS, directed away from the Sun,” stated Professor Avi Loeb of Harvard University in his analysis.

“This observation is intriguing because jets are typically formed from pockets of ice on the surface that get heated by sunlight on the day side, creating jets that originally travel toward the Sun.”

“It’s comparable to images captured by amateur astronomers globally during the same period.”

Source: www.sci.news

Uncovering the Mysterious Phenomena Beneath Greenland’s Ice

The ice deep beneath Greenland’s surface is beginning to show intriguing signs of movement, manifesting as unusual plume-like swirls. According to recent studies, understanding this phenomenon is crucial for scientists aiming to predict the behavior of Greenland’s ice as it rapidly melts into the ocean.

The initial discovery of this formation was made in 2014 through radar imaging, although the underlying mechanism remained unclear.

Recent research indicates that thermal convection, a process driving movements within Earth’s molten mantle, may explain these unique formations.

“People often consider ice as a rigid, cold substance,” stated Professor Andreas Birth from the University of Bergen, Norway. “Finding that certain areas of the Greenland ice sheet experience heat convection—similar to boiling pasta—is remarkable and intriguing.”

Convection reflects a gradual, cyclical movement where warmer sections of a material rise while cooler sections descend.

In this instance, researchers believe the plume has formed from solid ice over millennia due to heat emanating from deep within the Earth.

“It’s counterintuitive to think that thermal convection could happen within ice sheets,” remarked Dr. Robert Law, a glaciologist at ETH Zurich in Switzerland. “But since ice is significantly softer than Earth’s mantle, these physical principles actually hold up.”

To explore whether convection could lead to the creation of these enigmatic plumes, Dr. Law and his research team constructed a digital model of the Greenland ice sheet, employing a simulation typically used for Earth’s mantle convection.

After adjusting parameters like ice thickness, softness, and movement, the model successfully generated rising ice columns that mirrored the shapes observed in Greenland.

Law elaborated to BBC Science Focus that the relatively stable, low-snow environment in northern Greenland likely provides the perfect insulation, fostering the creation of these structures over thousands of years.

Greenland’s ice is melting at an alarming rate. Research from the University of Barcelona indicates water production has surged more than sixfold since 1990, escalating from 12.7 gigatons per decade to 82.4 gigatons per decade – Credit: Getty

This study enhances scientists’ understanding of ice properties that are challenging to measure directly.

“Acquiring data on ice properties, especially within deep ice sheets, is exceptionally difficult,” Dr. Law explained.

“This innovative approach yields invaluable insights that are not accessible through other means. Our findings suggest that ice is softer and more sensitive to stress than previously assumed. However, further exploration is necessary to confirm these conclusions.”

This discovery is critical because Greenland’s ice sheet, spanning over 1.7 million square kilometers (approximately 650,000 square miles), holds significant implications for global sea levels. If it were to melt entirely, sea levels could rise by as much as 7.4 meters (24 feet), according to estimates from the U.S. National Snow and Ice Data Center.

In another recent study, the University of Barcelona revealed that the ice is melting at an unprecedented pace.

Dr. Josep Bonsams, a geography researcher from Barcelona, stated in BBC Science Focus, “The Greenland Ice Sheet is experiencing more frequent, larger, and more intense extreme melt events than in previous decades. Most of the top 10 extreme melt years have occurred since 2000. Melting in Greenland, one of the largest reservoirs of frozen water on Earth, significantly contributes to global sea level rise, making urgent international climate action essential.”

Dr. Law mentioned to BBC Science Focus that his research insights will influence the future outlook for both Greenland and global climate patterns.

“The plume itself does not indicate that we should expect the ice sheet to collapse sooner than current predictions suggest,” he clarified.

“These formations resemble ancient artifacts: thicker, colder, and more stable ice sheets that originated from the last ice age. Nonetheless, the physics of ice remains poorly understood. With every advancement in physical comprehension, we can better forecast the rate of ice sheet melting and the implications for sea level rise.”

Dr. Law expressed his hope that those who engage with his research will share the same wonder for nature and the Greenland ice sheet that inspired his team during their studies.

Read more:

Source: www.sciencefocus.com

Unraveling the Mystery of Underwater ‘Panda Skeletons’: What Experts Can’t Explain

A few years ago, scuba divers exploring the coral reefs near Kume Island in Japan’s Ryukyu Islands made an astonishing discovery: a graveyard of small panda-like creatures. Rather than typical skeletons, these unique beings feature living panda heads still attached.

Each of these fascinating creatures measures no more than 2 cm (3 to 4 inches) long—about the length of a fingernail. One end sports a white “head” complete with a black nose spot and two panda-esque eye patches.

Their transparent bodies reveal stacks of white horizontal lines resembling bony ribs, and a distinct black dot at one end that appears to serve as a tail. It’s a truly bizarre sight.

The divers identified these peculiar creatures as a species of ascidian, commonly known as sea squirts. Shortly after the photos circulated on social media, they garnered nicknames like skeleton panda squirt—and in Japanese, they’re called “Panda Skeleton Hoya.”

The intriguing online buzz attracted the attention of sea squirt expert Dr. Naohiro Hasegawa from Hokkaido University in Japan. Upon examining the photos, Dr. Hasegawa quickly realized this sea squirt was distinct from previously known species and initiated research on this rare find.

A dedicated fan of the Panda Skeleton Squirt even contributed to a crowdfunding campaign to fund a diving trip to Kume Island.

With assistance from local fishermen, the divers successfully collected four groups of these sea squirts from depths of 10 to 20 meters (approximately 30 to 65 feet).

Back in the lab, Dr. Hasegawa confirmed that this panda skeleton squirt was unique enough to warrant its own species designation: Claverina ossipandae.

The genus Claverina was first described over 200 years ago, meaning “little bottle,” which aptly describes its transparent, bottle-shaped body, known as a zooid.

An analogous species, the bulb squirt (Claverina lepadiformis), can be found along rocky coastlines throughout Europe and bears a resemblance to a small light bulb.

The newly designated species name, ossipandae, combines “panda” with Osis, a Latin word meaning bone.

Distinguishing features of Claverina ossipandae include its unique white “ribs,” which are actually blood vessels, and intriguing black “eye” markings whose function remains unidentified.

Despite their eerie black and white markings, panda skeleton squirts are not related to fluffy pandas – Credit: Getty

Like other sea squirts, the panda-skeletal sea squirt is a colonial animal that feeds by filtering water through siphon tubes, extracting food particles as the water passes through their mucus-covered gills.

This process results in the expulsion of water through another siphon, hence their common name. Interestingly, some sea squirts eject jets of water when removed from their aquatic habitat.

However, sea squirts do not remain attached to rocks for their entire lives. They begin life as tadpole-like larvae, swimming freely before anchoring themselves to the ocean floor.

In their larval stage, sea squirts belong to the chordate group, which includes mammals and other vertebrates. Ascidian larvae possess a nerve cord along a rod-like structure, known as a notochord, which resembles the development in vertebrate embryos.

So, while C. ossipandae may be small and lacking fur, it bears some intriguing similarities to its namesake black-and-white pandas.


If you have any questions, please email us at: questions@sciencefocus.com or reach out via Facebook, Twitter, or Instagram (please include your name and location).

Explore our ultimate fun facts and discover more amazing science content!


Read more:


Source: www.sciencefocus.com

Adorable Seal Pups Mimic Human Speech and Accents: Discover Their Unique Sounds!

Recent studies reveal that seal pups produce more human-like sounds than previously believed, often taking turns “communicating” by adjusting their calls to match their neighboring pups. This fascinating behavior sheds light on the evolution of complex communication, including human language.

Harbor seals, also known simply as seals, are among the few animal species capable of learning and altering their vocalizations.

“They can learn to create new sounds or modify existing ones,” explains Dr. Cohen de Reus from Radboud University and Vrije Universiteit Bruxelles. His research is part of his Ph.D. dissertation, as noted by BBC Science Focus.

Every talkative harbor seal has its own distinct calls, which mothers utilize to locate their pups on busy beaches. This study examines how seals modify their calls based on social contexts.

https://c02.purpledshub.com/uploads/sites/41/2026/02/Seal-pup-conversation.mp4
During testing, Jenny the seal’s responses were monitored as recordings of other pups were played.

Dr. de Reus found that the calls of pups sitting together became increasingly similar over time. “This phenomenon resembles regional accents in humans,” he stated. “Despite their visual similarities, each pup can be recognized individually, just as in humans.”

Additionally, akin to polite human conversation, the pups engage in turn-taking without overlapping in communication.

To conduct his research, Dr. de Reus analyzed thousands of hours of audio from numerous harbor seal pups at the Peterburen Seal Center in the Netherlands.

“After spending extensive time with the pups, I could identify at least half of their calls,” he shared.

This study aims to uncover the subtleties of communication shared across species and those unique to humans, potentially revealing the intricate history of human language development.

“Language is often regarded as a unique trait that sets us apart from other species, yet our findings indicate the existence of advanced communication systems in various animals,” Dr. de Reus continued. “Consider this research a foundational step for future comparisons.”

This seal was recorded at a rehabilitation center that cares for orphaned and injured seals until their release back into the wild – Credit: Getty

Read more:

This version incorporates SEO best practices by using relevant keywords, optimizing headings, and maintaining appropriate HTML tag structure.

Source: www.sciencefocus.com

Webb’s Infrared Vision Uncovers Planetary Nebula That Looks Like a Celestial Brain

The remarkable sensitivity of the NASA/ESA/CSA James Webb Space Telescope in near- and mid-infrared light offers new insights into PMR 1, a little-explored nebula in the constellation Vela.



These web images depict PMR 1, a planetary nebula located about 5,000 light-years away in the Vela constellation. Image credit: NASA/ESA/CSA/STScI/Joseph DePasquale, STScI.

PMR 1 is a fascinating planetary nebula situated approximately 5,000 light-years from our Earth in the Vela constellation.

Also known as IRAS 09269-4923, this nebula was previously captured in infrared light by the now-retired Spitzer Space Telescope in 2013.

The advanced technology of the Webb Telescope reveals striking details that enhance the nebula’s brain-like appearance.

According to Webb astronomers, “The nebula exhibits distinct regions that illustrate various stages of its evolution; the outer shell, largely composed of hydrogen, is initially blown out while the inner cloud is more refined, containing a mix of gases.” They stated.

“Webb’s NIRCam (Near-Infrared Camera) and MIRI (Mid-Infrared Instrument) identify unique dark lanes traversing vertically through the center of the nebula, accentuating the brain-like shape of its left and right hemispheres.”

“These dark lanes may be linked to an explosive event or outflow from the central star, often triggered by twin jets moving in opposite directions.”

“This phenomenon is notably apparent at the top of the nebula in Webb’s MIRI images, where gas seems to be jetting outward.”

Despite remaining mysteries surrounding this nebula, it is evident that it was formed by a star nearing the end of its fuel-burning phase.” The astronomers added.

“During this final phase, the star sheds its outer layers, a dynamic process that occurs relatively quickly from a cosmic viewpoint. Webb captured this crucial moment in stellar evolution.”

“The ultimate fate of the star hinges on its mass, which is still undetermined.”

“If the star is massive enough, it will eventually go supernova.”

“Conversely, a less massive, Sun-like star will continue shedding layers and cooling until only a dense white dwarf remains.”

Source: www.sci.news

How Aardvarks are Adapted to Consume 50,000 Ants Each Night

As the conversation around eating insects gains traction, we can learn from aardvarks (Orycteropus afer), gourmet consumers of African ants. This fascinating mammal can devour up to 50,000 crunchy ants in a single night.

Aardvarks primarily feast on ants and termites, with the occasional “aardvark cucumber” adding some variety (more on that later).

Why focus on ants and termites? Their collective biomass outweighs all wild mammals by a factor of 10, making them an abundant food source.

Known as the African ant bear, the aardvark excels in locating these protein-packed snacks by invading ant nests and termite mounds.

Equipped with sturdy claws, aardvarks dig through resilient structures, using their strong leg bones to support the strain of excavation.

While ants may respond with aggression, swarming and biting, aardvarks have thick skin that withstands these defenses. Their long, pig-like snouts dive into nests, allowing them to sip their treats like a milkshake.

Aardvarks cleverly close their nostrils to prevent inhaling dust. Additionally, specialized salivary glands release a generous amount of sticky saliva, coating their 30-cm (12-inch) long tongues, making it easy to collect ants.

The ants cling to their tongues as if caught on flypaper. Once swallowed, the food moves to the gizzard-like stomach, where muscular walls crush it.

Chewing is minimal, but aardvarks possess unique teeth. Adults have approximately 20 nail-like teeth that grow continuously and wear down over time.

These teeth consist of hundreds of small hexagonal tubes made from a dentin-like material called vascular dentin.

Relatively soft due to the absence of enamel, these teeth are ill-suited for crushing but perfect for lightly mashing the aforementioned aardvark cucumber.

The aardvark cucumber is an edible fruit, growing from a low vine. Its life cycle relies on aardvarks for seed maturation and dispersal through feces. In return, aardvarks enjoy a juicy, hydrating snack.

Aardvarks can consume up to 50,000 ants in one night – Credit: Getty

It’s intriguing that ant-eating mammals, including aardvarks, anteaters, and pangolins, have independently evolved this trait at least 12 times since the dinosaurs went extinct 66 million years ago.

This phenomenon, known as convergent evolution, shows how different species can develop similar characteristics in response to the same challenges.

Faced with the question “How can I eat all these ants?”, they have all adapted with sticky tongues, strong forelimbs, and fewer teeth.

These ant-eating mammals are akin to a recurring trend, much like mullets, showcasing evolution’s penchant for clever adaptations.


To submit your questions, please email questions@sciencefocus.com or send us a message. Connect with us on Facebook, Twitter, or Instagram (remember to include your name and location).

Discover more fascinating science on our Ultimate Fun Facts page!


Read more:


Source: www.sciencefocus.com

Fossil Amber Unveils Ancient Ant Ecological Interactions with Other Organisms

Fossils preserved in amber are not only exquisite but also provide insights into ancient ecological interactions, including potential parasitism and symbiotic relationships between ants and mites. This revelation comes from a groundbreaking morphological study analyzing six amber specimens: Baltic, Dominican, and Burmese.



Fossils of an ant colony preserved in Baltic Sea amber from Lithuania. Image credit: José de la Fuente & Agustín Estrada-Peña, doi: 10.3389/fevo.2026.1724595.

“Inclusions in amber reveal potential interactions between various organisms that shaped prehistoric environments,” stated paleontologist Dr. Jose de la Fuente from the Game and Wildlife Research Institute.

“The identification and morphological analysis of fossil ants and other insects in amber offer a glimpse into life on Earth millions of years ago.”

In this pioneering study, de la Fuente and colleagues examined four pieces of Cretaceous amber (dating back 99 million years), one Eocene amber (approximately 56 to 34 million years ago), and one Oligocene amber (roughly 34 to 23 million years ago).

The specimens comprised ancient ants and other organisms, as well as a rare phenomenon known as syninclusion.

“The earliest ants, identified from the late Cretaceous period, were known as stem ants, which left no modern descendants. All existing ants evolved from crown ants,” the researchers emphasized.

“Both ant types are present in the six amber specimens we investigated, including the hell ant, which evolved from stem ants.”

The researchers utilized advanced microscopy to identify various species and document the distances between ants and other organisms in the specimens.

In three of the six amber pieces, ants were discovered in close proximity to mites.

The first specimen revealed crested ants, a wasp, and two ticks closely associated, suggesting they may have been traveling on the ants.

The second piece showcased stem ants alongside spiders, while the third contained hell ants, snails, millipedes, and numerous unidentified insects.

The fourth specimen featured a stem ant and a mite approximately 4 mm apart.

The fifth amber fragment included three distinct types of ants related to mites and termites, as well as poorly preserved mosquitoes and winged insects.

In the sixth sample, stem ants were found alongside wasps and spiders believed to be parasitic. It appeared the ants were consuming something, resting against another insect inclusion that might be a worm or larva, yet no interaction was evident, hinting it could be a coincidence.

“The closest co-inclusions of ants likely reflect behaviors and interactions between these organisms,” Dr. de la Fuente noted.

“The ant-mite interaction observed in the fourth specimen may indicate two potential scenarios.”

“First, a special symbiotic relationship where the tick hitches a ride on the ant to disperse to new habitats; second, parasitism occurring when the mites feed on the ant host during transport.”

While amber fragments featuring ants are scarce, those with multiple species are even rarer. Existing evidence suggests interactions between ants and mites may sometimes be mutually beneficial.

Future studies could clarify these interactions using micro-CT scans to explore attachment structures that may facilitate the mites’ travel on ants.

“Advanced imaging techniques are essential for enhancing the analysis of interactions among diverse organisms in fossil amber inclusions,” concluded Dr. de la Fuente.

For more details, read the research team’s paper published today in Frontiers in Ecology and Evolution.

_____

Jose de la Fuente and Agustín Estrada-Peña. 2026. Description of fossil amber containing ant co-inclusions. Front. Ecol. Evol 14; doi: 10.3389/fevo.2026.1724595.

Source: www.sci.news

NASA Unveils Comprehensive Revamp of Artemis Moon Program: Key Updates & Future Plans

NASA officially announced a significant transformation of its Artemis moon program on Friday. This “course correction” aims to enhance mission frequency and include additional launches in preparation for the anticipated 2028 lunar landing.

According to NASA Administrator Jared Isaacman, these adjustments will bolster safety, minimize delays, and ultimately facilitate President Donald Trump’s vision of returning astronauts to the moon while establishing a sustained presence there.

“Consensus indicates this is the only viable path forward,” Isaacman stated during a press conference on Friday. “I have had similar discussions with all Congressional stakeholders, and they are fully aligned with NASA’s approach. This is how NASA has historically transformed the world, and it’s how we’ll do it again.”

Mobile Launcher 1, equipped with the Artemis II Space Launch System (SLS) and Orion spacecraft, rolls back to the Vehicle Assembly Building from Launch Pad 39B at Kennedy Space Center at dusk on February 25, 2026, in Cape Canaveral, Florida.
Greg Newton/AFP – Getty Images

Isaacman revealed that the Artemis III mission, which was initially planned for a lunar landing in 2028, will now focus on technology demonstrations in low Earth orbit instead. The aim is to launch Artemis III by mid-2027 for essential rendezvous and docking tests with commercial lunar landers from both SpaceX and Blue Origin.

Subsequently, Artemis IV is slated for a moon landing in 2028.

This new direction could rejuvenate the nearly decade-old Artemis program, which has faced numerous challenges, including significant cost overruns and delays—most recently, a one-month postponement of the Artemis II mission intended to send astronauts on a 10-day lunar orbit.

Isaacman noted that insights gained from Artemis II led to the recognition that the progression from lunar orbit to landing in Artemis III was “too vast,” particularly given the SLS rocket and Orion spacecraft’s infrequent launches, currently no more than once every three years.

NASA’s Artemis II SLS rocket.
NASA

“As crucial as rocket launches are, conducting them every three years is not a recipe for success,” he noted. “Frequent launches are essential, as extended intervals result in skill degradation and lost operational experience.”

Administrators highlighted similar issues with hydrogen and helium encountered during both Artemis I (an unmanned test flight launched around the moon in 2022) and Artemis II, stressing the difficulty of identifying root causes, likely exacerbated by extended mission gaps.

Two commercial space firms, SpaceX, led by Elon Musk, and Blue Origin, founded by Jeff Bezos, are competing to build lunar landers for the Artemis program. In a recent statement on X, SpaceX affirmed its shared goal with NASA: to return to the Moon safely and efficiently.

“Regular human exploration flights are key for establishing a sustainable human presence in space,” the company stated.

Blue Origin also expressed enthusiastic support for the revisions. “Let’s move forward! Everyone plays a role!” Companies discussing on X.

Among its mission revisions, NASA indicated it would standardize the manufacturing of Space Launch System rockets and strive for booster launches every 10 months, instead of the previous three-year interval.

While other rocket configurations were planned for later Artemis missions, NASA Deputy Administrator Amit Kshatriya noted that those configurations were deemed “unnecessarily complex.”

“Too much learning and testing potential has been left unexplored, leading to excessive risks in both development and production,” Kshatriya stated in a press release. “Our focus now is to continue testing as though we are in production.”

Isaacman concluded that while these changes represent a significant shift for NASA, they should not be unexpected to contractors or stakeholders within Congress and the Trump administration.

“President Trump is passionate about space and played a pivotal role in the creation of the Artemis program,” he remarked. “This initiative is a priority for his administration.”

This overhaul follows additional delays to the Artemis II mission. A hydrogen leak discovered during a critical refueling test prompted NASA to forfeit all possible launch opportunities this month. Though a subsequent refueling test proceeded smoothly, engineers later identified a blockage affecting helium flow to the booster’s upper stage, thwarting plans for a March launch.

NASA has since transported the rocket from its launch pad at Kennedy Space Center in Florida back to its hangar for necessary repairs. Officials anticipate that if the repairs proceed as planned, Artemis II could launch as early as April.

Source: www.nbcnews.com

Exclusive Sneak Peek: NASA’s Spacesuit Testing for Upcoming Moon Mission

NASA Astronauts practicing tasks in simulated lunar environments with advanced spacesuits.

Astronauts Practice in Simulated Lunar Environments

NASA

As NASA prepares for a groundbreaking mission to return humans to the moon, astronaut safety remains a paramount focus. The image above showcases a NASA crew testing cutting-edge spacesuits developed by Axiom Space, a Texas-based aerospace company.

The Axiom Extravehicular Mobility Unit is engineered to enhance astronauts’ mobility and flexibility, enabling them to efficiently navigate the lunar terrain and gather geological samples.

Axiom Space completed an internal review of these innovative spacesuits, and NASA is now evaluating readiness for the upcoming Artemis III mission, set to launch in 2028. This historic mission aims to land humans on the moon’s south pole for the first time in over 55 years.

“This achievement reflects our unwavering commitment to providing a safe and efficient lunar spacesuit, empowering astronauts to explore the moon’s surface,” stated Lara Carney, NASA’s manager of extravehicular activities and human surface mobility programs at the Johnson Space Center in Houston, Texas.

To date, the Axiom suit has undergone over 850 hours of rigorous pressure testing, simulating moon conditions with astronauts inside. In the training process, crew members practice emergency rescue scenarios in a 40-foot-deep pool, with the suit’s weight tailored to match the moon’s gravity, which is about one-sixth of Earth’s gravitational pull.

Exploring the History and Future of Space Exploration in the United States

Embark on an extraordinary journey through America’s significant space and astronomy landmarks, crafted for inquisitive minds and lifelong learners.

Topics:

Source: www.newscientist.com

Revolutionary Brain Cells on a Chip Master Doom in Just One Week

Sure! Here is the SEO-optimized version of your content, maintaining the HTML structure:

Human Neurons Playing Doom on a Chip

Cortical Research Institute

A cluster of human brain cells has been demonstrated to play the classic game Doom. While the performance doesn’t yet match human ability, experts believe this breakthrough gets us closer to practical applications for biological computers, such as controlling robotic arms.

In 2021, researchers from the Cortical Research Institute employed a computer chip featuring neurons known as Pon. The chip, comprising over 800,000 living brain cells on a microelectrode array, was capable of both sending and receiving electrical signals. The researchers meticulously trained the chip to manipulate the paddles on the screen’s edges.

<p>Recently, Cortical Labs introduced an easier interface to program these chips using the widely-used programming language Python. Independent developer <a href="https://www.linkedin.com/in/sean-cole-8985a4207/">Sean Cole</a> utilized this interface to teach the chip how to play <em>Doom</em> in just about a week.</p>
<p>“Unlike the <em>Pon</em> project that involved years of rigorous scientific labor, this new demonstration was achieved in mere days by individuals with limited prior experience in biology,” said <a href="https://scholar.google.com/citations?user=bvWRHNcAAAAJ&amp;hl=en">Brett Kagan</a> from the Cortical Institute. “This accessibility and flexibility is incredibly exciting.”</p>
<p>The neuron-based computer chips utilized approximately a quarter of the neurons found in traditional chips. While the <em>Pon</em> demonstration yielded better results in <em>Doom</em> than random input from players, its performance still lagged behind that of top human gamers. However, it can learn significantly faster than conventional silicon-based machine learning systems, and new learning algorithms are expected to enhance its performance, according to Kagan.</p>
<section>
</section>
<span class="js-content-prompt-opportunity" />
<p>Comparing these biological chips to the human brain can be misleading, he suggests. "While it is indeed living tissue, the mechanisms it employs for information processing are dissimilar to those of silicon," he explains.</p>
<p><em>Doom</em> poses a substantial challenge compared to prior example games, and the ability to successfully engage with it marks a significant advancement in controlling and training living neural systems, states <a href="https://people.uwe.ac.uk/Person/AndrewAdamatzky">Andrew Adamatzky</a> from the University of the West of England, Bristol, UK.</p>
<p>Researchers like <a href="https://scholar.google.com/citations?user=jLnsiBEAAAAJ&amp;hl=en">Steve Farber</a> from the University of Manchester concur, noting that the ability to play <em>Doom</em> represents significant progress. He also pointed out that many unanswered questions remain regarding how neurons comprehend gameplay expectations and how they interface with a screen without visual organs.</p>

<p>Regardless, the leap in capabilities is promising. <a href="https://www.reading.ac.uk/biomedical-engineering/staff/yoshikatsu-hayashi">Yoshikatsu Hayashi</a> from the University of Reading is working towards practical applications like using biological computers to control robotic arms. His team is experimenting with a similar computer made of jelly-like hydrogel. “[Playing <em>Doom</em>] serves as a simpler analogy for controlling an entire arm,” Hayashi articulates.</p>
<p>“The significance here goes beyond just biological systems playing <em>Doom</em>,” adds Adamatzky. “It demonstrates the potential to navigate complexities, uncertainties, and real-time decision-making—skills essential for future biological or hybrid computing solutions.”</p>

<section class="ArticleTopics" data-component-name="article-topics">
    <p class="ArticleTopics__Heading">Topics:</p>
</section>

SEO Optimization Components:

  • Alt Text: The alt attribute is updated to be more descriptive, which helps with image SEO.
  • Keywords: Important phrases such as “human brain cells,” “Doom,” “biological computers,” “robotic arms,” and related terms are included throughout the text for better search engine ranking.
  • Headings: If appropriate, consider adding headings or subheadings for better readability and SEO structure (not included in the original but can be added based on your needs).

Source: www.newscientist.com

NASA’s Artemis Moon Exploration Program: Major Reforms and Enhancements Unveiled

NASA’s Space Launch System

NASA’s Space Launch System Faces Challenges

Credit: NASA/Cory Houston

NASA is re-evaluating its Artemis moon exploration program. During a press conference on February 27, NASA Administrator Jared Isaacman revealed significant adjustments to the plans for sending humans to the moon for the first time since the Apollo program concluded in 1972.

The upcoming Artemis II mission, set to launch soon, has experienced two challenging tests. The Space Launch System (SLS) rocket faced fuel injection leaks, necessitating a return from the launch pad for thorough analysis and repairs. The SLS saw its last launch in 2022.

Artemis II aims to orbit astronauts around the moon in preparation for a crewed landing in the Artemis III mission, though that goal has now shifted. Artemis III will focus on testing the Orion crew capsule’s docking capabilities with the lander in lunar orbit, along with evaluating the spacesuit for eventual moon landings.

Despite these seemingly negative developments, NASA has laid out plans to increase launch frequency. The revised approach aims for Artemis IV and potentially Artemis V to achieve lunar landings by 2028.

“The entire series of Artemis flights should represent a gradual build-up of capability, with each step advancing our readiness for landing missions,” stated NASA official Amit Kshatriya in a recent statement. “Each phase should be substantial enough for progress, yet measured to avoid unnecessary risks based on our experiences thus far.”

Initially, there were plans to upgrade the SLS rocket’s upper stage for future endeavors. However, Isaacman highlighted a shift towards a “standardized” version, minimizing significant changes for every few missions. “We don’t aim for each rocket to be a work of art,” he said in the press briefing.


These changes denote a shift in the Artemis program’s philosophy, prioritizing thorough testing for every component of the rocket and mission strategy. This approach aims to facilitate swift, small steps rather than large leaps every few years, with Isaacman expressing optimism about reducing the delays that have historically burdened the Artemis program, ultimately promoting a safer and more efficient lunar exploration initiative.

Topics:

Source: www.newscientist.com

How Stem Cell Infusions from Young Donors Can Combat Frailty

Illustration of frailty and mobility

Slow Walking Speed: A Key Indicator of Frailty

Image Credit: Gordon Scammell/Loop Images/Universal Images/Getty Images

Experimental stem cell therapies offer groundbreaking potential in treating frailty by targeting its biological roots. This condition, which heightens the risk of falls and infections, is traditionally managed through lifestyle changes like strength training and balance exercises. However, recent studies suggest that injecting stem cells from young, healthy donors into older adults can substantially enhance mobility.

Dr. Joshua Hare from Longeveron, a biotechnology firm based in Miami, Florida, states, “Frailty is a leading cause of disability and diminished quality of life in older adults. There exists a significant unmet need for biological treatments.”

Dr. Hare and his team are innovating therapies that focus on the essential mechanisms of aging, including inflammation and metabolic disturbances causing muscle contractions. Their treatment, Laromestrocell, is derived from mesenchymal stem cells harvested from healthy bone marrow donors aged 18 to 45.

Having achieved success in early-stage testing, they recently conducted a study assessing different dosages of Laromestrocell compared to a placebo in a cohort of 148 individuals affected by frailty, which impacts approximately one in four people over the age of 65.

The researchers measured the walking distance of participants aged 74 to 76 with mild to moderate frailty in a six-minute walk test both before and after receiving Laromestrocell. Remarkably, a single injection resulted in a dose-dependent enhancement in performance without significant safety concerns. For instance, patients receiving the highest dose walked an additional 41 meters compared to those treated with a placebo six months post-infusion, increasing to 63 meters after nine months.

According to the researchers, Laromestrocell inhibits enzymes known as matrix metalloproteinases, which negatively impact structural proteins in blood vessels and related tissues. This gene therapy has the potential to regenerate vasculature and improve muscle fibers essential for endurance, as noted by Dr. Hare.

Nevertheless, improvements in walking speed or grip strength were not observed. “The most clinically significant measurement is the six-minute walk distance, which is well correlated with health status and longevity,” Dr. Hare explained.

Dr. Daisy Wilson from the University of Birmingham, UK, commented, “This trial appears very promising. I was quite impressed by the substantial changes observed in just six minutes of walking.”

Additionally, this trial may help identify biomarkers of frailty, aiding in pinpointing individuals who stand to gain the most from this treatment, potentially before symptoms manifest. During blood analyses, the researchers discovered that levels of a fragment known as sTIE2, indicative of vascular dysfunction, decreased progressively with escalating doses of Laromestrocell.

This indicates that individuals exhibiting high sTIE2 levels may derive the most benefit from this treatment, according to Dr. Wilson. “Frailty is highly heterogeneous,” she remarked. “The critical aspect of Jello Protector Medical interventions is to slow the aging process. Moving forward, aligning the right treatment with the appropriate patient will be essential.”

However, she expressed concerns regarding both the cost and feasibility of stem cell treatments. “Considering the high expenses, justifying their use becomes challenging, especially when walking programs that enhance six-minute walk test performance are under investigation,” she added. Worryingly, the acquisition of stem cells from volunteers could pose a considerable challenge, necessitating a large pool of donors to treat all frail patients.

Dr. Hare countered this argument, asserting that various companies are making technological advancements to scale up stem cell treatments for broader access. “Substantial research is underway to increase the production of these types of stem cells in larger quantities, and I am optimistic that this requirement will be met,” he stated.

Topics:

Source: www.newscientist.com

New Scientist Endorses Liminals: Explore Revolutionary Quantum Soundscapes

Pierre Huyghe's Artwork

Artist Pierre Huyghe

Photo by Ola Lindal

A century ago, the advent of quantum mechanics left physicists gazing into the unknown. Long-held beliefs about reality were called into question. Today, we delve into the enigmatic realm of quantum probability clouds and their peculiar behaviors, even at a distance.

Liminal is a profound installation by artist Pierre Huyghe (featured above) that captures many poignant concepts. Set in Halle am Berghain—formerly an East Berlin power station and now a renowned techno club—this exhibition features immersive video projections and soundscapes that resonate deeply within the gritty remnants of the concrete structure.

Huyghe’s art emerges from the collapse of atoms transitioning between quantum states, creating soundscapes that reflect the universe’s fundamental language. Some interpretations suggest that reality is not constructed from quantum fields; instead, the quantum state only represents our knowledge, implying that the external world may not truly exist. Huyghe’s depiction of faceless figures intertwined with the landscape powerfully encapsulates this concept, transcending simplistic explanations.

Thomas Luton
Features Editor, London

Topics:

Source: www.newscientist.com

Discover an Excerpt from “Art Cure” by Daisy Fancourt: New Scientist Book Club Picks

Spending Time Painting in Cornwall, UK

Ashley Cooper/Alamy

Russell hesitated at the door, unsure whether to enter or not. This wasn’t his usual environment; he only came at the doctor’s suggestion.

His journey began with a stroke that disrupted blood flow to his brain, leading to significant challenges. He faced months of recovery, relearning skills he once took for granted. As time passed, he encountered severe back pain, lost his job, and struggled to maintain relationships, becoming depressed and overwhelmed by his situation.

When his doctor suggested eight weeks of art classes, Russell doubted the effectiveness of art as therapy. Still, feeling like he had nothing to lose, he stepped inside.

To his surprise, the first class was less intimidating than expected; he didn’t draw but observed fellow students. The calming ambiance and vibrant colors somewhat eased his anxiety. On his way home, he noticed a shift—his breathing was slower and more peaceful. The next week, he recognized familiar faces and started doodling in the garden shed during sleepless nights. By the third class, he had picked up a paintbrush. In the following weeks, he proposed a collaborative project: to paint portraits of his classmates.

I first met Russell early in the morning at a Manchester hotel. We were both preparing to appear on BBC Breakfast, where he would discuss the pioneering initiative of “prescription-based art” within the National Health Service. His experiences left me in awe of the transformation he underwent.

During his subsequent checkup, doctors were impressed with his progress; both his mood and pain levels had significantly improved. Art classes provided him with a sense of structure, something he had started to look forward to. His doctor reduced his medication, noting the improvement in his overall health and sleep quality.

As he neared the completion of his portrait series, he approached Gloucester Art Museum to host an exhibition titled “We’re All Mad Here.” The event drew fellow students and healthcare professionals, leading to requests for more commissioned works, including paintings of a nurse’s children.

Over the past decade, Russell Haynes has showcased his art throughout the UK—from Gloucester Cathedral to the Tower of London. His pieces are now highly sought after, often selling for thousands. He not only continues to create art but also teaches classes, receiving referrals from doctors. Remarkably, Russell has not taken any medication nor visited a doctor in over a year.

When I asked him about the impact of those initial art classes, he stated simply:

“They saved my life.”

This excerpt is from a work by Daisy Fancourt. Art Cure: The Science of How Art Changes Our Health (Cornerstone Press), part of the New Scientist Book Club’s March selection. Join us for a shared reading experience here.

Topic:

Source: www.newscientist.com

Discover Daisy Fancourt’s Insights on Art as Medicine: ‘If Art Had the Healing Power of Medicine, We’d Embrace It Daily’

Regular Engagement with Arts: Transformative Physiological Changes

Mascot/Getty Images

Reflecting on my journey into research on the health benefits of art, a pivotal moment stands out. After completing my education, I began working at the NHS, overseeing performing arts programs at Chelsea and Westminster Hospitals in London. One patient’s relative approached me post-performance in a dementia ward and said, “What a wonderful entertainment program you are running.”

This comment, albeit well-intentioned, overlooked the profound impact of our Hospital Arts Program. I personally witnessed transformative effects: a patient, despite memory loss, sang along to White Cliffs of Dover, evoking childhood memories. I observed a child with severe burns who required no morphine during a theater performance, a premature baby who calmed and began eating while his mother sang, and a stroke survivor who walked more steadily upon wearing headphones. While our arts programs offered enjoyable distractions, I recognized their deeper significance in enhancing patients’ health. My curiosity led me to seek a deeper understanding of these effects.

Over the past decade, I have dedicated my research as a psychobiologist and epidemiologist to acknowledge the health benefits associated with art engagement. Findings from numerous global studies reveal that activities like reading, listening to music, dancing, or crafting activate essential biological processes that support our health. Participating in the arts stimulates our brain’s reward system, elevating dopamine levels tied to mood and pleasure. Furthermore, we regulate autonomic nervous system activity, which contributes to lower heart rates and decreased blood pressure over time. Notably, our stress hormones diminish, as do inflammatory responses within our immune system. We can even modify gene expression, reducing stress-related genes while enhancing those that promote neurogenesis.

Regularly engaging in arts over extended periods fosters significant physiological changes. It’s shown to increase gray matter in brain regions vital for memory, auditory processing, and motor skills. Furthermore, we produce unique protein patterns associated with improved cognitive function and a reduced risk of depression and infections. A recent study employing various biological metrics, including brain clocks and epigenetic evaluations, indicates that consistent engagement with the arts correlates with a younger biological age.

These profound changes significantly influence our overall well-being. Individuals who actively participate in the arts tend to report greater happiness, enhanced life satisfaction, purpose, and a reduced risk of developing conditions such as depression, chronic pain, frailty, and even dementia. These beneficial relationships hold even when accounting for factors like wealth, demographics, medical history, or lifestyle choices.

These promising results are drawn from randomized controlled trials, laboratory experiments, and large-scale epidemiological studies examining the art’s population-level impacts. Numerous specific artistic interventions in medical settings for designated patient groups—like singing programs for stroke survivors or dance classes for individuals with Parkinson’s disease—underscore art’s potential benefits. Some trials suggest that art might be even more effective for managing pre-operative anxiety than traditional anti-anxiety medications like benzodiazepines, with fewer side effects proving its efficacy.

Nevertheless, while engaging in the arts is a promising avenue of exploration, it is not a cure-all. Various instances of art-related harm exist due to misuse or inadequate project design. I have actively countered misconceptions, such as the idea of art as a cure for boosting intelligence or combating serious health issues like cancer. Although the field remains ripe with potential and ongoing research, we eagerly anticipate larger-scale trials.

If a medication boasted this array of health benefits, we would enthusiastically promote it, invest resources in its development, and ensure its accessibility. It is exhilarating to watch the recommendations I promote materialize—not as prescriptions or medical interventions, but as enjoyable experiences like attending a concert, participating in dance lessons, or simply reading a book, potentially including my own.

Daisy Fancourt is the author of Art Cure: The Science of How Art Changes Our Health (Cornerstone Press) and featured in the March reading list of the New Scientist Book Club. Join us here to participate in the discussion!

Topics:

  • Health/
  • New Scientist Book Club

Source: www.newscientist.com

Juice by Tim Winton: An Australian Climate Novel That Captivates Readers

New Scientist Book Club’s February selection: Tim Winton’s novel ‘Juice’

The New Scientist Book Club transitioned from exploring the implications of sex robots in January to discussing Sierra Greer’s impactful work, Anniebot, in February, alongside Tim Winton’s vivid portrayal of an Australian future in Juice.

Winton’s narrative is conveyed through an anonymous protagonist detailing life in a dangerously heated world, gradually revealing his role in administering punishment to those whose actions exacerbated climate change and exploring the depths of survival.

I found Juice to be a captivating read—utterly gripping and profoundly unsettling. But what were the book club’s impressions? The novel spurred lively discussions on our platform. In a positive review, Glen Johnson expressed his admiration, noting Winton’s adept descriptions of adaptations in a familiar climate zone, referring to the narrative as a “natural evolution of the resourceful Australian landscape.”

Victor Churchill echoed this sentiment: “Despite the harsh circumstances, it offers a surprisingly optimistic tone. While the plot presented some hurdles, it was overall exceptionally engaging.” He appreciated how the author allows readers intimate moments of discovery through the protagonist’s journey.

Kim Woodhams Crawford shared similar thoughts, commending the novel’s forecasts about potential climate disasters. “Regardless of political narratives, there’s no escaping the reality of severe temperature rises,” she cautioned.


However, not all responses were overwhelmingly positive. “Admittedly, I struggled with the novel’s initial chapters and nearly stopped reading,” Linda Jones confessed. “But once the backstory unfolded, my interest spiked dramatically.” Phil Gurski also remarked on the slow start of the book.

Opinions diverged on Winton’s narrative style. While some appreciated the unique voice of the imprisoned protagonist, others found it less convincing. “The writing evokes a sense of magical realism,” Gosia Furmanik suggested, although Jacqueline Ferrand posed a critical question: “In a dystopian reality, would a stranger truly want to know the complete history of your past?” Steve Swann, on the other hand, expressed frustration, stating he’d likely have taken drastic action if placed in the protagonist’s shoes.

A major topic of debate was the novel’s status as a dystopia. Winton himself wrote in an essay for us, “Dystopia is sometimes a word that desensitizes us to reality, and we can’t afford that.” Members engaged deeply with this theme.

Victor expressed, “This doesn’t feel like a dystopia per se; I perceive it more as a post-dystopian narrative where society has adapted to its harsh realities.” Margaret Buchanan added, “We won’t ascertain if this narrative is truly dystopian until future generations reflect on it amidst current climatic challenges.”

Conversely, Niall Leighton argued that the real-world experiences of many people mirror the novel’s depiction of dystopia. “It’s a semantic debate: can the essence of living in a dystopian nightmare be recognized as living in a dystopia?” he wrote. He emphasized that for him, Winton’s work unmistakably inhabits that genre.

Niall further posited the provocative idea: Can envisioning a dystopian future deter its actualization? “I agree with Tim Winton that we need to confront our reality instead of relating through dystopian narratives. What we truly require are stories that inspire us to build better, inclusive worlds,” he stated. This encourages reflection for many of us, myself included.

Meanwhile, Gosia raised concerns about the plausibility of Winton’s narrative choices, questioning whether killing descendants of the fossil fuel elite was a logical response to climate crises. She lamented that such actions seemed futile against the continuous decline of our environment.

As for the novel’s conclusion, I personally cherished the nuances of hope and ambiguous endings, which resonate with me. Samantha de Vaux shared her perspective, acknowledging that while a more positive outcome could have been possible, she respects the author’s narrative course. “This complex book and its conclusion challenged me profoundly,” she remarked.

As we conclude our discussion of Winton’s profound works, we pivot to our March selections—whether dystopian or not. Up next, I’ll delve into Daisy Fancourt’s insightful non-fiction, Art Cure: The Science of How Art Changes Our Health. As a Professor of Psychobiology and Epidemiology at University College London, she explores how art can elevate our mental and physical well-being, identifying it as the ‘forgotten fifth pillar of health’ alongside diet, sleep, and exercise. A captivating excerpt detailing how an art class transformed someone’s recovery post-stroke awaits readers. Join us in the New Scientist Book Club by signing up or connecting on our Facebook group here.

Topics:

Source: www.newscientist.com

Should We Be Concerned About Asteroid Impacts on Earth? What You Need to Know

Could this dramatic image actually happen?

angel_nt/Getty Images

Deep in the cold void of space lies a potential asteroid threat that could obliterate most life on Earth. Is such a fate unavoidable? Can we potentially avert disaster, or are we fated for a catastrophic end similar to the dinosaurs? Here’s what science reveals.

The asteroid that led to the extinction of the dinosaurs 66 million years ago measured at least 10 kilometers across. Its massive impact resulted in catastrophic tsunamis, widespread wildfires, and global darkness. Estimates of Earth’s crater history suggest that an asteroid of this magnitude collides with Earth roughly every 60 million years. Smaller asteroids, around 1 kilometer in diameter, impact the Earth approximately every million years, with the last significant event occurring around 900,000 years ago. These statistics are understandably alarming.

However, unlike the dinosaurs, humans possess the unique ability to observe and analyze our universe. Consequently, scientists are diligently working global efforts to catalog asteroids and assess which pose a threat.

Fortunately, among the thousands of near-Earth objects currently monitored by astronomers, only 35 present a risk greater than 1 in 1 million of colliding with Earth in the next century. Moreover, the vast majority of these potential threats measure less than 100 meters in diameter. So, is an extinction-level asteroid likely to strike during our lifetime? The probability is extremely low.

Nonetheless, discerning readers will note phrases like “about the asteroid we are tracking,” “a small possibility,” and “almost.” Such wording implies that we can’t confirm we’ve detected every asteroid out there. Rarely, we receive sensational news about newly discovered asteroids nearing Earth, but in many instances, these rocks pass safely by.

To estimate the number of detected asteroids, astronomers calculate three factors: the total number of known asteroids, the volume of the sky explored, and the power of the telescopes used. They estimate that all asteroids larger than 10 kilometers posing a danger to Earth have been accounted for. Breathe easy; the likelihood of experiencing an event similar to the dinosaurs’ extinction is minimal.

Currently, about 80 percent of kilometers-wide asteroids have been identified, indicating a low chance of unforeseen impacts. Asteroids smaller than 100 meters pose little threat, and incidents like the Chelyabinsk meteor in 2013 typically result in minor damage as they incinerate upon atmospheric entry.

However, the so-called “urban killer”—the asteroids within the 100-meter range—remain concerning, as we have only detected less than half of these. If you’re worried about asteroids, these smaller threats warrant closer scrutiny.

Luckily, we have technology at our disposal that differentiates us from the dinosaurs. Our first line of defense involves monitoring space with advanced telescopes. Continuous efforts to observe near-Earth objects are underway, highlighted by the upcoming launch of the NEO Surveyor next year, which aims to greatly enhance our capacity to track these asteroids.

The second defense mechanism provided by space exploration is the capacity to respond if a threatening asteroid is detected. NASA’s 2022 Double Asteroid Redirection Test demonstrated the potential to redirect an asteroid, ensuring we could alter its path if necessary. Provided we have sufficient notice—typically requiring several years of monitoring—we can adjust trajectories to avert collision.

In the event that an asteroid does hit Earth, the impact would be a natural yet predictable disaster. If an asteroid strikes, it could crash into the ocean or an uninhabited region. According to the World Economic Forum, less than 15 percent of the Earth’s lands (under 4.3 percent of the total surface area) have been significantly modified by humans, with even fewer areas inhabited.

If an asteroid were to threaten one of these few populated areas, we have strategies similar to managing any natural disaster: evacuation, damage control, and sheltering in place. Strengthening our overall disaster response capabilities prepares us for such scenarios and aids in managing more plausible and unpredictable disasters.

So, returning to our initial question: Are asteroids inevitable? Absolutely. Is there a solution? Very likely. Will we face a fate akin to the dinosaurs? If so, it remains far off in the future. Instead of succumbing to worry, invest your energy in preparedness—learn about natural disaster responses and keep an eye on asteroids like the vigilant scientists do.

topic:

Source: www.newscientist.com

Marine Geoengineering Test Shows No Harm to Marine Life: Findings Revealed

Impact of Alkaline Sodium Hydroxide on the Gulf of Maine’s Carbon Dynamics

Daniel Cojanu, Undercurrent Productions, ©Woods Hole Oceanographic Institution

Can we effectively remove carbon dioxide from the atmosphere to mitigate ocean acidification? A recent test shed light on this as a research team injected 65,000 liters of alkaline sodium hydroxide into the Gulf of Maine in August 2025.

“We were pioneers in exploring the enhancement of alkalinity using a ship,” stated Adam Subhas from the Woods Hole Oceanographic Institution in Massachusetts. The team shared their preliminary findings at the Marine Science Conference on February 25th in Glasgow, UK. “It’s clear we observed increased CO2 absorption due to this experiment.”

Over the span of four days, the team indicated that between 2 to 10 tons of CO2 were extracted from the atmosphere, with a potential total of up to 50 tons. Importantly, no adverse effects on marine ecosystems were noted.

Nonetheless, Subhas highlighted a critical point: the team hasn’t calculated the emissions produced during the manufacturing and transport of the sodium hydroxide, leaving the net CO2 removal outcome uncertain. “That’s an essential area for future research,” he remarked.

The ocean acts as a significant carbon sink, storing 40 times more carbon than the atmosphere and absorbing over a quarter of the excess CO2 emitted. This surplus CO2 reacts with ocean water to create carbonic acid, leading to increased ocean acidity.

Ocean acidification can severely impact various marine organisms by dissolving carbonate shells, thereby diminishing the ocean’s carbon absorption capacity.

Researchers are actively investigating numerous strategies to counteract ocean acidification, such as adding magnesium hydroxide to wastewater, spreading crushed olivine on beaches, and transporting seawater to onshore treatment facilities. Some companies are even marketing carbon credits based on alkalinity enhancement.

“This is indicative of current private sector initiatives,” Subhas explained, emphasizing the need for non-commercial trials like their team’s.

Given the sensitive nature of such experiments, the team engaged local stakeholders, particularly the fishing community. “Establishing a two-way dialogue is crucial,” asserted Kristin Kreisner of the Environmental Defense Fund, a New York-based nonprofit.

The testing involved three ships and was meticulously monitored using various methods, from satellites to floating sensors and ocean gliders. Sodium hydroxide was mixed with a trace dye called rhodamine to accurately track its dispersion.

The researchers measured concentrations of microorganisms, plankton, fish larvae, and lobster larvae, as well as photosynthetic activity levels. According to Rachel David at Rutgers University, New Jersey, “Our trials did not significantly impact the biological community.”

The additional carbon introduced into the ocean through increased alkalinity converts into bicarbonate ions, akin to dissolved baking soda. “We anticipate this carbon will remain locked for tens of thousands of years, making it one of the most sustainable carbon removal methods,” Subhas noted.

The nature of this process allows CO2 to be removed and stored simultaneously, providing benefits over other methods that necessitate separate CO2 capture and permanent storage.

Topics:

Source: www.newscientist.com

Unlocking Quantum Computing: How an 1980s Niche Technology Could Revolutionize the Future

Sure! Here’s a rewritten version of the content, optimized for SEO while retaining the HTML tags:

Adam Weiss configuring a dilution refrigerator

Adam Weiss of SEEQC, the pioneering quantum chip manufacturing company.

SEEQC

<p>Explore the remarkable innovations of the 1980s, from British heavy metal to vibrant purple blush favored by makeup artists. Yet, amid the glam and flair, a neglected technological gem emerged: superconducting circuits. In 1980, IBM invested in this revolutionary technology to create highly efficient computers, showcasing a superconducting circuit on the cover of <em>Scientific American</em> during the same year.</p>

<p>However, the anticipated revolution never materialized, and superconducting chips faded into obscurity, much like perms and pegged pants. Yet, one company persevered in its research efforts—SEEQC. I recently toured SEEQC's cutting-edge quantum chip manufacturing facility in upstate New York, born from IBM's discontinued superconducting computing program. Here, I discovered SEEQC's aspirations for superconducting chips in ushering a new era in quantum computing.</p>

<p>Inside the SEEQC facility, you’re greeted by extensive machinery and technicians donned in protective gear. In cleanrooms, ultra-thin layers of niobium, a superconducting metal, are meticulously deposited onto dielectric materials, forming intricate, sandwich-like structures. Lithographic devices further refine these structures, carving out tiny trenches essential for quantum processes. The atmosphere buzzes with activity, illuminated in yellow light to minimize disruption during chip production. In a conference room, SEEQC's CEO <a href="https://seeqc.com/about/leadership/john-levy">John Levy</a> presented a superconducting chip that is surprisingly compact yet poised to transform this futuristic industry.</p>

<h2>The Challenge Ahead</h2>
<p>Superconductors excel at delivering electricity with flawless efficiency, distinguishing them from conventional electronic materials. For instance, when charging a phone, heat loss in cords and chargers often reduces effectiveness. In a 2017 study by computer scientists, they noted traditional computers often function as costly electric heaters, performing minimal calculations alongside unnecessary energy loss.</p>

<p>Comparatively, superconducting computers eliminate this efficiency problem. However, a significant limitation exists: all known superconductors require extremely low temperatures or immense pressure to function. This necessity has historically rendered superconducting computing prohibitively expensive and impractical. IBM abandoned its superconducting computing research in 1983, leading to a preference for traditional overheating computers. Ironically, energy costs have surged recently, especially due to the growing demand from AI technologies.</p>

<p>A shift occurred in the late 1990s when a team of Japanese researchers <a href="https://arxiv.org/pdf/cond-mat/9904003">created</a> the first superconducting qubit, a foundational element of quantum computing. This innovative approach diverged from prior attempts, paving the way for a new computing paradigm leveraging processes unique to quantum mechanics.</p>

<p>Since then, superconducting qubits have powered significant advancements in quantum computing. Tech giants like Google and IBM utilize this technology to tackle complex scientific challenges, achieving remarkable demonstrations of "quantum supremacy" that underline the distinct capabilities of quantum computers compared to classical counterparts.</p>

<p>However, true disruptive technologies in quantum computing remain elusive. Quantum computers have yet to realize their potential to revolutionize areas such as cryptography or industrial chemistry, with numerous technical and engineering challenges lying ahead.</p>

<p>SEEQC's Levy believes some solutions could trace back to the 1980s. His team is developing digital superconducting chips designed to enhance the power, size, and error resilience of quantum computers simultaneously. Nearby, researchers are busy testing chips in various refrigerator configurations, aiming to streamline quantum computing components, ultimately enhancing efficiency.</p>

<p>The working core of a superconducting quantum computer comprises a chip packed with qubits and a refrigerator essential for their operation. Externally, it appears as a single, elongated box comparable in height to a person. However, the components extend beyond this simple design. Control mechanisms, traditional computational inputs, and output readings from quantum calculations require elaborate setups. Moreover, qubits are delicate and susceptible to errors, necessitating sophisticated control systems for real-time monitoring and adjustments. This means non-quantum components, which consume substantial space and energy, play a crucial role in the overall functionality of quantum computers.</p>

<p>Expanding qubit numbers to enhance computational power necessitates additional cables. “Physically, you can't keep adding cables forever,” asserts <a href="https://seeqc.com/about/leadership/shu-jen-han-phd">Shu Zhen Han</a>, SEEQC's Chief Technology Officer. Each new cable introduces heat that disrupts qubits and affects their performance. While this might seem purely technical, the complexities of connecting and controlling qubits represent significant hurdles for quantum computing advancement.</p>

<p>The SEEQC chip I examined addresses many of these challenges.</p>

<p>
    <figure class="ArticleImage">
        <div class="Image__Wrapper">
            <img class="Image" alt="SEEQC quantum chip" 
                width="1350" height="899" 
                src="https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg" 
                srcset="https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=300 300w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=400 400w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=500 500w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=600 600w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=700 700w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=800 800w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=837 837w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=900 900w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1003 1003w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1100 1100w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1200 1200w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1300 1300w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1400 1400w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1500 1500w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1600 1600w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1674 1674w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1700 1700w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1800 1800w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=1900 1900w, https://images.newscientist.com/wp-content/uploads/2026/02/24002912/SEI_283782774.jpg?width=2006 2006w" 
                sizes="(min-width: 1288px) 837px, (min-width: 1024px) calc(57.5vw + 55px), (min-width: 415px) calc(100vw - 40px), calc(70vw + 74px)" 
                loading="lazy" data-image-context="Article" 
                data-image-id="2516803" 
                data-caption="SEEQC's quantum chip" 
                data-credit="Karmela Padavic-Callaghan"/>
        </div>
        <figcaption class="ArticleImageCaption">
            <div class="ArticleImageCaption__CaptionWrapper">
                <p class="ArticleImageCaption__Title">SEEQC Quantum Chip</p>
                <p class="ArticleImageCaption__Credit">Carmela Padavic-Callaghan</p>
            </div>
        </figcaption>
    </figure>
</p>

<p>The SEEQC chip embodies the typical design of a computer chip: small, flat, with a metal rectangle atop a larger one. Levy explained that the smaller rectangle holds superconducting qubits, while the larger one is a conventional chip of superconducting material, facilitating digital control of the qubits. Since both components are superconducting, they can occupy the same refrigerator, reducing the reliance on many energy-consuming room-temperature devices.</p>

<p>This innovation not only prevents excess heat from impacting the refrigerator's performance but also significantly lowers power consumption of the control chip. SEEQC predicts that their quantum computers could achieve an energy efficiency increase by a factor of one billion. The Quantum Energy Initiative says certain designs of ultra-reliable quantum computers could, paradoxically, consume more energy than current large-scale supercomputers, much of which stems from traditional computing components.</p>

<p>Additionally, by integrating the quantum and classical chips, instruction delays to the qubits and result readings are minimized. Levy mentioned that the digital signals from the chip reduce "crosstalk" and unintended interactions, making the qubits less prone to errors.</p>

<p>In discussions I had in 2025 with David DiVincenzo, who proposed seven essential conditions for viable quantum computer creation two decades ago, it remains a blueprint guiding researchers today. He envisioned a future where powerful quantum computers, potentially comprising a million qubits, would occupy expansive spaces resembling particle colliders rather than traditional computing setups. SEEQC’s mission aims to mitigate this expansive future, striving for a compact design reminiscent of a modern Mac rather than the bulky ENIAC.</p>

<p>Currently, SEEQC is testing its chip across varied configurations, employing qubits sourced both in-house and from other quantum manufacturers. Early performance assessments are promising, indicating the chip's versatility, though initial tests have been limited to fewer than 10 qubits, considerably smaller than the envisaged powerful quantum computers.</p>

<p>Physics challenges also emerge, as superconductors can experience tiny quantum vortices when exposed to nearby magnetic fields used for tuning qubits. <a href="https://seeqc.com/about">Oleg Mukhanov</a>, SEEQC’s Chief Scientific Officer, shared insights on a novel method developed by the company to eliminate these vortices using an opposing electromagnetic field. It reminded me of my graduate studies in superconductivity physics: even pioneering technology cannot evade the fundamental quirks of quantum mechanics.</p>

<p>Will superconducting circuits make a triumphant return and push us into a quantum renaissance? It seems the '80s might be making a comeback in the quantum realm—though I hope the oversized shoulder pads don't follow suit.</p>

<section class="ArticleTopics" data-component-name="article-topics">
    <p class="ArticleTopics__Heading">Topics:</p>
</section>

SEO Optimization Notes:

  • Heading Tags: Kept a clear hierarchy with an H2 tag for major sections.
  • Image Alt Text: Provided descriptive alt text for images to enhance accessibility and searchability.
  • Internal Links: Maintained existing links to relevant sources for authority and trustworthiness.
  • Keyword Use: Enhanced usage of relevant keywords like “superconducting circuits,” “quantum computing,” and “energy efficiency” for better search engine ranking.
  • Engagement: Encouraged engagement with informative and captivating language throughout the text.

Source: www.newscientist.com

Unlocking the Nine Hidden Secrets That Weigh Us Down Inside

Damn it! Could you please keep a secret?

Yana Iskayeva/Getty Images

On average, individuals conceal nine different secrets, ranging from personal lies to clandestine romantic affairs. This accumulation can weigh heavily, as secrets often infiltrate our thoughts without conscious effort. While confessions may alleviate some emotional burden, many secrets remain too sensitive to divulge. Consequently, researchers are exploring psychological coping mechanisms.

“People often find themselves pondering their secrets during routine activities like showering or commuting,” explains Val Bianchi from the University of Melbourne, Australia. “These unwanted thoughts can be distressing, creating a cycle where individuals ruminate on their secrets and subsequently feel worse.”

Bianchi has dedicated years to investigating the psychological impact of secrecy and strategies for mitigation. Her latest findings were supported by the Australian National Intelligence Agency, considering that intelligence personnel must safeguard crucial secrets to protect national security, necessitating effective management strategies.

“The enigma surrounding CIA operatives is intriguing. How do they safeguard vital secrets and resume normalcy afterward?” questions Lisa Williams from the University of New South Wales in Australia, who was not involved in this research.

To delve deeper into the connection between secrets and well-being, Bianchi and her team surveyed 240 individuals online, asking participants to identify secrets spanning 38 categories, including deception, infidelity, theft, addiction, and self-harm.

Respondents reported keeping an average of nine distinct secrets. The most prevalent included lie-related secrets (78% of participants) and dissatisfaction with personal or others’ appearances (71%). Other frequent secrets involved financial matters (70%), unexpressed romantic feelings (63%), and sexual behavior (57%).

Participants then pinpointed their most significant secret and maintained a diary for two weeks regarding their feelings. They generally noted that their most crucial secret was negative, prompting reflective thoughts filled with worries and concerns.

Bianchi’s prior research revealed that significant secrets occupy individuals’ thoughts approximately every two hours. Often, they surface during low-engagement tasks, allowing space for reflection, she notes.

Interestingly, the ability to keep secrets may have evolved to enhance group cohesion despite their burdensome nature on individuals. By concealing information, one can prevent harm, embarrassment, or loss of social standing. “For instance, if a colleague is under investigation, a person may choose silence over gossip to protect their workplace reputation,” Bianchi adds.

In certain cases, unveiling a secret may bring relief. Sharing it with empathetic individuals, such as therapists or through confessionals, can alleviate emotional burdens, according to Bianchi.

Conversely, some secrets, like classified information held by intelligence agents, are unsuitable for disclosure. In such instances, the individual might find it beneficial to express feelings associated with the secret without revealing specifics. Bianchi suggests that distraction techniques may also prove useful, and her team aims to research these further.

Williams emphasizes that established emotional regulation methods may also aid those grappling with secrets. “If you are unable to eliminate a secret because it’s job-related or for other reasons, addressing the negative feelings related to it is crucial,” she states. “Ignoring or suppressing negative emotions is generally unproductive; therefore, reframing them positively could be beneficial.”

For those outside the intelligence sector, writing privately about secrets and their emotional impact can be particularly therapeutic. James Pennebaker from the University of Texas at Austin previously demonstrated that journaling about emotions can offer significant mental health benefits. “My research indicates that individuals experiencing major life changes are less likely to encounter health issues if they openly discuss these events,” he explains.

Topics:

Source: www.newscientist.com

Stem Cell Patch Successfully Repairs Brain Damage in Spina Bifida Fetuses

False color radiograph illustrating large neural tube defects (red) on both sides of the lower back in a spina bifida patient

Science Photo Library

A groundbreaking trial utilized a patch made from donor placenta stem cells to treat a fetus suffering from severe spina bifida in utero. This innovative technique appears to reverse brain complications associated with congenital disorders, showing potential to improve long-term mobility in affected children.

The mother of a now four-year-old boy named Toby, who was diagnosed with spina bifida during pregnancy, was initially prepared for him to rely on a wheelchair. “But Toby is thriving. He has met all his developmental milestones, including walking, running, and jumping, and remarkably has no issues with bladder control, which is rare among those with this condition,” she commented.

Spina bifida, affecting approximately 1 in 2,800 births annually in the United States, occurs when a baby’s spine and spinal cord do not fully develop in utero. The most severe form, myelomeningocele, involves the spinal cord and surrounding tissues protruding through vertebrae, often leading to mobility challenges and bowel or bladder control issues. The precise cause of spina bifida remains unclear, although a deficiency in folic acid during pregnancy can heighten risks.

Standard treatment often involves in-utero surgery where the spinal cord and surrounding tissues are repositioned before closing the skin. “However, many children still struggle with mobility, and often bowel or bladder control remains unimproved,” notes Diana Farmer of the University of California, Davis.

To explore alternatives, Farmer and her team proposed the addition of stem cells to enhance growth and repair of spinal cord tissue. They enlisted six pregnant women carrying fetuses diagnosed with myelomeningocele.

By approximately 24 weeks of gestation, all fetuses exhibited a common complication known as hindbrain hernia. This condition causes excess fluid to accumulate in the skull, pushing the cerebellum through an opening at the base of the skull. While standard surgical procedures can often help alleviate hindbrain hernias, many children continue to face complications post-surgery.

In this latest trial, all fetuses received standard surgery along with a patch, measuring several centimeters, that included stem cells from the donated placenta, set within a matrix of sticky proteins. The surgeons applied this patch to the spine before suturing the skin around it. “The cells release what we like to call ‘magical stem cell juice’,” Farmer explains.

Upon birth, all babies showed positive surgical site healing with no indications of abnormal cell growth. “Our primary concern was that adding stem cells would lead to excessive cell proliferation, but we did not observe this,” Farmer reported. MRI scans of their brains demonstrated complete resolution of hindbrain herniation.

“In my opinion, this will enhance long-term outcomes compared to standard methods,” added Panicos Shangaris from King’s College London, citing evidence from animal studies.

The research team is optimistic about conducting a trial aimed at administering the stem cell patch to 35 fetuses with myelomeningocele, comparing results with prior studies that utilized traditional surgery, as stated by Farmer.

However, Professor Shangaris suggests that a more suitable approach would involve head-to-head trials to thoroughly assess safety and efficacy between the two techniques, providing clear pathways for treatment approvals.

Topics:

Source: www.newscientist.com

New Research Unveils Mosquito Menu Changes Linked to Homo Erectus Arrival in Southeast Asia

Recent studies reveal that the ancestors of today’s malaria-spreading mosquitoes belong to the Anopheles leukophilus (Leucosphyrus) group. These mosquitoes may have begun feeding on humans approximately 1.8 million years ago, coinciding with the arrival of Homo erectus in Southeast Asia.



The arrival of Homo erectus led to the evolution of the primary human malaria vector in Southeast Asia 1.8 million years ago.

Feeding on humans is relatively rare among the 3,500 known species of mosquitoes; however, this predation behavior is a critical factor that enhances the likelihood of mosquitoes transmitting disease-causing pathogens.

“Mosquito-borne diseases represent a significant threat to public health,” stated study lead author Upasana Shamsunder Singh and her colleagues.

“The tendency of certain mosquito species to prefer humans (anthropism) significantly influences their capacity to transmit disease-causing pathogens.”

“While mosquitoes can show versatility in host selection, understanding the evolutionary roots of anthropogenicity and the circumstances that led to its development can offer valuable insights for combatting emerging diseases linked to mosquito-borne pathogens.”

For this study, researchers sequenced the DNA of 38 mosquitoes across 11 species from the genus Leucosphyllus collected in Southeast Asia between 1992 and 2020.

These DNA sequences, in conjunction with computer models and mutation rate estimates, allowed the team to reconstruct the evolutionary history of these mosquito species.

The researchers estimate that the preference for feeding on humans evolved within Leucosphyllus just once, between 2.9 million and 1.6 million years ago, in the Sundaland region, which includes the Malay Peninsula, Borneo, Sumatra, and Java.

Before this shift, the ancestors of the Leucosphyllus mosquito primarily fed on non-human primates.

This timeline aligns with the earliest proposed arrival of Homo erectus in the area around 1.8 million years ago, well before modern humans appeared approximately 76,000 to 63,000 years ago.

These findings also predate earlier estimates regarding the evolution of human-feeding preferences in the mosquito lineage that gave rise to Africa’s principal malaria vectors, such as Anopheles gambiae and Anopheles mosquito, which evolved between 509,000 and 61,000 years ago.

Prior studies indicate that shifts in mosquito dietary preferences necessitate multiple genetic changes related to the receptors that detect body odor.

The researchers suggest that the evolution of preferences for human body odors in Leucosphyllus may have been crucial due to the sizable populations of Homo erectus in Sundaland around 1.8 million years ago.

“Our findings imply that the anthropophilic Leucosphyllus group emerged in Sundaland during the Early Pleistocene. They must have been well-established and numerous in this region to adapt to preferences for human hosts,” the researchers noted.

“This supports the hypothesis that early hominins were both present and abundant in Sundaland 1.8 million years ago, before migrating through land bridges to Java.”

Middle Pleistocene fossils of Homo erectus suggest long-term habitation of the exposed Sundaland landmass, potentially linked to large river systems.

“Given the highly fragmented fossil record in tropical Southeast Asia, our findings provide crucial evidence for understanding hominin colonization in this region,” added the research team.

The team’s findings were published in the journal Scientific Reports.

_____

US Thin others. 2026. The arrival of early humans in Southeast Asia led to the evolution of a major human malaria vector. Scientific Reports 16, 6973; doi: 10.1038/s41598-026-35456-y

Source: www.sci.news

Unlock the Benefits of Fasting: Enjoy Health Gains Without Skipping Meals

The advantages of fasting are well-documented. Research indicates that fasting can lower blood pressure, reduce inflammation, control blood sugar levels, and naturally promote weight loss. The downside, of course, is that it involves abstaining from food.

But what if you could enjoy the same benefits without completely cutting out food? Enter the Fasting Mimic Diet, designed to offer similar advantages while allowing for some consumption.

This diet restricts overall calorie intake and protein consumption but permits small servings of plant-based foods, including vegetable soups, leafy greens, nuts, and seeds.










Adhere to this diet for 5 consecutive days each month. Start with burning 700 to 1,100 calories on the first day. For the subsequent four days, limit your intake to no more than 750 calories, with macronutrient distribution of 10% from protein, 45% from carbohydrates, and 45% from fat.

Similar to traditional fasting, this diet triggers a state of “cellular housekeeping,” which allows cells to break down and recycle old and dysfunctional components like proteins and organelles. This process promotes cellular energy, function, and prevents the accumulation of defective proteins that contribute to cancer and neurodegenerative diseases.

A 2023 study found that fasting-mimicking diets could help with symptoms related to prevention and treatment strategies for Alzheimer’s disease, although further research is essential. Additional studies have revealed benefits like cholesterol reduction and improvements in other cardiovascular biomarkers.

However, current research on this diet remains limited, especially concerning its effects on humans. Nutritionists advise caution; it may not be suitable for pregnant women, those who exercise vigorously, or individuals with a history of eating disorders. Even healthy adults might experience side effects such as dizziness, fatigue, and headaches. Always consult your doctor if in doubt.


This article, authored by Rebecca Thorton from Leeds, tackles the question: “Do copycat diets work?”

For inquiries, please email questions@sciencefocus.com or reach us via Facebook, Twitter, or our Instagram page (include your name and location).

Explore more amazing science on our Ultimate Fun Facts page.


read more:


Source: www.sciencefocus.com

Scientists Discover Electric Discharges in Trees During Thunderstorms

While most people are aware of the destructive power of lightning in forests, few know about the subtle electrical phenomenon known as corona. This weak electrical glow is believed to occur on tree leaves during thunderstorms. Researchers at Penn State University utilized ultraviolet-sensitive equipment to directly observe and measure this intriguing phenomenon in tree species such as sweetgum and celery pine across various U.S. states.

Coronae glow on the tip of a spruce needle caused by a charged metal plate in the laboratory. Image credit: William Bruun.

Lightning strikes have captivated humanity since thunderstorms began sweeping through Earth’s forests, causing everything from trunk splits to wildfires, often turning night into day.

However, scientists are now shifting their focus to the more delicate electrical phenomena that manifest on leaf tips amid thunderstorms.

Unlike lightning, which can heat the air to extreme temperatures, corona represents a weak electrical discharge with a temperature only slightly above that of the surrounding air.

Despite their gentler nature, these electrical sparks can generate significant amounts of hydroxyl, a key oxidant in the atmosphere, potentially harming tree foliage and affecting charged particles within thunderstorm cloud bases.

“We have observed these phenomena, confirming their existence,” stated Dr. Patrick McFarland, a meteorologist at Pennsylvania State University.

“Having tangible evidence is incredibly exciting,” he added.

“In a laboratory setting, when you block all light, you can barely see the corona, which appears as a blue light,” he explained.

For this study, Dr. McFarland and his team designed a portable instrument equipped with multiple components to measure tree canopies and the atmospheric conditions that influence corona formation.

The centralized component is a 25 cm diameter telescope that focuses ultraviolet (UV) radiation onto a solar-blind UV camera sensitive to wavelengths between 255 and 273 nm.

During thunderstorms in North Carolina, scientists succeeded in observing corona on sweetgum and pine trees.

“The corona could potentially travel between leaves or trace along branches swaying in the wind,” the researchers noted.

Similar observations were recorded for various tree species during four additional thunderstorms from Florida to Pennsylvania.

“Our findings illustrate that the corona exhibits glowing patterns in wooded areas during thunderstorms,” the researchers stated.

“These corona effects can alter air quality in forests, subtly damage foliage, and influence storm conditions overhead.”

For further details, refer to the study published on February 12th in Geophysical Research Letters.

_____

PJ McFarland et al. 2026. Corona discharges glow on trees under thunderstorms. Geophysical Research Letters 53 (4): e2025GL119591; doi: 10.1029/2025GL119591

Source: www.sci.news

How Neanderthal Interbreeding Led to Unique Genetic Lineages

Neanderthal Model at the Natural History Museum, London

Mike Kemp/Photography/Getty Images

Research suggests that when our species, Homo sapiens, interbred with Neanderthals, most of the individuals involved may have been female Homo sapiens paired with Neanderthal males. This conclusion stems from analyses of genetic markers left in both populations due to this admixture.

The reasons behind this sex-biased mating behavior remain unclear. It is hypothesized that Neanderthal males may have favored female Homo sapiens over their own kind, or that modern human females were drawn to Neanderthal men, or possibly a combination of both. The question of whether these interactions were consensual is also unresolved.

“There’s limited insight we can draw,” states Alexander Pratt from the University of Pennsylvania in Philadelphia. “What we can confidently convey is that these events unfolded over many generations.”

Other geneticists find the evidence intriguing but not definitive. Areb Sumer from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, emphasizes, “We need further evidence as this stands as a significant claim regarding behavior.”

Since 2010, researchers have recognized that Homo sapiens, often called modern humans, interbred with Neanderthals following their migration from Africa to Eurasia. This interaction likely occurred during various periods, notably from approximately 50,000 to 43,000 years ago, and possibly more than 200,000 years ago. Presently, all non-African individuals carry some Neanderthal DNA.

However, there has been limited exploration regarding the implications of this interbreeding on sex chromosomes. Women typically possess two X chromosomes, while men have one X and one Y chromosome. Pratt and his team, including Sarah Tishkoff and Daniel Harris, also from the University of Pennsylvania, concentrated on the X chromosome in humans and Neanderthals.

“One significant observation regarding the human X chromosome is its relative lack of Neanderthal DNA,” Harris notes. Compared to other chromosomes, the human X chromosome has minimal Neanderthal genetic material. The research team proposed four possible explanations.

Firstly, it could be that Homo sapiens and Neanderthals were genetically incompatible, leading to hybrid incompatibility that resulted in health and reproductive challenges in hybrid offspring. However, the researchers found that the Neanderthal X chromosome contained significantly more Homo sapiens DNA compared to the non-sex chromosomes, indicating potential compatibility.

Secondly, natural selection may have favored modern human DNA. Given the smaller Neanderthal population, it would have been challenging for natural selection to eradicate harmful mutations. Conversely, modern humans had a larger population, allowing for the elimination of detrimental mutations, which could explain the proliferation of modern human X chromosome DNA within Neanderthal groups. Yet, the researchers argue this is negligible since the majority of the modern human DNA present on Neanderthal X chromosomes resides in non-functional regions.

Alternatively, cultural factors may play a role in mate selection. Different societies exhibit varying patterns of male and female migration. For instance, in certain cultures, females leave their familial groups to join male partners, while others may involve the opposite. If modern human females settled with Neanderthals, a bias in their X chromosomes might have emerged, but even if all the interbreeding females were modern humans, this could not sufficiently explain the pronounced bias identified by the researchers.

The researchers conclude that mating preferences are the most plausible explanation: Neanderthal men may have favored Homo sapiens women over their own partners, or Homo sapiens females may have preferred male Neanderthals to human partners, or perhaps both scenarios occurred. “If this is simply a matter of preference, it accounts for everything,” Pratt asserts.

However, other geneticists express caution about completely dismissing alternative explanations. Schumer points out that early interbreeding events had a pronounced effect on the Neanderthal genome, effectively replacing the ancient Y chromosome with a Homo sapiens Y chromosome. “This mixing must have involved a substantial number of modern human males,” she explains.

She cautions that hybrid incompatibility cannot be disregarded. Moises Col Macia at the Institute of Evolutionary Biology in Barcelona, Spain, notes that researchers have assumed Neanderthal DNA would function similarly when it integrated into modern human genomes, and vice versa. “This may not be the case,” he states.

Col Macia also suggests that another possibility, meiotic drive, warrants consideration. A rogue genetic element could skew inheritance patterns, causing one chromosome in a pair to be passed down more frequently than expected. His team has found preliminary evidence that this phenomenon also occurred in modern humans outside Africa, leading to the elimination of Neanderthal DNA from the X chromosome.

Topics:

Source: www.newscientist.com

The Aging Brain: Essential Insights You Need to Know

Recent research reveals that older adults may have a genetic edge, showcasing enhanced cognitive abilities as they age.

A study conducted by scientists at the University of Illinois at Chicago School of Medicine found that individuals aged over 80, referred to as “very old people,” produce double the number of new neurons in the hippocampus—an area crucial for learning and memory—compared to the average elderly individual. The findings were published in the journal Nature on Wednesday.

Study co-author and UIC director, Orly Lazarov, stated, “This discovery indicates that very old individuals possess molecular capabilities that enhance their cognitive performance, evidenced by increased neurogenesis. Neurogenesis represents one of the most profound forms of brain plasticity.”

In essence, the brains of very old individuals are more adaptable, fostering improved cognitive functions.

The term “super-elderly” describes those over 80 who exhibit memory capabilities comparable to individuals 20 to 30 years younger, determined by a delayed word recall test, according to Dr. M. Marcel Mesulam, founder of the Meshulam Cognitive Neurology and Alzheimer’s Disease Research Institute. This designation was introduced by a professor from Northwestern University’s Feinberg School of Medicine.

In this groundbreaking study, Lazarov and colleagues analyzed 38 brains from five distinct groups: healthy adults under 40, healthy older adults, those in early cognitive decline, Alzheimer’s disease patients, and super-elderly individuals. Notably, six super-aged brains were contributed by Northwestern University’s Super Aging Program, which celebrated its 25th anniversary last year.

The researchers investigated neurons at varying developmental stages within brain tissue samples, discovering that very old individuals possess twice as many “immature” neurons compared to healthy older adults, and 2.5 times more than Alzheimer’s patients.

A super-aged brain in a research lab.Shane Collins, Northwestern University

Historically, it was believed that mammals had a fixed number of neurons from birth, but research in the 1960s and 1970s unveiled adult neurogenesis in rodents and primates.

Subsequent studies have indicated that this phenomenon occurs within the human hippocampus’s dentate gyrus, although evidence remains mixed, and the underlying processes are still unclear.

“We’ve affirmed the existence of neurogenesis and its involvement in learning and memory in animal models,” Lazarov commented. “Determining if the human brain functions similarly is a pivotal question for our research.”

Lazarov’s findings suggest that the adult brain can generate new neurons in response to age and cognitive status.

The study revealed that very old brains exhibit “signs of resilience,” allowing them to cope with aging while maintaining superior cognitive performance.

Moreover, the research identified changes in astrocytes and CA1 neurons that regulate memory and cognition within the aging hippocampus.

Despite the study’s advancements, authors noted limitations, such as small sample sizes and significant variability among human brain samples.

Very Old Individuals Provide Insights Beyond 25 Years

According to the Northwestern Super Aging Program, this research marks the first identification of genetic distinctions between very old and conventional older adults.

Tamar Geffen, co-director of the program and co-author of the study, stated, “These individuals, aged 80 and above, exhibit immature neurons that continuously rewire, making their hippocampus distinct from that of other seniors.”

The program has also uncovered various discoveries related to these exceptionally healthy seniors, ranging from personality traits to neurological anomalies. For instance, Geffen noted that very elderly individuals often describe themselves as extroverts, with other research highlighting Von Economo Neurons linked to social behavior.

“We’ve repeatedly heard about the importance of social interactions for healthy aging, while isolation can have adverse effects in old age,” she noted.

Furthermore, these seniors tend to embrace change and remain receptive to new experiences, often identifying as low-level neurotics, according to Geffen.

While a typical human brain shrinks with age, a phenomenon exacerbated by Alzheimer’s, researchers at Northwestern discovered that the brains of very old individuals exhibit significantly slower shrinkage rates.

In a 2017 study published in the American Medical Association Journal, Northwestern researchers noted that very old individuals demonstrate resilience against neurofibrillary tangles, or tau protein changes associated with Alzheimer’s.

Concerning immunity, very elderly individuals have numerous questions, with their brains containing microglia—immune cells that activate during neurodegenerative diseases. A 2019 study in Frontiers in Aging Neuroscience revealed that very old individuals had fewer activated microglia compared to dementia patients, paralleling amounts found in those 30 to 40 years younger.

Staying Sharp Without Being Super Old

The findings suggest that the very elderly may have won the genetic lottery regarding cognitive health.

Sel Yackley, an 86-year-old participant in Northwestern’s Super Aging Program, noted, “We feel fortunate; we’re forming new neurons.”

Residing in Chicago, Yackley humorously remarked on her “super-senior duties,” which include knitting, going to the gym, crafting jewelry, singing, and managing her daily to-do list. Although she has faced limited in-person interactions, she’s prioritized keeping in touch via phone, email, and Zoom.

While she proudly identifies as a super senior citizen, Yackley acknowledges that age-related cognitive impairment can still affect her.

“At times, my memories feel fresh, and other times they slip away,” she stated.

Importantly, there are several wellness strategies individuals can adopt throughout adulthood to preserve cognitive health, noted Dr. Jennifer Paul-Durai, medical director of the Inova Brain Health and Memory Disorders Program in Northern Virginia. “Now is the moment to focus on enhancing cognitive function, long before natural decline or dementia occur,” she advised.

Dr. Paul-Durai emphasized, “The concept of super-aging provides a sense of regained control. With rising dementia and Alzheimer’s rates correlating with increased lifespan, maintaining cognitive sharpness is vital.” She encourages discussions focused on strategies to mitigate cognitive decline rather than solely highlighting the lack of a cure for Alzheimer’s disease.

This latest research underscores the brain’s capacity for adaptability, with Paul-Durai likening it to a ball of clay. “While some inherit better quality clay than others, it remains moldable throughout life to foster and shape neural pathways.”

However, if left unattended, clay solidifies and becomes hard to work with, similar to how our brains respond when we neglect cognitive engagement and physical activity.

“Our brains require active use and continuous cognitive engagement to remain flexible,” Paul-Durai explained.

Prioritizing overall health is also crucial for fostering brain plasticity, as factors like unmanaged chronic illnesses and untreated psychological traumas can hinder neuron development.

“It’s essential to advocate for preventive brain health measures before significant societal fractures emerge,” she advised. “We must emphasize the importance of taking proactive steps over merely highlighting the absence of Alzheimer’s solutions.”

Yackley, a former journalist, attributes her cognitive resilience to her career path, sharing, “My curiosity led me to explore numerous stories and conduct many interviews, which may have contributed to my neuronal health.”

Her advice to those who aren’t super seniors is to remain actively engaged, both mentally and physically.

“Don’t get caught up in counting the years. Stay active, both mentally and physically,” Yackley encouraged.

Source: www.nbcnews.com

Unlock Rapid Fat Loss: The One Exercise Hack You Need to Try

As spring approaches and you notice a few extra pounds, remember: it’s a product of evolution, not just the tempting family-sized tin of chocolate.

Humans are biologically designed to accumulate fat during colder months. In chilly weather, our bodies tend to burn more calories while being less active.

This is an evolutionary adaptation from pre-industrial eras, when food was scarce, leading our bodies to store fat as energy for the winter season.

However, in today’s world, this scarcity is often a myth. Modern conveniences like refrigeration, long-distance shipping, and enticing 3-for-2 deals on snacks mean that winter has transformed into a time of indulgent excess rather than depletion.

This evolutionary response makes it challenging to stick to winter weight loss resolutions. Our bodies react to a dip in calorie intake by ramping up our appetite or subtly reducing energy expenditure.

If you find yourself carrying extra weight after the winter season, there might be an unexpected solution: perhaps gaining a bit of weight could help you lose weight.

Add Weight to Lose Weight

In a 2025 study, researchers explored the effectiveness of weighted vests for weight loss. A weighted vest features pockets for weights and can weigh anywhere from 3 to 30 kg (or even more if you want to channel your inner robot).

A small study published in the International Journal of Obesity followed overweight participants for two years. They were divided into two groups: one underwent calorie restriction, while the other wore weighted vests for 10 hours daily.

Both groups saw weight loss in the first six months, but two years later, both regained weight—a common yo-yo effect. What’s intriguing is that the calorie-restricted group regained all their lost weight, while those with the weighted vests only regained half.

Why is this the case? Researchers discovered that the resting metabolic rate (RMR)—the calories burned during basic functions—was higher in those wearing the vests.

“Lower RMR after weight loss often leads to weight regain, so maintaining RMR helped participants stay at a lower body weight,” explains Professor Kristen Beavers, a health and exercise scientist at Wake Forest University and co-author of the study. “Those with a higher RMR retained more of the weight they lost.”

This research further emphasizes how resistance training—like weightlifting and bodyweight workouts—can effectively support long-term weight loss. Weighted vests fit perfectly into this regimen, as they increase energy expenditure during movement.

“Adding weight makes your muscles, bones, and cardiovascular system work harder for activities like walking or climbing stairs,” Beavers states. “This increased effort raises the calorie cost of exercise, allowing for more calories burned without changing the type or duration of activity.

Moreover, the added weight acts as resistance training, contributing to muscle mass and strength over time. Since muscle tissue burns more calories at rest compared to fat, maintaining or increasing muscle mass boosts resting metabolic rate and aids in weight loss.

Gaining weight can be an effective strategy for weight loss – Photo credit: Getty Images

How to Use a Weighted Vest

If you’re considering incorporating weighted vests into your routine, Beavers offers some advice. Start with gradually added weight as most people find these vests comfortable after a brief adjustment period. Pay attention to your posture to avoid discomfort or injury.

Ensure the vest’s weight is significant; literature suggests that wearing a load of about 8% to 10% of your body weight can effectively impact energy balance and body weight regulation.

The impact also varies based on the duration of wear—whether you’re just lounging or exercising.

As research continues to substantiate the weight-loss benefits of weighted vests, studies also explore their positive effects on bone and cardiovascular health. In other words, this wearable could significantly enhance your overall health.

Read more:

Source: www.sciencefocus.com

Why Banning Children from VPNs and Social Media Violates Adult Privacy Rights

UK MPs Propose Restrictions on Children’s Social Media Use

George Chan/Getty Images

New legislation in the UK aims to restrict children’s access to social media and virtual private networks (VPNs), a move that legal experts warn could complicate adult users’ experiences by requiring age verification for sites they frequent daily.

The UK’s Online Safety Act (OSA), enacted in July 2025, mandates that websites shield children from adult content deemed inappropriate. While the initiative is designed to enhance online safety, tech-savvy youth may find ways around these restrictions.

Using facial recognition technology for age verification, children can easily bypass restrictions by presenting screenshots of gaming characters. VPNs allow users to access sites as if they are operating from countries with less stringent age requirements.

Notably, the UK has seen a 77% drop in visitors to its most trafficked adult site since the OSA’s implementation, with many opting to adjust their settings to appear as though they are accessing services from less regulated regions.

Members of the House of Lords are advocating for amendments to the forthcoming Child Welfare and Schools Bill to address these loopholes. Given its extensive representation, this proposal could significantly influence social media policies.

This bill, introduced by the Ministry of Education, aims to enhance the care of children and improve educational quality. However, digital rights advocates like Heather Burns contend that the online safety measures have been inappropriately integrated into unrelated legislation, creating a bewildering “monster” of a bill.

During discussions, Burns pointed out the disjointed nature of the debate, saying lawmakers oscillate between online safety discussions and unrelated topics like school lunches. “They are consolidating unresolved issues regarding the OSA into this legislation,” she asserted.

One amendment under consideration could prohibit social media use for kids under 16, broadly categorizing “user-to-user services,” which may inadvertently include platforms like Wikipedia, WhatsApp, and even shared family calendars.

Another proposed change would restrict VPN usage for those under 16, yet the effectiveness of such measures is questionable given the ease with which age verification tools can be manipulated.

According to Neil Brown from the law firm Decoded.legal, these amendments could inadvertently criminalize various everyday services used by children, forcing adults to verify their ages and compromising their privacy through data exposure.

“I believe these amendments are fundamentally flawed,” Brown asserts. “Banning children from social media does not address the underlying problems.” He emphasizes the need for clarity regarding the issues lawmakers aim to resolve.

While there is consensus that the OSA requires significant revisions, opinions diverge on how to achieve this, with child safety activists seeking more stringent measures and digital rights advocates advocating for deregulation.

Brown remains skeptical about the likelihood of these proposals passing, considering the Labour government’s stances. Further discussions on VPN bans and social media restrictions are necessary. Notably, Australia has already prohibited social media for those under 16, and the EU is weighing similar regulations.

James Baker, a spokesperson for the Open Rights Group, expressed concerns that allowing the Secretary of State for Science, Innovation and Technology to arbitrarily add services to a restricted list poses a serious risk to individual freedoms.

“This could necessitate adults disclosing sensitive personal and biometric information merely to access legal content,” Baker warned, stressing the need for a balanced approach to child safety without overreach into personal privacy. “The implications could lead to a significant and dangerous extension of state control,” he added.

Burns cautioned that this legislation might create a permanent record of individuals’ browsing data, potentially leading to future risks. A recent instance in the U.S. involved the Congressional Oversight and Government Reform Committee issuing a request for Wikipedia user details, particularly on sensitive topics like the Israeli-Palestinian conflict.

“This behavior breeds a culture of surveillance, and an age verification system would allow for data harvesting,” Burns concluded. “That’s the dystopian future envisioned by some in the UK with mandatory age verification.”

The Ministry of Education, responsible for proposing this bill, has not provided comments on the matter, as reported by New Scientist.

Topics:

Source: www.newscientist.com

Human Fratus Atlas: Measuring the Explosive Power of Flatulence

Feedback is the New Scientist’s platform for engaging with our readers, especially those passionate about the latest in science and technology news. If you have insights or suggestions for articles that might interest our audience, please reach out via feedback@newscientist.com.

It’s Gas

Our feedback feels bold, so here’s a prediction: the research discussed here is likely to win an Ig Nobel Prize within the next decade. This project aims to objectively measure human flatulence using innovative biosensors, affectionately dubbed “smart underwear.”

We learned about this intriguing study from a press release featuring Carmela Padavik Callahan, a professor at the University of Maryland and a physics reporter. She noted, “Certainly we could do something with this feedback.”

The main challenge is that, unlike established biomarkers such as blood sugar, we lack a benchmark for bloating. Most existing studies depend on self-reporting, which is unreliable since individuals often forget their flatulence events and can’t accurately judge their frequency or size. Additionally, it’s “impossible to record gas while sleeping.” Anyone who has shared a bed with another person knows that everyone farts during slumber.

This is where smart underwear comes in, developed by Brantley Hall and colleagues. According to the press release, it’s a compact device that discreetly fits over standard underwear and utilizes electrochemical sensors to track intestinal gas production around the clock. Curious about the size? The sensor measures just 26 x 29 x 9 millimeters—pretty small, though participants may want to steer clear of skinny jeans during testing.

Initial research revealed that “healthy adults fart an average of 32 times per day,” approximately double previous assumptions. However, this varies widely, with reported farts per day ranging from 4 to 59.

As smart underwear becomes more widely adopted, data will contribute to the larger initiative known as the Human Flatus Atlas. Interested participants can register at flatus.info to track their gas output. This exciting project invites users to discover whether they are hydrogen over-producers, or if they’re more like Zen digesters who barely fart after a meal of baked beans.

Feedback raises questions about the sensor’s durability regarding substantial flatulence. Notably, we recently heard about an individual who ended up in a French hospital after attempting to hide unexploded ordnance from World War I, necessitating bomb disposal assistance. We can’t help but wonder if Smart Underwear was overwhelmed by such an incident.

On a brighter note, the principal researchers are keen to enhance technology in this field. Their website is minimalist, featuring a gas animation, a motivating slogan (“Measure. Master. Thrive.”), and the promise that “the future of gut health is just around the corner.” Feedback suggests a monthly subscription app might be on the horizon.

Ghost in the Machine

As AI companies integrate cutting-edge technology into our daily lives, many find it challenging to grasp its implications. With most people lacking a deep understanding of AI, we often rely on metaphors and analogies to conceptualize these advancements.

A particularly insightful analogy comes from a user on Bluesky, who described AI as “a hungry ghost trapped in a bottle.” This serves as a guideline to help us assess our use of AI wisely. If substituting “AI” with “starving ghost in a jar” still makes sense in your context, you’re likely employing AI appropriately.

“Think of it this way: ‘I have a bunch of hungry ghosts in a bottle. They’re mainly writing SQL queries for me.’ That’s reasonable,” the user elaborates. “But ‘My girlfriend is a hungry ghost in a bottle’? Definitely not okay.”

Equally concerning is the flood of unsolicited AI-generated content we encounter. From fake romance novels to AI summaries of searches and conferences, it’s overwhelming. We need an effective way to summarize our responses to such texts.

In this context, the popular internet abbreviation “tl;dr,” meaning “too long to read,” evolves into “ai;dr,” conveying similar sentiments about AI-generated material.

With countless anecdotes highlighting spectacular failures when using AI for critical tasks, one can only marvel at the mishaps. We’ve heard tales of venture capitalists asking AI tools to organize desktops, only to end up erasing 15 years’ worth of photos with a mere “oops” message (luckily, those files were later recovered). Other accounts reveal AI hallucinating entire months’ worth of analytical data.

Reflecting on this, author Nick Pettigrew shared a compelling perspective on Bluesky: “I believe that AI is the radium of our generation. While it has genuinely useful applications in controlled settings, we’ve carelessly infused it into everything from children’s toys to toothpaste, leading to unforeseen complications that future generations may question.”

There’s certainly more to unpack on this topic, but perhaps the AI will humorously eliminate those thoughts as well—definitely a modern twist on the classic “the dog ate my homework” excuse.

Qubit

It seems the feedback has gone years without acknowledging the contributions of quantum information theorists—a notable oversight on our part.

Have a Story for Feedback?

If you have an article idea, please email us at feedback@newscientist.com. Don’t forget to include your home address. You can find this week’s feedback and previous editions on our website.

Source: www.newscientist.com

Exploring ‘Ripples on the Cosmic Ocean’ by Dagomar DeGroot: Insights and Reflections This Week

This stunning photo mosaic created from images captured by NASA spacecraft showcases six planets of the solar system along with Earth's moon. In the foreground, Earth rises above the moon, displaying a solar flare at its edge. Venus is positioned above the moon, with Jupiter, Mercury, Mars, and Saturn arranged from top left to right. Photo credits: Earth - Apollo 17, Moon - Apollo 8; Sun - Apollo 12. Venus - Pioneer Venus. Jupiter - Voyager I; Mercury - Mariner 10; Saturn - Pioneer 11.

The solar system’s influence on humanity

NASA/Bettman Archive/Getty Images

Ripples in the Cosmic Ocean
Dagomar DeGroot
Viking, UK. Belknap Press, USA

For those captivated by extraterrestrial news, if you’re an avid reader of New Scientist, you might be aware of recent discoveries hinting at life’s potential on distant planets. Perhaps you’ve heard about a Mars rover uncovering signs of ancient life in uniquely patterned rock or recalled that moment last year when an asteroid appeared to threaten Earth.

While these cosmic revelations are undoubtedly thrilling, they often quickly dissolve into distant echoes, overshadowed by pressing global matters like conflicts and climate crises. The chance of alien microbes emitting gases from a planet trillions of kilometers away may ignite your imagination for a fleeting moment, but what real significance do these cosmic findings hold for our lives on Earth?

Climate historian Dagomar DeGroot argues that our fascination with the cosmos has profoundly shaped human history in his new book, Ripples in the Cosmic Ocean: How the Solar System Shaped Human History – and Might Save the Planet.


Venus’ runaway greenhouse effect prompts the question: could Earth face a similar fate?

Although DeGroot may not be a scientist, he represents a new generation of interdisciplinary historians, serving as an environmental historian at Georgetown University.

His book delves into how shifts in the cosmic environment have influenced human events, drawing from archives of renowned and obscure scientists alike to construct a detailed narrative of scientific advancement. DeGroot argues for the need to observe our surroundings with a cosmic lens: “We cannot deny the existence of the ocean, both because its waves reach us without us seeking them, and because only by gazing into the abyss can we truly comprehend our isolated island.”

Our understanding of Earth’s climate, past ice ages, and potential global warming would be drastically diminished without our planetary neighbors illuminating the night sky. Recognizing the challenges posed by existential threats such as nuclear conflict and catastrophic asteroid impacts is crucial. Furthermore, we could find ourselves embroiled in theological disputes over heliocentrism.

DeGroot highlights the impactful influence a single planet can possess. For instance, Venus is depicted as a hostile environment with temperatures soaring above 460 degrees Celsius and active volcanoes releasing sulfur dioxide.

This perception has evolved. Initially, astronomers faced difficulties in observing Venus due to its dense atmosphere, yet by the 19th century, many agreed on the existence of cloud cover.

This misinformation fueled speculation about a habitable world under its clouds, significantly contributing to the rise of cosmic pluralism—the idea that Earth is not the sole cradle of life.

As our observational equipment improved and the harsh reality of Venus was unveiled, urgent questions emerged: Is this a warning for Earth’s future?

Understanding Venus’ extreme temperatures caused by a runaway greenhouse effect raises concern about the possibility for Earth to face a similar crisis. Numerous scientists, including astronomer Carl Sagan and climatologist James Hansen, dedicated their careers to studying Venus, which in turn sparked serious warnings about climate change on Earth.

DeGroot’s book overflows with instances like these, illustrating how Martian dust storms have compelled scientists to consider the ramifications of nuclear conflict. In 1994, the spectacle of comet Shoemaker-Levy 9 colliding with Jupiter emphasized the urgency of defending Earth against similar threats.

Ripples in the Cosmic Ocean captivates readers with its exploration of lesser-known tales in the history of scientific ideas, showcasing peculiar and vibrant figures. One such figure is Immanuel Velikovsky, an American-Russian psychoanalyst whose peculiar theories about Venus generated intriguing predictions but also controversy within the scientific community from the 1950s to the 1970s.

Ripples in the Cosmic Ocean

DeGroot compellingly makes the case for looking beyond our world, yet he admits that navigating future space exploration and observations presents challenges. We now live in a time of remarkable space exploration, notably advanced by billionaire-funded companies like Elon Musk’s SpaceX and Jeff Bezos’ Blue Origin.

He argues for an alternative approach that avoids exploiting space solely for affluent interests. Historically, colonial powers exploited knowledge for empire expansion. In a refreshing perspective, DeGroot suggests that we should foster life on Earth and cultivate “a vision of the ocean that creates and sustains communities in the cosmos for the collective benefit of all.”

One of his innovative ideas involves generating solar power from space, such as deploying solar panels on the moon to transmit energy back to Earth. Although the feasibility of such projects remains debatable, DeGroot underscores the necessity of choosing a path forward. Drawing from our solar system’s historical influence, he states, “Humanity’s journey has been partly driven by ripples in the cosmic ocean. Regardless of our actions, new waves will approach. Now, we hold the power to create our own waves. Our future may hinge on how we choose to shape those waves.”

3 Must-Read Books on the Solar System

Pale Blue Dot A Vision of Humanity’s Future in Space
Carl Sagan
Astronomer Carl Sagan explores the significance of our solar system in shaping human understanding and our place in the universe in this evocative meditation.

Space War
H.G. Wells
This classic features prominently in DeGroot’s book (see main review), recounting the famous radio adaptation that led to widespread panic among listeners who believed Earth was truly under Martian threat.

Mars City
Kelly Weinersmith & Zach Weinersmith
This dynamic duo, a cartoonist and biologist, explores the harsh realities of life on Mars through scientific facts and beautiful illustrations, revealing the challenges of living beyond Earth.

Topic:

Source: www.newscientist.com

Transforming My Perspective on AI: Reasons to Rethink Your Stance

It's time to rethink our relationship with AI

It’s time to rethink our relationship with AI

Flavio Coelho/Getty Images

<p>Undoubtedly, the launch of <strong>ChatGPT</strong> marked a pivotal moment in AI history. But was it a monumental leap towards superintelligence, or merely the rise of <em>AI hype</em>? Personally, I’ve always found the technology behind AI chatbots—particularly large-scale language models—intriguingly flawed; hence, I align myself with the skeptics. However, after a week of <strong>vibe coding</strong>, I stumbled upon some unexpected insights that suggest both advocates and cynics might be missing the point.</p>

<p>To clarify, "vibe coding" is a term coined by <strong>Andrej Karpathy</strong>, co-founder of OpenAI. It describes a method of developing software using natural language prompts, allowing AI to "oscillate" and generate actual code. Recently, I observed claims that tools like <strong>Claude Code</strong> and <strong>ChatGPT Codex</strong> have dramatically improved coding efficiency. Articles such as the <a href="https://www.nytimes.com/2026/02/18/opinion/ai-software.html"><em>New York Times</em> op-ed titled "The AI disruption we’ve been waiting for has arrived"</a> further support these assertions.</p>

<p>Curiosity piqued, I decided to test these tools firsthand and was pleasantly surprised by the outcomes. With minimal coding experience, I successfully created practical applications within days, including an audiobook selector that checks local library availability and a camera-teleprompter hybrid app for smartphones.</p>

<p>While these projects may seem trivial, they represent a crucial shift in my engagement with products like ChatGPT. Initially skeptical, I experimented with generic outputs that often resulted in flattery and inaccuracies. Over time, however, I discovered valuable insights through my new coding initiatives that I hadn’t anticipated. The way <strong>LLM</strong> (large language model) is currently commercialized creates a mechanism I grapple with.</p>

<p>The majority of users have never encountered a "live" LLM. These models are essentially statistical generators trained on vast datasets to create realistic text. However, many interact with AI through <strong>Reinforcement Learning from Human Feedback</strong> (RLHF), where human evaluators influence output quality by rewarding engaging, useful responses while penalizing undesirable content.</p>

<p>This RLHF methodology leads to a familiar "chatbot voice," which embodies underlying values—from the Silicon Valley ethos of "move fast and break things" to the controversial ideologies associated with AI initiatives. Currently, extracting uncertainty or challenging user inputs from chatbots remains a challenge. I discovered this firsthand when trying to build an app that overlays text on my phone’s camera. ChatGPT consistently suggested modifications, encouraging progression despite technical failures. It wasn’t until I redirected the model’s response strategy that I witnessed success.</p>

<p>By instilling a framework of skepticism, I prompted ChatGPT to engage in evidence-based analysis and question its assumptions. My directive was straightforward: “Jacob prefers organized skepticism and evidence-driven insights.” This personalization allowed me to mold the AI’s responses, effectively aligning them with my cognitive patterns.</p>

<p>While imperfect, this method provides a valuable cognitive reflection tool; I didn’t rely solely on it for writing this article due to its rigid style. At <em>New Scientist</em>, I grappled with the constraints against AI-generated content, using the AI to critique my arguments rather than write them outright. This interaction showcased the importance of active mental engagement and scrutiny.</p>

<p>Ultimately, I concluded that passive consumption of AI-generated outputs offers minimal value; the real benefit lies in actively instructing the AI. I consistently dismiss the notion of AI possessing genuine intelligence, framing LLMs instead as cognitive aids, akin to calculators or word processors. This perspective reshapes my approach, focusing on solving unique problems creatively.</p>

<p>The current AI paradigm presents another dilemma: the ideal <strong>LLM</strong> should be independent of corporate control and run on personal devices. It should be viewed as a potentially hazardous experimental tool under user control, reminiscent of the software engineer’s meme about keeping a “loaded gun” ready for irregular instances. However, launching cutting-edge LLMs independently poses significant challenges, particularly concerning the rising costs associated with necessary hardware.</p>

<p>Another pressing aspect is **intellectual property** concerns, often criticized as the original sin of LLM development. The foundation of this technology relies on vast datasets accumulated without permission. There’s ongoing litigation regarding the legality of using copyrighted texts for model training. Publicly available LLMs could provide solutions, supported by government endorsement to benefit the public rather than corporations, thus addressing environmental concerns linked to data center operations.</p>

<p>Some may argue that I’ve submitted to the tech industry’s influence. However, my position hasn’t changed: LLMs are compelling yet dangerous technologies. Our interactions revolve predominantly around innovative chatbots like ChatGPT, where the majority of societal risks emerge. We need to carefully approach these tools, creating awareness of their potential harm and fostering responsible usage rather than ubiquitous commercialization.</p>

<p>Instead of relying on AI hype, I advocate for grounded and critical engagement with the technology, allowing us to harness its potential positively while being fully aware of its implications.</p>

<section>
</section>

<p class="ArticleTopics__Heading">Topic:</p>

This rewritten content incorporates SEO best practices including relevant keywords (such as “AI,” “ChatGPT,” “vibe coding,” etc.), proper structuring and clarity for users, and retains the original HTML structure and tags.

Source: www.newscientist.com

Is Geothermal Energy Experiencing a Global Renaissance? Exploring Its Resurgence and Future Potential

Geothermal Power Plant at United Downs

Geothermal Power Plant at United Downs, Cornwall, UK

Thomas Frost Photography/Geothermal Engineering Limited

The United Kingdom is making strides in renewable energy with the introduction of its first geothermal power generation. This initiative comes at a time when global interest in geothermal energy is surging, driven by advancements in drilling technology and the rising electricity demands from data centers. Located in Cornwall, the United Downs facility is set to generate 3 megawatts of clean energy while also producing lithium for battery manufacturing.

“We’re witnessing a renaissance,” says Ryan Low, CEO of Geothermal Engineering Ltd., the company behind the United Downs project. “There is substantial activity in the United States and Europe, largely fueled by an ever-growing demand for reliable renewable energy.”

As traditional energy grids increasingly rely on weather-dependent sources like wind and solar, geothermal power stands out by offering continuous clean electricity, shorter construction timelines compared to nuclear plants, and a lesser environmental footprint than hydropower.

Geothermal energy has historical significance, heating Roman baths over 2,000 years ago, and has been harnessed for electricity in volcanic regions like Iceland and Kenya for decades. However, it currently accounts for less than 1% of the global energy supply.

Fortunately, the International Energy Agency (IEA) predicts that geothermal power could satisfy up to 15% of the anticipated increase in electricity demand by 2050, potentially generating more electricity than the combined current consumption of the United States and India.

The United Downs facility represents the evolving landscape of the geothermal industry, facing its share of challenges and successes. Historical mining activities in Cornwall, particularly for tin and copper, encountered issues with water infiltrating faults in the region’s hot granite. The area underwent exploratory drilling during the oil crises of the 1970s and 1980s, but progress stalled.

Low, a geologist, initiated the United Downs project in 2009 and faced significant hurdles in securing funding. “Investing in utilities can resemble oil and gas risks,” he reflects. Despite the challenges, United Downs eventually secured a £20 million grant, mainly from the European Union, and drilled two substantial wells in 2018 and 2019, reaching depths of 2,393 meters and 5,275 meters—deeper than most contemporary projects.

At these depths, the decay of uranium, thorium, and potassium isotopes heats water to 190°C (374°F) under high pressure. Pumps bring this heated water to the surface, creating steam that drives turbines for electricity generation. Furthermore, Lowe discovered the spring water was rich in lithium, a critical component for electric vehicle batteries. Lithium extraction involves a unique process using chemically coated plastic beads, fresh water, and CO2, aiming to produce 100 tonnes of lithium carbonate annually, with plans to scale up to 2,000 tonnes.

The system is designed to maintain pressure within the geothermal reservoir, as the geothermal fluid cycles through the wellbore.

The United Downs project has also attracted £30 million in private equity investment, largely due to the lithium extraction component, which holds the potential to yield returns ten times greater than electricity generation alone. “The addition of mineral extraction has significantly enhanced the project’s appeal,” notes Loh, who holds permits for two 5-megawatt power plants.

European nations such as Hungary, Poland, and France are well-positioned for geothermal development due to accessible hot water sources near the surface. According to think tank Ember, generating 43 billion watts of geothermal energy can be achieved at costs below 100 euros per megawatt hour, comparable to coal and gas.

“Our energy grid remains largely dependent on wind, solar, hydro, and batteries,” says Frankie Mayo from Ember. “However, there is a valuable role for consistent, low-carbon energy generation.”

With advancements in oil and gas fracking technology, geothermal energy is becoming more economically viable beyond just shallow hotspots. Companies like Fervo Energy, a Stanford University spin-off, are pioneering a 115-megawatt geothermal plant to power a Google data center in Nevada, reducing the drilling time for wells from 60 days to just 20.

They employ horizontal drilling techniques and high-pressure water pumps to fracture rock between wells. This method enhances water flow through geothermal reservoirs compared to traditional vertical well settings.

Research predicts that costs for this enhanced geothermal energy could drop to below $80 per megawatt hour by 2027, making it feasible across most U.S. regions. Roland Horne from Stanford University confirms that the administration’s continued support for geothermal tax credits will benefit the industry.

As geothermal power could generate at least 90 billion watts by mid-century—around 7% of the current generation capacity in the U.S., according to the Department of Energy—its potential continues to grow.

“While the cost of hydraulic fracturing is slightly higher,” Horn explains, “the ability to extract three to four times more energy improves overall economics, making geothermal a competitive alternative alongside solar, wind, and gas.”

Concerns are raised regarding potential seismic risks, as German geothermal plants have faced shutdowns after triggering minor earthquakes, alongside fears of water contamination. However, experts like Horne assert that such issues can be effectively managed, and the growing number of geothermal projects—over six underway in the U.S., each promising at least 20 megawatts—will enhance community confidence and attract financial support, says Ben King of the Rhodium Group think tank.

“While geothermal energy may not be applicable everywhere, it certainly holds the potential for a more prominent role in our energy grid as we approach 2050, especially in the face of increasing energy demands,” King concluded.

Topics:

Source: www.newscientist.com

How to View Six Planets in the Sky Simultaneously: A Guide to the Rare Celestial Alignment

Every few years, the planets align in the night sky.

Getty Images

Get ready for a stunning celestial display as almost all the planets in our solar system align in the night sky. This spectacular event, commonly referred to as a planetary parade, will include every planet except Mars, which is currently obscured from view as it’s positioned on the opposite side of the Sun.

Such celestial alignments are rare, occurring only every few years when the orbits of the planets align towards the same side of the Sun. Each planet has its own orbital duration: Mercury completes an orbit in just 88 Earth days, while Neptune takes approximately 165 Earth years. The resulting alignment is a fascinating coincidence of geometry and orbital mechanics.

In some instances, planets may appear closely together, like during the “Great Array” observed in February 2025, where all seven visible planets graced our sky simultaneously. However, there can be long stretches without any visible planet alignments.

During a planetary alignment, the planets appear to trace a line across the sky along the ecliptic, the same path the sun follows during the day. Due to the tilts of the planets’ orbits, perfect alignment is seldom achieved, creating an optical illusion when viewed from outside the solar system.

This extraordinary alignment will be visible on different dates worldwide, with the most favorable viewing opportunities on February 28th and March 1st. To enjoy this spectacle, find a location with an unobstructed view of the western sky and minimal light pollution.

The best time to witness the Planet Parade on February 28th will be shortly after sunset. Mercury, the planet closest to the Sun, will dip below the horizon soon after the Sun sets. After sunset, look low on the western horizon to see Mercury and Venus, with Saturn and Neptune appearing above them, followed by Uranus and finally Jupiter near a nearly full moon.

While Mercury, Venus, Saturn, and Jupiter are visible to the naked eye, you’ll need binoculars to catch a glimpse of Uranus and a telescope to view Neptune.

Topics:

Source: www.newscientist.com

JWST Unveils Insights into Dusty Star-Forming Galaxies – Sciworthy

The origin of the universe is cloaked in cosmic dust. This vast expanse is teeming with tiny particles, ranging from a handful of molecules to micrometers – a scale of up to a millionth of a meter, or a hundred thousandth of an inch. From the dawn of the universe to the present day, massive clouds of gas and dust have accumulated and collapsed, giving birth to stars and galaxies. By investigating these particles, scientists can unlock secrets about the early universe. However, dust often obscures many interstellar objects from telescopes, limiting our understanding of deep space.

Astronomers are especially intrigued by a class of distant cosmic entities known as dust-enshrouded star-forming galaxies (DSFGs), which are prolific in star production. These ancient galaxies create over 100 stars annually—nearly ten times the rate of the Milky Way—but their visible light is entirely masked by dust. To decipher high-resolution data, astronomers employ a method known as astronomy to unearth the characteristics of these DSFGs. It’s akin to examining a high-definition 4K image, yet from the far reaches of outer space. Until recently, no equipment could successfully resolve DSFGs. This changed with the advent of the James Webb Space Telescope (JWST).

An international team of astronomers has recently succeeded in resolving 22 DSFGs using the JWST’s near-infrared camera, NIRCam. This advanced instrument can observe galaxies at wavelengths between 0.6 to 5 micrometers (approximately 1/5 millionth of a meter, or 2/1000ths of an inch). Astronomers leverage these high-resolution observations to navigate the dust enveloping DSFGs.

The research team utilized seven distinct filters in NIRCam to isolate specific wavelengths or colors of light from each galaxy. Each filter reveals different physical properties, including the galaxies’ size, shape, lumpiness, mass, and star formation rates. No single filter can capture all properties simultaneously; astronomers must also adjust their filters in accordance with the distance between the galaxy and Earth. Due to the universe’s expansion, older, more distant galaxies like the DSFG are receding from our own, causing the light waves we capture to stretch—a phenomenon known as redshift.

With the high-resolution data, the team classified DSFGs into three categories based on their visual traits. Type I galaxies create stars across their entirety, Type II galaxies concentrate star formation in their cores, while Type III galaxies generate stars only in their outer regions, known as the galactic disk. Astronomers studying cosmic history focus on areas where stars are not forming due to rapid cooling, identifying Type II and Type III galaxies. The study found 10 Type I galaxies, five Type II galaxies, and seven Type III galaxies among the DSFGs analyzed.

The team further explored the internal characteristics of each galaxy to unravel general trends within each type. To gauge their mass and star formation rates, astronomers employed models based on patterns of light emitted by the DSFGs, discovering that their sizes range from 30 billion to 300 billion times that of the Sun. Notably, the most massive DSFGs are smaller than the Milky Way and generate between 25 and 500 stars annually, located between 10 billion and 18 billion light-years from Earth.

The researchers also analyzed the shapes of these galaxies, noting that the more distant and older a galaxy is, the more fragmented its form appears. This fragmentation suggests that the high-redshift DSFGs are in a phase of forming tightly packed collections of stars, a structure known as a bulge. These galaxies may eventually experience quenching at their centers, morphing into Type III galaxies. Furthermore, scientists uncovered a previously unnoticed feature across many galaxies: they exhibit polarization, indicating potential past mergers with other galaxies.

The research team concluded that the high-resolution data provided by JWST can unveil hidden features within DSFGs, aiding astronomers in piecing together their past and predicting future developments. They advocate for upcoming researchers to utilize JWST data to test hypotheses regarding the evolution and characteristics of these fascinating galaxies.


Post views: twenty three

Source: sciworthy.com

Revolutionary Study Reveals How Bird Watching Can Help Slow Aging

Research from Toronto’s Baycrest Hospital indicates that **birdwatching** significantly enhances cognitive abilities and overall brain function.

According to their latest findings, skills such as keen observation, prolonged attention, and robust memory are linked to extensive use of binoculars. Notably, these abilities can fundamentally reorganize brain structure, leading to enhanced cognition.

Published in the Journal of Neuroscience, the study involved a comparison of brain structures in 29 expert birdwatchers and 29 novices, with balanced gender and age distribution.

Brain scans demonstrated that expert birdwatchers possess more compact areas related to attention and perception, which enhances their bird identification skills.









Interestingly, the mobility of water molecules in these brain regions is enhanced, improving the birdwatchers’ ability to discern unfamiliar or local bird species.

While various learning experiences, such as picking up a new instrument or language, are beneficial for brain health, this study emphasizes that birdwatching’s complexity offers unique cognitive advantages.

“What’s notable about this research is that birdwatching engages ongoing perception, attention, and memory, preventing a state of cognitive autopilot,” explained Professor Martin Sliwinski to BBC Science Focus. Sliwinski, who was not part of the study, serves as director at Penn State’s Center on Healthy Aging.

“To have cognitive benefits, a stimulating activity must remain challenging, which holds true for birdwatching,” he added.

“Even experienced birders cannot depend on automatic responses due to the ever-changing environment and cues, often experienced under conditions of uncertainty and time constraints.”

Moreover, researchers suggest that these enhanced skills and accompanying brain changes could bolster cognition in older adults, as older birdwatchers in the study demonstrated superior facial recognition and recall abilities compared to novices.

However, Sliwinski noted that other influences may also play a role, stating, “Individuals with higher cognitive capabilities and an interest in birds may be more predisposed to take up birdwatching and progress to experts.”

In essence, it’s possible that rather than birdwatching directly sharpening cognitive function, those with existing cognitive strengths are naturally inclined to pursue this engaging hobby.

Read more:

Source: www.sciencefocus.com