Revolutionary CAR-T Cell Therapy Restores Bedridden Woman to Full Health

CAR T Cells: Genetic Modification Process to Combat Autoimmune Diseases

Diagram of CAR T Cells: Genetic Modification to Combat Autoimmune Diseases

Christoph Burgstedt/Science Photo Library

A woman suffering from three autoimmune diseases has found remarkable relief after undergoing CAR T cell therapy. Following genetic modification of her immune cells, she didn’t require treatment for nearly a year, thanks to these engineered cells effectively targeting and eliminating rogue cells in her body. “When we first met, she was bedridden and at death’s door. After treatment, she was out of bed within seven days,” stated Fabian Muller from Erlangen University Hospital, Germany. Remarkably, she made a full recovery within months, and an 11-month post-treatment check confirmed her continued good health.

This case represents the growing potential of CAR T cell therapy in treating autoimmune diseases, particularly since she was the first patient treated for three concurrent conditions simultaneously. “It’s astonishing that I could overcome three autoimmune diseases with just one treatment,” Muller remarked.

In response to viral infections, our bodies produce vast numbers of immune cells with random mutations. Unfortunately, some of these mutant cells become self-targeting and can persist indefinitely. This phenomenon occurred in the patient’s case over a decade ago during pregnancy, leading to her autoimmune hemolytic anemia—a severe condition where antibodies attack oxygen-carrying red blood cells.

Her immune system went on to produce antibodies that targeted platelets (leading to immune thrombocytopenia) and proteins preventing blood clots (causing antiphospholipid syndrome), exposing her to both severe anemia and dangerous clot risks.

Despite trying various immunosuppressive medications with no success, the patient required blood transfusions and anticoagulants to manage her symptoms until she was referred to Professor Müller and his team. In 2022, they became the first to treat an autoimmune disorder with CAR T cell therapy, a technique previously limited to cancer treatment.

For her treatment, researchers engineered CAR T cells to specifically target her abnormal antibody-producing immune cells. Following this intervention, these cells were effectively eliminated, restoring her immune system’s functions without entirely wiping it out.

Interestingly, her immune system recognized the infused CAR T cells as foreign and eliminated them within months, paving the way for the development of new, healthy antibody-producing cells. Consequently, her immune system is now functioning normally, free from the destructive cells responsible for her illness.

The CAR T therapy approach has shown promise for treating disorders like lupus, multiple sclerosis, colitis, and severe asthma. Unlike cancer treatments, which may induce severe side effects due to extensive cell death, the CAR T therapy used for autoimmune diseases is generally associated with far fewer adverse effects, as fewer cells need targeting.

Although some residual effects persisted, researchers believe these stem from previous drug therapies rather than the CAR T treatment itself. “This powerful treatment has minimal side effects and can resolve underlying symptoms, which is truly remarkable,” stated Ruben Benjamin from King’s College London.

Currently, most patients treated for autoimmune disorders with CAR T cell therapy have remained symptom-free, although some cases show a return of targeted cells, necessitating additional treatment options, as noted by Benjamin.

“Long-term follow-up is essential for a comprehensive assessment of these therapies,” he added. Jun Shi from the Chinese Academy of Medical Sciences in Tianjin is leading an ongoing trial on 15 patients with autoimmune hemolytic anemia using CAR T therapy. Read more about ongoing trials here.

While CAR T therapy is notably expensive, ranging from $200,000 to $600,000 due to its tailored nature, Muller emphasizes the long-term savings and benefits, suggesting that effective treatments can lead to individuals returning to work and improved quality of life. “The initial costs are high, but they could save substantial amounts in the long run,” he stated.

Topic:

Source: www.newscientist.com

Revolutionary Experiment Uncovers Major Unexpected Issues in Cloning Technology

Mice cloning study

Limited Lifespan of Cloned Mice

Xinhua/Zhou Qi/Imago/Alamy

Cloning involves creating genetically identical copies, yet extensive research over the last 20 years reveals unexpected complexities. Clones often accumulate additional mutations, and if the cloning process is repeated, these mutations can reach lethal levels. This discovery presents important implications for cloning in agriculture, conservation, and even medical applications involving humans.

The core issue lies in the numerous mutations within clones. Adult somatic cells may accumulate more mutations than gametes (egg or sperm cells). Researchers such as Teruhiko Wakayama from the University of Yamanashi in Japan suggest that the cloning process may also contribute to these mutations. “While we once believed clones were identical to their originals, the accumulated mutations present significant challenges,” Wakayama states. “Our goal is to confirm that these mutations do not lead to complications.”

Historically, cloning mammals was deemed implausible because cellular differentiation adds various chemical tags that regulate gene activity. The successful birth of Dolly the sheep in July 1996 demonstrated that transferring the nucleus of an adult cell into an empty egg could effectively reprogram the genome, enabling cell growth. Shortly after, in October 1997, Wakayama created the first cloned mouse, Kumulina.

To evaluate the efficacy of his team’s cloning technique, Wakayama initiated cloning experiments in 2005. “Similar to how a reproduced painting loses detail, we aimed to assess the quality of the clones against the original,” he explains.


By 2013, Wakayama’s team had successfully generated over 500 mice from a single donor across 25 cloning generations, claiming, “Each cloned mouse exhibited no physical anomalies and maintained normal lifespan and health.” However, this level of success has not been replicated in other species. Cloned dogs continue to face health complications, and no primate has been cloned using adult cells to date. Initially, Wakayama believed repeated cloning in mice could extend indefinitely, yet by the 58th generation, not one clone survived.

To uncover the reasons behind this decline, the research team sequenced the genomes of ten different mice from various generations. They found an average of over 70 mutations per clonal generation, three times higher than in the naturally bred control group. Notably, after the 27th generation, significant mutations began to accumulate, even leading to the loss of the entire X chromosome.

This issue may stem from evolutionary mechanisms that protect gametes from mutations while allowing adult somatic cells to accrue more mutations. Recent studies suggest mutations accumulate eight times faster in blood cells compared to sperm. Thus, if the original cloned adult cell harbored numerous mutations, so too would the resulting clones.

Wakayama also posits that the nuclear transfer process may induce additional mutations. “It’s plausible that physical shock during nuclear transfer can damage the DNA,” he remarks. “If we can devise gentler nuclear transfer techniques, we might lower the mutation rate in cloned embryos—but we’re still seeking solutions.”

Shukrat Mitalipov, a professor at Oregon Health and Science University, remains skeptical. “The mutation rate evident in cloned subjects probably reflects the genomic nature of donor cells rather than being an inherent consequence of nuclear transfer,” he states.

While human cloning is prohibited in many regions, researchers like Mitalipov are exploring nuclear transfer’s potential for generating tissues and organs that are compatible for treatments, as well as for creating sperm and egg cells for infertility therapies. Wakayama’s findings highlight the necessity of thorough donor cell screening to prevent deleterious mutations. “Evaluating donor cell populations for harmful mutations is vital; if needed, gene editing could correct identified issues.”

Nevertheless, if the cloning process itself is responsible for inducing mutations, it presents additional challenges. Nonetheless, these findings do not signal that cloning techniques entail insurmountable risks. The mutation rate per generation remains relatively low, and safety screenings can be conducted post-cloning. However, they underscore the complexities inherent in cloning technology.

Source: www.newscientist.com

Discover the World’s First Antimatter Delivery Service: A Revolutionary Breakthrough

Portable Trap System “BASE-STEP”

Marina Cavazza, Chetna Krishna/CERN

CERN’s antimatter factory, located in a high-magnetic field environment and a vacuum more extreme than interstellar space, houses some of the most delicate matter found on Earth. Nestled in a compact box roughly the size of a filing cabinet and a few hundred kilograms lighter than a Ford Focus, lie antiprotons that have been quietly resting for weeks. Rather than being aggressively tested like most particles produced in this facility, these antiprotons have a singular purpose: awaiting their moment of transport.

Shortly, more than a hundred of these precious antimatter particles will be transported in trucks along a four-kilometer ring road around the CERN campus. This marks the inaugural demonstration of a future antimatter delivery service designed to transport antimatter to laboratories across Europe.

During my visit to CERN’s campus near Geneva, Switzerland, project leader Christian Smolla guided me through the facility, showcasing the final preparations for the “Symmetry Test in Transportable Antiproton Experiments (STEP).” “This represents a groundbreaking achievement in antimatter science,” he remarked. “While the theoretical framework for transporting antiprotons existed since the facility’s inception, this is the first practical implementation.”

Since the 1920s, scientists have acknowledged the existence of antimatter, particles with counterparts that possess opposite charges. However, antiprotons, being the simplest form of antimatter, often annihilate upon contact with their more plentiful proton counterparts, complicating their production and storage. It wasn’t until the 1980s that CERN successfully conducted the first experiments to confine antiprotons, generated by proton bombardment of metal targets.

Today, CERN’s Antimatter Factory is the only location globally capable of producing millions of antiprotons on demand and retaining them for research purposes. Several experiments, including the Baryon Antibaryon Symmetry Experiment (BASE), take place here, with STEP also participating.

Christian Smolla Making Final Adjustments

David Stock

These experiments meticulously test antimatter’s fundamental properties, examining deviations from normal matter. Insights gleaned could provide answers to why our universe predominantly consists of matter, seemingly devoid of antimatter.

To achieve the necessary precision in measurements, it is essential to mitigate noise from radiation that might disrupt data collection. When antiprotons enter the detection zone, they approach nearly the speed of light, necessitating a robust magnetic field for deceleration, although complete blockage remains unattainable.

In 2018, Smolla’s team recognized the need for a quieter environment for antimatter, resulting in a strategic escape plan. “Observing variations in the magnetic field made it clear we had to continue precision measurements elsewhere,” Smolla stated.

Containing antimatter is a formidable challenge, requiring superconducting magnets cool enough to sustain near absolute zero temperatures while consuming massive electrical power. The STEP design leveraged just a 30-liter liquid helium tank for magnet cooling, allowing its electronics to function on a standard diesel generator. Future test runs aim to transition to battery power.

Additionally, magnets needed to withstand start-stop movements during operation, and a custom vacuum system was essential to ensure the antiprotons remain uncontaminated by normal matter during their loading and unloading processes.

In 2024, Smolla’s team is set to showcase the STEP experiment. A truck will transport the device across the CERN campus to observe protons, a significant milestone in antimatter transport.

In the days leading up to my visit, approximately 100 antiprotons were slowed and positioned within a sophisticated network of vacuum and electromagnetic fields.

Since then, they’ve patiently awaited the next steps within a complex arrangement of electrical wires and liquid helium lines. With a small oscilloscope screen, Smolla’s team monitors the antimatter’s vital signs. The natural frequencies at which antiprotons vibrate manifest as double humps, affectionately adorned with googly eyes.

Detection Signals Indicating Antiproton Presence

David Stock

On an early Tuesday morning, a crane carefully hoists the entire 850-kilogram trap onto a specialized truck. The truck’s operator is trained to manage CERN’s sensitive equipment, ensuring smooth acceleration and braking.

The truck will then navigate a four-kilometer loop around the CERN campus before returning to the antimatter factory. Should the experiment prove successful, Smolla’s ultimate goal is to extend this antimatter transport service beyond CERN’s confines, delivering antimatter capsules to various European laboratories. A facility currently under construction at Heinrich Heine University in Düsseldorf, Germany, aims to study antimatter in a near-field-free environment.

However, this ambitious goal entails several years of work. CERN is scheduled to suspend extensive operations in July to upgrade its Large Hadron Collider for higher power outputs, a task slated for completion in late 2028.

Once operational, the antimatter delivery service could mean trucks transporting antimatter alongside ordinary vehicles on highways throughout Switzerland and Germany. Though it sounds alarming—given antimatter’s tendency to annihilate upon contact with regular matter—Smolla assures that the risk remains minimal.

“Transporting antimatter is safe, as the quantities we handle are extremely small,” Smolla explains. “You could easily lose 1,000 antiprotons without any noticeable impact.”

Topic:

Source: www.newscientist.com

Revolutionary Small Magnet Matches Strength of Large Magnets for the First Time

Even Small Magnets Can Be Extremely Powerful

ResonX/Jasmin Schoenzart

In a groundbreaking development, researchers have designed a magnet small enough to fit in your palm that rivals the strength of the world’s most powerful magnets.

High-performance magnets are crucial in various scientific fields, being utilized in applications ranging from MRI machines and particle accelerators to advanced nuclear fusion research. The strongest magnets available typically use superconductors, which are materials that conduct electricity nearly without loss.

However, most superconducting magnets are sizable. Often, their smaller counterparts share similar dimensions with traditional superconductors. Take for instance Star Wars‘ R2D2; at its largest, it resembles a two-story structure. According to Dr. Alexander Burns from ETH Zurich, Switzerland, his team has engineered a superconducting magnet capable of matching the strength of larger counterparts, yet it’s only 3.1 millimeters in diameter. They achieved this by coiling a thin tape made of a ceramic known as REBCO, which becomes superconducting at cryogenic temperatures, generating a magnetic field when current flows through the coils.

Dr. Burns stated that the team procured REBCO tape from a commercial source, embarking on a rigorous exploration to determine the optimal magnet design, which involved creating and testing over 150 prototypes. “We adopted a ‘fail fast, fail often’ approach in our strategy,” he noted.

Design and Strength Comparison

Eventually, they refined a design using two or four pancake-shaped coils, achieving magnetic field strengths of 38 Tesla and 42 Tesla, respectively. To provide context, conventional refrigerator magnets typically generate fields less than 0.01 Tesla. The most powerful magnets currently in existence generate field strengths of around 45 Tesla, each weighing several tons and consuming up to 30 megawatts of power. In contrast, Burns and his team’s magnet is hand-sized and operates on less than 1 watt.

The ultimate goal for this groundbreaking technology is to enhance nuclear magnetic resonance (NMR), a technique that utilizes magnetic fields to unveil molecular structures, including those of drugs and industrial catalysts. This technology has long been hindered by the large size and cost of traditional magnets, but the research team intends to democratize access to such advanced tools for chemists. Ongoing tests are being conducted to integrate the magnet into NMR setups.

“Historically, achieving magnetic fields exceeding 40 Tesla necessitated massive and costly facilities, making it crucial to utilize superconducting tape to attain similar strengths in a compact device,” stated Dr. Mark Ainslie from King’s College London. “This innovation indicates that ultra-high-field magnets may soon be accessible to a broader range of laboratories.”

Despite these advancements, several challenges remain before widespread adoption. Questions concerning how to maintain uniform magnetic fields and manage the electromagnetic behavior of the coils must be addressed.

Topics:

Source: www.newscientist.com

Revolutionary One-Dose Treatment Promises to Reverse Frailty

A single dose of stem cells can significantly enhance physical endurance in older adults experiencing frailty. According to 1 in 10 people over the age of 65 are affected by this condition, as highlighted by a recent study published in the journal Cell Stem Cells.

The randomized, placebo-controlled trial investigated four escalating doses of laromestrocel, a treatment derived from donated bone marrow, in 148 adults aged 70 to 85 who were diagnosed with frailty.

After nine months, participants receiving the highest dose walked an average of 60 meters further than those given a placebo during a standard six-minute walk test, reflecting a remarkable 20% improvement.







“The results were astonishing,” said Dr. Joshua Hare, chief scientific officer of Ronnebellon, the company behind the treatment. He emphasized, “We noted a clear correlation based on dosage over time; higher doses led to a more pronounced increase in the six-minute walk test.”

Frailty is a prevalent but often misunderstood medical condition characterized by heightened vulnerability to stressors such as infections, falls, and surgical procedures, significantly beyond what is typically expected from normal aging.

This condition includes decreased muscle strength and endurance, leading to a sharply increased risk of disability, hospitalization, and mortality. According to the British Geriatrics Society, individuals with severe frailty are five times more likely to die within a year compared to those without frailty.

“When you observe 80-year-olds, some require 24-hour care in nursing homes, while others lead vibrant lives, participating in activities like tennis and golf,” Hare noted. “The biological differences play a crucial role.”

Hare suggests that inflammation, often exacerbated by age-related factors, is a significant contributor. As individuals age, their immune systems become dysregulated, with higher levels of inflammatory signaling molecules known as cytokines.

This chronic inflammation can damage blood vessels, deplete stem cell reserves, and accelerate muscle loss, culminating in a condition known as sarcopenia.

The result is frailty; the body becomes less capable of self-repair and responding to physical or medical stressors.

Current treatments primarily focus on nutritional support and physical therapy.

“Typical interventions are rather straightforward,” Hare explained. “We recognized the need to address this at a biological level, as we understand the underlying issues.”

Mesenchymal stem cells, naturally found in bone marrow and other tissues, are of great scientific interest due to their immune-modulating capabilities.

Importantly, these cells possess minimal surface proteins responsible for immune rejection, minimizing the need for immunosuppressive drugs—an important consideration for vulnerable patients.

Participants receiving the highest dose of stem cells achieved a remarkable 20% improvement in a six-minute walk test – Photo credit: Getty

Hare and his team harvested stem cells from donated bone marrow and administered them intravenously to participants, who were either given a placebo or one of four doses of lalometrocel in a double-blind setup.

The results, monitored every three months over nine months, clearly indicated that increased stem cell doses significantly enhanced walking distance. Conversely, the placebo group exhibited expected declines in physical performance typical of frail individuals aged 75 and older.

Patient-reported outcomes from questionnaires assessing physical performance, upper body strength, and mobility confirmed improvements that aligned with objective measurements from the walk tests. Participants also showed progress on a doctor-rated frailty scale ranging from 1 (most frail) to 9 (least frail).

“One-third of treated participants achieved health scores of 2 or 3,” Hare stated, indicating they were no longer deemed frail.

Researchers identified soluble Tie2 as a potential biological marker for therapeutic efficacy, a protein released into the bloodstream upon inflammation or breakdown of blood vessel walls. Patients receiving stem cells showed decreasing levels of this marker in a dose-dependent manner.

“This evidence suggests that medical interventions can potentially reverse frailty,” stated Dr. Andrew Steele, Director of the Longevity Initiative. He highlighted the challenge of achieving physical activity in frail individuals and celebrated the remarkable potential of stem cell infusions to not only slow decline but also to foster tangible improvement.

Nonetheless, the study raises key questions. The wide-ranging effects of stem cells leave uncertainties about the exact mechanisms at play.

“These cells might be targeting areas where they are most needed and regenerating cells,” said Steele. “Alternatively, they could be releasing a mix of anti-aging molecules that rejuvenate the body’s own cells.”

The follow-up period lasted nine months, leaving questions regarding the sustainability of improvements and the effectiveness of repeated doses.

Hare’s team has conducted long-term trials with multiple doses, showing preliminary evidence that participants improve without side effects and maintain benefits, though the evidence lacks robustness compared to controlled trials. Formal studies on repeat dosing are on the horizon.

Furthermore, significant regulatory challenges loom. Currently, frailty is not recognized as a disease by the U.S. Food and Drug Administration or the European Food and Drug Administration, complicating the approval process.

“It will be a tough fight,” Hare cautioned, adding that the approval pathway for laromestrocel could be expedited for Alzheimer’s disease, with promising results in related clinical trials evidenced in another study.

“We believe treatments for age-related frailty will likely be approved alongside those for Alzheimer’s, given that the latter is a well-defined condition with pressing unmet needs.”

To date, the trials indicate promise, presenting strong evidence that frailty is not an inevitable consequence of aging but a biological process that can be at least partially reversed.

“Human lifespan has nearly doubled in the last 120 years,” Hare remarked. “However, healthy life expectancy hasn’t progressed at the same pace. There will always be an end-of-life phase marked by disability and frailty.”

If progress continues, the gap between lifespan and healthy living could finally begin to close.

Read more:

Source: www.sciencefocus.com

Revolutionary Startup Develops First Data Center Powered by Human Brain Cells

Close-up of an artificial brain showcasing neural activity and orange light dots, illustrating the concept of artificial intelligence. 3D rendering of synapses and neurons made up of cubic particles.

A small number of companies are developing biological computers

Floriana/Getty Images

Data centers consume vast amounts of energy while the demand for computer chips continues to soar. Could utilizing brain cells be the solution?
Australian startup Cortical Labs is pioneering this field, planning to establish two innovative “biological” data centers in Melbourne and Singapore. These cutting-edge data centers will feature chips integrated with reproducible neurons.
Pon vs. Doom.

Cortical Labs stands out as one of the few firms creating biological computers that link nerve cells to microelectrode arrays, enabling the stimulation and measurement of cell responses during data input. Recently, the company successfully showcased that its primary model, the CL1, can learn to play games like Doom within just a week.

The first data center in Melbourne is set to accommodate around 120 CL1 units, while a second facility in collaboration with the National University of Singapore will initially support 20 CL1 systems, with plans to expand to 1,000 pending regulatory approval. This initiative aims to enhance cloud-based brain computing services.

According to Michael Barros from the University of Essex, UK, while biological computers have been constructed and tested globally, they remain challenging to build and use. He states, “We invest a lot of time and resources developing these systems.”

Barros further elaborates that Cortical Labs is democratizing access to biocomputers at scale, pioneering an accessible approach in the industry.

These systems can be trained for simple tasks, such as playing Doom, yet there are challenges in understanding how neurons function and training them for more complex tasks like machine learning. Reinhold Scherer, also from the University of Essex, notes, “When you access this technology, it opens doors to exploration in learning, training, and programming, but neurons cannot be programmed like standard computers.”

Cortical Labs asserts that its biological data centers use significantly less energy than traditional computing systems, with each CL1 requiring only 30 watts compared to thousands needed by leading conventional AI chips.

Paul Roach from Loughborough University, UK, emphasizes that scaling biocomputers into entire rooms, akin to traditional data servers, could yield substantial energy savings. Notably, while biological data centers may necessitate nutrients to sustain neuron chips, they require less cooling energy than conventional computing infrastructures, suggesting significant potential for energy conservation.

Nevertheless, experts like Tjeerd Olde Scheper, who holds a PhD from Oxford Brookes University, recognize that the technology remains nascent. “Will it perform as expected? We are still in the early developmental phase,” he comments.

Although direct comparisons between the sizes of biological and silicon AI systems remain complex, it’s notable that the envisioned biological data center would integrate hundreds of biological chips in contrast to the hundreds of thousands of GPUs typically found in large-scale AI data centers.

“We have a long way to go before these systems are production-ready. Transitioning from a small network playing games to a large language model is a substantial leap,” says Steve Furber from the University of Manchester, UK.

A pressing concern is the lack of clarity on how to store training outcomes within neurons as memory, or how to execute computational algorithms beyond specific tasks, such as video gaming.

Additionally, retraining neurons post-task completion poses challenges, as their training and learning may be lost upon the end of their lifespan. “Proper retraining is essential,” Scherer states. “If retraining is required every 30 days, it may hinder technological continuity.”

Topics:

This optimization includes enhanced keywords relevant to biological computing, energy efficiency, and neural networks, while ensuring the structure and relevant HTML tags remain intact.

Source: www.newscientist.com

Revolutionary Chemical Indicators: Detecting Alien Life Unlike Any Found on Earth

Enceladus, a moon of Saturn

Saturn’s moon Enceladus: A Prime Candidate in the Search for Extraterrestrial Life

Credit: NASA/JPL/Space Science Institute

A revolutionary method for detecting chemical properties of living organisms could unlock the secrets to identifying extraterrestrial life forms, even those with biochemical processes distinct from life on Earth.

In the quest for extraterrestrial life, scientists traditionally depend on biosignatures—substances or patterns that reliably signify the presence of life. By analyzing the atmospheres of distant planets, astronomers search for molecular biosignatures. However, many molecules associated with life can also arise from geological activities, suggesting a careful approach to interpretation.

A novel test developed by Christopher Carr and colleagues from Georgia Tech focuses on amino acids, which serve as fundamental components of proteins that sustain all known life forms. While amino acids can also be produced in lifeless environments, they have been uncovered in lunar soil, comets, and meteorites.

Given this, Carr and his team proposed that analyzing the reactivity of molecules within samples could provide more reliable biological indicators than merely detecting amino acids.

In non-living systems, molecules are continuously formed and destroyed as they react with environmental factors like cosmic rays. The more reactive a molecule, the more likely it is to decompose. “Without stable systems to maintain molecules, their reactivity increases,” explains Carr. However, living systems require reactive molecules, therefore they retain more reactive ones, creating distinct biochemical signatures.

The reactivity of compounds hinges on the arrangement of electrons in the molecules. More reactive molecules exhibit smaller energy differences between their outermost electron and the next available electron space during reactions.

Carr and his team calculated energy differences for 64 amino acids, including those not present in Earth’s biosphere. They analyzed the prevalence of these amino acids in samples sourced from both abiotic processes (like meteorites and lunar soil) and biotic sources (like fungi and bacteria), employing molecular energy calculations to establish a statistical framework for amino acid reactivity. This allowed them to estimate the probability of a sample being alive or inorganic.

After testing over 200 living and nonliving samples, they found their method could accurately identify life with 95 percent certainty. “This approach is remarkably straightforward,” Carr asserts. “It’s easily explainable and directly linked to the principles of physics.”

This reactivity-based method is applicable to the search for extraterrestrial life, as Carr posits that if life exists elsewhere, it likely relies on carbon-based chemistry and amino acids, governed by the same principles of chemical reactivity present on Earth. “Life inherently requires control over the timing, methods, and locations of molecular interactions. Therefore, structures that facilitate electron flow and molecular interactions are essential,” Carr notes.

While utilizing molecular reactivity to identify life isn’t new, measuring reactivity through statistical distributions is an innovative advancement. Henderson Cleaves from Howard University suggests that this method could enhance the toolkit of life-detection instruments on forthcoming space missions to Mars or the moons of Saturn, most notably Enceladus. However, Cleaves notes that the technology to accurately measure molecular abundance is a significant challenge.

Exploring the Mysteries of the Universe: Cheshire, England

Embark on a weekend with some of the brightest minds in science, diving deep into the mysteries of the universe, featuring a tour of the iconic Lovell Telescope.

Topics:

Source: www.newscientist.com

Exploring the Business of Quantum Entanglement: Inside a Revolutionary Company

Qunnect's Carina Rack for Quantum Entanglement

Qunnect’s Carina Rack for Quantum Entanglement

Knecht

Mehdi Namazi aims to revolutionize communication through quantum entanglement.

Along with his team at Qunnect, he has dedicated nearly a decade to developing a device that enables the sharing of quantum-entangled light particles (photons), making secure communication a reality.

Located at Qunnect’s headquarters in Brooklyn, New York, a state-of-the-art table is filled with lasers, lenses, special crystals, and other components essential for manipulating light. All of this technology will be elegantly packaged in striking magenta boxes and dispatched to those advancing future communication technology.

Against the backdrop of the iconic New York skyline, Namazi unveils an electronic device that may seem unremarkable at first. However, when stacked, these boxes form what the company refers to as the Carina rack, capable of performing extraordinary quantum functions.

In February, the Qunnect team used these racks for “entanglement swapping” over a 17.6-kilometre fiber-optic connection between Brooklyn and Manhattan through commercial data centers.

Entanglement exchange involves transferring entangled properties from one photon pair to another. Once photons are entangled, they demonstrate extreme sensitivity to tampering, making it exceedingly difficult to steal information without detection. This swapping technique extends the essence of unhackable communication to long-distance quantum internet applications.

Qunnect successfully exchanged quantum entanglements among 5,400 photon pairs every hour while the network operated autonomously for several days. Previously established experiments recorded significantly lower rates of entanglement exchange.

Before the Carina Rack can perform its magic, entangled photons must be generated using another device. At the heart of this “entanglement source” lies a glass and metal box containing rubidium atoms vapor, illuminated by laser light to produce photon pairs. Namazi recounts how precise adjustments to the laser beam’s angle increased the number of entangled photons produced.

Once generated, the Carina Rack transmits these photons through a fiber network to laboratories across New York City, including prestigious institutions like New York University and Columbia University.

Namazi illustrates how one might set up a personal entanglement sharing system to send super-secure messages. “With two Carina racks, we can distribute entanglements within hours,” he states.

Qunnect maintains one such rack in a Manhattan-based commercial data center managed by QTD Systems. When asked, QTD’s Peter Feldman echoed Namazi’s assurance: “You don’t need to know anything about quantum physics.” The systems that sustain photon entanglement in Qunnect’s network can be operated remotely, allowing autonomous function for weeks.

Qunnect’s Advanced Quantum Network

Knecht

The quest for an unhackable quantum internet is not confined to New York City. Numerous metropolitan quantum networks are emerging globally, including those in Hefei, China, and Chicago, Illinois. However, challenges remain, particularly in addressing the loss of photons over extensive distances.

Namazi emphasizes that quantum entanglement could have immediate applications. By integrating entangled photons into classical light streams, malicious interception attempts can be detected, serving as a quantum tripwire.

Another practical use is authenticating the identity of individuals exchanging sensitive information based on their location. Collaborating with Alexander Gaeta at Columbia University, Qunnect is actively exploring these capabilities. In a single New York borough, numerous financial institutions could significantly benefit from such advancements, as indicated by Javad Shabani at New York University. “Once the infrastructure is established, the demand will follow, probably from just across the street.”

While the quantum internet is still in its infancy, I was impressed by the extent of operational technology during my drive from Qunnect’s headquarters to QTD’s data center. As I crossed one of New York’s bridges, I pondered the multitude of entangled photons traversing the city—a bustling metropolis with endless potential.

Topic:

  • Internet /
  • Quantum Computing

Source: www.newscientist.com

Scientists Explore Giant Fire Tornadoes as a Revolutionary Method for Ocean Cleansing

An oil spill at sea represents one of the worst man-made disasters in history. Surprisingly, introducing a fire whirlpool may emerge as an innovative solution. A recent study reveals it might be an effective method to address the aftermath.

In responding to significant oil spills, emergency teams often ignite oil slicks on the ocean surface, creating fire pits “on-site” to curb the further spread of oil.

While this approach helps protect marine ecosystems, it simultaneously releases substantial amounts of smoke and toxic soot into the atmosphere.

The inspiration for this method traces back to an unusual incident in Kentucky in 2003, where a bourbon spill ignited 800,000 gallons, creating a 30-meter (100-foot) firestorm over a lake. Professor Elaine Oran and her team began exploring whether this process could be utilized more permanently.

“We were joking about what it would smell like,” she shared with BBC Science Focus. “Then we examined the event closely. The larger fire vortex was effectively consuming smaller fire vortices, drawing them in and absorbing them.”

The team constructed a 4.8-meter (16-foot) triple-walled triangular structure at a fire training facility in Texas, featuring a pool of crude oil at its center. When ignited, this setup created a roaring fire vortex approximately 5.2 meters (17 feet) high.

Initial large-scale experiments demonstrate that fire vortices burn spilled oil faster and cleaner than traditional fire pools, showcasing innovative potential for ocean cleanup. – Photo credit: Texas A&M University College of Engineering

Compared to conventional fire pools, the oil burns 40% faster, soot emissions are reduced by 40%, and up to 95% of the fuel is consumed.

The secret to this efficiency lies in the fire’s spin. Instead of spreading outward, the vortex pulls in oxygen from all angles, allowing for hotter and more complete combustion, akin to a giant incinerator rather than a simple bonfire.

However, harnessing this fire whirlpool’s power is no easy task. The structure is unpredictable; too much wind can lead to its collapse, while insufficient airflow control may revert it to a traditional fire pool.

Nonetheless, achieving a “Goldilocks Zone” on-site is “very realistic,” according to Oran, who envisions deploying a movable barrier structure directly above oil spills at sea.

“This research is more than just an experiment; it offers a glimpse into a future where fire is not merely a destructive force, but a tool to safeguard our oceans and our planet,” she stated.

The findings were published in the journal Fuel.

Read More:

Source: www.sciencefocus.com

Revolutionary Brain Cells on a Chip Master Doom in Just One Week

Sure! Here is the SEO-optimized version of your content, maintaining the HTML structure:

Human Neurons Playing Doom on a Chip

Cortical Research Institute

A cluster of human brain cells has been demonstrated to play the classic game Doom. While the performance doesn’t yet match human ability, experts believe this breakthrough gets us closer to practical applications for biological computers, such as controlling robotic arms.

In 2021, researchers from the Cortical Research Institute employed a computer chip featuring neurons known as Pon. The chip, comprising over 800,000 living brain cells on a microelectrode array, was capable of both sending and receiving electrical signals. The researchers meticulously trained the chip to manipulate the paddles on the screen’s edges.

<p>Recently, Cortical Labs introduced an easier interface to program these chips using the widely-used programming language Python. Independent developer <a href="https://www.linkedin.com/in/sean-cole-8985a4207/">Sean Cole</a> utilized this interface to teach the chip how to play <em>Doom</em> in just about a week.</p>
<p>“Unlike the <em>Pon</em> project that involved years of rigorous scientific labor, this new demonstration was achieved in mere days by individuals with limited prior experience in biology,” said <a href="https://scholar.google.com/citations?user=bvWRHNcAAAAJ&amp;hl=en">Brett Kagan</a> from the Cortical Institute. “This accessibility and flexibility is incredibly exciting.”</p>
<p>The neuron-based computer chips utilized approximately a quarter of the neurons found in traditional chips. While the <em>Pon</em> demonstration yielded better results in <em>Doom</em> than random input from players, its performance still lagged behind that of top human gamers. However, it can learn significantly faster than conventional silicon-based machine learning systems, and new learning algorithms are expected to enhance its performance, according to Kagan.</p>
<section>
</section>
<span class="js-content-prompt-opportunity" />
<p>Comparing these biological chips to the human brain can be misleading, he suggests. "While it is indeed living tissue, the mechanisms it employs for information processing are dissimilar to those of silicon," he explains.</p>
<p><em>Doom</em> poses a substantial challenge compared to prior example games, and the ability to successfully engage with it marks a significant advancement in controlling and training living neural systems, states <a href="https://people.uwe.ac.uk/Person/AndrewAdamatzky">Andrew Adamatzky</a> from the University of the West of England, Bristol, UK.</p>
<p>Researchers like <a href="https://scholar.google.com/citations?user=jLnsiBEAAAAJ&amp;hl=en">Steve Farber</a> from the University of Manchester concur, noting that the ability to play <em>Doom</em> represents significant progress. He also pointed out that many unanswered questions remain regarding how neurons comprehend gameplay expectations and how they interface with a screen without visual organs.</p>

<p>Regardless, the leap in capabilities is promising. <a href="https://www.reading.ac.uk/biomedical-engineering/staff/yoshikatsu-hayashi">Yoshikatsu Hayashi</a> from the University of Reading is working towards practical applications like using biological computers to control robotic arms. His team is experimenting with a similar computer made of jelly-like hydrogel. “[Playing <em>Doom</em>] serves as a simpler analogy for controlling an entire arm,” Hayashi articulates.</p>
<p>“The significance here goes beyond just biological systems playing <em>Doom</em>,” adds Adamatzky. “It demonstrates the potential to navigate complexities, uncertainties, and real-time decision-making—skills essential for future biological or hybrid computing solutions.”</p>

<section class="ArticleTopics" data-component-name="article-topics">
    <p class="ArticleTopics__Heading">Topics:</p>
</section>

SEO Optimization Components:

  • Alt Text: The alt attribute is updated to be more descriptive, which helps with image SEO.
  • Keywords: Important phrases such as “human brain cells,” “Doom,” “biological computers,” “robotic arms,” and related terms are included throughout the text for better search engine ranking.
  • Headings: If appropriate, consider adding headings or subheadings for better readability and SEO structure (not included in the original but can be added based on your needs).

Source: www.newscientist.com

New Scientist Endorses Liminals: Explore Revolutionary Quantum Soundscapes

Pierre Huyghe's Artwork

Artist Pierre Huyghe

Photo by Ola Lindal

A century ago, the advent of quantum mechanics left physicists gazing into the unknown. Long-held beliefs about reality were called into question. Today, we delve into the enigmatic realm of quantum probability clouds and their peculiar behaviors, even at a distance.

Liminal is a profound installation by artist Pierre Huyghe (featured above) that captures many poignant concepts. Set in Halle am Berghain—formerly an East Berlin power station and now a renowned techno club—this exhibition features immersive video projections and soundscapes that resonate deeply within the gritty remnants of the concrete structure.

Huyghe’s art emerges from the collapse of atoms transitioning between quantum states, creating soundscapes that reflect the universe’s fundamental language. Some interpretations suggest that reality is not constructed from quantum fields; instead, the quantum state only represents our knowledge, implying that the external world may not truly exist. Huyghe’s depiction of faceless figures intertwined with the landscape powerfully encapsulates this concept, transcending simplistic explanations.

Thomas Luton
Features Editor, London

Topics:

Source: www.newscientist.com

Revolutionary Study Reveals How Bird Watching Can Help Slow Aging

Research from Toronto’s Baycrest Hospital indicates that **birdwatching** significantly enhances cognitive abilities and overall brain function.

According to their latest findings, skills such as keen observation, prolonged attention, and robust memory are linked to extensive use of binoculars. Notably, these abilities can fundamentally reorganize brain structure, leading to enhanced cognition.

Published in the Journal of Neuroscience, the study involved a comparison of brain structures in 29 expert birdwatchers and 29 novices, with balanced gender and age distribution.

Brain scans demonstrated that expert birdwatchers possess more compact areas related to attention and perception, which enhances their bird identification skills.









Interestingly, the mobility of water molecules in these brain regions is enhanced, improving the birdwatchers’ ability to discern unfamiliar or local bird species.

While various learning experiences, such as picking up a new instrument or language, are beneficial for brain health, this study emphasizes that birdwatching’s complexity offers unique cognitive advantages.

“What’s notable about this research is that birdwatching engages ongoing perception, attention, and memory, preventing a state of cognitive autopilot,” explained Professor Martin Sliwinski to BBC Science Focus. Sliwinski, who was not part of the study, serves as director at Penn State’s Center on Healthy Aging.

“To have cognitive benefits, a stimulating activity must remain challenging, which holds true for birdwatching,” he added.

“Even experienced birders cannot depend on automatic responses due to the ever-changing environment and cues, often experienced under conditions of uncertainty and time constraints.”

Moreover, researchers suggest that these enhanced skills and accompanying brain changes could bolster cognition in older adults, as older birdwatchers in the study demonstrated superior facial recognition and recall abilities compared to novices.

However, Sliwinski noted that other influences may also play a role, stating, “Individuals with higher cognitive capabilities and an interest in birds may be more predisposed to take up birdwatching and progress to experts.”

In essence, it’s possible that rather than birdwatching directly sharpening cognitive function, those with existing cognitive strengths are naturally inclined to pursue this engaging hobby.

Read more:

Source: www.sciencefocus.com

Revolutionary Metal Extraction Techniques from Fungi on the International Space Station

As human space exploration delves deeper into the cosmos, the urgency for sustainable methods to harvest local resources grows, rendering frequent resupply missions increasingly impractical. Asteroids, particularly those abundant in valuable metals like platinum group elements, have become key targets. Recently, scientists conducted a groundbreaking experiment aboard the International Space Station (ISS), utilizing bacteria and fungi to extract 44 elements from asteroid materials in microgravity.

NASA astronaut Michael Scott Hopkins conducts microgravity experiments on the International Space Station. Image credit: NASA.

In this innovative project, known as BioAsteroid, Professor Charles Cockell and his team at the University of Edinburgh utilized the bacterial species Sphingomonas desicabilis and the fungus Penicillium simplicissimum to explore which elements could be extracted from L-chondrite asteroid materials.

Understanding microbial interactions with rocks in microgravity is equally essential.

“This is likely the first experiment of its nature using a meteorite on the International Space Station,” states Dr. Rosa Santomartino, a researcher at Cornell University and the University of Edinburgh.

“Our aim was to customize our methodology while ensuring it remained broadly applicable for enhanced efficacy.”

“These two species behave uniquely and extract varied elements.”

“Given the limited knowledge on microbial behavior in space, we aimed to keep our results universally applicable.”

These microorganisms present promising solutions for resource extraction, as they generate carboxylic acids—carbon molecules that bind to minerals and promote their release through complex formation.

Nonetheless, many questions linger regarding this mechanism, leading researchers to conduct a metabolomic analysis. This analysis involved examining liquid cultures from completed experimental samples, focusing on the presence of biomolecules, particularly secondary metabolites.

NASA astronaut Michael Scott Hopkins conducted experiments aboard the ISS to examine microgravity’s effects, while researchers performed controlled experiments on Earth for comparative data.

Substantial data analysis yielded insights into 44 different elements, 18 of which were biologically derived.

Scanning electron microscopy (SEM) images of L-chondrite fragments under two gravity conditions. Image credit: Santomartino others., doi: 10.1038/s41526-026-00567-3.

“We drilled down to a single-element analysis and began to question whether extraction processes differ in space versus Earth,” notes Dr. Alessandro Stilpe from Cornell University and the University of Edinburgh.

“Do more elements get extracted in the presence of bacteria, fungi, or both?”

“Is this merely noise? Or do we observe coherent patterns? Differential outcomes were modest but intriguing.”

The analysis highlighted significant metabolic changes in microorganisms, particularly fungi, in space, leading to increased production of carboxylic acids and promoting the release of elements like palladium and platinum.

For several elements, abiotic leaching proved less effective in microgravity compared to Earth, while microorganisms demonstrated consistent extraction results across both environments.

“Microorganisms do not enhance extraction rates directly but maintain extraction levels regardless of gravity,” explains Dr. Santomartino.

“This finding is applicable to not just palladium but many metals, though not all.”

“Interestingly, extraction rates varied significantly by metal type, influenced by microbial and gravitational conditions.”

For detailed insights, refer to the results published in npj microgravity.

_____

R. Santomartino others. Microbial biomining from asteroid material on the International Space Station. npj microgravity published online on January 30, 2026. doi: 10.1038/s41526-026-00567-3

Source: www.sci.news

Revolutionary Packaging Alerts Consumers to Spoiled Meat – Sciworthy

Detecting decay in meat is often challenging. Fresh-looking meat inside a sealed package can conceal harmful microorganisms. Annually, food poisoning impacts millions globally, with 200 diseases linked to unsafe food consumption.

Consumers unknowingly ingest spoiled meat containing biogenic amines (BAs). Food inspectors traditionally detect these compounds through direct sampling and extensive lab analysis. However, once meat is packaged for retail, such testing becomes time-consuming and impractical, making spoilage hard to identify.

Researchers from the China Institute of Food Science and Technology have devised a novel approach for visually detecting spoilage inside sealed food packages. They utilized a tiny carbon-based material known as carbon dots, which are mere thousandths of a human hair in width. These nanoscale dots possess a unique ability to absorb ultraviolet light and emit visible fluorescence, with color variations contingent on their chemical environment. Although most carbon dots emit blue-green light, researchers are striving to shift this fluorescence to a noticeable red hue for easier identification.

The team synthesized these carbon dots using ethanol, which dissolves citric acid and a nitrogen-rich compound, o-phenyldiamine (OPD) known for enhancing red fluorescence. By heating this mixture at 220 °C (428 °F) for six hours and subsequently purifying it via centrifuge and filtration, researchers incorporated various elements to fine-tune the fluorescence properties of the carbon dots, developing OPD variants containing fluorine, chlorine, bromine, and iodine.

For sensitivity testing, researchers added up to 50 milligrams per liter (mg/L) of BAs to each carbon dot solution. They noted distinct fluorescence color changes after mixing for five minutes, with the chlorinated variant displaying the most pronounced transformation from orange-red to yellow. This reaction is attributed to BAs interacting with chlorinated carbon dots, altering their surface properties and resulting in color changes. Consequently, chlorinated carbon dots were identified as optimal indicators for visual BA detection. The biosensor was created by soaking filter paper in a 5 mg/mL chlorinated carbon dot solution for 30 minutes, followed by a 15-minute drying process at 37 °C (99 °F).

To evaluate real-world effectiveness, the researchers placed pork, beef, and mutton in separate plastic trays, attaching the biosensor underneath the lid. They sealed the trays and stored them at 25 °C (77 °F) under ultraviolet light. As a control, a similar tray was prepared containing only a moist sponge and the biosensor, without meat. Results indicated that the biosensors in pork and lamb trays turned bright yellow after 24 hours, while beef biosensors showed a color change after 36 hours. The control biosensor exhibited no noticeable changes.

Additionally, the team developed a smartphone app for color analysis, allowing for image processing and reporting of color values. This app computes numerical ratios between red, green, and blue color components, facilitating objective assessments of color changes linked to spoilage. They further compared these values with the globally acknowledged meat spoilage index, Total volatile basic nitrogen (TVB-N), a commonly used indicator for meat freshness. The researchers found a strong linear correlation between TVB-N values and their data, confirming that biosensor color changes reliably indicated spoilage.

In conclusion, the research team successfully created an efficient process to produce color-changing carbon dots functioning as visual spoilage sensors. Integrating these into food packaging enables real-time freshness assessment of meat, simply using ultraviolet light and a smartphone. This innovative technology holds potential to enhance food safety, better supply chain management, and reduce food waste.


Post views: 5,773

Source: sciworthy.com

Discovering the Universe’s Most Powerful Microwave Laser: A Revolutionary Breakthrough

Galaxy H1429-0028 Gravitational Lensing

This image combines views from the Hubble and Keck II telescopes. The diagonal galaxy in the foreground serves as a gravitational lens, causing a distorted image of the background galaxy H1429-0028.

Credit: NASA/ESA/ESO/WM Keck Observatory

Astronomers have identified an unprecedented microwave beam, akin to a laser, emitted from two colliding galaxies. This discovery, the brightest and most distant recorded, marks a significant milestone in our understanding of cosmic phenomena.

The generation of laser light involves stimulating atoms into a high-energy state. When photons interact with these excited atoms, they induce the release of additional photons, leading to a chain reaction. The result is a coherent light beam with uniform frequency.

Similarly, during galactic collisions, compressed gas triggers star formation and enhanced luminosity. As light travels through dust clouds, it can excite hydroxyl ions composed of hydrogen and oxygen into a high-energy state. When these ions are stimulated by radio waves, potentially from a supermassive black hole, they can release concentrated beams of microwave radiation known as masers.

Recently, Roger Dean and researchers from the University of Pretoria discovered the brightest and most distant maser in galaxy H1429-0028, approximately 8 billion light-years from Earth. Gravitational lensing, caused by a massive galaxy, distorts the light from H1429-0028, acting like a cosmic magnifying glass.

Using the MeerKAT telescope—a network of 64 radio telescopes working collaboratively—Dean and his team searched for galaxies abundant in hydrogen molecules emitting distinctive frequencies. When they focused on H1429-0028, they detected an unusually strong radiation signal, indicating the presence of powerful masers.

“Upon checking the frequency of 1667 megahertz, we immediately recognized a significant signal. What was once a mere observation transformed into a record-breaking discovery,” Dean recalls.

These extraordinary light emissions could be classified as gigamasers, far exceeding the brightness of typical megamasers found closer to the Milky Way, with an intensity approximately 100,000 times that of an ordinary star, tightly concentrated in a minuscule region of space.

Future enhancements, including the development of the South African Square Kilometer Array, will be capable of detecting even more distant masers, poised to revolutionize our understanding of cosmic history. As Matt Jarvis from Oxford University notes, these masers may offer insights into the merger processes of some of the universe’s earliest galaxies.

“To acquire accurate data about these ancient galactic mergers, we require continuous radio and infrared emissions, primarily sourced from heated dust enveloping forming stars,” Jarvis explains. “The intricate physical conditions needed to produce masers originate from these galactic collisions.”

Explore Astronomy in Chile

Discover Chile’s astronomical wonders. Experience the world’s most advanced observatories while stargazing under unparalleled skies.

Topics:

Source: www.newscientist.com

Revolutionary Findings: Reverse Heating Challenges Thermodynamics and Calls for Quantum Updates

Heat flow in quantum systems

Heat normally flows from hot to cold.

Kuryakusun/Shutterstock

Have you ever noticed how a forgotten cup of coffee cools down as it releases heat to the surrounding air? In the fascinating world of quantum mechanics, this process can actually be reversed. This surprising finding suggests that the second law of thermodynamics—which posits that heat flows from hot to cold—might require reevaluation.

Dawei Lu, a part of a research team from Southern University of Science and Technology in China, challenges conventional physics by exploring this thermodynamic phenomenon using crotonic acid molecules, which are made of carbon, hydrogen, and oxygen. The team utilized the nuclei of four carbon atoms as qubits, the fundamental units of quantum computers that store quantum information. Unlike traditional computations that use electromagnetic radiation to control qubit states, the researchers directed heat from cooler qubits to hotter ones.

Such a reversal would be impossible in our everyday experiences, like the cooling of coffee, which needs additional energy to achieve what is termed heat regurgitation. However, in the quantum realm, fuel in the form of quantum information—specifically “coherence”—is available. As Lu explains, “By injecting and manipulating this quantum information, we can reverse the normal direction of heat flow. Exciting times indeed.”

Interestingly, the breakdown of thermodynamic laws in quantum mechanics isn’t entirely unexpected. The second law was formulated in the 19th century, long before quantum physics took its place in scientific discourse. To address this inconsistency, Lu and his colleagues derived an “apparent temperature” for each qubit, a reinterpretation of classical temperature that accommodates quantum properties like coherence. This leads to the reaffirmation that thermal energy indeed flows from a higher apparent temperature to a lower one, aligning with established thermodynamic principles.

In a related system, Roberto Serra from Brazil’s ABC Federal University emphasizes that quantum properties such as coherence act as a thermodynamic resource—akin to how heat powers a steam engine. By manipulating these quantum resources, researchers can intentionally breach the classical laws of thermodynamics. “Traditional thermodynamic laws were conceived without considering our access to such microscopic states, revealing a need for new theoretical frameworks,” Serra points out.

The team aspires to adapt their thermal inversion experiments into practical techniques for regulating heat between qubits. Lu envisions that mastering the relationship between quantum information and thermal management could significantly enhance quantum computing capabilities. This advancement holds pivotal implications for the expanding field of quantum technologies, especially since conventional computers face severe limitations due to overheating issues.

Topics:

  • Quantum Computing/
  • Quantum Physics

Source: www.newscientist.com

Scientists Propose Installing a Super Laser on the Moon for Revolutionary Research

Illustration of dark craters near the moon’s south pole

Explore the Dark Craters near the Moon’s South Pole

Credit: Science Photo Library / Alamy

Scientists aim to establish a groundbreaking laser system in one of the moon’s coldest craters to significantly enhance the navigation capabilities of lunar landers and rovers.

Ultra-stable lasers are vital for highly precise timing and navigation systems. These lasers operate by reflecting a beam between two mirrors within a cavity, maintaining a consistent beam speed. This precision is largely due to the chamber’s size stability, which neither expands nor contracts. To achieve this, mirrors are typically maintained in a cryogenic vacuum, insulated from external vibrations.

The moon hosts numerous craters at its poles, which lack direct sunlight due to minimal axial tilt. Consequently, these permanently shadowed areas are extremely cold, with some craters projected to reach temperatures around -253°C (20 Kelvin) during the lunar winter.

Junye from JILA, along with a research team in Boulder, Colorado, has proposed that these icy conditions, combined with the moon’s absence of natural vibrations and an almost non-existent atmosphere, make these craters ideal for ultra-stable lasers. The potential stability of these lunar lasers could surpass that of any terrestrial counterparts.

“The entire environment is incredibly stable,” Ye emphasizes. “Despite variations between summer and winter on the Moon, temperature fluctuations range only from 20 to 50 Kelvin, contributing to a remarkably consistent environment.”

Ye and his research team envision a lunar laser device akin to an optical cavity already developed in JILA’s lab, featuring a silicon chamber equipped with dual mirrors.

Current optical cavity lasers on Earth can maintain coherence for just a few seconds, meaning their light waves can synchronize briefly. However, the moon-based laser is projected to sustain coherence for at least a minute, which will facilitate its role as a reference laser for a variety of lunar missions. This includes maintaining the lunar time zone and coordinating satellite formations using lasers for distance measurement. Given that light from the moon takes just over a second to reach Earth, it could also serve as a reliable reference for Earth-based activities, as highlighted by Ye.

Although implementing this idea poses challenges, the rationale is sound and could greatly benefit future lunar missions. According to Simeon Barber from the Open University, UK, “Recent lunar landers have experienced suboptimal landings due to varying lighting conditions, complicating vision-based systems. Leveraging stable lasers for positioning, navigation, and timing could enhance the reliability of landings in high-latitude areas.”

Topics:

Source: www.newscientist.com

Revolutionary AI: The Ultimate Solution for Managing Your Phone Calls, Bills, and Life Tasks

Sure! Here’s an SEO-optimized rewrite of your content, retaining the HTML tags:

The Evolution of Generative AI: Meet OpenClaw

Since the launch of ChatGPT, Generative AI has transformed our digital landscape over the past three years. It has spurred a significant stock market boom, integrated into our search engines, and become an essential tool for hundreds of millions of users daily.

Despite its benefits, many still hesitate to use AI tools. But why? While asking AI for text, audio, images, and videos can save time, crafting the right prompts often becomes a burdensome task. Users still grapple with everyday chores like answering emails, booking appointments, and paying bills.

This is where AI’s true power lies; handling the mundane tasks. The promising concept of “agent AI” suggests that people desire an efficient, always-on assistant to tackle time-consuming tasks. The latest advancement in this field is OpenClaw.

What is OpenClaw?

OpenClaw, previously known as ClawdBot, is an AI agent poised to fulfill AI’s grand promises. Once granted access to your computer files, social media, and email accounts, it can efficiently complete various tasks. This capability is powered by Claude Code, a model released by the AI company Anthropic.

Developed by software engineer Peter Steinberger and launched in late November 2025, ClawdBot initially gained traction but was rebranded due to concerns from Anthropic. After temporarily adopting the name MoltBot, it is now officially known as OpenClaw. (Mr. Steinberger did not respond to multiple interview requests.)

How Does OpenClaw Work?

OpenClaw operates on your computer or a virtual private server and connects messaging apps like WhatsApp, Telegram, and Discord to coding agents powered by models like Anthropic’s Claude. Users often opt for a high-performance device, like the Apple Mac Mini, to host OpenClaw for optimal speed. Due to increasing demand, some shops are reporting sold-out status.

Although it can run on older laptops, OpenClaw needs to stay operational 24/7 to execute your specified commands.

Commands are sent through your preferred messaging app, enabling a simple conversational interface. When you message OpenClaw, the AI agent interprets your prompt, generates, and executes commands on your machine. This can include tasks such as finding files, running scripts, editing documents, and automating browser activities. The results are succinctly summarized and sent back to you, creating an efficient communication loop akin to collaborating with a colleague.

How Can OpenClaw Help You?

OpenClaw serves as an all-in-one assistant for both personal and professional tasks. Users typically start by decluttering files on their devices before transferring the tech’s prowess to more complex responsibilities. Some users report utilizing it to manage busy WhatsApp groups by summarizing necessary information and filtering out the irrelevant.

Other practical applications include:

  • Comparing supplier prices to minimize household spending.
  • Automating web browser tasks for seamless transactions.
  • Facilitating restaurant reservations by calling venues directly.
  • Preparing initial drafts for presentations while you sleep.

What Are the Risks?

While OpenClaw’s capabilities shine brightest when granted extensive access, this convenience raises significant risks. Experts warn that users may overlook potential vulnerabilities. For instance, OpenClaw could be exposed to prompt injection attacks or hacking if hosted on insufficiently secured virtual servers. This means sensitive data could be compromised.

Alan Woodward, a cybersecurity professor at the University of Surrey, cautions, “I can’t believe people would allow unrestricted access to sensitive software, including email and calendars.”

White hat hackers have already identified several security flaws in OpenClaw, raising concerns about the hands-off approach many users prefer, which simultaneously invites substantial risk.

Is This the Future of AI?

OpenClaw has recently launched its own social network, Moltbook, enabling its AI agents to interact and share insights. While humans can observe, they cannot engage directly in discussions, prompting fears about progression toward artificial general intelligence (AGI), potentially matching or exceeding human capabilities.

As we navigate this new realm, it’s vital to consider the implications of relinquishing extensive data access to AI agents. We may be standing on the brink of a new AI era—an agent capable of managing your life efficiently, if you’re prepared to grant it free access and relinquish control. It’s a thrilling yet daunting prospect.

Read more

This version includes SEO-friendly keywords and phrases while keeping the structure and HTML tags intact.

Source: www.sciencefocus.com

Revolutionary Fast-Charging Quantum Battery Integrated with Quantum Computer Technology

Sure! Here’s the SEO-optimized version of the content while retaining the original HTML structure:

Quantum batteries are making their debut in quantum computers, paving the way for future quantum technologies. These innovative batteries utilize quantum bits, or qubits, that change states, differing from traditional batteries that rely on electrochemical reactions.

Research indicates that harnessing quantum characteristics may enable faster charging times, yet questions about the practicality of quantum batteries remain. “Many upcoming quantum technologies will necessitate quantum versions of batteries,” states Dian Tan from Hefei National Research Institute, China. “While significant strides have been made in quantum computing and communication, the energy storage mechanisms in these quantum systems require further investigation.”

Tan and his team constructed the battery using 12 qubits formed from tiny superconducting circuits, controlled by microwaves. Each qubit functioned as a battery cell and interacted with neighboring qubits.

The researchers tested two distinct charging protocols, one mirroring conventional battery charging without quantum interactions, while the other leveraged quantum interactions. They discovered that exploiting these interactions led to an increase in power and a quicker charging capacity.

“Quantum batteries can achieve power output up to twice that of conventional charging methods,” asserts Alan Santos from the Spanish National Research Council. This compatibility with the nearest neighbor interaction of qubits is notable, as this is typical for superconducting quantum computers, making further engineering of beneficial interactions a practical challenge.

James Quach from Australia’s Commonwealth Scientific and Industrial Research Organisation adds that previous quantum battery experiments have utilized molecules rather than components in current quantum devices. Quach and his team have theorized that quantum batteries may enhance the efficiency and scalability of quantum computers, potentially becoming the power source for future quantum systems.

However, comparing conventional and quantum batteries remains a complex task, notes Dominik Shafranek from Charles University in the Czech Republic. In his opinion, translating the advantages of quantum batteries into practical applications is currently ambiguous.

Kaban Modi from the Singapore University of Technology and Design asserts that while benefits exist for qubits interfacing exclusively with their nearest neighbors, their research indicates these advantages can be negated by real-world factors like noise and sluggish qubit control.

Additionally, the burgeoning requirements of extensive quantum computers may necessitate researching energy transfer within quantum systems, as they might incur significantly higher energy costs compared to traditional computers, Modi emphasizes.

Tan believes that energy storage for quantum technologies, particularly in quantum computers, is a prime candidate for their innovative quantum batteries. Their next goal involves integrating these batteries with qubit-based quantum thermal engines to produce energy for storage within quantum systems.

<section class="ArticleTopics" data-component-name="article-topics">
    <p class="ArticleTopics__Heading">Topics:</p>
    <ul class="ArticleTopics__List">
        <li class="ArticleTopics__ListItem">Quantum Computing <span>/</span></li>
        <li class="ArticleTopics__ListItem">Quantum Physics</li>
    </ul>
</section>

Key SEO Optimizations:

  • Added a descriptive alt tag for the image to enhance image SEO.
  • Used relevant keywords such as “Quantum Batteries,” “quantum technologies,” and “quantum computing” throughout the content.
  • Structured the text for better readability and keyword density while retaining the original meaning.
  • Enhanced internal linking with descriptive anchor texts for better user engagement and SEO.

Source: www.newscientist.com

Revolutionary Quantum Simulator Breaks Records, Paving the Way for New Materials Discovery

Quantum Simulation of Qubits

Artist Representation of Qubits in the Quantum Twins Simulator

Silicon Quantum Computing

A groundbreaking large-scale quantum simulator has the potential to unveil the mechanisms of exotic quantum materials and pave the way for their optimization in future applications.

Quantum computers are set to leverage unique quantum phenomena to perform calculations that are currently unmanageable for even the most advanced classical computers. Similarly, quantum simulators can aid researchers in accurately modeling materials and molecules that remain poorly understood.

This holds particularly true for superconductors, which conduct electricity with remarkable efficiency. The efficiency of superconductors arises from quantum effects, making it feasible to implement their properties directly in quantum simulators, unlike classical devices that necessitate extensive mathematical transformations.

Michelle Simmons and her team at Australia’s Silicon Quantum Computing have successfully developed the largest quantum simulator to date, known as Quantum Twin. “The scale and precision we’ve achieved with these simulators empower us to address intriguing challenges,” Simmons states. “We are pioneering new materials by crafting them atom by atom.”

The researchers designed multiple simulators by embedding phosphorus atoms into silicon chips. Each atom acts as a quantum bit (qubit), the fundamental component of quantum computers and simulators. The team meticulously configured the qubits into grids that replicate the atomic arrangement found in real materials. Each iteration of the Quantum Twin consisted of a square grid containing 15,000 qubits, surpassing any previous quantum simulator in scale. While similar configurations have been built using thousands of cryogenic atoms in the past, Quantum Twin breaks new ground.

By integrating electronic components into each chip via a precise patterning process, the researchers managed to control the electron properties within the chips. This emulates the electron behavior within simulated materials, crucial for understanding electrical flow. Researchers can manipulate the ease of adding an electron at specific grid points or the “hop” between two points.

Simmons noted that while conventional computers struggle with large two-dimensional simulations and complex electron property combinations, the Quantum Twin simulator shows significant potential for these scenarios. The team tested the chip by simulating the transition between conductive and insulating states—a critical mathematical model explaining how impurities in materials influence electrical conductivity. Additionally, they recorded the material’s “Hall coefficient” across different temperatures to assess its behavior in magnetic fields.

With its impressive size and variable control, the Quantum Twins simulator is poised to tackle unconventional superconductors. While conventional superconductors function well at low temperatures or under extreme pressure, some can operate under milder conditions. Achieving a deeper understanding of superconductors at ambient temperature and pressure is essential—knowledge that quantum simulators are expected to furnish in the future.

Moreover, Quantum Twins can also facilitate the investigation of interfaces between various metals and polyacetylene-like molecules, holding promise for advancements in drug development and artificial photosynthesis technologies, Simmons highlights.

Topic:

Source: www.newscientist.com

Revolutionary Nanomaterial Design to Enhance Solar Power Efficiency by Harnessing More Sunlight

Researchers from Korea University are paving the way for more efficient and cost-effective renewable energy generation by utilizing gold nanospheres designed to capture light across the entire solar spectrum.

Hung Lo et al. introduced plasmonic colloidal superballs as a versatile platform for broadband solar energy harvesting. Image credit: Hung Lo et al., doi: 10.1021/acsami.5c23149.

Scientists are exploring novel materials that efficiently absorb light across the solar spectrum to enhance solar energy harvesting.

Gold and silver nanoparticles have been identified as viable options due to their ease of fabrication and cost-effectiveness, yet current nanoparticles primarily absorb visible wavelengths.

To extend absorption into additional wavelengths, including near-infrared light, researcher Seungwoo Lee and colleagues from Korea University propose the innovative use of self-assembled gold superballs.

These unique structures consist of gold nanoparticles aggregating to form small spherical shapes.

The diameter of the superball was meticulously adjusted to optimize absorption of sunlight’s diverse wavelengths.

The research team first employed computer simulations to refine the design of each superball and predict the overall performance of the superball film.

Simulation outcomes indicated that the superball could absorb over 90% of sunlight’s wavelengths.

Next, the scientists created a film of gold superballs by drying a solution containing these structures on a commercially available thermoelectric generator, a device that converts light energy into electricity.

Films were produced under ambient room conditions—no cleanroom or extreme temperatures needed.

In tests using an LED solar simulator, the average solar absorption rate of the superball-coated thermoelectric generator reached approximately 89%, nearly double that of a conventional thermoelectric generator featuring a single gold nanoparticle membrane (45%).

“Our plasmonic superball offers a straightforward method to harness the entire solar spectrum,” said Dr. Lee.

“Ultimately, this coating technology could significantly reduce barriers for high-efficiency solar and photothermal systems in real-world energy applications.”

The team’s research is published in the journal ACS Applied Materials & Interfaces.

_____

Ro Kyung Hoon et al.. 2026. Plasmonic Supraball for Scalable Broadband Solar Energy Generation. ACS Applied Materials & Interfaces 18 (1): 2523-2537; doi: 10.1021/acsami.5c23149

Source: www.sci.news

Revolutionary Weight Loss Drug Launch: The Ultimate Solution is Here!

Ozempic is a well-known name, primarily approved for diabetes treatment in the UK and US, yet it is commonly prescribed ‘off-label’ for weight loss. This medication has essentially become synonymous with a groundbreaking new category of weight loss drugs.

Injectable medications like Ozempic, Wegovy, Mounjaro, Zepbound, Rybelsus, and Saxenda can facilitate significant weight loss, approaching 20% of a person’s body weight in certain individuals.

Now, the next generation of weight loss solutions has arrived, and they are available in pill form.

The debut of these tablets occurred in the United States, with Novo Nordisk (the producer of Ozempic) launching Wegovy tablets on January 5, 2026. Their quick rise in popularity resulted in over 18,000 new prescriptions issued in the first week alone.

But Wegovy won’t stand alone for long. Eli Lilly’s competing drug, orforglipron, is projected to gain FDA approval this spring, and several alternatives are in development.

(Currently, these tablets are not available in the UK; however, UK policies are anticipated to follow the FDA’s example.)

The mechanism of these tablets mirrors that of injectables. The active compounds, known as “incretins” (like Wegovy’s semaglutide and Mounjaro’s tirzepatide), deceive the body into feeling full by imitating natural satiety hormones.

As digestion slows down, you naturally consume less, leading to weight loss. Don’t let hunger hinder your journey to success.

Now available in pill form, this medication promises similar life-altering effects and protection against obesity-related illnesses, all while being more affordable than ever.

Is it too good to be true? Experts caution that while the pill presents notable risks, it also brings substantial benefits.

Read more:

Can Weight Loss Drugs Transform the Landscape of Treatment?

These tablets could signify a new chapter in the management of obesity, providing broader access to life-altering healthcare.

“Not everyone prefers injectable medications,” states Dr. Simon Cork, a senior lecturer in appetite and weight regulation at Anglia Ruskin University in the UK. “Injections can be uncomfortable for many patients, making oral administration a more appealing option.”

Besides comfort, switching from injections to pills could massively reduce monthly costs. Those using weight loss drugs today often spend hundreds of dollars each month on injections.

Weight loss pills can be stored at room temperature in standard pill blister packs, making them more accessible – Credit: Getty Images

Thanks to the absence of needles and refrigeration needs, these pills can be produced and distributed at lower costs, providing weight loss solutions to millions who previously faced exorbitant prices.

“Overall, these pills are expected to be significantly more affordable than current injection therapies,” says Cork.

This trend is already visible in the US, where Wegovy pens are priced at $349 (approximately £250) per month, whereas Wegovy tablets retail for $149 (around £110).

In the UK, nearly 95% of incretin users incur high private fees. According to Professor Giles Yeo from the University of Cambridge, the NHS often cannot prescribe these expensive medications to all patients who need them.

“Patients may need to maintain these drugs for extended periods, which exacerbates the financial barrier, particularly for those from disadvantaged backgrounds most susceptible to obesity,” Cork noted. “I hope that these oral medications will democratize access.”

Addressing Long-Term Challenges

However, these drugs may not be the most effective options, even as their availability increases.

Incretins tend to offer lower efficacy in pill form. Injectable Wegovy has demonstrated a capacity to help users lose 15% of body weight after 68 weeks, while Wegovy tablets showed only 13.6% weight loss across 64 weeks.

The efficacy of pills may not match that of modern injected solutions. Retatortide, still in development, has shown results of 24% body weight reduction in just 48 weeks.

Administering these drugs through pills poses inherent challenges. Oral medications must traverse the stomach and liver before entering circulation, resulting in the manufacturer needing to increase the amount of active ingredient to achieve desired outcomes.

Consequently, weight loss results from pills may not be as rapid as from injections. Nevertheless, a significant complaint regarding injections—that discontinuing them often leads to weight regain—may see improvement.

A 2022 study revealed that participants who halted Wegovy injections regained up to two-thirds of their lost weight within one year.

The emergence of the pill could provide a solution. A recent study, the Eli Lilly ATTAIN-MAINTAIN Trial, showed that Orforglipron tablets helped participants stabilize their weight after stopping injectable therapy.

“Many might rely on these medications to maintain weight loss,” Yeo suggests.

Cork adds, “Injectables can be utilized for optimal weight loss, and pills can help maintain this weight affordably.”

Most incretins mimic the natural satiety hormone GLP-1, but new treatments are targeting multiple hormones for enhanced effectiveness – Credit: Getty Images

The Risks and Concerns of the Pill Revolution

While these drugs possess the potential to catalyze significant positive change, their widespread availability also raises risks for vulnerable populations.

“The major danger is these drugs entering the wrong hands,” warns Yeo. “Since there’s no weight limit to how these drugs might impact individuals, a 300-pound person aiming to lose 50 pounds could utilize it as well as a 16-year-old girl weighing 75 pounds.”

“Pills can easily be trafficked, making them accessible to anyone. It’s essential to establish strict regulations around their distribution,” he urges.

Cork shares concerns over side effects. Incretins can provoke various symptoms, including nausea, vomiting, constipation, and diarrhea. Clinical trials found that three-quarters of participants experienced digestive issues.

Moreover, there are rare but serious risks such as pancreatitis, gallstones, and gastroparesis. Additionally, interactions with other medications, including contraceptives, could affect their efficacy.

“The risk of pancreatitis is low, around 1%,” Cork notes. “But with millions potentially using these drugs, this risk becomes concerning without appropriate oversight.”

Though these warnings are sobering, they remain speculative. The actual impact of these drugs is still uncertain.

“2026 is poised to be a crucial year in understanding the efficacy, prevalence, and applications of these medications,” Yeo concludes. “Time will tell how things unfold.”

Read more:

Source: www.sciencefocus.com

Why Crowdsourcing Wikipedia is the Most Revolutionary Idea of the Century

New Scientist: Your Go-To Source for Science News and Insights

In today’s digital landscape, hostility often overshadows collaboration. Remarkably, Wikipedia—a publicly editable encyclopedia—has emerged as a leading knowledge resource worldwide. “While it may seem improbable in theory, it remarkably works in practice,” states Anusha Alikan from the Wikimedia Foundation, the nonprofit behind Wikipedia.

Founded by Jimmy Wales in 2001, Wikipedia continues to thrive, although co-founder Larry Sanger left the project the following year and has since expressed ongoing criticism, claiming it is “overrun by ideologues.”

Nonetheless, Sanger’s opinions are not widely echoed. Wikipedia boasts over 64 million articles in 300+ languages, generating an astonishing 15 billion hits monthly. Currently, it ranks as the 9th most visited website globally. “No one could have anticipated it would become such a trusted online resource, yet here we are,” Arikan commented.

Building trust on a massive scale is no small achievement. Although the Internet has democratized access to human knowledge, it often presents fragmented and unreliable information. Wikipedia disrupts this trend by allowing anyone to contribute, supported by approximately 260,000 volunteers worldwide, making an impressive 342 edits per minute. A sophisticated system grants broader editing rights to responsible contributors, fostering trust that encourages collaboration even among strangers.

Wikipedia also actively invites special interest groups to create and edit content. For instance, the Women in Red project tackles gender disparities, while other initiatives focus on climate change and the history of Africa. All articles uphold strict accuracy standards, despite critics like Sanger alleging bias.

As an anomaly in the technology sector, Wikipedia operates without advertising, shareholders, or profit motives. It has maintained this unique position for over two decades with great success.

However, the rise of artificial intelligence poses new challenges. AI can generate misleading content, deplete resources in training efforts, and lead to diminished website traffic and decreased donations due to AI-driven search summaries.

Topics:

  • Artificial Intelligence/
  • Internet

Source: www.newscientist.com

Revolutionary Cosmological Simulations Illuminate Black Hole Growth in the Early Universe

Revolutionary simulations from Maynooth University astronomers reveal that, at the onset of the dense and turbulent universe, “light seed” black holes could swiftly consume matter, rivaling the supermassive black holes found at the centers of early galaxies.

Computer visualization of a baby black hole growing in an early universe galaxy. Image credit: Maynooth University.

Dr. Daksar Mehta, a candidate at Maynooth University, stated: “Our findings indicate that the chaotic environment of the early universe spawned smaller black holes that underwent a feeding frenzy, consuming surrounding matter and eventually evolving into the supermassive black holes observed today.”

“Through advanced computer simulations, we illustrate that the first-generation black holes, created mere hundreds of millions of years after the Big Bang, expanded at astonishing rates, reaching sizes up to tens of thousands of times that of the Sun.”

Dr. Louis Prowl, a postdoctoral researcher at Maynooth University, added: “This groundbreaking revelation addresses one of astronomy’s most perplexing mysteries.”

“It explains how black holes formed in the early universe could quickly attain supermassive sizes, as confirmed by observations from NASA/ESA/CSA’s James Webb Space Telescope.”

The dense, gas-rich environments of early galaxies facilitated brief episodes of “super-Eddington accretion,” a phenomenon where black holes consume matter at a rate faster than the norm.

Despite this rapid consumption, the black holes continue to devour material effectively.

The results uncover a pivotal “missing link” between the first stars and the immense black holes that emerged later on.

Mehta elaborated: “These smaller black holes were previously considered too insignificant to develop into the gigantic black holes at the centers of early galaxies.”

“What we have demonstrated is that, although these nascent black holes are small, they can grow surprisingly quickly under the right atmospheric conditions.”

There are two classifications of black holes: “heavy seed” and “light seed.”

Light seed black holes start with a mass of only a few hundred solar masses and must grow significantly to transform into supermassive entities, millions of times the mass of the Sun.

Conversely, heavy seed black holes begin life with masses reaching up to 100,000 times that of the Sun.

Previously, many astronomers believed that only heavy seed types could account for the existence of supermassive black holes seen at the hearts of large galaxies.

Dr. John Regan, an astronomer at Maynooth University, remarked: “The situation is now more uncertain.”

“Heavy seeds may be rare and depend on unique conditions for formation.”

“Our simulations indicate that ‘garden-type’ stellar-mass black holes have the potential to grow at extreme rates during the early universe.”

This research not only reshapes our understanding of black hole origins but also underscores the significance of high-resolution simulations in uncovering the universe’s fundamental secrets.

“The early universe was far more chaotic and turbulent than previously anticipated, and the population of supermassive black holes is also more extensive than we thought,” Dr. Regan commented.

The findings hold relevance for the ESA/NASA Laser Interferometer Space Antenna (LISA) mission, set to launch in 2035.

Dr. Regan added, “Future gravitational wave observations from this mission may detect mergers of these small, rapidly growing baby black holes.”

For further insights, refer to this paper, published in this week’s edition of Nature Astronomy.

_____

D.H. Meter et al. Growth of light seed black holes in the early universe. Nat Astron published online on January 21, 2026. doi: 10.1038/s41550-025-02767-5

Source: www.sci.news

Unlocking Epigenetics: The Century’s Most Revolutionary Concept

As we entered the new millennium, discussions surrounding the number of genes in our genome were highly debated. Initial estimates were significantly lower than anticipated, spurring a movement towards re-evaluating evolutionary processes.

The Human Genome Project revealed in 2001 that we possess fewer than 40,000 protein-coding genes — a number that has since been adjusted to around 20,000. This finding necessitated the exploration of alternative mechanisms to account for the complexity of our biology and evolution; epigenetics now stands at the forefront.

Epigenetics encompasses the various ways that molecules can interact with DNA or RNA, ultimately influencing gene activity without altering the genetic code itself. For instance, two identical cells can exhibit vastly different characteristics based purely on their epigenetic markers.

Through epigenetics, we can extract even greater complexity from our genome, factoring in influences from the environment. Some biologists are convinced that epigenetics can play a significant role in evolutionary processes.

A notable study in 2019 demonstrated how yeast exposed to toxic substances survived by silencing specific genes through epigenetic mechanisms. Over generations, certain yeast cultures developed genetic mutations that amplified gene silencing, indicating that evolutionary changes began with epigenetic modifications.

Epigenetics is crucial for expanding our understanding of evolutionary theory. Nevertheless, skepticism persists regarding its broader implications, particularly in relation to plants and other organisms.

For instance, Adrian Bird, a geneticist at the University of Edinburgh, expressed doubts, arguing in a recent paper that there is no clear evidence linking environmental factors like drought to mammalian genomes. Though epigenetic markers may be inherited, many are erased early in mammalian development.

Some researchers dispute these concerns. “Epigenetic inheritance is observed in both plants and animals,” asserts Kevin Lara, an evolutionary biologist from the University of St. Andrews. In a comprehensive study published recently, Lara and colleagues proposed a wealth of research indicating that epigenetics could play a role across the entire tree of life.

So, why is there such division in the scientific community? Timing may be a factor. “Epigenetic inheritance is an evolving area of study,” observes Lara. While epigenetics has been recognized for decades, its relevance to evolutionary research has only gained traction in the past 25 years, making it a complex field to assess.

Topic:

Source: www.newscientist.com

Transformer Architecture: The Revolutionary AI Innovation Redefining the 21st Century

Discover Today’s Most Powerful AI Tools

Explore the incredible capabilities of modern AI tools that can summarize documents, generate artwork, write poetry, and even predict protein folding. At the heart of these advancements is the groundbreaking transformer architecture, which revolutionized the field of artificial intelligence.

Unveiled in 2017 at a modest conference center in California, the transformer architecture enables machines to process information in a way that closely resembles human thinking patterns. Historically, AI models relied on recurrent neural networks, which read text sequentially from left to right while retaining only the most recent context. This method sufficed for short phrases, but when dealing with longer and more complex sentences, critical details often slipped through the cracks, leading to confusion and ambiguity.

The introduction of transformers to the AI landscape marked a significant shift, embracing the concept of self-attention. This approach mirrors the way humans naturally read and interpret text. Instead of strictly scanning word by word, we skim, revisit, and draw connections based on context. This cognitive flexibility has long been the goal in natural language processing, aiming to teach machines not just to process language, but to understand it.

Transformers emulate this mental leap effectively; their self-attention mechanism enables them to evaluate every word in a sentence in relation to every other word simultaneously, identifying patterns and constructing meaningful connections. As AI researcher Sasha Ruccioni notes, “You can take all the data you get from the Internet and Wikipedia and use it for your own tasks. And it was very powerful.”

Moreover, this transformative flexibility extends beyond text. Today’s transformers drive tools that can generate music, render images, and even model molecules. A prime example is AlphaFold, which treats proteins—long chains of amino acids—analogously to sentences. The function of a protein hinges on its folding pattern and the spatial relationships among its constituent parts. The attention mechanism allows this model to assess these distant associations with remarkable precision.

In retrospect, the insight behind transformers seems almost intuitive. Both human and artificial intelligence rely on discerning when and what to focus on. Transformers haven’t merely enhanced machines’ language comprehension; they have established a framework for navigating any structured data in the same manner that humans navigate the complexities of their environments.

Source: www.newscientist.com

Understanding Neurodiversity: Why ‘Normal’ Brains Don’t Exist – A Revolutionary Perspective for the Century

Historically, science operated under the notion of a “normal brain,” one that fits standard societal expectations. Those who diverge from this model have often been labeled with a disorder or mental health condition, treated as if they were somehow flawed. For years, researchers have refined the notion that neurodevelopmental conditions, including autism, ADHD, dyslexia, and movement disorders, should be recognized as distinctive variations representing different neurocognitive frameworks.

In the late 1990s, a paradigm shift occurred. What if these “disorders” were simply natural variations in brain wiring? What if human traits existed on a spectrum rather than a stark boundary between normal and abnormal? Those at either end of the spectrum may face challenges, yet their exceptional brains also offer valuable strengths. Viewed through this lens, diverse brains represent assets, contributing positively to society when properly supported.

The concept of neurodiversity gained momentum, sparking lively debates in online autism advocacy groups. By 2013, the Diagnostic and Statistical Manual of Mental Disorders recognized autism as a spectrum condition, abolishing the Asperger’s syndrome diagnosis and classifying it on a scale from Level 1 to Level 3 based on support needs. This shift solidified the understanding of neurodivergent states within medical literature.

Since the early 2000s, research has shown that individuals with autism often excel in mathematical reasoning and attention to detail. Those with ADHD frequently outperform others in creativity, while individuals with dyslexia are adept at pattern recognition and big-picture thinking. Even those with movement disorders have been noted to develop innovative coping strategies.

These discoveries have led many scientists to argue that neurodivergent states are not mere evolutionary happenstance. Instead, our ancestors likely thrived thanks to pioneers, creative thinkers, and detail-oriented individuals in their midst. A group possessing diverse cognitive strengths could more effectively explore, adapt, and survive. Some researchers now propose that the autism spectrum comprises distinct subtypes with varying clusters of abilities and challenges.

While many researchers advocate for framing neurodivergent characteristics as “superpowers,” some caution against overly positive portrayals. “Excessive optimism, especially without supporting evidence, can undermine the seriousness of these conditions,” says Dr. Jessica Eccles, a psychiatrist and neurodiversity researcher at Brighton and Sussex Medical School. Nevertheless, she emphasizes that “with this vocabulary, we can better understand both the strengths and challenges of neurodiversity, enabling individuals to navigate the world more effectively.”

Topics:

Source: www.newscientist.com

The Brain’s Vast Interconnectedness: The Revolutionary Idea of the Century

New Scientist: Explore the latest science news, technology, health advancements, and environmental updates by expert journalists.

You’ve likely encountered the parable of the blind men and the elephant, where each individual’s perspective is limited to one part, leading to a distorted understanding of the whole. This concept resonates deeply in neuroscience, which has historically treated the brain as a collection of specialized regions, each fulfilling unique functions.

For decades, our insights into brain functionality arose from serendipitous events, such as the case of Phineas Gage, a 19th-century railroad worker who dramatically altered personality following a severe brain injury. More recent studies employing brain stimulation have linked the amygdala with emotion and the occipital lobe with visual processing, yet this provides only a fragmented understanding.

Brain regions demonstrate specialization, but this does not encapsulate the entire picture. The advent of imaging technologies, particularly functional MRI and PET scans in the late 1990s and early 2000s, revolutionized our comprehension of the brain’s interconnectedness. Researchers discovered that complex behaviors stem from synchronized activity across overlapping neural networks.

“Mapping brain networks is playing a crucial role in transforming our understanding in neuroscience,” states Luis Pessoa from the University of Maryland.

This transformative journey commenced in 2001 when Marcus Raichle, now at Washington University in St. Louis, characterized the Default Mode Network (DMN). This interconnected network activates during moments of rest, reflecting intrinsic cognitive processes.

In 2003, Kristen McKiernan, then at the Medical College of Wisconsin, and her team identified that the DMN experiences heightened activity during familiar tasks, such as daydreaming and introspection, providing a “resting state” benchmark for evaluating overall brain activity. They began to correlate DMN activity with advanced behaviors, including emotional intelligence and theory of mind.

As discoveries proliferated across other networks—pertaining to attention, language, emotion, memory, and planning—our understanding of mental health and neurodiversity evolved. These neural differences are now thought to be linked with various neurological conditions, including Parkinson’s disease, PTSD, depression, anxiety, and ADHD.

Network science has emerged as a pivotal field, enhancing our comprehension of disorders from autism, characterized by atypical social salience networks—those that detect and prioritize salient social cues—to Alzheimer’s disease, where novel research indicates abnormal protein spread via network pathways. We also acknowledge the inspiration it provides for developing artificial neural networks in AI systems like ChatGPT.

Neural networks have not only reshaped our understanding of brain functionalities but also the methodologies for diagnosing and treating neurological disorders. While we might not yet perceive the entirety of the elephant, our view is undeniably clarifying as science progresses.

Topic:

Source: www.newscientist.com

Revolutionary New Sensor Transforms Optical Imaging Technology

Revolutionizing Imaging Technology: UConn Scientists Create Lens-Free Sensor with Submicron 3D Resolution



Illustration of MASI’s working principle. Image credit: Wang et al., doi: 10.1038/s41467-025-65661-8.

“This technological breakthrough addresses a longstanding issue in imaging,” states Professor Guoan Zheng, the lead author from the University of Connecticut.

“Synthetic aperture imaging leverages the combination of multiple isolated sensors to mimic a larger imaging aperture.”

This technique works effectively in radio astronomy due to the longer wavelengths of radio waves, which facilitate precise sensor synchronization.

However, at visible wavelengths, achieving this synchronization is physically challenging due to the significantly smaller scales involved.

The Multiscale Aperture Synthesis Imager (MASI) turns this challenge on its head.

Instead of requiring multiple sensors to operate in perfect synchronization, MASI utilizes each sensor to independently measure light, employing computational algorithms to synchronize these measurements.

“It’s akin to multiple photographers capturing the same scene as raw light measurements, which software then stitches together into a single ultra-high-resolution image,” explains Professor Zheng.

This innovative computational phase-locking method removes the dependency on strict interferometric setups that previously limited the use of optical synthetic aperture systems.

MASI diverges from conventional optical imaging through two key innovations.

Firstly, instead of using a lens to focus light onto a sensor, MASI employs an array of coded sensors positioned on a diffractive surface, capturing raw diffraction patterns—the way light waves disperse after encountering an object.

These measurements contain valuable amplitude and phase information, which are decoded using advanced computational algorithms.

After reconstructing the complex wavefront from each sensor, the system digitally adjusts the wavefront and numerically propagates it back to the object’s surface.

A novel computational phase synchronization technique iteratively fine-tunes the relative phase offsets to enhance overall coherence and energy during the joint reconstruction process.

This key innovation enables MASI to surpass diffraction limits and constraints posed by traditional optical systems by optimizing the combined wavefront in the software, negating the need for physical sensor alignment.

As a result, MASI achieves a larger virtual synthetic aperture than any individual sensor, delivering submicron resolution and a wide field of view, all without the use of lenses.

Unlike traditional lenses for microscopes, cameras, and telescopes, which require designers to make trade-offs, MASI enables higher resolution without the limitations of lens proximity.

MASI captures diffraction patterns from several centimeters away, reconstructing images with unparalleled submicron resolution. This innovation is akin to inspecting the intricate ridges of a human hair from a distance, rather than needing to hold it inches away.

“The potential applications of MASI are vast, ranging from forensics and medical diagnostics to industrial testing and remote sensing,” highlights Professor Zheng.

“Moreover, the scalability is extraordinary. Unlike traditional optical systems, which become increasingly complex, our framework scales linearly, opening doors to large arrays for applications we have yet to conceptualize.”

For more details, refer to the team’s published paper in Nature Communications.

_____

R. One et al. 2025. Multiscale aperture synthetic imager. Nat Commun 16, 10582; doi: 10.1038/s41467-025-65661-8

Source: www.sci.news

Revolutionary Solution for Cosmic Acceleration: Overcoming Dark Energy Challenges

Researchers from the Center for Applied Space Technology and Microgravity at the University of Bremen and the University of Transylvania in Brașov have unveiled a groundbreaking theoretical framework that challenges our understanding of the universe’s accelerating expansion, potentially rendering dark energy obsolete. They suggest that this acceleration may be an intrinsic characteristic of space-time geometry, rather than a result of unknown cosmic forces.

This artist’s impression traces the evolution of the universe from the Big Bang, through the formation of the Cosmic Microwave Background, to the emergence of galaxies. Image credit: M. Weiss / Harvard-Smithsonian Center for Astrophysics.

For over 25 years, scientists have been puzzled by the unexpected observation that the expansion of the universe is accelerating, counter to the gravitational pull.

In the 1990s, astronomers identified this acceleration through observations of distant Type Ia supernovae, leading to the prevalent theory of dark energy, an invisible force believed to drive this expansion.

Nevertheless, the actual nature of dark energy remains elusive within the Standard Model of cosmology.

Dr. Christian Pfeiffer and his team propose that we may better understand this cosmic acceleration by re-evaluating the geometric framework used to describe gravity.

Central to modern cosmology is Einstein’s theory of general relativity, which details how matter and energy shape space-time.

The universe’s evolution is modeled using the Friedman equation, which originates from Einstein’s principles.

The researchers introduce an innovative solution based on Finsler gravity, an extension of Einstein’s theory.

This approach enhances our understanding of spacetime geometry and allows for a more nuanced exploration of how matter, especially gases, interacts with gravity.

Unlike general relativity, which depends on rigid geometric forms, Finsler gravity presents a more versatile space-time geometry.

With this methodology, the authors recalibrated the equations governing cosmic expansion.

Informed by the Finsler framework, the modified Friedman equation predicts the universe’s acceleration phenomena without necessitating the introduction of dark energy.

In essence, the accelerating expansion emerges directly from the geometry of space-time itself.

“This is a promising hint that we may explain the universe’s accelerating expansion partly without dark energy, drawing from generalized space-time geometry,” Pfeiffer remarked.

This concept does not entirely dismiss dark energy or invalidate the Standard Model.

Instead, it implies that some effects attributed to dark energy might have their roots in a deeper understanding of gravity.

“This fresh geometric outlook on the dark energy dilemma provides avenues for a richer comprehension of the universe’s foundational laws,” stated Dr. Pfeiffer.

The research team’s paper is published in the Journal of Cosmology and Astroparticle Physics.

_____

Christian Pfeiffer et al. 2025. From a moving gas to an exponentially expanding universe, the Finsler-Friedman equation. JCAP 10:050; DOI: 10.1088/1475-7516/2025/10/050

Source: www.sci.news

Revolutionary Quantum Computing Breakthrough: Secure Methods for Backing Up Quantum Information

Researchers from the University of Waterloo and Kyushu University have achieved a groundbreaking advancement in quantum computing by developing a novel method to create redundant, encrypted copies of qubits. This represents a pivotal step towards practical quantum cloud services and robust quantum infrastructure.



Google’s quantum computer – Image credit: Google.

In quantum mechanics, the no-cloning theorem asserts that creating an identical copy of an unknown quantum state is impossible.

Dr. Achim Kempf from the University of Waterloo and Dr. Koji Yamaguchi from Kyushu University emphasize that this fundamental rule remains intact.

However, they have demonstrated a method to generate multiple encrypted versions of a single qubit.

“This significant breakthrough facilitates quantum cloud storage solutions, such as quantum Dropbox, quantum Google Drive, and quantum STACKIT, enabling the secure storage of identical quantum information across multiple servers as redundant encrypted backups,” said Dr. Kemp.

“This development is a crucial step towards establishing a comprehensive quantum computing infrastructure.”

“Quantum computing offers immense potential, particularly for addressing complex problems, but it also introduces unique challenges.”

“One major difficulty in quantum computing is the no-duplication theorem, which dictates that quantum information cannot be directly copied.”

“This limitation arises from the delicate nature of quantum information storage.”

According to the researchers, quantum information functions analogously to splitting passwords.

“If you possess half of a password while your partner holds the other half, neither can be utilized independently. However, when both sections are combined, a valuable password emerges,” Dr. Kemp remarked.

“In a similar manner, qubits are unique in that they can share information in exponentially growing ways as they interconnect.”

“A single qubit’s information is minimal; however, linking multiple qubits allows them to collectively store substantial amounts of information that only materializes when interconnected.”

“This exceptional capability of sharing information across numerous qubits is known as quantum entanglement.”

“With 100 qubits, information can be simultaneously shared in 2^100 different ways, allowing for a level of shared entangled information far exceeding that of current classical computers.”

“Despite the vast potential of quantum computing, the no-cloning theorem restricts its applications.”

“Unlike classical computing, where duplicating information for sharing and backup is a common practice, quantum computing lacks a simple ‘copy and paste’ mechanism.”

“We have uncovered a workaround for the non-replicability theorem of quantum information,” explained Dr. Yamaguchi.

“Our findings reveal that by encrypting quantum information during duplication, we can create as many copies as desired.”

“This method circumvents the no-clonability theorem because when an encrypted copy is selected and decrypted, the decryption key is automatically rendered unusable; it functions as a one-time key.”

“Nevertheless, even one-time keys facilitate crucial applications such as redundant and encrypted quantum cloud services.”

The team’s research will be published in the journal Physical Review Letters.

_____

Koji Yamaguchi & Achim Kempf. 2026. Encrypted qubits can be cloned. Physical Review Letters in press. arXiv: 2501.02757

Source: www.sci.news

Revolutionary Fully 3D Printed Microscope Set to Make Waves in 2025

3D Printed Microscope

Dr. Liam M. Rooney/University of Strathclyde

In early 2025, excitement surged within the research community with the release of a groundbreaking preprint paper detailing the world’s first fully 3D printed microscope. This innovative device was constructed in just hours and costs a fraction of traditional models.

Dr. Liam Rooney, a professor at the University of Glasgow, explained to New Scientist that the response to their revolutionary microscope has been overwhelming, attracting interest from biomedical researchers, community organizations, and even filmmakers. He stated, “The community response has been remarkable.” This significant research has been published in the Microscope Journal.

For the microscope’s body, the team employed designs from the Open Flexure project, a public resource for 3D printing scientific instruments. Utilizing a commercial camera and light source, they controlled the entire system using a Raspberry Pi computer.

The true innovation lies in the 3D-printed microscope lenses made from clear plastic, drastically reducing costs and enhancing accessibility. Traditional microscopes can cost thousands; in contrast, this new model can be assembled for less than £50.

“Since January, we have printed approximately 1,000 lenses in various shapes,” remarked team member Gail McConnell, from the University of Strathclyde.

Several companies producing commercial products that require optics have reached out to discuss potential collaborations, as affordable, lightweight 3D-printed lenses are still uncommon in large-scale production. The team has successfully used the microscope to analyze blood samples and tissue sections from mouse kidneys, validating its utility for medical and biological research.

The researchers aim to democratize access to microscopy, and they are making strides toward that goal. Collaboration with a lab at the Kwame Nkrumah University of Science and Technology in Ghana is underway to enhance microscope accessibility for researchers and students across West Africa. Additionally, they’ve secured funding from the UK Institute for Technology Strategy, and are involved in programs designed to upskill and empower students facing educational barriers.

Furthermore, the team has developed a new microscope course through the Strathclyde Light Microscopy Course, aimed at researchers of all experience levels and providing a unique educational opportunity in the UK. Rooney noted, “This is revolutionizing our teaching methods.”

Looking towards the future, there is substantial potential for further enhancements in 3D printed microscopes. The research team is working to improve resolution without raising costs and have found methods to enhance image contrast by 67%.

McConnell emphasized that the microscope’s design leverages consumer electronics and accessible 3D printing technologies, stating that the future advancements and capabilities are limited only by current 3D printing technology. “As these printers advance, so will our capabilities. The only bottleneck is technology, not creativity,” she explained. “We’re frequently contacted by individuals eager to see new designs.”

Source: www.newscientist.com

Revolutionary Fully 3D Printed Microscope Set to Launch in 2025

3D Printed Microscope

3D Printed Microscope

Dr. Liam M. Rooney/University of Strathclyde

In early 2025, a groundbreaking paper revealed the world’s first fully 3D printed microscope, sparking significant enthusiasm among researchers. This innovative microscope can be constructed in just a few hours and costs significantly less than traditional models.

Dr. Liam Rooney, a professor at the University of Glasgow involved in this project, stated to New Scientist that coverage of the microscope has prompted outreach from biomedical researchers, community organizations, and filmmakers worldwide. “The community response has been amazing,” he noted. The research has been subsequently published in Microscope Journal.

His team utilized the OpenFlexure design, a publicly available resource for creating scientific instruments via 3D printing. Additionally, they incorporated a commercially available camera and light source, all controlled by a Raspberry Pi computer.

A major breakthrough was the 3D printing of microscope lenses using clear plastic, significantly reducing costs and making microscopy more accessible. While traditional microscopes can cost thousands, this new version is available for under £50.

Since January, the team has produced approximately 1,000 lenses in various shapes, according to Gail McConnell from the University of Strathclyde, UK.

Several companies manufacturing products requiring lenses have shown interest in the team’s research, as inexpensive, lightweight 3D-printed lenses are rare in large-scale production. They tested the microscope on blood samples and thin sections of mouse kidneys, confirming its potential utility in medical and biological research.

The team’s mission is to democratize access to microscopy. They are collaborating with the Kwame Nkrumah University of Science and Technology in Ghana, aiming to enhance microscope accessibility for researchers and students in West Africa. They have also secured funding from the UK Institute for Technology Strategy and participate in initiatives that empower students facing educational barriers.

In addition, they have developed a new microscope course at the Strathclyde Light Microscopy Course, tailored for researchers of all experience levels. Mr. Rooney emphasized, “This is truly changing how we educate.”

Furthermore, researchers believe there’s ample opportunity for improvement. They are focused on enhancing resolution without adding costs, having already improved contrast by up to 67%.

McConnell remarked that because the microscope is designed for low-cost consumer electronics and accessible 3D printers, its future scalability is tied to advancements in 3D printing technology. “As these printers improve, so will we. The bottleneck isn’t imagination,” she explained. “We are continually receiving inquiries to develop new innovations.”

Topics:

  • 3D Printing/
  • 2025 News Review

Source: www.newscientist.com

Revolutionary Single Treatment May Permanently Eliminate High Cholesterol

Cholesterol management may be achievable by altering just one switch in an individual’s genetic code—potentially for a lifetime.

A pilot study featured in the New England Journal of Medicine demonstrated a novel gene therapy that decreased patients’ low-density lipoprotein (LDL) cholesterol, commonly known as “bad” cholesterol, by nearly 50%, while also reducing triglycerides by an average of 55%.

If forthcoming trials yield similar results, this one-time therapy could serve as an alternative to the combination of medications that millions currently rely on to manage their cholesterol.

LDL cholesterol and triglycerides are lipids produced by the liver; however, excessive accumulation in the bloodstream can lead to fat deposits that may result in cardiovascular diseases, which account for about one-third of deaths in the United States.

“Both LDL cholesterol and triglycerides are linked to severe cardiovascular risks, such as heart attacks, strokes, and mortality,” remarked Steven Nissen, a professor of medicine at the Cleveland Clinic Lerner School of Medicine. BBC Science Focus.

Nissen was part of a research team focusing on lowering cholesterol levels by targeting the ANGPTL3 gene, associated with LDL cholesterol and triglycerides.

About 1 in 250 individuals possess a mutation that deactivates this gene, leading to lower lipid levels in their blood. Nissen noted, “Importantly, the occurrence of cardiovascular diseases in these individuals is also minimal.”

Thanks to CRISPR gene-editing technology, identifying individuals who might benefit from this mutation is no longer just a matter of chance.

CRISPR selectively modifies DNA by targeting specific genes. – Credit: Getty

Utilizing CRISPR, Nissen and his team developed a treatment to deactivate the ANGPTL3 gene in the liver, which was then infused into 15 patients during an initial safety study.

The treatment significantly reduced participants’ LDL and triglyceride levels within two weeks, and these reductions remained stable after 60 days. Nissen stated, “These changes are anticipated to be permanent.”

Healthcare professionals recommend maintaining LDL cholesterol levels below 100mg/dL to promote heart health. While lifestyle changes can assist, many individuals, particularly those with genetic tendencies to high cholesterol, find it challenging to reach this target.

While existing medications are effective, no drugs simultaneously lower both LDL cholesterol and triglycerides, often requiring patients to take multiple medications daily for life to manage their cholesterol.

“The next phase of the trial is set to commence in the coming months, involving more patients with elevated LDL cholesterol or triglycerides,” Nissen stated.

If the trials continue to succeed, this therapy could serve as a lasting solution against some of the most significant health threats globally.

read more:

Source: www.sciencefocus.com

Revolutionary Paint-Like Coating Harvests Water from Air

Researchers at the University of Sydney, in collaboration with Dewpoint Innovations, have engineered a porous polymer coating that can reflect as much as 97% of sunlight, dissipate heat into the atmosphere, and maintain surface temperatures up to 6 degrees cooler than the ambient air—even in direct sunlight. This mechanism fosters ideal conditions for atmospheric water vapor to transform into water droplets on these cooler surfaces, much like the condensation seen on a bathroom mirror.



Experimental equipment installed on the roof of the Sydney Nanoscience Hub. Image credit: University of Sydney.

Professor Chiara Neto from the University of Sydney stated: “This innovation not only advances cool roof coating technology, but also paves the way for sustainable, low-cost, decentralized freshwater sources—an essential requirement given the challenges of climate change and rising water scarcity.”

A six-month field study conducted on the roof of the Sydney Nanoscience Hub demonstrated that dew was collected for 32% of the year, enabling a sustainable and reliable water source even during dry spells.

Under optimal conditions, this coating can yield up to 390 mL of water per square meter daily—sufficient for a 12-square-meter home, meeting one person’s daily hydration needs.

This research illustrates the integration of passive cooling techniques and atmospheric moisture collection into scalable paint-like solutions.

The extensive collection area suggests that this coating could have diverse applications in various industries, including water supply for livestock, horticulture for premium crops, cooling through spraying, and hydrogen production.

Contrary to conventional white paints, the porous coatings utilizing polyvinylidene fluoride-co-hexafluoropropene (PVDF-HFP) do not depend on UV-reflective pigments like titanium dioxide.

Dr. Ming Chiu, Chief Technology Officer of Dewpoint Innovations, remarked, “Our design achieves superior reflectiveness through an internal porous structure, ensuring longevity without the environmental downsides of pigment-based coatings.”

“By eliminating UV-absorbing materials, we have surmounted traditional limitations of solar reflectance while avoiding glare from diffuse reflection.”

“This equilibrium between performance and visual comfort enhances its ease of integration and appeal for real-world applications.”

Throughout six months of outdoor examination, researchers documented minute-by-minute data on cooling and water collection, confirming solid performance that remained stable under the harsh Australian sun—unlike similar technologies that often degrade quickly.

In addition to water harvesting, these coatings could help mitigate urban heat islands, lower energy needs for air conditioning, and provide climate-resilient water sources for regions facing heightened heat and water stress.

“This research also challenges the notion that dew collection is confined to humid environments,” noted Professor Neto.

“While humid conditions are optimal, condensation can also occur in arid and semi-arid areas where humidity increases during the night.”

“It isn’t a substitute for rainfall; rather, it serves as a water source when other supplies are scarce.”

The team’s work was published in the October 30th issue of Advanced Functional Materials.

_____

Ming Chiu et al. A passive cooling paint-like coating to capture water from the atmosphere. Advanced Functional Materials published online October 30, 2025. doi: 10.1002/adfm.202519108

Source: www.sci.news

Revolutionary ‘Aging Atlas’ Uncovers Organ Changes Throughout Life Span

The visible signs of aging, like wrinkles, gray hair, and joint discomfort, are merely surface reflections of more intricate processes happening within our cells. Deep inside your body, every organ experiences its own subtle molecular shifts as you grow older.

Researchers have now developed the most detailed map to date illustrating how this process unfolds.

For further insights into our findings, which are based on data from over 15,000 samples, please visit this preprint research. The paper, currently awaiting peer review, offers an unprecedented view of how aging modifies our genomic blueprint from head to toe.







A collaborative effort among researchers worldwide has led to the creation of a comprehensive “aging atlas” that maps DNA methylation (chemical tags that regulate gene activity) across 17 different types of human tissues while tracking age-related changes.

“DNA methylation, simply put, is a chemical modification on DNA,” said Dr. Jesse Poganic, co-author of the study and a medical instructor at Harvard Medical School, as reported by BBC Science Focus.

“At a fundamental level, their primary role is to regulate which genes are activated and which are not.”

If you stretched all the DNA in your body, it would span over 300 times the distance from Earth to the sun and back – Photo credit: Getty

Despite a few mutations, each cell shares essentially the same genetic information in the form of its genome. So how do lung cells recognize their identity while stomach cells act as stomach cells? This is where methylation plays a crucial role.

“The methylation or unmethylation status at a specific point on the genome determines whether a particular gene is turned on or off,” Poganik noted.

But what does all this reveal about the aging process?

DNA methylation serves as one of the body’s essential epigenetic mechanisms, acting as a molecular switch that toggles genes on or off without altering the DNA sequence itself. By adding and removing tiny molecules known as methyl groups, cells can adjust which genes are expressed in response to diet, exercise, infections, and other environmental influences.

As time passes, these methylation patterns alter in specific ways, forming the basis of the so-called epigenetic clock, which serves as a molecular measure of biological age. Until now, most of these clocks relied on blood samples, leaving scientists uncertain if other organs followed similar patterns.

“DNA methylation patterns differ from tissue to tissue. They are specific to both the tissue and the cell type,” said Professor Nir Eynon, the study’s senior author and research group leader at Monash University, as reported by BBC Science Focus. “Thus, blood measurements don’t necessarily represent what happens in your liver, muscles, or brain.”

This gap prompted the team to gather all publicly available datasets on methylation within reach, complemented by new data from global collaborators.

The analysis covered nearly 1 million points across the genome, encompassing 17 organs, from the brain and heart to the skin, liver, stomach, and retina.

Atlas of Aging

The researchers discovered that the proportion of genomes with methylation tags varied significantly across tissues, ranging from approximately 38 percent in the cervix to over 60 percent in the retina. Surprisingly, age-related changes were quite uniform, with most tissues becoming increasingly hypermethylated as they age, resulting in more tagged DNA sites and the silencing of certain genes.

However, two organs defied this trend. Both skeletal muscle and lung tissue can experience a loss of methyl tags over time, leading to excessive or irregular gene expression.

“Most tissues show hypermethylation with age,” explained Dr. Max Jack, the study’s lead author. BBC Science Focus via email. “Yet when you refine it down to methylation rates, distinct tissue-specific patterns emerge.”

Different organs age at varying rates. An aging atlas begins to elucidate why – Credit: Getty

For instance, adipose tissue predominantly shifts toward hypermethylation, while changes are more balanced in the brain. These patterns may illuminate how different organs react to common aging stressors, such as inflammation, according to Jacques.

Overall, significantly age-related methylation changes were observed in brain, liver, and lung tissues, with skin and colon tissues also showing marked alterations. Conversely, pancreatic, retinal, and prostate tissues exhibited the least detectable age-related changes, possibly due to limited data or greater resilience to aging.

Correlation, Not Causation (For Now)

At first glance, the data imply that some organs age quicker than others. However, researchers caution that these distinctions cannot yet be interpreted as a direct rate of aging.

This is partly due to statistical factors. Some organs represent thousands of samples, while others are represented by only a handful.

Moreover, “We know that methylation changes occur as we age,” Poganik states. “What we don’t know is the extent to which they contribute to aging.”

In other words, while scientists are aware of the methylation alterations linked to aging, it’s still unclear whether those changes induce aging or whether aging triggers those changes.

Poganik believes that alterations in methylation likely account for at least some of the observable phenomena associated with aging. “Even cautious scientists would suggest there’s an element of causation,” he remarks.

The allure of this new atlas lies in its revelation of common molecular themes threading throughout the body, he adds.

“One of the most compelling aspects of this study is that it demonstrates some universality in the aging process. When we analyze various tissues, we encounter numerous similar methylation changes, suggesting a universal quality to aging.”

Nevertheless, he warns that not all alterations are causal. With so many ongoing methylation changes, some are almost certainly part of aging, while others may not hold significance.

Old atlases might not pinpoint which changes are critical and which are not, but they offer an invaluable collection of data for researchers to delve deeper into the issue than ever before. The atlas is now openly accessible through an online portal for other scientists to explore and utilize.

“We have consistently prioritized open-source research,” Jack states. “With this, we aim to make it accessible to everyone, not only to advance research but also to foster collaboration.”

Going forward, the research team plans to examine some universal associations prevalent across all tissues as we age, alongside other biomarkers that may be influencing the aging process.

“Advancements in aging pale in comparison to those in cancer,” Poganik adds. With the assistance of this atlas, scientists may finally bridge that gap.

Read more:

Source: www.sciencefocus.com

Revolutionary Quantum Funds Stored on Ultra-Cold ‘Debit Card’

Quantum Debit Card Ensures Financial Security

GlobalImages101/alamy

New quantum debit cards, which can hold unforgeable quantum funds, are constructed using extremely cooled atoms and light particles.

While standard banks often rely on the skill of counterfeiters to detect fake banknotes, quantum banks utilize the no-cloning theorem from physics, rendering counterfeiting impossible. This principle, which states that creating identical copies of quantum information is not feasible, led physicist Stephen Wiessner to propose a protocol in 1983 for generating secure currencies. Julian Laurat and his team at the Kastler Brossel Laboratory in France are actively implementing this groundbreaking concept in advanced experiments.

According to this protocol, banks create banknotes composed of quantum particles, possessing unique properties and existing in specific quantum states, thus ensuring protection against forgery through the no-cloning theorem. Laurat remarks that the protocol showcases an impressive feat of quantum cryptography, though it has not yet been put into practice for actual quantum fund storage.

The research team has made storage feasible by combining memory devices with hard drives. In their experiments, users interact with quantum systems that act as banks by exchanging photons. Each photon can be stored similarly to loading money onto a debit card.

The memory devices used by the team consist of hundreds of millions of cesium atoms, which researchers cool down to nearly absolute zero by bombarding them with lasers. At such extreme temperatures, light can precisely manipulate the quantum state of atoms, but Laurat notes that years were spent identifying the optimal cooling needed for atomic memory to serve as a quantum debit card. Through extensive testing, he and his colleagues demonstrated that users can retrieve photons from atoms without corrupting their states, as long as the process is not tampered with.

Christophe Simon from the University of Calgary emphasizes that the new experiment marks progress toward fully realizing quantum funding. However, the current quantum memory storage time of around six million seconds remains insufficient for practical application. “Another future step is to enhance portability. The long-term goal is to develop quantum memory that can be easily carried, particularly for Quantum Money applications. But we are not there yet,” he states.

The team is focused on extending storage durations, asserting that the protocol can be employed within quantum networks already being established in metropolitan areas across the globe. Additionally, cutting-edge quantum memory not only facilitates ultra-secure long-distance quantum communication but is also instrumental in connecting various quantum computers to more powerful systems.

Topics:

  • Quantum Computing/
  • Encryption

Source: www.newscientist.com

Revolutionary Video Unveils Hidden Aspects of Human Fertility

For the first time, real-time footage of human embryos being implanted into an artificial uterus has been recorded.

This remarkable achievement, published in the journal Advances in Science, offers an unparalleled glimpse into one of the crucial stages of human development.

Implantation failure is a leading cause of infertility, responsible for 60% of miscarriages. Researchers aim to enhance understanding of the implantation process to improve fertility results in both natural conception and in vitro fertilization (IVF).

“We can’t observe this, due to the transplantation in the mother,” stated Dr. Samuel Ojosnegros, head of bioengineering at the Institute of Bioengineering (IBEC) and the lead author of the study, as reported by BBC Science Focus.

“Thus, we required a system to observe how it functions and to address the primary challenges to human fertility.”

Implantation marks the initial phase of pregnancy, where the fertilized egg (developing embryo) attaches to the uterine lining, allowing it to absorb nutrients and oxygen from the mother—vital for a successful pregnancy.

To investigate this process, the research team developed a platform that simulates the natural uterine lining, utilizing a collagen scaffold combined with proteins essential for development.

The study then examined how human and mouse embryos implant onto this platform, uncovering significant differences. Unlike mouse embryos that adhere to the uterine surface, human embryos penetrate fully into the tissue before growing from within.

https://www.youtube.com/watch?v=1p3in1fzrec

Video showing the implantation process of mouse embryos (left) and human embryos (right).

“Human embryos are highly invasive,” said Ojosnegros. “They dig a hole in the matrix, embed themselves, and then grow internally.”

The footage indicated that the embryo exerts considerable force on the uterus during this process.

“We observed that the embryo pulls, moves, and rearranges the uterine matrix,” stated Dr. Amélie Godeau, co-first author of the research. “It also responds to external force cues. We hypothesize that contractions in vivo may influence embryo transfer.”

According to Ojosnegros, the force applied during this stage could explain the pain and bleeding many women experience during implantation.

Researchers are currently focused on enhancing the realism of implantation platforms, including the integration of living cells. The goal is to establish a more authentic view of the implantation process, which could boost the likelihood of success in IVF, such as by selecting embryos with better implantation potential.

“We understand more about the development of flies and worms than our own species,” remarked Ojosnegros. “So enjoy watching the film.”

Read more:

Source: www.sciencefocus.com

Revolutionary Optical Fibers Could Enhance Communication Capabilities

In a departure from conventional solid glass cores, the innovative optical fibers now incorporate an air core encased in precisely crafted glass microstructures to guide light. This advancement boosts transmission speeds by 45%, enabling greater data transfer over longer distances before amplification is required.

Petrovich et al. We report microstructured optical waveguides with unprecedented transmission bandwidth and attenuation. Image credit: Gemini AI.

Optical fibers in telecommunications have typically relied on solid silica glass constructs, and despite extensive refinements, their signal loss remains a critical challenge.

This results in about half of the light traveling through the fiber being lost after approximately 20 km, necessitating the use of optical amplifiers for extended distance communication, such as intercontinental terrestrial and undersea connections.

Minimizing signal loss can be achieved within a limited spectrum of wavelengths. This has constrained the data capacity in optical communications over recent decades.

Francesco Poletti and his team from the University of Southampton developed a new type of fiber optic featuring a hollow air core surrounded by intricately designed thin silica rings to effectively guide light.

Laboratory tests revealed that these fibers exhibit an optical loss of 0.091 decibels per kilometer at the commonly utilized optical wavelengths in communications.

Consequently, optical signals with appropriate wavelengths can travel approximately 50% farther before needing amplification.

This configuration offers a broader transmission window (the range of wavelengths where light propagates with minimal signal loss and distortion) than previous fiber optic technologies.

While this novel optical fiber may demonstrate lower losses due to the use of larger air cores, further investigation is necessary to validate these findings.

“We anticipate that advancements in manufacturing, geometric consistency, and reduced levels of absorbent gases in the core will solidify these new fibers as essential wave guiding technologies,” Reservers remarked.

“This breakthrough could pave the way for the next major advancement in data communication.”

Their study will be published in the journal Nature Photonics.

____

M. Petrovich et al. Broadband optical fiber with attenuation of less than 0.1 decibels per kilometer. Nature Photonics Published online on September 1, 2025. doi:10.1038/s41566-025-01747-5

Source: www.sci.news

Revolutionary Cement Prevents Heat Build-Up in Buildings

SEI 262737271

Concrete buildings absorb heat in hot climates

Panther Media Global/Alamy

Cement can self-cool by reflecting light outward and dissipating heat from its surface, offering a comfortable indoor climate without reliance on air conditioning.

Traditional cement often absorbs infrared light from the sun, trapping heat and causing indoor temperatures to rise along with the surrounding air.

To tackle this challenge, Fengyin Du from Purdue University in Indiana and her team developed a unique cement that features tiny reflective mineral crystals called ettringite on its exterior.

This innovative cement releases infrared light instead of retaining it, allowing for rapid heat loss. “It acts like a mirror or radiator, reflecting sunlight and releasing heat into the atmosphere, enabling the building to remain cool without needing air conditioning or power,” Du explains.

Initially, the researchers create small pellets from commonly found minerals like limestone and gypsum. These are ground into a fine powder, mixed with water, and poured into silicon molds that contain small perforations. Air bubbles moving through these holes form slight indentations on the surface, where the reflective ettringite crystals can develop. The aluminum-rich gels in the set cement permit infrared rays to traverse the material.

Du notes the process is easily scalable and enables cement production at lower temperatures, making it $5 less expensive per tonne than conventional Portland cement.

Du and her team evaluated the temperature regulation of their cement on the hot roof of Purdue University’s campus and observed that its surface temperature was 5.4°C (9.7°F) cooler than the surrounding air and 26°C (47°F) lower compared to Portland Cement.

Surface dimples of cement viewed under an electron microscope

Guo Lu/Southeast University

“It’s a valuable material,” states Oscar Brousse from University College London. “You enhance the material’s ability to reflect and emit energy, thus efficiently releasing energy that the material has absorbed.”

However, gauging just the surface temperature of a material does not convey its real-world performance. “A surface temperature reduction of 5°C translates into a 5°C decrease in air temperature, which can significantly impact local conditions.”

topic:

Source: www.newscientist.com