Chia, marketed as an eco-friendly alternative to Bitcoin, is facing scrutiny for its energy consumption, which is reported to be 18 times higher than what its developers initially claimed. Chia Network has admitted that this figure is “not far off.”
Unlike Bitcoin, which depends on energy-intensive mining through a proof-of-work system, Chia employs a space-time proof mechanism that doesn’t require massive computations but rather relies on unused hard disk space. Currently, Bitcoin consumes around 157 terawatt-hours annually, roughly equivalent to the energy usage of all of Poland. In contrast, Chia miners earn new coins based on the amount of space they allocate for data and the duration they maintain that allocation.
Getting started with Chia involves two main processes: plotting and farming. Plotting is a CPU and memory-intensive task that generates data for storage, while farming involves simply storing data and confirming its existence to the network. High-speed solid-state drives (SSDs) are typically used for plotting, whereas economical, slower hard drives are used for farming.
A study by Soraya Jerab and her team at Algeria’s High School of Computer Science and Digital Technology revealed that the plotting phase consumes a significant number of SSDs, contributing to Chia’s carbon footprint. Their research involved precise power measurement of devices engaged in various Chia tasks, revealing that Chia’s annual CO2 emissions are likely between 0.584 million and 1.402 million tonnes. This estimate vastly exceeds the claimed 50,000 tonnes, indicating emissions two orders of magnitude larger than other mainstream blockchains like Ethereum.
“The primary emissions are from hardware production,” notes Jerab. “To use Chia, new hardware must be purchased, and this production process consumes energy.”
The researchers found that running the plotting process just 160 times could lead to the rapid degradation of a new SSD, and Chia’s alternative plotting methods might further increase emissions by requiring more RAM and GPUs, which are also energy-intensive.
As co-author Clementine Gritty stated, “While it’s still better than Bitcoin, it won’t save the planet.”
Gene Hoffman, CEO of Chia Network, argues that the study’s findings overstate the network’s environmental impact. “We are being charged for the carbon footprint of building drives that would’ve gone to landfill after four years of use. Chia farming has created a market for otherwise discarded hardware,” he said.
In any case, Hoffman is optimistic about upcoming changes to the Chia network. The anticipated Proof of Space 2.0 is expected to significantly reduce emissions and improve energy efficiency. “We believe we’re doing a good job, and Proof of Space 2.0 will be even more effective,” he asserted.
Over the past century, the cranial structure of Japanese individuals has evolved significantly, resulting in rounder heads, narrower cheekbones, broader upper jaws, and thinner, more prominent noses.
Although variations outside Japan may exist, global trends suggest similar morphological changes are likely happening worldwide. Shiori Usui from the Chiba Prefectural Science Police Research Institute emphasizes that this is a natural progression as lifestyles modernize globally.
Scientists traditionally use measurements from human remains from the 19th and early 20th centuries to establish baseline comparisons for “modern” humans. Usui explains that today’s populations are generally taller and larger due to advancements in health, diet, and environmental factors, which may also provide insights into head shape changes.
To investigate these changes, researchers utilized CT scans to analyze skulls from 34 men and 22 women who died of natural causes between 1900 and 1920, with their remains donated to Kyoto University School of Medicine for research purposes. They also examined 29 men and 27 women who passed away between 2022 and 2024. Autopsy imaging is increasingly common in Japan, contributing to extensive “virtual skeleton collections,” notes Usui.
Using 3D skull images, researchers identified subtle yet consistent shifts over time. Notably, contemporary individuals are becoming more brachycephalic, as the oval-shaped skulls of the early 20th century are being replaced by rounder forms. While earlier hypotheses suggested these trends, CT scans revealed unexpected differences, including changes in cheekbone structure, nose shape, and forehead contour, which has become shorter and slightly concave over time.
Additionally, the mastoid process, located behind the ear, has increased in size and prominence. Usui emphasizes that these changes are too recent to be attributed to genetic evolution; rather, they likely result from lifestyle factors, such as improved childhood nutrition and the consumption of softer foods requiring less chewing.
Interestingly, the disparities between male and female skulls have intensified compared to a century ago, with male skulls exhibiting stronger brow ridges, larger mastoid areas, and more pronounced facial features than female skulls. “This finding was unexpected,” Usui admits. The team assumed that similar lifestyles between genders would diminish physical differences, leading them to anticipate more “androgynous” facial features, yet their analysis revealed increasing sexual dimorphism.
A recent 2024 US survey indicates that both men’s and women’s facial structures evolve similarly over time. However, a 2000 US study noted a contrasting trend, with head shapes becoming more oval than round, potentially due to earlier studies’ technical limitations, as well as significant changes in the ethnic makeup of the U.S. population due to immigration.
“We aspire to conduct more global research to comprehend how different populations uniquely adapt to rapid environmental modernization,” Usui adds.
Francesco Capello from the University of Palermo highlights that even relatively recent human populations are not fixed; they continue to evolve. “This invites crucial questions regarding the interplay between genetics and environment, especially for traits like bone structure that were once considered stable,” he notes.
The findings underscore the need for scientists to reassess the criteria used for identifying human remains, says Kimberly Plomp from the University of the Philippines Diliman. “The significant changes in modern human skull morphology suggest that existing identification methods may no longer be as reliable as previously thought,” she warns. “This has vital implications for biology and forensic anthropology.”
The years 2023 and 2024 are projected to be the warmest on record, coinciding with a significant Pacific climate event known as El Niño. This phenomenon raises surface temperatures in the eastern Pacific Ocean, resulting in excessive heatwaves in the Amazon and heavy rainfall across the southern United States. Conversely, the La Niña event introduces cooler temperatures, bringing wetter conditions to the northern United States.
Typically, during an El Niño, the warm water in the eastern Pacific weakens the trade winds, creating a self-reinforcing cycle that amplifies the warming. However, the El Niño of 2023 is distinct; despite rapid ocean warming, the trade winds have remained strong. Researchers from the Scripps Institution of Oceanography, led by Qihua Peng and Shang-Ping Xie, have explored this unique occurrence.
To understand the changes, the team monitored pressure variations across the Pacific using the Southern Oscillation Index (SOI) established by NOAA. Typically, as the eastern Pacific warms during an El Niño, the pressure differences across the Pacific decrease. However, in 2023, while eastern Pacific temperatures soared more than 3°F (2°C) above average, the pressure drop was only about 31% stronger than anticipated. Additionally, alterations in wind speed and direction accounted for only about 30% of the warming. What then accounts for the robust El Niño in 2023?
To answer this question, researchers expanded their analysis beyond the Pacific, examining satellite data for sea surface temperatures from NOAA. They discovered that the North Atlantic and Indian Oceans also recorded unprecedented heat in 2023, with North Atlantic temperatures exceeding 2°F (1°C) above normal, marking an unusual occurrence. This indicates that El Niño events can be influenced by oceanic conditions globally, not simply confined to the Pacific Ocean.
The team employed a computer program to simulate atmospheric responses to oceanic temperatures using a community atmosphere model. This simulation helped assess how heat from the North Atlantic and Indian Oceans affects the Pacific. Results indicated that heat generated large columns of hot air in these regions, which then cooled at high altitudes before descending over the central Pacific. This enhanced updraft and downdraft loop directed the trade winds westward, fortifying the easterly trade winds by about 30% compared to what Pacific warming would alone suggest. If trade winds remained strong, why was the eastern Pacific so warm in 2023?
To uncover this, researchers scrutinized NOAA’s ocean temperature and sea level data over three consecutive years of La Niña from 2020 to 2023 using the Global Ocean Data System. During this period, the strengthening trade winds transported heat into the western Pacific, leading to thermal expansion of the warming waters, creating a “mountain” of warm water in the western Pacific — the highest level of heat storage recorded since 1982. When the weakening La Niña diminished the trade winds, this accumulated warm water surged eastward, paving the way for the El Niño event.
To ascertain if the stored heat alone could trigger El Niño, researchers utilized a computer simulation to model ocean-atmosphere interactions with a coupled general circulation model. They input observed sea temperatures from April 1, 2023, when La Niña ended, omitting all subsequent wind alterations. Their model adeptly replicated 87% of the observed warming from June to December 2023, indicating that only 13% of the warming resulted from trade wind influences. The stored heat migrated eastward via massive underwater waves along the equator, forcing deeper cold ocean water upwards, which warmed the surface layers. This oceanic dynamics thus enabled the 2023 El Niño to emerge without the typical wind feedback.
The research team posits that in an increasingly warmer world, substantial heat reservoirs in the western Pacific may become more prevalent, potentially leading to a rise in the frequency of strong El Niño events. However, since their analysis focused on this singular phenomenon, the frequency of El Niño occurrences driven purely by oceanic processes remains uncertain. Ultimately, their findings reveal that the ocean is not merely a passive player in El Niño events but can actively influence their development.
J. Craig Venter, a groundbreaking scientist renowned for his pivotal role in decoding the human genome and a trailblazer in modern genomics, passed away on Wednesday, as announced by his institute.
Subscribe to read this story without ads
Get unlimited access to ad-free articles and exclusive content.
He was 79 years old.
The J. Craig Venter Institute shared a statement on Wednesday confirming that he died in San Diego after being hospitalized due to cancer complications.
Venter was a pioneering scientist who significantly influenced the field of genomics. His institute asserted that he championed the idea that scientific advancements should provide “real-world impact.” He also played a crucial role in establishing synthetic biology.
Venter served as a naval officer in Vietnam from 1967 to 1968. He later earned his BS in biochemistry and PhD in physiology and pharmacology from the University of California, San Diego.
His influential research primarily focused on genomics. Venter stated that his institute “moved genome science from a slow, gene-by-gene discovery process to scalable, data-driven science, paving the way for demonstrated genome design and construction.”
Venter led efforts to create one of the first draft sequences of the human genome. His team published the first “high-quality” diploid human genome, emphasizing the importance of measuring genetic variations inherited from parents.
The human genome is a comprehensive set of genetic information, stored as DNA within nearly every cell nucleus in the body, as described by the J. Craig Venter Institute.
In the 1990s, Venter and his team at the National Institutes of Health developed expressed sequence tags (ESTs), which facilitated the rapid discovery of new genes.
In 1995, Venter and colleagues utilized “whole-genome shotgun sequencing” to sequence the DNA of Haemophilus influenzae, marking it as the first free-living organism analyzed.
In 1998, Venter co-founded Celera Genomics.
His team at Celera competed against the National Institutes of Health-supported Human Genome Project, which received funding from the US government and UK research partners.
As president of Celera in 2000, Venter and the public consortium announced they had compiled the first draft of the human genome, a landmark achievement in biological science.
Beyond genomics, Venter directed the Global Ocean Sampling Expedition in Metagenomics, revealing remarkable microbial diversity.
Scientists globally paid tribute to Venter, acknowledging his remarkable contributions to the field.
“Craig Venter was a force of nature, a controversial yet profoundly impactful figure,” remarked Sir John Hardy, a neuroscience professor at University College London. He stated in a press release. “The race to finish sequencing the human genome was marked by competition between American and British consortia. This rivalry undeniably accelerated the process, but ultimately, both teams published their findings simultaneously in Science and Nature.”
Dr. Roger Highfield, scientific director at the Science Museum Group, commented that Venter was an “adventurous and tireless pioneer” in genome sequencing and synthetic biology.
“I was in correspondence with him only weeks ago about a new writing project,” Highfield shared. He mentioned a recent diagnosis, but the news came as a shock. While Craig was a polarizing figure, he was undeniably passionate and driven by science.”
Throughout his illustrious career, Venter received numerous accolades, including the 2008 National Medal of Science, the 2002 Gardner Foundation International Award, the 2001 Paul Ehrlich and Ludwig Darmstaedter Awards, and the King Faisal International Science Award.
Feedback is New Scientist. Your source for cutting-edge science and technology news. Share your insights with us at feedback@newscientist.com.
Zack GPT: The Future of Feedback
Feedback has seen its fair share of bosses—some commendable, others not. It’s a diverse history, yet none were artificial intelligence until now.
Meta, the parent company of Facebook, Instagram, and WhatsApp, is reportedly crafting an AI version of CEO Mark Zuckerberg. This AI, known as ZuckGPT, aims to interact with employees by emulating Zuckerberg’s public persona, policies, and even his mannerisms. This initiative is spearheaded by Meta’s Superintelligence Labs, dedicated to developing lifelike AI interactions.
The intent is to foster a deeper connection between employees and Zuckerberg. However, opinions may vary on the desirability of such a digital interaction.
From Feedback’s perspective, this concept raises concerns. Historically, my bosses had a tendency to be elusive, attending meetings or being out of the office. Their absence allowed for a degree of flexibility—sometimes extending beyond mere work.
Conversely, the digital ZuckGPT has the potential to be omnipresent, maintaining an online presence at all hours. Yet, this approach could lead to a lack of genuine human connection.
Interestingly, there’s a chance this AI initiative may not succeed. Long-time readers might remember Meta’s ambitious yet ultimately disappointing foray into the Metaverse, where they struggled with basic avatar interactions.
Should the venture into an AI Zuckerberg fall short, it may be due to technical challenges, such as animating facial movements correctly.
The Mystery of Theoretical Chocolate
Another thought-provoking topic has emerged under the banner of “Unexpected Questions”. Unlike lighter inquiries into trivia, this one could potentially transform the chocolate industry.
Reader Toby Pereira was inspired by Tom Gold’s comic on confectionery structures and inquired about the existence of a fourth type of chocolate—one devoid of cocoa powder and milk. Such a chocolate might only consist of cocoa butter and sugar, which could prove problematic for digestion. The existence of this hypothetical concoction remains a mystery.
The quest for this elusive type of chocolate continues, and we invite readers to share their insights!
Seeking Words for Unseen Concepts
This section invites reader contributions to a linguistic conundrum: identifying words that describe certain scenarios we feel lack proper nomenclature. This request closes the column, encouraging fresh engagement.
Our exploration brings us to the Ship of Theseus paradox regarding identity. The entity raises questions about continuity through the replacement of components. Interestingly, the Wikipedia page discussing this topic has undergone numerous substantial edits, leading to a fascinating inquiry: Is it still the Ship of Theseus?
In essence, the Wikipedia page illustrates the paradox itself. If you have a suitable term to encapsulate this phenomenon, we urge you to share!
Have a story for Feedback?
You can share your article with us at feedback@newscientist.com. Don’t forget to include your home address. Explore this week’s highlights and past feedback on our website.
Renowned biologist Craig Venter, instrumental in decoding the human genome and advancing synthetic biology, has passed away.
According to the J. Craig Venter Institute, Venter died “after a brief hospitalization due to unexpected side effects from treatment for a recently diagnosed cancer.” He was 79 years old.
Venter’s legacy is vast and impactful, marked by significant advancements in genomics and biodiversity. His career also highlighted the commercialization of biological research and the competitive nature of modern science.
Venter’s journey into research was unconventional; after high school, he was an uninterested student drawn to sailing and surfing. His experience in the US Navy during the Vietnam War inspired him to turn his life around. Upon returning home, he pursued higher education, eventually becoming a biomedical researcher at the National Institutes of Health (NIH) in the 1980s.
Venter’s fascination with the human genome led him to utilize automated sequencing machines, significantly accelerating research. He began with sequencing short DNA fragments called expressed sequence tags, igniting controversy when he claimed NIH would patent these sequences, leading to heated debates within the scientific community.
The Official Human Genome Project (HGP) launched in 1990, but Venter deemed their methods too slow. In 1998, he founded the for-profit company Celera Genomics to expedite the sequence, competing against the publicly funded HGP.
While HGP employed Sanger sequencing, which involved mapping and piecing together the genome, Venter introduced the shotgun sequencing technique. This novel method involved breaking the genome into random pieces followed by sequencing and computer analysis. In 1995, he successfully sequenced the whole bacterial genome, laying the groundwork for targeting the more complex human genome.
The race culminated in a draw, with both teams publishing draft sequences in 2000, followed by their finalized results the next year. The HGP released all of its data publicly, while Venter’s Celera initially withheld some for commercial benefit.
Despite backlash from the genetic community, Venter moved forward with his innovative research. From 2004 to 2006, he sailed aboard his yacht, the Sorcerer II, collecting seawater samples and sequencing vast amounts of DNA, resulting in the identification of over 1000 new protein families.
Venter’s ambition extended to creating synthetic life forms, asserting that manipulating organisms could yield significant advantages in fields ranging from medicine to agriculture. In 2010, his team synthesized a novel cell.
Starting with the bacterium Mycoplasma mycoides, they synthesized an artificial genome by combining lab-generated DNA strands and replaced the original genome with an artificial one, allowing the cell to thrive and multiply instead of dying.
Venter clarified that he did not create life from scratch but engaged in generating a new form of life whose genome was entirely computer-generated, lacking biological ancestry. His team humorously inscribed their names onto the genome, symbolizing the successful transfer of genetic data.
Venter faced skepticism from fellow synthetic biologists who questioned the purpose of his flashy experiments, suggesting that alternative approaches may yield more practical outcomes. However, he persisted in refining his work, stripping away non-essential genes to develop organisms with “minimal genomes,” revealing many unknown essential gene functions and underscoring the complexity of life.
It will take extensive analysis for historians to evaluate Venter’s full impact on science. Nevertheless, his contributions are undeniably profound and transformative.
If this revolutionary method proves effective, it represents a monumental advancement in genetic science. Researchers from South Korea have developed a magnetically controlled switch designed to activate genes within cells, potentially paving the way for groundbreaking medical therapies. However, skepticism remains regarding the validity of their findings, particularly due to concerns about inconsistencies such as altered images in published research.
The pivotal concern is whether these results can be replicated by independent groups. Critic Andrew York, a physicist, stresses the necessity for verification prior to publication. “This assertion is so unprecedented that it warrants external validation to confirm reproducibility,” York noted, emphasizing a proactive approach to scientific scrutiny. He highlighted that the study has been in review for three years, ample time to confirm its findings with external research institutes.
The lead researcher, Kim Jong Pil, a professor at Dongguk University in Seoul, has indicated ongoing collaborations with various biotech firms and research organizations, with a commitment to releasing joint datasets in upcoming publications.
Existing methods, such as optogenetics, utilize light to influence biological processes by engineering cells to produce light-responsive proteins. This technology facilitates the firing of nerve cells when exposed to specific light frequencies, proving especially useful in research targeting certain visual impairments.
However, a critical limitation of optogenetics is its inability to penetrate deeply into body tissues. To address this, global researchers are exploring magnetic fields as alternative control signals for biological processes, with significant implications for both scientific research and clinical practices. Magnetic-based gene control may enable the manipulation of cellular functions to produce therapeutic proteins in a targeted manner.
In a study published in the esteemed journal Cell, Kim’s team asserts they have achieved “magnetogenetics,” utilizing a switch that activates genes in genetically modified cells, responsive to specific magnetic signals that can permeate the human body. Kim further clarified that the switch’s effect is negligible on non-modified genes, which supports its potential safety for medical applications.
Kim’s team employed a 4-kilohertz electromagnetic square wave with an intensity of 2 milliTesla, pulsing at 60 times per second. Their findings indicate that this signal prompted calcium ion oscillation within cells, with a cycle of less than one minute, due to interactions with cytochrome B5. Essentially, calcium ions demonstrated cycles of movement approximately every 50 seconds.
The exact mechanism by which these electromagnetic signals influence cytochrome B5 and induce this oscillation is still under investigation. “The biophysical processes involved are actively being researched,” Kim stated.
This biochemical oscillation purportedly activates the gene’s promoter sequence, LGR4, initiating the expression of subsequent genes. This sequence effectively creates a magnetically-controlled gene activation mechanism. The paper details the switch’s functionality across different cell types in both murine and human models.
York emphasizes the transformative potential of the findings, contingent upon validation. “It could fundamentally reshape our understanding of mammalian responses to electromagnetic fields.” However, he expressed concern over the seeming incongruity of a 60 Hz signal triggering such prolonged biological oscillations. “The physiological implications are extraordinary,” York concluded.
In response, Kim asserts that the oscillation duration is dictated by internal cellular signaling mechanisms, rather than external frequency inputs. “The subsequent oscillatory behaviors arise from intrinsic cellular processes,” he explained.
York also highlighted the substantial amplitude of these calcium oscillations, comparing their significance to a notable temperature fluctuation. “Such a profound physiological response should influence numerous cellular activities,” he remarked, even as the paper claims its effects are isolated to a single gene.
Kim refuted this assertion, maintaining that the magnitude of the signal remains within a considerate physiological range. In one notable experiment, researchers linked an electromagnetic switch to a gene encoding a luminescent protein. Harvard University investigator Adam Cohen pointed out discrepancies in the data, suggesting the engineered cells exhibited luminescence prematurely before activating the switch. Kim attributes this to “computational artifacts resulting from curve smoothing.”
On the PubPeer platform, commenter Yong‐Chang Zhou noted an image duplication concern in Figure S5P, where one image appeared to be a mirrored version of another. “Such mirroring is atypical when capturing images of the same sample,” he remarked, with concerns raised by Elizabeth Big, who focuses on scientific integrity.
Kim acknowledged a clerical error leading to duplicated control images. “We’re correcting this oversight and will provide the accurate raw data to Cell.
Inquiries into the Cell journal publisher regarding these discrepancies remain unanswered.
This self-portrait represents Daniel Regan’s exploration of his ADHD experience through art.
Daniel Regan
These surreal images provide a captivating insight into the journey of an individual living with Attention Deficit Hyperactivity Disorder (ADHD).
Just before his 40th birthday, visual artist Daniel Regan was diagnosed with ADHD. After starting treatment with the medication lisdexamfetamine, he noted significant changes in his perception, leading to a reduction in distractions. Regan describes his former mental state as “watching five movies in my head, each with its own soundtrack and subtitles.”
“Taking medication is like turning down the volume; it enables me to focus on one or two movies instead,” he continues. “This has helped me feel calmer and more present in the moment.”
During this transformative period, Regan photographed himself and his surroundings while hiking in Australia. By submerging these Polaroids in various ratios of ADHD medication and water for up to three months, he crafted distorted interpretations of his original images, saying, “It felt instinctive to process my diagnosis and medication artistically.”
In the main self-portrait, his form appears enveloped in a delicate silk shroud, exhibiting the fragile beauty surrounding these artistic interpretations.
Regan’s innovative process converts Polaroid images of the Australian bush.
Daniel Regan
Another piece reveals the vibrant chaos of Australian bushland, embodied in a bubble-like aesthetic. Regan states, “This image encapsulates the heightened sensory experience often associated with ADHD symptoms.”
Originally a self-portrait, this image transformed after immersion in medication.
Daniel Regan
The stunning blue image above began as a simple self-portrait but evolved dramatically from the infusion process involving ADHD medication, resulting in fascinating biological and molecular effects. Regan explains, “This process reflects how medication influences neurotransmitters within the brain, as lisdexamfetamine elevates dopamine levels.”
This artwork still retains elements of nature despite Regan’s modifications.
Daniel Regan
Highlighted elements of leaves and branches exhibit vivid yellows and greens. The penultimate image reminds Regan of his late mother, reflecting on how she would perceive his recent diagnosis and its implications for his past challenges.
The vivid greens become even more pronounced following Regan’s artistic process.
Daniel Regan
These striking visuals form a collection titled “C.”15H25N3, representing the molecular formula of Regan’s medication. His work, showcased in the Belonging exhibition, arises from a deepening understanding of ADHD. Characterized by persistent symptoms such as impulsivity, forgetfulness, and difficulty managing time, ADHD often manifests in childhood, as highlighted in this National Institute of Mental Health guide.
“Describing the internal experience can be challenging; my images aim to convey some of that inner turmoil and complexity,” Regan shares.
What if you could communicate with the past? Surprisingly, the laws of physics don’t rule this out. In some cases, sending messages backward in time could be more feasible than we imagine.
The potential for sending messages into the past stems from specific solutions to the equations of general relativity. This foundational theory of physics explains how the fabric of spacetime operates, showing that all objects traverse paths through time and space. Among these paths is the concept of a closed time-like curve (CTC), where an object journeys into the future before looping back to the past.
However, creating a CTC on a cosmic scale poses a significant challenge, as it necessitates bending spacetime—a feat that demands an enormous amount of energy. This complication seems to render backward messaging impossible, but quantum entanglement might offer a potential workaround.
In quantum mechanics, entangled particles exhibit a unique property: the state of one particle directly influences the state of another, regardless of distance. Some physicists theorize that this connection implies one particle could send messages back in time to inform its counterpart of future events.
While this theory is debated, in 2010, Seth Lloyd and colleagues at the Massachusetts Institute of Technology demonstrated this concept using entangled photons to simulate a quantum CTC. “It’s akin to sending a photon back in time for mere nanoseconds with the intention of eliminating your previous self,” Lloyd explained.
Lloyd’s team then envisioned a scenario where the CTC experienced interference, similar to a faulty phone line. Analyzing the communication capacity of these “noisy” channels is a common challenge in information theory. To their surprise, they discovered that communicating with the past could actually outperform traditional communication methods, even with noise present.
Team member Kaiyuan Ji noted that their inspiration came from the film Interstellar. In a pivotal scene, Matthew McConaughey’s character sends a message to his daughter from the past by manipulating a clock using a CTC. Considering this as a noisy quantum channel, they found that messages sent backwards in time could be understood better, as the sender could access past memories, enhancing message decoding. “Fathers recall how their daughters interpret future messages, allowing them to optimize their message encryption,” Ji added.
While practical time travel remains hypothetical, improved communication strategies for noisy devices are valuable, according to Lloyd. “Creating a physical closed time-like curve poses immense challenges. However, all channels have noise,” he remarked. The findings from Lloyd’s research could be adapted to similar experiments using photons, potentially uncovering new avenues for effective communication, even in conventional contexts.
Andreas Winter from the University of Cologne highlighted that the study illustrates how feedback types can enhance communication protocols, allowing future senders to draw on their memories. However, he notes that the chances for practical applications are minimal. “As far as we know, time travel or sending signals back in time isn’t feasible in our universe. We are unaware of any mechanisms that could facilitate such phenomena,” he concluded.
A major ocean current system, crucial for regulating the climate across the Northern Hemisphere, is expected to weaken far more severely by the end of this century than previously estimated, according to a new study published in Scientific Progress.
The Atlantic Meridional Overturning Circulation (AMOC) is a vast ocean current system that transports warm water north from the tropics, releasing heat into the atmosphere, then sinking and returning south.
Dr. Valentin Portman, lead author from Bordeaux Southwest Research Center in France, explains, “This loop transports heat from the equator to the North Atlantic Ocean,” as reported by BBC Science Focus.
The warm, salty water moves north, releases heat, thickens, sinks, and subsequently flows south through deep ocean currents.
Research predicts a 51% slowdown of AMOC by 2100, approximately 60% higher than average projections from standard climate models and with considerably lower uncertainty.
The implications of a weakened AMOC could be dire. Sea levels along the Northeast Coast of the United States are already rising faster than the global average, partly due to AMOC’s decline.
Globally, the tropical rain belt is anticipated to weaken and shift south, endangering the monsoon systems vital for agriculture in West Africa and South Asia.
In Europe, these changes could result in colder, harsher winters as the warm water conveyor belt slows down.
Every further weakening brings the AMOC closer to a tipping point, increasing the chances of complete collapse with potentially catastrophic outcomes.
The AMOC stretches the length of the Atlantic Ocean, forming part of a vast network of ocean currents – Photo credit: Getty
The Importance of AMOC
Predicting the future of AMOC as global temperatures rise is notoriously challenging. Its vast, complex nature is influenced by both local and global factors.
Previous assessments of AMOC’s future varied widely between climate models. While most agree on its weakening, estimates of its collapse range from minimal to catastrophic.
The latest study identified systematic errors in some of the best existing models: underestimating salinity in the South Atlantic and overestimating temperature in the North Atlantic.
These biases lead to an underestimation of the critical process that allows dense, saline water to sink, maintaining current flow within the system.
After correcting these discrepancies using ridge-normalized linear regression — a rarely applied technique in climate science — researchers found the expected weakening of AMOC increased to 51%, considerably lowering result uncertainty.
“Typically, models use one variable as input, like past AMOC strength,” Portman noted.
“Our goal was to utilize more comprehensive data by analyzing multiple variables concurrently, considering the complexity of AMOC.”
The current AMOC is already showing signs of weakness, as evidenced by observational data revealing a 10% to 20% intensity decline since the mid-2000s — equivalent to significant volumes of water no longer flowing north each second.
According to a 2025 study, recent AMOC weakening has contributed up to 50% of flooding along the U.S. Northeast coast since 2005.
However, researchers caution that linking this decline directly to anthropogenic climate change, rather than natural fluctuations, remains uncertain until at least 2033, when sufficient data will be available.
Understanding the Risks
While the findings of this study are concerning, researchers clarify what they do and don’t imply.
The 6th Assessment Report from the Intergovernmental Panel on Climate Change (IPCC) expressed confidence that AMOC will weaken throughout this century but reported “moderate confidence” that it would avoid total collapse by the year 2100.
However, these reassurances may offer little comfort given the impact of such a collapse, whether it occurs before or after 2100.
Moreover, a 2025 study published in Geophysical Research Letters indicated that under serious collapse scenarios, severe cold temperatures could drop to -20°C (-4°F) in London and -48°C (-54°F) in Oslo, despite global warming trends.
As human-induced climate change melts polar ice, ocean salinity decreases, hindering processes driving the AMOC.
A weakening AMOC also raises the risk of breaching an unknown tipping point. According to a study, AMOC may exist in two stable states, and once reversed, it could take thousands of years to revert.
The exact location of this threshold is uncertain. A 2025 study in Environmental Research Letters revealed that under high emissions, AMOC shutdowns could occur in 67% of operations, and 30% under moderate emissions.
“The threshold remains elusive,” Portman stated, “but this accelerated decline we observe may be approaching a tipping point.”
Future Projections
Portman’s team assessed four different emissions scenarios, three of which (from moderate to very high) indicate similar 50% weakening results, suggesting that beyond a certain emissions level, many consequences of climate change become inevitable.
“We’ve introduced significant heat into the ocean, and its chilling effects will last for centuries,” Portman warned.
The most optimistic scenario, emphasizing strong and sustained emissions reductions, resulted in only a 20% weakening of AMOC.
“We can frame it two ways: it’s late, given our high CO2 emissions and their long-term impacts,” Portman said, “but we can also assert that significant reductions before reaching a tipping point can avert a serious decline.”
Currently, Portman believes his research offers a clearer view of the AMOC’s future, though he acknowledges ongoing uncertainties and the potential for additional undiscovered processes.
“That’s why it’s critical to approach these findings cautiously,” he emphasized. “Addressing uncertainty in climate models is essential for understanding AMOC’s fate.”
You may have encountered the concept that the increase in the number of boys born after wars can be perceived as a form of divine intervention or karmic response to those who lost their lives in battle.
However, this phenomenon isn’t restricted to wartime. Significant stressors in a nation’s history, such as natural disasters, famine, or collective mourning periods, can also impact male birth rates.
For instance, a study led by Maltese pediatrician Professor Victor Grech in 2015 revealed that the birth rate of boys in the UK temporarily dipped following the death of Princess Diana.
These fluctuations might be connected to the established link between stress and miscarriage rates. Recent research indicates that miscarriages affect female fetuses slightly more than male ones.
But why exactly is this the case? It remains unclear.
Yet, female embryos appear to be particularly vulnerable during the first trimester, leading to an increased risk of repeated miscarriages.
Therefore, during times of heightened stress—like wartime—the increased frequency of miscarriages might contribute to a skewed sex ratio favoring boys.
Additionally, another factor influencing the rise in male births post-war is that overall birth rates tend to surge when soldiers return home. This is often attributed to increased intimate activity among couples.
But why does this result in more boys? The theory suggests that male births occur slightly more often when conception happens at the onset or end of the menstrual cycle, while female births are more likely to occur when conception happens mid-cycle.
As couples engage in sexual activity more frequently, they may conceive during the “male” days of the cycle. This leads to a slight but noticeable increase in male births when many couples are intimate.
While this difference isn’t significant enough for those trying to conceive a specific sex, in the context of hundreds of thousands of births, it could help adjust the overall sex ratio.
This article answers the question posed by Nicole Porter via email: “What is the veteran effect? Is it true?”
If you have any questions, feel free to reach out to us at:questions@sciencefocus.com or send us a messageFacebook,Twitter or Instagram(please include your name and location).
For more intriguing facts, check out our ultimatefun facts page!
Read more:
This version is SEO-optimized by including keywords related to the topic while maintaining the original HTML structure.
NASA’s Chandra X-ray Observatory has uncovered a remarkable object that could be the crucial link between a concealed “black hole star” and a fully visible supermassive black hole, shedding light on the growth of the universe’s earliest giants.
Optical and infrared images from Hubble depict the region around the X-ray dot, while Chandra X-ray images illustrate its surroundings. Image credits: NASA / CXC / Max Plank Inst / Hviding et al. / ESA / STScI / HST / CXC / SAO / N. Walk.
Shortly after the NASA/ESA/CSA James Webb Space Telescope initiated its scientific observations, new reports surfaced regarding a novel class of enigmatic astronomical entities.
Astronomers have identified numerous small red objects situated approximately 12 billion light-years away from Earth, dubbed “little red dots” (LRDs).
Many researchers suspect that LRDs are supermassive black holes encapsulated in dense gas clouds, obscuring features that typically help astronomers detect these celestial objects, including X-rays.
Unlike conventional growing supermassive black holes, which are not surrounded by dense gas, LRDs’ light emissions are hindered, preventing the escape of bright ultraviolet and X-rays from the material orbiting the black hole.
A recently identified “X-ray dot,” located about 11.8 billion light-years from Earth, might serve as a pivotal connection between black hole stars and typical growing supermassive black holes.
This object, known as 3DHST-AEGIS-12014, exhibits characteristics of an LRD—small, red, and distant—but uniquely emits X-ray light.
“Astronomers have been trying to decipher the nature of the little red dot for years,” commented Dr. Raphael Widing from the Max Planck Institute for Astronomy.
“This singular X-ray phenomenon could be the key to connecting all the dots, so to speak.”
This exceptional object was discovered through a comparison of new Webb data against comprehensive previous surveys conducted by Chandra.
“If this little red dot is a rapidly growing supermassive black hole, why does it not emit X-rays like its counterparts?” questioned Dr. Anna de Graaf from Harvard University and the Smithsonian Center for Astrophysics.
“Identifying small red dots that exhibit differing properties from other dots offers crucial new insights into the mechanisms behind their power.”
The researchers propose that this X-ray dot signifies a transitional phase from an LRD to a typical growing supermassive black hole.
As a black hole star consumes gas from its surroundings, gaps in the gas cloud form.
This allows X-rays from the matter descending into the black hole to penetrate and be detected by Chandra.
Ultimately, as the gas is fully consumed, the black hole star will cease to exist.
Chandra’s X-ray dot data also hints at fluctuations in the brightness of the X-rays, supporting the notion of a partially obscured black hole.
As the gas cloud rotates, varying densities of gas encircle the black hole, affecting X-ray brightness.
“If we confirm that the X-ray dot is indeed a small red dot in transition, it could be unprecedented and may reveal the core of a small red dot for the first time,” stated Princeton University’s Hanpu Liu.
“This would also provide strong evidence that a growing supermassive black hole resides at the center of some, if not all, of the tiny red dot population.”
Another hypothesis about the X-ray dot is that it could be a common type of growing supermassive black hole, albeit shrouded in an unusual type of dust yet to be documented.
Future observations are planned to uncover the truth behind this discovery.
“The X-ray dot has been part of our Chandra survey data for over a decade, but we only recognized its significance after Webb observed the region,” remarked Dr. Andy Golding of Princeton University.
“This exemplifies the strength of collaboration between two remarkable observatories.”
This discovery is documented in the following article: paper in Astrophysical Journal Letters.
_____
Raphael E. Viding et al. 2026. X-ray dots: exotic dust or late-stage tiny red dots? APJL 1000, L18; doi: 10.3847/2041-8213/ae4c88
Extensive genome-wide analysis of the Amazonian two-toed sloth (genus Choloepus) reveals that these remarkable creatures possess greater genetic diversity than previously understood, suggesting the potential existence of undiscovered species.
The two-toed sloth is a slow-moving, tree-dwelling mammal that inhabits the lush rainforests of Central and South America.
Interestingly, although named for their two clawed digits on the front legs, these animals do not literally have “two digits.” This characteristic differentiates them from their three-toed counterparts.
Belonging to the genus Choloepus, the three-toed sloth is classified within the monophyletic family Choloepodidae.
Dr. Camila Mazzoni of the Leibniz Institute for Zoo and Wildlife Research highlights that Choloepus didactylus is monotypic with a wide distribution throughout the Amazon, while Choloepus hoffmanni comprises five recognized subspecies in Central America, as well as northwestern parts of Venezuela, Colombia, and Ecuador.
Both species are believed to coexist in the western Amazon, a region known for its rich concentration of terrestrial mammals, exhibiting distinct morphological traits such as fur coloration and skeletal characteristics.
However, significant overlaps in body size and coat color present challenges for accurate taxonomic classification.
To investigate the phylogenetic and biogeographical history of this genus in the Amazon, the researchers compiled existing mitochondrial data and generated new genomic datasets, including mitochondrial and whole-genome sequences from Choloepus individuals sampled across three remote Amazonian regions.
This comprehensive analysis allowed them to assess population structure, lineage relationships, demographic history, and genomic diversity patterns among sampled lineages.
The findings revealed that Choloepus hoffmanni is not a singular, cohesive lineage as previously thought.
Instead, populations east of the Andes share closer genetic relationships with Choloepus didactylus than with their western counterparts, indicating a “paraphyletic” classification that fails to capture the species’ true evolutionary history.
Even more remarkably, the research uncovered at least three deeply divergent genetic lineages among Amazonian sloths, suggesting that many others may yet be identified.
This hidden diversity dates back millions of years. By comparing nuclear and mitochondrial DNA, they reconstructed the sloth evolutionary timeline, revealing significant divergences linked to historical environmental changes.
The separation of sloths on opposite sides of the Andes likely occurred around 4.6 million years ago during the final uplift of the mountains that reshaped South America’s landscape.
Then, approximately 2.6 million years ago, the Quaternary Ice Age began, fragmenting the Amazon rainforest into isolated habitats, which would have introduced barriers to migration for these arboreal creatures, setting them on separate evolutionary paths.
Genetic analyses also indicated that sloth populations have experienced expansions and contractions in response to glacial cycles throughout history.
Dr. Mazzoni emphasized, “The Amazon sloth represents both an ancient evolutionary relic and a critical indicator of current deforestation trends.” She asserted, “The discovery of cryptic lineages and potentially new species underscores the urgency of advancing sloth research, which our team actively pursues.”
“This collaborative research lays a crucial foundation for sloth conservation efforts.”
“Our findings demonstrate the pivotal role of genomic research in revealing the hidden biodiversity of the Amazon, directly informing conservation strategies to protect unique evolutionary units before they are threatened by human activities.”
For further details, refer to the published study in Molecular Phylogenetics and Evolution.
_____
Larissa S. Arantes et al. 2026. Genomic insights into the evolutionary history and puzzling diversity of the three-toed sloth (Choloepus) in the Amazon. Molecular Phylogenetics and Evolution 221: 108620; doi: 10.1016/j.ympev.2026.108620
Paleontologists have unveiled a remarkable discovery—a new species of multituberculous mammal from the genus Simolodon based on fossils found in Baja California, Mexico.
Illustration of Simolodon de Sosai holding fruit on a tree. Image credit: Andrei Atutin.
The newly identified mammal, Simolodon de Sosai, roamed the region that is now Mexico approximately 75 million years ago, during the Cretaceous period.
This ancient creature was about the size of a golden hamster, weighing around 100 grams.
It likely foraged on the ground and in trees, primarily consuming fruits and insects.
According to Professor Gregory Wilson Mantilla, a paleontologist at the University of Washington and curator of vertebrate paleontology at the Burke Museum, “Genus Simolodon was quite prevalent during the Late Cretaceous, the final chapter of the dinosaur era.”
Fossils of Simolodon have been uncovered across Western North America, stretching from western Canada to Mexico.
This new species, Simolodon de Sosai, represents an ancestor of those that survived the mass extinction event.
“Its relatively small size and omnivorous diet likely contributed to its survival advantages,” said Professor Mantilla.
Fossilized remains of Simolodon de Sosai, including parts of its skeleton such as teeth, skull, jaw, femur, and ulna, were discovered in the El Gallo Formation of Baja California in 2009.
This specimen is considered the most complete mammal known from the Mesozoic era in Mexico and one of the best-studied representatives of simolodontans, a significant group of multituberculates from North America.
“Fossil discoveries here are rare when compared to other regions,” explained Professor Wilson Mantilla.
The discovery of more than just teeth for Simolodon de Sosai enhances understanding of its anatomical features and locomotion.
This contributes to a broader comprehension of the genus and its ecological niche, providing insights into the multituberculate lineage.
Research indicates that the local mammalian fauna of El Gallo is currently represented by 16 specimens, including three multituberculates, one metatherian, and one eutherian.
“While additional sampling is necessary, the existing mammalian fauna appears to exhibit the highest biogeographic similarity to the native fauna of Terlingua in West Texas,” the researchers noted.
The findings were published in a recent paper in the Journal of Vertebrate Paleontology.
_____
Gregory P. Wilson Mantilla et al. New skull and postcranial remains of Simolodon (Mammalia, Polytuberculata, Cymolodontidae) from the Upper Cretaceous (Campanian) El Gallo Formation of Baja California, Mexico. Journal of Vertebrate Paleontology, published online on April 22, 2026. doi: 10.1080/02724634.2026.2641109
A groundbreaking study conducted by astronomers from the University of California, Riverside, Sam Houston State University, and the University of Oklahoma indicates that the collapse of dark matter may have significantly accelerated the collapse of early gas clouds, facilitating the rapid formation of supermassive black holes, contrary to existing theories.
Agarwal et al. revealed that the energy released from dark matter collapse significantly changed the chemistry of early galaxies, allowing for direct black hole formation. Image credit: Agarwal et al., doi: 10.1088/1475-7516/2026/04/034.
“Our findings suggest that dark matter collapse could drastically influence the evolution of the universe’s first stars and galaxies,” stated Yash Agarwal, a graduate student from the University of California, Riverside.
“As the James Webb Space Telescope uncovers more supermassive black holes from the early universe, this mechanism may help reconcile theory with observation.”
In their research, Agarwal and colleagues demonstrated that as dark matter—comprising approximately 85% of the universe’s unseen mass—decays, it releases a small fraction of energy that accelerates the decay of gas clouds.
Notably, each dark matter particle decaying only needs to “inject energy equivalent to one billionth that of a standard AA battery.”
“The primordial galaxies were essentially massive hydrogen gas balls, and their chemistry was extremely sensitive to atomic-scale energy fluctuations,” explained Dr. Flip Tanedo from the University of California, Riverside.
“These are characteristics we search for in dark matter detectors. The ‘detector’ properties could potentially explain the existence of supermassive black holes observed today.”
The team modeled the thermochemical dynamics of gas influenced by a decaying axion, uncovering a specific dark matter mass range between 24 and 27 electron volts, which creates conditions suitable for black hole formation.
“This research emerged from a series of fortunate events that united the right experts—including particle physicists, cosmologists, and astrophysicists—in workshops to address pivotal questions in the field,” Dr. Tanedo remarked.
“We’ve demonstrated that an optimal dark matter environment makes the direct collapse of black holes considerably more probable.”
“Additionally, support for interdisciplinary research allowed for the ‘serendipity’ that fueled this investigation.”
Read more about the study published on April 14, 2026, in the Journal of Cosmology and Astroparticle Physics.
_____
Yash Agarwal et al. 2026. A black hole candidate that collapses directly from collapsing dark matter. JCAP 04:034; Doi: 10.1088/1475-7516/2026/04/034
New York State Department of Health Secretary James McDonald addressed the measles outbreak last year.
Jim Franco/Albany Times Union via Getty Images
Increasingly, gamblers are placing bets on the projected number of measles cases in the United States. In January alone, wagers exceeded $9 million on platforms like Calci and Polimarket. Evidence shows their forecasts can effectively model the spread of infection.
Prediction markets operate by allowing users to buy and sell stocks tied to specific outcomes. Each market presents a question regarding future events, with participants betting on “yes” or “no” outcomes. The cost of each bet is set according to the majority opinion in the market.
For instance, if 86% of bets predict a “yes” outcome, each “yes” stock costs 86 cents. If the event occurs, successful betters receive $1 per share, while unsuccessful bettors lose their investment.
The concept of prediction markets originated from scientific research. In 1988, economists Robert Forsythe, George Newman, and Forrest Nelson at the University of Iowa aimed to forecast federal elections, leading to the creation of these markets where small stakes could predict outcomes.
Their predictions proved remarkably accurate. In 2003, infectious disease researcher Philip Polgreen urged economists to expand these markets to include disease forecasting, emphasizing an ethos centered on education and public welfare.
In recent years, companies like Kalshi and Polymarket have commercialized prediction markets, operating legally in the U.S. under Commodity Futures Trading Commission regulations, albeit facing growing scrutiny from government entities.
These markets have faced criticism for enabling bets on sensitive subjects like the Iran and Ukraine conflicts. Some observers deem it morally questionable; for instance, a trader known as Magamiman profited $553,000 by accurately predicting the removal of Ayatollah Khamenei from power on February 28, 2026. This success raised concerns among U.S. lawmakers about potential insider trading.
As measles cases rise in the U.S., a similar betting market has emerged for the disease. Although the ethical dilemma surrounding these bets is complex, it may offer valuable data insights. According to Spencer J. Fox, a professor at Northern Arizona University who specializes in predictions for COVID-19 and other respiratory viruses, the measles prediction market could serve as an innovative data source.
The June 2025 prediction market projected around 2,000 measles cases by year-end, a figure closely matching the actual data of 2,288 cases. Fox noted, “Our model anticipated far worse scenarios.”
To forecast disease, epidemiologists utilize various data types, including vaccination rates, genomic information, and climate considerations. “Everyone is searching for an edge in infectious disease prediction, constantly exploring new data streams,” Fox remarks. However, measles poses a challenge for predictions due to its “highly stochastic” nature.
Emile Servan Schreiber, the CEO of prediction market firm Hypermind, suggests that the accuracy of measles forecasts may stem from harnessing the “wisdom of the crowd,” where non-experts contribute diverse perspectives that balance out gaps in specialized knowledge.
Nevertheless, Fox argues that prediction markets cannot fully replace scientific models employed by epidemiologists. These markets often lack the comprehensive predictions and granularity that scientific approaches provide. He emphasizes, “We would need to make thousands of bets each week on all possible predictions.”
Moreover, he emphasizes that only seasoned experts are adept at predicting rare events. “If we neglect to cultivate expertise in infectious disease prediction now, we risk being overwhelmed by the next pandemic,” he warns.
Neither Kalshi nor Polymarket responded to a comment request from New Scientist.
Graphene sheets are 2D, but some thin materials may not fit neatly into that category.
Alfred Pasieka/Science Photo Library
Researchers have identified a groundbreaking quantum state of matter that operates beyond traditional two- or three-dimensional characteristics, unveiling novel mechanics in electron movement.
Physicists categorize matter states by analyzing electron mobility within various materials, driven by factors like atomic structure and arrangement.
In a magnetic field, thin materials experience a distinct electron behavior; electrons trace a small circular path, funneling currents toward the material—a phenomenon known as the Hall effect. In magnetic substances, electron trajectories are more intricate, leading to diverse manifestations of this effect.
Wang Lei and his research team at Nanjing University have made an unexpected finding: they’ve discovered a new variant, termed the Transdimensional Anomalous Hall Effect (TDAHE).
While examining electrons in a thin material structured from carbon atoms arranged in a diamond-like pattern to create highly efficient electrical currents, they observed unusual electron behaviors upon magnetization.
“The TDAHE discovery was astonishing; it’s a phenomenon not previously documented in other materials nor predicted by existing theories,” Wang states. Measuring the raw data took nearly a year, indicating the complexity of this new effect.
The unexpected finding was that their material exhibited Hall effect characteristics under two mutually perpendicular magnetic fields. This discovery was significant because it demonstrated electrons performing loop motions both horizontally and vertically, despite the material’s minimal thickness.
Initially, Wang’s team suspected experimental errors, yet repeated tests verified their measurements. Additional samples yielded consistent results, leading to a groundbreaking conclusion: in carbon materials only 2 to 5 nanometers thick, the electrons were exhibiting unprecedented behaviors.
Due to the material’s ambiguous dimensionality, the researchers coined the term “hyperdimensional” to describe the new electronic states that do not adhere to the traditional 2D or 3D frameworks. “We aim to express that this finding introduces a novel regime beyond the well-explored dimensions,” explains Wang.
Andrea Young from the University of California, Santa Barbara, notes that this new state is distinctive because its mathematical portrayal of electron states lacks symmetry in three different aspects, setting it apart from analogous states. He emphasizes that the material’s thickness is secondary to its unique characteristics.
Young likens the newly identified state to a “quarter metal,” indicating that its asymmetry constrains electron behavior compared to conventional metals.
Wang’s team is now eager to delve further into hyperdimensional physics across various materials, utilizing advanced tools, including diamond-based magnetic field sensors, to explore this unprecedented state.
Red blood cells can be enhanced to improve wound healing
3D Media Sphere/Science Photo Library
Recent studies show that simple modifications to red blood cells, which transport oxygen in our bodies, can dramatically halt severe bleeding almost instantly. In trials on rats with severe liver injuries, modified blood led to the formation of clots within just five seconds, significantly reducing blood loss. This breakthrough raises hopes for its application in both scheduled and emergency surgical procedures.
Approximately 2 million individuals die globally from blood loss, and the risk escalates with each passing year of continued bleeding. In less severe cases, blood clots can form rapidly; however, critical situations often necessitate expensive blood transfusions, which can be challenging to administer urgently and may involve bandages that provoke immune responses, hindering healing processes.
Red blood cells not only carry oxygen but also aggregate with platelets, tiny cell fragments responsible for halting bleeding, to create a sticky mesh that seals injuries. Red blood cells constitute the majority of this plug, but due to their inherent fragility, Lee Jianyu and his colleagues from McGill University, Montreal, are striving to enhance their strength. “We recognized the elephant in the room,” he noted.
The researchers initially collected blood from rats, separating various cellular components. They then introduced chemicals that function like handles: one side binds to proteins on the red blood cell surface, while the other interacts with long-chain molecules that help bind the cells together.
The modified cells were then reintegrated into the liquid component of blood, known as plasma, and injected into the rats’ severe liver wounds. In stark contrast to untreated rats, which took 265 seconds to clot, treated rats began clotting in under 5 seconds, with only a minimal loss of 24 milligrams of blood compared to nearly 2,000 milligrams in the untreated group.
Unlike natural blood clots that dissolve within a few days, these modified clots remain intact for one to two months, offering extended time for wound healing processes to occur, as noted by Lee. Furthermore, the study did not present any safety concerns at this stage.
“This is a ground-breaking study demonstrating a novel approach to designing cell-based biomaterials for surgical and regenerative purposes,” commented Hyun Woo Yook, founder of SanaHeal, a Boston-based firm focused on bioadhesive technology.
Researchers aspire that in the future, a small sample of a patient’s blood could be collected before a planned surgical procedure and processed within just 30 minutes. In emergency scenarios, therapeutic drugs could be prepared from blood bank samples and stored at low temperatures for up to a month. However, Jayachandran Kizakkedatu, a researcher at the University of British Columbia, highlights that while current treatments can last long, cellular materials like these face a considerable challenge with a limited shelf life compared to synthetic alternatives.
Lee mentioned that his team has applied for a patent and is planning further research to explore these promising findings.
Apple is increasingly utilizing defective chips to produce budget-friendly laptops. Although this may sound counterintuitive, it highlights a widely accepted technique known as “binning,” which minimizes production costs and environmental waste in smartphones and laptops.
The term “binning” originates from agriculture, where high-quality fruits and vegetables are sold to premium markets, while those in poor condition are allocated to other customers, and the least desirable items may be recycled as animal feed. This segregation ensures minimal waste, a practice echoed in semiconductor manufacturing.
For instance, take the new MacBook Neo, which incorporates the A18 Pro system featuring five GPU cores, offering users a more cost-effective Apple laptop option. Previously, the A18 Pro was found in the iPhone 16 Pro with six functional GPU cores. Reports suggest that Apple is utilizing A18 Pro chips stored in a bin with one defective core, thereby optimizing the use of discarded components. Although Apple hasn’t commented on this, industry experts like New Scientist indicate that manufacturers across various sectors, from electronics to automotive, routinely adopt this practice.
According to Owen Guy, a researcher from Swansea University in the UK, semiconductor chips are produced in large quantities on 300-mm silicon wafers housing trillions of individual transistors. Sophisticated machinery performs countless operations on these wafers, generating layers of circuits, insulators, and chemicals just nanometers thick. In truth, the complexity of this procedure often makes it astonishing that chips work at all, rather than the occurrence of defects.
“At each stage of the process, there is a small chance that something may go awry,” says Guy.
The proportion of errors on a single wafer dictates the yield, or the quantity of chips that meet the required specifications. Yields can reach up to 99 percent for standard silicon chips, a material employed since the 1960s, but often improve for advanced chip designs and newer substrate materials like silicon carbide and gallium nitride.
“The critical question is the number and severity of defects. Unless there’s a so-called fatal flaw, functioning chips can still exist even when they have imperfections,” Guy explains.
Imagine achieving a yield of 90%, where 9 out of 10 chips perform as expected. In such a scenario, one chip is destined for the bin. If one core is defective, it may be classified as a different product, featuring five cores instead of six. Alternatively, it might be regulated to operate at lower voltages or frequencies or rated for higher power consumption or temperature. There will always be customers eager to use these chips.
Researcher Tony Kenyon from University College London states that users often remain unaware of any issues. Error-correction software is employed to isolate faulty transistors in memory chips, avoiding potential data loss or rerouting computations around flawed processor cores to prevent software crashes.
“A deeper look under the hood reveals that certain components of the chip may not function. This is widely prevalent. Many believe all chips are uniform, but the reality is far more complex,” Kenyon asserts.
Scientists have developed a rapid, practical test for assessing the quality of black coffee, allowing baristas and researchers to better understand flavor profiles without the need for complex laboratory analysis.
Bamboo et al. demonstrate that cyclic voltammetry can directly measure the strength of coffee beverages and the degree of coffee roasting, without needing additional sample preparation. Image credit: Sci.News.
Traditional methods for evaluating coffee often rely on subjective tasting panels or estimations of dissolved substances in a sample.
Unfortunately, these methods may not accurately reflect chemical differences arising from varying roasting levels or brewing techniques.
While existing laboratory methods can identify individual molecules, they tend to be time-intensive, costly, and impractical for everyday use.
“Since the 1950s, the coffee industry has sought quantitative methods to evaluate beverage quality that extend beyond sensory panel insights,” noted Christopher Hendon, a researcher at the University of Oregon, along with his team.
“Research has shown that the strength of the coffee and the roasted bean color are two key, independent factors that influence coffee flavor.”
“While bean color can be easily analyzed through spectrophotometry, the most common method for measuring coffee concentration relates the beverage’s refractive index to effective concentration using an empirically derived polynomial.”
Researchers are now proposing a novel method to efficiently assess black coffee strength using an electrochemical test called cyclic voltammetry.
This technique differentiates strength and roast variations by applying voltage and monitoring the current flowing through the coffee in response to the electric field.
The scientists found a linear correlation between beverage strength and total electric charge, with signals decreasing as the roast of the sample darkened.
This reduction is linked to roast-dependent molecules, like caffeine, sticking to the platinum electrode during measurements.
The method’s validity was confirmed by comparing it with color and flavor descriptions from a UK-based roastery’s quality control process.
“This electrochemical test could effectively differentiate between batches of brewed coffee that appear similar and have comparable dissolved solids measurements but differ significantly in flavor,” the authors stated.
“The findings suggest that this approach may become a sensitive and reliable method for assessing coffee composition, complementing current tools in the industry.”
The team’s study is published in the journal Nature Communications.
_____
Bamboo et al. 2026. Direct electrochemical assessment of black coffee quality using cyclic voltammetry. Nat Commun 17, 3618; doi: 10.1038/s41467-026-71526-5
Firefighters Combat Wildfires in Fundão, Portugal, August 2025
Da Silva/EPA/Shutterstock
In 2025, Europe faces unprecedented wildfires and heatwaves, exacerbating the impacts on the world’s fastest-warming continent.
According to the European Center for Medium-Range Weather Forecasts (ECMWF), last year marked the hottest year on record in the UK, Iceland, and Norway, and one of the three hottest years across Europe. Over 95% of the continent experienced above-average annual temperatures. Regions like Scandinavia, Finland, and northwestern Russia endured the most severe heatwave recorded, with temperatures soaring to 30°C (86°F) in the Arctic for a staggering 21 consecutive days.
This extreme heat threatens to stunt the growth of flora and fauna while fostering the spread of invasive species and pests. Celeste Sauro from the World Meteorological Organization emphasized that such prolonged heat stress has devastating effects on ecosystems: “We’re seeing between 0 and 2 days of strong heat stress, and now we’re talking about 21 days affecting the ecosystem’s health.” Since 1980, Europe has warmed at twice the global average rate, with heatwaves becoming more frequent and severe.
The warming climate contributed to record wildfires in Portugal and Spain last August, conditions now deemed at least 40 times more probable due to extreme heat, dryness, and wind. Over 10,000 square kilometers were scorched, resulting in at least three fatalities. As wildfires approached Madrid, parts of the Camino de Santiago had to be closed, with smoke wafting as far as Britain.
Across Europe, wildfires emitted a staggering 47 million tonnes of carbon dioxide, with Spain, the UK, the Netherlands, Germany, and Cyprus exceeding previous emission records. Soil conditions are at their driest in over 33 years, with extreme agricultural drought afflicting over one-third of Europe, including the UK, Turkey, and Ukraine. This drought conditions may have catalyzed wildfires, particularly in Spain and Portugal. As Samantha Burgess from ECMWF stated, “Following a wet spring that stimulated plant growth, the record summer heat has left dry plants and shrubs, creating tinderbox conditions.”
“When you combine wildfire weather with very high fuel loads and especially hot, dry winds, you create catastrophic conditions for wildfires to spread rapidly,” Burgess continued. “National parks must implement firebreaks to limit the spread in case of a wildfire.”
The waters surrounding Europe have also been abnormally warm, breaking records for annual sea surface temperatures for four continuous years. A staggering 86 percent of these oceans have experienced strong, intense, or extreme marine heatwaves, with the hottest spots located off the west of Ireland, south of Iceland, and southeast of Spain.
In the last three years, the Mediterranean Sea has experienced 100% heatwave conditions, warming at a pace faster than the global average. Water temperatures in Italy and Spain reached a sweltering 30 degrees Celsius, warmer than average swimming pools, increasing the likelihood of fish mortality and the proliferation of harmful bacteria and algae. Historically, marine heatwaves in the Mediterranean have devastated coral reefs, seaweed beds, and shellfish populations.
To mitigate future impacts, Europe must spearhead efforts to combat climate change. Dusan Krenek, a European Commission official, noted: “In 2025, 12.5% of the continent’s electricity was generated from solar power, with renewables accounting for a total of 46%.” European nations are among the pioneers participating in the Summit on the Transition Away from Fossil Fuels, held in Colombia, following the inadequacies of the COP30 climate change summit in Brazil.
Additionally, officials stress the need for Europe to adapt to forthcoming climate risks, including potential multi-year droughts akin to those currently affecting the western United States. “We must address these risks,” Krenek stated. “The cost of inaction far exceeds the investments needed to tackle these adverse effects.”
Scorpions enhance their claws and stingers with metals, akin to wearing steel-capped boots. This adaptation serves to amplify the strength of their primary weapons.
The notion of metal reinforcement is already known in vertebrates, such as the teeth of Komodo dragons (Varanus komodoensis), where metal-rich areas are perceptible as dirt on the surface.
Sam Campbell and researchers from the University of Queensland, Australia, investigated the claws and stingers of 18 diverse scorpion species to analyze the extent and composition of metallic reinforcement.
The team employed advanced X-ray imaging techniques and an electron microscope, mapping the presence of three primary metals: iron, zinc, and manganese. Additional elements, including copper, nickel, silicon, chlorine, titanium, and bromine, were also detected.
Metals mainly accumulate at the tips of stingers and the cutting edges of claws, as well as in the mouth and tarsal claws, which Campbell likens to “a boot with a steel cap on the toe.” Although the rest of the exoskeleton is also hard, it is much softer in comparison.
All scorpions emit a green or blue glow under ultraviolet light, yet the researchers discovered that the metal-enriched areas do not fluoresce under UV exposure.
While the method by which scorpions acquire these metals remains unclear, their diet is likely a significant source.
The research indicates that different scorpion species accumulate metals variably according to their behaviors. “Our findings suggest that the more zinc in the nail, the less zinc in the stinger, and vice versa,” explains Campbell. “Since scorpions utilize their weapons distinctively, metal enrichment likely evolved to enhance the biomechanical properties of the most critical weapons.”
Metal incorporation in animal tissues appears to be more prevalent than previously recognized, notes Aaron LeBlanc from King’s College London. “Emerging research indicates similar traits in vertebrate teeth,” he adds. “The next logical phase is to explore the evolutionary pathways of these adaptations across major lineages, and this research is groundbreaking in that context.”
Colorectal cancer is on the rise, particularly among younger individuals.
Getty Images North America Copyright: Paul Morigi/Getty Images for Fight Colorectal Cancer
Research into the rising incidence of cancer among young individuals has generated more questions than definitive answers. While one study indicates that increasing obesity rates may account for a fraction of this trend, it doesn’t provide a comprehensive explanation.
According to Montserrat Garcia-Crosas from the Institute of Cancer Research (ICR) in London, the main takeaway is that although Body Mass Index (BMI) serves as a significant indicator, much of the increase remains unexplained.
Numerous global studies have documented a rise in cancer cases among adults under 50. Notably, the incidence of colorectal cancer has surged by about 50% in countries including the United States, Australia, and Canada since the 1990s.
To investigate the reasons behind this trend, Garcia-Crosas and colleagues analyzed cancer data in the UK alongside population trends related to risk factors such as obesity. Their findings indicated that 11 types of cancer are rising among individuals aged 20 to 49, with breast and colorectal cancers being the most prevalent. Other malignancies include liver, kidney, and pancreatic cancers, exhibiting growth rates between 1% and 6% annually.
The researchers discovered that the incidence of nine out of these 11 cancers was also increasing in individuals over 50, suggesting some common underlying factors. However, ovarian cancer and colorectal cancer were exceptions to this pattern, as noted by Garcia-Crosas.
The team also explored behavioral factors linked to these 11 cancers as identified by the International Agency for Research on Cancer, which include alcohol consumption, smoking, physical inactivity, body mass index, and dietary habits related to fiber and processed meats. “These researchers provide the strongest evidence connecting these factors,” Garcia-Crosas stated.
Despite the stable or improving nature of these risk factors over time, BMI remains a consistent concern, particularly given the rising rates of obesity. However, the link between obesity and the increase in cancer among young people is only partially understood. For instance, only about 20% of the rise in colorectal cancer among young women can be attributed to increasing BMI, as per Garcia-Crosas.
According to team member Mark Gunter at Imperial College London, extensive research is currently ongoing to identify the causes of this troubling trend. Potential factors being examined include a higher consumption of ultra-processed foods, substances known as PFAS (forever chemicals), and antibiotics affecting the gut microbiome.
Your analysis suggests that the increase in cancer cases among youths likely stems from a combination of elements rather than a single cause, and they could not exclude the possibility that diagnostic practices may also be influencing these statistics.
This rise should also be considered in context, as highlighted by Amy Berrington at ICR. In the UK, only about 3,000 bowel cancer cases are reported annually among individuals aged 20 to 49. Consequently, a 3% increase signifies approximately 100 more cases each year. “These trends are relative, and the overall increase in cases remains modest,” Berrington elaborated.
The study did not include cervical cancer due to the significant decrease in cases among women who received the HPV vaccine during childhood.
Looking ahead, Berrington draws attention to data through 2023, expressing optimism as the upward trend seems to be stabilizing. Furthermore, if obesity is a contributing factor to the rise in cancer diagnoses, emerging GLP-1 weight loss medications, such as semaglutide, may offer a potential solution. “Should obesity rates decline due to the adoption of these medications, we could witness a reduction in some obesity-related cancers in the future,” Professor Gunter concluded.
Everyone has a favorite movie villain, whether it’s Darth Vader or Cruella de Vil. These iconic characters captivate us, sparking a fascination that goes beyond their nefarious deeds.
Logically, this admiration doesn’t make sense. These characters are undeniably bad, dangerous, and often downright evil… yet we find ourselves rooting for them!
According to recent research, while most people acknowledge the villainous actions of these characters, they often believe there’s a “good” side lurking beneath. But why do we think this?
One reason might be that compelling plots depend on conflict, which is frequently brought to life by the antagonist. In essence, the excitement kicks in when the villain steps onto the scene.
The more diabolical the villain, the greater our entertainment. This simple fact often suggests that a movie’s success correlates with its villain’s role.
Just like how many people enjoy horror movies or thrilling activities like bungee jumping—despite their inherent dangers—our brains connect these experiences with positive emotions. Objectively “bad” experiences can elicit joy.
To reconcile this cognitive dissonance, we may instinctively convince ourselves that the villain can’t be *that* bad.
If we see ourselves as good people, we often believe villains have some goodness too. – Image credit: Getty Images
Our perceptions are shaped by our personal experiences, beliefs, and moral compass, leading us to view ourselves as inherently good.
When we emotionally connect with a character, such as an intriguing villain, we use our moral framework to understand their motives. If we can empathize with them, we subconsciously assume they possess some goodness too.
Right?
While there may be many psychological factors influencing our attraction to villains, the key takeaway is that, while we may cheer against them, we secretly appreciate their presence.
This article addresses the intriguing question from Lancaster’s Luke Rees: “Why do we root for the villain in a movie?”
For inquiries, reach out to us at:questions@sciencefocus.com or follow us onFacebook,Twitter, orInstagram. (Don’t forget to include your name and location.)
Explore our ultimate collection offun facts and discover more amazing science pages.
A newly identified genus and species of titanosaurian sauropod dinosaur, related to South American forms, has been unveiled by a research team led by the University of Bath, including Dr. Nick Longrich.
Reconstruction of Phosphatotitan khouribgaensis. Image credit: Conor Ashbridge.
Phosphatotitan khouribgaensis thrived in present-day Morocco during the “late but not latest” Maastrichtian epoch of the Cretaceous period, around 70 million years ago.
“The closing Cretaceous epoch witnessed the last diversification of dinosaurs, leading up to the end-Cretaceous extinction,” noted Longrich and team in their research.
“Discussions about Cretaceous dinosaur diversity have largely concentrated on the well-documented fauna of Laurasia.”
“However, there’s limited information about the dinosaur species from the Southern Hemisphere, particularly Africa.”
Fossils of Phosphatotitan khouribgaensis were excavated from the Sidi Shenan phosphate deposit in the Ourad Abdoun Basin, located in the Kouribga region of Morocco.
The fossil collection includes sacrum, parts of the pelvis, dorsal and caudal vertebrae.
“The phosphates in the Ourad-Aboun Basin consist of a mix of phosphate sandstone, marl, and limestone,” the paleontologists explained.
“These materials were formed in warm, shallow continental oceans and are part of a phosphate belt that developed along the Atlantic and Tethyan Seas during the Late Cretaceous and early Paleogene.”
Despite its North African origins, Phosphatotitan khouribgaensis appears closely linked to the Lognkosauria, a group of titanosaurs previously identified only in South America, known for harboring some of the largest land animals.
This connection implies that these dinosaurs roamed the ancient supercontinent Gondwana before the split of Africa and South America over 100 million years ago.
Alternatively, these dinosaurs could have crossed the narrow oceanic barrier between the two continents later on.
“This novel species differs from the titanosaurs described from Cretaceous Africa and Europe but bears resemblance to South American lognkosaurs, particularly Patagotitan,” the researchers remarked. “The dorsal and caudal centra are notably short, the neural spines are expanded, and the pubis is broad.”
In contrast to its giant South American cousins, Phosphatotitan khouribgaensis was relatively small, weighing an estimated 3.5 to 4 tons, significantly less than other titans like Patagotitan. This size reduction may be attributed to environmental factors or geographical isolation.
It is suggested that parts of North Africa might have acted as an island during the Late Cretaceous, a setting that typically favored smaller species.
Phosphatotitan khouribgaensis, akin to previously identified hadrosaurids, indicates that Morocco likely hosted a unique endemic population during the latest Cretaceous, distinct from other African fauna,” the authors noted.
“Increased sea levels during the Late Cretaceous could have led to isolated landmasses, resulting in distinctive faunas shaped by geographic isolation and local extinctions.”
“The substantial endemism observed among modern Cretaceous dinosaurs suggests a potentially incomplete understanding of dinosaur diversity, complicating efforts to discern global patterns prior to the end-Cretaceous extinction.”
The discovery of Phosphatotitan khouribgaensis is documented in a study published in the journal Diversity.
_____
Nicholas R. Longrich et al. 2026. Titanosaurian sauropods (Lognkosaurinae: Argentinosauridae) with South American affinities from the Late Maastrichtian of Morocco, and evidence of specific African dinosaur fauna. Diversity 18 (5): 241; doi: 10.3390/d18050241
Astronomers have amassed compelling evidence indicating that 80 percent of all matter in the universe is composed of dark matter, an invisible substance that holds galaxies together and impacts their rotation.
The large-scale structure of the universe and measurements of the cosmic microwave background (CMB) further support the presence of an undetermined entity saturating the cosmos.
While there is substantial evidence that dark matter forms extensive halos around galaxies and star clusters, and is relatively sparse in expansive “voids”, there is no basis to dismiss the existence of dark matter in proximity to Earth.
In fact, one study indicates that approximately 24 trillion tons of dark matter exist between the Earth and the Moon. The validity of this claim is still under investigation.
This article addresses a question from Charles Adcock: “Is it possible that dark matter exists around the Earth, but remains undetectable?”
If you have any questions, feel free to reach out to us at:questions@sciencefocus.com or message us onFacebook,Twitter, orInstagram(please include your name and location).
Explore our ultimate collection offun facts and dive into more amazing science content.
Read more:
In this version:
– Keywords like “dark matter,” “universe,” and “evidence” are highlighted for SEO.
– The structure is optimized to maintain clarity while enhancing relevance.
– Internal links have been preserved for further engagement.
Water consists of two hydrogen atoms and one oxygen atom, represented as H2O. However, in a standard water molecule, those hydrogen atoms contain only a single proton. In contrast, cometary water features a significant proportion of deuterium (D), a form of hydrogen combining a standard proton with a neutron. This deuterated water, such as that found in the interstellar comet 3I/ATLAS, presents evidence of remarkably different environmental conditions from billions of years ago, akin to the half-heavy water (HDO) identified in comets within our solar system.
This image from the Subaru Telescope shows the interstellar comet 3I/ATLAS. Image provided by: National Astronomical Observatory of Japan
Water is an essential molecule pivotal to life and various astrophysical processes.
From an astrobiological standpoint, water serves as a crucial solvent facilitating the emergence of life on Earth, and it is tracked across the universe as a potential indicator of habitable environments beyond our solar system.
During the formation of stars and planets, water in its gaseous phase acts as an efficient coolant, assisting the collapse of molecular clouds and the birth of stars.
In frozen conditions, water coats dust particles, enhancing their adhesion and accelerating planetary core growth.
Water has been identified in both gaseous and icy forms throughout our galaxy and beyond, even in high-redshift galaxies.
These discoveries encompass the entire solar system, including molecular clouds, protostellar systems, prestellar nuclei, protoplanetary disks, comets, meteorites, active asteroids, planets, and moons.
Current research endeavors aim to link water pathways across these varied environments, aiming to unravel the origins and evolution of water in planetary system formation.
The deuterium-to-hydrogen (D/H) ratio in water acts as a potent chemical tracer, informing where the water was formed, the physical conditions during its creation, and its subsequent treatment.
“Recent observations from the Atacama Large Millimeter/Submillimeter Array (ALMA) indicate that the conditions that led to the formation of our Solar System differ significantly from those that shaped planetary systems in other regions of the Galaxy,” explained Dr. Luis E. Salazar Manzano, a student at the University of Michigan.
“Most instruments can’t survey the sun, but radio telescopes like ALMA can,” Dr. Teresa Paneque Carreño from the University of Michigan added. “We successfully observed the comet just as it passed behind the Sun, shortly after its perihelion.”
“This provides us with constraints on these molecules that other instruments cannot match.”
ALMA’s measurements of the D/H ratio in water from 3I/ATLAS indicate values over 30 times higher than those observed in comets formed within our solar system, and more than 40 times the levels found in Earth’s oceans.
“We’ve established that the gas cloud that birthed the star associated with 3I/ATLAS and the other planets in its system originated under distinct, frigid conditions, contrasting sharply with the environment that formed our solar system and local comets,” Salazar Manzano revealed.
This finding offers a unique fundamental insight, unlike other more complex molecular studies in interstellar comets, as the deuterium-to-hydrogen proportions were determined during the Big Bang.
“The chemical processes leading to deuterium water enrichment are highly temperature-sensitive, typically requiring environments below about 30 K (equivalent to -243 degrees Celsius or -406 degrees Fahrenheit),” Salazar Manzano stated.
The D/H ratio of water in this comet was enriched compared to the Big Bang baseline while it formed and was preserved during its interstellar journey.
This interstellar comet likely formed under very specific radiation conditions in a far colder environment than the history of our solar system before being ejected into interstellar space.
“Each interstellar comet carries fragments of its history and ‘fossils’ from diverse locations,” expressed Dr. Paneque Carreño.
“While the exact formation site remains elusive, instruments like ALMA enable us to begin comprehending the conditions there and drawing comparisons to our solar system.”
The research team’s results were published on April 23rd in the journal Nature Astronomy.
_____
LE Salazar Manzano et al. 3I/ATLAS water D/H as a probe of another planetary system’s formation status. Nat Astron published online on April 23, 2026. doi: 10.1038/s41550-026-02850-5
James McDonald, New York State Department of Health Commissioner, addresses the measles outbreak.
Jim Franco/Albany Times Union via Getty Images
An increasing number of gamblers are placing bets on measles cases in the United States. In January alone, approximately $9 million was wagered on projected cases via the Calci and Polimarket prediction markets, suggesting that these predictions may accurately model infection spread.
Prediction markets operate by allowing participants to buy and sell shares related to the outcomes of specific future events. Each market poses a question regarding upcoming events, enabling bets on “yes” or “no” outcomes, with the share prices determined by collective betting behavior.
For instance, if 86% of bets forecast a “yes” outcome, a “yes” share costs 86 cents. Should the event occur, the successful bettor would receive $1 for each share bought, while the unsuccessful bettor loses their stake.
The concept of prediction markets originated from scientific research. In 1988, University of Iowa economists Robert Forsythe, George Newman, and Forrest Nelson sought a method to forecast federal elections, ultimately developing a betting market model. This model enabled researchers and students to place modest bets predicting election outcomes.
Market predictions proved to be quite accurate. In 2003, Philip Polgreen, an infectious disease researcher at the University of Iowa, encouraged economists to integrate disease prediction into these markets. Polgreen stated these markets were established “on an ethos of education and public benefit.”
Recently, however, prediction markets have commercialized, driven by companies like Kalshi and Polymarket. While these entities comply with U.S. regulations set by the Commodity Futures Trading Commission, they face mounting criticism from federal and state authorities.
For example, these markets have been criticized for allowing bets on conflicts such as the wars in Iran and Ukraine. Critics deem this practice as immoral. In February, a trader known as Magamiman made $553,000 by accurately predicting the timing of Ayatollah Khamenei’s removal from power.
Following the prediction, Khamenei was reported dead on February 28, 2026. This event raised ethical concerns among some U.S. Congress members regarding the potential monetization of state secrets.
Alarmingly, measles cases are reportedly on the rise across the United States, prompting the emergence of a betting market centered on this illness. While the ethical ramifications of such wagers are complex, there may be a beneficial side to this practice. Spencer J. Fox, a professor at Northern Arizona University forecasting diseases like COVID-19, views the measles prediction markets as a potentially rich data source.
For instance, the June 2025 prediction market anticipated roughly 2,000 measles cases for the year, a number very close to the actual reported total of 2,087. “Our model generated numerous worse predictions,” explained Fox.
Epidemiologists employ multiple data streams (vaccination rates, genomic data, and climate data) to forecast disease outbreaks. “Everyone seeks an advantage in predicting infectious diseases, and we continually explore new data streams,” noted Fox, adding that measles forecasts are rare due to the disease’s “highly stochastic” nature.
Cognitive scientist Emile Servan-Schreiber, CEO of prediction market firm Hypermind, believes he understands why measles predictions maintain such accuracy. He suggests these markets leverage “the wisdom of the crowd,” with “amateurs providing cognitive diversity to offset their lack of expertise.”
Nevertheless, Fox emphasizes that prediction markets cannot simply replace epidemiologists’ scientific models. For example, these markets do not account for as many explicit predictions and lack detailed granularity concerning future outcome probabilities. “We would need to make thousands of bets each week on all the different predictions we’re formulating,” he remarked.
Furthermore, Fox asserts that only specialists can accurately predict rare events. “If we don’t invest in developing expertise for infectious disease predictions now, we will be overwhelmed by the next coronavirus.”
Kalshi and Polimarket have yet to respond to requests for comments from New Scientist.
Recent fossil discoveries from the Namba Formation in South Australia have revealed that 25 million years ago, Obdurodon’s insignis — an ancient, larger, toothed ancestor of the modern platypus (Ornithorhynchus anatinus) — thrived alongside freshwater dolphins and other now-extinct species in verdant inland lakes.
An artist’s impression of the approximately 25-million-year-old fossil platypus and its surroundings. Image credit: Gen Conway, Flinders University Institute of Paleontology
“The platypus is extremely rare in the fossil record, mostly limited to tooth remains, making the discovery of new fossils significant for understanding this unique mammal,” stated Flinders University palaeontologist Dr. Aaron Camens.
First described in 1975, Obdurodon’s insignis inhabited the vast permanent lakes, slow-flowing rivers, and forested lowlands of central Australia during the late Oligocene, approximately 25 million years ago.
This species notably differs from today’s platypuses, possessing fully formed molars and premolars, unlike modern platypuses, which lose their vestigial teeth shortly after hatching.
Previously, Obdurodon’s insignis was known only from limited remains, including one and a half molars and fragments of the jaw and pelvis. However, this recent find includes one of the few well-preserved fossils of a related younger species, Obdurodon dicksoni, identified in 1992.
While Obdurodon dicksoni resembled modern platypuses, it had a slightly larger skull and a stronger bite.
“The new material of Obdurodon’s insignis includes the first premolars, the key teeth located in front of the molars,” said Dr. Camens. “This species had large, pointed front teeth and formidable molars capable of crushing shelled animals, such as yabbies.”
Dr. Trevor Worthy, also from Flinders University, highlighted an intriguing discovery: the scapulochoroid bone, which supports the arms and forelimbs. “This finding indicates that the limb structure closely resembles that of modern platypuses, suggesting ancient platypuses were adept swimmers like their modern relatives,” he noted.
“These fossils, dating back 25 million years, provide a glimpse of an ancient platypus that was larger and possessed teeth compared to modern variants.”
Research indicates that during this period, dense forests nurtured diverse communities of arboreal mammals, including koalas and various possum species.
On the forest floor, a sheep-sized marsupial coexisted with numerous other species, including familiar lizards, frogs, and small carnivorous marsupials.
These ancient trees also hosted a variety of birds, including the impressive Steller’s sea eagle, Archehieracus.
The ancient lakes teemed with lungfish and other small fish, while several species of waterfowl, cormorants, and flamingos thrived along the shores, feeding on fish, plants, and small invertebrates.
Interestingly, these freshwater ecosystems were also home to small dolphins, with their teeth and bones discovered at several fossil locations, revealing signs of this diverse ancient community.
“This rich environment was where the ancient toothed platypus lived 25 million years ago, before its remains settled into the lake’s depths,” explained Dr. Jen Conway, also from Flinders University.
This remarkable discovery is detailed in the latest issue of Australian Zoologist.
_____
Trevor H. Worthy et al. 2026. New material for the toothed platypus Obdurodon’s insignis (Monotremata: Ornithorhynchidae) from the Late Oligocene fauna of Pimpa, Billeroo Creek, South Australia. Australian Zoologist 45 (1): AZ26011; doi: 10.1071/AZ26011
Honor’s Humanoid Robot Triumphs at the 2026 Beijing E-Town Humanoid Robot Half Marathon
Lintao Zhang/Getty Images
Recently, Sabastian Thor set a new world record with a sub-2 hour marathon. However, he is not the only athlete pushing boundaries. On April 19th, a humanoid robot from Chinese tech giant Honor shattered the human half marathon record. Additionally, Unitree’s robot is remarkably close to matching human speed in the 100-meter sprint. These advancements prompt two critical inquiries: How fast can humanoid robots run, and what purpose do they serve at such high speeds?
The inaugural Beijing E-Town Half Marathon and Humanoid Robot Half Marathon took place in 2025. This year’s edition witnessed a near fivefold increase in participating robot teams—over 100 teams fielded more than 300 humanoid robots. In 2025, the fastest half marathon time recorded for an autonomous robot was 2 hours and 40 minutes; this year, it was significantly reduced to just over 50 minutes source.
Meanwhile, robot manufacturer Unitree has announced that its bipedal H1 model reached an astonishing speed of 10.1 meters per second. For comparison, Usain Bolt’s record-breaking 100-meter dash requires an average speed of 10.44 m/s, indicating that the human record is clearly within reach.
Several factors contribute to the swift improvements in mobile robotics, according to Chef Petar Corum at Imperial College London. Reduced component prices, alongside advances in more powerful and efficient motors, have bolstered robot responsiveness and agility. Furthermore, faster computer chips utilize less power while enabling more sophisticated control algorithms. Enhanced communication and more compact, accurate sensors have also played a role in this evolution.
However, if speed is the primary objective, mimicking human design may not be the optimal approach. “Humans are not efficiently designed for running, as it’s not a survival necessity,” explains Benam Dadashzadeh from Bournemouth University, UK. In fact, research indicates that robots emulating an emu’s running style can be up to 300% more efficient than those designed with human features.
Dadashzadeh remains uncertain whether advances in mobile robotics will directly benefit the environments where humanoid robots are expected to operate. If speed is essential, “you can always simply equip it with wheels,” he suggests.
While the demand for mobile robots may not be evident, competitions serve as excellent testing grounds. Kormschev notes, “These events act as stress tests for hardware, requiring high torque over extended periods, which can lead to overheating.” He compares it to car manufacturers participating in rigorous rally competitions, which demonstrate their ability to produce durable products. Both Unitree and Honor declined to comment to New Scientist on their motivations.
However, such competitions can lead to designs that may not be well-suited for practical applications. Kormschev points out that robots showcased in running events often lack functioning limbs or even heads, with large hip joints optimized purely for straight-line speed. “If lateral movement is required, these robots will struggle, as their design favors forward motion at the expense of versatility,” he explains.
Just because humanoid robots become more capable and affordable doesn’t imply they will lack utility. Human-like robots hold several advantages in spaces designed for human interaction, including the ability to operate door handles, ascend stairs, navigate furniture, and use tools.
So, how fast can humanoid robots ultimately go? Dadashzadeh suspects that the upper limits for human-like robots are likely already established. He predicts that while they may surpass human records, the difference will be slight. “They’ll be close, but robots will likely be just marginally faster,” he states.
Diagram of T Cells Attacking Proliferating Cancer Cells
Location South/Alamy
Our immune system constantly targets cells considered threats. However, when rogue immune cells attack our own body, it leads to autoimmune disorders. Current treatments can suppress these attacks but don’t prevent them. Fortunately, innovative therapies that eliminate rogue cells are proving to be highly effective.
According to Reuben Benjamin from King’s College London, “All major pharmaceutical companies are now investing in this approach.” He notes that there are numerous clinical trials in progress, with potential approvals as early as next year, as these new treatments show improved efficacy over existing options.
The cornerstone of these advances is the use of genetically engineered CAR T cells. These specialized T cells typically destroy cells infected by bacteria or viruses. For this innovative therapy, T cells are extracted from patients, engineered to target specific cells, and reintroduced into their bodies.
Initially developed for cancer treatment, CAR T cells have shown remarkable success in curing individuals when all other therapies have failed. However, they primarily work against blood cancers like leukemia and come with significant side effects, including brain inflammation that can impact language and behavior.
Both cancer and autoimmune diseases result from mutated cells that evade normal growth controls. Autoimmune conditions occur when rogue immune cells mistakenly attack healthy tissues. For instance, type 1 diabetes arises from immune assaults on insulin-producing pancreatic cells, and multiple sclerosis results from attacks on the myelin sheath surrounding nerves. “The list of autoimmune diseases is extensive,” states Benjamin.
After exposure to infection, the body generates new immune cells, which undergo screening to eliminate self-reacting cells. However, in some cases, this process fails, allowing rogue cells to persist indefinitely.
Recent research has confirmed that mutations in critical genes may hamper this screening process, preventing self-destructive mechanisms. Essentially, autoimmune disorders resemble cancer more closely than previously understood.
Despite this similarity, it remains ambiguous whether CAR T cell therapies effective against cancer would also work for autoimmune diseases. A significant challenge lies in differentiating rogue immune cells from normal ones, necessitating the destruction of most antibody-producing cells—not solely the rogue ones.
In cancer patients, CAR T cells can persist for years; however, for autoimmune conditions, they might compromise immunity. Yet, using CAR T cells for autoimmune disorders has shown promising results, with less risk of severe side effects compared to cancer treatments.
“Miraculous” Results
Five years ago, Fabian Muller and his team at Erlangen University Hospital in Germany began treating lupus patients with CAR T cells. Müller noted, “The initial patients were critically ill and would have perished without treatment.” To their astonishment, CAR T cells effectively diminished symptoms, allowing the immune system to recover.
Muller attributes this success to the intact immune systems in autoimmune patients, which recognize CAR T cells as foreign and eliminate them after a few months, unlike patients with weakened immunity from cancer.
Despite potential risks, treatments show promise in managing autoimmune disorders. While many patients with cancer experience severe side effects, lupus patients treated with CAR T cells often do not. “It’s genuinely a miracle,” Muller remarked, anticipating this therapy might extend beyond the most severe cases.
He presents three reasons for these unexpected outcomes: CAR T cells destroy fewer cells in autoimmune patients than in advanced cancer patients; the quality of CAR T cells in autoimmune cases is generally higher; and overactive immune responses in cancer patients may lead to excessive CAR T cell reactions.
Globally, hundreds of patients are currently being treated with CAR T cells for autoimmune diseases. Although definitive trial results are pending, initial reports indicate significant effectiveness against disorders like lupus, myasthenia gravis, and ulcerative colitis. While doctors remain cautious not to label it a “cure,” successful elimination of rogue cells holds the possibility of long-lasting relief.
According to Benjamin, “In the world of cancer, we often wait five years to discuss potential cures. In the realm of autoimmunity, the answer is still unclear.” Yet, if rogue cells emerge again, re-treatment could be an option.
Nonetheless, caution is advised. The damage inflicted by self-targeting immune cells may not be reversible, and not all rogue cells are easily identified or targeted. Furthermore, the cost of CAR T cell therapy poses a significant hurdle for widespread adoption—harvesting, modifying, and reinfusing personal cells can be prohibitively expensive.
On a positive note, researchers are developing so-called off-the-shelf CAR T-cell therapies for broader application. These use donor T cells to treat multiple patients, which could address some logistical challenges posed by current methods. While commercially available CAR T cells have had limited success in cancer treatment, their stability in autoimmune patients could offer advantages.
Additionally, innovations in “in-vivo CAR T cells” create CAR T cells within a patient’s body rather than in a lab, streamlining production and potentially reducing costs. “We’re genuinely excited about this,” Benjamin concludes.
While challenges may emerge down the line, the progress made in using CAR T cells for autoimmune disorders surpasses expectations from just five years ago. This development is promising news for the approximately one in ten individuals affected by such conditions.
Australia is experiencing a rapid increase in home battery storage installations as households rush to capitalize on government subsidies. My home is among those making the switch. Recently, we became part of the 300,000 homes that have already installed subsidized batteries, enabling us to power our house and electric vehicle at minimal costs.
My fascination with household batteries began when I first wrote about them for New Scientist. They store energy produced by rooftop solar panels, making it possible to use clean energy for daily needs or to charge an electric vehicle. By generating and retaining energy on-site, we significantly lower our carbon footprint and safeguard against volatile energy prices, especially those heightened by current global events, such as the disputes surrounding the Strait of Hormuz.
Initially, household batteries seemed unattainable due to their steep price. However, recent reductions in cost have made them increasingly affordable, even without government support.
To achieve home ownership under my first requirement, my family and I relocated from high-cost Sydney to a regional city about 400 kilometers north. There, we purchased a modest but durable 1970s brick home. We installed 13 solar panels on our roof, totaling 6.6 kilowatts—the most common size in Australia—maximizing our solar energy collection and acquired a reasonably priced electric vehicle.
Thanks to our solar panels, we now enjoy free electricity for appliances like dishwashers, washing machines, refrigerators, and induction cooktops on sunny days. However, when the sun sets or during gloomy weather, we revert to grid power. Charging our car overnight or running air conditioning during heatwaves incurs costs.
We contemplated purchasing a standard 15-kilowatt-hour domestic battery to store solar energy for night-time use but were daunted by the AU$20,000 (£10,000) price tag. With the government’s subsidy program launched in July 2025, costs dropped to AU$13,000 (£6,500), making it more feasible, especially since repayment could be done interest-free over five years.
The battery system we opted for comprises three lithium iron phosphate (LiFePO₄) packs housed in an aesthetically pleasing white box, located on a shaded wall outside our home to prevent overheating.
Since installation, our electricity bills have plummeted to approximately AU$25 (£13) monthly, even with continual overnight air conditioning use. This is a stark contrast to the average AU$300 (£150) monthly expenditure prior to the installation of our solar panels and batteries, although we still pay a minor fixed supply charge covering meter readings and grid maintenance.
To further curtail expenses, we recently joined a virtual power plant initiative that connects thousands of households with home batteries, allowing participants to share excess energy. Those who contribute can earn up to AU$300 (£150) annually for sharing this energy, potentially eliminating electricity costs.
Our batteries also shielded us from recent fuel price spikes following the Strait of Hormuz closure, where Australian petrol costs surged by 40 percent. Even with a shift to 70% diesel for travel, we maintained nearly free vehicle operation. Additionally, our batteries ensured we had power during a recent hour-long outage in our neighborhood.
While not everyone can afford a home battery or even homeownership, there are initiatives for making this green technology accessible. For instance, in South Australia, residents of public housing managed by the government can apply for free solar panel and battery installation to lessen their energy expenses.
Concerns regarding household battery safety are common, but the risk appears low. A study from Germany found that the likelihood of a household battery fire is 1 in 50, making them less likely to catch fire than standard residential fires and comparable to clothes dryer fires. Some household batteries, particularly lithium iron phosphate, are deemed safer than others.
The Australian Government’s home battery storage initiative is gaining momentum, requiring an additional AU$5 billion (approx. £2.65 billion) to expand the program. However, individual subsidies may decrease next month to enable broader accessibility. With aspirations for 2 million storage batteries installed by 2030 in Australia, significant progress is hoped.
In contrast, the UK’s adoption of this technology has been sluggish, with only 20,000 household batteries established annually. However, recent government announcements hint at the impending rollout of new subsidies to enhance uptake.
While some technologies I’ve discussed in the past have seemed outlandish—like massage-giving robots or tiny cars driven by rats—home batteries exemplify tangible, groundbreaking tech. It’s gratifying to witness their rise in popularity and to benefit from this innovation ourselves. With the considerable savings we’ve gained, we could even afford a robot massage therapist someday!
The colossal soft-bodied cephalopod, reaching lengths of up to 19 meters (62 feet), once rivaled the most ferocious reptiles of the Cretaceous seas and was likely preyed upon, according to a groundbreaking study led by paleontologists at Hokkaido University.
Artist’s impression of an ancient giant octopus. Image provided by: Yohei Utsugi, Hokkaido University
For hundreds of millions of years, it was believed that marine ecosystems were dominated by large vertebrate apex predators, relegating invertebrates to minor prey roles.
However, unlike their shelled counterparts, octopuses have carved out a unique evolutionary path.
These fascinating creatures have evolved soft bodies, which allow for remarkable mobility, vision, and intelligence.
Some octopus species have grown to enormous sizes, serving as apex predators, yet their precise ecological roles have remained unclear due to limited fossil records.
“Our discoveries suggest that the earliest octopuses were giant predators at the apex of the marine food chain during the Cretaceous period,” stated paleontologist Professor Yasuhiro Iba from Hokkaido University.
“Based on exceptionally preserved jaw fossils, we determined that these animals may have reached nearly 19 meters in total length, surpassing the size of modern large marine reptiles.”
“The most astonishing finding was the extent of wear on the jaws.”
This wear, indicative of biting into hard prey, leaves distinctive marks similar to those found in contemporary shell-crushing cephalopods. Measurements of octopus jaws can also estimate the overall body size.
In the study, Professor Iba and colleagues documented evident signs of wear on 15 large jaw fossils of ancient octopus relatives previously collected from Cretaceous deposits in Japan and Vancouver Island.
Moreover, through digital fossil mining techniques, they uncovered 12 flat-tailed octopus jaws entrapped in Cretaceous rocks in Japan.
The analysis categorized two major species: Nanaimoteti Zeretsky and Nanaimoteutis hagarti.
This finned octopus, Nanaimoteutis hagarti, remarkably grows to exceptional sizes ranging from 7 to 19 meters (23 to 62 feet), comparable to contemporary giant marine reptiles, and may represent the largest described invertebrates to date.
Additionally, the jaws of the largest specimens exhibit considerable wear, with the once sharp features of smaller juveniles dulled and rounded over time.
The wear patterns indicate that these creatures were active carnivores, routinely crushing hard shells and bones with powerful bites.
They used their long, flexible arms to capture prey while skillfully dissecting it with their strong beaks—behaviors associated with advanced intelligence.
“This study presents the first direct evidence that invertebrates can evolve into large, intelligent apex predators in an ecosystem largely dominated by vertebrates for approximately 400 million years,” Professor Iba noted.
“Our findings indicate that robust jaws and the absence of a superficial skeleton, a characteristic common to both octopuses and marine vertebrates, were crucial for their evolution into large, intelligent marine predators.”
These findings were published in the Online Journal on April 23, 2026, in Science.
_____
Arata Ikegami et al. 2026. The earliest octopuses were giant top predators of the Cretaceous oceans. Science 392 (6796): 406-410; doi: 10.1126/science.aea6285
Have you ever considered how geologists uncover the mysteries of ancient beaches and shallow oceans? From identifying paleoenvironments and ancient fauna to understanding seasonal weather patterns and the preservation of ancient landscapes, the key lies in microbial mats—structures formed by sand particles bound together by microorganisms.
Planar microbial mats with ripple marks in Cambrian (left) and modern (right) tidal flats. Image credit: Nora Noffke.
Often underestimated as mere “pond scum,” this concoction of microorganisms and sand is crucial for preserving sedimentological evidence from the ancient world.
At the notable Cambrian Blackberry Hill Site in Wisconsin, USA, these microbial mats have transformed ancient tidal flats preserved in quartz sandstone into remarkable microfossil hubs.
By binding loose sand particles, these microbial mats create stable surfaces that capture delicate imprints of soft-bodied organisms, including jellyfish (Ladle insect), ancient grazing behaviors of mollusks, the enigmatic footprints of long-extinct arthropods, and other signs of animal activity.
Without this microbial adhesive, the shifting tides and storms of the Cambrian inland sea would have erased trails and roads as swiftly as modern-day sandcastles vanish on beaches.
The findings at Blackberry Hill offer crucial insights into why animals inhabited tidal flats around 500 million years ago.
Among these was the Eutycarcinoid known as Mosineia—one of the first organisms to construct orbits on land (Protik Knight), a mystery that has fascinated scientists for over 150 years.
This research has also revealed that arthropods likely possessed increased agility compared to sluggish mollusks, taking advantage of the upper intertidal zone to seek alternative food sources and possibly scavenging.
Large, slug-like mollusks—approximately the size of an adult’s foot, occasionally growing to a meter in length—graze on these microbial mats, creating trails termed microbial mats. This behavior supports the theory that the “scum” served as a primary food source that attracted marine life to the intertidal zone (the area between high and low tides along coastlines).
Microbial mats played a pivotal role in the rapid development of fossil traces, shielding them from currents and storm surges, thus preserving records of animal activity, including “death traces” (mortichnia) of animals struggling to survive in dynamic environments.
Moreover, microbial mats preserved in geological formations provide vital clues regarding environmental conditions and events, such as seasonal variations and sudden storms. Ancient storm activities are evidenced by meter-scale fragments of thick microbial mats that have been torn and flipped. Only thinner, more adaptable mats can successfully record these traces.
Large overturned fragments of microbial mats deposited on Cambrian (left) and modern (right) beaches. Note the Climac Knight footprints near the scale in the photo on the left. Image credit: Nora Noffke.
Additionally, the fragments of these mats are preserved, particularly toward the end of the growing season when the microbial mats start to disintegrate.
Microbial mat chips scattered in Cambrian (left) and modern (right) tidal flats. Image credit: Nora Noffke.
“Today, extensive microbial mat systems flourish in tidal flats and lagoons along the coastlines of Earth’s oceans,” stated lead author Nora Noffke, a professor at Old Dominion University. This study was published in the journal Palaios.
“These modern mats thrive under the same conditions seen on Blackberry Hill.”
“Without these essential microbial mats, our understanding of life and earth events through the ages would remain largely hidden from the relentless forces of ancient currents, waves, and time.”
_____
N. Noffke & KC Gass. 2026. Microbial mat fauna of a Cambrian tidal flat and its implications for the trace fossil record (Elk Mound Group, Wisconsin, USA). Palaios pp. 74-90; doi: 10.2110/palo.2025.042
A recent study reveals that infrasound (very low frequency sounds below 20 Hz) can elevate cortisol levels and increase irritability, providing a scientific rationale for why certain “haunted” locations may evoke feelings of discomfort.
Research by Scatterati et al. has shown that infrasound can evoke irritation and aversion in humans through a combination of self-reports and biological measures, also suggesting a link to increased negative emotional assessments.
Infrasound is defined as sound waves with frequencies below 20 Hz, which can manifest naturally from sources such as tectonic shifts, volcanic eruptions, convective storms, and air-water interactions during upstream water discharge.
Additionally, infrasound is commonly found in urban settings, particularly near ventilation systems, air conditioning units, low-noise piping, traffic, and various mechanical systems.
Exploratory field recordings have detected infrasound energy linked to urban sound environments and live music events.
Professor Rodney Schmaltz from MacEwan University states, “Infrasound is widespread in everyday surroundings, frequently emanating from ventilation apparatus, transit systems, and industrial machinery.”
“Many individuals are unknowingly subjected to these sounds. Our research indicates that even brief exposure can modify mood and elevate cortisol levels. Understanding the effects of infrasound in real-world contexts is crucial.”
The study involved 36 participants who sat alone in a room while either soothing or anxiety-inducing music played.
For half the participants, an inconspicuous subwoofer emitted infrasound at 18 Hz. Afterward, they were asked to report their feelings, emotional evaluations of the music, and whether they suspected infrasound was present. Saliva samples were collected before and after the listening session.
Results showed that cortisol levels in saliva were elevated in participants exposed to infrasound.
These individuals also reported increased irritability, reduced interest, and a perception of the music as sadder, despite not being aware of the infrasound presence.
“This study indicates that the body can react to infrasound even when it goes consciously unheard,” Schmalz explains.
“Participants struggled to accurately identify the presence of infrasound, and their beliefs regarding it had no discernible impact on cortisol levels or mood.”
“When feeling irritable or stressed, cortisol levels naturally rise as a part of the body’s stress response, establishing a connection between increased irritability and elevated cortisol,” notes Dr. Kale Scatterati, a student at the University of Alberta.
“However, exposure to infrasound influenced both outcomes beyond their normal correlation.”
These findings suggest humans can detect infrasound, yet often fail to recognize it, though the underlying mechanism remains unclear.
Furthermore, this research points to the need for further exploration into whether long-term infrasound exposure could affect health due to consistently heightened cortisol levels and mood disturbances associated with increased irritability.
Professor Trevor Hamilton from MacEwan University commented, “Increased cortisol levels trigger alertness and help the body respond to potential stressors.”
“This is an evolutionarily beneficial response but chronic cortisol release can lead to various physiological issues and adversely impact mental health.”
The findings of this research are published in the journal Frontiers in Behavioral Neuroscience.
_____
Kale R. Scatterati et al. 2026. Exposure to infrasound is associated with aversion, negative evaluation, and elevated salivary cortisol in humans. Frontiers in Behavioral Neuroscience 20; doi: 10.3389/fnbeh.2026.1729876
In a well-known anecdote, Henry Ford claimed it was economically inefficient for car parts to outlast the vehicles themselves. This concept leads to considerations such as dispatching employees to junkyards to identify components that are overly durable, paving the way for cheaper material alternatives in the future.
Notably, as of this month, the average price of new EVs in the UK is less than that of new petrol cars. This is particularly encouraging news for consumers. However, in a twist that might unsettle Ford’s philosophy, recent findings suggest that EV batteries should be prioritized properly, as they can outlast the vehicles themselves, indicating that used EVs may represent better value than initially perceived.
“
The average new EV sold in the UK costs less than the average new petrol car. “
This shift towards EVs, both new and used, is making them increasingly attractive on the market. Most EVs remain parked and connected to power for nearly 23 hours daily. Plans are underway to leverage these batteries to temporarily store surplus power for the grid, with operators compensating EV owners when power restoration is required. Although the concept isn’t new, recent pilot programs in the U.S. demonstrate its potential profitability. Average EV drivers could earn thousands each year.
It seems that economic incentives, rather than merely addressing climate change, may spur the green transition. Furthermore, the current fuel crisis driven by the Iran war could also accelerate EV adoption, making fuel-burning cars costlier to maintain.
The EV industry faces challenges, but a slowdown in growth presents clearer pathways moving forward.
George MacKay (Nick, left) and Callum Turner (Liam) embark on a haunting journey home.
Credit: Ian Kingsnorth, BFI
The Rose of Nevada Directed by Mark Jenkin Now showing in UK cinemas. US release on June 19th.
Time functions oddly in desolate locations—this is the central theme of The Rose of Nevada, directed by Mark Jenkin. The film unfolds in a once-thriving fishing village in Cornwall, England, now bereft of its inhabitants and industry. Only a few souls linger amongst the deserted pubs and derelict moorings, remnants of a once-prosperous past. The sea itself feels lifeless.
What more fitting backdrop for a ghost story than this ghost town? After all, aren’t ghosts merely echoes of time? This village exists outside conventional chronology. Three decades ago, a fishing boat and its ill-fated crew tragically vanished at sea, and their absence resonates throughout the village. The tragedy remains unresolved until the re-emergence of the vividly colored Nevada Rose in the harbor.
For Nick (George MacKay), the boat’s return comes at a crucial juncture. As a husband and father struggling to stay afloat, the arrival of the Nevada Rose seems fortuitous. Liam (Callum Turner), an itinerant worker who sleeps on docks, also finds hope when he is recruited by Mike (Edward Lowe), the owner of the Rose of Nevada. They are guided by the seasoned rental captain, Margie (Frances McGee), who has a mysterious connection to the once-missing vessel.
Together, the trio sets sail, hoping to catch enough fish to repair Nick’s leaking roof, pad Liam’s pockets, and perhaps save their village. However, upon the Nevada Rose’s return to shore, something feels amiss. Time has looped back 30 years, and Liam becomes confused for two local men: Alan, an absentee father who vanished with the Rose of Nevada, and Nick, a fisherman haunted by the guilt of missed work on that tragic day.
“
What better place to set a ghost story than a ghost town? After all, what is a ghost if not a coincidence of time? “
The implications of this twist and the narrative that follows are open to interpretation. The emotional weight of The Rose of Nevada is palpable, particularly in McKay’s compelling portrayal of Nick. Navigating his past, Nick finds comfort in the empty house of long-lost neighbors who mistake him for their son, uncovering a note of love from his wife on the day he left.
The film’s mesmerizing sequences at sea offer tension and tranquility. For Nick, fishing transcends mere survival—it’s a recurring rhythm that brings him clarity amidst chaos. The quest is not just for Nick but for the entire village waiting for his return.
The Rose of Nevada marks the third installment of Jenkin’s Cornwall trilogy. The first, Feed, explores the negative impacts of tourism on a coastal community, while the second, Ennismen, follows a solitary wildlife volunteer on a remote island grappling with hallucinations. This film encapsulates themes from both predecessors with its stunning visuals and haunting setting. The iconic hand-cranked Bolex camera used in the earlier films succumbed just as this latest installment reached completion, marking a poignant end to Jenkin’s trilogy.
If this marks the conclusion of Jenkin’s journey, it will be a bittersweet farewell. His unique storytelling deserves further exploration within this picturesque Cornish setting. The Rose of Nevada stands as a significant achievement—an unforgettable tale of lost time, borrowed moments, and the quest for redemption.
George MacKay’s recent performances, especially in Fam, demonstrate his versatility. In this gripping thriller, he plays Preston, a man embroiled in a violent act that comes back to haunt him when he unexpectedly reunites with Jules (Nathan Stewart-Jarrett) at a gay sauna.
Bethan Ackerley is the associate editor at New Scientist. Her interests span science fiction, comedy, and the supernatural. Follow her on Twitter @inkerley
The coral reefs of the Houtman Abrolhos Islands, located off the coast of Western Australia, have shown remarkable resilience against the severe heatwave that impacted coral ecosystems globally in early 2025. Researchers are eager to unveil the secrets behind the extraordinary heat tolerance of these corals, hoping to aid in the preservation of coral reefs worldwide, which face extinction due to climate change.
Under the guidance of Dr. Kate Quigley, a team from the University of Western Australia ventured to 11 dive sites in the Houtman Abrolhos Islands during July 2025.
In contrast, up to 60% of the corals on Ningaloo Reef succumbed to the same heatwave. This trend reflects a pattern observed in coral reefs globally, as the marine heatwave of 2025 resulted in disastrous coral mortality rates.
However, at Houtman Abrolhos, aside from a few minor areas, the corals exhibited no signs of distress, unlike the typical fluorescent coloring associated with stress. “We anticipated a massive bleaching event following the prolonged marine heatwave. Surprisingly, the corals thrived,” stated Quigley.
Coral bleaching typically occurs due to prolonged thermal stress, where corals expel the symbiotic algae living within them, which are crucial for their sustenance.
Researchers are currently evaluating the heat stress levels experienced by corals using the Degree Heating Week (DHW) metric, which measures the duration and intensity of heat waves.
Significant bleaching is generally observed after 4 °C weeks, with catastrophic conditions arising after 8 °C weeks. “Around 8°C per week is deemed disastrous and is often linked to widespread bleaching and coral mortality,” explained Quigley.
The waters around the Houtman Abrolhos Islands experienced 4°C per week in early February 2025, reaching 8°C per week by early March. By mid-April, the corals were subjected to heat stress equivalent to 22°C per week.
Quigley and her team were particularly astonished to observe that corals of various species at the reef remained unharmed despite the devastating conditions affecting other regions.
To further investigate the heat resistance of Houtman Abrolhos corals, scientists collected several coral colonies and subjected them to elevated temperatures in laboratory settings.
At 8°C weeks, survival rates in Houtman Abrolhos were double, and bleaching resistance was nearly quadruple when compared to established thresholds. Nearly 100% survival was noted even at approximately 16 °C weeks.
The maximum tolerance level of these corals remains to be fully determined, but Quigley asserts it is “remarkably substantial and exceeds the thresholds recorded at other coral reef locations studied globally.”
The next phase for researchers is to discern how these corals manage to thrive in such extreme conditions.
Quigley posits that the presence of symbiotic algae could be key to the heat resilience seen in Houtman Abrolhos corals. “There are likely unique environmental conditions in this area that promote heat tolerance evolution among local species,” she stated. For this reason, protecting these reefs should be a top priority, along with identifying other resilient reefs.
Petra Lundgren from the Great Barrier Reef Foundation mentions that such reefs serve as “natural laboratories for understanding heat tolerance.”
“They also promise insights into enhancing selective breeding and interventions aimed at bolstering thermal resilience in coral restoration and conservation aquaculture,” Lundgren noted.
While curbing global carbon emissions is crucial for safeguarding these vital ecosystems, “providing adaptive support, such as seeding reefs with heat-tolerant corals, will significantly improve their chances of surviving future heat stress events,” she concluded.
Topic:
This version uses targeted keywords such as “coral reefs,” “heat tolerance,” and “Houtman Abrolhos Islands,” aiming to enhance search visibility while preserving the original content’s intent.
The Dark Energy Camera, an advanced 570-megapixel imaging device mounted on NSF’s Victor M. Blanco 4-meter telescope at the Cerro Tololo Inter-American Observatory in Chile, has captured the stunning Sombrero Galaxy in unparalleled detail. This image reveals a faint stream of stars and a radiant halo, hinting at a dynamic history shaped by galaxy mergers.
The DECam image of the Sombrero Galaxy, a target of interest for both amateur astronomers and scientific research. Image credits: CTIO / NOIRLab / DOE / NSF / AURA / TA University of Alaska Anchorage Chancellor and NSF’s NOIRLab / D. de Martin and M. Zamani, NSF’s NOIRLab.
The Sombrero Galaxy, located about 28 million light-years away in the constellation Virgo, is a striking astronomical object.
Also known as Messier 104, M104, or NGC 4594, the galaxy was first discovered by French astronomer Pierre Méchain on May 11, 1781.
With a diameter of approximately 49,000 light-years—about half that of the Milky Way—the Sombrero Galaxy possesses a distinct structure.
This galaxy uniquely combines characteristics of both spiral and elliptical galaxies, featuring prominent disks and spiral arms alongside a large, luminous central bulge that gives it a hybrid appearance.
The view showcases the Sombrero Galaxy head-on, at a 6-degree angle south of its plane, highlighting its dark, dusty lanes.
“The Sombrero Galaxy is a galactic masterpiece that captivates both scientists and astronomy enthusiasts,” stated NOIRLab astronomers.
“Its complex globular cluster system offers insights into star populations, and astronomers are particularly interested in the supermassive black hole at its core.”
“The galaxy’s unique visual features and relative brightness make it a favorite among amateur stargazers.”
“The rich discovery history, involving three renowned astronomers, has cemented its place among the most significant deep-sky objects.”
“Today, the Sombrero Galaxy stands as one of the most iconic celestial bodies visible in the night sky.”
The latest high-resolution image of the Sombrero Galaxy was captured using the advanced DECam instrument.
“DECam’s outstanding resolution highlights the remarkable features of the Sombrero Galaxy,” the astronomers noted.
“At its center lies a brilliant core, encircled by approximately 2,000 globular clusters.”
A dark band of cold dust and hydrogen gas outlines the disk where star formation predominantly occurs.
This image also accentuates the galaxy’s vast glowing halo, which appears to extend over three times the width of the sombrero itself.
“This could be the first time such a halo has been captured in such detail and scale.”
DECam’s exceptional sensitivity has also revealed a vast stream of stars extending from the southern region of the galaxy.
These halos and stellar streams are composed of stars torn from their original galaxies, suggesting a history of galactic mergers involving the Sombrero and smaller partner galaxies.
A recent study on bee vision reveals that their capacity to differentiate quantities goes beyond simple visual patterns, indicating authentic numerical cognition influenced by their distinct brain perceptions.
To understand the mechanisms behind animal cognition, it is essential to adopt experimental designs that respect the biological and perceptual limitations of the species being studied. Zanon et al. addressed the ongoing debate around visuospatial frequency in numerical cognition research using honeybees (Apis mellifera) as a model system. Image credit: PollyDot.
In this study, researchers from Monash University, including Scarlett Howard, revisited previous critiques of bee intelligence, considering the unique sensory and perceptual limitations bees possess.
By evaluating experimental stimuli from a biologically relevant standpoint, the researchers demonstrated that previous critiques suggesting bees are merely sensitive to visual cues like spatial frequency are unfounded.
“These findings emphasize the necessity to eliminate anthropocentric biases in animal research,” stated Dr. Howard.
“In evaluating an animal’s cognitive capabilities, it is crucial to prioritize the animal’s perspective; otherwise, we may miscalculate their abilities.”
“Given that humans perceive the world quite differently from animals, we must refrain from centering human perspectives when researching animal intelligence.”
As the researchers conclude, properly assessing cognitive abilities demands experimental designs that align with the natural sensory capabilities of the target species.
“Neglecting how animals perceive their environment can lead scientists to erroneous conclusions,” remarked Dr. Mirko Zanon from the University of Trento.
“There is an ongoing debate regarding whether bees are genuinely ‘counting’ or merely responding to visual patterns.”
“Our findings indicate that this critique is invalid when considered within the ecological context of the animals.”
“Analyzing stimuli through the lens of how bees perceive their world reveals a genuine sensitivity to numerical concepts.”
“While it may be challenging to envision the world from a bee’s perspective, understanding animal perception is vital for our research,” Dr. Howard stated.
“Bees consistently astonish us with their navigation, their responses to our inquiries, and their decision-making processes.”
For further details, refer to the study published in the April 22 issue of Proceedings of the Royal Society B: Biological Sciences.
_____
Zanon et al. 2026. Matching stimuli: A biology-aligned approach to numerical cognition research. Proc Biol Sci 293 (2069): 20253057; doi: 10.1098/rspb.2025.3057
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.