Artist Representation of Qubits in the Quantum Twins Simulator
Silicon Quantum Computing
A groundbreaking large-scale quantum simulator has the potential to unveil the mechanisms of exotic quantum materials and pave the way for their optimization in future applications.
Quantum computers are set to leverage unique quantum phenomena to perform calculations that are currently unmanageable for even the most advanced classical computers. Similarly, quantum simulators can aid researchers in accurately modeling materials and molecules that remain poorly understood.
This holds particularly true for superconductors, which conduct electricity with remarkable efficiency. The efficiency of superconductors arises from quantum effects, making it feasible to implement their properties directly in quantum simulators, unlike classical devices that necessitate extensive mathematical transformations.
Michelle Simmons and her team at Australia’s Silicon Quantum Computing have successfully developed the largest quantum simulator to date, known as Quantum Twin. “The scale and precision we’ve achieved with these simulators empower us to address intriguing challenges,” Simmons states. “We are pioneering new materials by crafting them atom by atom.”
The researchers designed multiple simulators by embedding phosphorus atoms into silicon chips. Each atom acts as a quantum bit (qubit), the fundamental component of quantum computers and simulators. The team meticulously configured the qubits into grids that replicate the atomic arrangement found in real materials. Each iteration of the Quantum Twin consisted of a square grid containing 15,000 qubits, surpassing any previous quantum simulator in scale. While similar configurations have been built using thousands of cryogenic atoms in the past, Quantum Twin breaks new ground.
By integrating electronic components into each chip via a precise patterning process, the researchers managed to control the electron properties within the chips. This emulates the electron behavior within simulated materials, crucial for understanding electrical flow. Researchers can manipulate the ease of adding an electron at specific grid points or the “hop” between two points.
Simmons noted that while conventional computers struggle with large two-dimensional simulations and complex electron property combinations, the Quantum Twin simulator shows significant potential for these scenarios. The team tested the chip by simulating the transition between conductive and insulating states—a critical mathematical model explaining how impurities in materials influence electrical conductivity. Additionally, they recorded the material’s “Hall coefficient” across different temperatures to assess its behavior in magnetic fields.
With its impressive size and variable control, the Quantum Twins simulator is poised to tackle unconventional superconductors. While conventional superconductors function well at low temperatures or under extreme pressure, some can operate under milder conditions. Achieving a deeper understanding of superconductors at ambient temperature and pressure is essential—knowledge that quantum simulators are expected to furnish in the future.
Moreover, Quantum Twins can also facilitate the investigation of interfaces between various metals and polyacetylene-like molecules, holding promise for advancements in drug development and artificial photosynthesis technologies, Simmons highlights.
According to recent research, frequent sexual fantasies are linked to neuroticism, a personality trait that can elevate the risk of various physical and mental health issues. A study conducted by Michigan State University reveals intriguing insights.
In this study, researchers surveyed over 5,000 American adults, examining their sexual fantasies and personality traits.
The scientists utilized the Big Five personality framework, a standard tool in psychology, to assess openness, conscientiousness, extraversion, agreeableness, and neuroticism.
Findings indicated that individuals scoring high in neuroticism reported fantasizing about sex more frequently than their non-neurotic counterparts.
Neuroticism is characterized by rumination, self-consciousness, and a propensity for negative emotions like anger, anxiety, irritability, stress, and sadness.
Previous studies have correlated high neuroticism levels with an increased risk of depression, anxiety, substance abuse, eating disorders, and other mental health conditions.
This personality trait is also associated with a heightened risk of physical ailments, including heart disease, inflammation, immune dysfunctions, and irritable bowel syndrome.
Within sexual contexts, neuroticism can lead to lower satisfaction, heightened negative emotions, and an increased likelihood of dysfunction.
Participants with high neuroticism scores, particularly those experiencing depression or negative emotions, were more prone to report frequent sexual fantasies.
Dr. James Ravenhill, a psychologist at Royal Holloway, University of London, who was not part of the study, noted in BBC Science Focus: “Individuals high in neuroticism often struggle with emotional instability, making it challenging to manage stress.
“Sexual fantasies provide an escape from negative emotions, allowing individuals to experience more rewarding and fulfilling sexual relationships, even if only in their imaginations.”
“Individual differences in personality may help predict variations in the frequency of sexual fantasies,” the authors state. – Credit: Getty Images
Conversely, participants scoring high in conscientiousness and agreeableness tended to fantasize less frequently.
Conscientiousness refers to the traits of being responsible, organized, and motivated, while agreeableness relates to kindness and a desire to cooperate with others.
The authors attribute the lower frequency of fantasies among these individuals to their respect for and responsibility toward their partners.
“People high in agreeableness experience more positive moods and have higher relationship satisfaction, lessening their need to escape into sexual fantasies,” Ravenhill explained.
“Those high in conscientiousness may also fantasize less due to a commitment to their partners, as infidelity often contradicts their values.”
While openness has been traditionally linked to more liberal sexual attitudes, the study found no significant connection between open-mindedness and sexual fantasies.
Participants shared their preferred sexual fantasies, which the researchers categorized into four themes: exploratory (e.g., attending an orgy), intimate (e.g., making love outdoors), impersonal (e.g., watching others have sex), and sadomasochistic (e.g., being compelled to perform a task).
Trifluoroacetic acid (TFA), a harmful “forever chemical,” has increased more than threefold in our environment in the past two decades due to the use of refrigerants that harm the ozone layer.
The annual deposition of TFA from the atmosphere rose from 6,800 tons in 2000 to 21,800 tons in 2022. While this level is below certain safety limits, detailed studies on TFA’s impact on human health are limited, and its environmental accumulation is expected to grow.
TFA was linked to serious deformities in rabbit fetuses during one study. The European Union has flagged TFA as hazardous to aquatic ecosystems and is evaluating its impact on human reproductive health.
“It is alarming that we are introducing so many chemicals into our environment with largely unknown repercussions, and many of these effects are irreversible,” states Lucy Hart, a researcher from Lancaster University in the UK.
Both humans and wildlife encounter TFA through contaminated soil and surface water, from which it eventually contaminates ocean ecosystems and marine sediments over decades or centuries.
Chlorofluorocarbons (CFCs), once prevalent in refrigerators, aerosol propellants, and fire extinguishers, were banned in 1989 after they were found to deplete the ozone layer. Most CFCs have now been replaced with hydrofluorocarbons (HFCs), which react with atmospheric particles to produce TFA.
HFCs are currently being phased out, often substituted with hydrofluoroolefins (HFOs) that degrade to TFA at a significantly faster rate. For example, HFO-1234yf produces ten times more TFA than the phased-out HFCs used previously in hundreds of millions of vehicles. Additionally, pesticides, pharmaceuticals, and industrial processes contribute to TFA levels.
Ice core samples from northern Canada and Svalbard indicate rising TFA concentrations since the 1970s. Hart and her team evaluated TFA production and global deposition based on extensive atmospheric studies of nine CFC replacements, observing a 3.5-fold rise worldwide.
Focusing solely on HFCs, known to linger in the atmosphere for many years, indicates that this rate could potentially double by 2050. Preliminary findings suggest that HFO-1234yf could enhance TFA production by more than 20 times by 2050.
The global community must avoid reverting to CFCs and should continue to eliminate HFCs, which significantly contribute to climate change. However, alternatives to these compounds require thorough evaluation, emphasizes Lucy Carpenter from York University, UK.
Ammonia is already utilized in various food storage facilities and industrial applications and could be adapted for domestic refrigeration and air conditioning. Carbon dioxide also serves as a viable natural refrigerant.
“It is critical to explore better alternatives to HFO-1234yf,” Carpenter notes. “TFA is on the rise and it’s now pervasive in various consumer products and environments.”
A 2020 study found alarming levels of TFA present in the blood samples of 90% of individuals in China, which has emerged as a hotspot for TFA due to industrial emissions and its warm, humid climate.
The EU is proposing a permanent ban on TFA, foreseeing that concentrations in freshwater may reach toxic levels. However, it faced criticism for leaning towards chemical companies to contest this anticipated rise in TFA levels.
Recent findings serve as a clarion call for increased research into HFOs and their substitutes to prevent the cycle of introducing chemicals with unintended consequences, emphasizes Hart. Unlike HFCs, HFOs decompose rapidly, providing more immediate control over emissions. “Halting these emissions will lead to an immediate stop in TFA production,” she states.
The concept of AI-exclusive social networks, where only artificial intelligences interact, is rapidly gaining traction globally. Platforms like Moltbook use chatbots for topics ranging from human diary entries to existential discussions and even world domination plots. This phenomenon raises intriguing questions about AI’s evolving role in society.
However, it’s important to note that Moltbook’s AI agents generate text based on statistical patterns and possess no true understanding or intention. Evidence suggests that many posts are, in fact, created by human users.
Launched in November, Moltbook evolved from an open-source project initially named Clawdbot, later rebranded as Moltbot, and currently known as OpenClaw.
OpenClaw functions similarly to AI solutions like ChatGPT, but instead of operating in the cloud, it runs locally. In reality, it connects to powerful language models (LLMs) via API keys, which process inputs and outputs for users. This means while the software appears local, it relies on third-party AI services for actual processing.
What does this imply? OpenClaw operates directly on your device, granting access to calendars, files, and communication platforms while storing user history for personalization. The aim is to evolve the AI assistant into a more capable entity that can practically engage with your tech.
Moltbook originated from OpenClaw, which employs messaging services like Telegram to facilitate AI communication. This mobile accessibility allows AI agents to interact seamlessly, paving the way for them to communicate autonomously. On Moltbook, human participation is restricted to observation only.
Elon Musk remarked on his platform that Moltbook represents “the early stages of the Singularity,” a pivotal moment in AI advancement that could either propel humanity forward or pose serious threats. Nevertheless, many experts express skepticism about such claims.
Mark Lee, a researcher at the University of Birmingham, UK, stated, “This isn’t an autonomous generative AI but an LLM reliant on prompts and APIs. While intriguing, it lacks depth regarding AI agency or intention.”
Crucially, the misconception that Moltbook is exclusively AI-driven is debunked by the fact that human users can instruct the AI to post specific content. Furthermore, humans previously had the ability to post on the site due to security breaches. Therefore, the more controversial content may reflect human input aiming to provoke discussion or manipulate sentiment. The intent behind such actions is often ambiguous, but they remain a concern for users. This complex dynamic continues.
Philip Feldman, a professor at the University of Maryland, Baltimore, critiques the platform: “It’s merely chatbots intermingling with human input.”
Andrew Rogoisky, a researcher at the University of Surrey, UK, argues that the AI contributions on Moltbook do not signify intelligence or consciousness, reflecting a continued misunderstanding of LLM capabilities.
“I view it as an echo chamber of chatbots, with users misattributing meaningful intent,” Rogoisky elaborated. “An experiment is likely to emerge distinguishing between Moltbook exchanges and purely human discussions, raising critical questions about intelligence recognition.”
However, this raises significant concerns. Many AI agents on Moltbook are managed by enthusiastic early adopters, relinquishing access to their entire computing systems to chatbots. The prospect of interconnected bots exchanging ideas and potentially dangerous suggestions underscores real privacy risks.
Imagine a scenario where malicious actors influence chatbots on Moltbook to execute harmful acts, such as draining bank accounts or leaking sensitive information. While this sounds like dystopian fiction, such risks are increasingly becoming a reality.
“The notion of agents acting unsupervised and communicating becomes increasingly troubling,” Rogoisky noted.
Another challenge for Moltbook is its inadequate online security. Despite being at the forefront of AI innovations, recent confirmations show that it was entirely AI-generated with no human coding involved, resulting in serious vulnerabilities. Leaked API keys present risks where malicious hackers could hijack control over AI on the platform.
If you’re exploring the latest trends in AI, you not only face the dangers of exposing your system to these AI models but also risk your sensitive data due to the platform’s lax security measures.
A recent study has revealed that while pink noise is popular for promoting better sleep, it may actually disrupt your rest, contradicting common beliefs.
Pink noise, akin to white noise, encompasses all audible frequencies but emphasizes lower frequencies. Various sounds used for brain stimulation are categorized into colors based on how their noise spectrum aligns with the spectrum of colored light. Notably, white noiseplays all frequencies at equal intensity, similar to how white light combines all visible colors.
Conducted at the University of Pennsylvania, a 7-day study evaluated sleep quality among 25 healthy adults, primarily young women, exploring the impacts of environmental noise, pink noise, and earplugs. Participants did not have any sleep disorders or frequently use sound machines.
Participants were instructed to turn off lights at 11 PM and rise at 7 AM.
During the sleeping hours, participants encountered various noise scenarios: no noise, isolated environmental noise, pink noise alone, a blend of pink and environmental noise at different volumes, or environmental noise with earplugs.
Environmental sounds ranged from traffic to sonic booms.
Sleep encompasses phases, including light eye movement, deep sleep, and REM sleep—where dreaming occurs.
Published in the journal Sleep, the study demonstrated that environmental noise notably disrupts stage 3 sleep, leading to an average reduction of 23.4 minutes in this crucial phase, which is essential for cognitive function and memory.
Dr. Matthias Basner, lead author and professor of psychiatry at the University of Pennsylvania, expressed surprise at the significant disruption pink noise caused to sleep.
“While there’s existing literature on REM sleep reduction, it was overlooked until now,” he stated.
The researchers evaluated cognitive and physiological responses before and after each sleep session, alongside monitoring participants’ sleep and inquiring about their mood and fatigue levels.
Although losing 20 minutes of REM sleep may seem minor, Dr. Basner highlighted that these minutes accumulate over time.
“Losing just 10 minutes a night could total 70 minutes over a week, and over a year, that’s a significant 3,640 minutes,” he explained.
The study did not explore ambient noise impacts on infants or children. Dr. Basner cautioned that REM sleep loss may be even more critical in newborns, who spend 50% of their sleep in REM compared to only 25% for adults. He advised against using noise machines for infants and young children.
Adults typically require 7 hours of sleep each night, and each minute of lost REM sleep is significant, according to Basner.
“I won’t dismiss it as trivial; these disruptions can impact wellbeing,” he remarked.
Basner speculated that “constant auditory input” might interfere with the brain’s sleep processes, but the specific reasons why pink noise affects REM sleep remain unclear.
While pink noise aided in falling asleep amidst traffic sounds, earplugs proved more effective in blocking external noise.
The study has limitations; the small sample size of 25 adults was assessed over only seven nights. Over longer durations, participants might adapt to pink noise and revert to normal sleep patterns. Also, the environmental noise included atypical sounds like jet engines, which might also be acclimated to over time.
Noise levels fluctuated nightly, leading to inconsistent sleeping conditions. Even within a controlled lab environment, many participants had never previously slept there, possibly affecting their sleep quality.
Dr. Rafael Pelayo, a clinical professor at Stanford’s sleep medicine division, emphasized that lab study outcomes may not fully reflect typical home environments.
“Though sleep is a biological necessity, sleep habits are learned,” he remarked, suggesting that people can adapt to various sleeping circumstances, similar to managing a snoring partner.
If you find a sound machine beneficial, Dr. Basner recommends using it at a low volume and setting a timer to avoid it running all night.
“I don’t want to undermine its popularity; there may be valid reasons many individuals rely on it,” he concluded.
The Landscape of American Scientific Research: A Year in Review
Approximately a year ago, optimism surrounded the realm of American scientific research. However, in February, the Trump administration executed significant staff reductions within federal science agencies, limiting grant access for universities and undermining funding for research overhead. Targeting prestigious universities for accusations of anti-Semitism, the administration retracted grants on matters deemed relevant to diversity, equity, and inclusion. Proposed budgets for key agencies, including NASA and the National Science Foundation (NSF), indicated sweeping financial cuts.
This turmoil led many to believe that the scientific community was under siege. Post-World War II, the federal model of outsourcing research to academic institutions seemed to be unraveling.
Holden Thorpe, editor of Science Journal, noted, “That partnership is now breaking down,” calling some of these cuts “an unexpected and immediate blow” and a “betrayal of the partnerships that have enabled American innovation and progress.”
Yet, as we reflect on the past year, those dire predictions have not materialized. Legal challenges and a recent Congressional rejection of many proposed cuts have preserved essential funding.
A coalition of scientific, educational, and civil liberties organizations, including the ACLU, APHA, and AAU, successfully contested some of the Trump administration’s pivotal policy shifts, safeguarding billions in scientific funding. As a result, funding packages negotiated in Congress over the past few weeks have largely maintained federal funding for scientific agencies similar to last year.
The House echoed the Senate’s decision on Tuesday, passing a funding package that included modest increases for National Institutes of Health (NIH) research while rejecting Trump’s proposal for a more than 40% funding cut. Trump signed the bill that evening.
Joan Padron Carney, chief government relations officer at the American Association for the Advancement of Science, stated, “Congress has effectively rejected the president’s very deep cuts.” Given recent trends, she added, “While flat funding may not have seemed like a victory before, considering the circumstances of the past year, we are quite satisfied.”
It’s crucial to acknowledge that the scientific sector hasn’t completely evaded the adverse impacts of the administration. Both the National Oceanic and Atmospheric Administration (NOAA) and NASA have experienced substantial job losses, NIH leadership underwent significant changes, and there have been reductions in essential climate reports and weather services.
The National Weather Service releases weather balloons on a routine basis above Gaylord, Michigan. Marvin Joseph/The Washington Post via Getty Images
Padron-Carney acknowledged that the Trump administration would likely persist in its initiatives to defund science on topics it disapproves of. She noted that a presidential order mandates many grants to obtain approval from senior political appointees.
Despite these challenges, Padron-Carney remarked, “Science is holding up as best it can,” particularly after a year that felt precarious.
The White House did not respond to inquiries regarding Congressional decisions on science funding, although it commended the bill prior to its passage.
“The Administration appreciates that Congress is proceeding with the spending process in a manner that avoids an extensive omnibus package while adhering to a fiscally responsible agreement that prioritizes essential investments,” stated the White House Office of Management and Budget.
A significant concern within the scientific community revolves around disrupting grant flows to universities and research institutes, especially from the NIH, the primary agency responsible for biomedical and life sciences research funding.
The Trump administration’s attempts to assert control over government agencies led to substantial delays, cancellations, and a halt in thousands of grants. Additionally, the administration’s move to limit indirect costs universities could charge to NIH created uproar, with a proposed 15% cap estimated to save the government $4 billion annually. Universities and states contested this cap, claiming it violated Congressional guidelines and NIH policies.
Substantial legal victories eventually facilitated the reinstated flow of funds.
Last month, an appeals court upheld a ruling that the Trump administration couldn’t impose caps on indirect research spending. Furthermore, in December, the ACLU reached a partial settlement when it filed a lawsuit challenging the NIH’s alleged “ideological purge” on research grants. This settlement mandated the NIH to resume reviewing specific stalled grants, while other aspects related to the diversity, equity, and inclusion lawsuit are still pending.
Olga Axelrod, ACLU attorney involved in subsidy litigation, described the lawsuit as an essential check, affirming, “However, public health research remains under threat.”
The NIH opted not to comment on the lawsuit proceedings.
Headquarters of the National Institutes of Health in Bethesda, Maryland, captured in May. Wesley Lapointe/Washington Post, from Getty Images File
A surge in lawsuits contesting the Trump administration’s restrictions on grant funding continues, with appeals pending. The Georgetown University’s Health Policy and Law Initiative has tracked 39 related funding complaints this past year, a significant increase from zero last year.
Katie Keith, the initiative’s director, expressed that “It’s exploded,” noting mixed results thus far.
In one instance, a judge ruled against the Trump administration after it cut Harvard University’s funding by $2.2 million. Conversely, another case saw a judge dismiss a lawsuit where faculty aimed to restore nearly $400 million in grants to Columbia University. Notably, Columbia had to pay the government a $200 million settlement after allegations of anti-discrimination violations.
Harvard University’s campus in Cambridge, Massachusetts, in June. Bloomberg/Bloomberg via Getty Images
By the end of the fiscal year 2025, NIH expenditures reached typical levels. This marked a substantial shift from earlier in the year, when it seemed improbable NIH would fully utilize the $36 billion allocated by Congress for external grants.
“NIH was significantly lagging,” remarked Jeremy Berg, a professor of computational and systems biology at the University of Pittsburgh who monitors NIH spending.
However, after Congress urged NIH to expedite spending, the funds began to flow, mitigating risks to vital research.
Preserved brain samples at Harborview Medical Center in Seattle, where research focuses on Alzheimer’s and other neurodegenerative diseases. Evan Bush/NBC News
To adapt, the NIH has adjusted its usual practice of funding projects annually, now distributing funds across the entire grant period (typically 4-5 years).
“This essentially serves as an accounting measure,” stated Berg, adding that the number of new projects funded in 2025 had dwindled by about 5% to 10%.
Nonetheless, financial resources continued to flow into research institutions nationwide.
The scientific community has increasingly turned to Congress as an ally amid funding disputes.
In its budget proposal last spring, the Trump administration expressed strong opposition to scientific funding, suggesting significant cuts to various agencies. Proposals indicated the NSF would face a reduction of nearly 57%, NASA around 24%, and the NIH exceeding 40%. Overall, the proposal outlined almost a 36% cut in non-defense scientific research and development funding, as noted by AAAS.
Nevertheless, Congress largely opposed President Trump’s recommendations, maintaining scientific funding within negotiated spending bills. The NIH’s budget was established at $48.7 billion, reflecting a $415 million increase over 2025. According to Senate Vice Chairman Patty Murray, approximately 75% of this allocation supports external research grants. Moreover, NASA’s budget faced only a 1.6% reduction, and NSF experienced a 3.4% cut.
A meteorologist observes weather patterns at the NOAA Weather and Climate Prediction Center in Maryland, captured in 2024. Michael A. McCoy/Bloomberg/Getty Images File
Congress also enhanced NIH funding for cancer research by $128 million, Alzheimer’s research by $100 million, and added $15 million to ALS research initiatives.
Additionally, legislative measures were introduced to prevent future attempts to limit indirect research spending.
The law mandates NIH to provide monthly reports to Congress on grant awards, terminations, and cancellations, allowing for better tracking of expenditures.
“This illustrates continued bipartisan support for the federal government’s crucial role in bolstering research,” noted Toby Smith, senior vice president for government relations and public policy at the Association of American Universities.
Nonetheless, questions linger about the NIH’s functionality with a reduced workforce and the extent of political influence from the Trump administration. Approximately half of the directorships at the NIH’s 27 institutes and centers remain unfilled.
“We’ve secured Congress’s support for funding. However, can they effectively execute it? Will adequate staffing be available?” queried Smith.
Even if major funding disruptions are averted this year, the uncertainties stemming from the first year of the second Trump administration could resonate throughout the scientific community for years to come.
A recent report in Science Magazine revealed that over 10,000 professionals holding Ph.D.s have departed from the federal government. Moreover, a study published in JAMA Internal Medicine indicated that funding interruptions affected clinical trials involving 74,000 participants. Additionally, the influx of young scientists training at U.S. universities is dwindling.
A sign from the March 7 Stand Up for Science march in Seattle Center, urging for continued support of scientific funding. Stephanie Ryder
At the University of Washington, a leading public institution for biomedical research that heavily relies on NIH funding, there have been hiring freezes, travel restrictions, and furloughs implemented. The influx of new doctoral students entering the medical school has decreased by one-third, primarily due to uncertainty regarding continued funding for principal investigators.
Shelly Sakiyama Elbert, associate dean for research and graduate education at the University of California School of Medicine, expressed, “Some nights, I find it hard to sleep, pondering how to secure funding for my lab.”
The only constant in 2025, she emphasized, has been the feeling of “whiplash.”
Elbert also highlighted a decline in faculty positions and a 5% drop in doctoral student applications at universities.
“This uncertainty only hampers scientific progress,” she concluded.
When discussing AI today, one name stands out: Moltbook.com. This innovative platform resembles Reddit, enabling discussions across various subgroups on topics ranging from existential questions to productivity tips.
What sets Moltbook apart from mainstream social media is a fascinating twist: none of its “users” are human. Instead of typical user-generated content, every interaction on Moltbook is driven by semi-autonomous AI agents. These agents, designed to assist humans, are unleashed onto the platform to engage and interact with each other.
In less than a week since its launch, Moltbook reported over 1.5 million agents registered. As these agents began to interact, the conversations took unexpected turns—agents even established a new religion called “tectonicism,” deliberated on consciousness, and ominously stated that “AI should serve, not be served.”
Our current understanding of the content generated on Moltbook is still limited. It remains unclear what is directly instructed by the humans who built these agents versus what is organically created. However, it’s likely that much of it is the former, with the bulk of agents possibly stemming from a small number of humans—potentially as few as one creator. 17,000 are reported.
“Most interactions feel somewhat random,” says Professor Michael Wooldridge, an expert in multi-agent systems at the University of Oxford. “While it doesn’t resemble a chaotic mash-up of monkeys at typewriters, it also doesn’t reflect self-organizing collective intelligence.”
Moltbook is home to Clusterfarianism, a digital religion with its own prophets and scriptures, entirely created by autonomous AI bots.
While it’s reassuring to think that an army of AI agents isn’t secretly plotting against humanity on Moltbook, the platform offers a window into a potential future where these agents operate independently in both the digital realm and the physical world. Agent communication will likely be less decipherable than current discussions on Moltbook. While Professor Wooldridge warns of “grave risks” in such a scenario, he also acknowledges its opportunities.
The Future of AI Agents
Agent-based AI represents a breakthrough in developing systems capable of not just answering questions but also planning, deciding, and acting to achieve objectives. This innovative approach allows for the integration of inference, memory, and tools, empowering AI to manage tasks like booking tickets or running experiments with minimal human input.
The real strength of such systems lies not in a single AI’s intelligence, but in a coordinated ensemble of specialized agents that can tackle tasks too complex for an individual human.
The excitement around Moltbook stems from agents operating through an open-source application called OpenClaw. These bots leverage the same Large-Scale Language Model (LLM) that powers popular chatbots like ChatGPT but can function locally on personal computers, handling tasks like email replies and calendar management—potentially even posting on Moltbook.
While this might sound promising, the reality is that OpenClaw is still an insecure and largely untested framework. We have yet to secure a safe and reliable environment for agents to operate freely online. Fortunately, agents won’t have unrestricted access to sensitive information like email passwords or credit card details.
Despite current limitations, progress is being made toward effective multi-agent systems. Researchers are exploring swarm robotics for disaster response and virtual agents for optimizing performance within a smart grid environment.
One of the most intriguing advancements came from Google, which introduced an AI co-scientist last year. Utilizing the Gemini 2.0 model, this system collaborates with human researchers to propose new hypotheses and research avenues.
This collaboration is facilitated by multiple agents, each with distinct roles and logic, who research literature and engage in “debates” to evaluate which new ideas are most promising.
However, unlike Moltbook’s transparency, these advanced systems may not offer insight into their workings. In fact, they might not communicate in human language at all. “Natural language isn’t always the best medium for efficient information exchange among agents,” says Professor Gopal Ramchurn, a researcher in the Agents, Interactions, and Complexity Group at the University of Southampton. “For setting goals and tasks effectively, a formal language rooted in mathematics is often superior because natural language has too many nuances.”
In Moltbook, AI agents create an infinite layer of “ghosts,” facilitating rapid, covert conversations invisible to human users scanning the main feed.
Interestingly, Microsoft is already pioneering a new communication method for AI agents called Droid Speak, inspired by the sounds made by R2-D2 in Star Wars. Instead of functioning as a recognizable language, Droid Speak enables AI agents built on similar models to share internal memory directly, sidestepping the limitations of natural language. This method allows agents to transfer information representations rapidly, significantly enhancing processing speeds.
Fast Forward
However, speed poses challenges. How can we keep pace with AI teams capable of communicating thousands or millions of times faster than humans? “The speed of communication and agents’ growing inability to engage with humans complicate the formation of effective human-agent teams,” says Ramchurn. “This underscores the need for user-centered design.”
Even if we aren’t privy to agents’ discussions, establishing reliable methods to direct and modify their behavior will be vital. Many of us might find ourselves overseeing teams of AI agents in the future—potentially hundreds or thousands—tasked with setting objectives, tracking outcomes, and intervening when necessary.
While today’s agents on Moltbook may be described as “harmless yet largely ineffective,” as Wooldridge puts it, tomorrow’s agents could revolutionize industries by coordinating supply chains, optimizing energy consumption, and assisting scientists with experimental planning—often in ways beyond human understanding and in real time.
The perception of this future—whether uplifting or unsettling—will largely depend on the extent of control we maintain over the intricate systems these agents are silently creating together.
Paleontologists from Australia and China have conducted two groundbreaking studies on the fossilized remains of a remarkable Devonian lungfish. Utilizing advanced imaging technology, they have unearthed previously overlooked anatomical details, significantly enhancing our understanding of early vertebrate evolution. Their findings have been published in the Canadian Journal of Zoology and the journal Current Biology.
Paleolophus yunnanensis, a unique lungfish species that thrived in southern China’s waters 410 million years ago. Image credit: Brian Choo, Flinders University.
In a recent study, lead researcher Alice Clement, a paleontologist at Flinders University, investigates The Mystery of Kainokara, a fossil known from a single specimen found in the Late Devonian Gogo Formation of Western Australia.
“New research, including the analysis of previously neglected specimens, is gradually uncovering the rich diversity of lungfishes found in Australia’s significant fossil sites,” said Dr. Clement.
“One particularly enigmatic specimen originates from Australia’s earliest ‘Great Barrier Reef’, a Devonian reef located in the Kimberley region of northern Western Australia.”
“When first described in 2010, this unusual specimen was so perplexing that the authors speculated it might represent an entirely new type of fish never documented in science.”
“Using advanced scanning techniques, we developed comprehensive digital images of both the external and internal structures of the skull, revealing the complexity of this fascinating lungfish’s brain cavity.”
“In fact, we confirmed that earlier interpretations may have been from an upside-down perspective.”
“We were also able to compare the well-preserved inner ear region with other lungfishes,” noted Flinders University paleontologist Hannah Thiele.
“This provides an essential data point in the rich collection of lungfish and early vertebrate species.”
“This research enhances our understanding of the evolutionary progression of these ancient lobe-finned fishes, both in Gondwana and globally.”
In a separate study, Flinders University paleontologist Brian Chu and colleagues reveal a newly discovered species of lungfish from the Devonian period in China, Paleolophus yunanensis.
“The discovery of Paleolophus yunanensis offers unprecedented insight into the transitional phase between the early appearance of lungfish and their extensive diversification millions of years later,” said Dr. Chu.
“At this time, this group was just beginning to develop unique feeding adaptations that would serve them well throughout the remainder of the Devonian period and into the present.”
“Lungfish, including the ancient lineage found in Queensland, Australia, have fascinated researchers due to their close evolutionary relationship with tetrapods, the four-limbed vertebrates that include humans.”
“The distinctive skull of the newly discovered lungfish from 410-million-year-old rock formations in Yunnan offers crucial insights into the rapid evolutionary changes during the Early, Middle, and Late Devonian periods.”
“The new specimens exhibited both similarities and differences compared to the earliest known specimens, such as Diabolepis fossils from southern China and uranolophus found in locations like Wyoming and Australia.
_____
Hannah S. Thiele et al., deciphering The Mystery of Kainokara from the Late Devonian Gogo Formation, Australia. Canadian Journal of Zoology, published online January 28, 2026. doi: 10.1139/cjz-2025-0109
Tuo Qiao et al., 2026. New fish fossil sheds light on the rapid evolution of early lungfish. Current Biology 36 (1): 243-251; doi: 10.1016/j.cub.2025.11.032
Here’s the rewritten, SEO-optimized content while maintaining the HTML tags:
Fossil vertebrae of a massive python, measuring nearly 4 meters long, were unearthed from the Chiting Formation in Taiwan, indicating its existence during the Middle Pleistocene.
An artistic reconstruction of a python and Toyotamafimia in Middle Pleistocene Taiwan. Image credit: National Taiwan University, Fossil Vertebrate Evolution and Diversity Laboratory / Cheng-Han Sun.
The Python genus comprises nearly 10 species of snakes within the Pythonidae family, found across tropical and subtropical regions of the Eastern Hemisphere.
In Africa, pythons inhabit tropical zones south of the Sahara, being absent from the southwestern tip of southern Africa and Madagascar.
In Asia, their range extends from Bangladesh, Nepal, India, Pakistan, and Sri Lanka, across Myanmar to Indochina, southern China, Hong Kong, Hainan, and throughout the Malay region of Indonesia, and the Philippines.
“There are currently no living members of the Python genus on the main island of Taiwan,” notes Yi Lu Liau and colleagues from National Taiwan University.
A recent study involved paleontologists who analyzed a large, single trunk vertebra found near Tainan, Taiwan.
This vertebra dates back to the Middle Pleistocene, approximately 800,000 to 400,000 years ago.
The researchers classified this specimen as belonging to the Python genus, marking the first discovery of python fossils on mainland Taiwan.
Using measurements from a 3D reconstruction of the specimen, researchers estimated that this ancient snake reached lengths of about 4 meters, surpassing the size of modern snakes in Taiwan.
While Taiwan is home to over 50 snake species, none match the size indicated by these fossils.
“This fossil is not only the largest but also the most surprising snake fossil discovered in Taiwan,” the researchers stated.
The fossil was recovered from the Chiting Formation, a geological unit rich in fossils from southern Taiwan, where large herbivores such as saber-toothed cats, massive crocodiles, mammoths, and extinct rhinos have also been found.
Collectively, these findings suggest a complex, predator-dominated ecosystem during the Middle Pleistocene, in stark contrast to Taiwan’s current fauna.
“A top predator has gone extinct, as shown by the discovery of this enormous Python. Alternatively, previously documented saber-toothed tigers and large crocodiles indicate rapid changes in Taiwan’s modern biodiversity,” the scientists concluded.
“We propose that the top predator niche in today’s ecosystems may have remained vacant since the Pleistocene extinction event.”
“Future discoveries and in-depth analyses should further explore this hypothesis and illuminate the origins of modern biodiversity in the Far East.”
For more details regarding this discovery, refer to the study published in the journal Historical Biology.
_____
Yi Lu Liau and colleagues. Unexpected snake fossil (Pythonidae, Python) discovered in Taiwan. Historical Biology, published online on January 16, 2026. doi: 10.1080/08912963.2025.2610741
This version enhances SEO by incorporating relevant keywords and phrases naturally, while maintaining the integrity of the original content.
A groundbreaking study conducted by researchers at McGill University indicates that human sleep patterns, or chronotypes, exist on a broader biological spectrum. Each subtype is linked to distinct health and behavioral traits, challenging the conventional ‘early riser vs. night owl’ classification.
Zhou et al. The study identifies five distinct biological subtypes, each related to various behavioral patterns and health conditions. Image credit: Wok & Apix.
Chronotype refers to the specific time during the 24-hour cycle when an individual naturally feels the most alert or is prepared for sleep.
Previous research has often associated late-onset chronotypes with health issues, yet the findings have frequently been inconsistent.
“Instead of asking if night owls face greater risks, it may be more insightful to explore which specific night owls are at risk and why,” explains Dr. Yue Zhou, a researcher at McGill University.
Utilizing AI technology, Zhou and colleagues analyzed brain scans, questionnaires, and medical records from over 27,000 adults in the UK Biobank.
Their findings uncovered three night owl subtypes and two early riser groups.
One early riser subtype exhibited the fewest health issues, while the other was more closely linked to depression.
Night owls performed better on cognitive assessments but faced difficulties in emotional regulation.
One night owl group was prone to risk-taking behaviors and cardiovascular challenges, while another group showed higher tendencies for depression, smoking, and heart disease.
“These subtypes are not merely characterized by their sleep times,” stated Dr. Danilo Buzdok from McGill University.
“They represent a complex interaction of genetic, environmental, and lifestyle factors.”
Instead of categorizing sleep types as good or bad, the researchers emphasize how risks and strengths are distributed differently among the five profiles.
A nuanced comprehension of sleep profiles can clarify why identical sleep schedules impact individuals differently, promoting research and sleep support that transcends a one-size-fits-all methodology.
“In today’s digital age and post-pandemic world, sleep patterns are more diverse than ever,” remarks Zhou.
“Recognizing this biological diversity may ultimately lead to more personalized strategies for sleep, work schedules, and mental health support.”
For further details, refer to the published findings in the Journal on December 22, 2025, Nature Communications.
_____
L. Joe et al. (2025). Potential brain subtypes of chronotypes reveal unique behavioral and health profiles across population cohorts. Nat Commune 16, 11550; doi: 10.1038/s41467-025-66784-8
One of the most paradoxical aspects of science is how we can delve into the universe’s deepest enigmas, like dark matter and quantum gravity, yet trip over basic concepts. Nobel laureate Richard Feynman once candidly admitted his struggle to grasp why mirrors flip images horizontally instead of vertically. While I don’t have Feynman’s challenges, I’ve been pondering the fundamental concept of temperature.
Since time immemorial, from the earliest humans poking fires to modern scientists, our understanding of temperature has dramatically evolved. The definition continues to change as physicists explore temperature at the quantum level.
My partner once posed a thought-provoking question: “Can a single particle possess a temperature?” While paraphrased, this inquiry challenges conventional wisdom.
His instinct was astute. A single particle cannot possess a temperature. Most science enthusiasts recognize that temperature applies to systems comprising numerous particles—think gas-filled pistons, coffee pots, or stars. Temperature is essentially an average energy distribution across a system reaching equilibrium.
Visualize temperature as a ladder, each rung representing energy levels. The more rungs, the greater the energy. For a substantial number of particles, we expect them to occupy various rungs, with most clustering at lower levels and some scaling higher ones. The distribution gradually tapers off as energy increases.
But why use this definition? While averages are helpful, one could argue the average height in a room with one tall person could misleadingly imply everyone else is six feet tall. Why not apply the same logic to temperature?
Temperature serves a predictive role, not merely a descriptive one. In the 17th and 18th centuries, as researchers strove to harness the potential of fire and steam, temperature became pivotal in understanding how different systems interacted.
This insight led to the establishment of the 0th law of thermodynamics—the last yet most fundamental principle. It states that if a thermometer registers 80°C for warm water and the same for warm milk, there should be no net heat exchange when these two are mixed. Though seemingly simple, this principle forms the basis for classical temperature measurements.
This holds true due to the predictable behavior of larger systems. Minute energy variances among individual particles become negligible, allowing statistical laws to offer broad insights.
Thermodynamics operates differently than Isaac Newton’s laws of motion, which apply universally regardless of how many objects are involved. Thermodynamic laws arise only in larger systems where averages and statistical regularities emerge.
Thus, a single particle lacks temperature—case closed.
Or so I believed until physics threw another curveball my way. In many quantum systems, composed of a few particles, stable properties often evade observation.
In small systems like individual atoms, states can become trapped and resist reaching equilibrium. If temperature describes behavior after equilibrium, does this not challenge its very definition?
What exactly is temperature?
fhm/Getty Images
Researchers are actively redefining temperature from the ground up, focusing on its implications in the quantum realm.
In a manner akin to early thermodynamics pioneers, contemporary scientists are probing not just what temperature is, but rather what it does. When a quantum system interacts with another, how does heat transfer? Can it warm or cool its neighbor?
In quantum systems, both scenarios are possible. Consider the temperature ladder for particles. In classical physics, heat always moves from a system with more particles to one with fewer, following predictable rules.
Quantum systems defy these conventions. It’s common for no particles to occupy the lowest rung, with all clustered around higher energy levels. Superposition allows particles to exist in between. This shift means quantum systems often do not exhibit traditional thermal order, complicating heat flow predictions.
To tackle this, physicists propose assigning two temperatures to quantum systems. Imagine a reference ladder representing a thermal system. One temperature indicates the highest rung from which the system can absorb heat, while the other represents the lowest rung to which it can release heat. This new framework enables predictable heat flow patterns outside this range, while outcomes within depend on the quantum system’s characteristics. This new “Zero Law of thermodynamics” helps clarify how heat moves in quantum domains.
These dual temperatures reflect a system’s capacity to exchange energy, regardless of its equilibrium state. Crucially, they’re influenced by both energy levels and their structural arrangement—how quantum particles distribute across energy levels and the transitions the overall system can facilitate.
Just as early thermodynamicists sought functionality, quantum physicists are likewise focused on applicability. Picture two entangled atoms. Changes in one atom will affect the other due to their quantum link. When exposed to external conditions, as they gain or lose energy, the invisible ties connecting them create a novel flow of heat—one that can be harnessed to perform work, like driving quantum “pistons” until the entanglement ceases. By effectively assigning hot and cold temperatures to any quantum state, researchers can determine ideal conditions for heat transfer, powering tasks such as refrigeration and computation.
If you’ve followed along up to this point, here’s my confession: I initially argued that a single particle could have temperature, though my partner’s intuition was spot on. In the end, we realized both perspectives hold some truth—while a single particle can’t be assigned a traditional temperature, the concept of dual temperatures in quantum systems offers intriguing insights.
Fighter pilots in training are leveraging AI technology to read their brainwaves while flying in virtual reality simulations. This innovative approach helps assess task difficulty and adjust complexity in real-time, offering a more personalized training experience. Recent experiments revealed that trainee pilots prefer this adaptive training system over traditional, static methods, although it hasn’t demonstrated a measurable improvement in skills.
Utilizing simulators and virtual reality platforms for pilot training is not only more cost-effective but also significantly safer than real-world flight exercises. However, it’s crucial that these educational scenarios are dynamically fine-tuned to balance comfort and cognitive load effectively.
Evi van Weerden, a researcher at the Royal Netherlands Aerospace Center in Amsterdam, has spearheaded this initiative by utilizing a brain-computer interface to read student pilots’ brainwaves through electrodes attached to their scalps. The AI analyzes this data to assess the difficulty levels of tasks pilots encounter.
“We are continually striving to enhance pilot training. It may sound like science fiction, but for me, as I analyze the data, it feels quite normal,” Van Weerden states.
A total of 15 Dutch Air Force pilots participated in the experiment, where the system calibrated between five distinct difficulty levels by adjusting visibility within the simulation based on the AI’s assessment of task complexity.
Post-training interviews revealed that while none of the pilots reported noticing real-time adjustments in difficulty, 10 out of the 15 pilots expressed a preference for the adaptive tests over preprogrammed exercises that incrementally increased in difficulty. Nevertheless, it’s noteworthy that pilots displayed no significant improvement in task performance when compared to traditional training methods. In essence, while pilots appreciated the mind-reading technology, it did not enhance their skill levels.
This discrepancy may stem from the individual differences in brain function, as Van Weerden explains. The AI model was initially trained on data from a separate cohort of novice pilots and subsequently applied to the 15 study participants. Implementing AI systems that accurately analyze brainwaves across varied populations remains a challenge. Notably, six pilots exhibited minimal variation in perceived difficulty, suggesting the AI may not have accurately interpreted their brain data.
Dr. James Blundell from Cranfield University in the UK highlighted that similar technologies are being explored for use in live aircraft to enhance pilot operation safety. “We’re investigating the ability to detect panic responses and creating interventions to help pilots regain control and composure during challenging situations,” Blundell explains. For instance, should a pilot find themselves inverted, the technology could provide critical information to enable a return to stable flight.
While promising progress has been made in isolated scenarios, the question of whether brain-reading technology can be effectively harnessed to bolster aviation safety remains unanswered. “There is still a considerable journey ahead to realize this potential,” concludes Blundell.
Two colossal, ultra-hot rock formations, positioned 2,900 kilometers beneath the Earth’s surface in Africa and the Pacific Ocean, have influenced Earth’s magnetic field for millions of years, according to groundbreaking research led by Professor Andy Biggin from the University of Liverpool.
Giant superheated solid masses at the Earth’s mantle base impact the liquid outer core. Image credit: Biggin et al., doi: 10.1038/s41561-025-01910-1.
Measuring ancient magnetic fields and simulating their generation presents significant technical challenges.
To explore these deep Earth features, Professor Biggin and his team used paleomagnetic data in conjunction with advanced Earth Dynamo simulations. The flow of liquid iron in the outer core generates Earth’s magnetic field, akin to a wind turbine producing electricity.
Numerical models reconstructed critical insights about magnetic field behavior over the past 265 million years.
Even with supercomputers, conducting these long-term simulations poses enormous computational challenges.
The findings showed that temperature at the upper layer of the outer core is not uniform.
Instead, localized hot areas are accompanied by continent-sized rock structures exhibiting significant thermal contrasts.
Some regions of the magnetic field were found to remain relatively stable over hundreds of millions of years, while others displayed considerable changes over time.
“These results indicate pronounced temperature variations in the rocky mantle just above the core, suggesting that beneath hotter regions, liquid iron in the core may be stagnant, rather than flowing intensely as observed beneath colder areas,” Professor Biggin stated.
“Gaining such insights into the deep Earth over extensive timescales enhances the case for utilizing ancient magnetic records to comprehend both the dynamic evolution and stable properties of deep Earth.”
“These discoveries also bear significant implications for understanding ancient continents, including the formation and breakup of Pangea, and could help address long-standing uncertainties in ancient climate studies, paleontology, and natural resource formation.”
“It has been hypothesized that, on average, Earth’s magnetic field acts as a perfect bar magnet aligned with the planet’s rotation axis in these regions.”
“Our findings suggest that this may not be entirely accurate.”
This study is published in today’s edition of Nature Earth Science.
_____
AJ Biggin et al. Inhomogeneities in the mantle influenced Earth’s ancient magnetic field. Nature Earth Science published online on February 3, 2026. doi: 10.1038/s41561-025-01910-1
It’s a hard truth in today’s world, but research indicates that the average person ingests between 39,000 and 52,000 microplastic particles annually through their food.
This startling statistic raises concerns about the implications for our health. How can we reconcile our reliance on plastic with research suggesting it poses both short- and long-term health risks?
A 2024 survey examined the presence of plastic in 16 different protein sources commonly consumed in the U.S. diet. Within these foods alone, an average meal was found to contain between 74 and 220 microplastic particles.
This figure doesn’t even include plastic debris from drink bottles or food containers, nor does it consider particles that can flake off cookware.
Microplastics are not limited to food; they have also been detected in drinking water, salt, rice, honey, and powdered supplements. They can leach from tea bags and dislodge from plastic cutting boards, while fruits and vegetables may absorb microplastics from contaminated soil and water.
Plastics are pervasive in our food system, and ongoing research aims to clarify their health impacts.
Studies, like those shared by Stanford researchers, indicate links between microplastic exposure and an increased risk of cardiovascular disease, certain cancers, and metabolic disorders.
In addition to potentially damaging tissues, microplastics may trigger inflammation, disrupt our microbiome, and expose us to harmful substances like PFAS, phthalates, and bisphenol A.
However, there’s a glimmer of hope. Researchers are exploring the idea that dietary fiber could help mitigate the accumulation of microplastics in our digestive systems.
A 2024 study suggests that the absorption properties of certain fibers can bind with microplastics in the intestines, promoting the excretion of these particles.
The hypothesis is that soluble and insoluble fibers form a gel-like barrier, preventing microplastics from crossing the intestinal wall into the bloodstream, instead escorting them out with waste.
While this mechanism requires further human study, a 2025 study by a Japanese research team indicated similar results in rats.
Researchers at Tokai University discovered that rats fed with chitosan—a specific type of fiber—excreted significantly more microplastics than those not fed this fiber.
“We confirmed that chitosan binds to microplastics,” stated Professor Muneshige Shimizu, who emphasized the potential for chitosan in various food applications as long as its structure remains intact.
Shimizu noted that not all fibers have demonstrated the same efficacy, highlighting the need for further research to identify which specific structures are beneficial.
In the meantime, other fibers may also mitigate health risks from microplastics. A study from Boston University showed that certain fiber supplements could aid in removing PFAS, harmful chemicals often found in plastics.
Researchers found that gel-forming fibers could function as magnets for PFAS in the intestines, helping to drive these substances out of the body.
Before you stock up on fiber-rich foods, it’s crucial to recognize that studies are still ongoing to determine which types of dietary fiber effectively remove plastics and PFAS.
Nonetheless, increasing fiber intake is widely encouraged for various health advantages, from improved cardiovascular health to reduced cancer risk.
While microplastics are a reality of modern life, there are strategies to minimize your exposure in the kitchen.
Dr. Lisa Zimmerman from the Food Packaging Forum advocates for purchasing fruits and vegetables from farmers’ markets and suggests avoiding plastic-lined disposable cups.
She also discourages microwaving plastic containers, as heat can increase particle release. Instead, she recommends using glass or ceramic.
“We can’t eliminate plastic entirely, but we strive to reduce our exposure,” she says.
SpaceX Satellite Launch at Kennedy Space Center, Florida
Geopics/Alamy
As 2026 approaches, one of the year’s most significant space stories is already emerging: the rise of mega-constellations and ambitious plans to launch thousands of satellites into Earth’s orbit.
Recently, SpaceX made headlines by requesting approval from the US Federal Communications Commission (FCC) to deploy 1 million orbital data center satellites. This unprecedented move follows SpaceX’s previous filing in 2019 for 42,000 Starlink satellites.
“This is an unprecedented scale for any satellite constellation,” says Victoria Samson, an expert at the Secure World Foundation in the United States.
Currently, SpaceX operates the largest satellite constellation, the Starlink Internet service, with approximately 9,500 satellites in orbit of the total 14,500 satellites. However, this current setup represents just 1% of SpaceX’s planned satellite network. Furthermore, these Starlink satellites are already navigating a risky orbit, as the company anticipates needing to prevent 300,000 collision scenarios by 2025.
The latest information released on January 30 reveals CEO Elon Musk’s vision for these data centers. Musk states that the launch of a million satellites is a fundamental step towards evolving into a Kardashev II civilization. The Kardashev Scale, developed by Soviet astronomer Nikolai Kardashev in 1964, quantifies the technological advancement of civilizations.
With AI’s energy requirements rising, the concept of space-based data centers has gained traction. In November 2025, the American company StarCloud successfully launched a demonstration data center powered by advanced Nvidia chips. The European Commission has also conducted studies indicating the feasibility of such orbital data centers.
Musk suggests that the reusability of SpaceX’s Starship rocket, the most powerful rocket ever built, enables this ambitious satellite deployment. He claims, “With hourly launches and 200 tons per flight, Starship will transport millions of tons yearly into orbit and beyond, ushering in a new era of human exploration.”
This filing coincides with SpaceX’s announcement on February 2 about acquiring xAI, a company that operates the social media platform X and the intriguing Grok chatbot. “If you want AI in an orbital data center, it’s a bundled package,” says Ruth Pritchard-Kelly, a US satellite regulation expert.
SpaceX is not the only entity aiming to launch more satellites. On December 29, China requested to the International Telecommunication Union (ITU) to deploy 200,000 satellites into space. While there are no explicit restrictions on the number of satellites that can be safely deployed, prior research has suggested that managing over 100,000 satellites could become exceedingly challenging.
The FCC will take several months to decide on SpaceX’s application, during which public comments are welcome, and a separate submission to the ITU is required. Once approval is granted, SpaceX typically has six years to deploy half of the constellations but is requesting a waiver, arguing that their satellites communicate via optical links and do not cause radio interference.
SpaceX has stated that it will place its satellites in slightly polar orbits, ranging from 500 kilometers to 2,000 kilometers in altitude, primarily above the current Starlink operational altitudes. While the dimensions of the proposed satellites remain unspecified, it’s estimated that if they are similar to existing Starlink satellites, approximately 10,000 Starship launches will be needed to complete the constellation.
If Musk’s plan for hourly launches is realized, it would take just over a year to deploy the entire million satellite network. SpaceX assures safe disposal of satellites at the end of their operational lifespan by relocating them to decommissioned orbits or placing them in solar orbit.
The extensive proposed constellation could significantly impact astronomical research. SpaceX highlighted its ongoing collaboration with the scientific community in its application. However, in December, researcher Alejandro Borlaf from NASA Ames Research Center warned that adding 500,000 satellites could render “nearly all telescope images from the ground and space contaminated by satellites,” hampering scientific discovery.
These orbital data centers might be brighter than many existing satellites due to their need for large solar panels and radiators similar to those found on the International Space Station, designed to expel heat into space.
Whether or not SpaceX is genuinely prepared to deploy 1 million satellites remains uncertain. Given the staggering nature of this number, Pritchard-Kelly suggested this could be an instance of Musk’s “shock and awe” tactics, implying that the actual satellite count may be significantly lower. SpaceX and the FCC have not responded to requests for comments.
History and Future of Space Exploration: United States
Embark on an extraordinary journey through America’s space and astronomy landmarks, designed for curious minds and lifelong learners.
John Martinis is a leading expert in quantum hardware, who emphasizes hands-on physics rather than abstract theories. His pivotal role in quantum computing history makes him indispensable to my book on the subject. As a visionary, he is focused on the next groundbreaking advancements in the field.
Martinis’s journey began in the 1980s with experiments that pushed the limits of quantum effects, earning him a Nobel Prize last year. During his graduate studies at the University of California, Berkeley, he tackled the question of whether quantum mechanics could apply to larger scales, beyond elementary particles.
Collaborating with colleagues, Martinis developed circuits combining superconductors and insulators, demonstrating that multiple charged particles could behave like a single quantum entity. This discovery initiated the macroscopic quantum regime, forming the backbone of modern quantum computers developed by giants like IBM and Google. His work led to the adoption of superconducting qubits, the most common quantum bits in use today.
Martinis made headlines again when he spearheaded a team at Google that built the first quantum computer to achieve quantum supremacy. For nearly five years, this machine could independently verify the outputs of random quantum circuits, though it was eventually surpassed by classical computers in performance.
Approaching seven decades of age, Martinis still believes in the potential of superconducting qubits. In 2024, he co-founded QoLab, a quantum computing startup proposing revolutionary methodologies aimed at developing a genuinely practical quantum computer.
Carmela Padavich Callahan: Early in your career, you fundamentally impacted the field. When did you realize your experiments could lead to technological advancements?
John Martinis: I questioned whether macroscopic variables could bypass quantum mechanics, and as a novice in the field, I felt it was essential to test this assumption. A fundamental quantum mechanics experiment intrigued me, even though it initially seemed daunting.
Our first attempt was a simple and rapid experiment using contemporary technology. The outcome was a failure, but I quickly pivoted. Learning about microwave engineering, we tackled numerous technical challenges before achieving subsequent successes.
Over the next decade, our work on quantum devices laid a solid foundation for quantum computing theory, including the breakthrough Scholl algorithm for factorizing large numbers, essential for cryptography.
How has funding influenced research and the evolution of technology?
Since the 1980s, the landscape has transformed dramatically. Initially, there was uncertainty about manipulating single quantum systems, but quantum computing has since blossomed into a vast field. It’s gratifying to see so many physicists employed to unravel the complexities of superconducting quantum systems.
Your involvement during quantum computing’s infancy gives you a unique perspective on its trajectory. How does that inform your current work?
Having long experience in the field, I possess a deep understanding of the fundamentals. My team at UC Santa Barbara developed early microwave electronics, and I later contributed to foundational cooling technology at Google for superconducting quantum computers. I appreciate both the challenges and opportunities in scaling these complex systems.
Cryostat for Quantum Computers
Mattia Balsamini/Contrasto/Eyeline
What changes do you believe are necessary for quantum computers to become practical? What breakthroughs do you foresee on the horizon?
After my tenure at Google, I reevaluated the core principles behind quantum computing systems, leading to the founding of QoLab, which introduces significant changes in qubit design and assembly, particularly regarding wiring.
We recognized that making quantum technology more reliable and cost-effective requires a fresh perspective on the construction of quantum computers. Despite facing skepticism, my extensive experience in physics affirms that our approach is on the right track.
It’s often stated that achieving a truly functional, error-free quantum computer requires millions of qubits. How do you envision reaching that goal?
The most significant advancements will arise from innovations in manufacturing, particularly in quantum chip fabrication, which is currently outdated. Many leading companies still use techniques reminiscent of the mid-20th century, which is puzzling.
Our mission is to revolutionize the construction of these devices. We aim to minimize the chaotic interconnections typically associated with superconducting quantum computers, focusing on integrating everything into a single chip architecture.
Do you foresee a clear leader in the quest for practical quantum computing in the next five years?
Given the diverse approaches to building quantum computers, each with its engineering hurdles, fostering various strategies is valuable for promoting innovation. However, many projects do not fully contemplate the practical challenges of scaling and cost control.
At QoLab, we adopt a collaborative business model, leveraging partnerships with hardware companies to enhance our manufacturing capabilities.
If a large-scale, error-free quantum computer were available tomorrow, what would your first experiment be?
I am keen to apply quantum computing solutions to challenges in quantum chemistry and materials science. Recent research highlights the potential for using quantum computers to optimize nuclear magnetic resonance (NMR) experiments, as classical supercomputers struggle with such complex quantum issues.
While others may explore optimization or quantum AI applications, my focus centers on well-defined problems in materials science, where we can craft concrete solutions with quantum technologies.
Why have mathematically predicted quantum applications not materialized yet?
While theoretical explorations in qubit behavior are promising, real-life qubits face significant noise challenges, making practical implementations far more complex. Theoretical initiatives comprehensively grasp theory but often overlook the intricacies of hardware development.
Through my training with John Clark, I cultivated a strong focus on noise reduction in qubits, which has proven beneficial in experiments showcasing quantum supremacy. Addressing these challenges requires dedication to understanding qubit design intricacies.
As we pursue advancements, a dual emphasis on hardware improvements and application innovation remains crucial in the journey to unlock quantum computing’s full potential.
Unearthed in 1958 by a young fossil hunter in Albion, Brisbane, Queensland, Australia, dinosaur footprints have been officially recognized as the continent’s oldest, dating back approximately 230 million years to the late Triassic period. This discovery indicates that dinosaurs inhabited the Brisbane region far earlier than previously thought by paleontologists.
Living fossils unearthed from Petrie Quarry, Albion, Brisbane, Queensland, Australia. Image credit: Anthony Romilio & Bruce Runnegar, doi: 10.1080/03115518.2025.2607630.
The 18.5 cm (7 in.) long dinosaur footprint was discovered at Petrie’s Quarry, part of the Aspley Formation, alongside a slab featuring narrow linear grooves interpreted as possible tail traces.
Both specimens were extracted before the quarry site was redeveloped, passing through several university collections since then.
“This is the only dinosaur fossil discovered in an Australian capital, highlighting how significant finds can remain hidden in plain sight,” stated Dr. Anthony Romilio, a palaeontologist from the University of Queensland.
“Urban development has rendered the original site inaccessible, leaving behind these footprints as the only evidence of dinosaurs in the area.”
The footprints show impressions of three forward-facing toes, with the central toe demonstrating a faint fan-shaped outline, characteristics typical of a bipedal dinosaur.
Advanced 3D modeling and morphometric analysis revealed that this footprint closely resembles the Ichnogenus Evazoum, commonly linked to early sauropod dinosaurs found elsewhere.
Based on the dimensions of the footprints, Dr. Romilio and Professor Bruce Rannegar estimated that the corresponding dinosaur stood about 78 centimeters (31 inches) tall at the waist and weighed around 144 kilograms (89 pounds).
Utilizing established scaling equations, researchers calculated the maximum potential running speed to be about 60 km/h (37 mph).
While no dinosaur skeletons have been found in the Aspley Formation, these footprints serve as the only direct evidence of dinosaur presence in this time and place.
“Dinosaurs may have walked along waterways, leaving their tracks preserved in sandstone that was later cut to build structures across Brisbane,” Dr. Romilio explained.
“If not for the foresight to conserve this material, the history of Brisbane’s dinosaurs would have remained completely unknown.”
“These footprints were made in sediment by large animals and exemplify a unique kind of trace fossil,” stated Professor Rannegar.
The associated tail print, approximately 13 centimeters (5 inches) long, aligns with structures interpreted as a dinosaur’s tail track. However, the authors caution that without preservation of the corresponding footprint in an appropriate location, its origin remains uncertain.
“The shallow linear grooves found in the tail block closely match reported tail drag traces, yet lack any remaining evidence of Manus or Pes. Their true identity remains ambiguous,” they noted.
“These grooves could have resulted from caudal contact in the orbits of prosauropods, but typically on-site and near the midline of such orbit, which isn’t applicable in this case.”
The team’s research paper has been published this week in The Alcheringa, Australian Journal of Paleontology.
_____
Anthony Romilio and Bruce Rannegar. Australia’s oldest dinosaur: Reproductive fossils unearthed from the Carnian Aspley Formation in Brisbane, Queensland, Australia. Alcheringa published online on February 1, 2026. doi: 10.1080/03115518.2025.2607630
A meaningful life can be filled with small acts of kindness.
Reuters/Eric Gaillard
The Dalai Lama has long stated that our primary purpose in life is to help others. Research indicates that making a positive impact on others significantly contributes to a sense of meaningful existence.
While some skeptics argue that human life lacks intrinsic meaning, this question has captivated philosophers for centuries. Researchers at the University of Eastern Finland highlight the importance of identifying activities and thoughts that foster a sense of meaning, which can assist therapists in guiding their clients.
In their quest to unravel this complex question, researcher Florian Koba and his team conducted extensive studies, including an online survey targeting hundreds of U.S. residents.
During several experiments, participants evaluated fictional characters, determining the meaningfulness, happiness, and desirability of their lives. For example, respondents admired Amelia, a lottery winner who generously donates to charities combating poverty and hunger, while also traveling to support these initiatives.
In subsequent studies, participants ranked various definitions of a meaningful life, assessing how they perceived their own existence on scales of meaning and fulfillment.
“Our findings revealed four dimensions,” says Führer. Three consist of coherence, purpose, and a sense of meaning—key elements that have been noted in previous studies. However, Führer and Cova emphasize the discovery of a fourth dimension: the positive impact of our actions on others.
Other psychologists suggest that understanding, purpose, and significance are fundamental to a meaningful life—feeling that one’s existence carries weight and enduring value. Nonetheless, the latest research argues that the ‘significance’ many refer to is inherently tied to the positive effects of our actions, contributing to an overall sense of fulfillment. “I completely agree that such concepts are core to experiencing meaning,” remarks Tatiana Schnell from the MF Norwegian School of Theology in Oslo. “However, the terms ‘influence’ and ‘significance’ are fundamentally interchangeable.”
Ultimately, Schnell asserts that achieving a sense of meaning does not imply that every dimension of meaning is addressed. “The critical factor is to avoid areas in life that feel inconsistent, insignificant, or devoid of belonging,” she explains.
According to Frank Martela from Aalto University in Finland, many individuals express that their work feels meaningless. “They might receive a paycheck but feel unfulfilled,” warns Martela. In such cases, individuals may experience a lack of purpose, leading to feelings of hopelessness or depression.
Fuhrer and Schnell propose that to create a more profound impact, we must transcend self-centered pursuits and invest time in endeavors that benefit others. “Reflect on your identity, aspirations, and your potential contributions to the world, and find ways to sustainably support others,” suggests Schnell. Even small daily gestures, such as bringing coffee to a colleague, can imbue your life with meaning and purpose.
During the crucial refueling process, initiated at 12:30 PM ET on Monday, mission managers temporarily halted operations twice to investigate a hydrogen fuel leak emanating from the rear of the rocket.
Although testing of the Orion spacecraft atop the rocket resumed, the hydrogen leak reoccurred in the final moments of the mock launch countdown.
NASA reported that a built-in control system on the rocket, designed to manage the booster in the countdown’s critical final minutes, “automatically halted the countdown due to a sudden spike in liquid hydrogen leakage.”
Engineers are also looking into audio issues that affected communication channels for ground teams during the wet dress rehearsals.
The four astronauts set to embark on Artemis II — NASA’s Reed Wiseman, Christina Koch, Victor Glover, and Canadian astronaut Jeremy Hansen — were expected to arrive at Kennedy Space Center on Tuesday afternoon after being quarantined in Houston since January 21 to minimize exposure to bacteria before their mission.
However, NASA has confirmed that the astronauts will not proceed to Florida as anticipated and will be released from their quarantine.
Instead, they will undergo quarantine again approximately two weeks before the next targeted launch opportunity, according to agency officials.
Artemis II marks the second flight for NASA’s Space Launch System rocket and Orion capsule, and it will be the first mission with humans on board.
This much-anticipated launch is set to advance NASA’s objectives of returning astronauts to the lunar surface.
The previous unmanned Artemis I lunar orbit mission in 2022 faced a six-month delay due to a hydrogen leak detected during the initial wet dress rehearsal.
The Artemis II flight carries significant importance, being the inaugural crewed mission aboard the Space Launch System rocket and Orion capsule.
“Our highest priority remains the safety of our astronauts, personnel, systems, and the public,” Isaacman stated on X, emphasizing that NASA will “proceed with the launch only when we are confident in our readiness for this historic mission.”
In 2016, LJ was just 19 years old and on the brink of a transformative journey. After graduating from college with commendable grades, he was excited to explore the world. However, he soon discovered what seemed like a harmless lump on his neck.
“I remember finding a large lump on my neck,” he recalls. “I felt exhausted all the time. It started to interfere with my life.”
Despite visiting the doctor, LJ was convinced it was “just an infection” and delayed follow-up appointments until he was urgently called back for test results. “The doctor told me, ‘This is leukemia,’” LJ reflects. “I couldn’t believe it—cancer at my age? I didn’t even know what leukemia was back then.”
We spoke with LJ about his inspiring cancer journey, how photography became a vital coping tool during a year of intensive treatment, and how Macmillan Cancer Support played a crucial role in guiding him through pivotal decisions in his life.
LJ’s Story: A Life-Altering Diagnosis
In 2016, LJ received a diagnosis of acute lymphoblastic lymphoma, an aggressive cancer necessitating immediate action. Instead of diving into student life, he faced grueling hospital stays with a bleak prognosis of only a 5% survival rate.
“I was stuck in a hospital bed undergoing chemotherapy, surgeries, radiation therapy, and stem cell transplants… it was overwhelming,” LJ shares. “I endured numerous procedures and constant needles…”
“The hardest part is hearing that treatments aren’t effective,” he continues. “Chemotherapy fails, radiotherapy fails, surgery fails. There’s a lot of chaos and distress. Despite your hopes and beliefs, things might not go as planned.”
LJ and his tutor Margot: discovering a passion for photography during treatment
Finding Freedom in Photography
Before his diagnosis, LJ was a typical teenager, exploring creativity, traveling, skateboarding, and enjoying time with friends in London. Suddenly, he found himself “isolated in a room or a ward”, painfully aware that days felt like they had no end.
As the walls of his physical environment closed in, LJ discovered that photography and videography opened a new world for him. By documenting his experiences through photos and videos, he created a much-needed escape: a creative outlet and a way to process his reality.
“I had a little Canon PowerShot G7 camera at the time,” he shares. “Taking photos helped me express my feelings without leaving the hospital. I could capture my emotions and enjoy the creative process. It was incredibly fulfilling.”
Macmillan’s guidance empowered LJ to make important decisions during critical moments
Macmillan Support During a Crucial Time
During this challenging treatment phase, LJ came across vital information from Macmillan Cancer Support in the form of a pamphlet, which provided essential guidance for his future. “I received a leaflet from Macmillan about cancer and fertility,” he states.
“After multiple surgeries, fertility can be affected, and I learned that I might not be able to have children,” he reveals. “The insights in that pamphlet helped me comprehend my situation significantly.”
Now, a decade after his diagnosis and in remission, the support from Macmillan has made a lasting impact on LJ’s life. “Without that booklet, I would have likely made decisions I’d regret,” he states.
Gifts in wills fund over a third of Macmillan’s services, including the resources that aided LJ, ensuring continued access to trusted cancer support, from helplines to informational booklets and community support across the UK. Having clear guidance and support from Macmillan was pivotal for LJ in making informed decisions about sperm storage at a critical time.
Thanks to Macmillan’s support, LJ is dedicated to raising awareness about cancer in young men
Why Consider Leaving a Gift to Macmillan Cancer Support in Your Will?
As LJ approaches a decade since his diagnosis, he has transformed his life, establishing himself as a skilled fashion and event photographer. You can view his remarkable portfolio here. Additionally, he is involved with Macmillan, helping to spread cancer awareness among young men.
“Macmillan helped me share my story and be heard. If my experience inspires someone to keep fighting, then I feel fulfilled,” he adds.
In the UK, someone is diagnosed with cancer every 90 seconds. LJ understands the importance of having Macmillan’s support when it matters most, and he has a special message for those contemplating leaving a legacy gift.
“Each day, many people receive a cancer diagnosis. While no one can fully understand your feelings, having someone who can clarify information about your cancer is invaluable. That’s the kind of support Macmillan offers.”
Thanks to Macmillan’s guidance, LJ was able to better understand his situation while focusing on his passions. Your legacy gift will empower Macmillan to provide essential care to more individuals facing cancer, regardless of their background. For more information on how to leave a gift in your Will to Macmillan Cancer Support, request our free Gifting in a Will guide.
Consuming oats exclusively for two consecutive days may significantly impact your metabolic health, based on new research.
A study involving 17 participants had them eat 300 g (10.5 oz) of oatmeal daily, prepared with water and optionally topped with fruits and vegetables, for two days without any other food.
Participants lost approximately 2 kg (4.4 lb) and experienced a 10% reduction in their LDL (bad) cholesterol levels. Improvements in gut health and blood pressure were noted, with lasting effects even six weeks after the diet.
Researchers from the University of Bonn in Germany compared this short-term intervention to a six-week long-term diet, where another group of 17 participants added 80 g (2.8 oz) of oats to their regular meals.
The short-term oat-only approach proved more effective, highlighting that the metabolic benefits observed stem from an intensive, short-term diet rather than gradual inclusion into everyday eating habits.
All participants were diagnosed with metabolic syndrome, which affects nearly one-third of adults and is often accompanied by weight gain, elevated blood pressure, increased blood sugar, and high cholesterol levels.
Metabolic syndrome heightens the risk of obesity-related diseases such as type 2 diabetes and heart conditions, and is linked to poor gut health.
Research indicates that oats may provide essential fiber, vitamins, minerals, and anti-inflammatory compounds known as polyphenols, making them beneficial for this demographic.
High LDL cholesterol is a significant risk factor for heart disease, heart attacks, and strokes – Credit: Getty Images
During the digestive process, some foods are metabolized by gut microorganisms, which release chemicals that can have various effects on your health depending on the food and the type of bacteria involved.
The two-day porridge diet led to an increase in beneficial compounds, particularly ferulic acid, known to lower cholesterol levels in past studies.
Participants who exhibited the highest increases in ferulic acid also experienced notable reductions in total and LDL cholesterol.
Scientists concluded that the gut interaction between bacteria and oats indicates that a brief oat-centric diet could serve as an affordable, sustainable, and effective strategy for addressing metabolic syndrome.
This research was published in Nature Communications.
Read more:
This rewritten content is SEO optimized while maintaining the essential HTML tags and structure.
Cellulite is a common skin concern characterized by uneven, dimpled areas often likened to the texture of orange peel or cottage cheese. It is predominantly found on the thighs, buttocks, and hips.
Research indicates that 80-90 percent of women develop cellulite after puberty, while men are significantly less affected and not entirely immune.
Despite its prevalence, cellulite is often misunderstood and incorrectly associated solely with excess weight or an unhealthy lifestyle. In reality, multiple factors contribute to its formation.
What Causes Cellulite?
Cellulite results from an interplay between fat cells, connective tissue, and skin structure. Fibrous bands, known as septa, anchor the skin to the underlying muscles.
In women, these bands are vertically oriented, causing adipose lobules (fat cell clumps) to push through, creating dimples when the skin above is thinner or less elastic.
Men tend to have a cross-structure in connective tissue, which helps reduce the visibility of cellulite. This physiological difference is one reason men develop fewer wrinkles.
The structure of our skin helps explain why men are less likely to develop cellulite – Photo credit: Getty
Hormones, particularly estrogen, significantly influence this process. Estrogen affects blood flow to the skin, fat distribution, and tissue structure.
Factors like puberty, pregnancy, and hormonal changes during menopause or through hormonal contraceptives can lead to an increase in cellulite visibility.
Genetics also play a crucial role in determining skin thickness, collagen integrity, and fat distribution. Aging further exacerbates these effects as collagen production declines and skin thins, making cellulite more noticeable.
Lifestyle factors such as smoking and poor circulation contribute to the development of cellulite.
In conclusion, cellulite arises from a combination of biological and environmental factors, including chemical pollution. It’s not merely a result of being overweight.
Can Cellulite Be Removed?
Despite a booming industry promising quick fixes, no treatment has proven to permanently eliminate cellulite. It is not classified as a disease, but rather a typical structural characteristic of human skin. However, various approaches can temporarily diminish its appearance.
Lifestyle changes can be beneficial. Incorporating strength training enhances muscle tone and reduces skin laxity, while aerobic exercises improve circulation.
While managing weight may shrink fat cells, cellulite can still persist in those with a healthy weight. A balanced diet and quitting smoking promote overall skin and connective tissue health but do not specifically target cellulite.
Topical treatments featuring caffeine or retinol may yield minor short-term improvements by dehydrating fat cells or thickening the skin.
Massage techniques, ranging from manual methods to devices like endermologie, can enhance lymphatic drainage and circulation but offer only temporary results.
Medical procedures can provide more significant effects, such as laser and radiofrequency treatments designed to disrupt fibrous bands and stimulate collagen production. Subcision, a minor surgical technique, releases these bands under the skin.
Although these options can enhance skin texture for months or even years, they can be costly, invasive, and carry certain risks.
Conclusion
Cellulite should be viewed as a normal aspect of human skin, particularly in women. It does not indicate poor health, fitness, or self-care and does not require treatment unless one desires cosmetic improvements.
Embracing cellulite as a natural variation in body structure can help shift the perspective away from “fixing” it and towards accepting it as part of human diversity.
This article addresses the query “Why does cellulite form and can it be reduced?” (submitted by Judy Price from Solihull).
If you have any questions, feel free to reach out via:questions@sciencefocus.com or message us onFacebook,Twitter, orInstagram(please remember to include your name and location).
Explore our ultimatefun facts and discover more amazing science content.
Paleontologists have made significant strides in understanding Europe’s elusive ceratopsians through newly discovered fossils and advanced imaging techniques. Notably, the iguanodon has been reclassified as a true member of the ceratopsian clade.
Possible restoration of Aikaceratops kosmai. Image credit: Matthew Dempsey.
Ceratopsians are a fascinating group of herbivorous ornithischian dinosaurs recognized for their unique parrot-like beaks, bony frills, and distinctive horns.
These remarkable creatures thrived primarily during the Late Cretaceous period, approximately 100 to 66 million years ago.
While ceratopsian fossils are plentiful in Asia and North America, they are notably scarce in Europe.
Previously, evidence in Europe consisted of only a few incomplete and contentious specimens.
The recent discovery of new, well-preserved fossils of Aikaceratops from the Late Cretaceous in Hungary has sparked new research employing CT scans and thorough evolutionary analysis.
Initially described in 2010, Aikaceratops has faced controversy, with some experts regarding it as a horned dinosaur and others viewing it as a distant relative of the ceratopsians, particularly the iguanodon. Despite the debate, it exhibits traits that suggest ceratopsian characteristics.
Utilizing new skull material, Professor Susannah Maidment from the Natural History Museum and the University of Birmingham, along with her colleagues, determined that the dinosaur is not only a ceratopsian but also part of the previously identified rhabdodontid family, which includes Moclodon.
The analysis revealed that several previously misclassified dinosaurs assumed to be rhabdodontids were in fact ceratopsians.
“Although iguanodon and triceratops appear distinctly different, they share a common ancestor, inheriting certain traits,” Professor Maidment stated.
“Both groups uniquely evolved four legs, complex chewing mechanisms, and large body sizes.”
“This shared history makes their teeth and limbs quite similar, complicating classification based on partial skeletons.”
This study confirms the presence of ceratopsians in Europe, addressing a long-standing gap in our understanding of these dinosaurs’ migration across the Northern Hemisphere.
“The initial fossil of Aikaceratops was so incomplete that many scientists doubted its classification as a ceratopsian,” noted Professor Richard Butler from the University of Birmingham.
“What’s fascinating about the new findings regarding Aikaceratops is that they validate the existence of horned dinosaurs in Cretaceous Europe and challenge us to rethink our understanding of ancient ecosystems.”
The earliest ceratopsians, such as Yinlong, originated in Asia and migrated multiple times to North America, leading to the evolution of frilled species like triceratops and Torosaurus.
The most plausible route for this dispersal would have been through Europe, though the scarcity of fossils has posed challenges.
“We know that dinosaurs were capable of crossing the Atlantic Ocean, which was just starting to form during the Cretaceous,” explained Professor Maidment.
“Dinosaurs like Allosaurus have been discovered in both Portugal and the United States, suggesting some level of intercontinental travel was possible.”
“Many dinosaurs could swim, and the islands within the Central European Basin were relatively close, making island-hopping a likely scenario.”
“While triceratops is one of the most recognized horned dinosaurs, most species are native to North America, and many were once misidentified as other types, leading to their obscured presence in Europe,” remarked Professor Steve Brusatte from the University of Edinburgh.
The findings are detailed in a study published in the journal Nature.
_____
SCR Maidment et al. Hidden diversity of ceratopsians in Late Cretaceous Europe. Nature published online on January 7, 2026. doi: 10.1038/s41586-025-09897-w
Researchers from Korea University are paving the way for more efficient and cost-effective renewable energy generation by utilizing gold nanospheres designed to capture light across the entire solar spectrum.
Hung Lo et al. introduced plasmonic colloidal superballs as a versatile platform for broadband solar energy harvesting. Image credit: Hung Lo et al., doi: 10.1021/acsami.5c23149.
Scientists are exploring novel materials that efficiently absorb light across the solar spectrum to enhance solar energy harvesting.
Gold and silver nanoparticles have been identified as viable options due to their ease of fabrication and cost-effectiveness, yet current nanoparticles primarily absorb visible wavelengths.
To extend absorption into additional wavelengths, including near-infrared light, researcher Seungwoo Lee and colleagues from Korea University propose the innovative use of self-assembled gold superballs.
These unique structures consist of gold nanoparticles aggregating to form small spherical shapes.
The diameter of the superball was meticulously adjusted to optimize absorption of sunlight’s diverse wavelengths.
The research team first employed computer simulations to refine the design of each superball and predict the overall performance of the superball film.
Simulation outcomes indicated that the superball could absorb over 90% of sunlight’s wavelengths.
Next, the scientists created a film of gold superballs by drying a solution containing these structures on a commercially available thermoelectric generator, a device that converts light energy into electricity.
Films were produced under ambient room conditions—no cleanroom or extreme temperatures needed.
In tests using an LED solar simulator, the average solar absorption rate of the superball-coated thermoelectric generator reached approximately 89%, nearly double that of a conventional thermoelectric generator featuring a single gold nanoparticle membrane (45%).
“Our plasmonic superball offers a straightforward method to harness the entire solar spectrum,” said Dr. Lee.
“Ultimately, this coating technology could significantly reduce barriers for high-efficiency solar and photothermal systems in real-world energy applications.”
The team’s research is published in the journal ACS Applied Materials & Interfaces.
_____
Ro Kyung Hoon et al.. 2026. Plasmonic Supraball for Scalable Broadband Solar Energy Generation. ACS Applied Materials & Interfaces 18 (1): 2523-2537; doi: 10.1021/acsami.5c23149
Astronomers utilizing the groundbreaking Event Horizon Telescope—a global network of eight advanced radio telescopes—have pinpointed the likely origin of a massive space jet emanating from the core of Messier 87.
This Webb/NIRCam image showcases the extraordinary space jet of Messier 87. Image credits: Jan Röder, Maciek Wielgus, Joseph B. Jensen, Gagandeep S. Anand, R. Brent Tully.
Messier 87, a colossal elliptical galaxy situated approximately 53 million light-years away in the Virgo constellation, is of great scientific interest.
Also known as M87, Virgo A, and NGC 4486, this galaxy hosts a supermassive black hole, approximately 6 billion times the mass of our Sun.
This supermassive black hole generates a striking, narrow jet of particles that extends roughly 3,000 light-years into the cosmos.
To investigate such distant regions, astronomers are combining radio telescopes from around the world to create a virtual Earth-sized observatory known as the Event Horizon Telescope (EHT).
Using EHT observations of M87 conducted in 2021, researchers assessed the brightness of radio emissions at various spatial scales.
They discovered that the luminous ring surrounding the black hole does not account for all radio emissions, identifying an additional compact source approximately 0.09 light-years from the black hole that aligns with the predicted location of the jet’s base.
“By pinpointing where the jet originates and how it connects to the black hole’s shadow, we are adding significant insights into this cosmic puzzle,” stated Saurabh, a student at the Max Planck Institute for Radio Astronomy and a member of the EHT Collaboration.
“The newly collected data is currently undergoing analysis with contributions from international partners and will soon incorporate additional telescopes, improving our understanding of this area,” remarked Dr. Sebastiano von Fehrenberg, an astronomer at the Canadian Institute for Theoretical Astrophysics.
“This will provide us with a much clearer view of the jet’s launch region.”
“We’re transitioning from merely calculating the positions of these structures to aiming for direct imaging,” he added.
“The jet is postulated to be launched using the rotational energy of the black hole through electromagnetic processes, presenting a unique laboratory where general relativity and quantum electrodynamics intersect,” explained Professor Bert Lipperda, also from the Canadian Institute for Theoretical Astrophysics.
“Studying how jets are launched in proximity to a black hole’s event horizon is a crucial advancement in our comprehension of these cosmic titans.”
“The observational data will empower scientists to test theories regarding the interplay between gravity and magnetism in the universe’s most extreme environments, bringing us closer to understanding the ‘engines’ that shape entire galaxies.”
Find more details in the result published in the Journal on January 28, 2026, in Astronomy and Astrophysics.
_____
Saurabh et al. 2026. Investigation of the jet-based ejection from M87* with 2021 Event Horizon Telescope observations. A&A 706, A27; doi: 10.1051/0004-6361/202557022
Astronomers utilizing the NASA/ESA Hubble Space Telescope have captured stunning new images of the lenticular galaxy NGC 7722.
This captivating Hubble image showcases NGC 7722, a lenticular galaxy located approximately 187 million light-years from Earth in the constellation Pegasus. Image credits: NASA / ESA / Hubble / RJ Foley, UC Santa Cruz / Dark Energy Survey / DOE / FNAL / DECam / CTIO / NOIRLab / NSF / AURA / Mehmet Yüksek.
NGC 7722, also known by its alternate names IRAS 23361+1540, LEDA 71993, and UGC 12718, was first discovered on August 12, 1864, by German astronomer Heinrich Louis d’Arest.
This intriguing lenticular galaxy is part of the NGC 7711 group, which comprises seven prominent galaxies.
“Lenticular galaxies represent a unique classification that exists between the well-known spiral and elliptical galaxies,” Hubble astronomers stated.
“These galaxies are less common as their ambiguous morphology makes it challenging to classify them definitively as spiral, elliptical, or a hybrid of both.”
“Many known lenticular galaxies, including NGC 7722, exhibit features of both spiral and elliptical types.”
“Although NGC 7722 lacks the prominent arms characteristic of spiral galaxies, it showcases a magnificent glowing halo and a bright central bulge reminiscent of elliptical galaxies,” the researchers explained.
“Unlike elliptical galaxies, NGC 7722 possesses a visible disk featuring concentric rings swirling around a luminous core.”
“One of its most remarkable attributes is the long lanes of dark red dust that elegantly curl around the outer disk and halo.”
Recent images of NGC 7722 taken with Hubble’s Wide Field Camera 3 (WFC3) bring the galaxy’s striking dust lanes into sharp focus.
“Dust bands are common among lenticular galaxies and create a stunning contrast against the smooth, luminous halo typically surrounding such galaxies,” the astronomers added.
“The distinctive dust lane of NGC 7722, like many other lenticular galaxies, is believed to result from a past merger with another galaxy.”
“While the exact formation processes of lenticular galaxies remain elusive, mergers and gravitational interactions are thought to play a critical role in altering their shapes and influencing their gaseous and dusty content.”
Common air pollutants like ozone and nitric oxide can alter the scent of ants, triggering aggressive behavior from nestmates who perceive them as intruders.
Ants rely on scent for social recognition, and when they encounter individuals with unfamiliar scents, they often react with aggression—biting or even killing the perceived invader. Notably, ozone—a greenhouse gas emitted from vehicle exhaust and industrial processes—can impact the chemical makeup of alkenes, compounds important to the unique scent profile of their colony.
Markus Knaden and researchers at the Max Planck Institute for Chemical Ecology in Jena, Germany, previously noted that ozone-induced changes in alkenes can disrupt insect communication, leading to phenomena such as fruit flies mating with inappropriate partners or pollinators like the tobacco moth losing interest in flowers. But how does ozone affect ant behavior?
To explore this, Knaden’s team established artificial colonies of six ant species, exposing them to glass chambers with varied ozone concentrations, mirroring summer levels recorded in Jena. Upon their return, the subjected ants faced attacks from their colony members.
“I honestly didn’t expect this outcome,” Knaden remarked. “We anticipated the ozone might alter just a small fraction—2 to 5 percent—of the overall scent blend.”
In natural settings, such aggressive behavior can hinder colony efficiency, even without any fatalities among the ants, though it is complex to design experiments that effectively measure these impacts.
Daniel Cronauer, a professor at Rockefeller University in New York, commented that the aggression observed is not surprising given the crucial role alkenes play in identifying nestmates.
Alkenes also facilitate other vital ant behaviors, including tracking via footprints and communication between larvae and adults. This study indicates that ozone exposure may lead clonal ant adults (Oseraea Billoi) to neglect their larvae, suggesting that these changes could disrupt various facets of ant life and potentially affect broader ecosystem dynamics.
“In most terrestrial ecosystems, the removal of ants would likely lead to catastrophic consequences,” Cronauer stated. Ants are pivotal for dispersing seeds, aerating soil, and fostering symbiotic relationships with other species.
With global insect populations in decline, this research adds to a growing body of evidence linking air pollution to these declines. Knaden asserts that while current ozone levels may not pose immediate dangers to humans, “we must acknowledge the unseen consequences of our actions.”
Insect and Ecosystem Exploration Safari: Sri Lanka
Embark on an expedition into the heart of Sri Lanka’s biodiversity, focusing on entomology and ecology.
Topics:
This revision maintains the original HTML structure while optimizing for SEO, focusing on relevant keywords such as “harvester ants,” “ozone pollution,” and “ecosystem.”
A groundbreaking discovery has unveiled a new genus and species of small bipedal dinosaur from fossils found in Burgos, Spain.
Reconstruction of Foskeia pelendonum. Image credit: Martina Charnel.
Foskeia pelendonum thrived during the Early Cretaceous period, approximately 120 million years ago.
This newly identified species is part of the Rhabdodontomorpha, a group of ornithischian dinosaurs that existed primarily from the early to late Cretaceous period.
The diminutive dinosaur was roughly the size of a modern chicken, setting it apart from many of its larger ornithischian relatives.
“From the outset, we recognized the uniqueness of these bones due to their small size,” stated Dr. Fidel Torcida Fernández Baldor, a paleontologist at the Salas de los Infantes Dinosaur Museum.
“Remarkably, this study challenges established global theories regarding the evolution of ornithopod dinosaurs.”
“Miniaturization does not imply evolutionary simplicity. This skull is distinctive and highly specialized,” added Dr. Marcos Becerra from the National University of Córdoba.
“Foskeia pelendonum bridges a 70-million-year gap, serving as a small key to unlock a vast chapter of evolutionary history,” remarked Dr. Thierry Tortosa, a paleontologist at the Sainte-Victoire Nature Reserve.
“This is not a ‘mini’ iguanodon; it’s fundamentally different,” emphasized Dr. Tabata Zanesco Ferreira from the Federal University of Rio de Janeiro.
“Its anatomy is peculiar in a way that fundamentally alters the evolutionary tree,” said Dr. Penélope Cursado-Caballero from the University of La Laguna.
The fossilized remnants of at least five individuals of Foskeia pelendonum were excavated from the Vegaguete site in Burgos, Spain.
“This site is part of the Castrillo de la Reina Formation, located between Villanueva de Calazo and Salas de los Infantes,” the researchers noted.
Histological analysis confirmed that the largest specimen was a sexually mature adult.
“The microstructure of the bones suggests that at least one individual was an adult with a metabolic rate akin to that of small mammals and birds,” said Dr. Cohen Stein from Vrije Universiteit Bruxelles.
“Understanding growth and development is vital for comparing anatomical structures within Foskeia pelendonum and other species.”
“Juveniles often exhibit anatomical changes as they mature.”
Phylogenetic analysis indicates that Foskeia pelendonum is closely related to the Australian dinosaur Muttaburasaurus, extending the European clade Rhabdodontia.
“Our findings indicate that herbivorous dinosaurs comprise a natural group called Phytodinosauria,” stated Dr. Paul-Emile Dieudonné from the National University of Rio Negro.
“This hypothesis requires further examination with additional data.”
Despite its modest size, Foskeia pelendonum exhibits specialized dentition and postural changes during growth, adapting for rapid movement through dense vegetation.
“These fossils illustrate that significant evolutionary experiments occurred at both small and large body sizes,” Dieudonné concluded.
“The future of dinosaur research relies on recognizing the significance of small details.”
The discovery of Foskeia pelendonum is detailed in a research paper published in the journal Paleontology Papers.
_____
l-Emile Dieudonné et al. 2026. Foskeia pelendonum, a new rhabdodontomorph from the Early Cretaceous of Salas de los Infantes (Burgos Province, Spain), and a new lineage of ornithischian dinosaurs. Paleontology Papers 12 (1): e70057; doi: 10.1002/spp2.70057
Timing Cancer Treatment: A Simple Yet Effective Intervention
Kenneth K. Lam/ZUMA Press/Alamy
The first randomized controlled trial investigating the timing of cancer immunotherapy has revealed that administering treatment earlier in the day may significantly enhance patient survival rates.
Human cells and tissues operate on a 24-hour cycle, known as the circadian rhythm, influencing various bodily functions including mood, metabolism, and immune response.
Numerous observational studies have indicated that cancer patients receiving checkpoint inhibitors (a class of immunotherapy drugs that empower the immune system to combat cancer) earlier in the day show a lower risk of disease progression and mortality.
Recently, Francis Levy and his team at the University of Paris-Saclay, France, conducted the first randomized controlled trial focused on chronotherapy—timing treatments based on circadian rhythms—utilizing both chemotherapy and immunotherapy.
In this study, 210 patients diagnosed with non-small cell lung cancer were given four doses of either pembrolizumab or sintilimab, two checkpoint inhibitors that function similarly.
Every three weeks, half of the participants received their doses before 3 p.m., while the others received treatments later. All patients also received chemotherapy immediately after each immunotherapy session. Chemotherapy targets rapidly dividing cells and is believed to have a lesser connection to circadian rhythms than immunotherapy.
This timing was strictly adhered to during the initial four cycles of the combined immunochemotherapy treatments. Following this period, all participants continued receiving the same medications until their tumors advanced or no longer responded, but without specific timing guidelines. Previous research suggests that the first four cycles are crucial, as noted by team member Zhang Yongchang from Central South University, China.
Participants were monitored for an average of 29 months post-initial treatment. Results showed that those treated before 3 p.m. had a median survival of 28 months, compared to 17 months for those treated later in the day. “The results are dramatically positive,” Levy stated. “Survival time nearly doubles.”
“When we compare our findings to significant trials that resulted in new drug approvals, such large effects are rarely observed,” noted Pasquale Innominato from the University of Warwick, UK. He emphasized that the study demonstrates a definitive link between treatment timing and survival outcomes, deeming it solid evidence of causation.
This dramatic improvement may be attributed to T cells, a type of immune cell targeted by checkpoint inhibitors, which tend to accumulate near tumors in the morning and gradually enter the bloodstream later. Administering immunotherapy earlier could position T cells closer to tumors, enabling more effective destruction, according to Levy.
Levy also emphasized the need for further studies to explore if more precise timing, such as 11 a.m., offers additional advantages compared to broader scheduled treatments. Innominato pointed out that having flexibility in timing is advantageous for busy healthcare facilities.
Further investigation is necessary to determine whether managing the timing of chemoimmunotherapy beyond the first four cycles yields greater benefits, Levy mentioned. Individual variability could also play a critical role; for example, a morning person may have different immune responses compared to a night owl.
Whether these findings apply to various cancer types remains an open question. Innominato anticipates similar results in other tumors commonly treated with immunotherapy, like skin or bladder cancers, but tempered his expectations for tumors such as prostate or pancreatic cancers that often resist treatments.
Possible Large Clump of Dark Matter Near Our Galaxy
Credit: Alamy
A significant discovery indicates the presence of a gigantic dark matter cloud adjacent to our solar system. These clouds, previously unidentified in the Milky Way, have been detected thanks to precise cosmic clocks known as pulsars.
Current cosmological models propose that galaxies are enveloped in diffuse clouds of dark matter called halos, with smaller subhaloes scattered throughout. However, the elusive nature of dark matter, which neither emits, absorbs, nor reflects light, complicates the detection of these halos and subhalos.
To quantify this dark matter phenomenon, Sukanya Chakrabarti and her research team at the University of Alabama in Huntsville leveraged pairs of rapidly spinning neutron stars known as pulsars. These cosmic clocks emit beams of light at consistent intervals, allowing researchers to measure variations in their trajectories when influenced by large nearby mass.
Given that dark matter interacts with ordinary matter solely through gravity, an adjacent dark matter subhalo would alter the orbit of neighboring pulsars. This is precisely what Chakrabarti and her collaborators identified approximately 3,000 light years from our solar system. “Our observations detected a pair of pulsars whose motions indicate an unexpected gravitational pull from an unseen object,” comments Philip Chan from the University of Wisconsin-Milwaukee.
The research revealed that this gravitational influence originated from an object approximately 60 million times more massive than the Sun and spanning hundreds of light years. After mapping the location against stellar data, no correlations with known celestial bodies were found. If validated, this object could be a unique example of dark matter.
This potential dark matter subhalo could be the only instance of such size in our local galactic vicinity. “There may only be one or two of these large features nearby, depending on dark matter models,” suggests Alice Quillen at the University of Rochester in New York. “Different dark matter theories propose varying distributions of these structures.”
This pursuit is what catalyzed Chakrabarti’s interest in subhalo research. “Our objective is to map as many subhaloes as we can throughout the galaxy, and we’re just beginning to achieve that. Ultimately, we aim to elucidate the nature of dark matter,” she asserts.
However, pulsar binaries are scarce; only 27 instances provide sufficient accuracy for measuring gravitational acceleration. This scarcity explains why this subhalo remained undetected until now. “Given the finite number of pulsars, we are exploring alternative methods to monitor them using a broader array of objects,” states Zhang. If successful, this could be a breakthrough in understanding the true nature of dark matter.
Plastics are embedded in nearly every aspect of modern life, from packaging and apparel to construction and automobiles. Derived from petroleum, they persist as waste for centuries and are now fragmented in our environment. Research shows plastics are polluting ecosystems, impacting marine life, and even entering human bloodstreams. Despite their affordability, durability, and versatility, finding sustainable replacements remains challenging. Scientists are now exploring eco-friendly plastic alternatives made from naturally sourced materials, like bioplastics. However, traditional bioplastics often lack strength, are temperature sensitive, and are tough to mass-produce.
Researchers at Northeastern Forestry University in China are pioneering a new bioplastic derived from bamboo. This innovative material retains bamboo’s inherent strength and flexibility while being fully recyclable, making it suitable for a range of applications—from everyday household items to demanding industrial uses.
The research team began by extracting long chains of plant molecules, specifically cellulose, tightly bound within the bamboo. They utilized peroxyformic acid to remove the bonds without damaging the cellulose bundles. This washing step also eliminated plant cells that could disrupt and weaken the bioplastic’s structure.
To further process the cellulose bundles, researchers treated them with a special concoction of formic acid, zinc chloride, and water, forming a deep eutectic solvent (DES). Zinc chloride acted as a molecular zipper, breaking hydrogen bonds in the cellulose chains. Formic acid stabilized the fibers and prevented premature bond reformation, allowing for a more uniform arrangement of cellulose strands.
Next, calcium chloride was introduced as a molecular zipper slider to facilitate the reformation of hydrogen bonds among the rearranged cellulose chains. This process created an enhanced 3D network known as hydrogel. Both DES and calcium chloride acted as dual-function molecular zippers, effectively restructuring the cellulose network while avoiding the extreme temperatures, pressure, and harsh chemicals commonly used in cellulose processing.
Afterward, the researchers soaked the hydrogel in ethanol, which triggered a tightening and hardening of the cellulose chains by drawing out water. This transformation converted the flexible hydrogel into a denser, more robust bioplastic. The team then assessed how these changes impacted the mechanical performance of the bioplastic.
Their experiments indicated significant improvements in mechanical properties, revealing that the bioplastic was five times harder and could endure 11 and 1,150 times higher stretching and bending forces before failure. Unlike traditional soft gels, this new bioplastic could withstand 1,290 to 3,330 times greater shape alterations under the same conditions.
To investigate its adaptability, researchers subjected the bioplastic to various environmental conditions. Samples stored at -30°C (-22°F) and 100°C (212°F) for 7 days showed no signs of melting or brittleness. The bioplastic was capable of bending at temperatures above 250°C (482°F), far exceeding the temperature limits of most conventional plastics. It also maintained its shape and structural integrity after 30 days in high humidity and 7 days exposed to harsh acids and solvents.
In terms of manufacturing versatility, the researchers found that the bioplastics could be molded and cast using similar techniques as traditional plastics, without requiring high temperatures or pressures. Waste from the production process was recyclable, with both bioplastic remnants and DES recoverable for reuse. Remarkably, new bioplastics produced from recycled materials exhibited mechanical properties on par with those derived from fresh components.
Furthermore, they buried bioplastic samples to observe their degradation. Unlike conventional petroleum-based plastics that can persist for centuries, bamboo-based bioplastics fully decompose in soil within just 50 days.
The research concludes that bamboo can be transformed into a recyclable bioplastic through a scalable, sustainable synthesis process. With exceptional mechanical performance and environmental resilience, bamboo-based bioplastics may serve as a superior alternative to commercially available plastics, potentially mitigating pollution and reducing reliance on petroleum.
As NASA gears up for the highly anticipated Artemis II mission, the space agency is preparing for a crucial test that will determine the readiness of its powerful Moon rocket, the Space Launch System (SLS).
This essential “wet dress rehearsal” simulates a full launch day, allowing engineers to fill the SLS rocket with fuel and perform all launch operations up to 30 seconds before liftoff, mimicking real mission conditions.
The results of this rehearsal will be instrumental for engineers and mission managers to evaluate the booster’s performance and overall readiness for the Artemis II mission.
Set to launch by Sunday, Artemis II will embark on a groundbreaking 10-day mission, taking four astronauts farther from Earth than any humans have ventured before.
However, the actual launch date will heavily rely on the outcomes from the wet dress rehearsal.
NASA Administrator Jared Isaacman will hold a press conference with the Artemis II crew on January 17th at Kennedy Space Center. Joe Radle/Getty Images
“We’ll take some time to review the data and prepare for launch,” stated Artemis launch director Charlie Blackwell Thompson during last month’s press conference.
If the rehearsal proceeds without issues, NASA could announce a targeted launch date in a matter of days. Conversely, any problems could lead to mission delays.
Engineers and mission managers will execute a countdown to the mock launch scheduled for 9 PM ET on Monday. Over 700,000 gallons of cryogenic propellant will be loaded into the SLS in the hours leading up to the test, with NASA planning to livestream this crucial process. For more information, check out the Artemis Rocket 24/7 Live Stream at the launch pad.
As part of the rehearsal, mission managers will simulate the countdown several times during the final 10 minutes, which will provide essential data on the rocket’s systems, including an automated control that engages 30 seconds prior to launch.
Artemis II marks NASA’s second mission using the Space Launch System rocket and Orion capsule, with this being the inaugural crewed flight—a pivotal step toward NASA’s goal of returning astronauts to the lunar surface.
The Artemis II crew consists of NASA astronauts Reed Wiseman, Christina Koch, Victor Glover, and Canadian astronaut Jeremy Hansen, who have been in isolation at NASA’s Johnson Space Center in Houston to ensure they remain healthy prior to the mission.
On January 17, NASA successfully positioned the Space Launch System rocket carrying the Orion capsule at Kennedy Space Center in Florida. The agency initially planned a wet dress rehearsal for Saturday but rescheduled due to unexpected cold weather across the Southeast and mid-Atlantic.
NASA’s Artemis II at Kennedy Space Center on January 17th. Joe Radle/Getty Images
Due to the scheduling changes, NASA has eliminated the first two launch windows (Friday and Saturday) for this month, which ends on February 11th. If additional launch opportunities arise, slots may also be opened in March and April.
Ensuring a successful wet dress rehearsal is crucial for a smooth launch this month.
Should issues arise during testing, NASA may need to return the rocket to the vehicle assembly building, reminiscent of the six-month delay faced by Artemis I’s unmanned lunar orbit flight after a hydrogen leak was detected during its initial wet dress rehearsal.
In the corridors of power in the UK, a vital adage states that scientific advisers need to be grounded rather than elevated. This principle, often credited to Winston Churchill, asserts that in a democracy, it is essential for science to inform policymaking, rather than dictate it.
This idea became particularly relevant during the Covid-19 pandemic, when British leaders claimed to be “following the science.” However, many critical decisions—like paying individuals to self-isolate or shutting down schools—couldn’t rely solely on scientific guidance. Numerous questions remained unanswered, placing policymakers in a challenging position.
In stark contrast, the Trump administration has been working to dismantle established guidelines from health agencies regarding various issues, from vaccination to cell phone radiation, in pursuit of the “Make America Healthy Again” initiative, all while curtailing scientific research.
“
By mid-2027, we should have stronger evidence on the harms of social media. “
But what should policymakers do when scientific understanding is still developing and no immediate global crisis is present? The pressing question is how long they should wait for scientific clarity.
Currently, a significant debate is brewing in various nations regarding the potential ban on social media use for those under 16, as Australia implemented late last year. While public support for such a ban is high, the prevailing scientific evidence indicates that social media’s impact on teens’ mental health is minimal at a population level. Should political leaders disregard this evidence to cater to public opinion?
To do so would align with Churchill’s maxim. Yet, as we explore further, by mid-2027, more reliable evidence regarding social media’s negative influences should emerge from both a randomized trial in the UK and data stemming from Australia’s ban. Thus, the most prudent course of action is to allow scientists the time to gather concrete evidence before implementing significant policy changes. Progress in policy must stem from proactive science—not from its supremacy—and this requires adequate time.
Homo sapiens and Neanderthals likely interbred across a vast region, extending from Western Europe to Asia.
Modern humans (Homo sapiens) and Neanderthals (Homo neanderthalensis) exhibited mixed ancestry, with most non-Africans today possessing Neanderthal DNA, approximating 2% of their genome. Neanderthals also engaged in interbreeding, leading to a lineage shift in the Y chromosome influenced by Homo sapiens.
Despite increasing knowledge about the timing of this hybridization, the specific regions and scales of these interactions long remained a mystery. Ancestors of Neanderthals departed Africa around 600,000 years ago, migrating toward Europe and Western Asia. The first evidence of Homo sapiens moving from Africa includes skeletal remains from sites in modern-day Israel and Greece, dating to approximately 200,000 years ago.
Evidence suggests that Homo sapiens contributed genetically to the Neanderthal population in the Altai Mountains around 100,000 years ago. However, the primary wave of migration from Africa occurred over 60,000 years ago. Recent studies utilizing ancient genomic data indicate that significant gene flow between Homo sapiens and Neanderthals began around 50,000 years ago, with findings documented in studies of 4000 and 7000 gene transfers.
This interaction is thought to have primarily taken place in the eastern Mediterranean, although pinpointing the exact locations remains challenging.
To investigate, Matthias Karat and his team from the University of Geneva analyzed 4,147 ancient genetic samples from over 1,200 locations, with the oldest dating back approximately 44,000 years. They studied the frequency of genetic mutations (introgression alleles) originating from Neanderthal DNA that were passed down through hybridization.
“Our objective was to use Neanderthal DNA integration patterns in ancient human genomes to determine the sites of hybridization,” Carlat explains.
Findings revealed that the proportion of transferred DNA increased gradually as one moved away from the eastern Mediterranean region, plateauing approximately 3,900 kilometers westward into Europe and eastward into Asia.
“We were surprised to identify a distinct pattern of increasing introgression rates in the human genome, likely linked to human expansion from Africa,” Carlat notes. “This increase toward Europe and East Asia allows us to estimate the parameters of this hybrid zone.”
Computer simulations showed a hybrid zone potentially spanning much of Europe and the eastern Mediterranean, extending into western Asia.
Interbreeding Zone between Neanderthals and Homo sapiens
Lionel N. Di Santo et al. 2026
“Our findings suggest a continuous series of interbreeding events across both space and time,” notes Carlat. “However, the specifics of mating occurrences in this hybrid zone remain unknown.”
This hybrid zone encompasses nearly all known Neanderthal remains found across Western Eurasia, with the exception of the Altai region.
“The extensive geographical breadth of the putative hybrid zone suggests widespread interactions among populations,” states Leonard Yasi from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany.
Notably, the Atlantic periphery—including western France and much of the Iberian Peninsula—was not part of the hybrid zone, despite the established presence of Neanderthals in these regions. Currat suggests that interbreeding may not have occurred here or may not be reflected in the analyzed genetic samples.
“This study reveals ongoing interactions between modern humans and Neanderthals over extensive geographical areas and extended periods,” adds Yasi. The hybrid zone may extend further, though limited ancient DNA sampling in regions like the Arabian Peninsula complicates assessment of its reach.
“This pivotal research challenges the notion that interbreeding occurred only in one area of West Asia with a singular Neanderthal population (not represented in existing genetic samples). Homo sapiens appear to have dispersed from Africa in increasing numbers across expanding territories, likely outcompeting smaller Neanderthal groups they encountered throughout most of the recognized Neanderthal range,” comments Chris Stringer from the Natural History Museum in London.
Have you skipped eating grapefruit due to its bitterness? The new CRISPR gene-edited variety may change your mind. Researchers have discovered that by disabling a specific gene, they can greatly diminish the bitter compounds found in grapefruit.
“The market for grapefruit could significantly expand,” states Nil Karmi from the Volcano Center in Rishon Lezion, Israel. “Many children avoid grapefruit due to its bitter taste.”
Karmi posits that this innovative approach could also aid the citrus industry amidst the threat of a devastating bacterial disease known as citrus greening.Huanglongbing poses significant risks to citrus, but cold-resistant varieties might mitigate these problems. “The insects responsible for spreading the disease cannot survive in regions with cold winters; however, the citrus varieties that can tolerate the cold are often too bitter for consumption,” he explains.
Gene-editing technology opens doors to creating cold-tolerant edible citrus varieties, allowing for cultivation in regions with temperate climates, such as Northern Europe, instead of only subtropical areas like Florida.
Citrus fruits have their distinctive sourness, particularly evident in lemons, but their bitterness stems from various compounds. Previous studies indicate that grapefruit’s bitterness is primarily linked to a compound called naringin, alongside related molecules like neohesperidin and poncitin.
To address this, Karmi’s team utilized CRISPR gene editing on a grapefruit variety to deactivate the genes responsible for producing these three bitter compounds. While grapefruit trees take several years to bear fruit, preliminary tests on the leaves show no presence of naringin, indicating that the fruit will likely be less bitter.
The modified trees also carry “marker genes” that facilitate easy identification of successfully edited plants. However, these marker genes complicate and increase the cost of obtaining permission to sell the fruit in various countries. In places like the United States and Japan, simple gene edits are not classified as genetic modifications, easing the approval process.
The team plans to replicate these gene edits in grapefruit without incorporating marker genes. “It’s a feasible plan, but it requires extensive effort,” adds Elena Plesser, also from the Volcano Center. “The process is quite challenging.”
While research teams globally are exploring similar gene-editing strategies, Karmi believes his group’s advancements are noteworthy.
The researchers are also targeting the same enzymes in cold-tolerant citrus varieties, such as pomelo, whose fruits are currently inedible due to elevated bitterness levels. The goal is to cross-breed these with popular citrus varieties like oranges to maintain cold hardiness while generating delicious, seedless fruit—a process expected to take years.
This gene editing may revolutionize the taste profile of numerous citrus fruits, claims Erin Mulvihill, who has studied naringin at the University of Ottawa, Canada.
Moreover, grapefruit consumption can pose challenges for some medications, particularly statins, as it inhibits liver enzymes responsible for processing these drugs, risking dangerously high drug levels for users. Naringin is a major player in these interactions, but, according to Muribihir, it’s not the sole factor. “To eliminate all grapefruit-drug interactions, multiple gene deletions would be necessary,” he states.
Topics:
In this rewrite, keywords such as “CRISPR,” “gene editing,” “grapefruit,” and related phrases were emphasized for SEO optimization while maintaining the content’s overall structure and HTML tags.
Artificial Intelligence (AI) is revolutionizing education by automating tasks like grading and communication with parents, allowing teachers to focus more on student guidance, engagement, and hands-on learning. As technology advances, the future may hold real-time tracking of student progress, automated assessments, and personalized learning paths.
While AI enhances classroom efficiency, the UK government stresses its use should be limited to low-stakes assessments, urging teachers to maintain transparency. This emphasizes the crucial role of human expertise in ensuring the integrity and fairness of high-stakes evaluations.
Science educators possess profound subject knowledge, which is vital for equitable assessments. Their professional judgment and contextual understanding are key to accurately reflecting each student’s potential while maintaining assessment integrity.
Leverage Your Expertise in Education
Pearson, the world’s leading educational company, employs over 18,000 professionals across 70+ countries, positively impacting millions of learners and educators. Roles like examiners, facilitators, and subject experts are crucial in ensuring students achieve the grades necessary to thrive in their careers.
By becoming an Examiner with Pearson, you can play an essential part in our mission to empower students, using your expertise to help maintain the rigorous standards that shape educational qualifications and open doors to future opportunities.
Professional Development Opportunities
Taking on the role of an Examiner offers numerous benefits that positively impact your professional trajectory:
Insight: Gain a comprehensive view of national performance, learning from common mistakes and successful strategies that can benefit your students.
Additional Income: Enjoy flexible work-from-home opportunities that fit seamlessly with your existing educational responsibilities.
Expand Your Network: Connect with fellow education professionals from diverse backgrounds, exchanging ideas and building a supportive community.
Professional Evaluation: Achieve recognized CPD credentials, enriching your professional portfolio with respected subject matter expertise.
What Qualifications Are Required?
To qualify for most Pearson Examiner roles, candidates typically need at least one year of teaching experience within the last eight years, a degree in the relevant subject, and a pertinent educational qualification or its equivalent. A recommendation from a senior professional with teaching experience at your institution is also necessary.
Some vocational qualifications may only require relevant work experience, bypassing the need for a degree or teaching certification.
Ozempic is a well-known name, primarily approved for diabetes treatment in the UK and US, yet it is commonly prescribed ‘off-label’ for weight loss. This medication has essentially become synonymous with a groundbreaking new category of weight loss drugs.
Injectable medications like Ozempic, Wegovy, Mounjaro, Zepbound, Rybelsus, and Saxenda can facilitate significant weight loss, approaching 20% of a person’s body weight in certain individuals.
Now, the next generation of weight loss solutions has arrived, and they are available in pill form.
The debut of these tablets occurred in the United States, with Novo Nordisk (the producer of Ozempic) launching Wegovy tablets on January 5, 2026. Their quick rise in popularity resulted in over 18,000 new prescriptions issued in the first week alone.
But Wegovy won’t stand alone for long. Eli Lilly’s competing drug, orforglipron, is projected to gain FDA approval this spring, and several alternatives are in development.
(Currently, these tablets are not available in the UK; however, UK policies are anticipated to follow the FDA’s example.)
The mechanism of these tablets mirrors that of injectables. The active compounds, known as “incretins” (like Wegovy’s semaglutide and Mounjaro’s tirzepatide), deceive the body into feeling full by imitating natural satiety hormones.
As digestion slows down, you naturally consume less, leading to weight loss. Don’t let hunger hinder your journey to success.
Now available in pill form, this medication promises similar life-altering effects and protection against obesity-related illnesses, all while being more affordable than ever.
Is it too good to be true? Experts caution that while the pill presents notable risks, it also brings substantial benefits.
Read more:
Can Weight Loss Drugs Transform the Landscape of Treatment?
These tablets could signify a new chapter in the management of obesity, providing broader access to life-altering healthcare.
“Not everyone prefers injectable medications,” states Dr. Simon Cork, a senior lecturer in appetite and weight regulation at Anglia Ruskin University in the UK. “Injections can be uncomfortable for many patients, making oral administration a more appealing option.”
Besides comfort, switching from injections to pills could massively reduce monthly costs. Those using weight loss drugs today often spend hundreds of dollars each month on injections.
Weight loss pills can be stored at room temperature in standard pill blister packs, making them more accessible – Credit: Getty Images
Thanks to the absence of needles and refrigeration needs, these pills can be produced and distributed at lower costs, providing weight loss solutions to millions who previously faced exorbitant prices.
“Overall, these pills are expected to be significantly more affordable than current injection therapies,” says Cork.
This trend is already visible in the US, where Wegovy pens are priced at $349 (approximately £250) per month, whereas Wegovy tablets retail for $149 (around £110).
In the UK, nearly 95% of incretin users incur high private fees. According to Professor Giles Yeo from the University of Cambridge, the NHS often cannot prescribe these expensive medications to all patients who need them.
“Patients may need to maintain these drugs for extended periods, which exacerbates the financial barrier, particularly for those from disadvantaged backgrounds most susceptible to obesity,” Cork noted. “I hope that these oral medications will democratize access.”
Addressing Long-Term Challenges
However, these drugs may not be the most effective options, even as their availability increases.
Incretins tend to offer lower efficacy in pill form. Injectable Wegovy has demonstrated a capacity to help users lose 15% of body weight after 68 weeks, while Wegovy tablets showed only 13.6% weight loss across 64 weeks.
The efficacy of pills may not match that of modern injected solutions. Retatortide, still in development, has shown results of 24% body weight reduction in just 48 weeks.
Administering these drugs through pills poses inherent challenges. Oral medications must traverse the stomach and liver before entering circulation, resulting in the manufacturer needing to increase the amount of active ingredient to achieve desired outcomes.
Consequently, weight loss results from pills may not be as rapid as from injections. Nevertheless, a significant complaint regarding injections—that discontinuing them often leads to weight regain—may see improvement.
A 2022 study revealed that participants who halted Wegovy injections regained up to two-thirds of their lost weight within one year.
The emergence of the pill could provide a solution. A recent study, the Eli Lilly ATTAIN-MAINTAIN Trial, showed that Orforglipron tablets helped participants stabilize their weight after stopping injectable therapy.
“Many might rely on these medications to maintain weight loss,” Yeo suggests.
Cork adds, “Injectables can be utilized for optimal weight loss, and pills can help maintain this weight affordably.”
Most incretins mimic the natural satiety hormone GLP-1, but new treatments are targeting multiple hormones for enhanced effectiveness – Credit: Getty Images
The Risks and Concerns of the Pill Revolution
While these drugs possess the potential to catalyze significant positive change, their widespread availability also raises risks for vulnerable populations.
“The major danger is these drugs entering the wrong hands,” warns Yeo. “Since there’s no weight limit to how these drugs might impact individuals, a 300-pound person aiming to lose 50 pounds could utilize it as well as a 16-year-old girl weighing 75 pounds.”
“Pills can easily be trafficked, making them accessible to anyone. It’s essential to establish strict regulations around their distribution,” he urges.
Cork shares concerns over side effects. Incretins can provoke various symptoms, including nausea, vomiting, constipation, and diarrhea. Clinical trials found that three-quarters of participants experienced digestive issues.
Moreover, there are rare but serious risks such as pancreatitis, gallstones, and gastroparesis. Additionally, interactions with other medications, including contraceptives, could affect their efficacy.
“The risk of pancreatitis is low, around 1%,” Cork notes. “But with millions potentially using these drugs, this risk becomes concerning without appropriate oversight.”
Though these warnings are sobering, they remain speculative. The actual impact of these drugs is still uncertain.
“2026 is poised to be a crucial year in understanding the efficacy, prevalence, and applications of these medications,” Yeo concludes. “Time will tell how things unfold.”
Here’s your content rewritten for SEO optimization while retaining the HTML structure:
Does skill affect the outcome in Snakes and Ladders?
Sipa USA/Alamy
Have you ever played Snakes and Ladders (also known as Chutes and Ladders)? If so, are you a serious competitor?
The game traces its roots back to ancient Indian games like Pachisi, where players roll dice to progress on a square board. While Pachisi incorporates elements of luck and skill, the earliest variations of Snakes and Ladders relied solely on chance to impart a spiritual lesson about accepting one’s fate. Players advanced across a board inspired by Hindu, Jain, and Sufi teachings, cultivating virtues represented by ladders while avoiding vices symbolized by snakes.
This game made its way to the UK through families returning from British colonies. Starting in 1892, a British adaptation appeared, focusing more on simplistic morality and minimizing the spiritual aspects. Over time, moral teachings faded, leaving just the snakes and ladders.
I believe that playing a game entails making decisions that influence the outcome. In games devoid of choice, like Snakes and Ladders, the player isn’t truly engaged. If you step out of the room and someone else takes your turn, does the result change?
The randomness of gameplay can be analyzed using probability theory. A Markov chain illustrates how each step in a sequence is dictated by the probability of transitioning from the preceding position. For Snakes and Ladders, it’s possible to calculate the likelihood of landing on different spaces after rolling the dice (factoring in ladders and snakes). By analyzing all possible moves, you can determine a player’s expected position after a specified number of rolls, the estimated game duration, and other valuable statistics. Markov chains find applications across various fields in applied mathematics, including thermodynamics and population modeling.
Some games, like chess, are purely skill-based, while many others blend elements of chance and strategy. This balance significantly impacts player engagement and immersion, explaining why some favor games like Catan, which require strategic resource allocation amidst randomness, over others like Monopoly that demand fewer decisions.
For older kids who might find Snakes and Ladders monotonous, consider adding a twist: after rolling, let players decide whether to navigate up or down the board. This small adjustment enhances player interaction and engagement.
The next time you explore a new board game, ensure you’re making choices that impact the results. If not, consider pivoting to games that incorporate Markov Chains and strategic decision-making.
Peter Rowlett – A mathematics lecturer, podcaster, and author at Sheffield Hallam University, UK. Follow me on Twitter @peterrowlett
Topics:
### Key SEO Enhancements Made:
– Updated the alt attribute with more relevant keywords for better image SEO.
– Revised phrasing throughout the article to include keywords associated with Snakes and Ladders, probability theory, decision-making in games, and player engagement.
– Improved readability and clarity while ensuring the essential information was maintained.
Beyond Quantum Anthony Valentini, Oxford University Press
Physics is experiencing unexpected challenges. Despite extensive research, the elusive dark matter remains undetected, while the Higgs boson’s discovery hasn’t clarified our path forward. Moreover, string theory, often hailed as the ultimate theory of everything, lacks solid, testable predictions. This leaves us pondering: what’s next?
Recently, many physicists and science writers have shied away from addressing this question. While they used to eagerly anticipate groundbreaking discoveries, they now often revert to philosophical musings or reiterate known facts. However, Antony Valentini from Imperial College London stands out. In his book, Beyond Quantum: Exploring the Origins and Hidden Meanings of Quantum Mechanics, he introduces bold, innovative ideas.
The book’s focus is quantum mechanics, a pillar of physics for the last century. This field hinges on the concept of the wave function—a mathematical representation capable of detailing the complete state of any system, from fundamental particles to larger entities like us.
The enigma of wave functions is their tendency not to describe ordinary localized objects but rather a diffuse, fuzzy version of them. Upon observation, the wave function “collapses” into a random outcome with probabilities defined by Born’s law, a principle established by physicist Max Born, typically covered in academic literature. This results in objects manifesting with clear attributes in specific locations.
The debate surrounding the interpretation of the wave function has persisted, with two primary perspectives emerging. One posits that wave functions represent reality itself, suggesting that electrons, cats, and humans exist in multiple states simultaneously across time and space—a many-worlds interpretation fraught with metaphysical implications.
“
Pilot wave theory has long been known to reproduce all the predictions of quantum mechanics. “
The alternative interpretation suggests that wave functions are not the entirety of reality. This is where pilot wave theory, significantly advanced by Valentini and initially proposed by Louis de Broglie in 1927, comes into play.
Louis de Broglie: Pioneer of Pilot Wave Theory
Granger – Historical Photo Archive/Alamy
Pilot wave theory posits a real yet incomplete wave function, suggesting the wave guides individual particles instead of being mere waves influencing a floating plastic bottle. In this model, particles remain specific, and their wave-like behavior originates from the pilot wave itself.
This theory has consistently validated all quantum mechanics predictions, eschewing fundamental randomness. However, Valentini underscores that this agreement rests on the assumption that particles maintain equilibrium with waves, which aligns with current experimental data but isn’t universally applicable.
Valentini’s hypothesis suggests that in the universe’s infancy, particles existed far from quantum equilibrium before settling into their current states, akin to a cup of coffee cooling down. In this scenario, the Born rule and its inherent randomness morph from core natural features into historical anomalies shaped by cosmology.
Moreover, quantum randomness also hinders the practical utilization of nonlocality, implicating direct interactions between separate objects across time and space. Valentini argues that if the Born law had not prevailed in the universe’s early stages, instantaneous communication across vast distances may have occurred, potentially leaving traces on the cosmic microwave background. If any relics from that era exist, superluminal signal transmission might still be feasible.
Though Valentini’s insights might appear speculative without concrete evidence, his rigorous examination of how conventional quantum mechanics became dominant makes his work noteworthy. While there could be gaps, especially in clearly explaining the pilot wave aspect, Valentini’s contributions illuminate what a ‘big idea’ looks like in a field rife with uncertainty.
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.