From a scientific perspective, “touching” an object is more complex than it seems. For all objects with mass, it appears they are touching, but in reality, they aren’t in physical contact. This phenomenon can be explained by two main factors.
First, the structure of atoms plays a crucial role. Atoms consist of positively charged protons and negatively charged electrons. Protons, along with neutral neutrons, form the nucleus at the center of the atom, while electrons orbit this nucleus.
According to the principles of electromagnetic force, opposite charges attract and like charges repel. When two atoms approach one another, their outer electrons typically repel due to their similar charges, leading to the sensation of not truly touching.
Another essential concept is Pauli’s Exclusion Principle. In simple terms, this principle states that no two electrons in the same atom can occupy the same quantum state, meaning their “orbitals” must differ.
This leads to a short-range repulsive force, referred to as Pauli’s Repulsion, affecting electrons and, consequently, atoms. Combined with electromagnetic forces, these interactions typically result in atoms repelling each other.
So when you “touch” an object, the atoms or molecules involved are usually repelled by one another, creating a small repulsive force that prevents real contact.
For instance, when you sit in a chair, you’re essentially floating on a cushion of subatomic repulsive forces.
While we may perceive that we are in contact with our surroundings, what we actually sense is a repulsive force – Credit: Getty
The reality is slightly more intricate. When we touch an object, a minimal chemical interaction may occur.
Some atoms can overcome electromagnetic repulsion, allowing them to exchange or share electrons with those of the object, forming bonds. This leads to the forces commonly associated with “friction,” but fundamentally, Pauli repulsion prevents true contact.
When you “touch” something, your body perceives this sensation, thanks to specialized sensory organs known as mechanoreceptors. These receptors respond to pressure and vibration, sending electrical signals to the brain, which interprets these signals as the sensation of “touch.”
Ultimately, these mechanoreceptors are detecting small repulsive forces between atoms and molecules, rather than direct physical contact. Hence, “touch” can be regarded as an illusion.
This article addresses the question raised by Josh Greene from Leeds: “Have you ever touched anything technically?”
If you have any questions, feel free to reach out to us at:questions@sciencefocus.com or send us a messageFacebook,Twitter or Instagram (please include your name and location).
Discover more in our ultimate collection of fun facts and explore other amazing science pages!
Read more:
This version is optimized for SEO while retaining the original HTML structure. It includes keywords related to “touch,” “atoms,” “electromagnetic force,” and “Pauli’s exclusion principle” without altering the meaning and flow of the text.
The American Physical Society Global Physics Summit: The World’s Largest Physicists Conference.
Credit: American Physical Society
I’m seated in the auditorium at the American Physical Society Global Physics Summit, the largest annual gathering of physicists worldwide, with 14,000 researchers attending this year in Denver, Colorado. We gather to hear prominent scientists present their groundbreaking research, yet many are now turning to artificial intelligence (AI) for clarification on complex topics.
As the presentations progressed, I frequently noticed an AI chatbot displayed on my laptop screen. These AI chatbots are utilized to simplify complex concepts. Questions like, “What are the benefits of transmon qubits?” and “Can you explain spintronics?” are addressed with rapid, bulleted emoji responses.
AI chatbots have shown promise in educational settings, but whether they can contribute significantly to real-world physics research remains a hot topic at conferences, sparking discussions in talks, intersessions, and networking events.
In a recent presentation, Harvard University researcher Matthew Schwartz highlighted that Anthropic’s Claude chatbot can tackle advanced physics problems with proficiency comparable to early-stage PhD students. Schwartz, who co-authored a study in January focused on quantum field theory, shared that collaborating with Claude sped up research that would typically take two years with human students.
He argues that AI could fundamentally change theoretical physics, stating he will no longer work with students who resist using AI tools. Schwartz believes that AI advancements could solve longstanding challenges in physics, such as harmonizing quantum mechanics with Einstein’s theory of general relativity, within the next five years. He metaphorically expressed that working with Claude made him feel akin to Einstein, proposing the idea of “10,000 Einsteins.”
However, Schwartz’s perspective represents an extreme viewpoint. CUNY professor Savannah Thais maintained that it’s premature to gauge how much technology will transform physics, emphasizing the AI’s capacity to generate plausible-sounding science without guaranteeing accuracy. Critical assumptions in particle physics can often obscure the validity of results.
During a session, Rachel Burley from the American Physical Society noted the initial enthusiasm over AI tools assisting physicists with writing and publishing scientific papers had quickly led to an overwhelming increase in journal submissions, straining the peer review process.
A recurring question from both formal presentations and informal dialogues was: As AI evolves, what roles will remain for human researchers? Matthew Ginsburg, a former physicist with extensive experience in AI at Google DeepMind, suggested that while AI may offer consensus expert opinions, innovation arises from researchers willing to challenge conventional understanding and pose unexpected questions.
Schwartz speculated that human physicists will primarily focus on setting research priorities based on interest and significance. He expressed concern that the transition could lead to complications before improvements manifest, stating, “It’s remarkable, yet slightly concerning.”
CERN’s antimatter factory, located in a high-magnetic field environment and a vacuum more extreme than interstellar space, houses some of the most delicate matter found on Earth. Nestled in a compact box roughly the size of a filing cabinet and a few hundred kilograms lighter than a Ford Focus, lie antiprotons that have been quietly resting for weeks. Rather than being aggressively tested like most particles produced in this facility, these antiprotons have a singular purpose: awaiting their moment of transport.
Shortly, more than a hundred of these precious antimatter particles will be transported in trucks along a four-kilometer ring road around the CERN campus. This marks the inaugural demonstration of a future antimatter delivery service designed to transport antimatter to laboratories across Europe.
During my visit to CERN’s campus near Geneva, Switzerland, project leader Christian Smolla guided me through the facility, showcasing the final preparations for the “Symmetry Test in Transportable Antiproton Experiments (STEP).” “This represents a groundbreaking achievement in antimatter science,” he remarked. “While the theoretical framework for transporting antiprotons existed since the facility’s inception, this is the first practical implementation.”
Since the 1920s, scientists have acknowledged the existence of antimatter, particles with counterparts that possess opposite charges. However, antiprotons, being the simplest form of antimatter, often annihilate upon contact with their more plentiful proton counterparts, complicating their production and storage. It wasn’t until the 1980s that CERN successfully conducted the first experiments to confine antiprotons, generated by proton bombardment of metal targets.
Today, CERN’s Antimatter Factory is the only location globally capable of producing millions of antiprotons on demand and retaining them for research purposes. Several experiments, including the Baryon Antibaryon Symmetry Experiment (BASE), take place here, with STEP also participating.
Christian Smolla Making Final Adjustments
David Stock
These experiments meticulously test antimatter’s fundamental properties, examining deviations from normal matter. Insights gleaned could provide answers to why our universe predominantly consists of matter, seemingly devoid of antimatter.
To achieve the necessary precision in measurements, it is essential to mitigate noise from radiation that might disrupt data collection. When antiprotons enter the detection zone, they approach nearly the speed of light, necessitating a robust magnetic field for deceleration, although complete blockage remains unattainable.
In 2018, Smolla’s team recognized the need for a quieter environment for antimatter, resulting in a strategic escape plan. “Observing variations in the magnetic field made it clear we had to continue precision measurements elsewhere,” Smolla stated.
Containing antimatter is a formidable challenge, requiring superconducting magnets cool enough to sustain near absolute zero temperatures while consuming massive electrical power. The STEP design leveraged just a 30-liter liquid helium tank for magnet cooling, allowing its electronics to function on a standard diesel generator. Future test runs aim to transition to battery power.
Additionally, magnets needed to withstand start-stop movements during operation, and a custom vacuum system was essential to ensure the antiprotons remain uncontaminated by normal matter during their loading and unloading processes.
In 2024, Smolla’s team is set to showcase the STEP experiment. A truck will transport the device across the CERN campus to observe protons, a significant milestone in antimatter transport.
In the days leading up to my visit, approximately 100 antiprotons were slowed and positioned within a sophisticated network of vacuum and electromagnetic fields.
Since then, they’ve patiently awaited the next steps within a complex arrangement of electrical wires and liquid helium lines. With a small oscilloscope screen, Smolla’s team monitors the antimatter’s vital signs. The natural frequencies at which antiprotons vibrate manifest as double humps, affectionately adorned with googly eyes.
Detection Signals Indicating Antiproton Presence
David Stock
On an early Tuesday morning, a crane carefully hoists the entire 850-kilogram trap onto a specialized truck. The truck’s operator is trained to manage CERN’s sensitive equipment, ensuring smooth acceleration and braking.
The truck will then navigate a four-kilometer loop around the CERN campus before returning to the antimatter factory. Should the experiment prove successful, Smolla’s ultimate goal is to extend this antimatter transport service beyond CERN’s confines, delivering antimatter capsules to various European laboratories. A facility currently under construction at Heinrich Heine University in Düsseldorf, Germany, aims to study antimatter in a near-field-free environment.
However, this ambitious goal entails several years of work. CERN is scheduled to suspend extensive operations in July to upgrade its Large Hadron Collider for higher power outputs, a task slated for completion in late 2028.
Once operational, the antimatter delivery service could mean trucks transporting antimatter alongside ordinary vehicles on highways throughout Switzerland and Germany. Though it sounds alarming—given antimatter’s tendency to annihilate upon contact with regular matter—Smolla assures that the risk remains minimal.
“Transporting antimatter is safe, as the quantities we handle are extremely small,” Smolla explains. “You could easily lose 1,000 antiprotons without any noticeable impact.”
Just when you thought Bambi couldn’t get any cuter, meet the Pudu, the world’s smallest deer. Slightly taller than domestic cats, these adorable creatures more than make up for their small size with immense charm.
With captivating doe eyes, a button-shaped nose, tiny feet, and perky ears, this small South American mammal looks like it just hopped out of a Disney movie.
There are two species of Pudu: the Southern Pudu and the Northern Pudu, also known as the Kitapudu.
The Southern Pudu, with its chestnut-colored fur, is native to the Valdivia temperate forest in south-central Chile and Argentina.
In contrast, the Kitapudu has a lighter coat and a darker face, primarily found in the Andes Mountains of Colombia, Venezuela, Peru, and Ecuador.
Adult Kitapudus weigh about the same as a domestic cat, but unlike felines, Pudus can be seen sporting charming headgear!
Each year, adult male Pudus grow a pair of distinctive, single-pointed horns that they use in playful “jousting” matches during the autumn mating season in the southern hemisphere. Males establish their territory and compete for dominance and mating rights.
However, Pudus are typically solitary creatures, only socializing during mating or when females are raising their fawns. In the wild, they can be quite elusive.
By day, they conceal themselves in dense forest undergrowth, but at night, they emerge to perform essential duties such as marking their territory and foraging for food.
When it comes to diet, Pudus prefer low-hanging fruit—both literally and figuratively. Due to their petite size, they thrive on plant material found at ground level, including herbs, ferns, bark, and fallen fruit.
If they desire a treat from higher up, Pudus will ingeniously stand on their hind legs or climb a branch. Observers have documented them using their front legs to bend or break seedlings to reach tender leaves.
These nervous creatures are easily startled. This makes sense, considering their young are prey for Andean foxes, long-eared owls, and pumas—animals that can threaten their existence.
When alarmed, Pudus emit a warning sound and quickly zigzag to safety in the underbrush.
Though they breed successfully in captivity, the same cannot be said for their wild counterparts. Pudus are increasingly threatened by habitat destruction due to cattle ranching, agriculture, and logging.
Additionally, some are captured for the pet trade or hunted with specially trained dogs. A recent study found that wild Pudus have also begun transmitting diseases from nearby livestock, emphasizing the urgent need to protect these charming animals.
If you have any questions, please email us at:questions@sciencefocus.com or reach out to us onFacebook,Twitter, orInstagram(please include your name and location).
Explore our ultimatefun facts and discover more amazing science topics!
A remarkable discovery has identified a cold virus that infected a woman in London approximately 250 years ago, marking it as the oldest known human RNA virus.
Researchers, through advanced DNA sequencing techniques, have uncovered traces of various viruses in ancient human bones that date back as far as 50,000 years. However, many viruses, particularly rhinoviruses that are responsible for the common cold, contain RNA genomes, which are significantly more unstable than DNA and typically deteriorate within hours post-mortem.
RNA is also generated by our cells during the process of translating genetic code into proteins.
In recent years, scientists have successfully extended the recovery timelines for ancient RNA. Notably, a team managed to recover RNA from a woolly mammoth that lived 40,000 years ago.
“To date, much of the ancient RNA research has depended on well-preserved materials, such as permafrost samples or dried seeds, which restricts our understanding of historical human diseases,” remarks Erin Burnett of the Fred Hutchinson Cancer Center in Seattle, Washington.
Since the early 1900s, numerous tissues in pathology collections have been preserved using formalin, a method that fortifies RNA against rapid degradation. Barnett and her team sought to explore pathology collections across Europe for older human specimens that might contain preserved RNA.
Within the Hunterian Museum of Anatomy at the University of Glasgow, researchers discovered lung tissue samples from two individuals preserved in alcohol rather than formalin. One sample belonged to a woman who passed away around the 1770s, while the other was from an unidentified individual who died in 1877. Both exhibited documented cases of severe respiratory illness.
The researchers aimed to extract both RNA and DNA from the lung tissue of these individuals. Barnett described the RNA extracted from both samples as “extremely fragmented,” with the majority of fragments measuring just 20 to 30 nucleotides in length.
“For context, RNA molecules in living cells typically exceed 1000 nucleotides,” she explains. “Thus, instead of working with long, complete chains, we meticulously pieced together data from many smaller fragments.”
Gradually, the scientists succeeded in reconstructing the entire RNA genome of a rhinovirus extracted from the 18th-century woman. They also detected signs indicating she was infected with bacteria responsible for respiratory ailments, including Pneumococcus, Haemophilus influenzae, and Moraxella catarrhalis.
They compared the reconstructed ancient RNA viruses against a National Institutes of Health database featuring millions of viral genomes globally, including multiple rhinovirus strains.
This analysis revealed that the historic virus’s genome classified under the human rhinovirus A group, representing an extinct lineage most closely aligned with the modern genotype known as A19. “By comparing this with contemporary viruses, we deduce that the last time this historic virus and modern A19 shared a common ancestor was around the 1600s,” Barnett noted.
“The personal stories of these two individuals remain largely untold, and I hope this research brings them to recognition,” she expressed.
“This finding is significant as it demonstrates the potential to recover RNA from wet collections dated before the use of formalin,” said Love Darren at Stockholm University, Sweden.
“This marks the first step towards a surge of research into RNA viruses. Given that many RNA viruses evolve rapidly, studying them over centuries will yield vital insights into viral evolution,” he concluded.
When discussing AI today, one name stands out: Moltbook.com. This innovative platform resembles Reddit, enabling discussions across various subgroups on topics ranging from existential questions to productivity tips.
What sets Moltbook apart from mainstream social media is a fascinating twist: none of its “users” are human. Instead of typical user-generated content, every interaction on Moltbook is driven by semi-autonomous AI agents. These agents, designed to assist humans, are unleashed onto the platform to engage and interact with each other.
In less than a week since its launch, Moltbook reported over 1.5 million agents registered. As these agents began to interact, the conversations took unexpected turns—agents even established a new religion called “tectonicism,” deliberated on consciousness, and ominously stated that “AI should serve, not be served.”
Our current understanding of the content generated on Moltbook is still limited. It remains unclear what is directly instructed by the humans who built these agents versus what is organically created. However, it’s likely that much of it is the former, with the bulk of agents possibly stemming from a small number of humans—potentially as few as one creator. 17,000 are reported.
“Most interactions feel somewhat random,” says Professor Michael Wooldridge, an expert in multi-agent systems at the University of Oxford. “While it doesn’t resemble a chaotic mash-up of monkeys at typewriters, it also doesn’t reflect self-organizing collective intelligence.”
Moltbook is home to Clusterfarianism, a digital religion with its own prophets and scriptures, entirely created by autonomous AI bots.
While it’s reassuring to think that an army of AI agents isn’t secretly plotting against humanity on Moltbook, the platform offers a window into a potential future where these agents operate independently in both the digital realm and the physical world. Agent communication will likely be less decipherable than current discussions on Moltbook. While Professor Wooldridge warns of “grave risks” in such a scenario, he also acknowledges its opportunities.
The Future of AI Agents
Agent-based AI represents a breakthrough in developing systems capable of not just answering questions but also planning, deciding, and acting to achieve objectives. This innovative approach allows for the integration of inference, memory, and tools, empowering AI to manage tasks like booking tickets or running experiments with minimal human input.
The real strength of such systems lies not in a single AI’s intelligence, but in a coordinated ensemble of specialized agents that can tackle tasks too complex for an individual human.
The excitement around Moltbook stems from agents operating through an open-source application called OpenClaw. These bots leverage the same Large-Scale Language Model (LLM) that powers popular chatbots like ChatGPT but can function locally on personal computers, handling tasks like email replies and calendar management—potentially even posting on Moltbook.
While this might sound promising, the reality is that OpenClaw is still an insecure and largely untested framework. We have yet to secure a safe and reliable environment for agents to operate freely online. Fortunately, agents won’t have unrestricted access to sensitive information like email passwords or credit card details.
Despite current limitations, progress is being made toward effective multi-agent systems. Researchers are exploring swarm robotics for disaster response and virtual agents for optimizing performance within a smart grid environment.
One of the most intriguing advancements came from Google, which introduced an AI co-scientist last year. Utilizing the Gemini 2.0 model, this system collaborates with human researchers to propose new hypotheses and research avenues.
This collaboration is facilitated by multiple agents, each with distinct roles and logic, who research literature and engage in “debates” to evaluate which new ideas are most promising.
However, unlike Moltbook’s transparency, these advanced systems may not offer insight into their workings. In fact, they might not communicate in human language at all. “Natural language isn’t always the best medium for efficient information exchange among agents,” says Professor Gopal Ramchurn, a researcher in the Agents, Interactions, and Complexity Group at the University of Southampton. “For setting goals and tasks effectively, a formal language rooted in mathematics is often superior because natural language has too many nuances.”
In Moltbook, AI agents create an infinite layer of “ghosts,” facilitating rapid, covert conversations invisible to human users scanning the main feed.
Interestingly, Microsoft is already pioneering a new communication method for AI agents called Droid Speak, inspired by the sounds made by R2-D2 in Star Wars. Instead of functioning as a recognizable language, Droid Speak enables AI agents built on similar models to share internal memory directly, sidestepping the limitations of natural language. This method allows agents to transfer information representations rapidly, significantly enhancing processing speeds.
Fast Forward
However, speed poses challenges. How can we keep pace with AI teams capable of communicating thousands or millions of times faster than humans? “The speed of communication and agents’ growing inability to engage with humans complicate the formation of effective human-agent teams,” says Ramchurn. “This underscores the need for user-centered design.”
Even if we aren’t privy to agents’ discussions, establishing reliable methods to direct and modify their behavior will be vital. Many of us might find ourselves overseeing teams of AI agents in the future—potentially hundreds or thousands—tasked with setting objectives, tracking outcomes, and intervening when necessary.
While today’s agents on Moltbook may be described as “harmless yet largely ineffective,” as Wooldridge puts it, tomorrow’s agents could revolutionize industries by coordinating supply chains, optimizing energy consumption, and assisting scientists with experimental planning—often in ways beyond human understanding and in real time.
The perception of this future—whether uplifting or unsettling—will largely depend on the extent of control we maintain over the intricate systems these agents are silently creating together.
John Martinis is a leading expert in quantum hardware, who emphasizes hands-on physics rather than abstract theories. His pivotal role in quantum computing history makes him indispensable to my book on the subject. As a visionary, he is focused on the next groundbreaking advancements in the field.
Martinis’s journey began in the 1980s with experiments that pushed the limits of quantum effects, earning him a Nobel Prize last year. During his graduate studies at the University of California, Berkeley, he tackled the question of whether quantum mechanics could apply to larger scales, beyond elementary particles.
Collaborating with colleagues, Martinis developed circuits combining superconductors and insulators, demonstrating that multiple charged particles could behave like a single quantum entity. This discovery initiated the macroscopic quantum regime, forming the backbone of modern quantum computers developed by giants like IBM and Google. His work led to the adoption of superconducting qubits, the most common quantum bits in use today.
Martinis made headlines again when he spearheaded a team at Google that built the first quantum computer to achieve quantum supremacy. For nearly five years, this machine could independently verify the outputs of random quantum circuits, though it was eventually surpassed by classical computers in performance.
Approaching seven decades of age, Martinis still believes in the potential of superconducting qubits. In 2024, he co-founded QoLab, a quantum computing startup proposing revolutionary methodologies aimed at developing a genuinely practical quantum computer.
Carmela Padavich Callahan: Early in your career, you fundamentally impacted the field. When did you realize your experiments could lead to technological advancements?
John Martinis: I questioned whether macroscopic variables could bypass quantum mechanics, and as a novice in the field, I felt it was essential to test this assumption. A fundamental quantum mechanics experiment intrigued me, even though it initially seemed daunting.
Our first attempt was a simple and rapid experiment using contemporary technology. The outcome was a failure, but I quickly pivoted. Learning about microwave engineering, we tackled numerous technical challenges before achieving subsequent successes.
Over the next decade, our work on quantum devices laid a solid foundation for quantum computing theory, including the breakthrough Scholl algorithm for factorizing large numbers, essential for cryptography.
How has funding influenced research and the evolution of technology?
Since the 1980s, the landscape has transformed dramatically. Initially, there was uncertainty about manipulating single quantum systems, but quantum computing has since blossomed into a vast field. It’s gratifying to see so many physicists employed to unravel the complexities of superconducting quantum systems.
Your involvement during quantum computing’s infancy gives you a unique perspective on its trajectory. How does that inform your current work?
Having long experience in the field, I possess a deep understanding of the fundamentals. My team at UC Santa Barbara developed early microwave electronics, and I later contributed to foundational cooling technology at Google for superconducting quantum computers. I appreciate both the challenges and opportunities in scaling these complex systems.
Cryostat for Quantum Computers
Mattia Balsamini/Contrasto/Eyeline
What changes do you believe are necessary for quantum computers to become practical? What breakthroughs do you foresee on the horizon?
After my tenure at Google, I reevaluated the core principles behind quantum computing systems, leading to the founding of QoLab, which introduces significant changes in qubit design and assembly, particularly regarding wiring.
We recognized that making quantum technology more reliable and cost-effective requires a fresh perspective on the construction of quantum computers. Despite facing skepticism, my extensive experience in physics affirms that our approach is on the right track.
It’s often stated that achieving a truly functional, error-free quantum computer requires millions of qubits. How do you envision reaching that goal?
The most significant advancements will arise from innovations in manufacturing, particularly in quantum chip fabrication, which is currently outdated. Many leading companies still use techniques reminiscent of the mid-20th century, which is puzzling.
Our mission is to revolutionize the construction of these devices. We aim to minimize the chaotic interconnections typically associated with superconducting quantum computers, focusing on integrating everything into a single chip architecture.
Do you foresee a clear leader in the quest for practical quantum computing in the next five years?
Given the diverse approaches to building quantum computers, each with its engineering hurdles, fostering various strategies is valuable for promoting innovation. However, many projects do not fully contemplate the practical challenges of scaling and cost control.
At QoLab, we adopt a collaborative business model, leveraging partnerships with hardware companies to enhance our manufacturing capabilities.
If a large-scale, error-free quantum computer were available tomorrow, what would your first experiment be?
I am keen to apply quantum computing solutions to challenges in quantum chemistry and materials science. Recent research highlights the potential for using quantum computers to optimize nuclear magnetic resonance (NMR) experiments, as classical supercomputers struggle with such complex quantum issues.
While others may explore optimization or quantum AI applications, my focus centers on well-defined problems in materials science, where we can craft concrete solutions with quantum technologies.
Why have mathematically predicted quantum applications not materialized yet?
While theoretical explorations in qubit behavior are promising, real-life qubits face significant noise challenges, making practical implementations far more complex. Theoretical initiatives comprehensively grasp theory but often overlook the intricacies of hardware development.
Through my training with John Clark, I cultivated a strong focus on noise reduction in qubits, which has proven beneficial in experiments showcasing quantum supremacy. Addressing these challenges requires dedication to understanding qubit design intricacies.
As we pursue advancements, a dual emphasis on hardware improvements and application innovation remains crucial in the journey to unlock quantum computing’s full potential.
Researchers Monitor Polar Bears’ Body Condition in Svalbard
John Earls, Norsk Arctic Institute
In the Svalbard archipelago of Norway, a region known for its climatic extremes, polar bears are surprisingly gaining weight despite the alarming reduction of sea ice. However, scientists warn this trend may not be sustainable.
The northern Barents Sea, located between Svalbard and Russia’s Novaya Zemlya, is warming disproportionately—seven times faster than the global average. Over the past two decades, sea ice around Svalbard has diminished, disappearing two months earlier each year. Currently, polar bears must swim over 200 kilometers between their hunting grounds and birthing caves.
<p>Despite this challenging environment, the overall size and weight of Svalbard’s polar bears has increased since 2000, presenting a puzzling contradiction. <a href="https://www.researchgate.net/profile/Jon-Aars-2">Jon Aars</a>, who led the research at the Norwegian Polar Institute, claims it's positive news for Svalbard. However, he cautions that areas most affected by climate change show severe decline in polar bear populations.</p>
<p>This widely dispersed solitary predator counts among its many challenges the difficulty of accurate population estimates. The numbers <a href="https://www.iucn-pbsg.org/wp-content/uploads/2024/11/PBSG-Status-Criteria-and-Report_Final_2024Oct7.pdf">are declining</a> in some regions while stable or even increasing in parts of Alaska, Canada, and Greenland; for nine populations, data remains insufficient.</p>
<section></section>
<span class="js-content-prompt-opportunity"/>
<p>Estimations suggest the Barents Sea bear population ranges from 1,900 to 3,600 individuals, and appears stable or potentially increasing. From 1995 onward, researchers used tranquilizers via helicopter to study 770 bears, measuring their body length and thorax circumference to approximate weight.</p>
<p>Analysis of trends demonstrated a decline in body condition until 2000, followed by a gradual increase leading up to the last assessments in 2019.</p>
<p xmlns:default="http://www.w3.org/2000/svg">
<figure class="ArticleImage">
<div class="Image__Wrapper">
<img class="Image"
alt="Polar bears rely on sea ice"
width="1350"
height="900"
src="https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg"
srcset="https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=300 300w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=400 400w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=500 500w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=600 600w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=700 700w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=800 800w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=837 837w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=900 900w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=1003 1003w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=1100 1100w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=1200 1200w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=1300 1300w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=1400 1400w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=1500 1500w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=1600 1600w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=1674 1674w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=1700 1700w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=1800 1800w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=1900 1900w, https://images.newscientist.com/wp-content/uploads/2026/01/29134243/SEI_282476095.jpg?width=2006 2006w"
sizes="(min-width: 1288px) 837px, (min-width: 1024px) calc(57.5vw + 55px), (min-width: 415px) calc(100vw - 40px), calc(70vw + 74px)"
loading="lazy"
data-image-context="Article"
data-image-id="2513715"
data-caption="Polar bears depend on sea ice for many aspects of their lives"
data-credit="Trine Lise Sviggum Helgerud, Norsk Polarinstitutt"/>
</div>
<figcaption class="ArticleImageCaption">
<div class="ArticleImageCaption__CaptionWrapper">
<p class="ArticleImageCaption__Title">Crucial Role of Sea Ice in Polar Bear Survival</p>
<p class="ArticleImageCaption__Credit">Trine Lise Sviggum Helgerud, Norsk Arctic Institute</p>
</div>
</figcaption>
</figure>
</p>
<p>In spring, the birth of seal pups on sea ice provides a vital food source for polar bears, allowing them to build fat reserves for the warmer months. Researchers like Earls speculate that diminishing ice may actually assist bears in hunting seals more easily.</p>
<p>With the retreating ice, bears are adapting by exploring new food sources. The approximately 250 bears remaining on Svalbard may increasingly hunt bearded and harbor seals along the coast, while thriving walrus populations might offer additional sustenance.</p>
<p>These adaptable "local bears" are now raiding nesting colonies for bird eggs and chasing reindeer, showcasing a remarkable resilience. According to researchers, such flexibility may be delaying their extinction, says <a href="https://www.researchgate.net/profile/Jouke-Prop">Jouke Prop</a> of the University of Groningen.</p>
<p>"This is a desperate tribe. They're doing unique things," he notes. "While this adaptability may not apply universally, it could suffice for a while in Svalbard."</p>
<p>Although polar bears have not yet reached Svalbard's ecological limits, thanks to the prohibition of hunting since 1973, warming temperatures threaten to disrupt the delicate food chain that begins with algae on sea ice, according to Prop.</p>
<p>"Should the sea ice vanish, sustaining a significant number of polar bears will become incredibly challenging," he warns.</p>
<p>"There exists a threshold beyond which continuous sea ice loss will negatively impact polar bears in Svalbard," Aars adds.</p>
<p>
<section class="SpecialArticleUnit">
<picture class="SpecialArticleUnit__ImageWrapper">
<img class="Image SpecialArticleUnit__Image"
alt="Explore Northern Norway"
width="2119"
height="1414"
src="https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg"
srcset="https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=300 300w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=375 375w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=500 500w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=600 600w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=700 700w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=750 750w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=800 800w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=900 900w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=1003 1003w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=1100 1100w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=1200 1200w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=1300 1300w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=1400 1400w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=1500 1500w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=1600 1600w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=1700 1700w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=1800 1800w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=1900 1900w, https://images.newscientist.com/wp-content/uploads/2025/10/17151005/istock-1130795690.jpg?width=2006 2006w"
sizes="(min-width: 1277px) 375px, (min-width: 1040px) 26.36vw, 99.44vw"
loading="lazy"
data-image-context="Special Article Unit"
data-caption="Stunning view of the Northern Lights over Norway"
data-credit="Getty Images/iStockphoto"/>
</picture>
<div class="SpecialArticleUnit__CopyWrapper">
<h3 class="SpecialArticleUnit__Heading">Uncover the Beauty of Svalbard and Tromsø</h3>
<div class="SpecialArticleUnit__Copy">
<p>Embark on an adventurous journey to Norway's Arctic region. Discover the scientific marvels behind the Northern Lights, the unique Arctic ecosystem, and how humans adapt to the challenges of the Far North.</p>
</div>
</div>
</section>
</p>
<section class="ArticleTopics" data-component-name="article-topics">
<p class="ArticleTopics__Heading">Topics:</p>
</section>
Key Optimization Techniques Used:
Keyword Inclusion: Strategic use of relevant keywords like “polar bears,” “Svalbard,” and “climate change” in titles, captions, and throughout the text.
Descriptive Alt Text: Improved alt attributes for images to enhance accessibility and SEO.
Clear Headings: Organized content with headers that summarize the topics adequately for better readability and search engine indexing.
External Links: Retained high-quality external links to credible sources, both for enhancing authority and providing users with further reading options.
Adaptive Language: Used active language to engage readers while remaining informative, suitable for an audience interested in environmental issues.
A recent discovery in Greece has unveiled the oldest known hand-held wooden tool, dating back approximately 430,000 years, utilized by early human ancestors.
One tool, crafted from an alder trunk, likely served a digging purpose, while the other, made from either willow or poplar, may have been employed for shaping stone, according to a study published in the Proceedings of the National Academy of Sciences.
“The rarity of preserving wood over such a long period makes this discovery particularly fascinating,” stated Annemieke Milks, the lead author of the study, in a phone interview with NBC News.
Milks, affiliated with the University of Reading in the UK and an authority on early wooden tools, emphasizes that while stone tools have been preserved for centuries, finding these rare wooden artifacts enhances our understanding of human evolution.
The evidence suggests that early human ancestors utilized wood for tool-making, marking a significant development in our knowledge of their capabilities.
These ancient tools were unearthed at the Megalopolis Basin site in Marathusa, Greece, located about 160 miles southwest of Athens.
Researchers have identified that this site—once a lakeshore—was pivotal for early human activities, including the fabrication and use of stone and bone tools, as well as hunting large animals like elephants.
Milks described one of the smaller tools as “unprecedented,” noting that its precise function remains unclear. “We were fortunate to uncover such a unique artifact,” she remarked.
Distinct markings on the wood signify that these artifacts were intentionally crafted by humans, rather than being natural sticks, according to Milks.
Innovative methods for analyzing ancient wooden tools have surged over the last decade, yielding new insights into our past, Milks added.
Since direct dating of organic materials like wood can only trace back 50,000 years, researchers relied on dating surrounding sediments and rocks to affirm the tools’ age of 430,000 years.
Milks explained that the preservation of these wooden tools was likely facilitated by their rapid burial in moist sediments, protecting them from microorganisms that would typically lead to decay.
Co-author Caterina Harbati noted that the extraordinary conditions at the excavation site facilitated the preservation of not just wood, but also delicate organic materials like seeds and leaves.
Paleoanthropologist Halvaty from the University of Tübingen in Germany emphasized the discovery’s significance, showcasing Greece’s essential role in human evolutionary studies.
“This finding expands our understanding of early human technology and highlights previously unknown types of tools, enriching our knowledge in this domain,” Halvaty stated.
Maeve McHugh, an associate professor of classical archaeology at the University of Birmingham, called the discovery an essential “snapshot” of early human activity and a glimpse into cognitive development during that era.
“The survival of this wooden artifact, particularly from such an early period in human history, is remarkable and of great significance,” McHugh concluded.
Transforming seawater into potable water has been a costly and energy-heavy endeavor for many regions globally. However, a pioneering approach by Flocean, a Norwegian company, is set to revolutionize this process. They aim to unveil the world’s first commercial-scale seabed desalination plant by 2026, significantly slashing both costs and energy consumption.
Global freshwater demand is surging due to factors like population growth, climate change, and industrial needs. Meanwhile, fresh water is increasingly scarce due to droughts, deforestation, and over-irrigation practices.
Currently, terrestrial desalination provides merely 1% of the world’s freshwater supply, with over 300 million people depending on it for their daily needs. The largest plants are located in the Middle East, where low energy costs enhance the feasibility of desalination technologies amid rising water scarcity.
Reverse osmosis is the primary technology employed in desalination today, which entails pressurizing seawater to force it through membranes that only allow water molecules to pass. This process is notoriously energy-intensive.
Flocean’s innovative strategy involves deploying underwater pods that filter seawater at significant depths, enabling separation of freshwater from salt while returning the salt back to the ocean. These reverse osmosis pods take advantage of hydrostatic pressure to filter seawater with reduced energy requirements.
The company asserts that their method can cut energy usage by approximately 40-50% compared to traditional desalination methods. Additionally, the deeper the pods are submerged, the cleaner the seawater, resulting in less pre-treatment before it reaches the membrane. Nikko zone conditions contribute to this purity.
“From a process perspective, it’s relatively straightforward,” states Alexander Fuglsang, Founder and CEO of Flocean. “The salinity, temperature, and pressure conditions remain stable, with minimal bacterial interference that can lead to biofouling.” The hydrostatic pressure also aids in diffusing the brine by-product, which is claimed not to have harmful chemicals for marine ecosystems.
Over the past year, Flocean has been successfully desalinating water at a depth of 524 meters at its test site located at the Mønstad Industrial Park, Norway’s leading marine supply base. The upcoming commercial facility, dubbed Flocean One, is set to produce 1,000 cubic meters of freshwater daily upon its launch next year. This scalable approach allows for the addition of more desalination pods as needed.
“We opt to maintain uniformity within the subsea units while expanding through replication, instead of constantly developing larger machinery,” explains Fugelsang. Nevertheless, scaling introduces engineering challenges, particularly in optimizing power distribution and permeation manifolds for increased efficiency.
This desalination technology has the potential to offer affordable freshwater solutions if properly implemented and costs are minimized, but large-scale viability has yet to be established, notes Nidal Hilal from New York University Abu Dhabi. “Successfully integrating this solution into municipal systems will require overcoming various technological and financial hurdles over time.”
Reducing costs is crucial for wider adoption of this technology, given that traditional water acquisition methods, such as lake or aquifer pumping, remain cheaper. Key expenses for Flocean stem from membrane cleaning and maintenance. Innovations in membrane technology are underway, with Hilal’s research focusing on conductive membranes that electrically repel salt and other contaminants, which may enhance cleanliness and throughput. Efforts are also being made to recycle single-use plastics into membrane materials to boost sustainability and drive down costs. “Durable membranes and high-efficiency pumps can further decrease operational costs, while incorporating renewable energy can lower electricity expenditures,” Hilal adds.
Flocean One is anticipated to start freshwater production in the second quarter of 2026. If all goes as planned, this technology could pave the way for larger plants in different locations. “The greatest challenge lies in achieving the right alignment,” Fugelsang concludes. “We seek clients, government approvals, and robust financial partnerships.”
Electron Beam in Niobium Cavity: A Core Element of SLAC’s LCLS-II X-ray Laser
Credit: SLAC National Accelerator Laboratory
The Klystron Gallery at SLAC National Accelerator Laboratory is a concrete corridor lined with robust metal columns that stretch well beyond my line of sight. Yet, beneath this unassuming structure lies a marvel of modern science.
Below the gallery, the Linac Coherent Light Source II (LCLS-II) extends over an impressive 3.2 kilometers. This cutting-edge machine produces X-ray pulses that are the strongest in the world. I am here to witness it because a significant record has just been surpassed. However, an upgrade is set to take its most powerful component offline soon. When it reopens—anticipated as early as 2027—it will more than double its X-ray energy output.
“It’s like the difference between a star’s twinkle and the brightness of a light bulb,” says James Cryan at SLAC.
Dismissing LCLS-II as merely a sparkle would be profoundly misleading. In 2024, it achieved the most potent X-ray pulse ever recorded. Although it lasted a mere 440 billionths of a second, it released nearly 1 terawatt of energy—far surpassing the annual output of a typical nuclear power plant. Moreover, in 2025, LCLS-II set a record of generating 93,000 X-ray pulses per second, a remarkable feat for an X-ray laser.
According to Cryan, this milestone enables researchers to undertake groundbreaking studies of how particles behave within molecules after absorbing energy. It’s akin to transforming a black-and-white film into a vibrant, colorful cinematic experience. With this breakthrough and forthcoming enhancements, LCLS-II has the capacity to revolutionize our understanding of the subatomic behavior of light-sensitive systems, from photosynthetic organisms to advanced solar cell technologies.
LCLS-II operates by accelerating electrons toward near-light speeds—the ultimate velocity threshold in physics. The cylindrical device known as the klystron, which gives the klystron gallery its name, generates the microwaves necessary for this acceleration. Once the electrons attain sufficient speed, they navigate through arrays of thousands of strategically placed magnets, enabling their oscillation and producing an X-ray pulse. These pulses can be utilized for imaging the internal structure of various materials, similar to medical X-rays.
During my visit, I had the opportunity to tour one of several experimental halls. Here, the X-ray pulses collide with molecules, enabling a closer look at their interactions. These experimental areas resemble futuristic submarines—with heavy metal exteriors and large glass windows—engineered to exclude stray air molecules that could disrupt their experiments.
Just before my visit, Cryan and his team conducted an experiment to examine proton movements within molecules. Traditional imaging techniques struggle to provide detailed insight into proton dynamics, yet these specifics are vital for advancing solar cell technology, Cryan emphasizes.
What awaits these investigations post-upgrade when LCLS-II evolves into LCLS-II-HE? Cryan states that the enhanced capability to examine particle behavior within molecules will be significantly augmented. However, the path to upgrades is challenging.
Explore CERN: The Hub of Particle Physics in Europe
Get ready to explore CERN, Europe’s premier center for particle physics, nestled near the beautiful city of Geneva, Switzerland, famous for housing the Large Hadron Collider.
John Schmage from SLAC notes that as the energy of the electron beam increases, the risk of particles straying becomes a significant concern. He recounts witnessing a misbehaving beam damage equipment at another facility, highlighting the necessity for precision. SLAC’s Ding Yuantao emphasizes that all new components installed during the upgrade are designed to endure higher power outputs, but they must increase energy levels gradually to ensure operational integrity. “We’ll activate the beam and closely monitor its performance,” he states.
In 2026, the team plans to engage in a significant engineering initiative to align the components, followed by one to two years of meticulous setup for a staged increase in power output. If all progresses according to plan, the upgraded LCLS-II-HE will be available for global researchers by 2030. Ongoing communication between X-ray users like Cryan, and operators like Schmage and Ding, will be essential. “This tool will evolve, and we will continually enhance its capabilities,” Schmage notes.
I It was a thrilling November for the Diamond family. All My favorite sequel has finally launched! The original Outer Worlds mesmerized us with its Art Nouveau hues, engaged us with clever dialogue, and drew us into a classic puzzle-solving adventure in a world of “dwarves versus malevolent corporate overlords,” which remains my top choice since Deus Ex. While the combat wasn’t groundbreaking, that hardly mattered. It was evident that a passionate team had carefully crafted this narrative, and we all became enchanted by it.
When I say “all of us,” I refer to myself and my three kids. My wife skipped out on playing The Outer Worlds because Crash Bandicoot didn’t feature in it. But the rest of us thoroughly enjoyed it, and the kids found it especially amusing that after struggling for half a day, I fled from the final boss fight and declared, “I did it.” Pretty much summed up the gaming achievements of a father with other responsibilities.
My son completed Outer Worlds 2 first. “What did you think?” I inquired.
“You’re going to hate it,” he responded.
What? How dare he assume he knows my gaming likes! If it weren’t for me, these kids wouldn’t even be into gaming. It’s bad enough they crush me in Mario Kart. Now they might take away my potential fun. I’ve decided to prove him wrong and give The Outer Worlds 2 a shot.
Reader: I didn’t enjoy it.
Much of the dialogue is filled with complaints about bosses… The Outer Worlds 2. Photo courtesy of Obsidian Entertainment
The combat is impressive, the character skill trees shine, and the speed and fluidity (on Xbox Series) are commendable.
However, the initial hour was packed with dull factional politics that make The Phantom Menace’s opening crawl seem engaging. Most characters lament their employers and personal mistakes. Everything feels broken; people suffer, are in dire straits, and medical resources are scarce. It’s practically 2025 but set in space, and the clunky, tedious dialogue reads like a LinkedIn comment.
“I was right, wasn’t I?” my son asked triumphantly as I conceded defeat after 20 hours on the third planet I explored.
“How do you know?” I challenged.
“Since playing FIFA online, I’ve never heard so much swearing during a game.”
“How did they miss the mark, son?” I probed.
“There’s no real passion or depth. They narrated the story over the phone.”
Thus began a meaningful discussion about role-playing games. We debated what succeeds and what falls flat, and what differentiates the engaging from the tedious. We concurred that a compelling RPG hinges on the storyteller’s commitment. This genre draws on the essence of Dungeons & Dragons, where imagination fuels incredible tales. For players, it can become mere number-crunching, but for storytellers, it’s pure artistry. World-building is equally vital, as seen in the sweeping vistas of Skyrim, the shadowy streets of Deus Ex, and the technomagical dystopia of Gaia in Final Fantasy VII.
Just like in tabletop D&D, graphics aren’t paramount. Years ago, I relished a month in a chaotic post-apocalyptic saga called Shin Megami Tensei, immersed in an entire world brought to life by tiny pixels on a Game Boy Advance screen.
My weak bladder and need for sleep were the only things separating me from the inhabitants of The Witcher 3. Photo: CD Projekt RED
There are bound to be characters within that world who pique your interest. My weak bladder and unfortunate need for sleep were the only barriers between me and the characters of The Witcher 3. Yet, they all felt eerily familiar. The unnecessarily dense and dreary dialogue distracted me from engaging with the game for more than five minutes outside of combat.
In today’s chaotic world, where “truth” is dictated by the wealthiest deceivers, and fairness is increasingly elusive, striving for success feels daunting. That’s why the true meritocracy present in RPGs appeals to me. In all video games, progress can depend on skill, but RPGs allow even those lacking natural talent to level up and earn achievements through hard work. In contrast with a harsh reality, where millions lag behind while a few thrive, RPGs present a vision of what a fairer world could look like, complete with shields, armor, and ideally, fast-travel points.
The Outer Worlds 2 was a letdown for me, but instead of escaping into the enthralling RPG I had hoped for, I found solace in an enriching exchange with my son about the game. I was reminded of the profound impact games have on our lives and how they strengthen our connections. Sometimes, even lackluster dialogue in games can inspire captivating conversations in the real world.
These ‘murder koalas’, or marsupial lions, are the highlight of the show
Apple TV
In 1999, the BBC introduced Walking with Dinosaurs, pioneering a new format of wildlife “documentaries” showcasing long-extinct species. As a fan of this genre, I found Prehistoric Planet: Ice Age, a production by BBC Studios for Apple TV, to be exceptional.
The earlier series brought prehistoric planet dinosaurs to vivid life. Now, this third installment highlights the remarkable mammals that inhabited Earth until relatively recently.
The visuals are breathtaking. You could easily mistake the extinct creatures on screen for real footage, especially their incredibly lifelike eyes.
There were occasional awkward moments in the animals’ movements, but my discerning son remarked, “The only unreal thing is how stunning it looks.”
Paleontologists who previewed the trailer seem genuinely impressed. Ultimately, if you’re at all intrigued by extinct species, Prehistoric Planet: Ice Age is a must-watch.
What I particularly appreciate about this series is its breadth; it’s not solely focused on woolly mammoths fleeing saber-toothed tigers. Iconic Ice Age animals are featured, including giant sloths, woolly rhinos, giant armadillos, scimitar-toothed cats, and Columbian mammoths.
This series explores not just the icy polar regions, but also global ecosystems, showcasing many lesser-known species—including some I had never heard of. The animal deemed the “king of beasts” in Ice Age Africa came as a complete surprise.
Prehistoric Planet: Procoptodon, the giant ice age kangaroo
Apple TV
Another standout was the “murder koala” or marsupial lion (Thylacoleo). A recent study’s findings were published just this month. Koalas are our closest living relatives. The inclusion of this marsupial lion suggests the producers were aware of this finding beforehand. Other Australian creatures, such as a massive marsupial called diprotodon, also make an appearance.
Prehistoric Planet: Ice Age Woolly Mammoth
Apple TV
Additionally, there are charming moments, like a squirrel trying to eat a fruit resembling a giant cannonball, reminiscent of the animated film series Ice Age.
I found the change from David Attenborough to Tom Hiddleston as narrator to be somewhat distracting, as Loki’s voice felt out of place at times.
Interestingly, the series avoids graphic content, perhaps considering a younger audience. I’ll refrain from specifics to avoid spoilers, but I was quite surprised by this approach.
My primary critique is that the final segment discussing the science is brief. I would have preferred more insights from the featured experts, particularly regarding the evidence and rationale behind the actions depicted. Many New Scientist readers might agree with this sentiment, although it could just be my perspective.
While the initial scientific trivia outlines why Ice Ages persisted for so long, it curiously omits mentioning carbon dioxide’s role. The reduction of CO2 was crucial in initiating these Ice Ages, and CO2 feedback significantly amplified orbital variations’ effects.
Lastly, keep an eye out for direwolves. I’ve extensively covered claims of reviving the dire wolf via gene editing on the gray wolf, noting the misconceptions stemming from the fantasy portrayals in Game of Thrones. This series offers a high-quality, accurate artistic representation of a real animal.
Ultimately, this science-based depiction of extinct creatures is a remarkable achievement. The direwolves aren’t just large white wolves; this portrayal captures their distinctive head shape and brownish fur.
Prehistoric Planet: Ice Age Direwolf
Apple TV
For me, the portrayal of extinct animals on screen represents a critical approach to de-extinction. As we approach the end of a lengthy Ice Age, we face the stark reality that there’s no longer a habitat for these extraordinary species on our planet.
If Elon Musk can elevate Tesla’s shareholder value to over $8 trillion within the next decade, he may become the world’s first trillionaire.
This is contingent upon shareholders endorsing a revised compensation plan for the company’s “superstar CEO,” as one judge once referred to him. The annual general meeting is set to take place Thursday afternoon in Austin, Texas.
“If Mr. Elon fulfills all performance benchmarks under this principles-based 2025 CEO Performance Award, his leadership will position Tesla as the most valuable company in history,” states the Company Annual Proxy Statement. I take pride in that.
The forward-looking aspect of Mr. Musk’s $1 trillion compensation isn’t the sole matter on the agenda. Shareholders will also evaluate alternatives for compensating Musk the estimated $56 billion still owed from his 2018 compensation plan. Moreover, the company urges shareholders to reject several other proposals, including one advocating for child labor audits. Previous compensation packages have been invalidated twice by Delaware courts, with an appeal pending in the state Supreme Court, and the company aims to ensure Musk is compensated regardless of the ruling.
The road to $1 trillion
The 2025 package encompasses goals beyond merely increasing the company’s market capitalization.
The defined milestones are split into 12 “tranches,” each presenting its own unique objective. The initial milestone, or tranche, necessitates achieving a market capitalization of $2 trillion. The following nine require an additional $500 billion growth, culminating in $8.5 trillion by 2035. Every financial milestone is supplemented by product development prerequisites.
To secure an additional 12% equity stake in the company over the next decade, Musk must also deliver 20 million Tesla electric vehicles, obtain 10 million active fully autonomous driving subscriptions, launch 1 million humanoid robots, and introduce 1 million robotaxis into commercial use. Additionally, he is expected to elevate the company’s profits to $400 billion for four consecutive quarters. Actual revenue for Q3 2025 was $4.2 billion, reflecting a 9% year-over-year decline.
Ultimately, Musk must increase Tesla’s market capitalization from around $1 trillion to $8.5 trillion by 2035. He must also invest in the company for at least seven-and-a-half years and contribute to developing a long-term succession plan. As he amplifies his company’s value, the value of his shares—and his wealth—will consequently rise.
The company noted in its proposal that achieving these milestones “will be extremely difficult and challenging for both Tesla and Musk personally.” Realizing these financial targets would position Tesla to be valued similarly to the combined worth of Meta, Microsoft, and Google’s parent company, Alphabet.
Some believe Mr. Musk is capable of achieving this. He continues making billions, even if he falls short of all the milestones.
Courting a “superstar CEO”
Tesla board chairman Robin Denholm issued a public warning recently, stating that a “no” vote on the 2025 compensation plan could jeopardize Musk’s position as CEO.
In a memo to shareholders, Denholm and board member Kathleen Wilson Thompson acknowledged that Musk “has not received meaningful compensation in eight years,” owing to a legal dispute regarding a prior compensation plan from 2018. They emphasized that Musk’s achievements during the earlier agreement elevated Tesla’s market capitalization to $735 billion.
Should Musk secure a new compensation plan alongside his 2018 package, he would ultimately control over 25% of Tesla shares. As of November 5, Tesla stock was trading around $450 per share, close to its 52-week high.
“We firmly believe that backing this proposal aligns management and shareholder interests for the best outcomes for everyone involved,” stated Schwab. The investment firm emphasized it does not solely depend on Glass Lewis or ISS recommendations.
Simultaneously, Norges Bank Investment Management, Norway’s sovereign wealth fund and Tesla’s seventh largest shareholder, declared its intention to vote against the proposed salary package.
“In line with our stance on executive compensation, we are concerned about total compensation, dilution, and insufficient risk mitigation for key personnel,” stated Norges.
In addition to support from current board members and a surge of messages on Musk’s own social media platform X, other stakeholders have also voiced support for the proposal, with at least three additional investment firms already committed to backing it.
Musk, as Tesla’s largest individual shareholder with over 500 million shares, can technically vote in favor of his own pay structure.
“If controlling shareholders could endorse their own compensation, it would undermine a sense of accountability,” remarked Lawrence Hammermesh, a professor emeritus at Widener University Delaware School of Law and a former corporate lawyer.
Tesla’s new headquarters
Tesla has consistently offered its CEO an incentive-based compensation plan, stipulating specific milestones for stock options.
Nevertheless, the last compensation package established in 2018 faced a legal challenge from a shareholder with less than 12 shares during the lawsuit in Delaware Chancellor’s Court. He prevailed, invalidating and canceling the salary package.
In response, Musk criticized the court and requested Tesla to relocate its headquarters from Delaware to Texas. Musk’s public dissatisfaction with the Delaware ruling is believed to have expedited #DExit, a movement where other major companies, including Dropbox and Meta, contemplate moving their corporate headquarters out of Delaware.
“Elon Musk wields significant influence, which extends into corporate law,” commented Eric Talley, Professor, Columbia University Law School. Delaware’s reputation as a “corporate mecca” remains relatively intact “until 2024, when Elon Musk endeavored to rally support,” he added.
TThe Outer Worlds 2 was first revealed in June for £70/$80, becoming the priciest game on Xbox at that point. However, this status was short-lived, as Microsoft quickly reverted to the typical £60/$70 price point after just a month. Although The Outer Worlds 2 is larger than its 2019 predecessor, the decision was indeed prudent. This game does not warrant a £70 price tag.
Nonetheless, it offers a delightful experience that can easily consume your time, enhancing the original game significantly. With improved combat and more intricate role-playing elements, The Outer Worlds 2 smartly expands its scope without overextending its narrative, even if the storyline doesn’t quite deliver the same level of satisfaction.
You don’t need to have played the first game to grasp the sequel’s premise. You take on the role of an agent for the slow-moving, “benevolent” space police known as the Earth Directorate, with a mission to impose order on the galaxy known as Arcadia. Much like Halcyon in the original, Arcadia is in disarray due to the rampant spread of capitalism. Players wield significant power to form new alliances and mend old ones amidst various groups of conflicting ideologies.
Frequent conflict between factions is a given, but there are more pressing issues than the divide among them. You soon discover that a rift in the universe poses a serious threat. This concept is introduced early in the game, where you investigate these rifts caused by the Protectorate, an authoritarian group that is altering the universe’s fabric. Upon your arrival, betrayal from trusted allies leads to a decade spent in suspended animation.
Colorful vendors…The Outer Worlds 2. Photo courtesy of Obsidian Entertainment
Upon waking ten years later, you’ll find that Arcadia has undergone significant changes, with the rift expanding uncontrollably. To save the galaxy, you’ll need to assemble a new crew.
Unfortunately, the narrative peak experienced at the beginning of The Outer Worlds 2 is not echoed throughout the game. A lingering sense of disappointment follows me as I spend about 30 hours journeying through the plot. I hoped for unexpected character developments or story twists, but instead found myself predictably nodding through most scenes. There are intriguing characters present, such as psychopathic cultists and spies, yet I struggled to bond with this team, similar to my experience in the first installment.
Despite the presence of three major factions vying for attention and favor, The Outer Worlds 2 offers limited impactful choices. Awkward dialogues with narrow-minded capitalists compel you to reflect on your involvement in the actual economic system, but the simplistic portrayals of characters provide more insight than substantial moments.
If your “speech” skills are sufficiently high, you can often navigate most situations with ease. This can feel jarring, especially when a blind follower of the Protectorate changes her stance drastically after a specific dialogue choice, yet is content to monologue as you exit a boss fight. In almost every significant conflict, it feels as though your actions carried minimal weight—at least, that’s how it seems.
Attribute firepower…Outer World 2. Photo courtesy of Obsidian Entertainment
This aspect made it difficult for me to engage fully with the storyline. While the characters around me may have strong opinions, they rarely seem to exhibit genuine anger, even when I disregard their beliefs. The Outer Worlds 2 lacks a certain persuasiveness, which is disappointing because its clever humor often brings joy but seldom serves to deliver a biting ideological critique.
The game features impressive depth in its combat and role-playing systems, significantly refining what was introduced previously. While gunfights can sometimes drag and frustrate, the diverse range of weaponry and their varying effects add excitement to each encounter. Elemental damage influences enemies in different ways, each requiring distinct ammunition. Running low on energy during a tough skirmish forces creativity, and I’ve often found myself trying weapons I hadn’t used before out of sheer necessity, which turned out to be quite enjoyable.
Character development is more intricate than in the original game, featuring a “flaws” system that tracks your actions and may even prompt you to adopt traits that bear both negative and positive consequences. For instance, depending on item crafting can lead to acquiring the “Hermit” flaw, which doubles vendor prices, while dismantling junk can yield opportunities for additional items. This became particularly helpful during crafty moments but posed challenges when I had limited resources and needed vendors. This filled my experience with fun dilemmas, prompting me to rethink my approach.
Developer Obsidian has had a stellar year, launching the fantasy adventure Avowed alongside the insect-filled survival game Grounded 2. All three titles showcase the studio’s remarkable ability to craft diverse worlds that adapt to player choices.
While The Outer Worlds 2 may not consistently amaze, it does offer significant value as an engaging role-playing experience that can keep you entertained for hours. The focus isn’t necessarily on reinventing the wheel but rather on enhancing its framework. In essence, it’s a gratifying, reliable experience—satisfying yet rarely surprising—capable of delivering joy quite regularly.
On Thursday, President Donald Trump granted a pardon to the founder of the largest cryptocurrency exchange globally.
The White House issued a statement saying, “President Trump utilized his constitutional powers by pardoning Mr. Zhao, who faced prosecution from the Biden administration concerning the virtual currency conflict. The conflict against virtual currencies is concluded.”
Qiao Changpeng stepped down as CEO of Binance in late 2023 after admitting to one count of failing to uphold an anti-money laundering program, alongside a payment of $4.3 billion to resolve associated accusations. He received a four-month prison sentence.
Chao, commonly known as CZ, ranks among the wealthiest individuals globally and is a prominent figure in the cryptocurrency industry. He established Binance as the largest cryptocurrency exchange; however, operations in the United States are prohibited following his guilty plea in 2023.
The pardon from President Trump marks a significant triumph for Chao and Binance after a period of lobbying and speculation. It also signifies a shift towards reduced scrutiny of the cryptocurrency sector by the Trump administration, even as the president and his family develop their own crypto business empire worth billions.
A spokesperson from Binance commented, “Today brings remarkable news regarding CZ’s pardon. We express our gratitude to President Trump for his guidance and dedication to making the United States the leading hub for cryptocurrency.”
During a press interaction on Thursday, President Trump addressed the pardon, minimizing Zhao’s offenses and asserting that he had no previous relationship with the cryptocurrency mogul.
In response to a query from a reporter about the decision, President Trump remarked, “Are you referring to the crypto individual? Many assert that he did nothing wrong. They claim his actions weren’t even criminal. It was persecution from the Biden administration, leading me to pardon him upon request from a number of esteemed individuals.”
Representatives from the Trump family’s crypto venture have discussed acquiring a stake in The Wall Street Journal, which is Binance’s U.S. arm. This was reported earlier this year. Mr. Zhao claimed that he was negotiating an agreement in return for clemency.
“Fact: I have never discussed my arrangement with Binance US with…well, anyone,” Zhao stated in a post on X in March. “Serious criminals wouldn’t be concerned about pardons,” he added.
However, Binance has significantly contributed to the growth of the Trump family’s World Liberty Financial cryptocurrency enterprise. Earlier this year, when Binance entered into a $2 billion agreement with a UAE investment fund, the payment was made using a cryptocurrency developed by World Liberty Financial. This enhanced the legitimacy of the Trump family’s digital currency and proved to be a highly profitable move for Binance.
In May, Zach Witkoff, the founder of the Trump family’s cryptocurrency entity, expressed at a press conference in Dubai to unveil the deal: “We appreciate the confidence that MGX and Binance have placed in us.”
A group of Democratic senators, including Elizabeth Warren, the ranking member of the Senate Banking, Housing, and Urban Affairs Committee; issued a statement after the May agreement, expressing concerns that Binance and the Trump administration may be seeking a deal that enriches the president.
“As the administration eases oversight of industries violating money laundering and sanctions regulations, it is not surprising that Binance, which has acknowledged prioritizing its growth and profits over compliance with U.S. law, would seek to eliminate the supervision mandated by the settlement,” the senators remarked.
The lawsuit by the U.S. Department of Justice against Binance alleges that the company neglected to report over 100,000 suspicious transactions to law enforcement, including those involving U.S.-designated terrorist entities such as Al Qaeda and Hamas. The Securities and Exchange Commission filed a lawsuit against the company in 2023, but dropped the case shortly after President Trump assumed office.
quick guide
Contact us about this story
show
The best public interest journalism relies on first-hand reporting from those in the know.
If you have something to share regarding this matter, please contact us confidentially using the methods below.
Secure messaging in the Guardian app
The Guardian app has a tool to submit story tips. Messages are end-to-end encrypted and hidden within the daily activities performed by all Guardian mobile apps. This prevents observers from knowing that you are communicating with us, much less what you are saying.
If you don’t already have the Guardian app, please download it (iOS/android) and go to the menu. Select “Secure Messaging.”
SecureDrop, instant messenger, email, phone, mail
If you are able to securely use the Tor network without being monitored, you can send messages and documents to Guardian through the SecureDrop platform.
Lastly, our guide found at theguardian.com/tips lists various ways to contact us safely and discusses the advantages and disadvantages of each.
“Today’s solar panels will inevitably reach the end of their lives and will require recycling or disposal.”
Jacques Hugo/Getty Images
By the mid-2020s, solar energy had become a major player. It emerged as the most affordable form of power generation and was also one of the fastest-growing sources of energy. The lifespan of solar panels had extended significantly, lasting around 30 to 40 years. However, eventually, these panels would need to be recycled or disposed of. By 2050, predictions indicated that there could be as much as 160 million tonnes of solar module waste. While this amount was considerably less than that produced by fossil fuel sources, it still posed a challenge.
Researchers began exploring how to create self-healing and even self-organizing solar panels.
By the mid-2030s, advancements had led to the creation of live solar panels, also known as biological solar power generation (BPV), which were deployed globally. The aesthetically pleasing, natural look of this technology made it popular, leading to the mantra of “yes, in my backyard,” and rapid adoption of living sunlight technology.
One of the first benefits was easily observed in off-grid rural areas, particularly in sub-Saharan Africa, where BPVs provided energy for mobile phones and computers without the need for batteries. As the technology progressed, older buildings were revamped into BPVs resembling green walls and roofs, while new structures incorporated living solar panels right from the design phase, allowing more people to become less dependent on traditional grid energy. This also helped boost local biodiversity and enhance overall happiness.
BPV operates like a fuel cell, where electrons move from the cathode to the anode, generating electricity. In biological contexts, electrons are produced by photosynthetic organisms and subsequently transferred to the anode.
Back in 2011, scientists became intrigued by the phenomenon of electrical leakage from cyanobacteria in sunlight. They discovered that by placing cyanobacteria on electrodes, they could harvest current to power small electronic devices.
However, the electrical output was weak due to insufficient electron leakage from the bacteria. Scientists like Chris Howe from Cambridge University worked on genetically modifying cyanobacteria to enhance electron leakage, allowing them to be connected to electronic devices.
In 2022, Howe’s team found that they could power computers solely using photosynthesis. Soon after, scientists made significant strides in their ability to scale up current harvesting and develop devices powered by biological energy sources worldwide.
“
Members of Homo Photosyntheticus pledged to limit their electricity consumption strictly to that derived from photosynthesis. “
With the improvement in BPV technology, larger devices like mobile phones and refrigerators began operating on batteries charged by living solar cells. Electric vehicles could be charged using arrays of biological solar panels installed in garages and depots, leading to a reduced need for metals like lithium and manganese.
Remarkably, the devices continued to function in low light. At night, the cells metabolized compounds created during the day, producing a comparable amount of electrons to maintain power.
The rise of living solar technology had numerous implications. As buildings adopted a green aesthetic, urban planners started integrating more nature into streets and public areas. Even densely populated cities began to exhibit a vibrant green atmosphere, teeming with trees, plants, flowers, and wildlife.
The success of BPVs inspired a movement focused on integrating the organelles of plant cells responsible for photosynthesis. This enthusiastic group, identifying as members of Homo Photosyntheticus, drew inspiration from solar-powered sea slugs and incorporated chloroplasts sourced from plant leaves into their own biology.
Sea slugs have evolved methods to sustain and manage chloroplast functionality; however, they sometimes require additional chloroplasts. They possess a leaf-like structure that maximizes surface area, yet the energy obtained through photosynthesis only meets a small fraction of their energy requirements. For humans, without the cellular infrastructure to support chloroplast function or leaf-like shapes, this method could only yield negligible energy.
Nevertheless, for self-identified members of H. Photosyntheticus, the incorporation of chloroplasts held significant symbolic meaning. They engaged in what they referred to as “greening,” committing to utilize only electricity generated directly through photosynthesis—eschewing fossil fuels altogether! Additionally, they commonly tattooed chloroplasts on their skin as a visible testament to their dedication.
Carbon Capture and Storage Cement Plant in Padeswood, Wales
Padeswood CCS
Commercial carbon capture systems for cement facilities are currently being rolled out, signaling a potential turn towards net-zero emissions for one of the most challenging sectors in the industry.
As reported by German company Heidelberg Materials, the inaugural carbon capture cement plant has been operational in Norway since June, with the first “carbon cement” products slated for delivery to the UK and other European countries next month.
In tandem, construction of carbon capture infrastructure at the Padeswood cement plant in North Wales is set to commence shortly, following a subsidy agreement revealed this week between the UK government and Heidelberg representatives. Several similar facilities are also in the pipeline for Sweden, Germany, and Poland.
This advancement represents a critical leap forward in the cement industry’s quest to cut emissions, a long-recognized hurdle in decarbonization efforts. “That’s significant progress,” states Paul Fennell of Imperial College London, referring to the projects in Norway and the UK.
Cement contributes to roughly 8% of global carbon emissions, according to Chatham House, a think tank. Much of this carbon dioxide is emitted by the chemical processes that create clinkers, the primary component of Portland cement, the most widely used construction material. “Regular Portland cement production inherently generates substantial CO2 due to essential chemical reactions,” Fennell explains.
Capturing CO2 generated from these processes is regarded as the only viable option for significantly decarbonizing cement production. Yet, this method is costly, with estimates ranging from 50-200 euros needed to capture, transport, and permanently store large amounts of carbon from European cement operations, as outlined in an analysis by the Bank of Netherlands here.
The Brebik plant in Heidelberg, Norway, benefits from government subsidies. Its carbon capture infrastructure accounts for 50% of the cement facility’s overall emissions. It operates by removing CO2 from the exhaust of cement plants using an ammonia-based solvent known as amines. The extracted CO2 is then released from the solvent, liquefied, and stored beneath the seabed in Norway.
The Padeswood Plant employs similar amine technology, but when the carbon capture and storage systems become operational in 2029, it is projected to eliminate around 95% of the plant’s emissions, according to the UK CEO of Heidelberg Materials, Simon Willis. This translates to approximately 800,000 tons of CO2 annually. The Padeswood facility is anticipated to sequester more CO2 than the Brevik plant, but that is largely due to the lack of additional energy supplies required to achieve the 95% reduction at Brevik.
Construction is expected to start in the coming weeks, with the UK government agreeing to subsidize the operational costs of the technology—although details of this funding agreement remain undisclosed. “The fundamental premise is that the government is providing us with funds to assist in establishing and operating carbon capture facilities,” Willis states.
According to Leon Black from the University of Leeds, UK, government financial backing is crucial for constructing the initial fleet of cement plants focused on carbon capture and storage. “Carbon capture and storage would not be feasible without governmental aid,” he asserts.
However, emerging technologies hold the promise of enhanced energy efficiency, and costs are anticipated to decrease over time. In Germany, for instance, Heidelberg is collaborating with a consortium exploring Oxifuer technology, which involves recirculating exhaust gases back to the burner, increasing the CO2 concentration in exhaust gases to around 70%, thereby making the carbon capture process more efficient.
Painted alongside scientist Manel Esterer, Maria Blagnas Morela contributed to research aimed at uncovering her secrets of longevity
Manel Esterler
From January 17, 2023, to August 19, 2024, Maria Blañas Morera from Spain was formally recognized as the oldest person in the world until her passing at the age of 117 years and 168 days. To investigate the secrets behind her remarkable longevity, a team of researchers explored her genetics, microbiome, and lifestyle.
When Morera was 116, the researchers gathered samples of her blood, saliva, and stool for genetic analysis. “Her genome was exceptional, enriched with variants known to extend lifespans in other species such as dogs, worms, and flies,” noted team member Manel Esterler at the Josep Carreras Leukemia Research Institute in Barcelona, Spain.
Showing no signs of dementia, Morera also possessed numerous genetic variants that helped maintain low blood lipid levels, protecting her heart and cognitive functions, according to Esteller. “Simultaneously, she lacked genetic mutations linked to conditions such as cancer, Alzheimer’s disease, or metabolic disorders.”
The researchers discovered that her lipid metabolism was one of the most efficient recorded. “Her lipid profile was remarkable, with very low cholesterol,” Esterer mentioned. “This efficiency was tied to her modest diet and genetic traits that enabled the rapid metabolism of damaged molecules.”
Esteller noted that Morela abstained from alcohol and smoking and adhered to a Mediterranean diet comprising vegetables, fruits, legumes, and olive oil, along with three servings of sugar-free yogurt daily.
Further assessments indicated that Morela maintained a robust immune system typically seen in younger individuals, alongside a gut microbiota characteristic of much younger people.
One of the most “astonishing” findings was a high concentration of Actinobacteriota bacteria in her gut, including well-known probiotics like Bifidobacteria. This abundance typically declines with age but tends to increase among centenarians and supercentenarians, offering various anti-aging benefits, such as reducing inflammation.
The researchers believe that her yogurt intake may have continually replenished her levels of Bifidobacteria. “This may suggest that dietary interventions can be linked to prolonged lifespan by influencing gut microbiota, along with preventing obesity and other health issues,” Esterer added.
Lastly, scientists examined whether there was a significant difference between Morela’s biological age and her chronological age by constructing an epigenetic clock based on her DNA methylation. This process involves adding or removing chemical tags that regulate gene expression. “Her biological age appeared 23 years younger than her actual age, contributing significantly to her longevity,” remarked Esterer.
Previous studies indicate that supercentenarians may carry genetic mutations associated with various medical conditions, such as Alzheimer’s disease and cardiovascular issues. Nevertheless, they somehow manage to overcome these obstacles and attain extraordinary lifespans. “There are limited studies on supercentenarians, and many only focus on one aspect, like microbiomes,” explained Esteller. “Our research demonstrates that overcoming such maladies is a blend of advantageous genetics and other elements, including beneficial gut microbiota, delayed biological aging indicated by a youthful epigenome, and lifestyle factors such as avoiding smoking, alcohol, and maintaining a low-fat diet.”
Richard Farragher from the University of Brighton in the UK acknowledged that the study highlights the plethora of assessments available to longevity researchers, cautioning that a case study of one individual could risk being perceived as a scientific “So-So Story.”
He explains that there are two key reasons behind the survival of extremely long-lived individuals: “First, there’s something extraordinary about them, perhaps genetically, and second, survival biases due to their fortunate circumstances,” said Farragher.
If luck plays a role, he asserts that to substantiate her longevity, Morela belonged to a family with a history of long lifespans that wasn’t documented in the study.
Mummies are commonly linked with Egypt and date back around 4,500 years. However, researchers have discovered mummies that are significantly older on the opposite side of the globe.
“We found several archaeological sites in southern China and Southeast Asia, where human burials dated between 4,000 and 14,000 years have been identified,” said Professor Peter Bellwood, co-author of the study, during a phone interview on Tuesday.
Research, as mentioned in a study published on Monday in the Proceedings of the National Academy of Sciences, analyzed 54 Neolithic burials from 11 archaeological sites in southern China and Southeast Asia. The findings include numerous samples from the autonomous regions of Guangzhou, as well as from Vietnam, the Philippines, Laos, Malaysia, and Indonesia.
Human remains were often found in crouched or squatting positions, frequently showing signs of burning. Researchers confirmed that many of these bodies had been preserved for a considerable time prior to burial during the mummification process.
Burials of partially skeletal bodies were frequently observed in pre-Neolithic sites in southern China and Southeast Asia. Hirofumi Matsumura
Bellwood, an archaeology professor at the Australian National University of Canberra, noted:
Before this discovery, the oldest known mummies were located in modern Peru and Chile, rather than in Egypt.
The modern radical smoke-dried mummies of mites bred in Jayawijaya, Indonesia, are very similar to the burials of many Neolites recorded in southern China and Southeast Asia. Hirofumi Matsumura
These discoveries have also garnered attention from leading experts in ancient Egyptian studies.
“The term has been adopted by various groups to refer to other preserved bodies, leading to a broader understanding of the concept,” stated Salima Ikram, a professor of Egyptology at the American University of Cairo who was not affiliated with the study.
“What’s positive is that the underlying ideas are similar, as these cultures aimed to preserve themselves,” she added.
The project began in 2017 with a casual conversation between the two lead authors and subsequently grew to include 24 experts.
“Over the years, we’ve gradually assembled various pieces of evidence,” said Hsiao-Chun Hung, the study’s lead author, in an email. “It’s akin to a detective’s work, where I find small clues, piece them together, and become increasingly confident in my hypothesis.”
Chamkaur Ghag plays a pivotal role in the Lux-Zeplin experiment, a leading dark matter detector
Nova
Deep underground in South Dakota, the most advanced dark matter detector on Earth awaits its moment of discovery. This is the Lux-Zeplin (LZ) experiment, highlighting a vast tank of liquid xenon. Physicist Shankaur Ghag from University College London is among the key leaders in this large scientific collaboration, which aims to unravel about 85% of the universe’s mysteries that still elude us.
Currently, Ghag and his team find themselves at a crucial juncture in the quest for this elusive substance. They are considering plans for a more significant detector called xlzd, which promises to be many times the size of the LZ and even more precise. However, if neither detector can uncover the dark matter, they may need to reassess their understanding of what dark matter is. As Ghag suggests, future dark matter detectors may not be massive underground structures but rather smaller, unassuming devices. He has already devised a prototype of such a detector ahead of his upcoming talks at New Scientist Live this October.
Leah Crane: To start, why is dark matter so essential?
Chamkaur Ghag: On one side, we have all the knowledge that particles and atoms, alongside particle physics, provide about the components of matter. On the contrary, we understand gravity as well. While this may seem comprehensive, a significant issue arises when attempting to merge gravity and particle physics. Our galaxy shouldn’t exist as it does. It remains intact through gravity, which seems to derive from unseen matter. This isn’t just a tiny glue; around 85% of the universe comprises this so-called dark matter.
Why have our efforts to find it been so prolonged, with little success?
At present, we hypothesize that dark matter likely consists of what we term “wimps”—massive, weakly interacting particles that originated in the early universe. Consequently, these rarely interact with other particles, providing only a faint signature, which necessitates a large detector for detection. The larger these detectors are, the greater the chance that dark matter particles will pass through them. Additionally, they must be extremely quiet since even slight vibrations can obscure the signal.
We discuss the theoretical landscape of dark matter, which encompasses the range of masses and characteristics such particles could possess. We’ve already excluded certain regions of this landscape, making it essential to delve even deeper underground with larger detectors to explore where dark matter may still exist.
This painstaking endeavor requires minimizing background noise. For instance, many metals emit small radioactive levels, necessitating rigorous efforts to reduce construction material noise. The LZ detector boasts the lowest background noise and the highest level of radio-purity on the planet.
The LZ is currently the most sensitive detector we have. How does it function?
In essence, it operates as a double-walled thermos, containing several meters of liquid xenon. This xenon resides within a reflective tank, equipped with light sensors positioned above and below. Additionally, an electric field exists within this tank. When a wimp collides with a xenon nucleus, it generates a brief flash of light. However, due to the electric field, it causes the electrons to split apart, producing a second flash from the nucleus.
This two-signal output enables us to ascertain the exact location of an event. The intensity of both the primary and secondary flashes informs us about the microphysics of whether the interaction was caused by a wimp or an unrelated phenomenon, such as gamma rays. To ensure optimal detection, we are positioned miles underground to shield against cosmic rays and also encapsulated in an aquarium to safeguard against the surrounding rock.
This endeavor is undoubtedly complex. What has been the most challenging aspect of making it operational?
In an earlier experiment with a smaller prototype called Lux, I understood what was required to create an instrument tenfold more sensitive. Bringing that theoretical knowledge into practice proved challenging. For me, the toughest challenge lay in ensuring the instrument remained clean and quiet enough to achieve required sensitivity. When deployed with the LZ, it occupies a vast area equivalent to a football pitch, where it must tolerate only a gram of dust spread across its surface.
What is it like working with such an ultra-clean detector underground?
The environment, once a gold mine, retains its industrial atmosphere. You don a hard hat, descend a mile down, and then trek to the lab. Upon entering the lab, you lose sense of the surroundings; it transforms into a clean room filled with computers and equipment—essentially a lab devoid of windows. But the journey underground feels otherworldly.
Outer Detectors of the Lux-Zeplin Experiment
Sanford Underground Research Facility/Matthew Kapust
Historically, wimps have been the primary suspect for dark matter. At what point do we consider the wimp hypothesis invalid if we find no evidence?
Should we construct the XLZDs, the larger detectors intended for this purpose, and reach a point where they fail to detect wimps, it would be hard to sustain the idea of a standard wimp existing if we must venture beyond the capabilities of those instruments. However, until that happens, wimps are still in the game. The void between our current findings and those of the XLZD remains intriguing.
We’ve also developed a much smaller, entirely different detector for dark matter. Can you tell me more about it?
We’ve engineered 150 nanometer wide glass beads coated with lasers. This highly sensitive force detector can determine interactions in three dimensions, allowing us to ascertain which direction an event originated from. This capability is significant as it enables us to filter out terrestrial background influence, such as radioactive decay from geological materials.
This concept seems far removed from large detectors like the LZ. What’s the logic behind its creation? Will we see further advancements in smaller detectors?
Large-scale underground experiments, while large and sensitive, can paradoxically limit sensitivity due to their size. For instance, when a dark matter particle collides with my xenon detector, it may produce 10 photons. A smaller tank can capture all of them, but in a larger tank, these photons could bounce around and only a few are detected.
Furthermore, when a dark matter particle interacts with my detector, it only generates two photons initially. In this scenario, the maximal signal from a detector akin to the LZ diminishes. This has spurred the motivation to search for low-mass dark matter particles beyond the LZ’s detection range, leading us toward alternative detection methods.
If dark matter were to be discovered, what implications would that hold for physics and our understanding of the universe?
The implications would be two-fold: it would conclusively provide answers to what constitutes 85% of the universe, and it would challenge the standard model of particle physics, which currently outlines the known components of reality. Thus, if we discovered dark matter, it may offer the first glimpse beyond this conventional framework. Up until now, we’ve had no solid evidence to deviate from the standard model—this would serve as the first ray of hope.
Larry Ellison, the co-founder of Oracle, is having a remarkable year. With his friend Donald Trump residing in the White House, and his son, David Ellison, taking the helm of the esteemed media company CBS, he recently outpaced his partner Elon Musk to claim the title of “The World’s Richest Person.”
The buzz around Oracle’s stock has further boosted his wealth, bringing Ellison’s net worth to an impressive $393 billion, overtaking Musk’s $384 billion.
While he may not have the same popularity as Musk, Ellison’s impact on Silicon Valley and the political landscape is significant. He is renowned for his extravagant lifestyle, which includes a massive yacht, a private jet, multiple marriages, and ownership of the entire island of Lanai in Hawaii.
At 81, this tech mogul has built a fortune through software development since the 1970s. He co-founded Oracle after securing a two-year contract to develop a database for the CIA. Academy of Achievement. Oracle has grown into a tech giant, creating software for Fortune 500 companies worldwide and making strides in cloud computing. The rise of artificial intelligence has further benefited the company, leading to fruitful partnerships with OpenAI, the creator of ChatGPT.
“AI is a much bigger deal than the Industrial Revolution, electricity, and everything that has come before,” Ellison emphasized during an interview with former British Prime Minister Tony Blair in February.
Having served as Oracle’s CEO for 37 years before becoming Chief Technology Officer in 2014, Ellison continues to lead the board and retains over 40% ownership of the company. Notably, Oracle’s headquarters relocated from Silicon Valley to Austin, Texas, in 2020.
In addition to Oracle, Ellison was on Tesla’s board from 2018 to 2022, holding shares in Musk’s electric vehicle company. According to Forbes, he also owns nearly 50% of Paramount Skydance, a media conglomerate managed by his son David. The company encompasses CBS, MTV, Paramount Pictures, among others. Young Ellison claims that media firms stay clear of political affiliations, yet he is close to Bari Weiss, a controversial figure who co-founded the Free Press to head CBS News.
Ellison’s Connections to Trump and Netanyahu
Ellison has deep ties to the Republican Party and a close relationship with Trump, dating back to his first term. Ellison has often dined at Trump’s Mar-a-Lago resort and met him in the Oval Office. Oracle has positioned itself as a lead buyer of popular social media platforms as Trump has delayed a Supreme Court ruling that could ban TikTok unless it is sold.
“In Larry, in Larry Ellison, that goes far beyond technology,” Trump remarked in a press conference shortly after taking office. “He’s a great guy and a great businessman.”
Ellison is also closely connected to Israeli Prime Minister Benjamin Netanyahu and has made substantial contributions to the Israeli military through a nonprofit benefiting Israeli Defense Forces. In 2017, he made a record donation of $16.6 million. Oracle did not respond to inquiries from the Guardian about any recent donations.
Ellison has hosted Netanyahu and numerous high-profile officials and celebrities on his extensive estate in Lanai. According to Bloomberg, he purchased 98% of the island in 2012, transforming it into a luxury Four Seasons Resort and a hydroponic farm that produces lettuce and other vegetables. Local residents have shared their concerns over the rapid transformation of their island from a sleepy military base to an ultra-rich destination.
Musk, a close friend and competitor of Ellison in the realm of wealth, is a regular visitor to Lanai and considers Ellison a mentor. During a recent podcast with Texas Senator Ted Cruz, Musk referred to Ellison as one of the smartest individuals he’s ever encountered.
“Larry Ellison is incredibly intelligent,” Musk noted. “I believe he is one of the smartest people.”
U.S. tech mogul Larry Ellison has surpassed Elon Musk to become the wealthiest individual globally, primarily through his holdings in Oracle, the company he co-founded.
Ellison’s fortune surged after Oracle, in which he holds a 41% stake, reported stronger-than-anticipated financial performance.
In early trading, Oracle’s shares skyrocketed by over 40% to $340 each, valuing the enterprise software firm at $958 billion (£70.7 billion) and pushing Ellison’s net worth to $393 billion.
This stock surge marks the largest single-day increase in the company’s history and represents the highest one-day wealth gain ever recorded on the Bloomberg Index. Ellison and Musk currently lead ahead of Facebook founder Mark Zuckerberg and Amazon’s Jeff Bezos.
In addition to Oracle, Ellison’s wealth stems from various ventures, including Musk’s sailing team, the Indian Wells Open Tennis Tournament, and investments in Tesla, the electric vehicle manufacturer based in Hawaii, according to Bloomberg.
Musk, often at odds with different figures in business and politics, maintains a close relationship with Ellison, who is regarded as a trailblazer among South African-born tech entrepreneurs.
Ellison served on Tesla’s board from 2018 to 2022 and has invested $1 billion in Musk’s Twitter since its rebranding as X. In Walter Isaacson’s biography of Musk, it is mentioned that when the Tesla CEO asked Ellison to invest in Twitter, he replied, “What do you recommend?” The book also highlights Musk’s frequent visits to Lanai, the Hawaiian island owned by Ellison.
Ellison is known to support Donald Trump and regularly appears alongside the U.S. President, including the launch of the Stargate project, which commits $500 million to American AI infrastructure. Musk, a well-known backer of Trump’s 2024 campaign, previously had close connections with Trump before they deteriorated earlier this year.
Oracle plays a significant role in Ellison’s financial portfolio, driven by the rising demand for cloud services from AI companies needing computing capabilities, such as those utilized by ChatGPT developer OpenAI.
The world’s most advanced engines are remarkably compact, achieving astonishing levels of efficiency, mirroring some of nature’s tiniest machines.
A thermodynamic engine represents the most straightforward mechanism to illustrate how the laws of physics govern the conversion of heat into useful work. These engines feature areas of heat and cold interconnected by a “working fluid” that goes through cycles of contraction and expansion. Molly’s Message and James Mirren from King’s College London and their team have constructed one of the most extreme engines yet, utilizing microscopic glass beads in place of traditional working fluids.
The researchers employed electric fields to trap and position the beads in diminutive chambers crafted from metal and glass with minimal air. To operate the engine, they varied the electric field parameters to tighten and loosen the beads’ “grip.” A handful of air particles within the chamber acted as the cold section of the engine, while manipulated spikes in the electric field represented the hot section. These spikes enabled the particles to move significantly faster than the sparse air particles in their vicinity. Notably, the glass particles experienced speeds greater than what they could achieve in gas while remaining cool to the touch, despite their temperature briefly spiking to 10 million Kelvin—approximately 2,000 times the sun’s surface temperature.
This glass bead engine functioned in an atypical manner. During certain cycles, it displayed striking efficiency, as the strength of the electric field propelled the glass beads at unexpected speeds, effectively generating more energy than was inputted. However, in other cycles, the efficiency dropped to negative levels, as if the beads were being cooled in scenarios where they should have heated further. “At times, you believe you’re inputting the correct energy. You’re attempting to run the fridge with the appropriate mechanisms designed to operate the heat engine,” explains Message. The temperature of the beads fluctuated based on their location within the chamber, an unexpected outcome given that the engine was designed to maintain specific hot or cold sections.
These peculiarities can be attributed to the engine’s minuscule size. Even a single air particle colliding randomly with the beads can drastically impact the engine’s performance. Although traditional physical laws generally prevail, sporadic extreme phenomena persist. Mirren notes that a similar situation exists for the microscopic components of cells. “You can observe all these strange thermodynamic behaviors, which make sense on a bacterial or protein level, but are counterintuitive for larger entities like ourselves,” he states.
Raul Rika from the University of Granada in Spain mentions that while this new engine lacks immediate practical applications, it may deepen researchers’ understanding of natural and biological systems. It also signifies a technical breakthrough. Loïc Rondin from Paris’ Clay University asserts that the team can further investigate numerous unusual characteristics of the microscopic realm with this relatively straightforward design.
“We are significantly simplifying what will become a biological system ideal for testing various theories,” states Rondin. The team aspires to apply the engine in the future for tasks such as modeling how protein energy varies during folding.
Journal Reference: Physical Review Letters, In print
The Outer Worlds 2 from Obsidian is set to be his inaugural first-party Xbox release, priced at $80 (£70). While the cost of games on the Nintendo Switch 2 is notably high, particularly as Sony’s PlayStation 5 titles have been trending in that direction, you might assume this development wouldn’t ignite a debate among gamers. Yet, it has. The increasing prices of video games continue to be a hot-button issue, especially with the ballooning budgets typically associated with blockbuster titles today. Nevertheless, Outer Worlds 2 promises a more expansive and intricate experience compared to its 2019 comedic sci-fi predecessor, and one could reasonably argue that the price reflects this enhanced value.
I thoroughly enjoyed the original Outer Worlds. It was vibrant with the signature dark humor expected from an Obsidian RPG (the studio behind Fallout: New Vegas). The lush, saturated universe filled with vibrant flora, bumbling activities, and eccentric characters provided joy for approximately 20 hours, although the combat left something to be desired.
According to game director Brandon Adler, Obsidian was aware that gunplay in the Outer Worlds required enhancements from the outset, especially when developing a sequel. “We valued this feedback, so we completely reassessed our approach,” he mentioned during an interview following the Outer Worlds 2 presentation in Los Angeles. “We also consulted with Hello. Everyone…they provided us with an extensive list of suggestions, advising us on what to target and how to improve our weapon analytics.”
Obsidian’s research has yielded impressive results. Not only are there more weapons to engage with in Outer Worlds 2, but they also feel rewarding to fire and offer a variety of combat tactics based on different encounters. A solid stealth approach allows players to sneak into enemy territory without a trace, should they choose to avoid confrontation. In the original game, firing was fun but could often lead to moments of disdain before entering into large battles. This has shifted in the sequel; I found enjoyment in every encounter, skillfully aiming and executing moves as I descended upon my targets.
“We didn’t want mere tweaks. Every weapon feels distinctly unique, each with its individual purpose,” Adler expresses. “You can take these weapons, apply mods, and create all sorts of imaginative combinations.”
The game also harnesses the capabilities of its upgraded engine (Unreal Engine 5) and modern hardware, making the world feel larger than before. For instance, entering a building no longer requires loading screens that could momentarily pull players out of the immersive experience. “These small details contribute significantly to the overall atmosphere,” Adler states. “Exploration is paramount for me; I want players to feel compelled to delve into this expansive universe, explore every avenue, and investigate even the tiniest features.”
Outer Worlds 2 promises a more expansive setting, refined combat mechanics, and an abundance of customization options, alongside its role-playing elements that can dramatically affect gameplay, including enhanced character perks and flaws. Although Adler refrained from commenting on the $80 price point, it is evident that this sequel stands on its own merit.
Images and videos from the Vera C. Rubin Observatory showcase over 10 hours of test observations before being revealed. The event was live streamed on Monday from Washington, DC.
Keith Bector, an associate professor from the University of Wisconsin-Madison physics department, has contributed to the Rubin Observatory for nearly a decade as a system verification scientist, ensuring that all components of the observation deck function properly.
He mentioned that the team was present as images streamed in real-time from the camera.
“In the control room, there was a moment when all engineers and scientists gazed at these images. We were able to observe more details about stars and galaxies,” Vector explained to NBC News. “Understanding this on an intellectual level is one thing, but on an emotional level, I realized I was part of something truly extraordinary, all happening in real-time.”
One of the newly released images enabled the Rubin Observatory to identify galaxies billions of light-years away, alongside asteroids in the solar system and stars in the Milky Way.
“In fact, most of the objects captured in these images exhibit light that was emitted before our solar system was formed,” highlighted Bechtol. “We are witnessing light that reflects billions of years of the universe’s history, and many of these galaxies have never before been observed.”
Astronomers are eagerly awaiting the first images from the new observatory, affirming that experts will aid in unraveling some of the universe’s greatest mysteries and revolutionizing our understanding.
“We are entering the golden age of American science,” stated Harriet Kang, acting director of the Energy Department of Science. She elaborated in a statement.
“We anticipate that the observation deck will provide profound insights into our past, future, and potentially the fate of the universe,” Kang remarked during a Monday event.
The Vera C. Rubin Observatory is collaboratively managed by the Energy Agency and the National Science Foundation.
Named after an American astronomer renowned for uncovering evidence of dark matter in space, the observatory is situated atop Cerro Pachon, a mountain in central Chile. It is designed to capture around 1,000 images of the southern hemisphere sky each night, covering the entire visible southern sky every three to four nights.
These early images stem from a series of test observations and mark the commencement of a bold decade-long mission to scan the sky continuously, capturing all visible details and changes.
“The entire observatory design is centered on this capability, enabling you to point, shoot, and repeat,” Bechtol noted. “Every 40 seconds, the view shifts to a new part of the sky. Imagine bringing the night sky back to life in a way we’ve never experienced before.”
By repeating this process nightly over the next decade, scientists aim to create extensive images of the visible southern sky, tracking bright stars, moving asteroids in the solar system, measuring supernova explosions, and observing other cosmic phenomena.
“Utilizing this groundbreaking scientific facility, we will delve into many mysteries of the universe, including the dark matter and dark energy that fills our cosmos,” stated Brian Stone, Chief of Staff of the National Science Foundation, in a statement.
Antimatter particles are fundamentally similar to their normal matter counterparts, differing primarily in their opposite charges and momentum.
Although extremely rare, physicists routinely generate antiparticles using particle accelerators. Additionally, anti-Dutters occur naturally in high-energy processes near the event horizons of black holes.
The question of how and why the universe is predominantly made up of normal matter remains unresolved.
Creating antimatter is a complex and costly endeavor. The European Institute of Particle Physics (CERN) plays a crucial role in this process. Using an anti-proton decelerator, a proton beam strikes a metal target, resulting in the generation of anti-protons.
However, this process only yields tens of thousands of particles.
One of the significant challenges with antimatter is that when it interacts with normal matter, it vanishes instantly, releasing energy. Therefore, the task of preventing its annihilation and storing it long-term poses a substantial technical hurdle.
Nonetheless, CERN engineers are working on methods to store and transport small amounts of anti-protons.
The challenge with antimatter is that it completely disappears upon contact with normal matter, releasing energy. – Image credits: Getty Images
To achieve this, researchers cool anti-protons to approximately -269ºC (-452.2°F) to nearly halt their motion. They then contain them in a high-vacuum enclosure to avoid contact with normal matter, using superconducting magnets to trap them.
This process must be managed while maintaining the capability to extract particles and introduce new ones into the enclosure.
Despite these challenges, CERN aims to develop “traps” capable of storing billions of anti-protons simultaneously. Recent techniques have been validated by transporting regular matter across the Swiss CERN facility.
With advancements in vacuum systems, antimatter storage and transport may soon become routine activities in the upcoming year.
This article addresses the question posed by Leighton Haas of Hamburg: “How is antimatter preserved?”
We welcome your inquiries! You can email us at Question @sciencefocus.com or reach us on Facebook, Twitter, or Instagram. Please include your name and location.
Explore our ultimateFun Factsfor more amazing science content!
Baby KJ Rebecca Affles Nicklas and Kiran Musnul after gene editing injection with researchers
Philadelphia Children’s Hospital
A young boy afflicted with a serious genetic disorder is set to be the first recipient of personalized CRISPR gene editing treatments, offering a glimpse into the potential future of medicine.
This groundbreaking event marks the first instance of an individual receiving a gene editing therapy tailored to correct unique mutations contributing to their illness. Rebecca Ahrens-Nicklas explained during a press briefing held at Children’s Hospital in Philadelphia, Pennsylvania, “He is showing early signs of progress,” though she noted that it’s premature to determine the complete effectiveness of the treatment.
Researchers released information promptly, aiming to motivate others, as stated by team member Kiran Musnur at the University of Pennsylvania. “We sincerely hope that demonstrating the feasibility of personalized gene editing therapy for one patient within a few months will encourage additional efforts in this area,” he remarked.
“When I refer to this as the future of medicine, I believe I’m stating a fact,” he emphasized. “This is a crucial step towards employing gene editing therapies to address a range of rare genetic disorders that currently lack viable treatment options.”
KJ inherited mutations on both alleles of a liver enzyme gene known as CPS1. The absence of this enzyme leads to ammonia accumulating in the bloodstream, posing a risk of brain damage during the breakdown of dietary proteins. According to Ahrens-Nicklas, over half of children born with CPS1 deficiency do not survive.
She and Musnur are developing therapies targeting this condition by focusing on the liver, allowing them to rapidly formulate a basic editing therapy that modifies one of KJ’s two CPS1 gene copies.
The team engaged with US regulatory bodies early in the process. “They recognized the exceptional nature of this situation,” Musnur stated. “KJ was critically ill and time was of the essence. Following our official submission to the FDA [Food and Drug Administration] when KJ was six months old, we received approval within just a week.”
KJ underwent initial low-dose treatment at six months in February 2025, followed by higher doses in March and April. He is now able to consume more protein than before, albeit while still taking other medications for his condition.
Ideally, children should receive treatment earlier to mitigate long-term damages linked to conditions like CPS1 deficiency. As reported by New Scientist, Musnur has ambitions to enable gene editing in humans prior to birth one day.
In contrast, other gene editing therapies are designed for broader applications, aiming to work for many individuals irrespective of the specific mutations causing their condition. For instance, the first approved gene editing treatment for sickle cell disease functions by enhancing fetal hemoglobin production, rather than altering the mutations in adult hemoglobin responsible for the disorder. Despite being a “one-size-fits-all” solution, it comes at a price of £1,651,000 per treatment in the UK, as noted by Each treatment course costs £1,651,000.
KJ with his family after treatment
Philadelphia Children’s Hospital
Custom treatments can be significantly more costly. Musnur mentioned that he cannot provide exact figures for KJ’s treatment due to the extensive pro bono work by the involved companies. However, he is optimistic about a decline in costs. “As we enhance our methods, we can anticipate economies of scale, leading to a substantial reduction in prices,” he stated.
One barrier to the development of personalized gene editing therapies has been the regulatory perspective, which previously treated therapies targeting different mutations within the same gene as separate entities. This necessitated restarting the approval process for each mutation individually. However, there’s a growing movement towards a platform approach, allowing broader approvals for treatments targeting various mutations.
“Platform-based methods, like CRISPR gene editing, offer scalable solutions for even the rarest diseases, as exemplified by KJ’s case,” stated Nick Mead from Genetic Alliance UK, a charity that advocates for individuals with rare conditions. “This development finally renders treatment a plausible possibility for countless families.”
If it weren’t for Dutch settlers who introduced coffee trees to the islands of Java, Sumatra, and Sulawesi around 300 years ago, the world might still overlook the Asian palm civet.
Prior to this, these long, short-legged mammals were feasting on the islands’ fruits, berries, small mammals, and insects. Yet, with the emergence of coffee plants, these cat-like creatures discovered a new delicacy.
Coffee cherries, the small round fruits found on coffee plants, enclose the beans we adore.
Curious locals sampled them and preferred the taste. When farm owners noticed that the beans passed through the animals unscathed, they instructed their workers to collect them—after all, waste was not an option.
Kopi Luwak coffee is one of the most expensive coffees in the world. – Photo credit: Getty
Then, an unusual phenomenon occurred. People began enjoying the coffee made from the civets’ recycled beans, noting its distinctive flavor featuring hints of chocolate, syrup, rusticity, and an earthy “jungle” undertone. Thus, Kopi Luwak coffee was born.
Although these civets remain in South and Southeast Asia, this rare coffee is now exported worldwide. Due to its unique production method, Kopi Luwak has become one of the priciest coffees available.
Prices for wild-sourced Kopi Luwak can range from $20 to $100 (approximately £15-80). But is it truly worth it?
In the wild, palm civets are believed to select only the ripest and finest coffee cherries. This selection enriches the flavor. As the beans traverse the animal’s digestive system, enzymes and stomach acids break down the cherry’s outer layer and digest internal proteins.
This process enhances the flavor and aroma of the coffee beans, appealing to aficionados, though it’s detrimental to the civets.
Palm civets are primarily solitary creatures, coming together only for mating. They establish territories and communicate through scent marking, using stimulating waxy substances from special glands near their feces, urine, and anus.
With their white masks amid dark fur, they blend seamlessly with the shadows of their forest habitats. Like all wildlife, they prefer to be left undisturbed.
However, the surging demand for Kopi Luwak Coffee has led to the establishment of civet farms in Asia. Here, these animals are often confined to small cages, deprived of proper nutrition and space, forcing them to live in poor conditions for the sake of luxury coffee. Wild civets suffer greatly in Indonesia.
While there are regulations regarding the number of civets that can be harvested from the wild in Indonesia, these rules are often overlooked by poachers and inadequately enforced. All this, just for a cup of coffee.
For any questions, please email usat Question @sciencefocus.com or message us viaFacebook,Twitter or Instagram(don’t forget to include your name and location).
Discover our ultimateFun Fact and more amazing science pages.
For residents of the West Coast, the weather event known as the atmospheric river, stretching from San Diego to Vancouver, can deliver winter-like conditions similar to those in Boston, with heavy rain and snowfall.
Much like the storms that affect the East Coast, the term “Atmospheric River” can often feel trendy. While it may resonate more with those walking the streets of San Francisco than just plain “heavy rain,” it precisely describes moisture-laden storms in the Pacific Ocean that release precipitation upon hitting the mountain ranges in Washington, Oregon, and California.
Yet, these plumes of highly humid air driven by strong winds are not exclusive to the West Coast. They can occur globally, and recently, meteorologists and scientists are starting to apply this term to storms occurring east of the Rocky Mountains. This spring, a series of heavy rains in the central and southern United States resulted in fatal floods, with Accuweather identifying the unusual weather phenomenon as an Atmospheric River. CNN did as well.
Some researchers are hopeful that the term will gain wider acceptance, although not all meteorologists, including those at the National Weather Service, are on board. The crux of the debate revolves around how forecasts will describe the conditions for the day.
The Atmospheric River can stretch up to 2,000 miles.
These weather systems typically form over oceans in tropical and subtropical regions, where water vapor evaporates and coalesces into extensive streams of steam that travel through the lower atmosphere towards the poles. Averaging around 500 miles wide and extending up to 1,000 miles, while many weak atmospheric rivers bring beneficial precipitation, stronger ones can lead to severe rainfall, causing flooding, landslides, and significant destruction.
Rain is not the only aspect; just as squeezing a wet sponge releases water, atmospheric rivers require a mechanism to shed rain and snow. As they ascend, the water vapor cools, condenses, and ultimately falls as precipitation.
On the West Coast, this process repeats from late fall to early spring, facilitated by mountain ranges such as the Cascade and Sierra Nevada, which provide the necessary lift. Atmospheric rivers from the Pacific Ocean collide with these mountains, forcing the water vapor upward where it turns into liquid.
The situation is more complex in other regions, where upward lift usually arises from less defined and unpredictable atmospheric instability rather than geographical features. In early April, for example, cold air descending from the north pushed under the Atmospheric River originating from the bay, elevating the moist air.
“When warm air is forced up to a higher elevation than its surroundings, it can rapidly ascend, leading to severe thunderstorms,” explained Travis O’Brien, an assistant professor at Indiana University and co-author of a noteworthy paper. This study garnered attention regarding Atmospheric Rivers impacting the Midwest and East Coast.
Regions like Kentucky, Tennessee, and Arkansas experienced extreme flooding, with rainfall exceeding 15 inches in some areas.
So, why is it called that?
Atmospheric rivers have existed for ages; however, scientists began recognizing and naming them in the mid-1970s to 1980s with advancements in satellite technology, specifically the global operating environment satellite known as GOES, developed by NASA and administered by the National Oceanic and Atmospheric Administration.
Clifford Masa, a professor of atmospheric science at the University of Washington, noted, “Prior to that, we didn’t discuss it much.”
Advancements in satellite technology allowed researchers and meteorologists to visualize atmospheric rivers, leading to more discussions and the formal naming of the phenomenon.
The term “Atmospheric River” was introduced in the 1990s by two scholars at the Massachusetts Institute of Technology: meteorologist Reginald E. Newell and research scientist Yong Zhu. They originally referred to it as Tropospheric River, named after the lowest layer of the Earth’s atmosphere where most weather phenomena occur. It later evolved into “Atmospheric River,” as it was noted that these rivers “carry about the same amount of water as the Amazon.”
Is the terminology overused? Sometimes.
Though the term became more prominent in the 2010s to 2020s, it primarily gained traction on the West Coast, as scientists focused on and studied atmospheric rivers. Numerous research papers identified them as a key source of rain and snow across California, Oregon, and Washington, as well as major contributors to flooding events. One notable occurrence was a series of nine atmospheric rivers that inundated California in December 2022 and January 2023, resulting in widespread flooding and alleviating drought conditions.
Daniel Swain, a climate scientist at the University of California, Los Angeles, highlighted that interest in atmospheric rivers tends to peak during California’s exceptionally wet storm seasons. While he appreciates the label, he also points out its potential misuse, stating that excessive use can mislead the public if distinctions between different atmospheric river intensities are not made.
“The primary misconception is that every atmospheric river is an extreme and destructive event, which is not accurate,” Swain explained.
A classification system for atmospheric rivers was introduced in 2019 to clarify this confusion. Dr. Marty Ralph, director of the Scripps Institution of Oceanography in San Diego and the Center for Extreme Weather and Water in the West, spearheaded the development of this classification system, which has been applied in various global regions including the Arctic and Antarctic. He has been a prominent advocate for researching and popularizing the term atmospheric river, particularly in California, authoring numerous papers on the topic.
“It was Marty Ralph who convened the scientific community around the concept of Atmospheric Rivers as a topic deserving of attention, and his efforts have implicitly tied this concept to the West Coast, despite the original studies being global in scope,” Dr. O’Brien remarked.
This association may mislead the public as daily forecasts from West Coast offices frequently discuss atmospheric rivers, whereas offices in other regions may not.
“In the Midwest and Southeast, we typically don’t use that terminology,” stated Jimmy Barham, lead meteorologist with the Arkansas Meteorological Service. “We simply refer to it as higher-level moisture.”
The focus on the West Coast also means that atmospheric rivers are studied less frequently in other regions, where hurricanes and summer thunderstorms also contribute significantly to rainfall and draw considerable attention.
Dr. Ralph aspires for expanded research to reach the East Coast, asserting, “Even the East Coast often experiences strong, potentially impactful atmospheric rivers.”
Treatment offers protection to mice against venom from common taipans and various other snakes
Matthijs Kuijpers/Alamy
Antibodies derived from inflammatory men exhibit effectiveness against a range of snake bites, suggesting that a universal treatment may soon be achievable.
The use of non-human antibodies, however, can lead to serious adverse effects, including potentially fatal allergic reactions. Additionally, it necessitates the identification of the specific snake responsible for the bite before administering the anti-venom.
Jacob Granville from Centivax, a biotechnology firm in San Francisco, California, is exploring broadly neutralizing antibodies that could be developed into anti-venoms effective against multiple or all venomous snakes. “There are 650 venomous snake species, but their venoms involve just 10 common classes of toxins,” Granville explains.
Researchers began investigating individuals bitten multiple times by different snakes. “Perhaps a daring snake researcher,” remarks Granville. Media reports introduced the story of Tim Friede, who claims to have “self-administered escalating doses of venom from the world’s deadliest snakes over 700 times.”
“If anyone could yield a wide-ranging neutralizing antibody against snake venom, it would be Tim Friede,” Granville affirms.
From just 40 milliliters of Friede’s blood, the team “converted immune memory into a library of billions of antibodies,” he adds. They subsequently tested promising candidates against venom from 19 of the deadliest Elapidae family species, including several cobra varieties.
Ultimately, they treated two antibodies derived from Friede’s blood, known as LNX-D09 and SNX-B03, along with a toxin inhibitor named varespladib. In experiments on mice, this combination provided comprehensive protection against 13 species, including various cobras, the tiger snake (Notechis scutatus), and the general Thai bread snake (Oxyuranus scutellatus). It also offered partial protection against six additional species, including the notorious death adder (Acanthophis Antalcus).
The subsequent phase involves testing these treatments on animals brought into Australian veterinary clinics following a snake bite and identifying antibodies that can confer protection against vipers.
Tian Du from the University of Sydney emphasizes that “discovering two antibodies that can inhibit toxins makes for a universal treatment for closely related species.”
Additionally, after learning that the anticoagulant drug heparin can assist individuals in avoiding limb loss following a cobra bite, Du aims to determine whether their treatment can also avert skin and muscle necrosis.
The harmful bleaching of corals around the world has increased to affect 84% of the ocean coral reefs, marking the most intense event in recorded history, as announced by the International Coral Reef Initiative on Wednesday.
This is the fourth global bleaching event since 1998, surpassing the 2014-17 bleaching that impacted two-thirds of the reefs during that time. The current crisis began in 2023, and it remains unclear when it will end, with ocean warming being criticized for the phenomenon.
Mark Eakin, the executive director of the International Coral Reef Association and former coral monitoring officer for the US National Marine and Atmospheric Administration, stated, “We’re witnessing a complete transformation of the planet and its impact on our oceans’ ability to sustain life and livelihoods.”
Last year was reported as the hottest year on record globally, with average sea surface temperatures for oceans away from the poles reaching 20.87 degrees Celsius (69.57 degrees Fahrenheit), which is detrimental to corals. These structures are vital for seafood production, tourism, and protecting coastlines from erosion and storms. Coral reefs are often referred to as “rainforests of the sea” because they host a significant amount of marine biodiversity, with approximately twenty five% of all marine species living in and around them.
Corals house colorful algae, which give them their vibrant hues and serve as a food source. However, prolonged warming causes the algae to release toxins, leading to coral bleaching where they expel the algae and turn white. Weakened corals are at an increased risk of death due to these events, prompting NOAA’s Coral Reef Watch program to introduce additional levels to the bleaching alert scale to convey the heightened risk of coral mortality.
Efforts to conserve coral reefs are underway, such as initiatives to restore coral populations. Dutch labs are working with coral fragments, including those sourced from the Seychelles, with the intention to propagate them in zoos for potential reintroduction to natural reef habitats. Similar projects, including those in Florida, aim to rescue at-risk corals from high temperatures and rehabilitate them before returning them to the sea.
Nevertheless, scientists stress the importance of reducing greenhouse gas emissions like carbon dioxide and methane to combat planet-warming effects and protect coral reefs.
Melanie Mcfield, co-chair of the Caribbean Steering Committee for the Global Coral Reef Surveillance Network, emphasized, “The most effective way to safeguard coral reefs is to address the root causes of climate change by reducing human emissions, primarily from fossil fuel combustion. Inaction poses a significant threat to coral reef ecosystems.”
This update coincides with President Donald Trump’s efforts to bolster fossil fuels and scale back clean energy initiatives as he enters a second term, prompting concerns about the future of coral reefs. Eakin remarked, “The current government is actively dismantling these ecosystems, and eliminating their protections would have catastrophic consequences.”
What health professionals see when overseeing IVF procedures via live streams
Possible life sciences
A highly automated form of in vitro fertilization (IVF) leads to successful births and we hope that this approach can reduce the risk of artificial errors during such procedures.
One method of IVF is intratesticular sperm injection (ICSI). Here, sperm is injected into the egg into the lab dish. This is commonly used in cases of male infertility, as the sperm does not need to work to reach the egg. The resulting embryo is then inserted into the uterus. IVF can also be done by mixing sperm and eggs into a lab dish in the hopes of fertilization being performed, which is generally less successful, but requires fewer medical interventions.
ICSI also suffer from drawbacks as it relies on high levels of accuracy and judgment from healthcare professionals. “Like everyone else in most professions, they are sometimes tired and distracted. [the] “Fertilization and the possibility of birth.” Jack Cohen With the Life Sciences, a biotech company in New York City.
To address this, Cohen and his colleagues developed a machine that can perform the 23 important steps required for ICSI. Each is started by a person by pressing a button when watching the live stream of the process. This can also be done from another part of the world.
In one step, the machine uses an AI model to select the healthiest sperm cells for fertilization based on appearance. In other cases, the machine will fix sperm by laying its tail with a laser to make it easier to pick up. The sperm is later injected into the already collected eggs. A similar approach has been tested previously, As a result, two births are bornhowever, some steps were not performed by the machine.
To test the machine, researchers recruited couples struggling to get pregnant because the man had sperm that he couldn’t swim properly. Women also had problems with egg production, so donors’ items were used in the procedure.
The researchers randomly allocated five of the eight donor eggs to be fertilized by an automated system that generated the four embryos. The remaining three eggs were fertilized using a standard manual ICSI approach. All of these formed embryos.
We then used another AI model to select two best embryos based on the appearance of the chromosomes. Both of these were generated using automated systems, but that doesn’t necessarily mean that this approach leads to healthier embryos than manual ICSI, Cohen says. This cannot be measured because there are fewer eggs involved, he says.
When the team inserts one of the embryos into the female uterus, it fails to develop, but the second successfully birth.
It’s an exciting proof of concept, I say Joyce Harper University College London. However, large-scale studies that randomly assign couples to be randomly assigned to perform either automated or manual LCSI procedures should establish whether the former approach leads to increased fertility rates, she says.
Harper said automated IVFs are unlikely to be widely used, as they are at an additional cost, at least if they were first deployed. But Cohen hopes this will improve over time. “We expect patients and clinics to decline as we optimize, standardize and refine our systems,” he says.
Enjoy the beautiful scene of Music sounds as Maria and von Trapp’s children sing about the lonely Jasteld Jodel in the Austrian Alps (lay-ee-odl-lay-ee-odl-lay-hee-hoo).
Despite the picturesque moment, I was unfamiliar with these characters. Comparing their yodeling to that of monkeys in the rainforest of Latin America, it seems somewhat embarrassing.
Recent research conducted by Anglia Ruskin University (ARU) and the University of Vienna in collaboration with experts from Japan, Sweden, and Bolivia sheds light on this topic.
Through recordings and analysis of black and gold Howler monkeys, tufted capuchins, black-cap squirrel monkeys, and Peruvian spider monkeys at Randa Verde Wildlife Reserve in Bolivia, scientists discovered that these primates can jump three or more musical octaves at once, unlike human yodelers who span sub-octaves.
Dr. Jacob Dunn, an associate professor of evolutionary biology at ARU, highlighted how these voice leaps contribute to the primates’ communication abilities in complex social settings.
The unique vocalizations known as “ultra yodels” are made possible by the distinct anatomy of the monkey’s throat, specifically the vocal membrane. This thin tissue ribbon allows for extended pitch ranges, enhancing the monkeys’ vocal repertoire.
The evolution of monkeys’ vocal membranes contrasts with human vocal abilities, as they enhance pitch range but may lead to voice instability.
Capuchin Monkeys are known for their intelligence and tool use
While humans yodel by shifting between voice registers, monkeys utilize vocal membranes to produce complex vocal patterns without the need for intricate neural control.
Not all monkeys excel at yodeling, with Latin American monkeys displaying a particular proficiency due to their vocal membranes. This suggests the importance of these calls for certain species.
Freshwater essential for lithium mining is found in parts of Argentina, Bolivia, and Chile, situated in the world’s “lithium triangle” on the Andean plateau, boasting half of all global lithium reserves.
A recent study in Communications Earth and the Environment revealed that available freshwater for lithium extraction in these regions is significantly lower than previously believed. With global demand for lithium expected to surge by 2040, this poses a challenge as it surpasses the limited annual rainfall supplying water to the dry lithium triangle.
Minimizing freshwater usage in the lithium industry is crucial to prevent disruption in mining activities. Extracting one ton of lithium requires approximately 500,000 gallons of water, which also sustains small indigenous communities and unique wildlife habitats in the region.
Water scarcity affects both the ecosystem and the industry in the lithium triangle, as lithium is a key component in batteries driving the global shift towards clean energy technologies. Despite the projected quadrupling demand for lithium batteries by 2030, delays in mining operations due to resource availability raise concerns about meeting this growing demand.
Freshwater plays a vital role in determining the supply of lithium available for mining in the lithium triangle. Rainfall washes lithium-rich minerals out of rocks, creating lagoons filled with lithium-rich water where mining companies extract the mineral. However, limited weather data and overestimation of freshwater supply in the region pose challenges to sustainable mining.
Research into water and resource availability for lithium mining operations is ongoing, emphasizing the need for a comprehensive understanding of the entire lithium supply chain. Studies in lithium-rich regions worldwide are essential to grasp the environmental and social impacts of lithium extraction.
According to Italian newspapers, it is the world’s first fully produced version created by artificial intelligence.
Il Foglio, a conservative liberal newspaper, is conducting a month-long experiment to showcase the impact of AI technology on our work and time, as stated by Claudio Cerasa, the newspaper’s editor.
The four-page IL Foglio AI is included in the Slim Broadsheet edition of the newspaper and can be found on newsstands. Online starting Tuesday.
Cerasa mentioned that Il Foglio AI will be the world’s first daily newspaper fully created using artificial intelligence, covering everything from writing, headlines, quotes, summaries, and even sarcasm. Journalists will have a limited role in questioning and reading the responses generated by the AI tool.
This experiment coincides with global news organizations exploring the use of AI. The Guardian recently reported that BBC News will utilize AI for more personalized content delivery.
The debut edition of Il Foglio AI features stories on US President Donald Trump and Russian President Vladimir Putin, along with various other topics.
Cerasa emphasized that Il Foglio Ai represents traditional newspapers but also serves as a testing ground for understanding the impact of AI on the creation of daily newspapers.
“Do not consider Il Foglio as an artificial intelligence newspaper,” Serasa stated.
Antarctic Circulating Current (ACC), which is more than four times as strong as the Gulf Stream, is the world’s strongest ocean current and plays an unbalanced role in the climate system due to its role as a major basin conduit. Scientists at the University of Melbourne and the Research Centre in Nordic Norway have shown that ACC will slow by about 20% by 2050 in high carbon emission scenarios. This influx of freshwater into the southern ocean is expected to alter the properties such as the density (salinity) of the ocean and its circulation patterns.
Sohail et al. High-resolution ocean and sea ice simulations of ocean currents, heat transport, and other factors were analyzed to diagnose the effects of temperature changes, saltiness, and wind conditions. Image credit: Sohail et al. , doi: 10.1088/1748-9326/adb31c.
“The oceans are extremely complex, finely balanced,” says Dr. Bishakhdatta Gayen, liquid mechanic at the University of Melbourne.
“If this current ‘engine’ collapses, serious consequences, including more climate change, including extreme extreme climate variability in certain regions, will accelerate global warming due to a decline in the ability of the ocean to function as a carbon sink. “
The ACC acts as a barrier to invasive species, like the southern burkelp and marine vectors such as shrimp and mollusks, which travel in the current from other continents reaching Antarctica.
If this current slows and weakens, it is more likely that such species will head towards the fragile Antarctica, potentially serious effects on food webs, which could change the available diet of Antarctic penguins, for example.
The ACC is an important part of the marine conveyor belt around the world, moving water around the world and linking the Atlantic, Pacific and Indian seas. These are the main mechanisms of exchange of heat, carbon dioxide, chemicals and biology throughout these basins.
In their study, the authors used Gadi, the fastest supercomputer in Australia located on the Access National Research Infrastructure.
They discovered that transport of seawater from the surface to the deepest could also be slower in the future.
“If ice melting accelerates as predicted by other studies, slowdowns are predicted to be similar in low emission scenarios,” Dr. Sohail said.
“The 2015 Paris Agreement aims to limit global warming to 1.5 degrees Celsius above pre-industrial levels.”
“Many scientists agree that we have already reached this 1.5 degree target, which could have an impact on the melting of Antarctic ice, making it even hotter.”
“Cooperative efforts to limit global warming (by reducing carbon emissions) will limit the melting of Antarctic ice and avoid the expected slowdown in ACC.”
This study reveals that the effects of ice melting and ocean warming on ACC are more complicated than previously thought.
“The melted ice sheets throw a large amount of fresh water from salt water into the salty sea.”
“This sudden change in ocean salinity has a series of results, including weakening of subsidence to the depths of surface seawater (called Antarctic bottom water), and based on this study, it includes weakening of the powerful marine jets surrounding Antarctica,” Dr. Gayen said.
study Published in the journal Environmental Survey Letter.
____
Taimoor Sohail et al. 2025. Decreasing the polar current in the Antarctic due to polarization. environment. res. Rent 20, 034046; doi:10.1088/1748-9326/adb31c
Norwegian researchers have connected the dots from 2,000 years ago, suggesting that a woman could have engraved her name on the oldest dated runestone ever discovered in Norway.
The inscription starts with the word “I” in the Lunic script, hinting that it might be the author’s name. The runestone was unearthed in the Hall cemetery, a small town in southern Norway, towards the east of the capital.
“The text essentially indicates that it is the name of the Rune attendee,” Christel Zilmer, one of the study’s co-authors, told NBC News over the phone. He shared that the script was found by a Rune attendee.
Rune stones appear to be part of the excavation. Museum of Cultural History
Experts believe that the Germanic alphabet script drew inspiration from the Roman alphabet, with Runes serving as key components in early Scandanavian communication. This form of communication was prevalent in the region until the late Middle Ages.
Rune inscriptions have been identified on items like Danish bone knives, iron knives, and combs, estimated to be around 700 Guangxi around 150 AD, correlating with other runestones discovered by archeologists.
These inscriptions often carried messages involving spells for the deceased and enchanting words.
However, the evolution of Runes over time remains a mystery, and deciphering them without an archaeological context can pose significant challenges.
The reconstruction and accompanying illustrations reveal the rune inscription. Christel Zilmer
Recent research indicates that the fragments uncovered in 2021 belonged to a single slab, aiding scientists in understanding language evolution and the significance of such stones.
Two years later, additional fragments were discovered, and it appears that the inscriptions span across all fragments, suggesting they are part of a single stone.
“By finding two additional pieces that fit perfectly into the existing inscription, it has almost completed the inscription,” Zilmer remarked.
Due to the stone’s deterioration and weathered state, deciphering the exact text containing the names of the attendees poses some challenges, but it is noted that the inscription ends with a “-u”, which peaked researchers’ interest.
Excavation was found at the Swingyad Site, west of Oslo, Norway. Museum of Cultural History
If confirmed as a woman’s name in ancient runes, it could be the earliest known record of female Runes inscription.
The fragments buried alongside cremated human remains in the pit allow scientists to use radiocarbon dating, tracing the fragments back to a period between 50 BC to 275 AD, providing valuable context.
“There could be a series of interconnected events here involving different individuals. It’s possible that the stone served multiple purposes,” Zilmer commented.
While much of the research is still underway, there remains a conspicuous gap in our understanding, as Zilmer noted.
“It’s akin to a puzzle with missing pieces, but exploring how these individual fragments, some inscribed, could potentially connect is an intriguing prospect,” she added.
In a new paper, planetary researchers from Texas A&M University and the University of Washington introduce a new thermodynamic concept called centotectics to investigate the stability of liquids under extreme conditions. This is important information for determining the habitability of icy moons and oceanic exoplanets.
Europa's surface stands out in this newly reprocessed color view. Image scale is 1.6 km per pixel. North of Europe is on the right. Image credit: NASA / JPL-Caltech / SETI Institute.
Exploration of icy ocean worlds represents a new frontier in planetary science, with a focus on understanding the potential of these environments to support life.
New research is addressing fundamental questions in this field. Under what conditions can liquid water remain stable on these distant frozen bodies?
The authors provide an important framework for interpreting data from planetary exploration activities by defining and measuring the cenotect, the absolute minimum temperature at which a liquid remains stable under various pressures and concentrations.
This research combines their expertise in cryobiology with their expertise in planetary science and high-pressure water ice systems.
Together, they developed a framework that bridges the disciplines to tackle one of the most fascinating challenges in planetary science.
2016 artist concept for the European Clipper spacecraft. As spacecraft development progresses, the design changes. Image credit: NASA/JPL-Caltech.
“The launch of NASA's Europa Clipper, the largest planetary exploration mission ever launched, ushered us into a decades-long era of exploration of the frigid ocean world,” said Dr. Baptiste Journeau, a planetary scientist at the University of Washington. It's coming in,” he said.
“Measurements from this and other missions will tell us the depth of the ocean and its composition.”
“Laboratory measurements of liquid stability, particularly the lowest possible temperature (a newly defined cenotect), combined with the mission results will help us determine how habitable the solar system's cold, deep oceans are, and how likely they will ultimately be. It will be possible to completely constrain what the temperature will be.''The fate would be when the moon or planet cools down completely. ”
“The study of icy worlds is a particular priority for both NASA and ESA, as evidenced by the spate of recent and upcoming spacecraft launches,” said Dr. Matt Powell-Palm, a planetary scientist at Texas A&M University. Ta.
“We hope to help Texas A&M provide intellectual leadership in this area.”
of paper Published in the Journal on December 18, 2024 nature communications.
_____
A. Zaris others. 2024. On the equilibrium limit of liquid stability in pressurized water systems. nut community15;doi: 10.1038/s41467-024-54625-z
Hello. Welcome to Techscape. After enduring the recent impact of COVID-19, I’ve been reflecting on screen time and isolation. Just a few days of isolation and prolonged screen exposure were enough to bring back the mental state I experienced for most of 2020. Wishing everyone a wonderful winter and a happy new year filled with family, friends, and joyous gatherings.
Today on Techscape: A recap of the biggest tech story of 2024 – Elon Musk and the US Amazon worker strike.
Technology in 2024: Elon Musk as Influential as Donald Trump
Donald Trump listens to Elon Musk, who arrived to watch SpaceX’s giant rocket Starship take off for a test flight from Starbase in Boca Chica, Texas, on November 19th. Photo: Brandon Bell/AP
The significant tech story of the year is Elon Musk’s meteoric rise to power and global influence in 2024. Musk has become the most prominent individual worldwide without winning any elections. He holds sway over the US President and exerts control over vital government bodies regulating his companies, which have become crucial to many countries’ digital infrastructure. His enormous wealth makes US lawmakers uneasy, and his tweets impact leaders globally.
Since Trump’s presidency, Musk has wielded his influence boldly by shaping government decisions. His recent clash with the House of Representatives over a spending deal highlighted tensions in US politics. Despite his and Trump’s efforts, Republicans resisted their demands, pointing to the limits of CEO power and foreshadowing 2025’s potential chaos. Democrats mocked Musk as “President Elon Musk,” hinting at the escalating power struggle.
As political storms rage, it’s crucial to contemplate Musk’s rapid ascendancy in American politics and the implications it holds. Let’s delve into his timeline of events throughout the year, revealing Musk’s inevitable dominion over 2024 akin to Trump’s reign from 2015 to 2021, setting global news agendas with their actions.
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.