Revolutionary Fast-Charging Quantum Battery Integrated with Quantum Computer Technology

Sure! Here’s the SEO-optimized version of the content while retaining the original HTML structure:

Quantum batteries are making their debut in quantum computers, paving the way for future quantum technologies. These innovative batteries utilize quantum bits, or qubits, that change states, differing from traditional batteries that rely on electrochemical reactions.

Research indicates that harnessing quantum characteristics may enable faster charging times, yet questions about the practicality of quantum batteries remain. “Many upcoming quantum technologies will necessitate quantum versions of batteries,” states Dian Tan from Hefei National Research Institute, China. “While significant strides have been made in quantum computing and communication, the energy storage mechanisms in these quantum systems require further investigation.”

Tan and his team constructed the battery using 12 qubits formed from tiny superconducting circuits, controlled by microwaves. Each qubit functioned as a battery cell and interacted with neighboring qubits.

The researchers tested two distinct charging protocols, one mirroring conventional battery charging without quantum interactions, while the other leveraged quantum interactions. They discovered that exploiting these interactions led to an increase in power and a quicker charging capacity.

“Quantum batteries can achieve power output up to twice that of conventional charging methods,” asserts Alan Santos from the Spanish National Research Council. This compatibility with the nearest neighbor interaction of qubits is notable, as this is typical for superconducting quantum computers, making further engineering of beneficial interactions a practical challenge.

James Quach from Australia’s Commonwealth Scientific and Industrial Research Organisation adds that previous quantum battery experiments have utilized molecules rather than components in current quantum devices. Quach and his team have theorized that quantum batteries may enhance the efficiency and scalability of quantum computers, potentially becoming the power source for future quantum systems.

However, comparing conventional and quantum batteries remains a complex task, notes Dominik Shafranek from Charles University in the Czech Republic. In his opinion, translating the advantages of quantum batteries into practical applications is currently ambiguous.

Kaban Modi from the Singapore University of Technology and Design asserts that while benefits exist for qubits interfacing exclusively with their nearest neighbors, their research indicates these advantages can be negated by real-world factors like noise and sluggish qubit control.

Additionally, the burgeoning requirements of extensive quantum computers may necessitate researching energy transfer within quantum systems, as they might incur significantly higher energy costs compared to traditional computers, Modi emphasizes.

Tan believes that energy storage for quantum technologies, particularly in quantum computers, is a prime candidate for their innovative quantum batteries. Their next goal involves integrating these batteries with qubit-based quantum thermal engines to produce energy for storage within quantum systems.

<section class="ArticleTopics" data-component-name="article-topics">
    <p class="ArticleTopics__Heading">Topics:</p>
    <ul class="ArticleTopics__List">
        <li class="ArticleTopics__ListItem">Quantum Computing <span>/</span></li>
        <li class="ArticleTopics__ListItem">Quantum Physics</li>
    </ul>
</section>

Key SEO Optimizations:

  • Added a descriptive alt tag for the image to enhance image SEO.
  • Used relevant keywords such as “Quantum Batteries,” “quantum technologies,” and “quantum computing” throughout the content.
  • Structured the text for better readability and keyword density while retaining the original meaning.
  • Enhanced internal linking with descriptive anchor texts for better user engagement and SEO.

Source: www.newscientist.com

Revolutionary Quantum Simulator Breaks Records, Paving the Way for New Materials Discovery

Quantum Simulation of Qubits

Artist Representation of Qubits in the Quantum Twins Simulator

Silicon Quantum Computing

A groundbreaking large-scale quantum simulator has the potential to unveil the mechanisms of exotic quantum materials and pave the way for their optimization in future applications.

Quantum computers are set to leverage unique quantum phenomena to perform calculations that are currently unmanageable for even the most advanced classical computers. Similarly, quantum simulators can aid researchers in accurately modeling materials and molecules that remain poorly understood.

This holds particularly true for superconductors, which conduct electricity with remarkable efficiency. The efficiency of superconductors arises from quantum effects, making it feasible to implement their properties directly in quantum simulators, unlike classical devices that necessitate extensive mathematical transformations.

Michelle Simmons and her team at Australia’s Silicon Quantum Computing have successfully developed the largest quantum simulator to date, known as Quantum Twin. “The scale and precision we’ve achieved with these simulators empower us to address intriguing challenges,” Simmons states. “We are pioneering new materials by crafting them atom by atom.”

The researchers designed multiple simulators by embedding phosphorus atoms into silicon chips. Each atom acts as a quantum bit (qubit), the fundamental component of quantum computers and simulators. The team meticulously configured the qubits into grids that replicate the atomic arrangement found in real materials. Each iteration of the Quantum Twin consisted of a square grid containing 15,000 qubits, surpassing any previous quantum simulator in scale. While similar configurations have been built using thousands of cryogenic atoms in the past, Quantum Twin breaks new ground.

By integrating electronic components into each chip via a precise patterning process, the researchers managed to control the electron properties within the chips. This emulates the electron behavior within simulated materials, crucial for understanding electrical flow. Researchers can manipulate the ease of adding an electron at specific grid points or the “hop” between two points.

Simmons noted that while conventional computers struggle with large two-dimensional simulations and complex electron property combinations, the Quantum Twin simulator shows significant potential for these scenarios. The team tested the chip by simulating the transition between conductive and insulating states—a critical mathematical model explaining how impurities in materials influence electrical conductivity. Additionally, they recorded the material’s “Hall coefficient” across different temperatures to assess its behavior in magnetic fields.

With its impressive size and variable control, the Quantum Twins simulator is poised to tackle unconventional superconductors. While conventional superconductors function well at low temperatures or under extreme pressure, some can operate under milder conditions. Achieving a deeper understanding of superconductors at ambient temperature and pressure is essential—knowledge that quantum simulators are expected to furnish in the future.

Moreover, Quantum Twins can also facilitate the investigation of interfaces between various metals and polyacetylene-like molecules, holding promise for advancements in drug development and artificial photosynthesis technologies, Simmons highlights.

Topic:

Source: www.newscientist.com

Unusual Temperature Rules: Exploring the Bizarre Phenomena of the Quantum Realm

Check out our monthly Lost in Space-Time newsletter for captivating ideas from around the globe. Click here to register for Lost in Time and Space.

One of the most paradoxical aspects of science is how we can delve into the universe’s deepest enigmas, like dark matter and quantum gravity, yet trip over basic concepts. Nobel laureate Richard Feynman once candidly admitted his struggle to grasp why mirrors flip images horizontally instead of vertically. While I don’t have Feynman’s challenges, I’ve been pondering the fundamental concept of temperature.

Since time immemorial, from the earliest humans poking fires to modern scientists, our understanding of temperature has dramatically evolved. The definition continues to change as physicists explore temperature at the quantum level.

My partner once posed a thought-provoking question: “Can a single particle possess a temperature?” While paraphrased, this inquiry challenges conventional wisdom.

His instinct was astute. A single particle cannot possess a temperature. Most science enthusiasts recognize that temperature applies to systems comprising numerous particles—think gas-filled pistons, coffee pots, or stars. Temperature is essentially an average energy distribution across a system reaching equilibrium.

Visualize temperature as a ladder, each rung representing energy levels. The more rungs, the greater the energy. For a substantial number of particles, we expect them to occupy various rungs, with most clustering at lower levels and some scaling higher ones. The distribution gradually tapers off as energy increases.

But why use this definition? While averages are helpful, one could argue the average height in a room with one tall person could misleadingly imply everyone else is six feet tall. Why not apply the same logic to temperature?

Temperature serves a predictive role, not merely a descriptive one. In the 17th and 18th centuries, as researchers strove to harness the potential of fire and steam, temperature became pivotal in understanding how different systems interacted.

This insight led to the establishment of the 0th law of thermodynamics—the last yet most fundamental principle. It states that if a thermometer registers 80°C for warm water and the same for warm milk, there should be no net heat exchange when these two are mixed. Though seemingly simple, this principle forms the basis for classical temperature measurements.

This holds true due to the predictable behavior of larger systems. Minute energy variances among individual particles become negligible, allowing statistical laws to offer broad insights.

Thermodynamics operates differently than Isaac Newton’s laws of motion, which apply universally regardless of how many objects are involved. Thermodynamic laws arise only in larger systems where averages and statistical regularities emerge.

Thus, a single particle lacks temperature—case closed.

Or so I believed until physics threw another curveball my way. In many quantum systems, composed of a few particles, stable properties often evade observation.

In small systems like individual atoms, states can become trapped and resist reaching equilibrium. If temperature describes behavior after equilibrium, does this not challenge its very definition?

What exactly is temperature?

fhm/Getty Images

Researchers are actively redefining temperature from the ground up, focusing on its implications in the quantum realm.

In a manner akin to early thermodynamics pioneers, contemporary scientists are probing not just what temperature is, but rather what it does. When a quantum system interacts with another, how does heat transfer? Can it warm or cool its neighbor?

In quantum systems, both scenarios are possible. Consider the temperature ladder for particles. In classical physics, heat always moves from a system with more particles to one with fewer, following predictable rules.

Quantum systems defy these conventions. It’s common for no particles to occupy the lowest rung, with all clustered around higher energy levels. Superposition allows particles to exist in between. This shift means quantum systems often do not exhibit traditional thermal order, complicating heat flow predictions.

To tackle this, physicists propose assigning two temperatures to quantum systems. Imagine a reference ladder representing a thermal system. One temperature indicates the highest rung from which the system can absorb heat, while the other represents the lowest rung to which it can release heat. This new framework enables predictable heat flow patterns outside this range, while outcomes within depend on the quantum system’s characteristics. This new “Zero Law of thermodynamics” helps clarify how heat moves in quantum domains.

These dual temperatures reflect a system’s capacity to exchange energy, regardless of its equilibrium state. Crucially, they’re influenced by both energy levels and their structural arrangement—how quantum particles distribute across energy levels and the transitions the overall system can facilitate.

Just as early thermodynamicists sought functionality, quantum physicists are likewise focused on applicability. Picture two entangled atoms. Changes in one atom will affect the other due to their quantum link. When exposed to external conditions, as they gain or lose energy, the invisible ties connecting them create a novel flow of heat—one that can be harnessed to perform work, like driving quantum “pistons” until the entanglement ceases. By effectively assigning hot and cold temperatures to any quantum state, researchers can determine ideal conditions for heat transfer, powering tasks such as refrigeration and computation.

If you’ve followed along up to this point, here’s my confession: I initially argued that a single particle could have temperature, though my partner’s intuition was spot on. In the end, we realized both perspectives hold some truth—while a single particle can’t be assigned a traditional temperature, the concept of dual temperatures in quantum systems offers intriguing insights.

Topics:

  • quantum physics/
  • lost in space and time

Source: www.newscientist.com

Nobel Prize Winner Plans to Develop World’s Most Powerful Quantum Computer

Ryan Wills, New Scientist. Alamy

John Martinis is a leading expert in quantum hardware, who emphasizes hands-on physics rather than abstract theories. His pivotal role in quantum computing history makes him indispensable to my book on the subject. As a visionary, he is focused on the next groundbreaking advancements in the field.

Martinis’s journey began in the 1980s with experiments that pushed the limits of quantum effects, earning him a Nobel Prize last year. During his graduate studies at the University of California, Berkeley, he tackled the question of whether quantum mechanics could apply to larger scales, beyond elementary particles.

Collaborating with colleagues, Martinis developed circuits combining superconductors and insulators, demonstrating that multiple charged particles could behave like a single quantum entity. This discovery initiated the macroscopic quantum regime, forming the backbone of modern quantum computers developed by giants like IBM and Google. His work led to the adoption of superconducting qubits, the most common quantum bits in use today.

Martinis made headlines again when he spearheaded a team at Google that built the first quantum computer to achieve quantum supremacy. For nearly five years, this machine could independently verify the outputs of random quantum circuits, though it was eventually surpassed by classical computers in performance.

Approaching seven decades of age, Martinis still believes in the potential of superconducting qubits. In 2024, he co-founded QoLab, a quantum computing startup proposing revolutionary methodologies aimed at developing a genuinely practical quantum computer.

Carmela Padavich Callahan: Early in your career, you fundamentally impacted the field. When did you realize your experiments could lead to technological advancements?

John Martinis: I questioned whether macroscopic variables could bypass quantum mechanics, and as a novice in the field, I felt it was essential to test this assumption. A fundamental quantum mechanics experiment intrigued me, even though it initially seemed daunting.

Our first attempt was a simple and rapid experiment using contemporary technology. The outcome was a failure, but I quickly pivoted. Learning about microwave engineering, we tackled numerous technical challenges before achieving subsequent successes.

Over the next decade, our work on quantum devices laid a solid foundation for quantum computing theory, including the breakthrough Scholl algorithm for factorizing large numbers, essential for cryptography.

How has funding influenced research and the evolution of technology?

Since the 1980s, the landscape has transformed dramatically. Initially, there was uncertainty about manipulating single quantum systems, but quantum computing has since blossomed into a vast field. It’s gratifying to see so many physicists employed to unravel the complexities of superconducting quantum systems.

Your involvement during quantum computing’s infancy gives you a unique perspective on its trajectory. How does that inform your current work?

Having long experience in the field, I possess a deep understanding of the fundamentals. My team at UC Santa Barbara developed early microwave electronics, and I later contributed to foundational cooling technology at Google for superconducting quantum computers. I appreciate both the challenges and opportunities in scaling these complex systems.

Cryostat for Quantum Computers

Mattia Balsamini/Contrasto/Eyeline

What changes do you believe are necessary for quantum computers to become practical? What breakthroughs do you foresee on the horizon?

After my tenure at Google, I reevaluated the core principles behind quantum computing systems, leading to the founding of QoLab, which introduces significant changes in qubit design and assembly, particularly regarding wiring.

We recognized that making quantum technology more reliable and cost-effective requires a fresh perspective on the construction of quantum computers. Despite facing skepticism, my extensive experience in physics affirms that our approach is on the right track.

It’s often stated that achieving a truly functional, error-free quantum computer requires millions of qubits. How do you envision reaching that goal?

The most significant advancements will arise from innovations in manufacturing, particularly in quantum chip fabrication, which is currently outdated. Many leading companies still use techniques reminiscent of the mid-20th century, which is puzzling.

Our mission is to revolutionize the construction of these devices. We aim to minimize the chaotic interconnections typically associated with superconducting quantum computers, focusing on integrating everything into a single chip architecture.

Do you foresee a clear leader in the quest for practical quantum computing in the next five years?

Given the diverse approaches to building quantum computers, each with its engineering hurdles, fostering various strategies is valuable for promoting innovation. However, many projects do not fully contemplate the practical challenges of scaling and cost control.

At QoLab, we adopt a collaborative business model, leveraging partnerships with hardware companies to enhance our manufacturing capabilities.

If a large-scale, error-free quantum computer were available tomorrow, what would your first experiment be?

I am keen to apply quantum computing solutions to challenges in quantum chemistry and materials science. Recent research highlights the potential for using quantum computers to optimize nuclear magnetic resonance (NMR) experiments, as classical supercomputers struggle with such complex quantum issues.

While others may explore optimization or quantum AI applications, my focus centers on well-defined problems in materials science, where we can craft concrete solutions with quantum technologies.

Why have mathematically predicted quantum applications not materialized yet?

While theoretical explorations in qubit behavior are promising, real-life qubits face significant noise challenges, making practical implementations far more complex. Theoretical initiatives comprehensively grasp theory but often overlook the intricacies of hardware development.

Through my training with John Clark, I cultivated a strong focus on noise reduction in qubits, which has proven beneficial in experiments showcasing quantum supremacy. Addressing these challenges requires dedication to understanding qubit design intricacies.

As we pursue advancements, a dual emphasis on hardware improvements and application innovation remains crucial in the journey to unlock quantum computing’s full potential.

Topics:

Source: www.newscientist.com

Beyond Quantum: An In-Depth Review of Must-Read Books on Quantum Mechanics and Big Ideas

Plastic bottle in crashing waves

Pilot Wave Theory: Steering a Bottle at Sea

Philip Thurston/Getty Images

Beyond Quantum
Anthony Valentini, Oxford University Press

Physics is experiencing unexpected challenges. Despite extensive research, the elusive dark matter remains undetected, while the Higgs boson’s discovery hasn’t clarified our path forward. Moreover, string theory, often hailed as the ultimate theory of everything, lacks solid, testable predictions. This leaves us pondering: what’s next?

Recently, many physicists and science writers have shied away from addressing this question. While they used to eagerly anticipate groundbreaking discoveries, they now often revert to philosophical musings or reiterate known facts. However, Antony Valentini from Imperial College London stands out. In his book, Beyond Quantum: Exploring the Origins and Hidden Meanings of Quantum Mechanics, he introduces bold, innovative ideas.

The book’s focus is quantum mechanics, a pillar of physics for the last century. This field hinges on the concept of the wave function—a mathematical representation capable of detailing the complete state of any system, from fundamental particles to larger entities like us.

The enigma of wave functions is their tendency not to describe ordinary localized objects but rather a diffuse, fuzzy version of them. Upon observation, the wave function “collapses” into a random outcome with probabilities defined by Born’s law, a principle established by physicist Max Born, typically covered in academic literature. This results in objects manifesting with clear attributes in specific locations.

The debate surrounding the interpretation of the wave function has persisted, with two primary perspectives emerging. One posits that wave functions represent reality itself, suggesting that electrons, cats, and humans exist in multiple states simultaneously across time and space—a many-worlds interpretation fraught with metaphysical implications.


Pilot wave theory has long been known to reproduce all the predictions of quantum mechanics.

The alternative interpretation suggests that wave functions are not the entirety of reality. This is where pilot wave theory, significantly advanced by Valentini and initially proposed by Louis de Broglie in 1927, comes into play.

Louis de Broglie: Pioneer of Pilot Wave Theory

Granger – Historical Photo Archive/Alamy

Pilot wave theory posits a real yet incomplete wave function, suggesting the wave guides individual particles instead of being mere waves influencing a floating plastic bottle. In this model, particles remain specific, and their wave-like behavior originates from the pilot wave itself.

This theory has consistently validated all quantum mechanics predictions, eschewing fundamental randomness. However, Valentini underscores that this agreement rests on the assumption that particles maintain equilibrium with waves, which aligns with current experimental data but isn’t universally applicable.

Valentini’s hypothesis suggests that in the universe’s infancy, particles existed far from quantum equilibrium before settling into their current states, akin to a cup of coffee cooling down. In this scenario, the Born rule and its inherent randomness morph from core natural features into historical anomalies shaped by cosmology.

Moreover, quantum randomness also hinders the practical utilization of nonlocality, implicating direct interactions between separate objects across time and space. Valentini argues that if the Born law had not prevailed in the universe’s early stages, instantaneous communication across vast distances may have occurred, potentially leaving traces on the cosmic microwave background. If any relics from that era exist, superluminal signal transmission might still be feasible.

Though Valentini’s insights might appear speculative without concrete evidence, his rigorous examination of how conventional quantum mechanics became dominant makes his work noteworthy. While there could be gaps, especially in clearly explaining the pilot wave aspect, Valentini’s contributions illuminate what a ‘big idea’ looks like in a field rife with uncertainty.

John Cartwright – A writer based in Bristol, UK.

Topics:

Source: www.newscientist.com

Exploring the Universe: Unlocking Fundamental Quantum Secrets Yet to be Discovered

Conceptual diagram of quantum fluctuations

We May Never Know the Universal Wave Function

Victor de Schwanberg/Science Photo Library/Getty Images

From the perspective of quantum physics, the universe may be fundamentally agnostic in some respects.

In quantum physics, every object, such as an electron, corresponds to a mathematical entity known as a wave function. This wave function encodes all details regarding an object’s quantum state. By combining the wave function with other equations, physicists can effectively predict the behavior of objects in experiments.

If we accept that the entire universe operates on quantum principles, then even larger entities, including the cosmos itself, must possess a wave function. This perspective has been supported by iconic physicists like Stephen Hawking.

However, researchers like Eddie Kemin Chen from the University of California, San Diego and Roderich Tumulka from the University of Tübingen in Germany, have demonstrated that complete knowledge of the universal wave function may be fundamentally unattainable.

“The cosmic wave function is like a cosmic secret that physics itself conspires to protect. We can predict a lot about how the universe behaves, yet we remain fundamentally unsure of its precise quantum state,” states Chen.

Previous studies assumed specific forms for the universal wave function based on theoretical models of the universe, overlooking the implications of experimental observations. Chen and Tumulka began with a more practical inquiry: Can observations help in identifying the correct wave function among those that reasonably describe our universe?

The researchers utilized mathematical outcomes from quantum statistical mechanics, which examines the properties of collections of quantum states. A significant factor in their calculations was the realization that the universal wave function depends on numerous parameters and exists in a high-dimensional abstract state.

Remarkably, upon completing their calculations, they found that universal quantum states are essentially agnostic.

“The measurements permissible by the rules of quantum mechanics provide very limited insight into the universe’s wave function. Determining the wave function of the universe with significant precision is impossible,” explains Tumulka.

Professor JB Manchak from the University of California, Irvine states that this research enhances our understanding of the limits of our best empirical methods, noting that we essentially have an equivalent to general relativity within the framework of quantum physics. He adds that this should not come as a surprise since quantum theory was not originally designed as a comprehensive theory of the universe.

“The wave function of a small system or the entire universe is a highly theoretical construct. Wave functions are meaningful not because they are observable, but because we employ them,” remarks Sheldon Goldstein from Rutgers University. He further explains that the inability to pinpoint a unique, accurate universal wave function from a limited range of candidates may not be problematic, as any of these functions could yield similar effects in future calculations.

Chen expresses hope to connect his and Tumulka’s research with the exploration of large-scale systems smaller than the universe itself, especially through techniques like shadow tomography, which aim to determine the quantum state of such systems. However, the philosophical consequences of their work are equally crucial. Tumulka emphasizes the need for caution against over-relying on positivist views that deem non-experimental statements as meaningless or unscientific. “Some truths are real, but cannot be measured,” he asserts.

This rationale might influence ongoing debates regarding the interpretation of quantum mechanics. According to Emily Adlam from Chapman University in California, the new findings advocate for incorporating more components into the interpretation of quantum equations, such as wave functions, emphasizing the relationship between quantum objects and individual observer perspectives, moving away from the assumption of a singular objective reality dictated by a single mathematical construct.

Topic:

This revised content is SEO-optimized with relevant keywords and better formatting for improved readability and search engine visibility.

Source: www.newscientist.com

Breakthrough: The Most Complex Time Crystal Created Inside a Quantum Computer

IBM Quantum System 2

IBM Quantum System Two: The Machine Behind the New Time Crystal Discovery

Credit: IBM Research

Recent advancements in quantum computing have led to the creation of a highly complex time crystal, marking a significant breakthrough in the field. This innovative discovery demonstrates that quantum computers excel in facilitating scientific exploration and novel discoveries.

Unlike conventional crystals, which feature atoms arranged in repeating spatial patterns, time crystals possess configurations that repeat over time. These unique structures maintain their cyclic behavior indefinitely, barring any environmental influences.

Initially perceived as a challenge to established physics, time crystals have been successfully synthesized in laboratory settings over the past decade. Recently, Nicholas Lorente and his team from the Donostia International Physics Center in Spain utilized an IBM superconducting quantum computer to fabricate a time crystal exhibiting unprecedented complexity.

While previous work predominantly focused on one-dimensional time crystals, this research aimed to develop a two-dimensional variant. The team employed 144 superconducting qubits configured in an interlocking, honeycomb-like arrangement, enabling precise control over qubit interactions.

By manipulating these interactions over time, the researchers not only created complex time crystals but also programmed the interactions to exhibit advanced intensity patterns, surpassing the complexity of prior quantum computing experiments.

This new level of complexity allowed the researchers to map the entire qubit system, resulting in the creation of its “state diagram,” analogous to a phase diagram for water that indicates whether it exists as a liquid, solid, or gas at varying temperatures and pressures.

According to Jamie Garcia from IBM, which did not participate in the study, this experiment could pave the way for future quantum computers capable of designing new materials based on a holistic understanding of quantum system properties, including extraordinary phenomena like time crystals.

The model emulated in this research represents such complexity that traditional computers can only simulate it with approximations. Since all current quantum computers are vulnerable to errors, researchers will need to alternate between classical estimation methods and precise quantum techniques to enhance their understanding of complex quantum models. Garcia emphasizes that “large-scale quantum simulations, involving more than 100 qubits, will be crucial for future inquiries, given the practical challenges of simulating two-dimensional systems.”

Biao Huang from the University of the Chinese Academy of Sciences notes that this research signifies an exciting advancement across multiple quantum materials fields, potentially connecting time crystals, which can be simulated with quantum computers, with other states achievable through certain quantum sensors.

Topics:

  • Quantum Computing/
  • Quantum Physics

Source: www.newscientist.com

Unveiling Quantum Creepiness: The Top Innovative Concept of the Century

In the 1920s, renowned physicist Albert Einstein believed he had identified a fundamental flaw within quantum physics. This led to extensive investigations revealing a pivotal aspect of quantum theory, one of its most perplexing features.

This intriguing property, known as Bell nonlocality, describes how quantum objects exhibit cooperative behavior over vast distances, challenging our intuitions. I’ve accepted this understanding for over 21 years—a remarkable insight for the 21st century.

To illustrate this phenomenon, consider two hypothetical experimenters, Alice and Bob, each possessing a pair of “entangled” particles. Entanglement enables particles to correlate, even when separated by distances that prevent any signal from transmitting between them. Yet, these correlations become apparent only through the interaction of each experimenter with their respective particles. Do these particles “know” about their correlation beforehand, or is some mysterious connection at play?

Einstein, alongside Nathan Rosen and Boris Podolsky, sought to refute this eerie connection. They proposed that certain “local hidden variables” could explain how particles understand their correlated state, making quantum physics more relatable to everyday experiences, where interactions happen at close range.

In the 1960s, physicist John Stewart Bell devised a method to empirically test these concepts. After numerous attempts, groundbreaking experiments in 2015 provided rigorous verification of Bell’s theories, earning three physicists the 2022 Nobel Prize. “This was the final nail in the coffin for these ideas,” says Marek Zhukowski from the University of Gdańsk. Researchers concluded that hidden variables could not maintain the locality of quantum physics. Jacob Valandez at Harvard University adds, “We cannot escape from non-locality.”

Embracing delocality offers substantial advantages, as noted by Ronald Hanson from Delft University of Technology, who led one of the groundbreaking experiments. For him, the focus was never on the oddities of quantum mechanics; rather, he viewed the results as a demonstration of “quantum supremacy” beyond conventional computational capabilities. This intuition proved accurate. The technology developed for the Bell Test has become a foundation for highly secure quantum cryptography.

Currently, Hanson is pioneering quantum communication networks, utilizing entangled particles to forge a near-unhackable internet of the future. Similarly, quantum computing researchers exploit entangled particles to optimize calculations. Although the implications of entanglement remain partially understood, the practical application of entangling quantum objects has transformed into a valuable technological asset, marking a significant evolution for a leading figure in discussions about the quantum nature of reality.

Topics:

Source: www.newscientist.com

Mastering Quantum Computing: A Beginner’s Guide to Understanding the Basics

IBM's Quantum System Two showcased in Ehningen, Germany on October 1, 2024, featuring advanced quantum chips at IBM's inaugural quantum data center.

IBM’s Quantum System Two Unveiled at a Data Center in Germany

Quantum computing has been making headlines lately. You might have noticed quantum chips and their intriguing cooling systems dominating your news feed. From politicians to business leaders, the term “quantum” is everywhere. If you find yourself perplexed, consider setting a New Year’s resolution to grasp the fundamentals of quantum computing this year.

This goal may seem daunting, but the timing is perfect. The quantum computing sector has achieved significant breakthroughs lately, making it a hotbed of innovation and investment, with the market expected to exceed $1 billion, likely doubling in the coming years. Yet, high interest often leads to disproportionate hype.

There remain numerous questions about when quantum computers might outpace classical ones. While mathematicians and theorists ponder these queries, the practical route may be to improve quantum computers through experimentation. However, consensus on the best methodologies for building these systems is still elusive.

Compounding the complexity, quantum mechanics itself is notoriously challenging to comprehend. Physicists debate interpretations of bizarre phenomena like superposition and entanglement, which are pivotal for quantum computing’s potential.

Feeling overwhelmed? You’re not alone. But don’t be discouraged; these challenges can be overcome with curiosity.

As a former high school teacher, I often encountered curious students who would linger after class, eager to discuss intricate aspects of quantum computing. Many were novice learners in math or physics, yet they posed thought-provoking questions. One summer, a group who took an online quantum programming course approached me, surpassing my own coding knowledge in quantum applications. The following year, we delved into advanced topics typically reserved for college-level classes.

Recently, I discovered a young talent in quantum inquiry. A 9-year-old YouTuber, Kai, co-hosts a podcast named Quantum Kid, where he interviews leading quantum computing experts for over 88,000 subscribers to enjoy.

Kai’s co-host, Katya Moskvich, is not only his mother but also a physicist with extensive experience in science writing. She works at Quantum Machines, a firm developing classical devices that enhance the functionality of quantum computers. Kai brings an infectious enthusiasm to the podcast, engaging with pivotal figures who have influenced modern quantum theory.

In a recent episode, renowned quantum algorithm creator Peter Scholl discussed the intersection of quantum computing, sustainability, and climate action. Nobel laureate Stephen Chu and distinguished computer scientist Scott Aaronson also joined, exploring concepts like time travel and its theoretical connections to quantum mechanics. Additionally, physicist John Preskill collaborated with roboticist Ken Goldberg to examine the interplay of quantum computing and robotics.

Kai and Co-Host (Mother) Katya Moskvich

While The Quantum Kid may not delve deep into rigorous math, it offers a fun entry point and insight from leading experts in quantum technology. Most episodes introduce fundamental concepts like superposition and Heisenberg’s uncertainty principle, which you can explore further in reputable publications such as New Scientist.

The true strength of The Quantum Kid lies in Kai’s ability to ask the very questions that an inquisitive mind might have regarding quantum computers—those which seek to unpack the complex yet fascinating nature of this technology. If you’ve been curious about quantum computing but have felt overwhelmed, Kai encourages you to remain inquisitive and seek clarity. (We’re here to guide you on your quantum journey.)

Could quantum computers revolutionize space exploration or even facilitate time travel? Might they help develop advanced robotics or combat climate issues? The answers are not straightforward, laden with nuances. Kai’s engaging dialogues make complex theories accessible, ensuring clarity resonates with both young listeners and adults. Hearing Peter Scholl reiterate that current quantum systems lack the clout to change the world doesn’t dampen Kai’s enthusiasm but rather fuels it.

In the pilot episode, physicist Lennart Renner expresses optimism, stating, “We’re evolving alongside new machines that can potentially revolutionize tasks, hence we must deliberate on their applications,” setting a forward-thinking tone that reverberates throughout the series.

Adopting a blend of Kai’s wonder and imagination, coupled with the seasoned expertise of guests, will enhance any quantum learning project you embark on this year. Quantum computing, while intricate and multifaceted, remains incredibly compelling. If your child is captivated, why not explore it together?

Topics:

  • Quantum Computing/
  • Quantum Physics

Source: www.newscientist.com

Rethinking Quantum Computing: Are They Necessary for Key Applications?

Can Quantum Computers Revolutionize Agriculture?

As quantum computing technology evolves, it becomes crucial to pinpoint challenges that can be tackled more efficiently than with classical computers. Interestingly, many significant tasks that quantum advocates are pursuing may not necessitate quantum computing at all.

The focal point of this discussion is a molecule called FeMoco, essential for life on Earth due to its role in nitrogen fixation. This process enables microorganisms to convert atmospheric nitrogen into ammonia, making it biologically available for other organisms. The mechanisms of FeMoco are intricate and not completely understood, but unraveling this could greatly diminish energy usage in fertilizer production and enhance crop yields.

Understanding FeMoco involves determining its lowest energy state, or “ground state” energy, which necessitates examining several electron behaviors. Electrons, being quantum particles, exhibit wave-like properties and occupy distinct regions known as orbits. This complexity has historically made it challenging for classical computers to calculate the various properties of FeMoco accurately.

While approximation methods have shown some success, their energy estimates have been constrained in accuracy. Conversely, rigorous mathematical analyses have demonstrated that quantum computers, utilizing a fundamentally different encoding of complexity, can resolve problems without relying on approximations, exemplifying what is known as ‘quantum advantage.’

Now, researchers such as Garnet Kin Rick Chan from the California Institute of Technology have unveiled a conventional calculation method capable of achieving comparable accuracy to quantum calculations. A pivotal metric in this discussion is “chemical precision,” which signifies the minimum accuracy required to yield reliable predictions in chemical processes. Based on their findings, Chan and colleagues assert that standard supercomputers can compute FeMoco’s ground state energy with the necessary precision.


FeMoco embodies various quantum states, each with distinct energy levels, forming a structure similar to a ladder with the ground state at the base. To streamline the process for classical algorithms to reach this lowest level, researchers concentrated on the states located on adjacent rungs and inferred their implications for what may exist one or two steps below. Insights into the symmetries of the electrons’ quantum states offered valuable context.

This simplification allowed researchers to use classical algorithms to establish an upper limit on FeMoco’s ground state energy and subsequently extrapolate it to a value with an uncertainty consistent with chemical accuracy. Essentially, the computed lowest energy state must be precise enough for future research applications.

Furthermore, researchers estimate that supercomputing methods could outperform quantum techniques, allowing classical calculations that would typically take eight hours to be completed in under a minute. This assumption relies on ideal supercomputer performance.

However, does this discovery mean you’ll instantly understand FeMoco and enhance agricultural practices? Not entirely. Numerous questions remain unanswered, such as which molecular components interact most effectively with nitrogen and what intermediate molecules are produced in the nitrogen fixation process.

“While this study does not extensively detail the FeMoco system’s capabilities, it further elevates the benchmark for quantum methodologies as a model to illustrate quantum benefits,” explains David Reichman from Columbia University in New York.

Dominic Berry, a professor at Macquarie University in Sydney, Australia, highlights that although their team’s research demonstrates that classical computers can approach the FeMoco dilemma, it only does so through approximations, while quantum methods promise complete problem resolution.

“This raises questions about the rationale for utilizing quantum computers for such challenges; however, for more intricate systems, we anticipate that the computational time for classical approaches will escalate much faster than quantum algorithms,” he states.

Another hurdle is that quantum computing technology is still evolving. Existing quantum devices are currently too limited and error-prone for tackling problems like determining FeMoco’s ground state energy. Yet, a new generation of fault-tolerant quantum computers, capable of self-correction, is on the horizon. From a practical standpoint, Berry suggests that quantum computing may still represent the optimal approach to deciphering FeMoco and related molecules. “Quantum computing will eventually facilitate more general solutions to these systems and enable routine computations once fault-tolerant quantum devices become widely available.”

Topic:

Source: www.newscientist.com

Stunning Photos That Reveal the Fascinating World of Quantum Physics

Marco Schioppo and Adam Park monitor ultra-stable lasers at the National Physical Laboratory in Teddington, UK.

David Severn, part of Quantum Untangled (2025), Science Gallery, King’s College London

In a striking portrayal, two physicists observe Britain’s revolutionary quantum technology involving ultra-stable lasers at the National Physical Laboratory in London. Captured by photographer David Severn for the **Quantum Untangled** exhibition at King’s College London, this fascinating image was shortlisted for the **Portrait of Britain Award**.

Severn states, “This portrait offers a rare peek into a domain typically hidden from view, like opening a door to a normally restricted lab.” While the photographs are contemporary, he notes that the scientists’ engagements with technology evoke imagery reminiscent of earlier eras, such as a 1940s submarine pilot or operators of a cotton spinning machine from the turn of the 20th century.

Having no background in quantum mechanics before this venture, Severn was briefed on current quantum physics projects in the UK. He observed that the bewildering aspects of quantum science closely align with artistic perspectives. “Although many scientific concepts eluded my detailed understanding, ideas like superposition and quantum entanglement resonated with me intuitively, akin to artistic realization,” he shared.

3D Printed Helmet Prototype

David Severn, part of Quantum Untangled (2025), Science Gallery, King’s College London

Severn’s captivating photographs highlight a range of innovations in quantum physics, showcasing a **3D-printed helmet** (above) designed to house a quantum sensor that images the brain using magnetic fields. He also features a complex **laser table** (below) monitored by Hartmut Grothe from Cardiff University, ensuring that the vacuum pumps sustaining the system remain operational.

Hartmut Grote at the Laser Table

David Severn, part of Quantum Untangled (2025), Science Gallery, King’s College London

Severn’s photography embraces a mystical quality, showcasing the **3D-printed imaging helmet** used by researchers from the University of Nottingham’s Sir Peter Mansfield Imaging Center (as shown above), along with the intricate network of pumps and mirrors essential for maintaining cleanliness in Grothe’s experiments (as depicted below). Severn asserts that this ethereal essence is intentional.

Joe Gibson Wearing a 3D Printed Imaging Helmet at the University of Nottingham

David Severn, part of Quantum Untangled (2025), Science Gallery, King’s College London

Complex Vacuum System from King’s College London’s Photonics and Nanotechnology Group

David Severn, part of Quantum Untangled (2025), Science Gallery, King’s College London

Severn references a favorite quote from photographer Diane Arbus: “Photographs are secrets about secrets. The more they tell you, the less you understand.” He finds a parallel in quantum physics, where just when one thinks they’ve grasped how light behaves, the quantum realm subverts those expectations and exposes the elusive truths underpinning our understanding of reality.

The **Quantum Untangled** exhibition is on display at the Science Gallery at King’s College London until February 28, 2025. This event is a reimagining of the traveling exhibition **Cosmic Titans: Art, Science and the Quantum Universe** organized by Lakeside Arts and ARTlab at the University of Nottingham.

Topics:

Source: www.newscientist.com

How Quantum Computers Could Enhance Exoplanet Imaging for Clearer Views

Artist’s Impression of an Exoplanet

Credit: ESA/Hubble (M. Kornmesser)

Innovative quantum computers may enhance our ability to detect exoplanets and analyze their characteristics in unprecedented detail.

Astronomers have identified thousands of planets beyond our solar system, but they believe billions of exoplanets remain to be uncovered. This exploration is crucial for the search for extraterrestrial life, though the distance from Earth complicates direct observations.

Johannes Borregard and his team at Harvard University propose that quantum computing technology could dramatically streamline this endeavor.

Capturing images of exoplanets involves detecting their faint light signals, which diminish as they traverse vast cosmic distances. Additionally, these signals can be obscured by the light of nearby stars, creating additional challenges.

According to Borregard, his NASA colleagues illustrated the difficulty of this task, likening it to locating a single photon amidst a sea of light during telescope observations.

Traditional processing methods struggle with such weak signals. However, quantum computers can harness the quantum states of incoming photons, utilizing their unique properties to gather crucial data about exoplanets. This approach could transform what typically produces indistinct images or singular blurred points into clear visuals of distant worlds, revealing light-based markers of molecules present on these exoplanets.

The central concept of the team’s proposal suggests that light from an exoplanet interacts with a quantum computing device crafted from specially engineered diamond. This technology has already shown success in storing quantum states of photons. These states would then be transmitted to an advanced quantum computer designed to process and generate images of exoplanets. In their model, Borregard and his colleagues envision the second device utilizing ultracold atoms, which have demonstrated significant potential in recent experiments.

Research indicates that employing quantum devices in this manner could produce images using only one-hundredth, or even one-thousandth, of the photons needed in conventional methods. Essentially, in scenarios of extremely weak light, quantum systems could surpass existing technology.

“Since photons adhere to quantum mechanics principles, it is intuitive to explore quantum approaches for detecting and processing light from exoplanets,” notes Cosmolpo from the Polytechnic University of Bari, Italy. However, he acknowledges that realizing this proposal poses significant challenges, necessitating precise control over both quantum computers and effective coordination between them.

Borregard concurs, recognizing promising experimental advancements in employing diamond-based and cryogenic quantum computers. He highlights that establishing a connection between these systems is currently a focus for several research teams, including his own.

Lupo introduces another innovative strategy leveraging quantum light properties. Current initiatives utilizing quantum devices have already begun to observe stars in the Canis Minor constellation. “I am eager to witness the influence of quantum computing on imaging and astronomy in the future,” he states. “This new research represents a pivotal step in that direction.”

Discover Chile: The Global Hub of Astronomy

Immerse yourself in Chile’s astronomical wonders. Experience cutting-edge observatories and gaze at the stars beneath the world’s clearest skies.

Topics:

  • Exoplanet/
  • Quantum Computing

Source: www.newscientist.com

Can Quantum Neural Networks Bypass the Uncertainty Principle?

Quantum Chips in Quantum Systems showcasing IBM's first quantum data center

Quantum Computers and Heisenberg’s Uncertainty Principle

Marijan Murat/DPA/Alamy

The Heisenberg Uncertainty Principle imposes limits on the precision of measuring specific properties of quantum entities. However, recent research suggests that utilizing quantum neural networks may allow scientists to circumvent this barrier.

For instance, when analyzing a chemically relevant molecule, predicting its properties over time can prove challenging. Researchers must first assess its current characteristics, but measuring quantum properties often leads to interference between measurements, complicating the process. The uncertainty principle asserts that certain quantum attributes cannot be accurately measured at the same time; for example, gaining precise momentum data can distort positional information.

According to Zhou Duanlu from the Chinese Academy of Sciences, recent mathematical insights indicate that quantum neural networks may address these measurement challenges more effectively.

Zhou’s team approached this issue from a practical standpoint. For optimal performance of quantum computers, understanding the properties of qubits—quantum computing’s fundamental components—is crucial. Typical operations, akin to dividing by 2, are employed to yield information about qubits. Yet, the uncertainty principle presents challenges akin to the incompatibility encountered when attempting to execute several conflicting arithmetic operations simultaneously.

Their findings propose that leveraging quantum machine learning algorithms, or Quantum Neural Networks (QNNs), could effectively resolve the compatibility issues inherent to quantum measurements.

Notably, these algorithms rely on randomly selected steps from a predefined set, as shown in previous studies. Zhou et al. demonstrated that introducing randomness into QNNs can enhance the accuracy of measuring a quantum object’s properties. They further extended this approach to simultaneously measure various properties typically constrained by the uncertainty principle, using advanced statistical techniques to aggregate results from multiple random operations for improved precision.

As noted by Robert Fan, this capability to measure multiple incompatible properties swiftly could accelerate scientific understanding of specific quantum systems, significantly impacting quantum computing fields in chemistry and material sciences, as well as large-scale quantum computer research.

The practicality of this innovative approach appears promising, though its effectiveness will hinge on how it compares against other methodologies employing randomness to facilitate reliable quantum measurements, Huang asserts.

Topic:

Source: www.newscientist.com

Why Some Quantum Computers Demand More Power Than Traditional Supercomputers

El Capitan, the National Nuclear Security Administration's leading exascale computer

El Capitan Supercomputer: Power Play in Quantum Computing

Credit: LLNL/Garry McLeod

The advancement of large quantum computers offers the potential to solve complex problems beyond the reach of today’s most powerful classical supercomputers. However, this leap in capability may come with increased energy demands.

Currently, most existing quantum computers are limited in size, with less than 1,000 qubits. These fragile qubits are susceptible to errors, hindering their ability to tackle significant issues, like aiding in drug discovery. Experts agree that to reach practical utility, a Fault-Tolerant Quantum Computer (FTQC) must emerge, with a much higher qubit count and robust error correction. The engineering hurdles involved in this pursuit are substantial, compounded by multiple competing designs.

Olivier Ezratty, from the Quantum Energy Initiative (QEI), warns that the energy consumption of utility-scale FTQCs has been largely overlooked. During the Q2B Silicon Valley Conference in Santa Clara, California, on December 9, he presented his preliminary estimates. Notably, some FTQC designs could eclipse the energy requirements of the world’s top supercomputers.

For context, El Capitan, the fastest supercomputer globally, located at Lawrence Livermore National Laboratory, draws approximately 20 megawatts of electricity—three times that of the nearby city of Livermore, which has a population of 88,000. Ezratty forecasts that FTQC designs scaling up to 4,000 logical qubits may demand even more energy. Some of the power-hungry designs could require upwards of 200 megawatts.

Ezratty’s estimates derive from accessible data, proprietary insights from quantum tech firms, and theoretical models. He outlines a wide energy consumption range for future FTQCs, from 100 kilowatts to 200 megawatts. Interestingly, he believes that three forthcoming FTQC designs could ultimately operate below 1 megawatt, aligning with conventional supercomputers utilized in research labs. This variance could significantly steer industry trends, particularly as low-power models become more mainstream.

The discrepancies in projected energy use stem from the various strategies that quantum computing companies employ to construct and maintain their qubits. For instance, certain qubit technologies necessitate extensive cooling to function effectively. Light-based qubits struggle with warm light sources and detectors, leading to heightened energy consumption. Similarly, superconducting circuits require entire chips to be housed in large refrigeration systems, while designs based on trapped ions or ultracold atoms demand substantial energy input from lasers or microwaves to precisely control qubits.

Oliver Dial from IBM, known for superconducting quantum computers, anticipates that his company’s large-scale FTQC will need approximately 2 to 3 megawatts of power, a fraction of what a hyperscale AI data center could consume. This demand could be lessened through integration with existing supercomputers. Meanwhile, a team from QuEra, specializing in ultracold atomic quantum computing, estimates their FTQC will require around 100 kilowatts, landing on the lower end of Ezratty’s spectrum.

Other companies like Xanadu, focusing on light-based quantum technologies, as well as Google Quantum AI, centered on superconducting qubits, have opted not to comment. PsiQuantum, another light-based qubit developer, was unavailable for a response. New Scientist has made multiple attempts for their insights.

Ezratty also pointed out that traditional electronics responsible for directing and monitoring qubit operations could result in additional costs, particularly for FTQC systems where qubits need further instructions to self-correct errors. This complexity necessitates understanding how these algorithms contribute to energy footprints. The operational runtime length of quantum computers adds another layer, as energy savings from fewer qubits might be negated if longer operation times are needed.

To effectively measure and report the energy consumption of machines, the industry must establish robust standards and benchmarks. Ezratty emphasizes that this is an integral element of QEI’s mission, with projects actively progressing in both the United States and the European Union.

As the field of quantum computing continues to mature, Ezratty anticipates that his research will pave the way for insights into FTQC energy consumption. This understanding could be vital for optimizing designs to minimize energy use. “Countless technological options could facilitate reduced energy consumption,” he asserts.

Topics:

Source: www.newscientist.com

Revolutionary Quantum Computing Breakthrough: Secure Methods for Backing Up Quantum Information

Researchers from the University of Waterloo and Kyushu University have achieved a groundbreaking advancement in quantum computing by developing a novel method to create redundant, encrypted copies of qubits. This represents a pivotal step towards practical quantum cloud services and robust quantum infrastructure.



Google’s quantum computer – Image credit: Google.

In quantum mechanics, the no-cloning theorem asserts that creating an identical copy of an unknown quantum state is impossible.

Dr. Achim Kempf from the University of Waterloo and Dr. Koji Yamaguchi from Kyushu University emphasize that this fundamental rule remains intact.

However, they have demonstrated a method to generate multiple encrypted versions of a single qubit.

“This significant breakthrough facilitates quantum cloud storage solutions, such as quantum Dropbox, quantum Google Drive, and quantum STACKIT, enabling the secure storage of identical quantum information across multiple servers as redundant encrypted backups,” said Dr. Kemp.

“This development is a crucial step towards establishing a comprehensive quantum computing infrastructure.”

“Quantum computing offers immense potential, particularly for addressing complex problems, but it also introduces unique challenges.”

“One major difficulty in quantum computing is the no-duplication theorem, which dictates that quantum information cannot be directly copied.”

“This limitation arises from the delicate nature of quantum information storage.”

According to the researchers, quantum information functions analogously to splitting passwords.

“If you possess half of a password while your partner holds the other half, neither can be utilized independently. However, when both sections are combined, a valuable password emerges,” Dr. Kemp remarked.

“In a similar manner, qubits are unique in that they can share information in exponentially growing ways as they interconnect.”

“A single qubit’s information is minimal; however, linking multiple qubits allows them to collectively store substantial amounts of information that only materializes when interconnected.”

“This exceptional capability of sharing information across numerous qubits is known as quantum entanglement.”

“With 100 qubits, information can be simultaneously shared in 2^100 different ways, allowing for a level of shared entangled information far exceeding that of current classical computers.”

“Despite the vast potential of quantum computing, the no-cloning theorem restricts its applications.”

“Unlike classical computing, where duplicating information for sharing and backup is a common practice, quantum computing lacks a simple ‘copy and paste’ mechanism.”

“We have uncovered a workaround for the non-replicability theorem of quantum information,” explained Dr. Yamaguchi.

“Our findings reveal that by encrypting quantum information during duplication, we can create as many copies as desired.”

“This method circumvents the no-clonability theorem because when an encrypted copy is selected and decrypted, the decryption key is automatically rendered unusable; it functions as a one-time key.”

“Nevertheless, even one-time keys facilitate crucial applications such as redundant and encrypted quantum cloud services.”

The team’s research will be published in the journal Physical Review Letters.

_____

Koji Yamaguchi & Achim Kempf. 2026. Encrypted qubits can be cloned. Physical Review Letters in press. arXiv: 2501.02757

Source: www.sci.news

How Quantum Fluctuations Ignite the Universe’s Greatest Mysteries

Small Vibrations Marking the Universe’s Formation

Joseph Kuropaka / Alamy

Discover more insights in the Lost in Space-Time newsletter. Register for the latest updates from the universe.

Introduction

Since the 5th century AD, the phrase “In the beginning” has sparked intrigue, originating from the writings of an Israeli priest known as “P.” This profound beginning resonates with our modern understanding of the cosmos. Here’s a glimpse into the universe’s birth:

Words falter when describing the universe’s origins, transcending mere physics and human experience. By retracing our steps, we assert that the universe emerged from a hot Big Bang approximately 13.8 billion years ago. The early universe, characterized by rapid expansion, underwent quantum fluctuations, which left enduring marks.

These fluctuations allowed some regions to expand more rapidly, forming hyperdensities of hot matter, while others lagged, resulting in varying densities. About 100 seconds post-Big Bang, baryonic matter took shape: hydrogen nuclei, helium nuclei, and free electrons. Alongside, dark matter emerged as its elusive counterpart.

Initially, the universe existed as a hot plasma—fluidic and dominated by intense radiation—expanding with Big Bang momentum, aided by dark energy. As expansion slowed over 9 billion years, dark energy escalated the expansion rate.

This early universe’s excess density was predominantly dark matter, with small baryonic matter contributions. Gravity pulled these together, while radiation acted as a binding force. The pressure from this radiation created acoustic vibrations or sound waves within the plasma.

Although these waves were not audible, they traveled faster than half the speed of light, with wavelengths spanning millions of light-years. This era signifies the genesis of our universe.

As the pressure waves from radiation expanded outward, they dragged negatively charged electrons and their heavier baryon counterparts. Dark matter, indifferent to radiation interactions, remained behind, resulting in a spherical wave of dense baryonic material expanding outward.

The propagation speed of these sound waves reflected the baryonic material and radiation’s density. Early waves had smaller amplitudes and higher frequencies, readily damped after minimal cycles, akin to ultrahigh-frequency sound waves.

As the universe continued its expansion and cooldown, roughly 380,000 years later, electrons merged with hydrogen and helium nuclei, giving rise to neutral atoms in a process known as recombination. This event, spanning about 100,000 years, produced cosmic background radiation—an elusive imprint awaiting discovery.

Map of Cosmic Microwave Background Radiation Exhibiting Density Fluctuations

Collaboration between ESA and Planck

The radiation pressure and sound speed decreased significantly, creating a frozen spherical shell of baryonic material, similar to debris washed ashore by a storm. The largest compressional wave left behind a concentrated sphere of visible matter, termed the sonic horizon, roughly 480 million light-years from the original overdensity.

Early compressional waves left minor imprints on the universe’s matter distribution, while later waves, generated right before recombination, exhibited greater amplitude and lower frequency, observable in today’s cosmic background radiation.

Consequently, regions of high density yield slightly warmer background radiation, while lower density areas produce cooler radiation. This frozen state incorporates traces of matter distribution just after the Big Bang, known as a “feature of the universe.”

The wavelength of these final sound waves closely relates to the curvature of space, while the Hubble constant integrates our understanding of the cosmos measured over 13 billion years.

Both quantum fluctuations and acoustic vibrations provide distinct signatures, akin to cosmic fingerprints. The first evidence emerged on April 23, 1992, revealing temperature variations in a cosmic background radiation map produced by the COBE satellite. George Smoot, the lead researcher, highlighted its monumental significance, describing it as a divine encounter for believers.

Observing distinct directions in the cosmos creates a triangle projecting into space, with the vertex angle referred to as the angular scale. A favorable horizon results in a higher probability of encountering a hot spot within the cosmic background approximately 480 million light-years from another hot spot, corresponding to an angular scale of around 1°.

This measurement surpasses the resolution of earlier instruments, with the WMAP and Planck satellite missions unveiling additional acoustic vibrations down to angular scales under 0.1°.

The origins of baryonic matter contributed to cosmic structures, with small overdensities serving as seeds for star and galaxy formation, while underdensities created voids within the universe’s large-scale structure, known as the cosmic web. Thus, the probability of finding galaxy chains roughly 480 million light-years from each other slightly increases.

By analyzing acoustic vibrations, astrophysicists have accurately assessed cosmological parameters, including baryonic matter density, dark matter, dark energy, and the Hubble constant among others. However, contentment is elusive, as the standard cosmological inflation model (Lambda CDM) reveals we only observe 4.9% of the universe, with dark matter comprising 26.1% and dark energy making up 69%.

The enigma remains: we have yet to uncover the true nature of dark matter and dark energy.

Jim Baggott’s upcoming book, Disharmony: A History of the Hubble Constant Problem, is scheduled for release in the US by Oxford University Press in January 2026.

Topics:

Source: www.newscientist.com

Unlocking Quantum Computer Success: The Role of Unique Quantum Nature

Google’s Willow Quantum Computer

Credit: Google Quantum AI

What sets quantum computers apart from classical machines? Recent experiments suggest that “quantum contextuality” may be a critical factor.

Quantum computers fundamentally differ from traditional systems by leveraging unique quantum phenomena absent in classical electronics. Their building blocks, known as qubits, can exist in a superposition state, representing two properties simultaneously, which are typically incompatible, or they can be interconnected through a phenomenon called quantum entanglement.

Researchers at Google Quantum AI have conducted several groundbreaking demonstrations using the Willow quantum computer, revealing that quantum contextuality is also significant.

Quantum contextuality highlights an unusual aspect of measuring quantum properties. Unlike classical objects, where attributes are stable regardless of measurement order, quantum measurements are interdependent.

This phenomenon has previously been explored in special experiments with quantum light, and in 2018, researchers mathematically proved its potential application in quantum computing algorithms.

This algorithm enables quantum computers to uncover hidden patterns within larger mathematical structures in a consistent number of operations, regardless of size. In essence, quantum contextuality makes it feasible to locate a needle in a haystack, irrespective of the haystack’s dimensions.

In our experiments, we scaled qubit numbers from a few to 105, analogous to increasing the haystack size. While the number of steps rose with additional qubits, Willow demonstrated superior noise and error management compared to an ideal theoretical quantum computer for the algorithm involved. Notably, it still required fewer steps than traditional computers would need.

Thus, quantum contextuality appears to confer a quantum advantage, allowing these computers to utilize their unique characteristics to outperform classical devices. The research team also executed various quantum protocols reliant on contextuality, yielding stronger effects than previous findings.

“Initially, I couldn’t believe it. It’s genuinely astonishing,” says Adan Cabello from the University of Seville, Spain.

“These findings definitively showcase how modern quantum computers are redefining the limits of experimental quantum physics,” states Vir Burkandani at Rice University, Texas, suggesting that a quantum computer, as a candidate for practical advantages, should accomplish these tasks to confirm its quantum capabilities.

However, this demonstration does not yet confirm the superiority of quantum technology for practical applications. The 2018 research established that quantum computers are more effective than classical ones only when using more qubits than those in Willow, as well as employing qubits with lower error rates, asserts Daniel Lidar at the University of Southern California. The next crucial step may involve integrating this new study with quantum error correction algorithms.

This experiment signifies a new benchmark for quantum computers and underscores the importance of fundamental quantum physics principles. Cabello emphasizes that researchers still lack a complete theory explaining the origins of quantum superiority, but unlike entanglement—which often requires creation—contextuality is inherently present in quantum objects. Quantum systems like Willow are now advanced enough to compel us to seriously consider the peculiarities of quantum physics.

Topics:

Source: www.newscientist.com

Will 2026 Mark the Breakthrough of Quantum Computers in Chemistry?

Quantum Computers: Solutions for Chemistry Challenges

Marijan Murat/DPA/Alamy

One of the critical questions in the quantum computing sector is whether these advanced machines can solve practical problems in fields like chemistry. Researchers in industrial and medical chemistry are poised to provide insights by 2026.

The complexity of determining the structure, reactivity, and other properties of molecules is inherently a quantum problem, primarily involving electrons. As molecular structures grow increasingly complex, these calculations become challenging, sometimes even surpassing the capabilities of traditional supercomputers.

Quantum computers, being inherently quantum, have a potential advantage in tackling these complex chemical calculations. As these computers develop and become more seamlessly integrated with conventional systems, they are gaining traction in the chemistry sector.

For instance, in 2025, IBM and the Japanese Institute of Scientific Research collaborated, employing quantum computers alongside supercomputers to model various molecules. Google researchers have also been innovating algorithms that unveil molecular structures. Additionally, RIKEN researchers are teaming up with Quantinuum to create efficient workflows, allowing quantum computers to calculate molecular energy with remarkable precision. Notably, the quantum computing software platform Kunova Computing introduced an algorithm that reportedly operates ten times more efficiently than traditional methods for energy calculations.

Progress is expected to expedite by 2026 as quantum computers become more advanced. “Future larger machines will allow us to create enhanced workflows, ultimately solving prevalent quantum chemistry problems,” states David Muñoz Ramo from Quantinuum. While his team currently focuses on hydrogen molecules, they foresee stepping into more intricate structures, such as catalysts for industrial reactions.

Other research entities are making strides in similar areas. In December, Microsoft announced a partnership with Algorithmiq, a quantum software startup, aimed at accelerating the development of quantum algorithms for chemistry. Furthermore, a study by Hyperion Research highlights chemistry as a focal area for advancement and investment in quantum computing, ranking it as one of the most promising applications in annual surveys.

However, meaningful progress in quantum chemical calculations depends on achieving error-free or fault-tolerant quantum computers, which will also unlock other potential applications for these devices. As Philip Schleich and Alan Aspuru-Guzik emphasized in a commentary for Science magazine, the ability of quantum computers to outperform classical computers hinges on the development of fault-tolerant algorithms. Thankfully, achieving fault tolerance is a widely accepted goal among quantum computer manufacturers worldwide.

Source: www.newscientist.com

Microsoft’s Controversial Quantum Computer Set to Make Headlines in 2025

Press photo: Microsoft's Majorana 1 chip - the first quantum chip featuring a topological core based on groundbreaking materials developed by Microsoft. Image by John Brecher from Microsoft.

Microsoft’s Majorana 1 Quantum Chip

John Brecher/Microsoft

In February, Microsoft unveiled the Majorana 1 quantum computer, igniting debates in the quantum computing community.

The Majorana 1 is noteworthy for its use of topological qubits, which promise enhanced error resistance compared to traditional qubit designs. Microsoft has pursued the development of topological qubits grounded in the elusive Majorana zero mode (MZM), facing mixed results throughout its journey.

In 2021, a significant paper from Microsoft researchers was retracted by Nature due to identified analytical flaws in their research on topological qubits. Furthermore, evaluations of experiments leading up to Majorana 1 received heavy criticism in 2023.

Consequently, the 2025 paper from Nature announcing Majorana 1 faced heightened scrutiny. Notably, the editorial team claimed, “The results in this manuscript do not represent evidence of the presence of Majorana zero mode in the reported devices.” In contrast, Microsoft’s press release asserted the opposite.

Chetan Nayak from Microsoft addressed concerns during a packed presentation at the American Physical Society Global Summit in Anaheim, California, in March. Despite presenting new data, skepticism remained prevalent among critics.

“The data presented does not demonstrate a functional topological qubit, let alone the basic components of one,” stated Henry Legg, a professor at the University of St Andrews, expressing his reservations.

In response, Nayak contended that the community’s feedback has been enthusiastic and engaged. “We’re observing thoughtful discussions and intriguing responses regarding our recent findings and ongoing efforts,” he noted.

In July, additional data emerged, with researchers like Kim Eun-ha from Cornell University asserting that these results exhibit characteristics more indicative of a topological qubit than previously shown. “It’s encouraging to witness the progress,” she emphasized.

Nayak and his team remain optimistic about future advancements, aiming to escalate their quantum computing capabilities beyond Majorana 1. This initiative was selected for the final phase of the Quantum Benchmarking Initiative led by the U.S. Defense Advanced Research Projects Agency, focusing on practical approaches toward building viable quantum computers.

“This past year has been transformative for our quantum program, and the introduction of the Majorana 1 chip marks a crucial milestone for both Microsoft and the quantum computing sector,” stated Nayak.

Looking ahead to 2026, will Microsoft’s endeavors finally quell the critics? Legg remains doubtful: “Fundamental physics doesn’t adhere to schedules dictated by major tech corporations,” he remarked.

Topics:

Source: www.newscientist.com

Remarkable Advances in Developing Practical Quantum Computers

Quantum Computing Advancements

Practical Quantum Computers Approaching Reality

Alexander Yakimov / Alamy

The quantum computing industry is concluding the year with renewed hope, despite the absence of fully operational quantum systems. At December’s Q2B Silicon Valley Conference, industry leaders and scientists expressed optimism regarding the future of quantum computing.

“We believe that it’s highly likely that someone, or perhaps several entities, will develop a genuinely industrially viable quantum computer, but we didn’t anticipate this outcome until the end of 2025,” stated Joe Altepeter, program manager for the Defense Advanced Research Projects Agency’s Quantum Benchmarking Initiative (QBI). The QBI aims to evaluate which of the competing quantum computing approaches can yield practical devices capable of self-correction or fault tolerance.

This initiative will extend over several years, involving hundreds of professional evaluators. Reflecting on the program’s initial six months, Altepeter noted that while “major roadblocks” were identified in each approach, none disqualified any team from the pursuit of practical quantum devices.

“By late 2025, I sense we will have all major hardware components in place with adequate fidelity; the remaining challenges will be primarily engineering-focused,” asserted Scott Aaronson, a key figure in the field, during his presentation at the University of Texas at Austin. He acknowledged the ongoing challenge of discovering algorithms for practical quantum applications, but highlighted significant progress in hardware developments.

Though quantum computing hardware advancements are encouraging, application development is lagging, according to Ryan Babush from Google. During the conference, Google Quantum AI alongside partners unveiled the finalists for the XPRIZE competition, aiming to accelerate application development.

The research by the seven finalists spans simulations of biomolecules crucial for human health, algorithms enhancing classical simulations for clean energy materials, and calculations that could impact the diagnosis and treatment of complex health issues.

“A few years back, I was skeptical about running applications on quantum computers, but now my interest has significantly increased,” remarked John Preskill, a pivotal voice in quantum computing at Caltech, advocating for the near-term application of quantum systems in scientific discovery.

Over the past year, numerous quantum computers have been employed for calculations, including the physics of materials and high-energy particles, potentially rivaling or surpassing traditional computational methods.

While certain applications are deemed particularly suitable for quantum systems, challenges remain. For instance, Pranav Gokhale at Inflection, a company manufacturing quantum devices from cryogenic atoms, is implementing Scholl’s algorithm—a classic method capable of breaking many encryption systems used by banks today. However, this initial implementation still lacks the computational power necessary to effectively decrypt real-world encrypted information, illustrating that significant enhancements in both hardware and software are essential.

Dutch startup Quantware has proposed a solution to the industry’s major hardware challenge, asserting that increasing quantum computer size can enhance computational capacity while maintaining reliability. Their quantum processor unit design aims to utilize 10,000 qubits, roughly 100 times the capacity of most current superconducting quantum computers. According to Matt Reilersdam, QuantWare anticipates having its first device operational within two and a half years. Other firms, such as IBM and Quantinuum, are working toward similar large-scale quantum systems, while QuEra aims to fabricate 10,000 qubits from ultra-cold atoms within a year, intensifying the competitive landscape.

Moreover, the quantum computing industry is projected to expand significantly, with global investments expected to rise from $1.07 billion in 2024 to approximately $2.2 billion by 2027, as noted in a Quantum Computing Industry Survey by Hyperion Research.

“More individuals than ever can now access quantum computers, and I believe they will accomplish things we can scarcely imagine,” said Jamie Garcia from IBM.

Topics:

Source: www.newscientist.com

Quantum Computers Prove More Valuable Than Anticipated by 2025

Quantum Computers Could Shed Light on Quantum Behavior

Galina Nelyubova/Unsplash

Over the past year, I consistently shared the same narrative with my editor: Quantum computers are increasingly pivotal for scientific breakthroughs.

This was the primary intent from the start. The ambition to leverage quantum computers for deeper insights into our universe has been part of its conception, even referenced in Richard Feynman’s 1981 address. In his discussion about effectively simulating nature, he suggested: “Let’s construct the computer itself using quantum mechanical components that adhere to quantum laws.”

Currently, this vision is being brought to life by Google, IBM, and a multitude of academic teams. Their devices are now employed to simulate reality on a quantum scale. Below are some key highlights.

This year’s advancements in quantum technology began for me with two studies in high-energy particle physics that crossed my desk in June. Separate research teams utilized two unique quantum computers to mimic the behavior of particle pairs within quantum fields. One utilized Google’s Sycamore chip, crafted from tiny superconducting circuits, while the other, developed by QuEra, employed a chip based on cryogenic atoms regulated by lasers and electromagnetic forces.

Quantum fields encapsulate how forces like electromagnetism influence particles across the universe. Additionally, there’s a local structure that defines the behaviors observable when zooming in on a particle. Simulating these fields, especially regarding particle dynamics—where particles exhibit time-dependent behavior—poses challenges akin to producing a motion picture of such interactions. These two quantum computers addressed this issue for simplified versions of quantum fields found in the Standard Model of particle physics.

Jad Halime, a researcher at the University of Munich who was not a part of either study, remarked that enhanced versions of these experiments—simulating intricate fields using larger quantum computers—could ultimately clarify particle behaviors within colliders.

In September, teams from Harvard University and the Technical University of Munich applied quantum computers to simulate two theoretical exotic states of matter that had previously eluded traditional experiments. Quantum computers adeptly predicted the properties of these unusual materials, a feat impossible by solely growing and analyzing lab crystals.

Google’s new superconducting quantum computer, “Willow,” is set to be utilized in October. Researchers from the company and their partners leveraged Willow to execute algorithms aimed at interpreting data obtained from nuclear magnetic resonance (NMR) spectroscopy, frequently applied in molecular biochemical studies.

While the team’s demonstration using actual NMR data did not achieve results beyond what conventional computers can handle, the mathematics underlying the algorithm holds the promise of one day exceeding classical machines’ capabilities, providing unprecedented insights into molecular structures. The speed of this development hinges on advancements in quantum hardware technology.

Later, a third category of quantum computer made headlines. Quantinuum’s Helios-1, designed with trapped ions, successfully executed simulations of mathematical models relating to perfect electrical conductivity, or superconductivity. Superconductors facilitate electricity transfer without loss, promising highly efficient electronics and potentially enhancing sustainable energy grids. However, currently known superconductors operate solely under extreme conditions, rendering them impractical. Mathematical models elucidating the reasons behind certain materials’ superconducting properties are crucial for developing functional superconductors.

What did Helios-1 successfully simulate? Henrik Dreyer from Quantinuum provided insights, stating that it is likely the most pivotal model in this domain, capturing physicists’ interests since the 1960s. Although this simulation didn’t unveil new insights into superconductivity, it established quantum computers as essential players in physicists’ ongoing quest for understanding.

A week later, I was on another call with Sabrina Maniscalco discussing metamaterials with the quantum algorithm firm Algorithmiq. These materials can be finely tuned to possess unique attributes absent in naturally occurring substances. They hold potential for various applications, ranging from basic invisibility cloaks to catalysts accelerating chemical reactions.

Maniscalco’s team worked on metamaterials, a topic I delved into during my graduate studies. Their simulation utilized an IBM quantum computer built with superconducting circuits, enabling the tracking of how metamaterials manipulate information—even under conditions that challenge classical computing capabilities. Although this may seem abstract, Maniscalco mentioned that it could propel advancements in chemical catalysts, solid-state batteries, and devices converting light to electricity.

As if particle physics, new states of matter, molecular analysis, superconductors, and metamaterials weren’t enough, a recent tip led me to a study from the University of Maryland and the University of Waterloo in Canada. They utilized a trapped ion quantum computer to explore how particles bound by strong nuclear forces behave under varying temperatures and densities. Some of these behaviors are believed to occur within neutron stars—poorly understood cosmic entities—and are thought to have characterized the early universe.

While the researchers’ quantum computations involved approximations that diverged from the most sophisticated models of strong forces, the study offers evidence of yet another domain where quantum computers are emerging as powerful discovery tools.

Nevertheless, this wealth of examples comes with important caveats. Most mathematical models simulated on quantum systems require simplifications compared to the most complex models; many quantum computers are still prone to errors, necessitating post-processing of computational outputs to mitigate those inaccuracies; and benchmarking quantum results against top-performing classical computers remains an intricate challenge.

In simpler terms, conventional computing and simulation techniques continue to advance rapidly, with classical and quantum computing researchers engaging in a dynamic exchange where yesterday’s cutting-edge calculations may soon become routine. Last month, IBM joined forces with several other companies to launch a publicly accessible quantum advantage tracker. This initiative ultimately aims to provide a leaderboard showcasing where quantum computers excel or lag in comparison to classical ones.

Even if quantum systems don’t ascend to the forefront of that list anytime soon, the revelations from this past year have transformed my prior knowledge into palpable excitement and eagerness for the future. These experiments have effectively transitioned quantum computers from mere subjects of scientific exploration to invaluable instruments for scientific inquiry, fulfilling tasks previously deemed impossible just a few years prior.

At the start of this year, I anticipated primarily focusing on benchmark experiments. In benchmark experiments, quantum computers execute protocols showcasing their unique properties rather than solving practical problems. Such endeavors can illuminate the distinctions between quantum and classical computers while underscoring their revolutionary potential. However, transitioning from this stage to producing computations useful for active physicists appeared lengthy and undefined. Now, I sense this path may be shorter than previously envisioned, albeit with reasonable caution. I remain optimistic about uncovering more quantum surprises in 2026.

Topics:

Source: www.newscientist.com

Qubits Surpass Quantum Boundaries, Enabling Extended Information Encoding

Quantum particles now have an extended capacity to carry useful information.

koto_feja/Getty Images

The intriguing phenomenon of quantum superposition has enabled scientists to surpass the limitations imposed by fundamental quantum mechanics, equipping quantum objects with properties advantageous for long-term quantum computing.

For over a century, physicists have wrestled with the challenge of distinguishing between the minuscule quantum world and the larger macroscopic universe. In 1985, physicists Anthony Leggett and Anupam Garg introduced a mathematical assessment for determining the size threshold at which an object transcends its quantum characteristics. Quantum objects are recognized by remarkably strong correlations of their properties over time, akin to surprising connections between actions of yesterday and tomorrow.

Objects that achieve a sufficient score in this assessment are classified as quantum, with the scores traditionally held back by a value known as the temporal Zirelson limit (TTB). Theorists believed that even distinctly quantum objects could not surpass this threshold. However, Arijit Chatterjee and his colleagues from the Indian Institute of Science Education and Research in Pune have discovered a method to significantly exceed the TTB using one of the most basic quantum elements.

They centered their research on qubits, the essential building blocks of quantum computers and other quantum information systems. While qubits can be produced through various methods, the team utilized a carbon-based molecule incorporating three qubits. The first qubit was employed to control the behavior of the second “target” qubit over time, with the third qubit employed to extract properties from the target.

Though three-qubit configurations are generally believed to be constrained by the TTB, Chatterjee and his team discovered a method to push the target qubits beyond this limitation dramatically. In fact, their technique resulted in one of the most significant deviations from mathematical plausibility. The key was for the first qubit to govern the target qubit while it was in a state of quantum superposition, where it can effectively embody two states or actions that seem mutually exclusive. For instance, in their experiment, the first qubit directed the target qubit to rotate both clockwise and counterclockwise simultaneously.

Qubits are usually susceptible to decoherence over time, diminishing their capacity to store quantum information. However, after the target qubit surpassed the TTB, decoherence set in, yet the ability to encode information persisted five times longer due to its time-controlled behavior influenced by superposition.

According to Chatterjee, this resilience is advantageous in any context requiring precise qubit control, such as in computational applications. Team member HS Kartik from Poland’s University of Gdańsk mentions that procedures in quantum metrology, including accurate sensing of electromagnetic fields, could benefit significantly from this level of qubit control.

Rakura and their colleagues from China’s Sun Yat-sen University indicate that this research not only has clear potential for enhancing quantum computing practices but also fundamentally broadens our comprehension of how quantum objects behave over time. This is significant because immensely surpassing the TTB indicates that the properties of the qubit are highly interconnected at two divergent time points, a phenomenon absent in non-quantum entities.

The substantial breach of the TTB strongly demonstrates the extent of quantum characteristics present throughout the three-qubit configuration and exemplifies how researchers are advancing the frontiers of the quantum domain, says Karthik.

Topics:

  • quantum computing/
  • quantum physics

Source: www.newscientist.com

Quantum Experiment Resolves Century-Long Debate Between Einstein and Bohr

SEI 276622262

Double-slit experiment showcases the quantum nature of reality

Russell Kightley/Science Photo Library

A thought experiment that sparked a famous debate between physicists Albert Einstein and Niels Bohr in 1927 has now been realized. This breakthrough addresses one of quantum physics’ fundamental mysteries: is light truly a wave, a particle, or an intricate mix of both?

The debate centers on the double-slit experiment, tracing back another century to 1801, when Thomas Young used it to argue for the wave nature of light, while Einstein contended it is a particle. Bohr’s contributions to quantum physics suggested that both perspectives could hold true. Einstein, critical of this notion, designed a modified version of Young’s experiment to counter it.

<p>Recently, <a href="https://quantum.ustc.edu.cn/web/en/node/137">Chaoyan Lu</a> and his team at the University of Science and Technology of China utilized cutting-edge technology in experimental physics to verify Einstein's theories, demonstrating the unique dual wave-particle character of quantum objects, as theorized in the 1920s. "Witnessing quantum mechanics 'in action' at such a foundational level is awe-inspiring," remarks Lu.</p>
<p>In the classic double-slit experiment, light is directed at two narrow parallel slits in front of a screen. If light were entirely particles, the screen would display a distinct light blob behind each slit. However, researchers observed an "interference pattern" of alternating dark and bright bands instead. This demonstrates that light behaves like waves passing through a slit, creating ripples that collide on the screen. Notably, this interference pattern remains evident even when the light intensity is reduced to a single photon. Does this imply that photons, which exhibit particle-like behavior, also interfere like waves?</p>
<p>Bohr proposed the idea of "complementarity," stating that one cannot simultaneously observe the particle nature of a photon showing wave-like behavior, and vice versa. Amid discussions on this matter, Einstein envisioned an additional spring-loaded slit that would compress when a photon entered. By analyzing the movement of the spring, physicists could determine which slit a photon passed through. Einstein believed this approach allowed for a simultaneous description of both particle and wave behavior, creating an interference pattern that contradicts complementarity.</p>
<section></section>
<p>Lu's team aimed to create a setup at the "ultimate quantum limit," firing a single photon rather than using a slit, but rather an atom that could recoil similarly. Upon impacting the atom, the photon entered a quantum state that allowed it to propagate left and right, which also produced an interference pattern upon reaching the detector. To achieve this, researchers utilized lasers and electromagnetic forces to significantly cool the atoms, enabling precise control over their quantum properties. This was vital for testing Bohr's claims against Einstein's. Bohr argued that Heisenberg's uncertainty principle could disrupt the interference pattern when momentum fluctuations of the slit due to recoil are well known, rendering the photon’s position highly ambiguous, and vice versa.</p>
<p>"Bohr's response was brilliant, but such thought experiments remained theoretical for almost a century," notes Lu.</p>

<p>By adjusting the laser, Lu's team could control the momentum uncertainty of the atoms as they slitted. They found that Bohr was indeed correct; finely tuning these momentum ambiguities could eliminate interference patterns. Remarkably, the team could access intermediate regions to measure recoil information, observing blurred versions of interference patterns. Essentially, the photon displayed both wave and particle characteristics simultaneously, according to Lu.</p>
<p>``The real intrigue lies in [this] intermediate realm," states <a href="https://physics.mit.edu/faculty/wolfgang-ketterle/">Wolfgang Ketterle</a> from the Massachusetts Institute of Technology. Early this year, he and his team conducted a variation of Einstein's experiment, using ultracold atoms controlled by lasers that could pass through two slits. Lu's group utilized a single atom to scatter light in two directions; both atoms scattered light in the same direction, and changes in its quantum state indicated the influence of the photons colliding with each atom. Ketterle emphasizes that this approach provides a distinct means to explore wave-particle duality, offering clearer insights into photon behavior since this "which direction" information is recorded in one of the two separate atoms, albeit deviating slightly from Einstein's premise.</p>
<p>Furthermore, he and his colleagues performed experiments where they abruptly switched off the laser (similar to removing a spring from a moving slit) and subsequently directed photons at the atoms. Bohr's conclusions held, as the uncertainty principle impacted the momentum exchange between atoms and photons, potentially "washing out" the interference fringes. This spring-free iteration of Einstein's concept had remained untested until now, according to Ketterle. "Nuclear physics presents an excellent opportunity to apply cold atoms and lasers for a clearer illustration of quantum mechanics, a possibility not achievable before."</p>

<p><a href="https://physik.unibas.ch/en/persons/philipp-treutlein/">Philip Treutlein</a> and his colleagues at the University of Basel in Switzerland assert that both experiments strongly reinforce fundamental aspects of quantum mechanics. "From our modern perspective, we understand how quantum mechanics operates on a microscopic level. Yet witnessing the empirical realization of these principles is always impactful." The experiments led by Lu align conceptually with historical records of the debates between Bohr and Einstein, affirming that quantum mechanics behaves as predicted.</p>
<p>For Lu, there remains more work on categorizing the quantum state of the slit and increasing its mass. However, the experiment carries significant educational importance. "Above all, I hope to illustrate the sheer beauty of quantum mechanics," he shares. "If more young individuals witness the real-time emergence and disappearance of interference patterns and think, 'Wow, this is how nature functions,' then the experiment will already be a success."</p>

<section class="ArticleTopics" data-component-name="article-topics">
    <p class="ArticleTopics__Heading">topic:</p>
</section>

Source: www.newscientist.com

Why Quantum Mechanics Suggests the Past Isn’t Real

Einstein’s ring, termed the blue horseshoe, an effect observed through gravitational lensing of far-off galaxies

NASA, ESA

This is an excerpt from the Lost in Space-Time newsletter. Each month, we showcase intriguing concepts from around the globe. You can Click here to subscribe to Lost in Time and Space .

Adolf Hitler’s death is recorded as April 30, 1945. At least, that’s the official narrative. However, some historians contest this, suggesting he escaped war-torn Berlin and lived in secrecy. Today, this alternate theory is largely viewed as a conspiracy, yet no rational historian can deny that, regardless of the available evidence, the “facts in question” existed. Hitler was either deceased that day or he was not. It’s nonsensical to suggest that he was both alive and dead on May 2, 1945. But if we replace Adolf Hitler with Schrödinger’s renowned cat, the historical “facts” become quite muddled.

Schrödinger is recognized as a foundational figure in quantum mechanics, the most successful scientific framework to date. It serves as the backbone for many fields, including chemistry, particle physics, materials science, molecular biology, and astronomy, yielding remarkable technological advancements, from lasers to smartphones. Yet, despite its successes, the essence of quantum mechanics appears perplexing at its core.

In our daily lives, we operate under the assumption that an “external” real world exists where objects like tables and chairs possess clearly defined traits, such as position and orientation, independent of observation. In the macroscopic realm, our observations merely uncover a pre-existing reality. Conversely, quantum mechanics governs the microscopic domain of atoms and subatomic particles, where certainty and clarity dissolve into ambiguity.

Quantum uncertainty implies that the future is not entirely dictated by the present. For example, if an electron is directed toward a thin barrier with a known speed, it can either bounce back or tunnel through, emerging on the opposite side. Similarly, if an atom becomes excited, it might remain excited or decay and emit a photon a few microseconds later. In both scenarios, predicting outcomes with certainty is impossible—only probabilistic estimates can be offered.

Most individuals are comfortable with the idea that the future holds uncertainties. However, quantum indeterminacy similarly applies to the past. The process is not yet complete. When scrutinized at a minute scale, history transmutes into a blend of alternate possibilities, a state known as superposition.

The hazy picture of the quantum microcosm sharpens during measurements. For instance, localizing an electron may show it at a specific location; however, quantum mechanics asserts that this doesn’t imply the electron previously existed in that state. It is already there. Observations merely disclose the specific location prior to measurement. Rather, measurement transforms the electron from a state without a defined location into one with a defined position.

So, how should we conceptualize electrons prior to observation? Picture an abundance of semi-real “ghost electrons” dispersed in space, each denoting a distinct potential. The reality dwells in an indeterminate state. This notion is sometimes explained by stating that an electron occupies multiple locations simultaneously. Moreover, measurements serve to convert a certain “ghost” into tangible reality while eliminating its counterparts.

Does the experimenter have control over the outcome? Not if they opt for the prevailing ghost. The process hinges on randomness. Yet, a layer of choice is present, which is vital for grasping quantum reality. If, instead of measuring position, the experimenter decides to assess the electron’s speed, the fuzzy initial state resolves into a distinct result. This time, instead of locating electrons, measurements yield electrons with velocity. Interestingly, it appears that electrons with speed exhibit wave-like properties, distinct from their particle nature. Thus, electrons embody both wave and particle characteristics, contingent on the measurement approach.

In summary: the behavior of electrons—as waves or particles—is dictated by the type of measurement the experimenter chooses. While this may seem bizarre, the situation grows even stranger. What has transpired to atoms before measurement relies on the experimenter’s selections. In essence, the properties of electrons—wave or particle—are contingent upon one’s choices, suggesting that something may have retroactively influenced the “external” world prior to measurement.

Is this time travel? Retroactive causality? Telepathy? These terms are often overused in popular quantum physics discussions, but the clearest explanation comes from John Wheeler, who coined the term black hole: “The past exists solely as recorded in the present,” he asserted.

While Mr. Wheeler’s assertion is thought-provoking, is there an actual experiment that validates it? Over breakfast at the Hilton Hotel in Baltimore in 1980, Wheeler mentioned a curious inquiry: “How do you suppress the ghosts of photons?” Recognizing my bewilderment, he proceeded to elaborate on a unique twist he devised for a classical quantum experiment, applicable to light, electrons, or even entire atoms.

This experiment traces back to the British polymath Thomas Young, who in 1801 aimed to demonstrate the wave properties of light. Young established a screen with two closely placed slits and illuminated it with a pinprick of light. What transpired? Instead of the anticipated two blurred light bands, Young observed a series of bright and dark stripes known as interference fringes. This phenomenon arises because light waves passing through each slit disperse, where they amplify and create brighter sections through constructive interference while canceling out in others, resulting in dark patches through destructive interference.

Light passing through two slits in a screen during a double-slit experiment

Russell Kightley/Science Photo Library

The conversation surrounding quantum mechanics began with scientists debating whether light consists of waves or particles called photons. The resolution is that it is both. Thanks to modern advancements, we can conduct Young’s experiment one photon at a time. Each photon produces a minuscule dot on the second screen, and over time, multiple dots accumulate, forming the characteristic striped pattern unearthed by Young. This situation raises questions: if a photon is a minuscule particle, it should clearly pass through either slit or the other. Yet, both slits are necessary to create the interference pattern.

What occurs if an astute experimenter wants to determine the slit a particular photon travels through? A detector can be placed near a slit to achieve this. Once that occurs, the interference pattern vanishes. The act of detecting effectively causes the photons to assume a particle-like behavior, obscuring their wave characteristics. The same principle applies to electrons; one can either pinpoint which slit the electrons traverse, resulting in the absence of interference stripes, or obscure their pathways and observe stripes manifest after numerous electrons have produced the pattern. Thus, experimenters can dictate whether photons, or electrons for that matter, act like waves or particles when they hit the detection screen.

Now, let’s discuss Wheeler’s twist. The decision to observe or not doesn’t need to be premeditated. Photons (or electrons) can pass through a slit system and remain until reaching an imaging screen. The experimenter can even opt to glance back in time to see which slit a photon originated from. Known as a delayed choice experiment, this setup has been executed and yielded anticipated outcomes. When the experimenter decides to observe, the photons fail to coalesce into a striped pattern. The essence of the phenomenon is that the reality that It was—whether the light behaves like a wave traversing both slits or a particle going through one—is contingent on the later choice of the experimenter. For clarity, in real studies, the “selections” are automated and randomized to prevent biases, occurring more swiftly than human response times.

In delayed choice experiments, the past remains unchanged. Instead, without experimentation, multiple pasts exist, intertwining distinct realities. Your measurement choice narrows down this history. While a unique past remains elusive, the number of possibilities can be reduced. Thus, this experiment is frequently referred to as the quantum eraser experiment.

Although the time used in actual experiments is merely nanoseconds, in principle, it could reach back to the dawn of the universe. This is what lay behind Wheeler’s intriguing query regarding retaining the ghost of a photon. He envisaged a distant cosmic light source being gravitationally lensed from our view by an intervening black hole, with two light paths bending around opposite sides of the black hole before converging on Earth. This scenario resembles a two-slit experiment on a cosmic scale, where a photon’s ghost may arrive via one path while another, possibly longer, route carries a different one. To execute such a cosmic interference experiment, like Young’s original experiment, the first ghost must be preserved, or “held,” allowing the waves to overlap simultaneously, awaiting the arrival of the second ghost before they merge.

Einstein claimed that past, present, and future are mere illusions. In this case, he erred in specifying “the”. A While the past is recorded in today’s history, it comprises myriad interwoven “ghost pasts,” collectively creating unique narratives on a macroscopic level. Nevertheless, at a quantum level, it transforms into a mosaic of blurred partial realities that exceed human comprehension.

Paul Davies is a theoretical physicist, cosmologist, astrobiologist, and bestselling author. His book, Quantum 2.0, will be published by Penguin in November 2025.

Topic:

Source: www.newscientist.com

Quantum Computers Require Classical Computing for Real-World Applications

Quantum Machine Professor Jonathan Cohen presenting at the AQC25 conference

Quantum Machines

Classical computers are emerging as a critical component in maximizing the functionality of quantum computers. This was a key takeaway from this month’s assembly of researchers who emphasized that classical systems are vital for managing quantum computers, interpreting their outputs, and enhancing future quantum computing methodologies.

Quantum computers operate on qubits—quantum entities manifesting as extremely cold atoms or miniature superconducting circuits. The computational capability of a quantum computer scales with the number of qubits it possesses.

Yet, qubits are delicate and necessitate meticulous tuning, oversight, and governance. Should these conditions not be met, the computations conducted may yield inaccuracies, rendering the devices less efficient. To manage qubits effectively, researchers utilize classical computing methods. The AQC25 conference held on November 14th in Boston, Massachusetts, addressed these challenges.

Sponsored by Quantum Machines, a company specializing in controllers for various qubit types, the AQC25 conference gathered over 150 experts, including quantum computing scholars and CEOs from AI startups. Through numerous presentations, attendees elaborated on the enabling technologies vital for the future of quantum computing and how classical computing sometimes acts as a constraint.

Per Shane Caldwell, sustainable fault-tolerant quantum computers designed to tackle practical problems are only expected to materialize with a robust classical computing framework that operates at petascale—similar to today’s leading supercomputers. Although Nvidia does not produce quantum hardware, it has recently introduced a system that links quantum processors (QPUs) to traditional GPUs, which are commonly employed in machine learning and high-performance scientific computing.

Even in optimal operations, the results from a quantum computer reflect a series of quantum properties of the qubits. To utilize this data effectively, it requires translation into conventional formats, a process that again relies on classical computing resources.

Pooya Lonar from Vancouver-based startup 1Qbit discussed this translation process and its implications, noting that the performance speed of fault-tolerant quantum computers can often hinge on the operational efficiency of classical components such as controllers and decoders. This means that whether a sophisticated quantum machine operates for hours or days to solve a problem might depend significantly on its classical components.

In another presentation, Benjamin Lienhardt from the Walter Meissner Institute for Cryogenic Research in Germany, presented findings on how traditional machine learning algorithms can facilitate the interpretation of quantum states in superconducting qubits. Similarly, Mark Saffman from the University of Wisconsin-Madison highlighted using classical neural networks to enhance the readout of qubits derived from ultra-cold atoms. Researchers unanimously agreed that non-quantum devices are instrumental in unlocking the potential of various qubit types.

IBM’s Blake Johnson shared insights into a classical decoder his team is developing as part of an ambitious plan to create a quantum supercomputer by 2029. This endeavor will employ unconventional error correction strategies, making the efficient decoding process a significant hurdle.

“As we progress, the trend will shift increasingly towards classical [computing]. The closer one approaches the QPU, the more you can optimize your system’s overall performance,” stated Jonathan Cohen from Quantum Machines.

Classical computing is also instrumental in assessing the design and functionality of future quantum systems. For instance, Izhar Medalcy, co-founder of the startup Quantum Elements, discussed how an AI-powered virtual model of a quantum computer, often referred to as a “digital twin,” can inform actual hardware design decisions.

Representatives from the Quantum Scaling Alliance, co-led by 2025 Nobel Laureate John Martinis, were also present at the conference. This reflects the importance of collaboration between quantum and classical computing realms, bringing together qubit developers, traditional computing giants like Hewlett Packard Enterprise, and computational materials specialists such as the software company Synopsys.

The collective sentiment at the conference was unmistakable. The future of quantum computing is on the horizon, bolstered significantly by experts who have excelled in classical computing environments.

Topics:

  • Computing/
  • Quantum Computing

Source: www.newscientist.com

Quantum 2.0 Review: An Ambitious and Entertaining Exploration of Quantum Physics, Though Slightly Exaggerated

Quantum 2.0 explores the boundaries of our understanding of the quantum realm

Richard Keil/Science Photo Library

Quantum 2.0
Paul Davies Penguin (UK, released November 27th); University of Chicago Press (US, released in February 2026)

In his book Quantum 2.0: The Past, Present, and Future of Quantum Physics, physicist Paul Davies concludes with a beautiful reflection: “To grasp the quantum world is to catch a glimpse of the grandeur and elegance of the physical universe and our role within it.”

This enchanting and romantic viewpoint resonates throughout the text. Quantum 2.0 presents a bold attempt to elucidate the fringes of the quantum universe, with Davies as an informed and passionate storyteller. However, his enthusiasm occasionally edges toward exaggeration, with his remarkable writing skills often compensating where more direct quotations might have been fitting.

Davies’ book is quite accessible, despite its ambitious aim of covering nearly every facet of quantum physics. He addresses quantum technologies in computing, communications, and sensing, touches on quantum biology and cosmology, and manages to explore various competing interpretations of quantum theory.

There are no equations in Quantum 2.0, and while some technical diagrams and schematics are included, they do not detract from the reading experience.

As a writer on quantum physics myself, I appreciate how clearly Davies articulates the experiments and protocols involved in quantum information processing and encryption—a challenging task to convey.

As a navigator through the quantum realm, Davies serves as a delightful and amiable companion. His genuine curiosity and excitement are palpable. Yet, this exuberance doesn’t always align with the rigor that contemporary quantum physics research demands. In my view, most quantum-related excitement should come with cautionary notes.


Readers unfamiliar with quantum research might confuse speculative claims with the truth.

For instance, within the first 100 pages, Davies asserts that quantum computers could enhance climate modeling—an assertion not widely accepted among computer scientists and mathematicians, especially concerning near-future machines.

In another section regarding quantum sensors, he mentions manufacturers proposing their utility in evaluating conditions like epilepsy, schizophrenia, and autism. I anticipated a justification or insights from experts outside the sensor industry, but the ensuing discussion was lacking in depth and critical analysis.

Additionally, the example Davies provides to demonstrate quantum computers’ advantages over classical ones dates back several years.

Less experienced readers in quantum research may find some of Davies’s speculative statements misleading, although the book remains an engaging read. This is underscored by bold assertions such as, “Whoever masters Quantum 2.0 will certainly control the world.”

To clarify, I don’t dispute Davies’ sentiments. Many gadgets that influence our lives currently depend on quantum physics, and the future may usher in even more quantized technology. I support this notion.

Emerging fields, such as quantum biology and better integration of quantum and cosmological theories, also seem poised for significant breakthroughs. Just ask the numerous researchers diligently working toward a theory of quantum gravity.

However, conveying this future to newcomers necessitates a blend of precision and subtlety in storytelling and writing.

Otherwise, the outcome may lead to disappointment.

topic:

Source: www.newscientist.com

Quantum Computers with Recyclable Qubits: A Solution for Reducing Errors

Internal optics of Atom Computing’s AC1000 system

Atom Computing

Quantum computers, utilizing qubits formed from extremely cold atoms, are rapidly increasing in size and may soon surpass classical computers in computational power. However, the frequency of errors poses a significant challenge to their practicality. Researchers have now found a way to replenish and recycle these qubits, enhancing computation reliability.

All existing quantum systems are susceptible to errors and are currently unable to perform calculations that would give them an edge over traditional computers. Nonetheless, researchers are making notable advancements in the creation of error correction methods to address this issue.

One approach involves dividing the components of quantum computers, known as qubits, into two primary categories: operational qubits that manipulate data and auxiliary qubits that monitor errors.

Developing large quantities of high-quality qubits for either function remains a significant technical hurdle. Matt Norcia and his team at Atom Computing have discovered a method to lessen the qubit requirement by recycling or substituting auxiliary qubits. They demonstrated that an error-tracking qubit can be effectively reused for up to 41 consecutive runs.

“The calculation’s duration is likely to necessitate numerous rounds of measurement. Ideally, we want to reuse qubits across these rounds, minimizing the need for a continuous influx of new qubits,” Norcia explains.

The team utilized qubits derived from electrically neutral ytterbium atoms that were chilled close to absolute zero using lasers and electromagnetic pulses. By employing “optical tweezers,” they can manipulate each atom’s quantum state, which encodes information. This method allowed them to categorize the quantum computer into three distinct zones.

In the first zone, 128 optical tweezers directed the qubits to conduct calculations. The second zone comprised 80 tweezers that held qubits for error tracking, or that could be swapped in for faulty qubits. The third zone functioned as a storage area, keeping an additional 75 qubits that had recently been deemed useful. These last two areas enabled researchers to reset or exchange the auxiliary qubit as needed.

Norcia noted that it was challenging to establish this setup due to stray laser light interfering with nearby qubits. Consequently, researchers had to develop a highly precise laser control and a method to adjust the state of data qubits, ensuring they remained “hidden” from specific harmful light types.

“The reuse of Ancilla is crucial for advancing quantum computing,” says Yuval Borger from QuEra, a U.S. quantum computing firm. Without this ability, even basic calculations would necessitate millions, or even billions, of qubits, making it impractical for current or forthcoming quantum hardware, he adds.

This challenge is recognized widely across the atom-based qubit research community. “Everyone acknowledges that neutral atoms understand the necessity to reset and reload during calculations,” Norcia asserts.

For instance, Borger highlights that a team from Harvard and MIT employed similar techniques to maintain the operation of their quantum computer using 3000 ultra-cold rubidium atoms for several hours. Other quantum setups, like Quantinuum’s recently launched Helios machine, which uses ions controlled by light as qubits, also feature qubit reusability.

topic:

Source: www.newscientist.com

IBM Introduces Two Quantum Computers with Unmatched Complexity

IBM researchers hold components of the Loon quantum computer

IBM

In the competitive landscape of developing error-resistant quantum supercomputers, IBM is adopting a unique approach distinct from its primary rivals. The company has recently unveiled two new quantum computing models, dubbed Nighthawk and Loon, which may validate its methodology and deliver the advancements essential for transforming next-gen devices into practical tools.

IBM’s design for quantum supercomputers is modular, emphasizing the innovation of connecting superconducting qubits both within and across different quantum units. When this interconnectivity was first proposed, some researchers expressed skepticism about its feasibility. Jay Gambetta from IBM noted that critics implied to the team, “You exist in a theoretical realm; achieving this is impossible,” which they aim to refute.

Within Loon, every qubit interlinks with six others, allowing for unique connectivity that enables vertical movement in addition to lateral motion. This feature has not been previously observed in existing superconducting quantum systems. Conversely, Nighthawk implements four-way connections among qubits.

This enhanced connectivity may be pivotal in tackling some of the most pressing issues encountered by current quantum computers. The advancements could boost computational capabilities and reduce error rates. Gambetta indicated that initial tests with Nighthawk demonstrated the ability to execute quantum programs that are 30% more complex than those on most other quantum computers in use today. Such an increase in complexity is expected to facilitate further advancements in quantum computing applications, with IBM’s earlier models already finding utility in fields like chemistry.

The industry’s ultimate objective remains the ability to cluster qubits into error-free “logical qubits.” IBM is promoting strategies that necessitate smaller groupings than those pursued by competitors like Google. This could permit IBM to realize error-free computation while sidestepping some of the financial and engineering hurdles associated with creating millions of qubits. Nonetheless, this goal hinges on the connectivity standards achieved with Loon, as stated by Gambetta.

Stephen Bartlett, a researcher at the University of Sydney in Australia, expressed enthusiasm about the enhanced qubit connectivity but noted that further testing and benchmarking of the new systems are required. “While this is not a panacea for scaling superconducting devices to a size capable of supporting genuinely useful algorithms, it represents a significant advancement,” he remarked.

However, there remain several engineering and physical challenges on the horizon. One crucial task is to identify the most effective method for reading the output of a quantum computer after calculations, an area where Gambetta mentioned recent IBM progress. The team, led by Matthias Steffen, also aims to enhance the “coherence time” for each qubit. This measure indicates how long a quantum state remains valid for computational purposes, but the introduction of new connections can often degrade this quantum state. Additionally, they are developing techniques to reset certain qubits while computations are ongoing.

Plans are in place for IBM to launch a modular quantum computer in 2026 capable of both storing and processing information, with future tests on Loon and Nighthawk expected to provide deeper insights.

Topic:

Source: www.newscientist.com

Helios 1: A Groundbreaking Quantum Computer Poised to Tackle Superconductivity Challenges

Helios-1 Quantum Computing Chip

Quantinum

At Quantinuum, researchers have harnessed the capabilities of the Helios-1 quantum computer to simulate a mathematical model traditionally used to analyze superconductivity. While classical computers can perform these simulations, this breakthrough indicates that quantum technology may soon become invaluable in the realm of materials science.

Superconductors can transmit electricity flawlessly, yet they only operate at exceedingly low temperatures, rendering them impractical. For decades, physicists have sought to modify the structural characteristics of superconductors to enable functionality at room temperature, and many believe the solution lies within a mathematical framework known as the Fermi-Hubbard model. This model is regarded by Quantinuum researchers as a significant component of condensed matter physics. For additional insights, see Henrik Dreyer.

While traditional computers excel at simulating the Fermi-Hubbard model, they struggle with large samples and fluctuating material properties. In comparison, quantum computers like Helios-1 are poised to excel in these areas. Dreyer and colleagues achieved a milestone by conducting the most extensive simulation of the Fermi-Hubbard model on a quantum platform.

The team employed the Helios-1, which operates with 98 qubits derived from barium ions. These qubits are manipulated using lasers and electromagnetic fields to execute the simulations. By adjusting the qubits through various quantum states, they collected data on their properties. Their simulation encompassed 36 fermions, the exact particles typical in superconductors, represented mathematically by the Fermi-Hubbard model.

Past experiments show that fermions must form pairs for superconductors to function, an effect that can be induced by laser light. The Quantinuum team modeled this scenario, applying laser pulses to the qubits and measuring the resulting states to detect signs of particle pairing. Although the simulation didn’t replicate the experiment precisely, it captured key dynamic processes that are often challenging to model using traditional computational methods with larger particle numbers.

Dreyer mentioned that while the experiment does not definitively establish an advantage for Helios-1 over classical computing, it gives the team assurance in the competitiveness of quantum computers compared to traditional simulation techniques. “Utilizing our methods, we found it practically impossible to reproduce the results consistently on classical systems, whereas it only takes hours with a quantum computer,” he stated. Essentially, the time estimates for classical calculations were so extended that determining equivalence with Helios’ performance became challenging.

The Trapped Ions Function as Qubits in the Helios-1 Chip

Quantinum

No other quantum computer has yet endeavored to simulate fermion pairs for superconductivity, with the researchers attributing their achievement to Helios’ advanced hardware. David Hayes from Quantinuum remarked on Helios’ qubits being exceptionally reliable and their proficiency in industry-standard benchmarking tasks. Preliminary experiments yielded maintenance of error-free qubits, including a feat of entangling 94 specialized qubits—setting a new record across all quantum platforms. The utilization of such qubits in subsequent simulations could enhance their precision.

Eduardo Ibarra Garcia Padilla, a researcher at California’s Harvey Mudd University, indicated that the new findings hold promise but require careful benchmarks against leading classical computer simulations. The Fermi-Hubbard model has intrigued physicists since the 1960s, so he’s eager for advanced tools to further its study.

Uncertainty surrounds the timeline for approaches like Helios-1 to rival the leading conventional computers, according to Steve White from the University of California, Irvine. He noted that many essential details remain unresolved, particularly ensuring that quantum simulations commence with the appropriate qubit properties. Nevertheless, White posits that quantum simulations could complement classical methods, particularly in exploring the dynamic behaviors of materials.

“They are progressing toward being valuable simulation tools for condensed matter physics,” he stated, but added, “It remains early days, and computational challenges persist.”

Reference: arXiv Doi: 10.48550/arXiv.2511.02125

Topic:

Source: www.newscientist.com

Next-Gen Quantum Networks: Paving the Way for a Quantum Internet Prototype

Quantum Internet could provide secure communications globally

Sakumstarke / Alamy

One of the most sophisticated quantum networks constructed to date will enable 18 individuals to communicate securely through the principles of quantum physics. The researchers affirm that this represents a feasible step towards realizing a global quantum internet, although some experts express doubt.

The eagerly awaited quantum internet aims to allow quantum computers to communicate over distances by exchanging light particles, known as photons, that are interconnected through quantum entanglement. Additionally, it will facilitate the linkage of quantum sensor networks, enabling communications impervious to classical computer hacking. However, connecting different segments of the quantum realm is not as straightforward as laying down cables due to the challenges in ensuring seamless interactions between network nodes.

Recently, Chen Shenfeng from Shanghai Jiao Tong University in China demonstrated a method to interconnect two quantum networks. Initially, they established two networks containing 10 nodes each, both sharing quantum entanglement and functioning as smaller iterations of a quantum internet. They then combined one node from each network, resulting in a larger, fully integrated network that enables communication across all pairs of the 18 remaining nodes.

Networking 18 classical computers is a straightforward endeavor involving inexpensive components, but in the quantum sphere, where specific timing is crucial for sharing individual photons among several users, advanced technology and specialized knowledge are required. Even establishing communication between pairs is intricate, yet facilitating communication among any pair of 18 users is unprecedented.

“Our method provides essential capabilities for quantum communication across disparate networks and is pivotal for creating a large-scale quantum internet that enables interactions among all participants,” the researchers stated in their paper, which has not responded to inquiries for comments.

As the researchers clarify, this network integration hinges on a process termed entanglement swapping. Photons can be intertwined by conducting a specific observation known as the Bell measurement. By simultaneously measuring the status of one photon from each of two pairs of entangled photons, the most distant photons in the arrangement become linked. However, attempting to observe their states disrupts the delicate quantum balance and thus depletes the measured photon states.

“This isn’t the initial demonstration of entanglement exchange,” remarks Sidharth Joshi from the University of Bristol, UK. “What they have achieved is a framework that simplifies inter-network exchanges.”

Joshi notes that current quantum communication research is divided between extending the range of information transmission between two devices, occasionally utilizing satellites, and developing protocols and strategies for reliably networking numerous devices over shorter distances. This study pertains to the latter. “Both areas are critically important,” he asserts.

Conversely, Robert Young, a professor at Lancaster University in the UK, commented that while the results showcase a remarkable technical feat demanding expertise and extensive resources, he deems it improbable as a blueprint for future large-scale quantum networks, considering the expense and intricacy involved.

“This is far from practical and not something readily applicable in real-world scenarios,” Young states. “The paper’s claim is that this is the future of quantum network integration, but many formidable challenges remain to be addressed.”

One significant issue is the necessity for quantum repeaters to convey information across extensive distances. As distance increases, photons are frequently lost in fiber optic cables, and measurements can jeopardize the state of a photon, rendering the quantum information unreadable or untransmittable, thereby preventing signal amplification along its route. If quantum repeaters functioned effectively, they could transmit signals over longer distances, yet constructing such devices has been challenging.

“We understand that to build a viable quantum network, some method of quantum repeater is essential,” Young points out, emphasizing that this was absent in the current network demonstration.

Topics:

  • internet/
  • quantum computing

Source: www.newscientist.com

Tony Blair Warns: “History Won’t Forgive Us” if Britain Lags in the Quantum Computing Race

Prime Minister Tony Blair asserted that “history will not permit” Britain to lag behind in the quantum computing race. This advanced technology is anticipated to ignite a new era of innovations across various fields, from pharmaceutical development to climate analysis.

“The United Kingdom risks losing its edge in quantum research,” cautioned the former Labor prime minister at the Tony Blair Institute, a think tank supported by tech industry veterans such as Oracle founder Larry Ellison.

In a report advocating for a national quantum computing strategy, Mr. Blair and former Conservative leader William Hague drew parallels between the current situation and the evolution of artificial intelligence. While the UK made significant contributions to AI research, it has since surrendered its leadership to other nations, particularly the US, which has triggered a race to develop “sovereign” AI capabilities.

“As demonstrated with AI, a robust R&D foundation alone is insufficient; countries with the necessary infrastructure and capital will capture the economic and strategic advantages of such technologies,” they noted. “While the UK boasts the second-largest number of quantum start-ups globally, it lacks the high-risk investment and infrastructure essential for scaling these ventures.”

Quantum computing operates in unusual and fascinating ways that contrast sharply with classical computing. Traditional computers process information through transistors that switch on or off, representing 1s and 0s. However, in quantum mechanics, entities can exist in multiple states simultaneously, thanks to a phenomenon called quantum superposition, which allows transistors to be in an on and off state concurrently.

This leads to a dramatic boost in computational capabilities, enabling a single quantum computer to perform tasks that would typically require billions of the most advanced supercomputers. Although this field is not yet mature enough for widespread application, the potential for simulating molecular structures to develop new materials and pharmaceuticals is vast. The true value of quantum computing lies in its practical delivery. Estimations suggest that industries such as chemicals, life sciences, automotive, and finance could represent about $1.3 trillion.

There are increasing fears that extraordinarily powerful quantum machines could decipher all encryption and pose serious risks to national security.

Prime Ministers Blair and Hague remarked: “The quantum era is upon us, whether Britain chooses to lead or not.” They added, “History will not excuse us if we squander yet another opportunity to excel in groundbreaking technology.”

This alert follows the recent recognition of British, Cambridge-educated John Clarke, who received the 2025 Nobel Prize in Physics for his contributions to quantum computing, alongside the continued growth of UK quantum firms supported by US companies.

In June, the Oxford University spinout Oxford Ionics was acquired by US company IonQ for $1.1 billion. Meanwhile, Cyclantum, a spinout from the University of Bristol and Imperial College London, primarily thrived in California, discovering that its most enthusiastic investors were located there, where it developed its first large-scale quantum computer. These advancements can be made in Brisbane, Australia.

A report from the Tony Blair Institute for Global Change critiques the UK’s current quantum approach, highlighting that both China and the US are “ahead of the game,” with countries like Germany, Australia, Finland, and the Netherlands also surpassing the UK.

A government representative stated: “Quantum technology has the potential to revolutionize sectors ranging from healthcare to affordable clean energy. The UK currently ranks second globally for quantum investment and possesses leading capabilities in supply chains such as photonics, yet we are resolute in pushing forward.”

They continued: “We have committed to a groundbreaking 10-year funding strategy for the National Quantum Computing Center and will plan other aspects of the national program in due course.”

In June, the Labor party unveiled a £670 million initiative to expedite the application of quantum computing, as part of an industrial strategy aimed at developing new treatments for untreatable diseases and enhancing carbon capture technologies.

Source: www.theguardian.com

Quantum Computers Confirm the Reality of Wave Functions

SEI 270583733

The wave function of a quantum object might extend beyond mere mathematical representation

Povitov/Getty Images

Does quantum mechanics accurately depict reality, or is it merely our flawed method of interpreting the peculiar characteristics of minuscule entities? A notable experiment aimed at addressing this inquiry has been conducted using quantum computers, yielding unexpectedly solid results. Quantum mechanics genuinely represents reality, at least in the context of small quantum systems. These findings could lead to the development of more efficient and dependable quantum devices.

Since the discovery of quantum mechanics over a hundred years ago, its uncertain and probabilistic traits have confounded scientists. For instance, take superposition. Are particles truly existing in multiple locations simultaneously, or do the calculations of their positions merely provide varying probabilities of their actual whereabouts? If it’s the latter, then there are hidden aspects of reality within quantum mechanics that may be restricting our certainty. These elusive aspects are termed “hidden variables,” and theories based on this premise are classified as hidden variable theories.

In the 1960s, physicist John Bell devised an experiment intended to disprove such theories. The Bell test explores quantum mechanics by evaluating the connections, or entanglement, between distant quantum particles. If these particles exhibit quantum qualities surpassing a certain threshold, indicating that their entanglement is nonlocal and spans any distance, hidden variable theories can be dismissed. The Bell test has since been performed on various quantum systems, consistently affirming the intrinsic nonlocality of the quantum realm.

In 2012, physicists Matthew Pusey, Jonathan Barrett, and Terry Rudolph developed a more comprehensive test (dubbed PBR in their honor) that enables researchers to differentiate between various interpretations of quantum systems. Among these are the ontic perspective, asserting that measurements of a quantum system and its wavefunction (a mathematical representation of a quantum state) correspond to reality. Conversely, the epistemological view suggests that this wavefunction is an illusion, concealing a richer reality beneath.

If we operate under the assumption that quantum systems possess no ulterior hidden features that impact the system beyond the wave function, the mathematics of PBR indicates we ought to comprehend phenomena ontically. This implies that quantum behavior is genuine, no matter how peculiar it appears. PBR tests function by comparing different quantum elements, such as qubits in a quantum computer, assessing how frequently they register consistent values for specific properties, like spin. If the epistemological perspective is accurate, the qubits will report identical values more often than quantum mechanics would suggest, implying that additional factors are at play.

Yang Songqinghao and his colleagues at the University of Cambridge have created a method to perform PBR tests on a functioning IBM Heron quantum computer. The findings reveal that if the number of qubits is minimal, it’s possible to assert that a quantum system is ontic. In essence, quantum mechanics appears to operate as anticipated, as consistently demonstrated by the Bell test.

Yang and his team executed this validation by evaluating the overall output from a pair or group of five qubits, such as a sequence of 1s and 0s, and determined the frequency at which this outcome aligned with predictions regarding the behavior of the quantum system, factoring in inherent errors.

“Currently, all quantum hardware is noisy and every operation introduces errors, so if we add this noise to the PBR threshold, what is the interpretation? [of our system]? ” remarks Yang. “We discovered that if we conduct the experiment on a small scale, we can fulfill the original PBR test and eliminate the epistemological interpretation.” The existence of hidden variables vanishes.

While they successfully demonstrated this for a limited number of qubits, they encountered difficulties replicating the same results for a larger set of qubits on a 156-qubit IBM machine. The error or noise present in the system becomes excessive, preventing researchers from distinguishing between the two scenarios in a PBR test.

This implies that the test cannot definitively determine whether the world is entirely quantum. At certain scales, the ontic view may dominate, yet at larger scales, the precise actions of quantum effects remain obscured.

Utilizing this test to validate the “quantum nature” of quantum computers could provide assurance that these machines not only function as intended but also enhance their potential for achieving quantum advantage: the capability to carry out tasks that would be impractically time-consuming for classical computers. “To obtain a quantum advantage, you must have quantum characteristics within your quantum computer. If not, you can discover a corresponding classical algorithm,” asserts team member Haom Yuan from Cambridge University.

“The concept of employing PBR as a benchmark for device efficacy is captivating,” he notes. Matthew Pusey PhD from York University, UK, one of the original PBR authors. However, Pusey remains uncertain about its implications for reality. “The primary purpose of conducting experiments rather than relying solely on theory is to ascertain whether quantum theory can be erroneous. Yet, if quantum theory is indeed flawed, what questions does that raise? The entire framework of ontic and epistemic states presupposes quantum theory.”

Understanding Reality To successfully conduct a PBR test, it’s essential to devise a method of performing the test without presuming that quantum theory is accurate. “A minority of individuals contend that quantum physics fundamentally fails at mesoscopic scales,” states Terry Rudolph, one of the PBR test’s founders from Imperial College London. “This experiment might not pertain to dismissing certain proposals, but let me be straightforward: I am uncertain! – Investigating fundamental aspects of quantum theory in progressively larger systems will always contribute to refining the search for alternative theories.”

reference: arXiv, Doi: arxiv.org/abs/2510.11213

topic:

Source: www.newscientist.com

Germanium Superconductors: A Key to Reliable Quantum Computing

Germanium is already utilized in standard computer chips

Matejimo/Getty Images

Superconductors made from germanium, a material traditionally used for computer chips, have the potential to revolutionize quantum computing by enhancing reliability and performance in the future.

Superconductors are materials that enable electricity to flow without resistance, making them ideal for various electrical applications, particularly in maintaining quantum coherence—essential for effective quantum computing.

Nonetheless, most superconductors have been specialized materials that are challenging to incorporate into computer chips. Peter Jacobson and his team at the University of Queensland, Australia, successfully developed a superconductor using germanium, a material already prevalent in the computing sector.

The researchers synthesized the superconductor by introducing gallium into a germanium film through a process called doping. Previous experiments in this area found instability in the resulting combination. To overcome this, the team utilized X-rays to infuse additional gallium into the material, achieving a stable and uniform structure.

However, similar to other known superconductors, this novel material requires cooling to a frigid 3.5 Kelvin (-270°C/-453°F) to function.

David Cardwell, a professor at the University of Cambridge, notes that while superconductors demand extremely low temperatures, making them less suitable for consumer devices, they could be ideally suited for quantum computing, which also necessitates supercooling.

“This could significantly impact quantum technology,” says Cardwell. “We’re already in a very cold environment, so this opens up a new level of functionality. I believe this is a clear starting point.”

Jacobson highlighted that previous attempts to stack superconductors atop semiconductors—critical components in computing—resulted in defects within their crystal structure, posing challenges for practical applications. “Disorder in quantum technology acts as a detrimental effect,” he states. “It absorbs the signal.”

In contrast, this innovative material enables the stacking of layers containing gallium-doped germanium and silicon while maintaining a uniform crystal structure, potentially paving the way for chips that combine the advantageous features of both semiconductors and superconductors.

Topic:

Source: www.newscientist.com

Google Unveils Quantum Computers’ Ability to Unlock Molecular Structures

Sure! Here’s a rewritten version of your content while preserving the HTML tags:

Google’s Quantum Computing Willow Chip

Google Quantum AI

Researchers at Google Quantum AI have leveraged Willow quantum computers to enhance the interpretation of data sourced from nuclear magnetic resonance (NMR) spectroscopy—an essential research method within chemistry and biology. This significant advancement may open new horizons for the application of quantum computing in various molecular technologies.

While quantum computers have been most effectively demonstrated in cryptographic contexts, current devices face limitations in scale and error rates that hinder their competence in decryption tasks. However, they show promise in expediting the discovery of new drugs and materials, which align with the fundamentally quantum nature of many scientific procedures. Hartmut Neven and colleagues at Google Quantum AI have showcased one instance where quantum computers can mimic the complex interactions found in natural processes.

The investigation centered on a computational method known as quantum echo and its application to NMR, a technique utilized to extract detailed information regarding molecular structures.

At its core, the concept of quantum echoes is akin to the butterfly effect. This phenomenon illustrates how minor perturbations—like the flap of a butterfly’s wings—can trigger substantial changes in broader systems. The researchers exploited a quantum approach within a system made up of 103 qubits in Willow.

During the experiment, the team executed a specific sequence of operations to alter the quantum state of a qubit in a manageable way. They then selected one qubit to disrupt, acting as a “quantum butterfly,” and employed the identical sequence of operations, effectively reversing time. Finally, the researchers evaluated the quantum characteristics of the qubits to extract insights regarding the entire system.

In a basic sense, the NMR technique applied in the lab also hinges on minor disturbances; it nudges actual molecules using electromagnetic waves and examines the system’s reactions to ascertain atomic positions—similar to using a molecular ruler. If the operations on qubits can replicate this process, the mathematical scrutiny of the qubits can likewise be translated into molecular structural details. This series of quantum computations could potentially enable the examination of atoms that are relatively distant from one another, said team member Tom O’Brien. “We’re constructing longer molecular rulers.”

The researchers believe that a protocol akin to quantum echoes would require approximately 13,000 times longer on a conventional supercomputer. Their tests indicated that two distinct quantum systems could successfully perform a quantum echo and yield identical outcomes—a notable achievement given the inconsistencies faced in previous quantum algorithms supported by the team. O’Brien noted that enhancements in the quality of Willow’s hardware and reduced qubit error rates have contributed to this success.

Nonetheless, there remains ample opportunity for refinement. In their utilization of Willow and quantum echoes for two organic molecules, the researchers operated with a mere 15 qubits at most, yielding results comparable to traditional non-quantum methods. In essence, the team has not yet demonstrated a definitive practical edge for Willow over conventional systems. This current exhibition of quantum echo remains foundational and has not been subjected to formal peer review.

“Addressing molecular structure determination is crucial and pertinent,” states Keith Fratus from HQS Quantum Simulations, a German company focused on quantum algorithms. He emphasizes that bridging established techniques such as NMR with calculations executed by quantum computers represents a significant milestone, though the technology’s immediate utility might be confined to specialized research in biology.

Doris Sels, a professor at New York University, remarked that their team’s experiments involve larger quantum computers and more complex NMR protocols and molecules than prior models. “Quantum simulation is often highlighted as a promising application for quantum computers, yet there are surprisingly few examples with industrial relevance. I believe model inference of spectroscopic data like NMR could prove beneficial,” she added. “We’re not quite there, but initiatives like this inspire continued investigation into this issue.”

O’Brien expressed optimism that the application of quantum echo to NMR will become increasingly beneficial as they refine qubit performance. Fewer errors mean a greater capability to execute more operations simultaneously and accommodate larger molecular structures.

Meanwhile, the quest for optimal applications of quantum computers is ongoing. While the experimental implementation of quantum echoes on Willow is remarkable, the mathematical analysis it facilitates may not achieve widespread adoption, according to Kurt von Keyserlingk at King’s College London. Until NMR specialists pivot away from traditional methods cultivated over decades, he suggests that its primary allure will lie with theoretical physicists focused on fundamental quantum system research. Furthermore, this protocol may face competitive challenges from conventional computing methods, as von Keyserlingk has already pondered how traditional computing might rival this approach.

Topic:

Let me know if you need any further adjustments!

Source: www.newscientist.com

Google Celebrates Breakthrough: Quantum Computer Exceeds Supercomputer Performance

Google has announced a significant breakthrough in quantum computing, having developed an algorithm capable of performing tasks that traditional computers cannot achieve.

This algorithm, which serves as a set of instructions for guiding the operations of a quantum computer, has the ability to determine molecular structures, laying groundwork for potential breakthroughs in areas like medicine and materials science.

However, Google recognizes that the practical application of quantum computers is still several years away.

“This marks the first occasion in history when a quantum computer has successfully performed a verifiable algorithm that surpasses the power of a supercomputer,” Google stated in a blog post. “This repeatable, beyond-classical computation establishes the foundation for scalable verification and moves quantum computers closer to practical utilization.”

Michel Devore, Google’s chief scientist for quantum AI, who recently received the Nobel Prize in Physics, remarked that this announcement represents yet another milestone in quantum developments. “This is a further advancement towards full-scale quantum computing,” he noted.

The algorithmic advancement, allowing quantum computers to function 13,000 times faster than classical counterparts, is documented in a peer-reviewed article published in the journal Nature.

One expert cautioned that while Google’s accomplishments are impressive, they revolve around a specific scientific challenge and may not translate to significant real-world benefits. Results for two molecules were validated using nuclear magnetic resonance (NMR), akin to MRI technology, yielding insights not typically provided by NMR.

Winfried Hensinger, a professor of quantum technology at the University of Sussex, mentioned that Google has achieved “quantum superiority”, indicating that researchers have utilized quantum computers for tasks unattainable by classical systems.

Nevertheless, fully fault-tolerant quantum computers—which could undertake some of the most exciting tasks in science—are still far from realization, as they would necessitate machines capable of hosting hundreds of thousands of qubits (the basic unit of information in quantum computing).

“It’s crucial to recognize that the task achieved by Google isn’t as groundbreaking as some world-changing applications anticipated from quantum computing,” Hensinger added. “However, it represents another compelling piece of evidence that quantum computers are steadily gaining power.”

A truly capable quantum computer able to address a variety of challenges would require millions of qubits, but current quantum hardware struggles to manage the inherent instability of qubits.

“Many of the most intriguing quantum computers being discussed necessitate millions or even billions of qubits,” Hensinger explained. “Achieving this is even more challenging with the type of hardware utilized by the authors of the Google paper, which demands cooling to extremely low temperatures.”

Hartmut Neven, Google’s vice president of engineering, stated that quantum computers may be five years away from practical application, despite advances in an algorithm referred to as Quantum Echo.

Skip past newsletter promotions

“We remain hopeful that within five years, Quantum Echo will enable real-world applications that are solely feasible with quantum computers,” he said.

As a leading AI company, Google also asserts that quantum computers can generate unique data capable of enhancing AI models, thereby increasing their effectiveness.

Traditional computers represent information in bits (denoted by 0 or 1) and send them as electrical signals. Text messages, emails, and even Netflix movies streamed on smartphones consist of these bits.

Contrarily, information in a quantum computer is represented by qubits. Found within compact chips, these qubits are particles like electrons or photons that can exist in multiple states simultaneously—a concept known as superposition in quantum physics.

This characteristic enables qubits to concurrently encode various combinations of 1s and 0s, allowing computation of vast numbers of different outcomes, an impossibility for classical computers. Nonetheless, maintaining this state requires a strictly controlled environment, free from electromagnetic interference, as disturbances can easily disrupt qubits.

Progress by companies like Google has led to calls for governments and industries to implement quantum-proof cryptography, as cybersecurity experts caution that these advancements have the potential to undermine sophisticated encryption.

Source: www.theguardian.com

Ultracold Atoms May Investigate Relativity in the Quantum Realm

Here’s your content rewritten while maintaining the HTML tags:

Spinning ultracold atoms could uncover the limits of Einstein’s relativity

Shutterstock / Dmitriy Rybin

Small Ferris wheels made from light and extremely chilled particles could enable scientists to investigate elements of Albert Einstein’s theory of relativity on an extraordinary level.

Einstein’s special and general theories of relativity, established in the early 20th century, transformed our comprehension of time by illustrating that a moving clock can tick slower than a stationary one. If one moves rapidly or accelerates significantly, time measured will also increase. The same applies when an object moves in a circular path. While these effects have been noted in relatively large celestial entities, Vassilis Rembesis and his team at King Saud University in Saudi Arabia have developed a method to test these principles on a diminutive scale.

By examining rotation and time at the molecular level (atoms and molecules), they explored ultracold regions, just a few millionths of a degree above absolute zero. In this domain, the quantum behavior and movement of atoms and molecules can be meticulously controlled with laser beams and electromagnetic fields. In 2007, Rembesis and his colleagues formulated a technique to tune a laser beam to trap atoms in a cylindrical form, allowing them to spin. They refer to this as an “optical Ferris wheel,” and Rembesis asserts that their new findings propose that it can be used to observe relativistic time dilation in ultracold particles.

Their predictions indicate that nitrogen molecules are optimal candidates for investigating rotational time delays at the quantum level. By considering the movement of electrons within them as the ticks of an internal timer, the researchers detected frequency changes as minuscule as 1/10 quintillion.

Simultaneously, Rembesis noted that experiments utilizing optical Ferris wheels have been sparse up until now. This new proposal opens avenues for examining relativity theory in uncharted conditions where new or surprising phenomena may emerge. For instance, the quantum characteristics of ultracold particles may challenge the “clock hypothesis,” which states how a clock’s acceleration influences its ticking.

“It’s crucial to validate our interpretations of physical phenomena within nature. It’s often during unexpected occurrences that we need to reevaluate our understanding for a deeper insight into the universe. This research offers an alternative approach to examining relativistic systems, providing distinct advantages over traditional mechanical setups,” says Patrick Oberg from Heriot-Watt University, UK.

Relativistic phenomena, such as time dilation, generally necessitate exceedingly high velocities; however, optical Ferris wheels enable access to them without the need for impractically high speeds, he explains. Aidan Arnold from the University of Strathclyde, UK adds, “With the remarkable accuracy of atomic clocks, the time difference ‘experienced’ by the atoms in the Ferris wheel should be significant. Because the accelerated atoms remain in close proximity, there is ample opportunity to measure this difference,” he states.

By adjusting the focus of the laser beam, it may also become feasible to manipulate the dimensions of the Ferris wheel that confines the particles, allowing researchers to explore time-delay effects for various rotations, as noted by Rembesis. Nevertheless, technical challenges persist, including the need to ensure that atoms and molecules do not heat up and become uncontrollable during rotation.

topic:

Source: www.newscientist.com

Challenging Calculations: Quantum Computers May Struggle with ‘Nightmare’ Problems

SEI 270616406

Certain problems remain insurmountable for quantum computers.

Jaroslav Kushta/Getty Images

Researchers have uncovered a “nightmare scenario” computation tied to a rare form of quantum material that remains unsolvable, even with the most advanced quantum computers.

In contrast to the simpler task of determining the phase of standard matter, such as identifying whether water is in a solid or liquid state, the quantum equivalent can prove exceedingly challenging. Thomas Schuster and his team at the California Institute of Technology have demonstrated that identifying the quantum phase of matter can be notably difficult, even for quantum machines.

They mathematically examined a scenario in which a quantum computer receives a set of measurements regarding the quantum state of an object and must determine its phase. Schuster mentioned that this is not necessarily an impossible task, but his team has shown that a considerable number of quantum phases of matter—such as the complex interactions between liquid water and ice, including unusual “topological” phases that exhibit strange electrical currents—might necessitate quantum computers to perform computations over extremely protracted periods. This situation mirrors a worst-case scenario in laboratory settings, where instruments may need to operate for billions or even trillions of years to discern the characteristics of a sample.

This doesn’t imply that quantum computers are rendered obsolete for this analysis. As Schuster noted, these phases are unlikely to manifest in actual experiments involving materials or quantum systems, serving more as an indicator of our current limitations in understanding quantum computers than posing an immediate practical concern. “They’re like nightmare scenarios. It would be quite unfortunate if such a case arose. It probably won’t happen, but we need to improve our comprehension,” he stated.

Bill Fefferman from the University of Chicago raised intriguing questions regarding the overall capabilities of computers. “This might illuminate the broader limits of computation: while substantial speed improvements have been realized for specific tasks, there will inevitably be challenges that remain too daunting, even for efficient quantum computers,” he asserted.

Mathematically, he explained, this new research merges concepts from quantum information science employed in quantum cryptography with foundational principles from materials physics, potentially aiding progress in both domains.

Looking ahead, the researchers aspire to broaden their analysis to encompass more energetic or excited quantum phases of matter, which are recognized as challenging for wider calculations.

topic:

Source: www.newscientist.com

What Makes Quantum Computers So Powerful?

3D rendering of a quantum computer’s chandelier-like structure

Shutterstock / Phong Lamai Photography

Eleven years ago, I began my PhD in theoretical physics and honestly had never considered or written about quantum computers. Meanwhile, New Scientist was busy crafting the first “Quantum Computer Buyer’s Guide,” always ahead of its time. A glance through reveals how things have changed—John Martinis from UC Santa Barbara was recognized for developing an array of merely nine qubits and earned a Nobel Prize in Physics just last week. Curiously, there was no mention of quantum computers built using neutral atoms, which have rapidly transformed the field in recent years. This sparked my curiosity: how would a quantum computer buyer’s guide look today?

At present, around 80 companies globally are producing quantum computing hardware. My reporting on quantum computing has allowed me to witness firsthand how the industry evolves, complete with numerous sales pitches. If choosing between an iPhone and an Android is challenging, consider navigating the press lists of various quantum computing startups.

While there’s significant marketing hype, the challenge in comparing these devices stems from the lack of a clear standard for building quantum computers. For instance, potential qubit options include superconducting circuits, cryogenic ions, and light. With such diverse components, how does one assess their differences? This aspect will hone in on each quantum computer’s performance.

This marks a shift from the early days, where success was measured by the number of qubits—the foundational elements of quantum information processing. Many research teams have surpassed the 1000-qubit threshold, and the trajectory for achieving even more qubits appears to be becoming clearer. Researchers are exploring standard manufacturing methods, such as creating silicon-based qubits, and leveraging AI to enhance the size and capabilities of quantum computers.

Ideally, more qubits should always translate to greater computational power, enabling quantum computers to tackle increasingly complex challenges. However, in reality, ensuring each additional qubit doesn’t impede the performance of existing ones presents significant technical hurdles. Thus, it’s not just the number of qubits that counts, but how much information they can retain and how effectively they can communicate without losing data accuracy. A quantum computer could boast millions of qubits, but if they’re susceptible to errors that disrupt computations, they become virtually ineffective.

The extent of this “glitch” or noise can be measured by metrics like “gate fidelity,” which reflects how accurately a qubit or pair can perform operations, and “coherence time,” which gauges how long a qubit can maintain a viable quantum state. However, we must also consider the intricacies of inputting data into a quantum computer and retrieving outcomes, despite some favorable metrics. The growth of the quantum computing industry is partly attributed to the emergence of companies focused on qubit control and interfacing quantum internals with non-quantum users. A thorough buyer’s guide for quantum computers in 2025 should encompass these essential add-ons. Choosing a qubit means also selecting a qubit control system and an error correction mechanism. I recently spoke with a researcher developing an operating system for quantum computers, suggesting that such systems may become a necessity in the near future.

If I were to create a wish list for the short term, I would favor a machine capable of executing at least a million operations: a million-step quantum computing program with minimal error rates and robust error correction. John Preskill from the California Institute of Technology refers to this as the “Mega-Quop” machine. Last year, he expressed confidence that such machines would be fault-tolerant and powerful enough to yield scientifically significant discoveries. Yet, we aren’t there yet. The quantum computers at our disposal currently manage tens of thousands of operations, but error correction has only been effectively demonstrated for smaller tasks.

Quantum computers today are akin to adolescents—growing toward utility but still faced with developmental challenges. As a result, the question I frequently pose to quantum computer vendors is, “What can this machine actually accomplish?”

In this regard, it’s vital to compare not only various types of quantum computers but also contrast them with classical counterparts. Quantum hardware is costly and complex to manufacture, so when is it genuinely the sole viable solution for a given issue?

One method to tackle this inquiry is to pinpoint calculations traditional computers cannot resolve without unlimited time. This concept is termed “quantum supremacy,” and it keeps quantum engineers and mathematicians consistently preoccupied. Instances of quantum supremacy do exist, but they raise concerns. To be meaningful, such cases must be applicable, facilitating the construction of capable machines that can execute them, while also being demonstrable enough for mathematicians to assure that no conventional computer could compete.

In 1994, physicist Peter Shor devised a quantum computing algorithm for factoring large numbers, a technique that could potentially compromise the prevalent encryption methods utilized by banks worldwide. A sufficiently large quantum computer that could manage its own errors might execute this algorithm, yet mathematicians have yet to convincingly demonstrate that classical computers can’t efficiently factor large numbers. The most prominent claims of quantum supremacy often fall into this gray area, with some eventually being outperformed by classical machines. Ongoing demonstrations of quantum supremacy appear currently to serve primarily as confirmations of the quantum characteristics of the computers accomplishing them.

Conversely, in the mathematical discipline of “query complexity,” the superiority of quantum solutions is rigorously demonstrable, but practical algorithms remain elusive. Recent experiments have also introduced the notion of “quantum information superiority,” wherein quantum computers solve tasks using fewer qubits than traditional computers would require, focusing on the physical components instead of time. Though this sounds promising—indicating that quantum computers may solve problems without extensive scaling—they are not recommended for purchase simply because the tasks in question often lack pivotal real-world applications.

It’s undeniable that several real-world challenges are well-suited for quantum algorithms, like understanding molecular properties relevant to agriculture or medicine, or solving logistic issues like flight scheduling. Yet, researchers lack full clarity on these applications, often opting to state, “it seems.”

For instance, recent research on the prospective applications of quantum computing in genomics by Aurora Maurizio from the San Raffaele Scientific Institute in Italy and Guglielmo Mazzola at the University of Zurich suggests that traditional computing methods excel so significantly that “quantum computing may, in the near future, only yield speedups for a specific subset of sufficiently complex tasks.” Their findings indicate that while quantum computers could potentially enhance research in combinatorial problems within genomics, their application needs to be very precise and calculated.

In reality, for numerous issues not specifically designed to demonstrate quantum supremacy, there exists a spectrum in what constitutes “fast,” particularly when one considers that quantum computers might ultimately run algorithms quicker than classical computers, despite overcoming noise and technical challenges. However, this speed may not always offset the hardware’s significant costs. For example, the second-best-known quantum algorithm, Shor’s search algorithm, offers a non-exponential speedup, reducing computation time at a square root level instead. Ultimately, the question of how fast is “fast enough” to justify the transition to quantum computing may depend on individual buyers.

While it’s frustrating to include this in a purported buyer’s guide, my discussions with experts indicate that there remains far more uncertainty about what quantum computers can achieve than established knowledge. Quantum computing is an intricate, costly future technology; however, its genuine added value to our lives remains vague beyond serving the financial interests of a select few companies. This might not be satisfying, but it reflects the unique, uncharted territory of quantum computing.

For those of you reading this out of the desire to invest in a powerful, reliable quantum computer, I encourage you to proceed and let your local quantum algorithm enthusiast experiment with it. They may offer better insights in the years to come.

Topic:

Source: www.newscientist.com

Nobel Prize in Physics Awarded to Trio Pioneering Quantum Computing Chips

John Clarke, Michel Devolette and John Martinis awarded the 2025 Nobel Prize in Physics

Jonathan Nackstrand/AFP via Getty Images

The prestigious 2025 Nobel Prize in Physics was awarded to John Clarke, Michel Devolette, and John Martinis. Their research elucidates how quantum particles can delve through matter, a critical process that underpins the superconducting quantum technology integral to modern quantum computers.

“I was completely caught off guard,” Clarke remarked upon hearing the news from the Nobel Committee. “This outcome was unimaginable; it felt like a dream to be considered for the Nobel Prize.”

Quantum particles exhibit numerous peculiar behaviors, including their stochastic nature and the restriction to specific energy levels instead of a continuous range. This phenomenon sometimes leads to unforeseen occurrences, such as tunneling through solid barriers. Such unusual characteristics were first revealed by pioneers like Erwin Schrödinger during the early years of quantum mechanics.

The implications of these discoveries are profound, particularly supporting theories like nuclear decay; however, earlier research was limited to individual particles and basic systems. It remained uncertain whether more intricate systems such as electronic circuits, conventionally described by classical physics, also adhered to these principles. For instance, the quantum tunneling effect seemed to vanish when observing larger systems.

In 1985, the trio from the University of California, Berkeley—Clarke, Martinis, and Devolette—sought to change this narrative. They investigated the properties of charged particles traversing a superconducting circuit known as the Josephson Junction, a device that earned the Nobel Prize in Physics in 1973 for British physicist Brian Josephson. These junctions comprise wires exhibiting zero electrical resistance, separated by an insulating barrier.

The researchers demonstrated that particles navigating through these junctions behaved as individual entities, adopting distinct energy levels, clear quantum attributes, and registering voltages beyond expected limits without breaching the adiabatic barrier.

This groundbreaking discovery significantly deepened our understanding of how to harness similar superconducting quantum systems, transforming the landscape of quantum science and enabling other scientists to conduct precise quantum physics experiments on silicon chips.

Moreover, superconducting quantum circuits became foundational to the essential components of quantum computers, known as qubits. Developed by companies like Google and IBM, the most advanced quantum computers today consist of hundreds of superconducting qubits, a result of the insights gained from Clarke, Martinis, and Devolette’s research. “In many respects, our findings serve as the cornerstone of quantum computing,” stated Clarke.

Both Martinis and Devolette are currently affiliated with Google Quantum AI, where they pioneered the first superconducting quantum computer in 2019 that demonstrated quantum advantage over traditional machines. However, Clarke noted to the Nobel Committee that it was surprising to consider the extent of impact their 1985 study has had. “Who could have imagined that this discovery would hold such immense significance?”

Topics:

  • Nobel Prize/
  • Quantum Computing

Source: www.newscientist.com

Ultracold Clock Sheds Light on Quantum Physics’ Impact on Time

SEI 267717982

What is the quantum nature of time? We may be on the verge of discovering it

Quality Stock / Alamy

How does time manifest for a genuine quantum entity? The most advanced clocks can rapidly address this query, enabling us to test various ways to manipulate and alter the quantum realm, thereby delving into the uncharted territories of physics.

The notion that time can shift originates from Albert Einstein’s special theory of relativity. As an object approaches the speed of light, it appears to experience time more slowly compared to a stationary observer. He expands upon this with a general theory of relativity, which demonstrates a similar temporal distortion in the presence of a gravitational field. Igor Pikovsky from the Stevens Institute in New Jersey and his team aim to uncover whether a similar effect occurs within the microscopic quantum landscape, utilizing ultra-cold clocks constructed from ions.

“The experiments we’ve performed until now have always focused on classical time, disregarding quantum mechanics,” says Pikovsky. “We’ve observed a regime where conventional explanations falter with an ion clock,” he continues.

These clocks consist of thousands of ions cooled to temperatures nearing absolute zero via laser manipulation. At such low temperatures, the quantum state of an ion and its embedded electrons can be precisely controlled through electromagnetic forces. Thus, the ticks of an ion clock are governed by the electrons oscillating between two distinct quantum states.

Since their behavior is dictated by quantum mechanics, these instruments provided an ideal platform for Pikovsky and his colleagues to investigate the interplay between relativistic and quantum phenomena on timekeeping. Pikovski mentions that they’ve identified several scenarios where this blending is evident.

One example arises from the intrinsic fluctuations inherent in quantum physics. Even at ultra-low temperatures, quantum objects cannot be completely static and instead must oscillate, randomly gaining or losing energy. Team calculations indicated that these fluctuations could lead to extended clock time measurements. Although the effect is minute, it is detectable in current ion clock experiments.

The researchers also mathematically analyzed the behavior of ions in a clock when “compressed,” resulting in “superpositions” of multiple quantum states. They found that these states are closely linked to the motion of the ions, influenced by their internal electrons. The states of ions and electrons are interconnected at a quantum level. “Typically, experiments necessitate creative methods to establish entanglements. The intriguing aspect here is that it arises organically,” explains team member Christian Sanner from Colorado State University.

Pikovski asserts that it is intuitive to think that quantum objects existing in superposition cannot simply perceive time linearly, though this effect has yet to be experimentally confirmed. He believes it should be achievable in the near future.

Team member Gabriel Solch from the Stevens Institute of Technology mentions that the next step is incorporating another crucial aspect of modern physics: gravity. Ultra-cold clocks can currently detect temporal extensions caused by significant variations in the Earth’s gravitational pull, such as when elevated by a few millimeters, but the exact integration of these effects with the intrinsic quantum characteristics of the clock remains an unresolved question.

“I believe it is quite feasible with our existing technology,” adds David Hume from the U.S. National Institute of Standards and Technology, Colorado. He highlights that the primary challenge is to mitigate ambient disturbances affecting the clock to ensure it doesn’t overshadow the effects suggested by Pikovsky’s team. Successful experiments could pave the way for exploring unprecedented physical phenomena.

“Such experiments are thrilling because they create a platform for theories to interact in a domain where they could yield fresh insights,” remarks Alexander Smith at St. Anselm College, New Hampshire.

Topic:

Source: www.newscientist.com