Quantum Computers Require Classical Computing for Real-World Applications

Quantum Machine Professor Jonathan Cohen presenting at the AQC25 conference

Quantum Machines

Classical computers are emerging as a critical component in maximizing the functionality of quantum computers. This was a key takeaway from this month’s assembly of researchers who emphasized that classical systems are vital for managing quantum computers, interpreting their outputs, and enhancing future quantum computing methodologies.

Quantum computers operate on qubits—quantum entities manifesting as extremely cold atoms or miniature superconducting circuits. The computational capability of a quantum computer scales with the number of qubits it possesses.

Yet, qubits are delicate and necessitate meticulous tuning, oversight, and governance. Should these conditions not be met, the computations conducted may yield inaccuracies, rendering the devices less efficient. To manage qubits effectively, researchers utilize classical computing methods. The AQC25 conference held on November 14th in Boston, Massachusetts, addressed these challenges.

Sponsored by Quantum Machines, a company specializing in controllers for various qubit types, the AQC25 conference gathered over 150 experts, including quantum computing scholars and CEOs from AI startups. Through numerous presentations, attendees elaborated on the enabling technologies vital for the future of quantum computing and how classical computing sometimes acts as a constraint.

Per Shane Caldwell, sustainable fault-tolerant quantum computers designed to tackle practical problems are only expected to materialize with a robust classical computing framework that operates at petascale—similar to today’s leading supercomputers. Although Nvidia does not produce quantum hardware, it has recently introduced a system that links quantum processors (QPUs) to traditional GPUs, which are commonly employed in machine learning and high-performance scientific computing.

Even in optimal operations, the results from a quantum computer reflect a series of quantum properties of the qubits. To utilize this data effectively, it requires translation into conventional formats, a process that again relies on classical computing resources.

Pooya Lonar from Vancouver-based startup 1Qbit discussed this translation process and its implications, noting that the performance speed of fault-tolerant quantum computers can often hinge on the operational efficiency of classical components such as controllers and decoders. This means that whether a sophisticated quantum machine operates for hours or days to solve a problem might depend significantly on its classical components.

In another presentation, Benjamin Lienhardt from the Walter Meissner Institute for Cryogenic Research in Germany, presented findings on how traditional machine learning algorithms can facilitate the interpretation of quantum states in superconducting qubits. Similarly, Mark Saffman from the University of Wisconsin-Madison highlighted using classical neural networks to enhance the readout of qubits derived from ultra-cold atoms. Researchers unanimously agreed that non-quantum devices are instrumental in unlocking the potential of various qubit types.

IBM’s Blake Johnson shared insights into a classical decoder his team is developing as part of an ambitious plan to create a quantum supercomputer by 2029. This endeavor will employ unconventional error correction strategies, making the efficient decoding process a significant hurdle.

“As we progress, the trend will shift increasingly towards classical [computing]. The closer one approaches the QPU, the more you can optimize your system’s overall performance,” stated Jonathan Cohen from Quantum Machines.

Classical computing is also instrumental in assessing the design and functionality of future quantum systems. For instance, Izhar Medalcy, co-founder of the startup Quantum Elements, discussed how an AI-powered virtual model of a quantum computer, often referred to as a “digital twin,” can inform actual hardware design decisions.

Representatives from the Quantum Scaling Alliance, co-led by 2025 Nobel Laureate John Martinis, were also present at the conference. This reflects the importance of collaboration between quantum and classical computing realms, bringing together qubit developers, traditional computing giants like Hewlett Packard Enterprise, and computational materials specialists such as the software company Synopsys.

The collective sentiment at the conference was unmistakable. The future of quantum computing is on the horizon, bolstered significantly by experts who have excelled in classical computing environments.

Topics:

  • Computing/
  • Quantum Computing

Source: www.newscientist.com

Quantum 2.0 Review: An Ambitious and Entertaining Exploration of Quantum Physics, Though Slightly Exaggerated

Quantum 2.0 explores the boundaries of our understanding of the quantum realm

Richard Keil/Science Photo Library

Quantum 2.0
Paul Davies Penguin (UK, released November 27th); University of Chicago Press (US, released in February 2026)

In his book Quantum 2.0: The Past, Present, and Future of Quantum Physics, physicist Paul Davies concludes with a beautiful reflection: “To grasp the quantum world is to catch a glimpse of the grandeur and elegance of the physical universe and our role within it.”

This enchanting and romantic viewpoint resonates throughout the text. Quantum 2.0 presents a bold attempt to elucidate the fringes of the quantum universe, with Davies as an informed and passionate storyteller. However, his enthusiasm occasionally edges toward exaggeration, with his remarkable writing skills often compensating where more direct quotations might have been fitting.

Davies’ book is quite accessible, despite its ambitious aim of covering nearly every facet of quantum physics. He addresses quantum technologies in computing, communications, and sensing, touches on quantum biology and cosmology, and manages to explore various competing interpretations of quantum theory.

There are no equations in Quantum 2.0, and while some technical diagrams and schematics are included, they do not detract from the reading experience.

As a writer on quantum physics myself, I appreciate how clearly Davies articulates the experiments and protocols involved in quantum information processing and encryption—a challenging task to convey.

As a navigator through the quantum realm, Davies serves as a delightful and amiable companion. His genuine curiosity and excitement are palpable. Yet, this exuberance doesn’t always align with the rigor that contemporary quantum physics research demands. In my view, most quantum-related excitement should come with cautionary notes.


Readers unfamiliar with quantum research might confuse speculative claims with the truth.

For instance, within the first 100 pages, Davies asserts that quantum computers could enhance climate modeling—an assertion not widely accepted among computer scientists and mathematicians, especially concerning near-future machines.

In another section regarding quantum sensors, he mentions manufacturers proposing their utility in evaluating conditions like epilepsy, schizophrenia, and autism. I anticipated a justification or insights from experts outside the sensor industry, but the ensuing discussion was lacking in depth and critical analysis.

Additionally, the example Davies provides to demonstrate quantum computers’ advantages over classical ones dates back several years.

Less experienced readers in quantum research may find some of Davies’s speculative statements misleading, although the book remains an engaging read. This is underscored by bold assertions such as, “Whoever masters Quantum 2.0 will certainly control the world.”

To clarify, I don’t dispute Davies’ sentiments. Many gadgets that influence our lives currently depend on quantum physics, and the future may usher in even more quantized technology. I support this notion.

Emerging fields, such as quantum biology and better integration of quantum and cosmological theories, also seem poised for significant breakthroughs. Just ask the numerous researchers diligently working toward a theory of quantum gravity.

However, conveying this future to newcomers necessitates a blend of precision and subtlety in storytelling and writing.

Otherwise, the outcome may lead to disappointment.

topic:

Source: www.newscientist.com

Quantum Computers with Recyclable Qubits: A Solution for Reducing Errors

Internal optics of Atom Computing’s AC1000 system

Atom Computing

Quantum computers, utilizing qubits formed from extremely cold atoms, are rapidly increasing in size and may soon surpass classical computers in computational power. However, the frequency of errors poses a significant challenge to their practicality. Researchers have now found a way to replenish and recycle these qubits, enhancing computation reliability.

All existing quantum systems are susceptible to errors and are currently unable to perform calculations that would give them an edge over traditional computers. Nonetheless, researchers are making notable advancements in the creation of error correction methods to address this issue.

One approach involves dividing the components of quantum computers, known as qubits, into two primary categories: operational qubits that manipulate data and auxiliary qubits that monitor errors.

Developing large quantities of high-quality qubits for either function remains a significant technical hurdle. Matt Norcia and his team at Atom Computing have discovered a method to lessen the qubit requirement by recycling or substituting auxiliary qubits. They demonstrated that an error-tracking qubit can be effectively reused for up to 41 consecutive runs.

“The calculation’s duration is likely to necessitate numerous rounds of measurement. Ideally, we want to reuse qubits across these rounds, minimizing the need for a continuous influx of new qubits,” Norcia explains.

The team utilized qubits derived from electrically neutral ytterbium atoms that were chilled close to absolute zero using lasers and electromagnetic pulses. By employing “optical tweezers,” they can manipulate each atom’s quantum state, which encodes information. This method allowed them to categorize the quantum computer into three distinct zones.

In the first zone, 128 optical tweezers directed the qubits to conduct calculations. The second zone comprised 80 tweezers that held qubits for error tracking, or that could be swapped in for faulty qubits. The third zone functioned as a storage area, keeping an additional 75 qubits that had recently been deemed useful. These last two areas enabled researchers to reset or exchange the auxiliary qubit as needed.

Norcia noted that it was challenging to establish this setup due to stray laser light interfering with nearby qubits. Consequently, researchers had to develop a highly precise laser control and a method to adjust the state of data qubits, ensuring they remained “hidden” from specific harmful light types.

“The reuse of Ancilla is crucial for advancing quantum computing,” says Yuval Borger from QuEra, a U.S. quantum computing firm. Without this ability, even basic calculations would necessitate millions, or even billions, of qubits, making it impractical for current or forthcoming quantum hardware, he adds.

This challenge is recognized widely across the atom-based qubit research community. “Everyone acknowledges that neutral atoms understand the necessity to reset and reload during calculations,” Norcia asserts.

For instance, Borger highlights that a team from Harvard and MIT employed similar techniques to maintain the operation of their quantum computer using 3000 ultra-cold rubidium atoms for several hours. Other quantum setups, like Quantinuum’s recently launched Helios machine, which uses ions controlled by light as qubits, also feature qubit reusability.

topic:

Source: www.newscientist.com

IBM Introduces Two Quantum Computers with Unmatched Complexity

IBM researchers hold components of the Loon quantum computer

IBM

In the competitive landscape of developing error-resistant quantum supercomputers, IBM is adopting a unique approach distinct from its primary rivals. The company has recently unveiled two new quantum computing models, dubbed Nighthawk and Loon, which may validate its methodology and deliver the advancements essential for transforming next-gen devices into practical tools.

IBM’s design for quantum supercomputers is modular, emphasizing the innovation of connecting superconducting qubits both within and across different quantum units. When this interconnectivity was first proposed, some researchers expressed skepticism about its feasibility. Jay Gambetta from IBM noted that critics implied to the team, “You exist in a theoretical realm; achieving this is impossible,” which they aim to refute.

Within Loon, every qubit interlinks with six others, allowing for unique connectivity that enables vertical movement in addition to lateral motion. This feature has not been previously observed in existing superconducting quantum systems. Conversely, Nighthawk implements four-way connections among qubits.

This enhanced connectivity may be pivotal in tackling some of the most pressing issues encountered by current quantum computers. The advancements could boost computational capabilities and reduce error rates. Gambetta indicated that initial tests with Nighthawk demonstrated the ability to execute quantum programs that are 30% more complex than those on most other quantum computers in use today. Such an increase in complexity is expected to facilitate further advancements in quantum computing applications, with IBM’s earlier models already finding utility in fields like chemistry.

The industry’s ultimate objective remains the ability to cluster qubits into error-free “logical qubits.” IBM is promoting strategies that necessitate smaller groupings than those pursued by competitors like Google. This could permit IBM to realize error-free computation while sidestepping some of the financial and engineering hurdles associated with creating millions of qubits. Nonetheless, this goal hinges on the connectivity standards achieved with Loon, as stated by Gambetta.

Stephen Bartlett, a researcher at the University of Sydney in Australia, expressed enthusiasm about the enhanced qubit connectivity but noted that further testing and benchmarking of the new systems are required. “While this is not a panacea for scaling superconducting devices to a size capable of supporting genuinely useful algorithms, it represents a significant advancement,” he remarked.

However, there remain several engineering and physical challenges on the horizon. One crucial task is to identify the most effective method for reading the output of a quantum computer after calculations, an area where Gambetta mentioned recent IBM progress. The team, led by Matthias Steffen, also aims to enhance the “coherence time” for each qubit. This measure indicates how long a quantum state remains valid for computational purposes, but the introduction of new connections can often degrade this quantum state. Additionally, they are developing techniques to reset certain qubits while computations are ongoing.

Plans are in place for IBM to launch a modular quantum computer in 2026 capable of both storing and processing information, with future tests on Loon and Nighthawk expected to provide deeper insights.

Topic:

Source: www.newscientist.com

Helios 1: A Groundbreaking Quantum Computer Poised to Tackle Superconductivity Challenges

Helios-1 Quantum Computing Chip

Quantinum

At Quantinuum, researchers have harnessed the capabilities of the Helios-1 quantum computer to simulate a mathematical model traditionally used to analyze superconductivity. While classical computers can perform these simulations, this breakthrough indicates that quantum technology may soon become invaluable in the realm of materials science.

Superconductors can transmit electricity flawlessly, yet they only operate at exceedingly low temperatures, rendering them impractical. For decades, physicists have sought to modify the structural characteristics of superconductors to enable functionality at room temperature, and many believe the solution lies within a mathematical framework known as the Fermi-Hubbard model. This model is regarded by Quantinuum researchers as a significant component of condensed matter physics. For additional insights, see Henrik Dreyer.

While traditional computers excel at simulating the Fermi-Hubbard model, they struggle with large samples and fluctuating material properties. In comparison, quantum computers like Helios-1 are poised to excel in these areas. Dreyer and colleagues achieved a milestone by conducting the most extensive simulation of the Fermi-Hubbard model on a quantum platform.

The team employed the Helios-1, which operates with 98 qubits derived from barium ions. These qubits are manipulated using lasers and electromagnetic fields to execute the simulations. By adjusting the qubits through various quantum states, they collected data on their properties. Their simulation encompassed 36 fermions, the exact particles typical in superconductors, represented mathematically by the Fermi-Hubbard model.

Past experiments show that fermions must form pairs for superconductors to function, an effect that can be induced by laser light. The Quantinuum team modeled this scenario, applying laser pulses to the qubits and measuring the resulting states to detect signs of particle pairing. Although the simulation didn’t replicate the experiment precisely, it captured key dynamic processes that are often challenging to model using traditional computational methods with larger particle numbers.

Dreyer mentioned that while the experiment does not definitively establish an advantage for Helios-1 over classical computing, it gives the team assurance in the competitiveness of quantum computers compared to traditional simulation techniques. “Utilizing our methods, we found it practically impossible to reproduce the results consistently on classical systems, whereas it only takes hours with a quantum computer,” he stated. Essentially, the time estimates for classical calculations were so extended that determining equivalence with Helios’ performance became challenging.

The Trapped Ions Function as Qubits in the Helios-1 Chip

Quantinum

No other quantum computer has yet endeavored to simulate fermion pairs for superconductivity, with the researchers attributing their achievement to Helios’ advanced hardware. David Hayes from Quantinuum remarked on Helios’ qubits being exceptionally reliable and their proficiency in industry-standard benchmarking tasks. Preliminary experiments yielded maintenance of error-free qubits, including a feat of entangling 94 specialized qubits—setting a new record across all quantum platforms. The utilization of such qubits in subsequent simulations could enhance their precision.

Eduardo Ibarra Garcia Padilla, a researcher at California’s Harvey Mudd University, indicated that the new findings hold promise but require careful benchmarks against leading classical computer simulations. The Fermi-Hubbard model has intrigued physicists since the 1960s, so he’s eager for advanced tools to further its study.

Uncertainty surrounds the timeline for approaches like Helios-1 to rival the leading conventional computers, according to Steve White from the University of California, Irvine. He noted that many essential details remain unresolved, particularly ensuring that quantum simulations commence with the appropriate qubit properties. Nevertheless, White posits that quantum simulations could complement classical methods, particularly in exploring the dynamic behaviors of materials.

“They are progressing toward being valuable simulation tools for condensed matter physics,” he stated, but added, “It remains early days, and computational challenges persist.”

Reference: arXiv Doi: 10.48550/arXiv.2511.02125

Topic:

Source: www.newscientist.com

Next-Gen Quantum Networks: Paving the Way for a Quantum Internet Prototype

Quantum Internet could provide secure communications globally

Sakumstarke / Alamy

One of the most sophisticated quantum networks constructed to date will enable 18 individuals to communicate securely through the principles of quantum physics. The researchers affirm that this represents a feasible step towards realizing a global quantum internet, although some experts express doubt.

The eagerly awaited quantum internet aims to allow quantum computers to communicate over distances by exchanging light particles, known as photons, that are interconnected through quantum entanglement. Additionally, it will facilitate the linkage of quantum sensor networks, enabling communications impervious to classical computer hacking. However, connecting different segments of the quantum realm is not as straightforward as laying down cables due to the challenges in ensuring seamless interactions between network nodes.

Recently, Chen Shenfeng from Shanghai Jiao Tong University in China demonstrated a method to interconnect two quantum networks. Initially, they established two networks containing 10 nodes each, both sharing quantum entanglement and functioning as smaller iterations of a quantum internet. They then combined one node from each network, resulting in a larger, fully integrated network that enables communication across all pairs of the 18 remaining nodes.

Networking 18 classical computers is a straightforward endeavor involving inexpensive components, but in the quantum sphere, where specific timing is crucial for sharing individual photons among several users, advanced technology and specialized knowledge are required. Even establishing communication between pairs is intricate, yet facilitating communication among any pair of 18 users is unprecedented.

“Our method provides essential capabilities for quantum communication across disparate networks and is pivotal for creating a large-scale quantum internet that enables interactions among all participants,” the researchers stated in their paper, which has not responded to inquiries for comments.

As the researchers clarify, this network integration hinges on a process termed entanglement swapping. Photons can be intertwined by conducting a specific observation known as the Bell measurement. By simultaneously measuring the status of one photon from each of two pairs of entangled photons, the most distant photons in the arrangement become linked. However, attempting to observe their states disrupts the delicate quantum balance and thus depletes the measured photon states.

“This isn’t the initial demonstration of entanglement exchange,” remarks Sidharth Joshi from the University of Bristol, UK. “What they have achieved is a framework that simplifies inter-network exchanges.”

Joshi notes that current quantum communication research is divided between extending the range of information transmission between two devices, occasionally utilizing satellites, and developing protocols and strategies for reliably networking numerous devices over shorter distances. This study pertains to the latter. “Both areas are critically important,” he asserts.

Conversely, Robert Young, a professor at Lancaster University in the UK, commented that while the results showcase a remarkable technical feat demanding expertise and extensive resources, he deems it improbable as a blueprint for future large-scale quantum networks, considering the expense and intricacy involved.

“This is far from practical and not something readily applicable in real-world scenarios,” Young states. “The paper’s claim is that this is the future of quantum network integration, but many formidable challenges remain to be addressed.”

One significant issue is the necessity for quantum repeaters to convey information across extensive distances. As distance increases, photons are frequently lost in fiber optic cables, and measurements can jeopardize the state of a photon, rendering the quantum information unreadable or untransmittable, thereby preventing signal amplification along its route. If quantum repeaters functioned effectively, they could transmit signals over longer distances, yet constructing such devices has been challenging.

“We understand that to build a viable quantum network, some method of quantum repeater is essential,” Young points out, emphasizing that this was absent in the current network demonstration.

Topics:

  • internet/
  • quantum computing

Source: www.newscientist.com

Tony Blair Warns: “History Won’t Forgive Us” if Britain Lags in the Quantum Computing Race

Prime Minister Tony Blair asserted that “history will not permit” Britain to lag behind in the quantum computing race. This advanced technology is anticipated to ignite a new era of innovations across various fields, from pharmaceutical development to climate analysis.

“The United Kingdom risks losing its edge in quantum research,” cautioned the former Labor prime minister at the Tony Blair Institute, a think tank supported by tech industry veterans such as Oracle founder Larry Ellison.

In a report advocating for a national quantum computing strategy, Mr. Blair and former Conservative leader William Hague drew parallels between the current situation and the evolution of artificial intelligence. While the UK made significant contributions to AI research, it has since surrendered its leadership to other nations, particularly the US, which has triggered a race to develop “sovereign” AI capabilities.

“As demonstrated with AI, a robust R&D foundation alone is insufficient; countries with the necessary infrastructure and capital will capture the economic and strategic advantages of such technologies,” they noted. “While the UK boasts the second-largest number of quantum start-ups globally, it lacks the high-risk investment and infrastructure essential for scaling these ventures.”

Quantum computing operates in unusual and fascinating ways that contrast sharply with classical computing. Traditional computers process information through transistors that switch on or off, representing 1s and 0s. However, in quantum mechanics, entities can exist in multiple states simultaneously, thanks to a phenomenon called quantum superposition, which allows transistors to be in an on and off state concurrently.

This leads to a dramatic boost in computational capabilities, enabling a single quantum computer to perform tasks that would typically require billions of the most advanced supercomputers. Although this field is not yet mature enough for widespread application, the potential for simulating molecular structures to develop new materials and pharmaceuticals is vast. The true value of quantum computing lies in its practical delivery. Estimations suggest that industries such as chemicals, life sciences, automotive, and finance could represent about $1.3 trillion.

There are increasing fears that extraordinarily powerful quantum machines could decipher all encryption and pose serious risks to national security.

Prime Ministers Blair and Hague remarked: “The quantum era is upon us, whether Britain chooses to lead or not.” They added, “History will not excuse us if we squander yet another opportunity to excel in groundbreaking technology.”

This alert follows the recent recognition of British, Cambridge-educated John Clarke, who received the 2025 Nobel Prize in Physics for his contributions to quantum computing, alongside the continued growth of UK quantum firms supported by US companies.

In June, the Oxford University spinout Oxford Ionics was acquired by US company IonQ for $1.1 billion. Meanwhile, Cyclantum, a spinout from the University of Bristol and Imperial College London, primarily thrived in California, discovering that its most enthusiastic investors were located there, where it developed its first large-scale quantum computer. These advancements can be made in Brisbane, Australia.

A report from the Tony Blair Institute for Global Change critiques the UK’s current quantum approach, highlighting that both China and the US are “ahead of the game,” with countries like Germany, Australia, Finland, and the Netherlands also surpassing the UK.

A government representative stated: “Quantum technology has the potential to revolutionize sectors ranging from healthcare to affordable clean energy. The UK currently ranks second globally for quantum investment and possesses leading capabilities in supply chains such as photonics, yet we are resolute in pushing forward.”

They continued: “We have committed to a groundbreaking 10-year funding strategy for the National Quantum Computing Center and will plan other aspects of the national program in due course.”

In June, the Labor party unveiled a £670 million initiative to expedite the application of quantum computing, as part of an industrial strategy aimed at developing new treatments for untreatable diseases and enhancing carbon capture technologies.

Source: www.theguardian.com

Quantum Computers Confirm the Reality of Wave Functions

SEI 270583733

The wave function of a quantum object might extend beyond mere mathematical representation

Povitov/Getty Images

Does quantum mechanics accurately depict reality, or is it merely our flawed method of interpreting the peculiar characteristics of minuscule entities? A notable experiment aimed at addressing this inquiry has been conducted using quantum computers, yielding unexpectedly solid results. Quantum mechanics genuinely represents reality, at least in the context of small quantum systems. These findings could lead to the development of more efficient and dependable quantum devices.

Since the discovery of quantum mechanics over a hundred years ago, its uncertain and probabilistic traits have confounded scientists. For instance, take superposition. Are particles truly existing in multiple locations simultaneously, or do the calculations of their positions merely provide varying probabilities of their actual whereabouts? If it’s the latter, then there are hidden aspects of reality within quantum mechanics that may be restricting our certainty. These elusive aspects are termed “hidden variables,” and theories based on this premise are classified as hidden variable theories.

In the 1960s, physicist John Bell devised an experiment intended to disprove such theories. The Bell test explores quantum mechanics by evaluating the connections, or entanglement, between distant quantum particles. If these particles exhibit quantum qualities surpassing a certain threshold, indicating that their entanglement is nonlocal and spans any distance, hidden variable theories can be dismissed. The Bell test has since been performed on various quantum systems, consistently affirming the intrinsic nonlocality of the quantum realm.

In 2012, physicists Matthew Pusey, Jonathan Barrett, and Terry Rudolph developed a more comprehensive test (dubbed PBR in their honor) that enables researchers to differentiate between various interpretations of quantum systems. Among these are the ontic perspective, asserting that measurements of a quantum system and its wavefunction (a mathematical representation of a quantum state) correspond to reality. Conversely, the epistemological view suggests that this wavefunction is an illusion, concealing a richer reality beneath.

If we operate under the assumption that quantum systems possess no ulterior hidden features that impact the system beyond the wave function, the mathematics of PBR indicates we ought to comprehend phenomena ontically. This implies that quantum behavior is genuine, no matter how peculiar it appears. PBR tests function by comparing different quantum elements, such as qubits in a quantum computer, assessing how frequently they register consistent values for specific properties, like spin. If the epistemological perspective is accurate, the qubits will report identical values more often than quantum mechanics would suggest, implying that additional factors are at play.

Yang Songqinghao and his colleagues at the University of Cambridge have created a method to perform PBR tests on a functioning IBM Heron quantum computer. The findings reveal that if the number of qubits is minimal, it’s possible to assert that a quantum system is ontic. In essence, quantum mechanics appears to operate as anticipated, as consistently demonstrated by the Bell test.

Yang and his team executed this validation by evaluating the overall output from a pair or group of five qubits, such as a sequence of 1s and 0s, and determined the frequency at which this outcome aligned with predictions regarding the behavior of the quantum system, factoring in inherent errors.

“Currently, all quantum hardware is noisy and every operation introduces errors, so if we add this noise to the PBR threshold, what is the interpretation? [of our system]? ” remarks Yang. “We discovered that if we conduct the experiment on a small scale, we can fulfill the original PBR test and eliminate the epistemological interpretation.” The existence of hidden variables vanishes.

While they successfully demonstrated this for a limited number of qubits, they encountered difficulties replicating the same results for a larger set of qubits on a 156-qubit IBM machine. The error or noise present in the system becomes excessive, preventing researchers from distinguishing between the two scenarios in a PBR test.

This implies that the test cannot definitively determine whether the world is entirely quantum. At certain scales, the ontic view may dominate, yet at larger scales, the precise actions of quantum effects remain obscured.

Utilizing this test to validate the “quantum nature” of quantum computers could provide assurance that these machines not only function as intended but also enhance their potential for achieving quantum advantage: the capability to carry out tasks that would be impractically time-consuming for classical computers. “To obtain a quantum advantage, you must have quantum characteristics within your quantum computer. If not, you can discover a corresponding classical algorithm,” asserts team member Haom Yuan from Cambridge University.

“The concept of employing PBR as a benchmark for device efficacy is captivating,” he notes. Matthew Pusey PhD from York University, UK, one of the original PBR authors. However, Pusey remains uncertain about its implications for reality. “The primary purpose of conducting experiments rather than relying solely on theory is to ascertain whether quantum theory can be erroneous. Yet, if quantum theory is indeed flawed, what questions does that raise? The entire framework of ontic and epistemic states presupposes quantum theory.”

Understanding Reality To successfully conduct a PBR test, it’s essential to devise a method of performing the test without presuming that quantum theory is accurate. “A minority of individuals contend that quantum physics fundamentally fails at mesoscopic scales,” states Terry Rudolph, one of the PBR test’s founders from Imperial College London. “This experiment might not pertain to dismissing certain proposals, but let me be straightforward: I am uncertain! – Investigating fundamental aspects of quantum theory in progressively larger systems will always contribute to refining the search for alternative theories.”

reference: arXiv, Doi: arxiv.org/abs/2510.11213

topic:

Source: www.newscientist.com

Germanium Superconductors: A Key to Reliable Quantum Computing

Germanium is already utilized in standard computer chips

Matejimo/Getty Images

Superconductors made from germanium, a material traditionally used for computer chips, have the potential to revolutionize quantum computing by enhancing reliability and performance in the future.

Superconductors are materials that enable electricity to flow without resistance, making them ideal for various electrical applications, particularly in maintaining quantum coherence—essential for effective quantum computing.

Nonetheless, most superconductors have been specialized materials that are challenging to incorporate into computer chips. Peter Jacobson and his team at the University of Queensland, Australia, successfully developed a superconductor using germanium, a material already prevalent in the computing sector.

The researchers synthesized the superconductor by introducing gallium into a germanium film through a process called doping. Previous experiments in this area found instability in the resulting combination. To overcome this, the team utilized X-rays to infuse additional gallium into the material, achieving a stable and uniform structure.

However, similar to other known superconductors, this novel material requires cooling to a frigid 3.5 Kelvin (-270°C/-453°F) to function.

David Cardwell, a professor at the University of Cambridge, notes that while superconductors demand extremely low temperatures, making them less suitable for consumer devices, they could be ideally suited for quantum computing, which also necessitates supercooling.

“This could significantly impact quantum technology,” says Cardwell. “We’re already in a very cold environment, so this opens up a new level of functionality. I believe this is a clear starting point.”

Jacobson highlighted that previous attempts to stack superconductors atop semiconductors—critical components in computing—resulted in defects within their crystal structure, posing challenges for practical applications. “Disorder in quantum technology acts as a detrimental effect,” he states. “It absorbs the signal.”

In contrast, this innovative material enables the stacking of layers containing gallium-doped germanium and silicon while maintaining a uniform crystal structure, potentially paving the way for chips that combine the advantageous features of both semiconductors and superconductors.

Topic:

Source: www.newscientist.com

Google Unveils Quantum Computers’ Ability to Unlock Molecular Structures

Sure! Here’s a rewritten version of your content while preserving the HTML tags:

Google’s Quantum Computing Willow Chip

Google Quantum AI

Researchers at Google Quantum AI have leveraged Willow quantum computers to enhance the interpretation of data sourced from nuclear magnetic resonance (NMR) spectroscopy—an essential research method within chemistry and biology. This significant advancement may open new horizons for the application of quantum computing in various molecular technologies.

While quantum computers have been most effectively demonstrated in cryptographic contexts, current devices face limitations in scale and error rates that hinder their competence in decryption tasks. However, they show promise in expediting the discovery of new drugs and materials, which align with the fundamentally quantum nature of many scientific procedures. Hartmut Neven and colleagues at Google Quantum AI have showcased one instance where quantum computers can mimic the complex interactions found in natural processes.

The investigation centered on a computational method known as quantum echo and its application to NMR, a technique utilized to extract detailed information regarding molecular structures.

At its core, the concept of quantum echoes is akin to the butterfly effect. This phenomenon illustrates how minor perturbations—like the flap of a butterfly’s wings—can trigger substantial changes in broader systems. The researchers exploited a quantum approach within a system made up of 103 qubits in Willow.

During the experiment, the team executed a specific sequence of operations to alter the quantum state of a qubit in a manageable way. They then selected one qubit to disrupt, acting as a “quantum butterfly,” and employed the identical sequence of operations, effectively reversing time. Finally, the researchers evaluated the quantum characteristics of the qubits to extract insights regarding the entire system.

In a basic sense, the NMR technique applied in the lab also hinges on minor disturbances; it nudges actual molecules using electromagnetic waves and examines the system’s reactions to ascertain atomic positions—similar to using a molecular ruler. If the operations on qubits can replicate this process, the mathematical scrutiny of the qubits can likewise be translated into molecular structural details. This series of quantum computations could potentially enable the examination of atoms that are relatively distant from one another, said team member Tom O’Brien. “We’re constructing longer molecular rulers.”

The researchers believe that a protocol akin to quantum echoes would require approximately 13,000 times longer on a conventional supercomputer. Their tests indicated that two distinct quantum systems could successfully perform a quantum echo and yield identical outcomes—a notable achievement given the inconsistencies faced in previous quantum algorithms supported by the team. O’Brien noted that enhancements in the quality of Willow’s hardware and reduced qubit error rates have contributed to this success.

Nonetheless, there remains ample opportunity for refinement. In their utilization of Willow and quantum echoes for two organic molecules, the researchers operated with a mere 15 qubits at most, yielding results comparable to traditional non-quantum methods. In essence, the team has not yet demonstrated a definitive practical edge for Willow over conventional systems. This current exhibition of quantum echo remains foundational and has not been subjected to formal peer review.

“Addressing molecular structure determination is crucial and pertinent,” states Keith Fratus from HQS Quantum Simulations, a German company focused on quantum algorithms. He emphasizes that bridging established techniques such as NMR with calculations executed by quantum computers represents a significant milestone, though the technology’s immediate utility might be confined to specialized research in biology.

Doris Sels, a professor at New York University, remarked that their team’s experiments involve larger quantum computers and more complex NMR protocols and molecules than prior models. “Quantum simulation is often highlighted as a promising application for quantum computers, yet there are surprisingly few examples with industrial relevance. I believe model inference of spectroscopic data like NMR could prove beneficial,” she added. “We’re not quite there, but initiatives like this inspire continued investigation into this issue.”

O’Brien expressed optimism that the application of quantum echo to NMR will become increasingly beneficial as they refine qubit performance. Fewer errors mean a greater capability to execute more operations simultaneously and accommodate larger molecular structures.

Meanwhile, the quest for optimal applications of quantum computers is ongoing. While the experimental implementation of quantum echoes on Willow is remarkable, the mathematical analysis it facilitates may not achieve widespread adoption, according to Kurt von Keyserlingk at King’s College London. Until NMR specialists pivot away from traditional methods cultivated over decades, he suggests that its primary allure will lie with theoretical physicists focused on fundamental quantum system research. Furthermore, this protocol may face competitive challenges from conventional computing methods, as von Keyserlingk has already pondered how traditional computing might rival this approach.

Topic:

Let me know if you need any further adjustments!

Source: www.newscientist.com

Google Celebrates Breakthrough: Quantum Computer Exceeds Supercomputer Performance

Google has announced a significant breakthrough in quantum computing, having developed an algorithm capable of performing tasks that traditional computers cannot achieve.

This algorithm, which serves as a set of instructions for guiding the operations of a quantum computer, has the ability to determine molecular structures, laying groundwork for potential breakthroughs in areas like medicine and materials science.

However, Google recognizes that the practical application of quantum computers is still several years away.

“This marks the first occasion in history when a quantum computer has successfully performed a verifiable algorithm that surpasses the power of a supercomputer,” Google stated in a blog post. “This repeatable, beyond-classical computation establishes the foundation for scalable verification and moves quantum computers closer to practical utilization.”

Michel Devore, Google’s chief scientist for quantum AI, who recently received the Nobel Prize in Physics, remarked that this announcement represents yet another milestone in quantum developments. “This is a further advancement towards full-scale quantum computing,” he noted.

The algorithmic advancement, allowing quantum computers to function 13,000 times faster than classical counterparts, is documented in a peer-reviewed article published in the journal Nature.

One expert cautioned that while Google’s accomplishments are impressive, they revolve around a specific scientific challenge and may not translate to significant real-world benefits. Results for two molecules were validated using nuclear magnetic resonance (NMR), akin to MRI technology, yielding insights not typically provided by NMR.

Winfried Hensinger, a professor of quantum technology at the University of Sussex, mentioned that Google has achieved “quantum superiority”, indicating that researchers have utilized quantum computers for tasks unattainable by classical systems.

Nevertheless, fully fault-tolerant quantum computers—which could undertake some of the most exciting tasks in science—are still far from realization, as they would necessitate machines capable of hosting hundreds of thousands of qubits (the basic unit of information in quantum computing).

“It’s crucial to recognize that the task achieved by Google isn’t as groundbreaking as some world-changing applications anticipated from quantum computing,” Hensinger added. “However, it represents another compelling piece of evidence that quantum computers are steadily gaining power.”

A truly capable quantum computer able to address a variety of challenges would require millions of qubits, but current quantum hardware struggles to manage the inherent instability of qubits.

“Many of the most intriguing quantum computers being discussed necessitate millions or even billions of qubits,” Hensinger explained. “Achieving this is even more challenging with the type of hardware utilized by the authors of the Google paper, which demands cooling to extremely low temperatures.”

Hartmut Neven, Google’s vice president of engineering, stated that quantum computers may be five years away from practical application, despite advances in an algorithm referred to as Quantum Echo.

Skip past newsletter promotions

“We remain hopeful that within five years, Quantum Echo will enable real-world applications that are solely feasible with quantum computers,” he said.

As a leading AI company, Google also asserts that quantum computers can generate unique data capable of enhancing AI models, thereby increasing their effectiveness.

Traditional computers represent information in bits (denoted by 0 or 1) and send them as electrical signals. Text messages, emails, and even Netflix movies streamed on smartphones consist of these bits.

Contrarily, information in a quantum computer is represented by qubits. Found within compact chips, these qubits are particles like electrons or photons that can exist in multiple states simultaneously—a concept known as superposition in quantum physics.

This characteristic enables qubits to concurrently encode various combinations of 1s and 0s, allowing computation of vast numbers of different outcomes, an impossibility for classical computers. Nonetheless, maintaining this state requires a strictly controlled environment, free from electromagnetic interference, as disturbances can easily disrupt qubits.

Progress by companies like Google has led to calls for governments and industries to implement quantum-proof cryptography, as cybersecurity experts caution that these advancements have the potential to undermine sophisticated encryption.

Source: www.theguardian.com

Ultracold Atoms May Investigate Relativity in the Quantum Realm

Here’s your content rewritten while maintaining the HTML tags:

Spinning ultracold atoms could uncover the limits of Einstein’s relativity

Shutterstock / Dmitriy Rybin

Small Ferris wheels made from light and extremely chilled particles could enable scientists to investigate elements of Albert Einstein’s theory of relativity on an extraordinary level.

Einstein’s special and general theories of relativity, established in the early 20th century, transformed our comprehension of time by illustrating that a moving clock can tick slower than a stationary one. If one moves rapidly or accelerates significantly, time measured will also increase. The same applies when an object moves in a circular path. While these effects have been noted in relatively large celestial entities, Vassilis Rembesis and his team at King Saud University in Saudi Arabia have developed a method to test these principles on a diminutive scale.

By examining rotation and time at the molecular level (atoms and molecules), they explored ultracold regions, just a few millionths of a degree above absolute zero. In this domain, the quantum behavior and movement of atoms and molecules can be meticulously controlled with laser beams and electromagnetic fields. In 2007, Rembesis and his colleagues formulated a technique to tune a laser beam to trap atoms in a cylindrical form, allowing them to spin. They refer to this as an “optical Ferris wheel,” and Rembesis asserts that their new findings propose that it can be used to observe relativistic time dilation in ultracold particles.

Their predictions indicate that nitrogen molecules are optimal candidates for investigating rotational time delays at the quantum level. By considering the movement of electrons within them as the ticks of an internal timer, the researchers detected frequency changes as minuscule as 1/10 quintillion.

Simultaneously, Rembesis noted that experiments utilizing optical Ferris wheels have been sparse up until now. This new proposal opens avenues for examining relativity theory in uncharted conditions where new or surprising phenomena may emerge. For instance, the quantum characteristics of ultracold particles may challenge the “clock hypothesis,” which states how a clock’s acceleration influences its ticking.

“It’s crucial to validate our interpretations of physical phenomena within nature. It’s often during unexpected occurrences that we need to reevaluate our understanding for a deeper insight into the universe. This research offers an alternative approach to examining relativistic systems, providing distinct advantages over traditional mechanical setups,” says Patrick Oberg from Heriot-Watt University, UK.

Relativistic phenomena, such as time dilation, generally necessitate exceedingly high velocities; however, optical Ferris wheels enable access to them without the need for impractically high speeds, he explains. Aidan Arnold from the University of Strathclyde, UK adds, “With the remarkable accuracy of atomic clocks, the time difference ‘experienced’ by the atoms in the Ferris wheel should be significant. Because the accelerated atoms remain in close proximity, there is ample opportunity to measure this difference,” he states.

By adjusting the focus of the laser beam, it may also become feasible to manipulate the dimensions of the Ferris wheel that confines the particles, allowing researchers to explore time-delay effects for various rotations, as noted by Rembesis. Nevertheless, technical challenges persist, including the need to ensure that atoms and molecules do not heat up and become uncontrollable during rotation.

topic:

Source: www.newscientist.com

Challenging Calculations: Quantum Computers May Struggle with ‘Nightmare’ Problems

SEI 270616406

Certain problems remain insurmountable for quantum computers.

Jaroslav Kushta/Getty Images

Researchers have uncovered a “nightmare scenario” computation tied to a rare form of quantum material that remains unsolvable, even with the most advanced quantum computers.

In contrast to the simpler task of determining the phase of standard matter, such as identifying whether water is in a solid or liquid state, the quantum equivalent can prove exceedingly challenging. Thomas Schuster and his team at the California Institute of Technology have demonstrated that identifying the quantum phase of matter can be notably difficult, even for quantum machines.

They mathematically examined a scenario in which a quantum computer receives a set of measurements regarding the quantum state of an object and must determine its phase. Schuster mentioned that this is not necessarily an impossible task, but his team has shown that a considerable number of quantum phases of matter—such as the complex interactions between liquid water and ice, including unusual “topological” phases that exhibit strange electrical currents—might necessitate quantum computers to perform computations over extremely protracted periods. This situation mirrors a worst-case scenario in laboratory settings, where instruments may need to operate for billions or even trillions of years to discern the characteristics of a sample.

This doesn’t imply that quantum computers are rendered obsolete for this analysis. As Schuster noted, these phases are unlikely to manifest in actual experiments involving materials or quantum systems, serving more as an indicator of our current limitations in understanding quantum computers than posing an immediate practical concern. “They’re like nightmare scenarios. It would be quite unfortunate if such a case arose. It probably won’t happen, but we need to improve our comprehension,” he stated.

Bill Fefferman from the University of Chicago raised intriguing questions regarding the overall capabilities of computers. “This might illuminate the broader limits of computation: while substantial speed improvements have been realized for specific tasks, there will inevitably be challenges that remain too daunting, even for efficient quantum computers,” he asserted.

Mathematically, he explained, this new research merges concepts from quantum information science employed in quantum cryptography with foundational principles from materials physics, potentially aiding progress in both domains.

Looking ahead, the researchers aspire to broaden their analysis to encompass more energetic or excited quantum phases of matter, which are recognized as challenging for wider calculations.

topic:

Source: www.newscientist.com

What Makes Quantum Computers So Powerful?

3D rendering of a quantum computer’s chandelier-like structure

Shutterstock / Phong Lamai Photography

Eleven years ago, I began my PhD in theoretical physics and honestly had never considered or written about quantum computers. Meanwhile, New Scientist was busy crafting the first “Quantum Computer Buyer’s Guide,” always ahead of its time. A glance through reveals how things have changed—John Martinis from UC Santa Barbara was recognized for developing an array of merely nine qubits and earned a Nobel Prize in Physics just last week. Curiously, there was no mention of quantum computers built using neutral atoms, which have rapidly transformed the field in recent years. This sparked my curiosity: how would a quantum computer buyer’s guide look today?

At present, around 80 companies globally are producing quantum computing hardware. My reporting on quantum computing has allowed me to witness firsthand how the industry evolves, complete with numerous sales pitches. If choosing between an iPhone and an Android is challenging, consider navigating the press lists of various quantum computing startups.

While there’s significant marketing hype, the challenge in comparing these devices stems from the lack of a clear standard for building quantum computers. For instance, potential qubit options include superconducting circuits, cryogenic ions, and light. With such diverse components, how does one assess their differences? This aspect will hone in on each quantum computer’s performance.

This marks a shift from the early days, where success was measured by the number of qubits—the foundational elements of quantum information processing. Many research teams have surpassed the 1000-qubit threshold, and the trajectory for achieving even more qubits appears to be becoming clearer. Researchers are exploring standard manufacturing methods, such as creating silicon-based qubits, and leveraging AI to enhance the size and capabilities of quantum computers.

Ideally, more qubits should always translate to greater computational power, enabling quantum computers to tackle increasingly complex challenges. However, in reality, ensuring each additional qubit doesn’t impede the performance of existing ones presents significant technical hurdles. Thus, it’s not just the number of qubits that counts, but how much information they can retain and how effectively they can communicate without losing data accuracy. A quantum computer could boast millions of qubits, but if they’re susceptible to errors that disrupt computations, they become virtually ineffective.

The extent of this “glitch” or noise can be measured by metrics like “gate fidelity,” which reflects how accurately a qubit or pair can perform operations, and “coherence time,” which gauges how long a qubit can maintain a viable quantum state. However, we must also consider the intricacies of inputting data into a quantum computer and retrieving outcomes, despite some favorable metrics. The growth of the quantum computing industry is partly attributed to the emergence of companies focused on qubit control and interfacing quantum internals with non-quantum users. A thorough buyer’s guide for quantum computers in 2025 should encompass these essential add-ons. Choosing a qubit means also selecting a qubit control system and an error correction mechanism. I recently spoke with a researcher developing an operating system for quantum computers, suggesting that such systems may become a necessity in the near future.

If I were to create a wish list for the short term, I would favor a machine capable of executing at least a million operations: a million-step quantum computing program with minimal error rates and robust error correction. John Preskill from the California Institute of Technology refers to this as the “Mega-Quop” machine. Last year, he expressed confidence that such machines would be fault-tolerant and powerful enough to yield scientifically significant discoveries. Yet, we aren’t there yet. The quantum computers at our disposal currently manage tens of thousands of operations, but error correction has only been effectively demonstrated for smaller tasks.

Quantum computers today are akin to adolescents—growing toward utility but still faced with developmental challenges. As a result, the question I frequently pose to quantum computer vendors is, “What can this machine actually accomplish?”

In this regard, it’s vital to compare not only various types of quantum computers but also contrast them with classical counterparts. Quantum hardware is costly and complex to manufacture, so when is it genuinely the sole viable solution for a given issue?

One method to tackle this inquiry is to pinpoint calculations traditional computers cannot resolve without unlimited time. This concept is termed “quantum supremacy,” and it keeps quantum engineers and mathematicians consistently preoccupied. Instances of quantum supremacy do exist, but they raise concerns. To be meaningful, such cases must be applicable, facilitating the construction of capable machines that can execute them, while also being demonstrable enough for mathematicians to assure that no conventional computer could compete.

In 1994, physicist Peter Shor devised a quantum computing algorithm for factoring large numbers, a technique that could potentially compromise the prevalent encryption methods utilized by banks worldwide. A sufficiently large quantum computer that could manage its own errors might execute this algorithm, yet mathematicians have yet to convincingly demonstrate that classical computers can’t efficiently factor large numbers. The most prominent claims of quantum supremacy often fall into this gray area, with some eventually being outperformed by classical machines. Ongoing demonstrations of quantum supremacy appear currently to serve primarily as confirmations of the quantum characteristics of the computers accomplishing them.

Conversely, in the mathematical discipline of “query complexity,” the superiority of quantum solutions is rigorously demonstrable, but practical algorithms remain elusive. Recent experiments have also introduced the notion of “quantum information superiority,” wherein quantum computers solve tasks using fewer qubits than traditional computers would require, focusing on the physical components instead of time. Though this sounds promising—indicating that quantum computers may solve problems without extensive scaling—they are not recommended for purchase simply because the tasks in question often lack pivotal real-world applications.

It’s undeniable that several real-world challenges are well-suited for quantum algorithms, like understanding molecular properties relevant to agriculture or medicine, or solving logistic issues like flight scheduling. Yet, researchers lack full clarity on these applications, often opting to state, “it seems.”

For instance, recent research on the prospective applications of quantum computing in genomics by Aurora Maurizio from the San Raffaele Scientific Institute in Italy and Guglielmo Mazzola at the University of Zurich suggests that traditional computing methods excel so significantly that “quantum computing may, in the near future, only yield speedups for a specific subset of sufficiently complex tasks.” Their findings indicate that while quantum computers could potentially enhance research in combinatorial problems within genomics, their application needs to be very precise and calculated.

In reality, for numerous issues not specifically designed to demonstrate quantum supremacy, there exists a spectrum in what constitutes “fast,” particularly when one considers that quantum computers might ultimately run algorithms quicker than classical computers, despite overcoming noise and technical challenges. However, this speed may not always offset the hardware’s significant costs. For example, the second-best-known quantum algorithm, Shor’s search algorithm, offers a non-exponential speedup, reducing computation time at a square root level instead. Ultimately, the question of how fast is “fast enough” to justify the transition to quantum computing may depend on individual buyers.

While it’s frustrating to include this in a purported buyer’s guide, my discussions with experts indicate that there remains far more uncertainty about what quantum computers can achieve than established knowledge. Quantum computing is an intricate, costly future technology; however, its genuine added value to our lives remains vague beyond serving the financial interests of a select few companies. This might not be satisfying, but it reflects the unique, uncharted territory of quantum computing.

For those of you reading this out of the desire to invest in a powerful, reliable quantum computer, I encourage you to proceed and let your local quantum algorithm enthusiast experiment with it. They may offer better insights in the years to come.

Topic:

Source: www.newscientist.com

Nobel Prize in Physics Awarded to Trio Pioneering Quantum Computing Chips

John Clarke, Michel Devolette and John Martinis awarded the 2025 Nobel Prize in Physics

Jonathan Nackstrand/AFP via Getty Images

The prestigious 2025 Nobel Prize in Physics was awarded to John Clarke, Michel Devolette, and John Martinis. Their research elucidates how quantum particles can delve through matter, a critical process that underpins the superconducting quantum technology integral to modern quantum computers.

“I was completely caught off guard,” Clarke remarked upon hearing the news from the Nobel Committee. “This outcome was unimaginable; it felt like a dream to be considered for the Nobel Prize.”

Quantum particles exhibit numerous peculiar behaviors, including their stochastic nature and the restriction to specific energy levels instead of a continuous range. This phenomenon sometimes leads to unforeseen occurrences, such as tunneling through solid barriers. Such unusual characteristics were first revealed by pioneers like Erwin Schrödinger during the early years of quantum mechanics.

The implications of these discoveries are profound, particularly supporting theories like nuclear decay; however, earlier research was limited to individual particles and basic systems. It remained uncertain whether more intricate systems such as electronic circuits, conventionally described by classical physics, also adhered to these principles. For instance, the quantum tunneling effect seemed to vanish when observing larger systems.

In 1985, the trio from the University of California, Berkeley—Clarke, Martinis, and Devolette—sought to change this narrative. They investigated the properties of charged particles traversing a superconducting circuit known as the Josephson Junction, a device that earned the Nobel Prize in Physics in 1973 for British physicist Brian Josephson. These junctions comprise wires exhibiting zero electrical resistance, separated by an insulating barrier.

The researchers demonstrated that particles navigating through these junctions behaved as individual entities, adopting distinct energy levels, clear quantum attributes, and registering voltages beyond expected limits without breaching the adiabatic barrier.

This groundbreaking discovery significantly deepened our understanding of how to harness similar superconducting quantum systems, transforming the landscape of quantum science and enabling other scientists to conduct precise quantum physics experiments on silicon chips.

Moreover, superconducting quantum circuits became foundational to the essential components of quantum computers, known as qubits. Developed by companies like Google and IBM, the most advanced quantum computers today consist of hundreds of superconducting qubits, a result of the insights gained from Clarke, Martinis, and Devolette’s research. “In many respects, our findings serve as the cornerstone of quantum computing,” stated Clarke.

Both Martinis and Devolette are currently affiliated with Google Quantum AI, where they pioneered the first superconducting quantum computer in 2019 that demonstrated quantum advantage over traditional machines. However, Clarke noted to the Nobel Committee that it was surprising to consider the extent of impact their 1985 study has had. “Who could have imagined that this discovery would hold such immense significance?”

Topics:

  • Nobel Prize/
  • Quantum Computing

Source: www.newscientist.com

Ultracold Clock Sheds Light on Quantum Physics’ Impact on Time

SEI 267717982

What is the quantum nature of time? We may be on the verge of discovering it

Quality Stock / Alamy

How does time manifest for a genuine quantum entity? The most advanced clocks can rapidly address this query, enabling us to test various ways to manipulate and alter the quantum realm, thereby delving into the uncharted territories of physics.

The notion that time can shift originates from Albert Einstein’s special theory of relativity. As an object approaches the speed of light, it appears to experience time more slowly compared to a stationary observer. He expands upon this with a general theory of relativity, which demonstrates a similar temporal distortion in the presence of a gravitational field. Igor Pikovsky from the Stevens Institute in New Jersey and his team aim to uncover whether a similar effect occurs within the microscopic quantum landscape, utilizing ultra-cold clocks constructed from ions.

“The experiments we’ve performed until now have always focused on classical time, disregarding quantum mechanics,” says Pikovsky. “We’ve observed a regime where conventional explanations falter with an ion clock,” he continues.

These clocks consist of thousands of ions cooled to temperatures nearing absolute zero via laser manipulation. At such low temperatures, the quantum state of an ion and its embedded electrons can be precisely controlled through electromagnetic forces. Thus, the ticks of an ion clock are governed by the electrons oscillating between two distinct quantum states.

Since their behavior is dictated by quantum mechanics, these instruments provided an ideal platform for Pikovsky and his colleagues to investigate the interplay between relativistic and quantum phenomena on timekeeping. Pikovski mentions that they’ve identified several scenarios where this blending is evident.

One example arises from the intrinsic fluctuations inherent in quantum physics. Even at ultra-low temperatures, quantum objects cannot be completely static and instead must oscillate, randomly gaining or losing energy. Team calculations indicated that these fluctuations could lead to extended clock time measurements. Although the effect is minute, it is detectable in current ion clock experiments.

The researchers also mathematically analyzed the behavior of ions in a clock when “compressed,” resulting in “superpositions” of multiple quantum states. They found that these states are closely linked to the motion of the ions, influenced by their internal electrons. The states of ions and electrons are interconnected at a quantum level. “Typically, experiments necessitate creative methods to establish entanglements. The intriguing aspect here is that it arises organically,” explains team member Christian Sanner from Colorado State University.

Pikovski asserts that it is intuitive to think that quantum objects existing in superposition cannot simply perceive time linearly, though this effect has yet to be experimentally confirmed. He believes it should be achievable in the near future.

Team member Gabriel Solch from the Stevens Institute of Technology mentions that the next step is incorporating another crucial aspect of modern physics: gravity. Ultra-cold clocks can currently detect temporal extensions caused by significant variations in the Earth’s gravitational pull, such as when elevated by a few millimeters, but the exact integration of these effects with the intrinsic quantum characteristics of the clock remains an unresolved question.

“I believe it is quite feasible with our existing technology,” adds David Hume from the U.S. National Institute of Standards and Technology, Colorado. He highlights that the primary challenge is to mitigate ambient disturbances affecting the clock to ensure it doesn’t overshadow the effects suggested by Pikovsky’s team. Successful experiments could pave the way for exploring unprecedented physical phenomena.

“Such experiments are thrilling because they create a platform for theories to interact in a domain where they could yield fresh insights,” remarks Alexander Smith at St. Anselm College, New Hampshire.

Topic:

Source: www.newscientist.com

The 6100-Qubit Device: A Major Leap Towards Quantum Computing Advancement

Quantum computers can be developed using arrays of atoms

Alamy Stock Vector

Devices boasting over 6000 qubits are setting new records and represent the initial phase of constructing the largest quantum computer ever.

At present, there isn’t a universally accepted design for creating quantum computers. However, researchers assert that these machines need to incorporate at least tens of thousands of qubits to be truly functional. The current record holder is a quantum computer utilizing 1180 qubits, with Hannah Manetsch from the California Institute of Technology and her team endeavoring to build a 6100 qubit system.

These qubits are made from neutral cesium atoms that are chilled to near absolute zero and manipulated using a laser beam, all arranged neatly on a grid. According to Manetsch, they have fine-tuned the properties of these qubits to enhance their suitability for calculations, although they have yet to carry them out.

For instance, they modify the laser’s frequency and power to help the fragile qubits maintain their quantum state, thus ensuring the grid’s stability for more precise calculations and extended runtimes of the quantum machine. The research team also assessed how efficiently the lasers could shift qubits around within the array, as noted by Ellie Bataille at the California Institute of Technology.

“This is a remarkable demonstration of the straightforward scaling potential that neutral atoms present,” he remarks. Ben Bloom from Atom Computing also employs neutral atoms in their technologies.

Mark Suffman from the University of Wisconsin-Madison emphasizes that new experiments are vital, providing proof that neutral atomic quantum computers can achieve significant sizes. However, further experimental validation is necessary before considering these setups as fully developed quantum computers.

Research teams are currently investigating optimal methods for enabling qubits to perform calculations while employing error-reduction strategies, mentions Kon Leung at the California Institute of Technology. Ultimately, they envision scaling their systems to 1 million qubits over the next decade, he states.

topic:

Source: www.newscientist.com

Revolutionary Quantum Funds Stored on Ultra-Cold ‘Debit Card’

Quantum Debit Card Ensures Financial Security

GlobalImages101/alamy

New quantum debit cards, which can hold unforgeable quantum funds, are constructed using extremely cooled atoms and light particles.

While standard banks often rely on the skill of counterfeiters to detect fake banknotes, quantum banks utilize the no-cloning theorem from physics, rendering counterfeiting impossible. This principle, which states that creating identical copies of quantum information is not feasible, led physicist Stephen Wiessner to propose a protocol in 1983 for generating secure currencies. Julian Laurat and his team at the Kastler Brossel Laboratory in France are actively implementing this groundbreaking concept in advanced experiments.

According to this protocol, banks create banknotes composed of quantum particles, possessing unique properties and existing in specific quantum states, thus ensuring protection against forgery through the no-cloning theorem. Laurat remarks that the protocol showcases an impressive feat of quantum cryptography, though it has not yet been put into practice for actual quantum fund storage.

The research team has made storage feasible by combining memory devices with hard drives. In their experiments, users interact with quantum systems that act as banks by exchanging photons. Each photon can be stored similarly to loading money onto a debit card.

The memory devices used by the team consist of hundreds of millions of cesium atoms, which researchers cool down to nearly absolute zero by bombarding them with lasers. At such extreme temperatures, light can precisely manipulate the quantum state of atoms, but Laurat notes that years were spent identifying the optimal cooling needed for atomic memory to serve as a quantum debit card. Through extensive testing, he and his colleagues demonstrated that users can retrieve photons from atoms without corrupting their states, as long as the process is not tampered with.

Christophe Simon from the University of Calgary emphasizes that the new experiment marks progress toward fully realizing quantum funding. However, the current quantum memory storage time of around six million seconds remains insufficient for practical application. “Another future step is to enhance portability. The long-term goal is to develop quantum memory that can be easily carried, particularly for Quantum Money applications. But we are not there yet,” he states.

The team is focused on extending storage durations, asserting that the protocol can be employed within quantum networks already being established in metropolitan areas across the globe. Additionally, cutting-edge quantum memory not only facilitates ultra-secure long-distance quantum communication but is also instrumental in connecting various quantum computers to more powerful systems.

Topics:

  • Quantum Computing/
  • Encryption

Source: www.newscientist.com

Quantum Computers: Finally Attaining Unchallenged Dominance

SEI 232816755

Quantinuum’s Quantum Computer

Quantinuum

What unique capabilities do quantum computers possess that traditional computers cannot replicate? This question is central to a swiftly evolving industry, and recent findings aim to provide clarity on this topic.

Unlike classical bits, quantum computers utilize qubits that can occupy multiple states beyond just “0” or “1”, offering potential computational advantages. However, the debate on whether quantum computers can accomplish tasks beyond the reach of the most advanced traditional computers, including the notion of quantum supremacy, remains complex and contentious. This is primarily due to the stipulation that genuine demonstrations of quantum advantage must involve practical computational tasks, achievable with realistic quantum technology, while explicitly excluding any mathematical or algorithmic enhancements that may allow classical computers to eventually catch up.

William Crescher from The University of Texas at Austin and his colleagues are presently conducting experiments that satisfy both criteria. In contrast to earlier claims of quantum dominance, which were ultimately bridged by classical computing advancements, the researchers assert, “Our results are clear and enduring: no future classical algorithm development will close this gap.”

The team executed a complex mathematical experiment addressing communication challenges using 12 qubits created from laser-controlled ions by the Quantum Computing Company Quantinuum. The experiment’s objective was for two virtual participants, referred to as Alice and Bob, to devise the most efficient method for exchanging messages and performing calculations.

One section of the quantum computer, acting as Alice, prepares a specific quantum state and transmits it to Bob, another segment of the machine. Bob must discern its properties and determine how to measure Alice’s state to produce an output. By iterating this process, the duo can establish a means to forecast Bob’s output before Alice discloses her state.

The researchers conducted the procedure 10,000 times to refine the way Alice and Bob execute their tasks. With an analysis of these iterations and a rigorous mathematical examination of the protocol involved, it was found that classical algorithms with fewer than 62 bits could not compete with the performance of a 12-qubit quantum computer in this particular task. For a classical algorithm to achieve equivalent performance, it would require a performance threshold of about 330 bits, representing a nearly 30-fold difference in computational capability.

“This is an extraordinary scientific achievement that illustrates the extent of the ‘quantum advantage’ landscape, which may be broader than previously understood,” said Ashley Montanaro from the University of Bristol, UK. “Unlike most demonstrations of quantum superiority, the prospect of discovering a superior classical algorithm is virtually impossible.”

Ronald de Wolf from the Dutch Institute for Mathematics and Computer Science highlights that this experiment effectively leverages the recent rapid enhancements in existing quantum technologies while drawing upon theories of communication complexity that have been explored for years.

“The intricacies of communication are known to contribute to a verifiable and realistic distinction between quantum and classical systems. The difference is that advancements in hardware have made it feasible to implement the model for the first time,” he explains. “Moreover, they tackled a novel challenge in communication complexity, revealing a significant gap between classical and quantum capabilities even with just 12 qubits.”

These new findings differentiate themselves from earlier demonstrations of quantum superiority, but share a crucial element: their immediate practicality remains uncertain. Notable examples of quantum advantage that could produce substantial real-world benefits, such as Shor’s algorithm which could revolutionize encryption, still await confirmation regarding their applicability.

In the future, research teams might enhance their findings further by separating Alice and Bob into distinct computers. While this limits the chances of unmonitored interactions affecting outcomes of the quantum computer, the true utility of quantum dominance remains a critical issue, according to De Wolf.

“Progress beyond mere [quantum] dominance is essential for achieving [quantum] utility. Quantum computers currently outperform classical ones in specific areas of genuine interest, like some chemical computations and logistics optimization,” he suggests.

Topics:

Source: www.newscientist.com

Quantum Computers Are Now Practical and Valuable

3D illustration of a quantum computer

AdventTr/Getty Images

Amidst the excitement surrounding quantum computing, the technology may appear as a catch-all solution for various challenges. While the science is impressive, real-world applications are still developing. However, the quest for viable uses is starting to yield fruitful results. Particularly, the search for exotic quantum materials is gaining traction, which could revolutionize electronics and enhance computational power.

The discovery and exploration of new phases—especially more exotic forms analogous to ice or liquid water—remain foundational to condensed matter physics. Insights gained here can enhance our understanding of semiconductor functionality and lead to practical superconductors.

Yet, traditional experimental methods are increasingly inadequate for studying certain complex phases that theory suggests exist. For instance, the Kitaev honeycomb model predicts materials with a unique type of magnetism, but it took “decades of exploration to actually design this with real materials,” according to Simon Everred of Harvard University.

Everred and colleagues simulated this phenomenon using a quantum computer with 104 qubits made from ultra-cold atoms. They’re not alone in this endeavor; Frank Pollmann from the Technical University of Munich and his team utilized Google’s Sycamore and Willow Quantum Computers, which house 72 and 105 superconducting qubits respectively, to model conditions based on iterations of the Kitaev honeycomb framework. Both teams have documented their findings.

“These two projects harness quantum computers to investigate new phases of problems that had been theoretically predicted but not observed experimentally,” notes Petr Zapletal from the University of Erlangen-Nuremberg, who was not involved in the studies. “The advancement of quantum simulations for complex condensed matter systems is particularly thrilling.”

Both research teams confirmed the presence of anyons in their simulations, a significant progress that illustrates the growth and potential utility of quantum computers. Anyons differ fundamentally from qubits and represent exotic particles that are challenging to emulate.

Existing particles typically categorize into fermions and bosons. While chemists and materials scientists often focus on fermions, qubits generally function as bosons. The distinctions—like spin and collective behaviors—complicate the simulation of fermions using bosons. However, cold atom quantum experiments utilized Kitaev models to bridge these gaps. Masin Karinowski of Harvard, who participated in the research, described the Kitaev model as a “canvas” for exploring new physics. Through this model, the team could tune quasiparticles in their simulations by adjusting interactions among the qubits. According to Karinowski, some of these new particles might be employed to replicate novel materials.

Another critical aspect of the research was the use of Google’s quantum computer to examine materials outside equilibrium. Despite the significant exploration of equilibrium states in laboratories, the non-equilibrium realm remains largely uncharted. Pollmann notes that this aligns with laboratory trials where materials are repeatedly subjected to laser pulses. His team’s work reflects how condensed matter physicists study materials by exposing them to extreme temperatures or magnetic fields and then diagnosing changes in their phases. Such diagnostics are crucial for determining the conditions under which materials can be effectively utilized.

It’s important to clarify that these experiments don’t yield immediate real-world applications. To translate these findings into usable technologies, researchers will need to conduct further analysis on larger, less error-prone quantum computers. However, these preliminary studies carve out a niche for quantum computers in exploring physical phenomena, akin to the way traditional experimental tools have been employed for decades.

That material science might be the first field to showcase the value of quantum computing is not surprising. This aligns with how pioneers like Richard Feynman discussed quantum technology in the 1980s, envisioning its potential beyond mere devices. Moreover, this perspective diverges from the usual portrayal of quantum computing as technology primarily focused on outperforming classical computers in non-practical tasks.

“Viewing the advancement of quantum computing as a scientific approach, rather than simply through the lens of individual device performance, is undeniably supported by these experimental findings,” concludes Kalinowski.

topic:

Source: www.newscientist.com

Self-Integrating Atoms Uncover Quantum Wave Functions

The wave functions of atoms can expand without altering their shape

ShutterStock / Bolbik

Extremely cold atoms show a unique ability to self-integrate their quantum states, allowing for imaging with remarkable clarity. This capability aids researchers in exploring the behaviors of quantum particles within unusual materials like superconductors and superfluids.

Mapping the quantum states of atoms, particularly the shape of their wavefunction, poses significant challenges—especially when atoms are densely packed in solids and interact closely. To delve into the quantum behaviors of such materials, scientists convert quantum properties into extremely cold atoms, which they can manipulate with lasers and electromagnetic fields, arranging them into closely packed patterns that mimic atomic structures in solid materials.

Sandra Brantetter from the University of Heidelberg, along with her team, has developed methods to expand the wave functions of hyperpolar atoms by a factor of 50, enhancing their detectability.

Starting with around 30 lithium atoms cooled to just a few millionths above absolute zero, researchers trapped these atoms in a flat configuration using lasers, allowing for precise control of their quantum states. The team then manipulated the properties of the light used, effectively enlarging the atoms’ wave functions while carefully managing the trapping conditions to maintain stability, akin to fine-tuning a microscope’s lens, according to Brandstetter.

Following these adjustments, the researchers employed a reliable atomic detection technique to visualize wave functions in detail that were previously unattainable. “When imaging a system without prior magnification, the result is merely a singular blob, obscuring any structural insights,” Brandstetter explains.

Utilizing this innovative technique, the team examined various atomic configurations. For instance, they successfully imaged a pair of atoms interacting and forming molecules; the magnification permitted them to distinguish between each individual atom. The most complex setup involved 12 interacting atoms, each exhibiting different quantum spins that dictate the material’s magnetic properties.

Jonathan Mortlock notes that although similar magnification methods have been explored at Durham University, this experiment is the first to utilize such an approach for identifying the quantum characteristics of individual atoms in an array—details once deemed inaccessible.

The team aims to apply this method to study the phenomena when two quantum particles known as fermions coalesce into liquids that exhibit zero viscosity or conduct electricity with complete efficiency. Understanding these states could pave the way for the development of superior electronic devices. However, researchers must first achieve a deeper comprehension of how fermions assemble and the implications of pairing within the quantum state. Brandstetter states that new techniques now allow for the creation of ultra-cold fermionic atoms and the imaging of their enlarged wave functions.

topic:

  • Quantum Science/
  • Atomic Physics

Source: www.newscientist.com

Quantum Routers Can Accelerate Quantum Computing

Misrepresented color images of quantum router circuits

MIT Squill Foundry

Quantum computers are poised to execute beneficial algorithms at an accelerated pace, thanks to advanced quantum routers that optimize data transmission efficiency.

Conventional computers mitigate slowdowns during complex program executions by temporarily storing information in random access memory (RAM). The essential component for developing QRAM, the quantum equivalent of RAM, is the router. This internal router manages information flow within the computer, distinct from a router that routes Internet queries to specific IP addresses.

Connie Miao, at Stanford University, along with her team, is actively creating these devices. “Our project originated from an algorithm that employs QRAM. Numerous papers have emerged. [experimentally]She remarks.

This innovative router is built using essential bits, the core elements of quantum computers, and quantum memory composed of miniature superconducting circuits, regulated by electromagnetic pulses. Similar to traditional routers, this Quantum One directed quantum information to a specific quantum address. What makes these devices unique is the ability to encode addresses not just through one superposition but through two. The research team tested this setup on three qubits and achieved approximately 95% fidelity in routing.

This implies that when integrated into QRAM, the device can embed information into quantum states. Once in this state, it becomes impossible to determine which of the two locations contains the preserved information.

Duan Luming from Tsinghua University in China notes that their previous quantum routers only operated intermittently, but this new device represents a significant advancement towards establishing practical QRAM, which may enable the execution of quantum machine learning algorithms.

Team Member David Schuster at Stanford states that while numerous unresolved questions remain regarding the practical impacts of precise quantum routing, applications are extensive, ranging from familiar algorithms to database searches, and even the creation of quantum IP addresses for future iterations of the Internet.

However, the current version of the router is still not reliable enough for all intended purposes; further work is needed to reduce errors and to incorporate additional qubits in future designs. Sebastian Legger was involved in this project at Stanford University.

Journal Reference: PRX Quantum, In print

Topic:

Source: www.newscientist.com

How a Skilled New Zealand Dog Triumphed and Secured a Quantum Computer

Feedback provides the latest insights into science and technology from New Scientist, showcasing recent developments. To share intriguing items you think our readers would enjoy, email us at Feedback@newscientist.com.

Computer vs Dog

Feedback often receives emails that start with striking statements. Elliot Baptist recently wrote, expressing curiosity about the comparison of well-trained New Zealand dogs to quantum computers.

Elliot referenced a Preprint paper by cryptographers Peter Gutman of Auckland and Stephen Neuhaus of Zurich’s University of Applied Sciences. This work documents efforts to develop quantum computers capable of factoring very large numbers, specifically identifying two numbers that multiply to a given target.

This is a significant concern because many encryption systems depend on large numbers that are hard to factor. If a quantum computer is built that can easily manage large numbers, it would compromise the security of numerous servers and transactions. There have been notable advancements; for instance, IBM created a computer capable of factoring 15 in 2001 (5×3, for reference) and upgraded to 21 (7×3) by 2012. In 2019, the startup Zapata claimed they could factor 1,099,551,473,989.

However, Gutman and Neuhaus remain optimistic about the future of encryption, noting that many of the quantum factors are engineered. “Like stage magic, when a new quantum factorization is announced, the fascination lies not just in the trick, but in discerning how it was achieved,” they state.

Consequently, we attempted to replicate quantum factorizations using advanced technology. I utilized a home computer for a detailed explanation, which I’ll leave to readers as an exercise. The Abacus method is simpler, but larger numbers necessitate an Abacus arranged in 616 columns.

Now, let’s consider the dog method. To replicate the factorizations of 15 and 21, researchers trained dogs to bark three times. “We took the recently proofed reference dog, depicted in Figure 6, and commanded it to bark together for both 15 and 21,” they wrote. “This task was more complicated than expected, as Scribble performed exceptionally well and hardly barked.”

Elliot admits that he “is not qualified to judge the discussion’s validity,” and remarks that the Feedback team might be even less so. Readers with a deep understanding of quantum computing and encryption are encouraged to write in and elucidate what is happening globally. Feedback may not grasp the explanation, but try presenting it to one of the cats and note their reactions.

Robot Response

Feedback received inquiries about next year’s “inspirational” conference focused on love and interactions with robots, slated to occur in Z Jiang, China.

Tim Stevenson pointed out that I failed to mention a critical detail: the attendance fee. Feedback thrives on diligence, so I revisited the conference website and discovered it costs $105.98 to register. I suspect the actual tickets could hold higher prices, but I didn’t want to register just to find out.

Meanwhile, Pamela Manfield weighed in, disagreeing with Feedback’s stance. However, she acknowledged the controversy, especially given the Trump administration’s cuts to research funding.

Seasonal Injuries

Nicole Golowski wrote to spotlight research from 2023 that may have flown under our radar. She remarked it was akin to “obvious findings.” The study on “Penis Fracture: Merry Christmas Price” exemplifies this notion, as Nicole puts it, “It speaks for itself.”

Using data from Germany between 2005 and 2021, researchers examined whether “tears of the tunica albuginea surrounding the corpora cavernosa” were more frequent during certain times of the year, particularly around the holiday season. The Christmas period (December 24th-26th) and summertime exhibited a higher incidence of such injuries, while unexpectedly, the New Year (December 31st to January 2nd) did not follow this trend. The researchers proposed that “Christmas may be a risk factor for penile fractures due to the heightened intimacy and joy associated with the festive season.”

The study concludes: “Last year’s Christmas penile fractures rose in frequency. This year, let’s avoid doing anything that leads us to tears.”

Apologies for any typos: Feedback noted that this section seemed to curl up defensively.

Have you shared your thoughts with Feedback?

Stories can be submitted to feedback@newscientist.com. Make sure to include your home address. Check our website for this week’s and past Feedback editions.

Source: www.newscientist.com

The US Military Aims to Enhance Internet Security Through Quantum Technology.

SEI 262862112

Can we add quantum to the internet to enhance safety?

Nicolinino / Aramie

The U.S. military has initiated a program aimed at enhancing traditional communication infrastructures to improve the security of quantum devices and the information shared over the Internet.

Quantum networks utilize the quantum states of particles for information sharing, thereby ensuring high security. For instance, the messages linked to these quantum states cannot be copied without detection due to inherent quantum properties. Consequently, numerous quantum communication networks have already been established globally.

However, the development of a fully functional quantum internet remains restricted due to various unresolved technological challenges. Instead of awaiting the resolution of these issues, the U.S. Defense Advanced Research Projects Agency (DARPA) has propelled a program focused on uncovering the immediate advantages of integrating quantum technologies into existing communication networks.

The agency emphasizes its goal of pinpointing practical and beneficial quantum enhancements available in the short term. Allison O’Brien, DARPA Program Manager of the Quantum Organised Network (Quanet) initiative, remarks, “We can’t convert everything from classical to quantum.”

In August, the Quanet team participated in a Hackathon, culminating in a tangible demonstration. Light was placed into a specific quantum state that successfully transmitted images, including the DARPA logo and simple cat graphics. This initial trial of the quantum-enhanced network achieved sufficient bitrate to stream high-resolution videos.

O’Brien indicates that the quantum state demonstrated is just one example of the multitude of quantum properties the Quanet initiative is investigating. Researchers are also delving into “hyperparting,” where multiple light properties are simultaneously linked through the complex nature of quantum entanglement. Initial mathematical models suggest this could allow for the encoding of more secure data within fewer optical signals, optimizing resource use within quantum networks.

Meanwhile, the team is exploring the prospect of generating light with certain quantum-like characteristics, but without fully altering the physical properties at a fundamental level.

Furthermore, Quanet researchers are designing quantum network interface cards that integrate with communication devices to facilitate the transmission and reception of quantum signals.

Numerous questions remain concerning the practical utility of these innovations, including optimal deployment stages and network design levels. However, O’Brien reassures that Quanet is uniting experts in quantum physics, electrical engineering, and networking to comprehensively address these inquiries.

“Quantum networks are not designed to be a universal solution.” states Joseph Lukens from Purdue University, Indiana. They excel in specific tasks, and performing them effectively necessitates some conventional networking components. “The future lies in the automatic integration of quantum networks with traditional ones,” Lukens asserts. He believes that initiatives like Quanet are valuable, despite the numerous questions we still face regarding the potential enhancement of our well-established internet infrastructure.

If this program successfully devises a means for users to activate an ultra-secure “quantum mode” on their devices, it will mark a significant achievement. In that scenario, we could all benefit from these advancements without needing to understand the complexities of quantum physics, says Lukens.

topic:

Source: www.newscientist.com

Another Quantum Computer Achieves Quantum Advantage — Is It Significant?

Jiuzhang 4.0 early prototype, a quantum computer that has achieved quantum advantage

Chao-Yang Lu/University of Science and Technology of China

Quantum computers may have achieved a “quantum advantage” by performing tasks beyond the capabilities of the most powerful supercomputers. Experts estimate that replicating the calculations made by classical machines could take an incomprehensible amount of time, equivalent to trillions of times the age of the universe. What implications does this development hold for creating truly functional quantum computers?

The latest record holder in this domain is a quantum computer known as Jiuzhang 4.0, which utilizes particles of light, or photons, to execute computations. Chao-Yang Lu and his team at the University of Science and Technology of China utilized it for Gauss Boson Sampling (GBS). This involves measuring a sample of photons after they navigate a sophisticated arrangement of mirrors and beamsplitters connected to computers.

In earlier attempts to perform this task, the number of utilized photons never exceeded 300. In contrast, Jiuzhang employed 3,090 particles, representing a tenfold improvement in computational strength. Lu and his colleagues estimate that contemporary algorithms on the most powerful supercomputers would require a staggering 1042 years to replicate what Jiuzhang accomplished in just 25.6 microseconds.

“These results are certainly an impressive technical achievement,” said Jonathan Lavoy of the Canadian quantum computing startup Xanadu, which previously held the GBS record with 219 photons. Chris Langer of Quantinuum noted that while their systems have previously demonstrated quantum advantages in various forms of quantum computing, this advancement is significant. “It’s essential to establish that quantum systems cannot be simulated by classical means,” he asserts.

However, Jiuzhang’s previous versions have been used successfully in conducting GBS with a considerable number of photons, but each time a classical computer eventually replicated the results, sometimes within an hour.

Bill Fefferman from the University of Chicago mentions that he is working on a classical algorithm to achieve victory over quantum systems but notes that significant challenges exist for photonic devices. Many photons are lost during the operation of quantum computers, and the systems tend to be noisy. “Currently, we’ve managed to reduce noise while simultaneously ramping up experimentation. However, our algorithm has yet to find a breakthrough,” states Fefferman.

Lu points out that addressing photon loss is the primary hurdle his team faced in the latest experiment. Nevertheless, Jiuzhang remains free of noise, suggesting potential for new classical simulation strategies to take on the title of superiority.

“In my view, they haven’t achieved full power yet, but they are certainly in a position to prove that such classical strategies may not be feasible,” remarks Gelmarenema from the University of Twente, Netherlands.

This presents a “noble cycle” where the competition between classical algorithms and quantum devices enables a better understanding of the blurry lines separating classical and quantum realms, according to Fefferman. From a fundamental science view, this signifies a triumph for all; however, whether quantum computing can be effectively harnessed in more powerful machines remains an open question.

Langer describes GBS as an “entry-level benchmark” that highlights the distinction between quantum and classical computers, but the results do not necessarily indicate the practical utility of such machines. From a rigorous mathematical perspective, evaluating GBS as concrete evidence of quantum advantage is challenging, as Nicolas Quesada at Polytechnic Montreal, Canada, points out. Identifying a clear pathway to developing a superior machine using GBS remains elusive.

This is primarily because Jiuzhang’s hardware is highly specialized, and programming quantum computers for a variety of calculations remains unachieved. “It might demonstrate computational advantages for narrow tasks, but it fundamentally lacks the key components for practical quantum calculations that involve fault tolerance,” explains Lavoy. Fault tolerance refers to a quantum computer’s ability to recognize and correct its own errors—an essential capability that has yet to be realized in contemporary quantum systems.

Meanwhile, Lu and his team advocate for various applications stemming from Jiuzhang’s remarkable capabilities in GBS. This approach could revolutionize computations tied to image recognition, chemistry, and specific mathematical challenges associated with machine learning. Fabio Sciarrino from the University of Sapienza in Rome suggests that though this quantum computing paradigm is still nascent, its realization could lead to groundbreaking changes.

Specifically, advancements like Jiuzhang’s device could pave the way for the creation of extraordinary light-based quantum computers, asserts Sciarrino. These computers would be programmed in entirely innovative manners and excel in machine learning-related tasks.

Topic:

Source: www.newscientist.com

Quantum Device Simultaneously Detects All Electrical Units

A standardized unit is necessary for measuring electricity

Yuichi Rochino/Getty Images

A single quantum device can now define all three units critical for understanding electricity.

When calculating electricity, one must assess the current in amperes, resistance in ohms, and voltage in volts. Before proceeding, researchers need consensus on the measurements for each unit, which has historically required separate quantum devices and often necessitated visits to different labs.

Recently, Jason Underwood and his team at the National Institute of Standards and Technology (NIST) in Maryland have showcased how to characterize these units using a single device. “Integrating these two quantum standards has always felt like a Holy Grail,” he remarks. “It was a prolonged endeavor. Much like Sisyphus, we’ve been pushing this boulder uphill.”

This integration posed challenges as both devices depend on delicate quantum effects observable only at extremely low temperatures. Additionally, certain devices historically required magnetic fields, which could disrupt the operation of others.

The innovative “One Box” approach circumvents these issues by utilizing new materials capable of conducting quantum functions without the need for magnetic fields, allowing previously separated quantum systems to function together within the same cryostat. This method successfully measures amperes, ohms, and volts with an uncertainty of just one in millions for each unit.

However, before these combined devices can be used practically, researchers must further enhance their precision. Currently, accuracy is hampered by the heating generated when placing the two systems and their wiring too closely together. Moreover, development on the new quantum material, which facilitates the cooperation of both systems, is ongoing, according to Lindsey Rodenbach at Stanford University in California.

He views the project as a significant achievement, yet Underwood highlights that Budget constraints at NIST, funded by the US government, have impeded the team’s reach for even higher precision. He specifically mentions the agency’s “Crossing Infrastructure” report, which revealed that several NIST facilities are in disrepair. NIST has chosen not to comment on the matter.

Susmit Kumar from the Norwegian Metrology Service describes the new device as an “impressive engineering feat” that could enhance quantum electrical standards, making them more accessible and affordable for researchers and tech developers worldwide. He is part of the Quahmet Consortium, which also aims to develop user-friendly devices for measuring ohms using novel materials.

“The International System of Units is a shared language for scientists and engineers everywhere. Our goal is to simplify their use as much as possible,” says Richard Davis, a retired member of the International Bureau of Weights and Measures. He adds that integrating existing devices will foster advancement moving forward.

Topics:

Source: www.newscientist.com

The Challenges of Creating a Viable Quantum Broadcasting Station

Can I broadcast quantum information?

Weiquan Lin/Getty Images

Distributing quantum information akin to traditional broadcasting may not be feasible, even with mathematical models designed to work around quantum mechanics’ inherent limitations.

It is a well-established fact that quantum copy machines cannot exist due to the no-cloning theorem, which is a fundamental principle of quantum physics that prevents the duplication of quantum states. However, physicists have explored the possibility of transmitting or broadcasting copies of quantum information to multiple recipients without breaching this law.

To achieve this, researchers must permit the quantum copies to differ slightly and integrate additional information processing steps for the receivers. Recently, Zhenhuan Liu from Tsinghua University in China and his team demonstrated that these methods might be impractically complex.

“There’s no ‘Ctrl+C’ in the quantum realm,” Liu states. “If you aim to send quantum information to several receivers, there are no quick fixes. You must generate sufficient copies and transmit each one individually.”

The researchers honed in on the previously discussed “virtual quantum broadcast” protocol. In this model, information is adjusted so that various states maintain correlations with each other, although not with identical physical replicas. The messages received are not precise duplicates but share enough characteristics to be valuable. This is analogous to a television network broadcasting slightly different episodes of a serialized drama to each household while generally maintaining the narrative flow. While this protocol is certainly functional, team member Xiangjing Liu at the National University of Singapore questioned its efficiency.

The team analyzed the effort required by recipients to ensure that the information they received, despite not being identical, remained useful. Their mathematical assessment indicated that viable quantum broadcasts may not be realistic.

Counterintuitively, even this optimized approach to quantum broadcasting demands more resources compared to methods like drafting individual letters for each recipient, akin to how group texts send messages to everyone simultaneously, according to team member Yunlong Xiao from Singapore’s scientific research institutions.

“If your sole objective is to simply relay quantum states across various locations, it’s questionable whether exploring virtual quantum broadcasts is a viable method,” says Seok Hyun Lee at Ulsan National Institute of Science and Technology in Korea. He believes this protocol has never been considered a practical guideline for quantum communication but rather an investigation into the fundamental limits of quantum information theory.

Paolo Perinotti from Pavia University in Italy acknowledges the mathematical significance of the team’s efforts but also suggests it is unlikely to provide immediate benefits to quantum technology.

Looking forward, researchers are keen to explore the theoretical implications of this current analysis. It helps us comprehend the correlations permissible when manipulating quantum states, regardless of whether they are distributed over space or transmitted sequentially in time. Xiangjing Liu notes that this work could form the basis of a new framework for understanding quantum processes, emphasizing a clearer distinction between time and space compared to traditional methods.

topic:

  • Quantum Computing/
  • Quantum Physics

Source: www.newscientist.com

Why There’s No Consensus on the Implications of Quantum Physics

What does interpretation mean in quantum theory?

ShutterStock/Cyber Magic Man

If you were to poll a thousand physicists, you’d find no consensus. This assertion applies to a multitude of subjects, including the nature of the universe, the composition of dark matter, and the quest for perfectly efficient wiring. Recently, the team at Nature raised inquiries that sharply delineated the field’s divisions. They conducted a survey of 1,100 physicists regarding their preferred interpretations of quantum mechanics. The outcome? They exhibited “significant disagreement.”

This does not surprise me. In my reporting, I frequently encounter physicists who interpret the results of quantum experiments in varied ways. They might all analyze the same equation or experimental outcome but arrive at different narratives about reality.

So, how significant is this discord, and what does the quest for interpretation really entail? To begin with, it’s peculiar how things unfold within quantum mechanics, a discipline we’ve explored for over a century amid a plethora of unfortunate tests. There’s no denying the robust success of quantum mechanics, a remarkable framework governing the actions of the extremely small or the extremely cold. This theory not only passes all evaluations with distinction but also leads to technological innovations like transistors that power electronic devices and fiber optics for the internet. “Quantum mechanics is remarkably successful, both theoretically and practically,” asserts Peter Lewis from Dartmouth College in New Hampshire.

However, while physicists can articulate equations and construct devices, if I may put it bluntly, they don’t always agree on what these equations signify. They fail to reach consensus on how quantum mechanics describes the observable realities of our world. Research published in Nature indicates that the Copenhagen interpretation of quantum mechanics discourages contemplation on the nature of quantum entities, prompting physicists to focus merely on calculations. Others endorse the many-worlds interpretation, which necessitates belief in an infinitely expansive universe or a hyper-deterministic theory. Notably, only 24% of physicists expressed complete confidence in their chosen interpretations.

Discrepancies also surfaced regarding fundamental aspects of quantum theory, such as wave functions, the enigmatic link between particles referred to as quantum entanglements, and the iconic double-slit experiment that confirmed all matter possesses hidden wave-like attributes. “Moreover, some scientists, even those in similar camps, exhibit varied understandings of their chosen interpretations,” Elizabeth Gibney highlighted in her analysis of the research.

Lewis observes that this scenario—a blend of extraordinary technical advancement and complete philosophical bewilderment—is unparalleled in the annals of science. Navigating this situation remains a challenge. Some physicists perceive it as a discredit to the field, while others argue it’s a positive aspect of scientific diversity. I found myself wrestling with the term “interpretation” to discern which viewpoint I align with the most. What does this term actually imply, and what criteria make an interpretation viable or competitive? Ultimately, I returned to the source material.

“For me, interpreting quantum mechanics transcends mere physics; it veers into philosophy or perhaps psychology,” noted Jeffrey Harvey from the University of Chicago. I recall his class as being a mathematical challenge, and I vividly remember the excitement of discovering that the waves in the abstract Hilbert space “exist.” However, I struggle to remember any clear arguments surrounding the interpretations of the complex mathematical outcomes we examined. Harvey expresses hesitance in teaching various interpretations, citing competition from established “mental models” over experimentally discernible frameworks. When two interpretations stem from the same equation and yield identical experimental predictions, why favor one over the other? “This reflects an agnostic stance. I’d prefer to keep an open mind rather than feel compelled to choose,” Harvey explained.

Jontae Hans, located at the University of Newcastle in the UK, contends that the term interpretation is often utilized too broadly. Some interpretations effectively extend quantum mechanics by adding or modifying core equations. “The challenge lies in the fact that interpretations are viewed differently, as well as the specific issues faced by quantum mechanics,” Lewis states. The Nature survey revealed respondents’ insights across eight interpretations, some of which augment the foundational quantum mechanics rules, while others simplify them, leaving the question of their necessity open for debate, as seen in the Copenhagen interpretation.

To grasp this distinction, consider the famous Schrödinger equation. This is the equation physicists employ to predict outcomes related to quantum objects. Several interpretations of quantum mechanics (e.g., the many-worlds interpretation) rely on the original Schrödinger equation as it was initially formulated. Conversely, a theory termed “decoherence” seeks to uncover why quantum effects are infrequently observed in our macroscopic world, incorporating additional symbols and numbers into the Schrödinger equation that signify new physical processes. Hans asserts that this technically renders the latter an extension rather than merely an interpretation. In such cases, experimental tests could potentially reveal whether our reality necessitates modification of the Schrödinger equation.

This could provide evidence compelling researchers like Harvey to abandon agnosticism. Hans suggests that a successful extension of quantum mechanics could explain numerous experiments whose predictions are already highly accurate, while also insisting that different interpretations can yield clearly distinct and testable predictions.

At the same time, all three researchers acknowledged that many physicists manage to perform their daily tasks without delving into the complexities of quantum mechanics interpretations. This partly explains why my class with Harvey didn’t cover quantum mechanical interpretations; I was primarily taught how to apply the theory. “I don’t perceive it as a problem in terms of innovation and applications in most areas of quantum mechanics. [Interpretation] is mainly a philosophical concern,” Lewis remarks.

Nonetheless, it doesn’t mean that interpretations lack merit, even when competing interpretations don’t yield differing experimental predictions. “While physicists may find interpretations less integral to physics, they can significantly influence how innovative ideas emerge. In that regard, I believe the diversity of mental models fosters exploration of new concepts arising from quantum mechanics,” says Harvey.

Moreover, even philosophical perspectives hold weight, especially regarding the growth of quantum mechanics. For Lewis, this historically unprecedented divide between utility and meaning in quantum mechanics might offer insights into the limitations of science and the philosophical boundaries regarding what can or cannot be understood. The fact that quantum mechanics, a mathematical model explaining the world exceptionally well, still lacks consensus on its significance is telling.

Hans similarly argues that assigning meaning is a fundamental aspect of physics. When discussing this, they often reference social media posts from people like Elon Musk. While I may not have seen them, I’m struck by the tremendous simplifications in their claims. “For me… it’s all about developing equations; it’s about engineering. While some are inclined to pursue engineering careers, I haven’t followed that path. This doesn’t imply engineers lack curiosity; rather, I feel some tension stemming from existential concerns. It’s a question that has kept physicists awake for centuries, and it will likely persist into the future.

Topics:

Source: www.newscientist.com

Imaging Molecules’ Minute Quantum Jitter with Unmatched Clarity

Accelerator tunnels at the European XFEL, where atomic motion is meticulously studied.

Xfel/Heiner Mueller-Elsner

In a groundbreaking achievement, a highly advanced X-ray laser has successfully unveiled the slight atomic movements of molecules that are typically expected to remain stationary.

Quantum physics thrives on uncertainty. Heisenberg’s uncertainty principle prevents scientists from simultaneously and accurately determining a particle’s position and momentum, indicating that quantum particles can never be fully at rest. Instead, atoms are perpetually in motion, albeit minuscule.

Nonetheless, measuring this subtle Heisenberg wiggle is challenging in complex molecules where atoms exhibit various motion patterns. Recently, Till Janke from the XFEL facility, along with his team, successfully captured this phenomenon using molecules composed of 11 atoms, including carbon, hydrogen, nitrogen, and iodine.

“This was my first experiment utilizing an extraordinary tool,” Janke remarked. The pivotal device was the “laser beast,” which bombarded molecules with intense bursts of X-rays. Although the pulse duration was only a quarter of a second, it was a million times brighter than conventional medical X-rays.

Each X-ray pulse stripped electrons from the molecule, causing the atoms to become positively charged and repel explosively from each other. By analyzing the aftermath of these explosions, scientists were able to reconstruct quantum variations of atoms in detail at their lowest energy states.

The team discovered that Heisenberg’s wiggle appears to follow a synchronized pattern in the movements of specific atoms. While this wasn’t unexpected based on the molecular structure, the researchers were astonished by the precision of their measurements, as noted by team member Ludger Inhester at German electronic synchrotrons.

Next, the researchers aim to explore how quantum fluctuations influence molecular behavior during chemical reactions. They also intend to adapt their methodology to study electron movements.

“We are exploring ways to expand our findings to larger systems. There are numerous avenues for future research,” shared team member Rebecca Bol from European XFEL.

Topic:

Source: www.newscientist.com

Is It Possible to Capture Quantum Creepiness Without Entanglement?

Sure! Here’s the rewritten content with the HTML tags preserved:

Light particles seem to display quantum peculiarities even without entanglement

Wladimir Bulgar/Science Photo Library

Particles that appear unentangled achieved significant results in the renowned Entanglement test. This experiment offers fresh insights into the peculiarities of the quantum realm.

Nearly sixty years ago, physicist John Stewart Bell devised a method to determine whether our universe can be better explained through quantum mechanics or traditional theories. The pivotal distinction lies in quantum theory’s incorporation of “abbiotics,” or effects that can persist across vast distances. Remarkably, every experimental implementation of Bell’s tests to date supports the idea that our physical reality is non-local, indicating that we reside in a quantum world.

However, these experiments primarily focused on particles that are closely associated via quantum entanglement. Now, Xiao-Song Ma from Nanjing University in China, along with his team, claims they conducted the Bell Test without relying on entanglement. “Our research may offer a novel viewpoint on non-local correlations,” he states.

The experiment commenced with four specialized crystals, each generating two light particles, or photons, when exposed to a laser. These photons possess various properties measurable by researchers, such as polarization and phase, which describe their behavior as electromagnetic waves. The researchers guided the photons through an intricate arrangement of optical devices, including crystals and lenses, prior to detection.

A standard Bell test experiment involves two fictional experimenters, Alice and Bob, evaluating the properties of correlated particles. By correlating their observations with the “inequality” equation, Alice and Bob can ascertain whether the particles are linked in a non-local manner.

In the new experiment, Alice and Bob were represented by sets of optical devices and detectors instead of interlinked photons. In fact, the researchers incorporated devices in the setup to prevent the intertwining of particle frequencies and velocities. Nonetheless, when Alice and Bob’s measurements were analyzed using the inequality equation, the results indicated a stronger correlation among photons than what could be explained by local effects alone.

Mario Clen from the Max Planck Institute for the Light of Light in Germany suggests that this might be linked to another peculiar property of photons. They indicate it is impossible to identify which photons were “born” within the crystal and what paths they took, making them indistinguishable. Previously, Clen, along with colleagues, utilized this property, termed “distinguishability by path identity,” to entangle photons. However, in this scenario, they confirmed that only one type of quantum peculiarity remains indistinguishable.

The team has yet to formulate a definitive theory explaining how entanglement outcomes can manifest in the Bell test without entanglement actually being employed, but Ma proposes that several underlying quantum phenomena could be indistinguishable as a condition. Thus, even strategies that lack entanglements might serve as the fundamental components necessary to create non-local correlations.

Krenn and Ma express hope that fellow physicists will propose new alternative theories and identify experimental gaps within the Bell test. This mirrors the historical development surrounding the standard Bell test, where nearly five decades elapsed between the initial experiment and the establishment of quantum theory, successfully ruling out all alternative explanations.

One contentious aspect may be the “post-selection” technique utilized by the team. Stefano Paesani at the University of Copenhagen in Denmark argues that this raises questions about whether unentangled photons can be convincingly recognized as non-local within Bell’s tests. After the selection process, he contends that the experiments resemble more traditional scenarios where entanglement exists.

Jeff Randeen from the University of Ottawa, Canada, asserts that while the Bell test can create experiments to examine light, this “holds no profound significance concerning the nature of the universe or reality.”

In such circumstances, there exists the potential for Alice and Bob to act as identical observers or to generate correlations that researchers might misinterpret as non-local effects. Lundeen maintains that the new experiment doesn’t completely eliminate the possibility that Alice and Bob were colluding. “Thus, this experiment doesn’t quite carry the same weight as the renowned violation of Bell’s inequality,” he states.

“This represents one of the elegant extensions of a landmark finding from the ‘Glorious Age’ of the 1990s,” notes Aephraim Steinberg at the University of Toronto, Canada. Nevertheless, in his assessment, traces of entanglement remain in the new experiment—not at the photon level, but rather within the quantum field.

Looking forward, the team aims to enhance the apparatus to address some of these criticisms. For instance, by generating more photons from each crystal, researchers could avoid relying on selection thereafter. “Our collaborative group has already pinpointed several critical potential shortcomings, which we are eager to tackle in the future,” states Ma.

topic:

Source: www.newscientist.com

You Could Potentially Share Near-Infinite Quantum Entanglement

Quantum entanglement can be treated as a shareable resource

Peter Julik/Aramie

Quantum entanglement, an enigmatic connection between particles, serves as a crucial asset for quantum computing and communication, and in some instances, can be shared almost limitlessly.

Numerous quantum operations, including the secure transfer of encrypted quantum data and computations on quantum systems, depend on multiple entangled particles. Ujjwal Sen and his team at the Harish Chandra Research Institute in India have inquired whether entanglements can be shared rather than created anew.

“We imagined a scenario where someone possesses an abundance, like money or treats, willing to distribute it among children, employees, or others,” he explains.

To explore this idea, his team formulated a mathematical model featuring two hypothetical researchers, Alice and Bob, who share entangled particles. When additional researchers, Charu and Debu, require entanglement but cannot generate their own, the first pair must assist.

Their calculations indicated that if Charu’s particles interacted with Alice’s, and Debu’s with Bob’s, the initial pair could transfer part of their entanglement to the latter pair. Kornikar Sen, another researcher at the Harish Chandra Research Institute, clarified that although Charu and Debu couldn’t interact with each other, they could utilize a shared “entanglement bank.”

In fact, the researchers concluded that this procedure for sharing entanglement could potentially accommodate an infinite number of successive pairs of researchers unable to create their own entangled states. Ujjwal Sen expressed that this revelation was surprising, as they had not anticipated the ability to share entanglement across so many pairs when they commenced their calculations.

Moreover, the team pinpointed how the experimenters would need to modify their operations on the particles to facilitate this sharing mechanism, although these specific methods have yet to undergo experimental validation.

Chirag Srivastava from the Harish-Chandra Research Institute added that each new experimenter obtaining entanglement from Alice and Bob would acquire a diminishing share, as some entanglement dissipates during interactions.

Consequently, while the sharing methodology could theoretically continue forever, in practice, it would sooner or later cease when some researchers receive insignificantly small portions of entanglement.

How this situation unfolds—and how it measures against other methods by which researchers can obtain entanglement from a single source—remains to be explored through ongoing experiments.

topic:

Source: www.newscientist.com

Physicists Uncover Unusual Quantum Echoes in Niobium Superconductors

Researchers from Ames National Laboratory and Iowa State University have unveiled the emergence of Higgs echoes in niobium superconductors. These findings shed light on quantum behavior that could influence the development of next-generation quantum sensing and computing technologies.

Using Higgs Echo Spectroscopy, Huang et al reveal unconventional echo formation due to non-uniform expansion and soft quasiparticle bands, dynamically evolving under THZ drive. Image credit: Ames National Laboratory.

Superconductors are materials known for conducting electricity without resistance.

These superconducting materials exhibit collective oscillations referred to as the Higgs mode.

The Higgs mode represents a quantum phenomenon that occurs when the electronic potential fluctuates similarly to a Higgs boson.

Such modes manifest when the material experiences a superconducting phase transition.

Monitoring these vibrations has posed challenges for scientists for many years.

Additionally, they interact complexly with quasiparticles, which are electron-like excitations arising from superconducting dynamics.

By utilizing advanced terahertz (THZ) spectroscopy, the researchers identified a new type of quantum echo known as Higgs echo in superconductive niobium materials utilized in quantum computing circuits.

“Unlike traditional echoes seen in atoms and semiconductors, Higgs echoes result from intricate interactions between Higgs modes and quasiparticles, generating anomalous signals with unique properties.”

“Higgs echoes can uncover and reveal hidden quantum pathways within a material.”

By employing precisely-timed THZ radiation pulses, the authors were able to detect these echoes.

These THZ radiation pulses can also facilitate the encoding, storage, and retrieval of quantum information embedded in the superconducting material via echoes.

This study illustrates the ability to manipulate and observe the quantum coherence of superconductors, paving the way for innovative methods of storing and processing quantum information.

“Grasping and controlling these distinctive quantum echoes brings us closer to practical quantum computing and advanced quantum sensing technologies,” stated Dr. Wang.

a paper detailing these findings was published in the journal on June 25th in Advances in Science.

____

Chuankun Huang et al. 2025. Discovery of unconventional quantum echoes due to Higgs coherence interference. Advances in Science 11 (26); doi:10.1126/sciadv.ads8740

Source: www.sci.news

Quantum Physics Laws Might Erase the Universe That Preceded Ours

Did the cosmos originate from a massive bounce from a different universe?

Vadim Sadovski/Shutterstock

Is it possible that our universe will continuously expand, then contract back into a small point, repeating the Big Bang? According to recent mathematical analyses, the laws of physics suggest that such cyclical behavior is unlikely.

A pivotal element in the concept of a cyclical universe is the “big bounce,” which reimagines the beginning of our known universe as an event following this bounce rather than the traditional Big Bang. The Big Bang is characterized by incomprehensibly dense concentrations of matter and energy where gravity becomes intense enough to alter physical laws, leading to an infinite outward expansion. Conversely, a universe beginning with a big bounce allows us to explore realities beyond what we perceive as the inception, potentially emerging from another universe that undergoes contraction into an extremely dense state, but not necessarily a singularity.

Thus, the essential inquiry about whether time began with a singularity becomes crucial for understanding our universe’s past and future. If the big bounce indeed marks the inception of our universe, it may also inform its prospective trajectory. The initial idea proposed by Oxford’s Roger Penrose in 1965 revolved around the inevitability of collapse under general relativity, the prevailing framework for understanding gravity, particularly related to black holes, which also represent scenarios where gravity can disrupt the fabric of space-time. Penrose concluded that if gravity intensifies sufficiently, singularities cannot be evaded.

Currently, Raphael Bousso of the University of California, Berkeley, has introduced critical insights enhancing these findings by elucidating the quantum properties of the universe.

While Penrose’s arguments didn’t incorporate quantum theory, Bousso indicates that prior explorations by Aron Wall from Cambridge University considered scenarios of very minimal gravity. However, Bousso’s analysis does not limit gravity’s intensity and asserts that it “decisively excludes” the possibility of a circular universe, reinforcing the singularity associated with the Big Bang as an unavoidable outcome.

Onkar Parrikar from the Tata Basic Research Institute in India asserts, “This represents a significant generalization of Penrose’s original theorem, further extended by Wall.”

Chris Akers from the University of Colorado, Boulder points out that this marks substantial progress, as it is “far more effective in quantum physics” compared to earlier studies. He suggests that this new research will impose stricter constraints on larger bounce models.

Bousso’s computations hinge upon a generalized second law of thermodynamics, expanding the conventional second law to address entropy behavior around black holes. This advanced perspective has yet to be rigorously validated, according to Surjeet Rajendran at Johns Hopkins University in Maryland.

In 2018, Rajendran and his team crafted a mathematical representation of the bouncing universe that circumvented constraints imposed by Bousso’s theorems. However, their model included more dimensions of space-time than have currently been observed, leaving some uncertainties unaddressed.

Akers emphasizes, “Understanding our universe’s history is undeniably one of the most crucial scientific endeavors, and alternative models like big bounces should be thoroughly evaluated.”

Jackson Fris from the University of Cambridge mentions that in bouncing scenarios, quantum effects might bolster the universe’s rebound from its dense states. Investigating these scenarios can further our understanding of how quantum gravity theory, which melds general relativity and quantum mechanics, may reshape our conception of the universe. “If quantum gravity is indeed essential for a comprehensive explanation of a black hole’s interior or a big bang,” he notes.

According to Rajendran, one of the most vital methods to ascertain whether our universe experienced a spatial bounce is through gravitational wave observations. These space-time ripples could carry identifiable signatures of the bounce but currently exist in frequencies outside the detection capabilities of existing gravitational wave observatories. Future generations of detectors may capture these frequencies, although the realization of several planned upgrades to U.S. detectors may be uncertain due to proposed budget cuts from the previous administration.

“It is a matter of whether there exists a universe capable of generating a signal strong enough for detection, and if our current world permits scientists to perform those experimental constructions,” Rajendran states.

topic:

Source: www.newscientist.com

Quantum Computers Exhibit Unexpected Randomness—And That’s Beneficial!

Quantum object shuffling is more complex than classic shuffling

Andriy Onofriyenko/Getty Images

Quantum computers are capable of generating randomness far more efficiently than previously anticipated. This remarkable discovery reveals the ongoing complexities at the intersection of quantum physics and computation.

Randomness is essential for numerous computational tasks. For instance, weather simulations require multiple iterations with randomly chosen slightly varied initial conditions. In the realm of quantum computing, researchers have demonstrated quantum advantage by arranging qubits randomly to yield outcomes that classical machines struggle to achieve.

Creating these random configurations effectively entails shuffling qubits and connecting them repeatedly, akin to shuffling a deck of cards. Initially, it was believed that adding more qubits to the system would extend the time required for shuffling, analogous to how larger decks of cards are harder to shuffle. With increased shuffling potentially compromising the delicate quantum states of qubits, the prospect of significant applications relying on randomness was thought to be limited to smaller quantum systems.

Recently, Thomas Schuster from the California Institute of Technology and his team found that generating these random sequences requires fewer shuffles than previously believed.

To illustrate this, Schuster and his colleagues conceptualized dividing the qubit ensemble into smaller segments, thereby mathematically demonstrating that each segment could independently produce a random sequence. They further established that these smaller qubit segments could be “joined” to create a well-shuffled version of the original collection of qubits in a manner that defies expectations.

“It’s quite astonishing because it indicates that classical random number generators don’t exhibit anything comparable,” states Schuster. For instance, in the case of card shuffling within a block, the top cards tend to remain near the top. This is not applicable in quantum systems, where quantum shuffles generate a random superposition of all possible arrangements.

“This is a significantly more intricate phenomenon compared to classical shuffling. The order of the top card is not preserved, as can be observed through classical methods where measuring the top card’s position post-shuffle yields a random output each time, devoid of any insights into the shuffling process itself. It’s genuinely a new and fundamentally quantum phenomenon.”

“We anticipated that this sort of random quantum behavior would be exceptionally challenging to achieve. Yet, the authors demonstrated that it can be realized with remarkable efficiency,” remarks Peter Craze from the Max Planck Institute for the Physics of Complex Systems in Germany. “This discovery was quite unexpected.”

“Random quantum circuits hold numerous applications as elements of quantum algorithms and for showcasing what is termed quantum advantage,” notes Ashley Montanaro from the University of Bristol, UK. “The authors have already identified various applications in quantum information and hope that additional applications will emerge.” While researchers can facilitate experiments demonstrating a type of quantum advantage they have previously conducted, Montanaro cautions that this does not imply we are closer to reaping the practical benefits of such advantages.

Topics:

Source: www.newscientist.com

Quantum Superposition Challenges Us to Confront Profound Realities

Physicists observe that students often exhibit a “digging expression” when first introduced to quantum superposition, as noted by Marcelo Gleiser. Having taught quantum mechanics for several decades, he notes the consistent surprise among students as they grapple with the complexities of atomic and particle behavior.

This article is part of our special concept series, exploring how experts perceive some of the most astonishing ideas in science. Click here for additional details.

The term “clear” often adds confusion in this field. Since the inception of superposition, its true implications have been debated for centuries. What is universally acknowledged is that this concept challenges our understanding of what constitutes “reality.”

A foundational aspect to grasp is the Schrödinger equation. Formulated by Erwin Schrödinger in the 1920s, it serves as a cornerstone of quantum theory, outlining the probabilities of finding particles in specific states upon measurement. Notably, quantum mechanics focuses on predicting potential outcomes rather than clarifying the exact activities of particles pre-measurement.

The Schrödinger equation articulates all conceivable positions a particle may occupy before measurement, utilizing mathematical constructs known as wave functions. This establishes one mathematical interpretation of superposition, defined as the combination of various potential quantum states.

It is well-established that particles can indeed exist in superposition. For instance, in a double-slit experiment, a solitary photon (a light particle) is directed toward a barrier with two narrow openings. When a detector is active, the photon seems to “choose” one slit and strikes a specific point on the screen. In contrast, without the detector, an “interference pattern” is observed, indicating that the particles act like waves, traversing through both slits simultaneously and interacting with themselves.

However, the true significance of being “in a superposition” remains elusive. Generally, two perspectives exist. Some view wave functions merely as mathematical constructs rather than reflections of reality—this aligns with Gleiser’s stance at Dartmouth University, New Hampshire. He asserts, “In quantum mechanics, we argue that wave functions must constitute a part of physical reality,” asserting that equating mathematical constructs with truth has become almost cult-like.

Gleiser endorses an interpretation known as quantum Bayesianism (or QBism), which posits that the theory addresses our understanding rather than reality itself. Consequently, during quantum state measurements, what shifts is merely our information about reality, not reality itself.

Conversely, some scholars, like Simon Saunders, a philosopher from Oxford University, argue against this view, asserting that wave functions represent an authentic state of existence. He suggests that particles in superposition physically occupy multiple locations simultaneously. “It’s an extended object,” he clarifies. “It’s delocalized.” Within this framework, our experience of particle reality may deviate from actual reality. For example, electrons orbiting atoms appear as a cloud of probability until measured.

Critics of this interpretation often question the fate of alternate possibilities once measurement constrains a particle to a single location. Saunders concedes to the radical notion that this may suggest the existence of a branching infinite multiverse.

Ultimately, a resolution to this question isn’t imminent. Meanwhile, researchers have successfully extended superposition beyond individual particles to larger molecules and even 16-microgram crystals. This suggests that reality is much stranger than it appears.

Explore more articles in this series by using the links below:

Topics:

  • Quantum Mechanics/
  • Amazing Concepts

Source: www.newscientist.com

Unveiling the Quantum Computers That Can Make a Difference

Zhang Bin/China News Service/VCG Getty Images

In the last decade, quantum computing has evolved into a multi-billion dollar sector, attracting investments from major tech firms like IBM and Google, along with the U.S. military.

However, Ignacio Cirac, a trailblazer in this field from Germany’s Max Planck Institute for Quantum Optics, provides a more measured assessment: “Quantum computers are not yet a reality,” he states, because creating a functional and practical version is exceedingly challenging.

This article is part of our special feature that delves into how experts perceive some of science’s most intriguing concepts. Click here for more information.

These quantum systems utilize qubits to encode data, in contrast to the traditional “bits” of conventional computers. Qubits can be generated through various methods, ranging from small superconducting circuits to ultra-cold atoms, yet each method presents its own complexities in construction.

The primary advantage lies in their ability to leverage quantum attributes for performing certain calculations at a speed unattainable by classical computers.

This acceleration holds promise for various challenges that traditional computers face, such as simulating complex physical systems and optimizing passenger flight schedules or grocery deliveries. Five years ago, quantum computers appeared poised to tackle these and numerous other computational hurdles.

Today, the situation is even more intricate. Certainly, the progress in creating larger quantum computers is remarkable, with numerous companies developing systems exceeding 1000 qubits. However, this progress also highlights the formidable challenges that remain.

A significant issue is that as these computers scale up, they tend to generate increased errors, and developing methods to mitigate or correct them has proven more challenging than anticipated. Last year, Google researchers made notable strides in addressing this problem, but as Cirac emphasizes, a fully functional useful quantum computer remains elusive.

Consequently, the list of viable applications for such machines may be shorter than many previously anticipated. Weighing the costs of construction against the potential savings reveals that, in many scenarios, the economics may not favor them. “The most significant misconception is that quantum computers can expedite all types of problems,” Cirac explains.

So, which issues might still benefit from quantum computing? Experts suggest that quantum computers could potentially compromise the encryption systems currently employed for secure communications, making them appealing to governments and institutions concerned with security. Scott Aaronson from the University of Texas at Austin notes this.

Another promising area for quantum computers is in modeling materials and chemical reactions. Because quantum computers operate within a framework of quantum objects, they are ideally suited for simulating other quantum systems, such as electrons, atoms, and molecules.

“These are simplified models that don’t accurately reflect real materials. However, if you appropriately design your system, there are numerous properties of real materials you can learn about physics.” Daniel Gottesman from the University of Maryland adds.

While quantum chemical simulations might seem more specialized than flight scheduling, the potential outcomes (such as discovering room-temperature superconductors) could be groundbreaking.

The extent to which these ambitions can be realized heavily relies on the algorithms guiding quantum computations and methods for correcting those pesky errors. This is a complex new domain, as Vedran Dunjko of Leiden University in the Netherlands points out, prompting researchers like himself to confront fundamental questions about information and computation.

“This creates a significant incentive to investigate the complexity of the problem and the potential of computing devices,” Dunjko asserts. “For me, this alone justifies dedicating a substantial portion of my life to these inquiries.”

Explore more articles in this series by using the links below:

topics:

Source: www.newscientist.com

Helgorand: Exploring the Past and Future of Quantum Physics on a Tiny Island

Helgoland Island occupies a nearly mythical position in quantum mechanics history

Shutterstock/Markus Stappen

Having attended numerous scientific conferences, the recent one on Helgoland Island, marking a century of quantum mechanics, stands out as one of the most peculiar, in a positive sense.

This tiny German island, stretching less than a kilometer in the North Sea, exudes the ambiance of a coastal resort. Even during summer, its charm wanes, giving way to the scent of quaint streets filled with souvenir shops, fish eateries, and ice cream stalls. Picture cutting-edge experimenters in Quantum Technologies casually mingling after discussions at the town hall beside a miniature golf course—it’s quite an experience.

Our purpose here becomes evident as we stroll along the cliffside road, where a bronze plaque commemorates physicist Werner Heisenberg’s purported invention of quantum mechanics in 1925. While it sounds intriguing, it’s an embellishment; Heisenberg merely outlined some concepts here. The more recognized formulation came from Erwin Schrödinger in early 1926, who introduced wave functions to predict quantum system evolutions.

Nonetheless, this year clearly holds significance as we commemorate a century of quantum mechanics. Regardless of how much of Helgoland’s narrative stems from Heisenberg’s own embellishments—he recounted his breakthrough there several years later—this “Remote Control Island” serves as a unique venue for celebratory gatherings.

And what a celebration it is! It’s almost surreal to witness such a congregation of renowned quantum physicists. Among them are four Nobel laureates: Alain Aspect, David Wineland, Anton Zeilinger, and Serge Haroche. Collectively, they’ve validated the bizarre aspects of quantum mechanics, showcasing how the characteristics of one particle can instantaneously influence another, no matter the distance. They’ve also developed techniques to manipulate individual quantum particles, crucial for quantum computing.

In my view, these distinguished individuals would concur that the younger generation is poised to delve deeper into the implications of quantum mechanics, transforming its notoriously counterintuitive essence into new technologies and a better understanding of nature. Quantum mechanics is renowned for encompassing multiple interpretations of its mathematical framework concerning reality, with many seasoned experts firmly entrenched in their perspectives.

Helgoland’s plaque honors Werner Heisenberg’s role in quantum mechanics

Philip Ball

This divisive sentiment was noticeable during Zeilinger and Aspect’s evening panel discussion. Jill’s Brothers pioneered quantum cryptography at the University of Montreal.

In fairness to the veterans, their theories emerged under considerable skepticism from their peers, particularly regarding the significance of examining such foundational concerns. They navigated an era where “silent calculations” were prevalent—a term coined by American physicist David Mermin to describe how it was frowned upon to ponder the implications of quantum mechanics beyond merely solving the Schrödinger equation. It’s no wonder they developed thick skins.

In contrast, younger researchers seem more pragmatic in their approach to quantum theories, often adopting various interpretations as tools to address specific challenges. Elements of the Copenhagen interpretation and the multiverse theory are intertwined, not as definitive claims about reality, but as frameworks for analysis.

The new wave of researchers, such as Vedika Khemani from Stanford University, are actively bridging condensed matter physics and quantum information. I heard her highlight the evolution from storing information on magnetic tapes in the 1950s to the crucial error correction techniques in today’s quantum computing.

Quantum mechanics applications are on the rise, with theorists also stepping up their game. For instance, Flaminia Giacomini at the Federal Institute of Technology in Zurich spoke about her pursuit of reconciling the granular quantum realm with the smooth continuous world required for quantum gravity, offering profound insights into the essence of quantum mechanics.

While some may consider this exploration to be veering into the realm of speculation, as seen in string theory attempts, Giacomini asserted, “There is no experimental evidence that gravity should be quantized.” Hence, empirical validation remains elusive, despite a wealth of theoretical discourse.

Excitingly, there are plans to test hypotheses in the not-so-distant future. For instance, examining whether two objects can entangle purely through gravitational interactions is a goal. The difficulty is ensuring the objects are substantial enough to exert meaningful gravitational pull while being sufficiently small to demonstrate quantum characteristics. Several speakers expressed confidence in overcoming this hurdle within the next decade.

The conference revealed the interconnectedness of quantum theories and experiments: perturbing one aspect inevitably influences others. Gaining a nuanced understanding of quantum gravity through delicate experiments involving trapped particles could shed light on black hole information paradoxes and inspire innovative ideas for quantum computing and the nature of quantum states.

Ultimately, achieving progress in any of these areas appears promising for uncovering the enduring questions that have fascinated Heisenberg and his contemporaries. What occurs when we measure quantum particles? However, rather than perceiving it as a repetitive struggle, it’s clear that quantum mechanics is much more sophisticated and intriguing than the founders ever envisaged.

Topics:

Source: www.newscientist.com

Why John Stewart Bell Has Challenged Quantum Mechanics for Decades

John Stewart Bell developed a method to measure the unique correlations permitted in the quantum world

CERN

While some perceive a Poltergeist in the attic and others spot a ghost on dark nights, there’s also the enigmatic figure of John Stewart Bell. His groundbreaking work and enduring legacy have intrigued me for years.

Consider this: how much of our reality can we claim to experience objectively? I ponder this frequently, especially when discussing the intricate nature of space, time, and quantum mechanics. Bell was deeply reflective about such matters, and his contributions have forever altered our comprehension of these concepts.

Born in Belfast in 1928, Bell was, by all accounts, a curious and cheerful child. He gravitated towards physics early and undertook his first role as a lab engineer at just 16. With training in both theoretical and experimental physics, he built a significant part of his career around particle accelerators. Yet, it was the inconsistencies he perceived within quantum theory that occupied his thoughts during late nights.

Today, this area has become a well-established branch of physics, featured prominently in New Scientist. Modern physics does not typically welcome those who question the edges of physics, mathematics, and philosophy. In Bell’s time, scientists were still grappling with the legacies of quantum theory’s pioneers, including heated debates between Niels Bohr and Albert Einstein.

My interest in Bell’s work began as a casual pursuit, though I devoted several hours to it. In 1963, he took a sabbatical with his physicist wife, using the time to craft a pair of original papers. Initially published without much attention, their significance could not be understated.

Bell transformed philosophical inquiries into testable experiments, particularly concentrating on the notion of “hidden variables” in quantum mechanics.

Quantum mechanics inherently resists certainty and determinism, as elucidated by Bohr and his contemporaries in the early 20th century. Notably, definitive statements about quantum entities remain elusive until we engage with them. Predictive ability exists only in probabilistic terms—an electron, for instance, might have a 98% likelihood of exhibiting one energy level while being 2% likely to reveal another, but the actual outcome is intrinsically random.

How does nature make these seemingly random decisions? One theory proposes that certain properties remain hidden from observers. If physicists could identify these hidden variables, they could inject absolute predictability into quantum theory.

Bell crafted a test aimed at marginalizing the myriad hidden variable theories, either altering or challenging quantum theory. This test typically involves two experimenters—Alice and Bob. A pair of entangled particles is produced repeatedly, with one particle sent to Alice and the corresponding one dispatched to Bob in a separate laboratory. Upon receipt, Alice and Bob each independently measure specific properties, for instance, Alice might analyze a particle’s spin.

Simultaneously, Bob conducts his measurements without any communication between the two experimenters. Once all data is collected, it is filtered into equations derived by Bell in 1964. This “inequality” framework evaluates the correlations between Alice and Bob’s observations. Even in scenarios devoid of quantum interactions, some correlations may occur by mere chance. However, Bell established a threshold of correlation indicating that something beyond randomness is happening. The particles demonstrate correlations unique to quantum physics, negating the presence of local hidden variables.

Thus, Bell’s test does more than affirm quantum theory as a superior explanation of our reality; it also underscores the peculiar nature of “non-locality,” revealing strange traits of our existence. This implies that quantum objects can maintain connections, with their behaviors remaining profoundly intertwined despite vast separations. Einstein critiqued this notion vigorously, as it contradicts the principles of his special theory of relativity by insinuating a form of instantaneous communication between entities.

Bell, initially a disciple of Einstein’s theories, found himself ultimately proving his idol wrong. His tests compellingly indicated that our reality is indeed quantum. This debate continues to engage researchers, particularly regarding the persistent discrepancies between quantum theory and our best understanding of gravity, framed by Einstein himself.

There was little acknowledgment of Bell’s experimental designs during his lifetime, despite the technical challenges they presented. The first experiment of this kind was conducted in 1972, and it wasn’t until 2015 that a test with minimal loopholes ultimately refuted the local hidden variable theories conclusively. In 2022, physicists Alain Aspect, John F. Krauss, and Anton Zeilinger received the Nobel Prize in Physics for their extensive work on these experiments.

So why does John Stewart Bell’s legacy resonate so strongly with me? Am I ensnared in some quantum malaise?

The answer lies in the fact that his work and the myriad experiments testing it have spawned as many questions about quantum physics and physical reality as they aim to resolve. For instance, numerous physicists concur that our universe is fundamentally non-local, yet they strive to uncover the underlying physical mechanisms at play. Others are busy formulating new hidden variable theories that evade the constraints set by Bell’s tests. Additionally, researchers are scrupulously reevaluating the mathematical assumptions Bell made in his original work, believing that fresh perspectives on Bell’s findings may be critical for advancing interpretations of quantum theory and developing cohesive theories.

The repercussions of Bell’s findings permeate the realm of quantum physics. We have engaged in Bell tests for nearly five decades, continuously enhancing entangled particles. But this is just the beginning. Recently, I collaborated with physicists to design a method to leverage Bell’s work in exploring whether free will might be partially constrained by cosmic factors. Afterwards, I received a call from another cohort of researchers keen to discuss Bell again, this time in relation to gravity and the foundational nature of space and time. They drew inspiration from his methodologies and sought to create a test that would examine genuine gravitational properties rather than quantum ones.

It’s no wonder I feel inextricably linked to Bell. His capacity to convert philosophical inquiries into tangible tests encapsulates the essence of physics. The essence of physics is to unravel the world’s most baffling mysteries through experimental means. Bell’s test vividly embodies that promise.

If I must ponder a haunting presence, I couldn’t ask for a more remarkable specter.

Topic:

Source: www.newscientist.com

IBM Plans to Develop a Functional Quantum Supercomputer by 2029

Rendering of IBM’s proposed quantum supercomputer

IBM

In less than five years, you’ll have access to a Quantum SuperComputer without errors, according to IBM. The company has unveiled a roadmap for a machine named Starling, set to be available for academic and industrial researchers by 2029.

“These are scientific dreams that have been transformed into engineering achievements,” says Jay Gambetta at IBM. He mentions that he and his team have developed all the required components to make Starling a reality, giving them confidence in their ambitious timeline. The new systems will be based in a New York data center and are expected to aid in manufacturing novel chemicals and materials.

IBM has already constructed a fleet of quantum computers, yet the path to truly user-friendly devices remains challenging, with little competition in the field. Errors continue to thwart many efforts to utilize quantum effects for solving problems that typical supercomputers struggle with.

This underscores the necessity for a fault-tolerant quantum computer that can autonomously correct its mistakes. Such capabilities lead to larger, more powerful devices. There is no universal agreement on the optimal strategy to tackle these challenges, prompting the research team to explore various approaches.

All quantum computers depend on qubits, yet different groups create these essential units from light particles, extremely cold atoms, and in Starling’s case, superconducting qubits. IBM is banking on two innovations to enhance its robustness against significant errors.

First, Starling establishes new connections among its qubits, including those that are quite distant from one another. Each qubit is embedded within a chip, and researchers have innovated new hardware to link these components within a single chip and connect multiple chips together. This advancement enables Starling to be larger than its forerunners while allowing it to execute more complex programs.

According to Gambetta, Starling will employ tens of thousands of qubits, permitting 100 million quantum manipulations. Currently, the largest quantum computers house around 1,000 physical qubits, grouped into roughly 200 “logical qubits.” Within each logical qubit, several qubits function together as a single computational unit resilient to errors. The current record for logical qubits belongs to the Quantum Computing Company Quantinuum with a count of 50.

IBM is implementing a novel method for merging physical qubits into logical qubits via LDPC codes. This marks a significant shift from previous methods employed in other superconducting quantum computers. Gambetta notes that utilizing LDPC codes was once seen as a “pipe dream,” but his team has now realized crucial details to make it feasible.

The benefit of this somewhat unconventional technique is that each logical qubit created with an LDPC approach requires fewer physical qubits compared to competing strategies. Consequently, they are smaller and faster error correction becomes achievable.

“IBM has consistently set ambitious goals and accomplished significant milestones over the years,” states Stephen Bartlett from the University of Sydney. “They have achieved notable innovations and improvements in the last five years, and this represents a genuine breakthrough.” He points out that both the distant qubits and the new hardware for connecting the logical qubit codes deviate from the well-performing devices IBM previously developed, necessitating extensive testing. “It looks promising, but it also requires a leap of faith,” Bartlett adds.

Matthew Otten from the University of Wisconsin-Madison mentions that LDPC codes have only been seriously explored in recent years, and IBM’s roadmap clarifies how it functions. He emphasizes its importance as it helps researchers pinpoint potential bottlenecks and trade-offs. For example, he notes that Starling may operate slower than current superconducting quantum computers.

At its intended scale, the device could address challenges relevant to sectors such as pharmaceuticals. Here, simulations of small molecules or proteins on quantum computers like Starling could replace costly and cumbersome experimental steps in drug development, Otten explains.

IBM isn’t the only contender in the quantum computing sector planning significant advancements. For instance, Quantinuum and Psiquantum have also announced their intentions to develop fault-tolerant utility-scale machines by 2029 and 2027, respectively.

Topics:

Source: www.newscientist.com

Physicists Investigate Light’s Interaction with Quantum Vacuums

Researchers have successfully conducted the first real-time 3D simulation demonstrating how a powerful laser beam alters the quantum vacuum. Remarkably, these simulations reflect the unusual phenomena anticipated by quantum physics, known as vacuum four-wave mixing. This principle suggests that the combined electromagnetic fields of three laser pulses can polarize a virtual electron-positron pair within a vacuum, resulting in photons bouncing toward one another as if they were billiard balls.



Illustration of photon photon scattering in a laboratory: Two green petawatt laser beams collide in focus with a third red beam to polarize the quantum vacuum. This allows the generation of a fourth blue laser beam in a unique direction and color, conserving momentum and energy. Image credit: Zixin (Lily) Zhang.

“This is not merely a matter of academic interest. It represents a significant advance toward experimental validation of quantum effects, which have largely remained theoretical,” remarks Professor Peter Norries from Oxford University.

The simulation was executed using an enhanced version of Osiris, a simulation software that models interactions between laser beams and various materials or plasmas.

“We are doctoral students at Oxford University,” shared Zixin (Lily) Zhang.

“By applying the model to a three-beam scattering experiment, we were able to capture a comprehensive spectrum of quantum signatures, along with detailed insights into the interaction region and the principal time scale.”

“We’ve rigorously benchmarked the simulation, enabling our focus to shift to more intricate, exploratory scenarios, like exotic laser beam structures and dynamic focus pulses.”

Crucially, these models furnish the specifics that experimentalists depend on to design accurate real-world tests, encompassing realistic laser configurations and pulse timing.

The simulations also uncover new insights into how these interactions develop in real-time and how subtle asymmetries in beam geometry can influence the outcomes.

According to the team, this tool not only aids in planning future high-energy laser experiments but also assists in the search for evidence of virtual particles, such as axes and millicharged particles, or potential dark matter candidates.

“The broader planned experiments at state-of-the-art laser facilities will greatly benefit from the new computational methods implemented in Osiris,” noted Professor Lewis Silva, a physicist at the Technico Institute in Lisbon and Oxford.

“The integration of ultra-intense lasers, advanced detection techniques, cutting-edge analysis, and numerical modeling lays the groundwork for a new era of laser-material interactions, opening new avenues for fundamental physics.”

The team’s paper was published today in the journal Communication Physics.

____

Z. Chan et al. 2025. Computational modeling of semi-real-world quantum vacuums in 3D. Commun Phys 8, 224; doi:10.1038/s42005-025-02128-8

Source: www.sci.news