Zhang Bin/China News Service/VCG Getty Images
In the last decade, quantum computing has evolved into a multi-billion dollar sector, attracting investments from major tech firms like IBM and Google, along with the U.S. military.
However, Ignacio Cirac, a trailblazer in this field from Germany’s Max Planck Institute for Quantum Optics, provides a more measured assessment: “Quantum computers are not yet a reality,” he states, because creating a functional and practical version is exceedingly challenging.
This article is part of our special feature that delves into how experts perceive some of science’s most intriguing concepts. Click here for more information.
These quantum systems utilize qubits to encode data, in contrast to the traditional “bits” of conventional computers. Qubits can be generated through various methods, ranging from small superconducting circuits to ultra-cold atoms, yet each method presents its own complexities in construction.
The primary advantage lies in their ability to leverage quantum attributes for performing certain calculations at a speed unattainable by classical computers.
This acceleration holds promise for various challenges that traditional computers face, such as simulating complex physical systems and optimizing passenger flight schedules or grocery deliveries. Five years ago, quantum computers appeared poised to tackle these and numerous other computational hurdles.
Today, the situation is even more intricate. Certainly, the progress in creating larger quantum computers is remarkable, with numerous companies developing systems exceeding 1000 qubits. However, this progress also highlights the formidable challenges that remain.
A significant issue is that as these computers scale up, they tend to generate increased errors, and developing methods to mitigate or correct them has proven more challenging than anticipated. Last year, Google researchers made notable strides in addressing this problem, but as Cirac emphasizes, a fully functional useful quantum computer remains elusive.
Consequently, the list of viable applications for such machines may be shorter than many previously anticipated. Weighing the costs of construction against the potential savings reveals that, in many scenarios, the economics may not favor them. “The most significant misconception is that quantum computers can expedite all types of problems,” Cirac explains.
So, which issues might still benefit from quantum computing? Experts suggest that quantum computers could potentially compromise the encryption systems currently employed for secure communications, making them appealing to governments and institutions concerned with security. Scott Aaronson from the University of Texas at Austin notes this.
Another promising area for quantum computers is in modeling materials and chemical reactions. Because quantum computers operate within a framework of quantum objects, they are ideally suited for simulating other quantum systems, such as electrons, atoms, and molecules.
“These are simplified models that don’t accurately reflect real materials. However, if you appropriately design your system, there are numerous properties of real materials you can learn about physics.” Daniel Gottesman from the University of Maryland adds.
While quantum chemical simulations might seem more specialized than flight scheduling, the potential outcomes (such as discovering room-temperature superconductors) could be groundbreaking.
The extent to which these ambitions can be realized heavily relies on the algorithms guiding quantum computations and methods for correcting those pesky errors. This is a complex new domain, as Vedran Dunjko of Leiden University in the Netherlands points out, prompting researchers like himself to confront fundamental questions about information and computation.
“This creates a significant incentive to investigate the complexity of the problem and the potential of computing devices,” Dunjko asserts. “For me, this alone justifies dedicating a substantial portion of my life to these inquiries.”
Explore more articles in this series by using the links below:
topics:
Source: www.newscientist.com












