Quantum Computers Require Classical Computing for Real-World Applications

Quantum Machine Professor Jonathan Cohen presenting at the AQC25 conference

Quantum Machines

Classical computers are emerging as a critical component in maximizing the functionality of quantum computers. This was a key takeaway from this month’s assembly of researchers who emphasized that classical systems are vital for managing quantum computers, interpreting their outputs, and enhancing future quantum computing methodologies.

Quantum computers operate on qubits—quantum entities manifesting as extremely cold atoms or miniature superconducting circuits. The computational capability of a quantum computer scales with the number of qubits it possesses.

Yet, qubits are delicate and necessitate meticulous tuning, oversight, and governance. Should these conditions not be met, the computations conducted may yield inaccuracies, rendering the devices less efficient. To manage qubits effectively, researchers utilize classical computing methods. The AQC25 conference held on November 14th in Boston, Massachusetts, addressed these challenges.

Sponsored by Quantum Machines, a company specializing in controllers for various qubit types, the AQC25 conference gathered over 150 experts, including quantum computing scholars and CEOs from AI startups. Through numerous presentations, attendees elaborated on the enabling technologies vital for the future of quantum computing and how classical computing sometimes acts as a constraint.

Per Shane Caldwell, sustainable fault-tolerant quantum computers designed to tackle practical problems are only expected to materialize with a robust classical computing framework that operates at petascale—similar to today’s leading supercomputers. Although Nvidia does not produce quantum hardware, it has recently introduced a system that links quantum processors (QPUs) to traditional GPUs, which are commonly employed in machine learning and high-performance scientific computing.

Even in optimal operations, the results from a quantum computer reflect a series of quantum properties of the qubits. To utilize this data effectively, it requires translation into conventional formats, a process that again relies on classical computing resources.

Pooya Lonar from Vancouver-based startup 1Qbit discussed this translation process and its implications, noting that the performance speed of fault-tolerant quantum computers can often hinge on the operational efficiency of classical components such as controllers and decoders. This means that whether a sophisticated quantum machine operates for hours or days to solve a problem might depend significantly on its classical components.

In another presentation, Benjamin Lienhardt from the Walter Meissner Institute for Cryogenic Research in Germany, presented findings on how traditional machine learning algorithms can facilitate the interpretation of quantum states in superconducting qubits. Similarly, Mark Saffman from the University of Wisconsin-Madison highlighted using classical neural networks to enhance the readout of qubits derived from ultra-cold atoms. Researchers unanimously agreed that non-quantum devices are instrumental in unlocking the potential of various qubit types.

IBM’s Blake Johnson shared insights into a classical decoder his team is developing as part of an ambitious plan to create a quantum supercomputer by 2029. This endeavor will employ unconventional error correction strategies, making the efficient decoding process a significant hurdle.

“As we progress, the trend will shift increasingly towards classical [computing]. The closer one approaches the QPU, the more you can optimize your system’s overall performance,” stated Jonathan Cohen from Quantum Machines.

Classical computing is also instrumental in assessing the design and functionality of future quantum systems. For instance, Izhar Medalcy, co-founder of the startup Quantum Elements, discussed how an AI-powered virtual model of a quantum computer, often referred to as a “digital twin,” can inform actual hardware design decisions.

Representatives from the Quantum Scaling Alliance, co-led by 2025 Nobel Laureate John Martinis, were also present at the conference. This reflects the importance of collaboration between quantum and classical computing realms, bringing together qubit developers, traditional computing giants like Hewlett Packard Enterprise, and computational materials specialists such as the software company Synopsys.

The collective sentiment at the conference was unmistakable. The future of quantum computing is on the horizon, bolstered significantly by experts who have excelled in classical computing environments.

Topics:

  • Computing/
  • Quantum Computing

Source: www.newscientist.com

Am I an Endangered Composer? Exploring Classical Music’s Future in the Age of AI

This hacker mansion blends elements of a startup hub, a luxurious retreat, and a high-tech boutique. Scattered throughout Silicon Valley, these spaces serve as residences for tech founders and visionaries. The most opulent I’ve encountered is in Hillsboro, one of the Bay Area’s affluent neighborhoods just south of San Francisco. Inside, polished marble floors shine beneath high-tech royal portraits affixed with tape. The garden boasts gravel meticulously raked into Zen spirals, and a pond glistens behind well-maintained hedges.

On a sunny June afternoon, I accompanied producer Faye Lomas to capture an interview for a show. BBC Radio 3 documentary discussing the intersection of generative AI and classical music in both San Francisco and Silicon Valley.

We were cheerfully informed that professional creators, including us, would soon be relegated to hobbyists. This wasn’t meant as provocation or sarcasm—just a straightforward reality. At that moment, Faye interjected in the documentary, her voice tinged with agitation: “Does this mean AI is going to take my job?” It was a natural reaction, but it shifted the room’s energy.

When I embarked on making this documentary, I harbored the same curiosity as everyone else. “The cat is out of the bag,” I joked, believing this to be a wise observation. Technology has arrived, and facing it is better than ignoring it.

Silicon Valley composer Tariq O’Regan and BBC producer Faye Lomas. Photo: Joel Cabrita

When I recently spoke with Faye, she recounted the moment vividly. “We swiftly moved from talking about AI’s potential to aid the creative fields to casually mentioning how AI could easily replace every job in the company. The tone was friendly and encouraging, almost as if I should be excited,” she reflected.

This interaction feels pivotal to the narrative. Those small, human moments of awkwardness occur when discussions shift from the theoretical to the tangible.

They contemplated replacing us.

That was back in June. With October now upon us and Oasis on tour in the UK and US, I’ve been reflecting on a different kind of mansion. The band’s concert at Knebworth House in 1996 drew 250,000 attendees over two nights, where people waved lighters instead of phones—one of the last great communal singalongs before everything transformed. Before Napster and MP3s, before cell phones, and before our culture underwent invisible algorithmic reorganization.

Composer Ed Newton Rex plays keyboards and piano while donning a virtual reality headset at his residence in Palo Alto, California. Photo: Marissa Leshnoff/The Guardian

What followed was a subtle yet profound transition from ownership to access. Playlists replaced albums, curated by algorithms rather than musicians, designed to blend seamlessly with our activities. Initially, I believed this was the future of music. Maybe it truly was.

So, long after finishing the documentary, an article like this gave me pause. RBO/Shift is an exciting initiative from the Royal Ballet and Opera, exploring how art interacts with AI. It stems from an institution I deeply respect, run by individuals who have supported me and many others over the years. This initiative is touted as a bold, positive dialogue between technology and creativity, representing a potential compelling partnership. However, what catches my attention isn’t what’s included, but what is glaringly absent.

There is no reference to ethics, training data, consent, environmental impacts, or job security. It’s unimaginable that this technology threatens to significantly undermine the entire ecosystem of artists, crafts, and labor that RBOs have nurtured.

A driverless taxi navigating the streets of San Francisco. Photo: Anadolu Agency/Getty Images

The tone is reminiscent of what we heard at the Hillsboro mansion—always optimistic. Royal Opera Artistic Director Oliver Mears declared, “AI is here to stay” in a recent New York Times interview. “You can bury your head in the sand or embrace the waves.”

However, I find no one I meet in San Francisco, where this technology is innovated and marketed, is simply riding any waves. Embracing a wave suggests succumbing to its force. People here are focused on managing the tides and altering the moon if needed.

I don’t want to dismiss AI. However, my earlier phrase, “the cat is out of the bag,” now feels like a form of moral indifference, suggesting ethics fall by the wayside the moment something novel appears. After spending a summer immersed in machinery, it’s unsettling to witness major institutions handling AI as if it’s the nuclear power of art. It’s attractive, profitable, already causing harm, yet remarkably it carries no warning label.

In this fast-paced environment, our documentary already seems like a piece of history, a snapshot from the last moment when the future ceased asking for permission. That afternoon, with gravel being shoveled and sunlight pouring in, there was a palpable silence in the Hacker mansion, which now feels suspended—an interlude before the surge.

Listening back, I can sense the atmosphere shift—the silence that followed Faye’s question and my nervous chuckle. It’s the sound of tension, the sound of humanity still grounded.

If Knebworth’s Oasis was the last significant singalong before the internet, perhaps this brief moment we chronicled represents the anxious inhalation before the machine begins to produce its own melody.

Tariq O’Regan is a composer based in San Francisco, originally from London. ‘The Artificial Composer,’ a BBC Radio 3 Sunday feature produced by Faye Lomas; is now available on BBC Sounds.

Source: www.theguardian.com