IBM’s Quantum System Two Unveiled at a Data Center in Germany
Quantum computing has been making headlines lately. You might have noticed quantum chips and their intriguing cooling systems dominating your news feed. From politicians to business leaders, the term “quantum” is everywhere. If you find yourself perplexed, consider setting a New Year’s resolution to grasp the fundamentals of quantum computing this year.
This goal may seem daunting, but the timing is perfect. The quantum computing sector has achieved significant breakthroughs lately, making it a hotbed of innovation and investment, with the market expected to exceed $1 billion, likely doubling in the coming years. Yet, high interest often leads to disproportionate hype.
There remain numerous questions about when quantum computers might outpace classical ones. While mathematicians and theorists ponder these queries, the practical route may be to improve quantum computers through experimentation. However, consensus on the best methodologies for building these systems is still elusive.
Compounding the complexity, quantum mechanics itself is notoriously challenging to comprehend. Physicists debate interpretations of bizarre phenomena like superposition and entanglement, which are pivotal for quantum computing’s potential.
Feeling overwhelmed? You’re not alone. But don’t be discouraged; these challenges can be overcome with curiosity.
As a former high school teacher, I often encountered curious students who would linger after class, eager to discuss intricate aspects of quantum computing. Many were novice learners in math or physics, yet they posed thought-provoking questions. One summer, a group who took an online quantum programming course approached me, surpassing my own coding knowledge in quantum applications. The following year, we delved into advanced topics typically reserved for college-level classes.
Recently, I discovered a young talent in quantum inquiry. A 9-year-old YouTuber, Kai, co-hosts a podcast named Quantum Kid, where he interviews leading quantum computing experts for over 88,000 subscribers to enjoy.
Kai’s co-host, Katya Moskvich, is not only his mother but also a physicist with extensive experience in science writing. She works at Quantum Machines, a firm developing classical devices that enhance the functionality of quantum computers. Kai brings an infectious enthusiasm to the podcast, engaging with pivotal figures who have influenced modern quantum theory.
In a recent episode, renowned quantum algorithm creator Peter Scholl discussed the intersection of quantum computing, sustainability, and climate action. Nobel laureate Stephen Chu and distinguished computer scientist Scott Aaronson also joined, exploring concepts like time travel and its theoretical connections to quantum mechanics. Additionally, physicist John Preskill collaborated with roboticist Ken Goldberg to examine the interplay of quantum computing and robotics.
Kai and Co-Host (Mother) Katya Moskvich
While The Quantum Kid may not delve deep into rigorous math, it offers a fun entry point and insight from leading experts in quantum technology. Most episodes introduce fundamental concepts like superposition and Heisenberg’s uncertainty principle, which you can explore further in reputable publications such as New Scientist.
The true strength of The Quantum Kid lies in Kai’s ability to ask the very questions that an inquisitive mind might have regarding quantum computers—those which seek to unpack the complex yet fascinating nature of this technology. If you’ve been curious about quantum computing but have felt overwhelmed, Kai encourages you to remain inquisitive and seek clarity. (We’re here to guide you on your quantum journey.)
Could quantum computers revolutionize space exploration or even facilitate time travel? Might they help develop advanced robotics or combat climate issues? The answers are not straightforward, laden with nuances. Kai’s engaging dialogues make complex theories accessible, ensuring clarity resonates with both young listeners and adults. Hearing Peter Scholl reiterate that current quantum systems lack the clout to change the world doesn’t dampen Kai’s enthusiasm but rather fuels it.
In the pilot episode, physicist Lennart Renner expresses optimism, stating, “We’re evolving alongside new machines that can potentially revolutionize tasks, hence we must deliberate on their applications,” setting a forward-thinking tone that reverberates throughout the series.
Adopting a blend of Kai’s wonder and imagination, coupled with the seasoned expertise of guests, will enhance any quantum learning project you embark on this year. Quantum computing, while intricate and multifaceted, remains incredibly compelling. If your child is captivated, why not explore it together?
As quantum computing technology evolves, it becomes crucial to pinpoint challenges that can be tackled more efficiently than with classical computers. Interestingly, many significant tasks that quantum advocates are pursuing may not necessitate quantum computing at all.
The focal point of this discussion is a molecule called FeMoco, essential for life on Earth due to its role in nitrogen fixation. This process enables microorganisms to convert atmospheric nitrogen into ammonia, making it biologically available for other organisms. The mechanisms of FeMoco are intricate and not completely understood, but unraveling this could greatly diminish energy usage in fertilizer production and enhance crop yields.
Understanding FeMoco involves determining its lowest energy state, or “ground state” energy, which necessitates examining several electron behaviors. Electrons, being quantum particles, exhibit wave-like properties and occupy distinct regions known as orbits. This complexity has historically made it challenging for classical computers to calculate the various properties of FeMoco accurately.
While approximation methods have shown some success, their energy estimates have been constrained in accuracy. Conversely, rigorous mathematical analyses have demonstrated that quantum computers, utilizing a fundamentally different encoding of complexity, can resolve problems without relying on approximations, exemplifying what is known as ‘quantum advantage.’
Now, researchers such as Garnet Kin Rick Chan from the California Institute of Technology have unveiled a conventional calculation method capable of achieving comparable accuracy to quantum calculations. A pivotal metric in this discussion is “chemical precision,” which signifies the minimum accuracy required to yield reliable predictions in chemical processes. Based on their findings, Chan and colleagues assert that standard supercomputers can compute FeMoco’s ground state energy with the necessary precision.
FeMoco embodies various quantum states, each with distinct energy levels, forming a structure similar to a ladder with the ground state at the base. To streamline the process for classical algorithms to reach this lowest level, researchers concentrated on the states located on adjacent rungs and inferred their implications for what may exist one or two steps below. Insights into the symmetries of the electrons’ quantum states offered valuable context.
This simplification allowed researchers to use classical algorithms to establish an upper limit on FeMoco’s ground state energy and subsequently extrapolate it to a value with an uncertainty consistent with chemical accuracy. Essentially, the computed lowest energy state must be precise enough for future research applications.
Furthermore, researchers estimate that supercomputing methods could outperform quantum techniques, allowing classical calculations that would typically take eight hours to be completed in under a minute. This assumption relies on ideal supercomputer performance.
However, does this discovery mean you’ll instantly understand FeMoco and enhance agricultural practices? Not entirely. Numerous questions remain unanswered, such as which molecular components interact most effectively with nitrogen and what intermediate molecules are produced in the nitrogen fixation process.
“While this study does not extensively detail the FeMoco system’s capabilities, it further elevates the benchmark for quantum methodologies as a model to illustrate quantum benefits,” explains David Reichman from Columbia University in New York.
Dominic Berry, a professor at Macquarie University in Sydney, Australia, highlights that although their team’s research demonstrates that classical computers can approach the FeMoco dilemma, it only does so through approximations, while quantum methods promise complete problem resolution.
“This raises questions about the rationale for utilizing quantum computers for such challenges; however, for more intricate systems, we anticipate that the computational time for classical approaches will escalate much faster than quantum algorithms,” he states.
Another hurdle is that quantum computing technology is still evolving. Existing quantum devices are currently too limited and error-prone for tackling problems like determining FeMoco’s ground state energy. Yet, a new generation of fault-tolerant quantum computers, capable of self-correction, is on the horizon. From a practical standpoint, Berry suggests that quantum computing may still represent the optimal approach to deciphering FeMoco and related molecules. “Quantum computing will eventually facilitate more general solutions to these systems and enable routine computations once fault-tolerant quantum devices become widely available.”
Researchers from the University of Waterloo and Kyushu University have achieved a groundbreaking advancement in quantum computing by developing a novel method to create redundant, encrypted copies of qubits. This represents a pivotal step towards practical quantum cloud services and robust quantum infrastructure.
Google’s quantum computer – Image credit: Google.
In quantum mechanics, the no-cloning theorem asserts that creating an identical copy of an unknown quantum state is impossible.
Dr. Achim Kempf from the University of Waterloo and Dr. Koji Yamaguchi from Kyushu University emphasize that this fundamental rule remains intact.
However, they have demonstrated a method to generate multiple encrypted versions of a single qubit.
“This significant breakthrough facilitates quantum cloud storage solutions, such as quantum Dropbox, quantum Google Drive, and quantum STACKIT, enabling the secure storage of identical quantum information across multiple servers as redundant encrypted backups,” said Dr. Kemp.
“This development is a crucial step towards establishing a comprehensive quantum computing infrastructure.”
“Quantum computing offers immense potential, particularly for addressing complex problems, but it also introduces unique challenges.”
“One major difficulty in quantum computing is the no-duplication theorem, which dictates that quantum information cannot be directly copied.”
“This limitation arises from the delicate nature of quantum information storage.”
According to the researchers, quantum information functions analogously to splitting passwords.
“If you possess half of a password while your partner holds the other half, neither can be utilized independently. However, when both sections are combined, a valuable password emerges,” Dr. Kemp remarked.
“In a similar manner, qubits are unique in that they can share information in exponentially growing ways as they interconnect.”
“A single qubit’s information is minimal; however, linking multiple qubits allows them to collectively store substantial amounts of information that only materializes when interconnected.”
“This exceptional capability of sharing information across numerous qubits is known as quantum entanglement.”
“With 100 qubits, information can be simultaneously shared in 2^100 different ways, allowing for a level of shared entangled information far exceeding that of current classical computers.”
“Despite the vast potential of quantum computing, the no-cloning theorem restricts its applications.”
“Unlike classical computing, where duplicating information for sharing and backup is a common practice, quantum computing lacks a simple ‘copy and paste’ mechanism.”
“We have uncovered a workaround for the non-replicability theorem of quantum information,” explained Dr. Yamaguchi.
“Our findings reveal that by encrypting quantum information during duplication, we can create as many copies as desired.”
“This method circumvents the no-clonability theorem because when an encrypted copy is selected and decrypted, the decryption key is automatically rendered unusable; it functions as a one-time key.”
“Nevertheless, even one-time keys facilitate crucial applications such as redundant and encrypted quantum cloud services.”
Quantum Machine Professor Jonathan Cohen presenting at the AQC25 conference
Quantum Machines
Classical computers are emerging as a critical component in maximizing the functionality of quantum computers. This was a key takeaway from this month’s assembly of researchers who emphasized that classical systems are vital for managing quantum computers, interpreting their outputs, and enhancing future quantum computing methodologies.
Quantum computers operate on qubits—quantum entities manifesting as extremely cold atoms or miniature superconducting circuits. The computational capability of a quantum computer scales with the number of qubits it possesses.
Yet, qubits are delicate and necessitate meticulous tuning, oversight, and governance. Should these conditions not be met, the computations conducted may yield inaccuracies, rendering the devices less efficient. To manage qubits effectively, researchers utilize classical computing methods. The AQC25 conference held on November 14th in Boston, Massachusetts, addressed these challenges.
Sponsored by Quantum Machines, a company specializing in controllers for various qubit types, the AQC25 conference gathered over 150 experts, including quantum computing scholars and CEOs from AI startups. Through numerous presentations, attendees elaborated on the enabling technologies vital for the future of quantum computing and how classical computing sometimes acts as a constraint.
Per Shane Caldwell, sustainable fault-tolerant quantum computers designed to tackle practical problems are only expected to materialize with a robust classical computing framework that operates at petascale—similar to today’s leading supercomputers. Although Nvidia does not produce quantum hardware, it has recently introduced a system that links quantum processors (QPUs) to traditional GPUs, which are commonly employed in machine learning and high-performance scientific computing.
Even in optimal operations, the results from a quantum computer reflect a series of quantum properties of the qubits. To utilize this data effectively, it requires translation into conventional formats, a process that again relies on classical computing resources.
Pooya Lonar from Vancouver-based startup 1Qbit discussed this translation process and its implications, noting that the performance speed of fault-tolerant quantum computers can often hinge on the operational efficiency of classical components such as controllers and decoders. This means that whether a sophisticated quantum machine operates for hours or days to solve a problem might depend significantly on its classical components.
In another presentation, Benjamin Lienhardt from the Walter Meissner Institute for Cryogenic Research in Germany, presented findings on how traditional machine learning algorithms can facilitate the interpretation of quantum states in superconducting qubits. Similarly, Mark Saffman from the University of Wisconsin-Madison highlighted using classical neural networks to enhance the readout of qubits derived from ultra-cold atoms. Researchers unanimously agreed that non-quantum devices are instrumental in unlocking the potential of various qubit types.
IBM’s Blake Johnson shared insights into a classical decoder his team is developing as part of an ambitious plan to create a quantum supercomputer by 2029. This endeavor will employ unconventional error correction strategies, making the efficient decoding process a significant hurdle.
“As we progress, the trend will shift increasingly towards classical [computing]. The closer one approaches the QPU, the more you can optimize your system’s overall performance,” stated Jonathan Cohen from Quantum Machines.
Classical computing is also instrumental in assessing the design and functionality of future quantum systems. For instance, Izhar Medalcy, co-founder of the startup Quantum Elements, discussed how an AI-powered virtual model of a quantum computer, often referred to as a “digital twin,” can inform actual hardware design decisions.
Representatives from the Quantum Scaling Alliance, co-led by 2025 Nobel Laureate John Martinis, were also present at the conference. This reflects the importance of collaboration between quantum and classical computing realms, bringing together qubit developers, traditional computing giants like Hewlett Packard Enterprise, and computational materials specialists such as the software company Synopsys.
The collective sentiment at the conference was unmistakable. The future of quantum computing is on the horizon, bolstered significantly by experts who have excelled in classical computing environments.
OpenAI has secured a $38 billion (£29 billion) agreement to leverage Amazon’s infrastructure for its artificial intelligence offerings, part of a broader initiative exceeding $1 trillion in investments in computing resources.
This partnership with Amazon Web Services provides OpenAI with immediate access to AWS data centers and the Nvidia chips utilized within them.
Last week, OpenAI CEO Sam Altman stated that the company is committed to an investment of $1.4 trillion in AI infrastructure, highlighting concerns over the sustainability of the expanding data center ecosystem, which serves as the backbone of AI applications such as ChatGPT.
“To scale frontier AI, we need large-scale, dependable computing,” Altman remarked on Monday. “Our collaboration with AWS enhances the computing ecosystem that fuels this new era and makes sophisticated AI accessible to all.”
OpenAI indicated that this deal will provide access to hundreds of thousands of Nvidia graphics processors for training and deploying its AI models. Amazon plans to incorporate these chips into its data centers to enhance ChatGPT’s performance and develop OpenAI’s upcoming models.
AWS CEO Matt Garman reaffirmed that OpenAI is continuously pushing technological boundaries, with Amazon’s infrastructure forming the foundation of these ambitions.
OpenAI aims to develop 30 gigawatts of computing capacity, enough to supply power to approximately 25 million homes in the U.S.
Recently, OpenAI declared its transformation into a for-profit entity as part of a restructuring effort that values the startup at $500 billion. Microsoft, a long-time supporter, will hold roughly 27% of the new commercial organization.
The race for computing resources among AI firms has sparked worries among market analysts regarding financing methods. The Financial Times reported that OpenAI’s annual revenue is approximately $13 billion, a figure starkly contrasted by its $1.4 trillion infrastructure expenditures. Other data center deals OpenAI has entered include a massive $300 billion agreement with Oracle.
During a podcast with Microsoft CEO Satya Nadella, Altman addressed concerns regarding spending, stating “enough is enough” when prompted by host Brad Gerstner about the disparity between OpenAI’s revenue and its infrastructure costs.
Altman claimed that OpenAI generates revenue “well above” the reported $13 billion but did not disclose specific figures. He added: “Enough is enough…I believe there are many who wish to invest in OpenAI shares.”
Analysts at Morgan Stanley have forecast that global data center investment will approach $3 trillion from now until 2028, with half of this spending expected to come from major U.S. tech firms, while the remainder will be sourced from private credit and other avenues. The private credit market is an expanding segment of the shadow banking industry, raising concerns for regulators such as the Bank of England.
quick guide
Contact us about this story
show
The best public interest journalism depends on firsthand reporting from informed individuals.
If you have insights to share on this subject, please contact us confidentially using the following methods.
Secure messaging in the Guardian app
The Guardian app features a tool for submitting story tips. Messages are encrypted end-to-end and concealed within the routine activities of the Guardian mobile app, preventing observers from knowing that you are communicating with us, let alone the content of the messages.
If you haven’t installed the Guardian app yet, download it (iOS/Android) and access the menu. Select ‘Secure Messaging.’
SecureDrop, instant messaging, email, phone, mail
If you can employ the Tor network safely without surveillance, you can send messages and documents to the Guardian through our SecureDrop platform.
Finally, our guide at theguardian.com/tips outlines several secure contact methods and discusses the pros and cons of each.
Prime Minister Tony Blair asserted that “history will not permit” Britain to lag behind in the quantum computing race. This advanced technology is anticipated to ignite a new era of innovations across various fields, from pharmaceutical development to climate analysis.
“The United Kingdom risks losing its edge in quantum research,” cautioned the former Labor prime minister at the Tony Blair Institute, a think tank supported by tech industry veterans such as Oracle founder Larry Ellison.
In a report advocating for a national quantum computing strategy, Mr. Blair and former Conservative leader William Hague drew parallels between the current situation and the evolution of artificial intelligence. While the UK made significant contributions to AI research, it has since surrendered its leadership to other nations, particularly the US, which has triggered a race to develop “sovereign” AI capabilities.
“As demonstrated with AI, a robust R&D foundation alone is insufficient; countries with the necessary infrastructure and capital will capture the economic and strategic advantages of such technologies,” they noted. “While the UK boasts the second-largest number of quantum start-ups globally, it lacks the high-risk investment and infrastructure essential for scaling these ventures.”
Quantum computing operates in unusual and fascinating ways that contrast sharply with classical computing. Traditional computers process information through transistors that switch on or off, representing 1s and 0s. However, in quantum mechanics, entities can exist in multiple states simultaneously, thanks to a phenomenon called quantum superposition, which allows transistors to be in an on and off state concurrently.
This leads to a dramatic boost in computational capabilities, enabling a single quantum computer to perform tasks that would typically require billions of the most advanced supercomputers. Although this field is not yet mature enough for widespread application, the potential for simulating molecular structures to develop new materials and pharmaceuticals is vast. The true value of quantum computing lies in its practical delivery. Estimations suggest that industries such as chemicals, life sciences, automotive, and finance could represent about $1.3 trillion.
There are increasing fears that extraordinarily powerful quantum machines could decipher all encryption and pose serious risks to national security.
Prime Ministers Blair and Hague remarked: “The quantum era is upon us, whether Britain chooses to lead or not.” They added, “History will not excuse us if we squander yet another opportunity to excel in groundbreaking technology.”
This alert follows the recent recognition of British, Cambridge-educated John Clarke, who received the 2025 Nobel Prize in Physics for his contributions to quantum computing, alongside the continued growth of UK quantum firms supported by US companies.
In June, the Oxford University spinout Oxford Ionics was acquired by US company IonQ for $1.1 billion. Meanwhile, Cyclantum, a spinout from the University of Bristol and Imperial College London, primarily thrived in California, discovering that its most enthusiastic investors were located there, where it developed its first large-scale quantum computer. These advancements can be made in Brisbane, Australia.
A report from the Tony Blair Institute for Global Change critiques the UK’s current quantum approach, highlighting that both China and the US are “ahead of the game,” with countries like Germany, Australia, Finland, and the Netherlands also surpassing the UK.
A government representative stated: “Quantum technology has the potential to revolutionize sectors ranging from healthcare to affordable clean energy. The UK currently ranks second globally for quantum investment and possesses leading capabilities in supply chains such as photonics, yet we are resolute in pushing forward.”
They continued: “We have committed to a groundbreaking 10-year funding strategy for the National Quantum Computing Center and will plan other aspects of the national program in due course.”
In June, the Labor party unveiled a £670 million initiative to expedite the application of quantum computing, as part of an industrial strategy aimed at developing new treatments for untreatable diseases and enhancing carbon capture technologies.
Germanium is already utilized in standard computer chips
Matejimo/Getty Images
Superconductors made from germanium, a material traditionally used for computer chips, have the potential to revolutionize quantum computing by enhancing reliability and performance in the future.
Superconductors are materials that enable electricity to flow without resistance, making them ideal for various electrical applications, particularly in maintaining quantum coherence—essential for effective quantum computing.
Nonetheless, most superconductors have been specialized materials that are challenging to incorporate into computer chips. Peter Jacobson and his team at the University of Queensland, Australia, successfully developed a superconductor using germanium, a material already prevalent in the computing sector.
The researchers synthesized the superconductor by introducing gallium into a germanium film through a process called doping. Previous experiments in this area found instability in the resulting combination. To overcome this, the team utilized X-rays to infuse additional gallium into the material, achieving a stable and uniform structure.
However, similar to other known superconductors, this novel material requires cooling to a frigid 3.5 Kelvin (-270°C/-453°F) to function.
David Cardwell, a professor at the University of Cambridge, notes that while superconductors demand extremely low temperatures, making them less suitable for consumer devices, they could be ideally suited for quantum computing, which also necessitates supercooling.
“This could significantly impact quantum technology,” says Cardwell. “We’re already in a very cold environment, so this opens up a new level of functionality. I believe this is a clear starting point.”
Jacobson highlighted that previous attempts to stack superconductors atop semiconductors—critical components in computing—resulted in defects within their crystal structure, posing challenges for practical applications. “Disorder in quantum technology acts as a detrimental effect,” he states. “It absorbs the signal.”
In contrast, this innovative material enables the stacking of layers containing gallium-doped germanium and silicon while maintaining a uniform crystal structure, potentially paving the way for chips that combine the advantageous features of both semiconductors and superconductors.
Google has announced a significant breakthrough in quantum computing, having developed an algorithm capable of performing tasks that traditional computers cannot achieve.
This algorithm, which serves as a set of instructions for guiding the operations of a quantum computer, has the ability to determine molecular structures, laying groundwork for potential breakthroughs in areas like medicine and materials science.
However, Google recognizes that the practical application of quantum computers is still several years away.
“This marks the first occasion in history when a quantum computer has successfully performed a verifiable algorithm that surpasses the power of a supercomputer,” Google stated in a blog post. “This repeatable, beyond-classical computation establishes the foundation for scalable verification and moves quantum computers closer to practical utilization.”
Michel Devore, Google’s chief scientist for quantum AI, who recently received the Nobel Prize in Physics, remarked that this announcement represents yet another milestone in quantum developments. “This is a further advancement towards full-scale quantum computing,” he noted.
The algorithmic advancement, allowing quantum computers to function 13,000 times faster than classical counterparts, is documented in a peer-reviewed article published in the journal Nature.
One expert cautioned that while Google’s accomplishments are impressive, they revolve around a specific scientific challenge and may not translate to significant real-world benefits. Results for two molecules were validated using nuclear magnetic resonance (NMR), akin to MRI technology, yielding insights not typically provided by NMR.
Winfried Hensinger, a professor of quantum technology at the University of Sussex, mentioned that Google has achieved “quantum superiority”, indicating that researchers have utilized quantum computers for tasks unattainable by classical systems.
Nevertheless, fully fault-tolerant quantum computers—which could undertake some of the most exciting tasks in science—are still far from realization, as they would necessitate machines capable of hosting hundreds of thousands of qubits (the basic unit of information in quantum computing).
“It’s crucial to recognize that the task achieved by Google isn’t as groundbreaking as some world-changing applications anticipated from quantum computing,” Hensinger added. “However, it represents another compelling piece of evidence that quantum computers are steadily gaining power.”
A truly capable quantum computer able to address a variety of challenges would require millions of qubits, but current quantum hardware struggles to manage the inherent instability of qubits.
“Many of the most intriguing quantum computers being discussed necessitate millions or even billions of qubits,” Hensinger explained. “Achieving this is even more challenging with the type of hardware utilized by the authors of the Google paper, which demands cooling to extremely low temperatures.”
Hartmut Neven, Google’s vice president of engineering, stated that quantum computers may be five years away from practical application, despite advances in an algorithm referred to as Quantum Echo.
“We remain hopeful that within five years, Quantum Echo will enable real-world applications that are solely feasible with quantum computers,” he said.
As a leading AI company, Google also asserts that quantum computers can generate unique data capable of enhancing AI models, thereby increasing their effectiveness.
Traditional computers represent information in bits (denoted by 0 or 1) and send them as electrical signals. Text messages, emails, and even Netflix movies streamed on smartphones consist of these bits.
Contrarily, information in a quantum computer is represented by qubits. Found within compact chips, these qubits are particles like electrons or photons that can exist in multiple states simultaneously—a concept known as superposition in quantum physics.
This characteristic enables qubits to concurrently encode various combinations of 1s and 0s, allowing computation of vast numbers of different outcomes, an impossibility for classical computers. Nonetheless, maintaining this state requires a strictly controlled environment, free from electromagnetic interference, as disturbances can easily disrupt qubits.
Progress by companies like Google has led to calls for governments and industries to implement quantum-proof cryptography, as cybersecurity experts caution that these advancements have the potential to undermine sophisticated encryption.
John Clarke, Michel Devolette and John Martinis awarded the 2025 Nobel Prize in Physics
Jonathan Nackstrand/AFP via Getty Images
The prestigious 2025 Nobel Prize in Physics was awarded to John Clarke, Michel Devolette, and John Martinis. Their research elucidates how quantum particles can delve through matter, a critical process that underpins the superconducting quantum technology integral to modern quantum computers.
“I was completely caught off guard,” Clarke remarked upon hearing the news from the Nobel Committee. “This outcome was unimaginable; it felt like a dream to be considered for the Nobel Prize.”
Quantum particles exhibit numerous peculiar behaviors, including their stochastic nature and the restriction to specific energy levels instead of a continuous range. This phenomenon sometimes leads to unforeseen occurrences, such as tunneling through solid barriers. Such unusual characteristics were first revealed by pioneers like Erwin Schrödinger during the early years of quantum mechanics.
The implications of these discoveries are profound, particularly supporting theories like nuclear decay; however, earlier research was limited to individual particles and basic systems. It remained uncertain whether more intricate systems such as electronic circuits, conventionally described by classical physics, also adhered to these principles. For instance, the quantum tunneling effect seemed to vanish when observing larger systems.
In 1985, the trio from the University of California, Berkeley—Clarke, Martinis, and Devolette—sought to change this narrative. They investigated the properties of charged particles traversing a superconducting circuit known as the Josephson Junction, a device that earned the Nobel Prize in Physics in 1973 for British physicist Brian Josephson. These junctions comprise wires exhibiting zero electrical resistance, separated by an insulating barrier.
The researchers demonstrated that particles navigating through these junctions behaved as individual entities, adopting distinct energy levels, clear quantum attributes, and registering voltages beyond expected limits without breaching the adiabatic barrier.
This groundbreaking discovery significantly deepened our understanding of how to harness similar superconducting quantum systems, transforming the landscape of quantum science and enabling other scientists to conduct precise quantum physics experiments on silicon chips.
Moreover, superconducting quantum circuits became foundational to the essential components of quantum computers, known as qubits. Developed by companies like Google and IBM, the most advanced quantum computers today consist of hundreds of superconducting qubits, a result of the insights gained from Clarke, Martinis, and Devolette’s research. “In many respects, our findings serve as the cornerstone of quantum computing,” stated Clarke.
Both Martinis and Devolette are currently affiliated with Google Quantum AI, where they pioneered the first superconducting quantum computer in 2019 that demonstrated quantum advantage over traditional machines. However, Clarke noted to the Nobel Committee that it was surprising to consider the extent of impact their 1985 study has had. “Who could have imagined that this discovery would hold such immense significance?”
Quantum computers can be developed using arrays of atoms
Alamy Stock Vector
Devices boasting over 6000 qubits are setting new records and represent the initial phase of constructing the largest quantum computer ever.
At present, there isn’t a universally accepted design for creating quantum computers. However, researchers assert that these machines need to incorporate at least tens of thousands of qubits to be truly functional. The current record holder is a quantum computer utilizing 1180 qubits, with Hannah Manetsch from the California Institute of Technology and her team endeavoring to build a 6100 qubit system.
These qubits are made from neutral cesium atoms that are chilled to near absolute zero and manipulated using a laser beam, all arranged neatly on a grid. According to Manetsch, they have fine-tuned the properties of these qubits to enhance their suitability for calculations, although they have yet to carry them out.
For instance, they modify the laser’s frequency and power to help the fragile qubits maintain their quantum state, thus ensuring the grid’s stability for more precise calculations and extended runtimes of the quantum machine. The research team also assessed how efficiently the lasers could shift qubits around within the array, as noted by Ellie Bataille at the California Institute of Technology.
“This is a remarkable demonstration of the straightforward scaling potential that neutral atoms present,” he remarks. Ben Bloom from Atom Computing also employs neutral atoms in their technologies.
Mark Suffman from the University of Wisconsin-Madison emphasizes that new experiments are vital, providing proof that neutral atomic quantum computers can achieve significant sizes. However, further experimental validation is necessary before considering these setups as fully developed quantum computers.
Research teams are currently investigating optimal methods for enabling qubits to perform calculations while employing error-reduction strategies, mentions Kon Leung at the California Institute of Technology. Ultimately, they envision scaling their systems to 1 million qubits over the next decade, he states.
Misrepresented color images of quantum router circuits
MIT Squill Foundry
Quantum computers are poised to execute beneficial algorithms at an accelerated pace, thanks to advanced quantum routers that optimize data transmission efficiency.
Conventional computers mitigate slowdowns during complex program executions by temporarily storing information in random access memory (RAM). The essential component for developing QRAM, the quantum equivalent of RAM, is the router. This internal router manages information flow within the computer, distinct from a router that routes Internet queries to specific IP addresses.
Connie Miao, at Stanford University, along with her team, is actively creating these devices. “Our project originated from an algorithm that employs QRAM. Numerous papers have emerged. [experimentally]She remarks.
This innovative router is built using essential bits, the core elements of quantum computers, and quantum memory composed of miniature superconducting circuits, regulated by electromagnetic pulses. Similar to traditional routers, this Quantum One directed quantum information to a specific quantum address. What makes these devices unique is the ability to encode addresses not just through one superposition but through two. The research team tested this setup on three qubits and achieved approximately 95% fidelity in routing.
This implies that when integrated into QRAM, the device can embed information into quantum states. Once in this state, it becomes impossible to determine which of the two locations contains the preserved information.
Duan Luming from Tsinghua University in China notes that their previous quantum routers only operated intermittently, but this new device represents a significant advancement towards establishing practical QRAM, which may enable the execution of quantum machine learning algorithms.
Team Member David Schuster at Stanford states that while numerous unresolved questions remain regarding the practical impacts of precise quantum routing, applications are extensive, ranging from familiar algorithms to database searches, and even the creation of quantum IP addresses for future iterations of the Internet.
However, the current version of the router is still not reliable enough for all intended purposes; further work is needed to reduce errors and to incorporate additional qubits in future designs. Sebastian Legger was involved in this project at Stanford University.
Could a new approach lead to error-free quantum computers?
Nord’s numbers
Canadian startups in quantum computing assert that the new Qubit technology will enable the development of smaller, more affordable, and error-free quantum computers. However, reaching that goal presents a significant challenge.
Traditional computers mitigate errors by storing redundant copies of information across multiple locations. This method, known as redundancy, requires quantum computers to utilize many additional qubits, potentially hundreds of thousands, to replicate this redundancy.
Julianne Camiland Lemire and her team at Nord’s numbers have engineered a qubit that promises to reduce this requirement to just a few hundred. “The fundamental principle of our hardware is to utilize qubits with inherent redundancy,” she notes.
Competing qubit technologies include small superconducting circuits and ultra-cold atoms. The Nord Quartique qubit employs a superconducting cavity filled with microwave radiation. Inside this cavity, photons are trapped and bounce back and forth, allowing information to be encoded within quantum states.
This design is not entirely new; however, it’s the first instance of employing “multimode encoding.” Researchers utilize multiple properties of photons simultaneously to store information, thereby enhancing resilience against common quantum computing errors.
Victor Albert from the University of Maryland mentions that effective quantum error correction necessitates more qubits, meaning information is stored in interconnected groups rather than isolated qubits, safeguarding the system from individual failures.
The innovative Qubit incorporates a second technique that enables the effective storage of information in a four-dimensional mathematical framework.
This is why NORD’s quantitative project anticipates that their error-resistant quantum computers will be up to 50 times smaller than those utilizing superconducting circuit qubits, the most advanced yet. Moreover, the company estimates that machines built with their Qubits will consume as much power as those using conventional methods.
Despite these advancements, Nord has not yet released data on multiple kits. Furthermore, ensuring the multimode encoding functions correctly is still pending, indicating that the new Qubit has yet to be applied in computational tasks. Significant technical hurdles remain before these teams can achieve scalable quantum computing.
“It’s too early to conclude whether this fault-resistant approach will inherently outperform other methods,” remarks Barbara Telhal at Delft University of Technology in the Netherlands.
Michel Devoret from Yale University observes that while the new development is “not groundbreaking,” it enhances the science of quantum error correction and reflects the company’s grasp of technical difficulties.
Lemire expresses that the team is actively working on building additional Qubits and refining existing designs. They aim to implement a “perfect mechanism” for manipulating information stored within the Qubit, essential during quantum computational processes. The goal is to create a practical quantum computer featuring over 100 error-resilient qubits by 2029.
What is the difference between artificial intelligence and quantum computing? One is sci-fi sound technology that has long been committed to revolutionizing our world, providing researchers can sort out some technical wrinkles, such as the tendency to cause errors. In fact, the other one is too.
Still, AI seems breathless and inevitably inevitable, but the average person has no experience with quantum computing. Is this important?
Practitioners in both fields certainly commit the crime of hyping their products, but part of the problem with quantum advocates is that the current generation of quantum computers are essentially useless. With a special report on the state of the industry (see “Quantum Computers Finally Arrived, Will They Be Useful?”), races are intended to build machines that can actually do useful calculations. Currently underway. This is not possible on a regular computer.
There is no clear use case to prevent high-tech giants from forcing AI into the software they use every day, but the subtle nature of this hardware makes quantum computing the masses more difficult. It is much more difficult to bring in the same way. You probably won’t own a personal quantum computer. Instead, the industry is targeting businesses and governments.
Practitioners in both AI and quantum computing fields are guilty of hyping their products
Perhaps that’s why quantum computer builders seem to keep their feet on science, drumming business while publishing peer-reviewed research. It appears that the major AI companies have all those who have given up on publishing. Why are you troubled when you can simply charge a monthly fee to use your technology, whether it actually works or not?
The quantum approach is correct. When you are committed to technology that transforms research, industry and society, explaining how it works in the most open way possible is the only way to persuade people to believe in the hype. .
It may not be flashy, but in the long run it’s not style, it’s substance. So, I will definitely aim to revolutionize the world, but please show me your work.
Measuring just 4cm square, Google has developed a computing chip with unprecedented speed. In just five minutes, this chip can complete tasks that would take conventional computers 10 billion years to finish – a mind-boggling number surpassing the age of our universe.
The chip, named Willow, is the size of an After Eight Mint and could revolutionize drug development by accelerating the experimental phase. Recent advancements suggest that within five years, quantum computing will transform research and development across various industries.
Willow boasts fewer errors, enhancing the potential of artificial intelligence. Quantum computing leverages matter existing in multiple states simultaneously to make vast calculations beyond previous capabilities, expediting advancements in medicine and technology.
However, concerns remain about security vulnerabilities posed by quantum computing – the ability to breach even the most robust encryption systems.
Google Quantum AI, alongside other entities like Microsoft, Harvard University, and Quantinum, is working on harnessing quantum mechanics for computing. Overcoming challenges in error correction has paved the way for significant speed enhancements and groundbreaking developments.
Quantum processors are evolving rapidly, surpassing traditional computers and unlocking new possibilities for quantum computations. The potential for quantum computers to exist in multiple states simultaneously promises remarkable capabilities across various fields.
Dr Peter Leake, Research Fellow at the University of Oxford’s Quantum Institute and founder of Oxford Quantum Circuits, acknowledges the rapid advancements in quantum computing technology. While applauding Google’s progress in error correction, he highlights the need for practical applicability in real-world scenarios.
As quantum computing approaches practical implementation, collaboration across various fields becomes crucial to navigate challenges and harness the full potential of this groundbreaking technology.
AI could help us predict the weather more accurately
LaniMiro Lotufo Neto/Alamy
Google researchers have developed an artificial intelligence that they say can predict weather and climate patterns as accurately as current physical models, but with less computing power.
Existing forecasts are based on mathematical models run by extremely powerful supercomputers that deterministically predict what will happen in the future. Since they were first used in the 1950s, these models have become increasingly detailed and require more and more computer power.
Several projects aim to replace these computationally intensive tasks with much less demanding AI, including a DeepMind tool that forecasts localized rainfall over short periods of time. But like most AI models, the problem is that they are “black boxes” whose inner workings are mysterious and whose methods can’t be explained or replicated. And meteorologists say that if these models are trained on historical data, they will have a hard time predicting unprecedented events now being caused by climate change.
now, Dmitry Kochkov The researchers, from Google Research in California, and his colleagues created a model called NeuralGCM that balances the two approaches.
Typical climate models divide the Earth's surface into a grid of cells up to 100 kilometers in size. Due to limitations in computing power, simulating at high resolution is impractical. Phenomena such as clouds, turbulence, and convection within these cells are only approximated by computer codes that are continually adjusted to more closely match observed data. This approach, called parameterization, aims to at least partially capture small-scale phenomena that are not captured by broader physical models.
NeuralGCM has been trained to take over this small-scale approximation, making it less computationally intensive and more accurate. In the paper, the researchers say their model can process 70,000 days of simulations in 24 hours using a single chip called a Tensor Processing Unit (TPU). By comparison, competing models, called X-Shield A supercomputer with thousands of processing units is used to process the simulation, which lasts just 19 days.
The paper also claims that NeuralGCM performs predictions at a rate comparable to or better than best-in-class models. Google did not respond to a request for an interview. New Scientist.
Tim Palmer The Oxford researcher says the work is an interesting attempt to find a third way between pure physics and opaque AI approximations: “I'm uncomfortable with the idea of completely abandoning the equations of motion and moving to AI systems that even experts say they don't fully understand,” he says.
This hybrid approach is likely to spur further discussion and research in the modeling community, but time will tell whether it will be adopted by modeling engineers around the world, he says. “It's a good step in the right direction and the type of research we should be doing. It's great to see different alternatives being explored.”
The number of girls studying computing GCSEs in England has more than halved in less than a decade, leading to warnings about “male dominance in shaping the modern world”.
The sharp fall in female participation comes as government changes to qualifications see the old Information and Communications Technology (ICT) GCSE abolished and replaced with a new Computer Science GCSE.
Government reforms aimed to create “more academically challenging and knowledge-based” qualifications, but the introduction of the new curriculum had the unintended consequence of reducing female enrolments, new research from King’s College London has found.
In 2015, 43% of ICT GCSE candidates were women, but in 2023, just 21% of those taking GCSE Computer Science were women.
To put the figures in perspective, 40,000 girls took ICT GCSEs and a further 5,000 took Computer Science in 2015. By 2023, with ICT no longer available, just 18,600 girls will have taken Computer Science.
When asked why, girls who chose not to study computer science said they didn’t enjoy it and that it didn’t fit into their career plans, the survey found.
Critics of the old ICT qualification complained that they only taught students how to use Microsoft Office. In contrast, the new Computer Science GCSE, with its emphasis on computer theory, coding and programming, is perceived by many students as “harder” than other subjects.
The study recognised that computer science GCSEs are here to stay, with 88,000 students taking the subject in 2023, and a four-fold increase in the number of A-level candidates between 2013 and 2023.
“However, these successes coincide with a general decline in computer and digital skills education at secondary school level, particularly affecting girls, certain ethnic groups and students from disadvantaged socio-economic backgrounds,” the report said.
The report included a series of recommendations calling for urgent curriculum reform, more support for computing teachers and “expanding the current narrative about computing to focus on more than just male tech entrepreneurs.”
“The lack of women in the computing industry could lead to increased vulnerability and male dominance in shaping the modern world,” the authors warned.
“There is an urgent need for action to encourage more girls to study computing at school so they can gain the digital skills they need to participate in and shape the world,” said Dr Peter Kemp, lead researcher on the study and senior lecturer in computing education at King’s College London.
“Current GCSEs focus on developing computer science and programming skills and this appears to be preventing young people, particularly girls, from taking up the subject. We need to ensure that computing is attractive to all pupils and meets the needs of young people and society.”
“All students should leave with the digital skills they need to succeed in the workplace and society,” says Pete Dolling, head of computing at Fulford School in York. “The curriculum needs to be reformed to include a comprehensive computing GCSE that provides essential skills and knowledge, going beyond just computer science.”
Maggie Philbin, OneThe technology broadcaster and director of TeenTech, which promotes digital skills, added: “At the moment many students consider the subject to be ‘difficult’ and will vote with their feet if they want to achieve the best results. It’s time to look at this subject with a fresh eye and work with teachers to design a curriculum that is more engaging and that teachers can be confident delivering.”
oh
On my desk, next to my ultra-modern gaming PC, sits a strange device that resembles a spaceship control panel from a 1970s sci-fi movie. There’s no keyboard or monitor, just a few rows of colorful switches beneath a string of blinking lights. If you thought the recent proliferation of retro video game consoles, such as the Mini SNES and the Mega Drive Mini, was an amazing development in technology nostalgia, look no further than the PiDP-10. It’s a 2/3-scale replica of the PDP-10 mainframe computer, first introduced by Digital Equipment Corporation (DEC) in 1966. It was designed and built by an international group of computer enthusiasts known as the PiDP-10. Obsolescence is certain
It’s a beautiful thing.
The project’s genesis dates back to 2015, when Oscar Vermeulen, a Dutch economist and lifelong computer collector, wanted to build a single replica of the PDP-8 mainframe that had fascinated him since childhood. “I had a Commodore 64 and proudly showed it to a friend of my father’s,” Vermeulen says. “He scoffed and said the Commodore was a toy. The real computer was the PDP, specifically the PDP-8. So I started looking for discarded PDP-8 computers, but I couldn’t find a single one. Now they’re collector’s items, very expensive and most of the time broken. So I decided to build a replica for myself.”
Ever the perfectionist, Vermeulen decided he needed a professionally made front panel cover. “The company that could make them told me I’d have to pay for one four-square-metre sheet of Perspex to cover 50 of these panels,” Vermeulen says. “So I made 49 extra ones, thinking I’d find 49 idiots to do it for me. Little did I know it would end up costing me thousands of dollars on my dinner table.”
At the same time, Vermeulen began posting in various vintage computing Google Groups, where he worked on software emulators for pre-microprocessor computers. As word spread about his replica, it quickly became a group effort that now has over 100 members. While Vermeulen focuses on designing the hardware replica (a front panel with working switches and lights), others are working on different aspects of the open source software emulation, which has a complicated history. At its core is SIMH, created by the ex-SIMH. December Developed by employee and megastar hacker Bob Supnick, the program emulates a variety of classic computers, and it was later improved by Richard Cornwell and Lars Brinkhoff to add driver support for the PDP-10. the Many other people were involved in the operating system and other MIT projects, some of whom collected and preserved old backup tapes, some of whom added improvements and debugging, and some of whom provided documentation and schematics.
Happy hacking! …PiDP-10 replica computer in Keith Stewart’s game room Photo: Keith Stewart/The Guardian
The attention to detail is incredible. The lights on the front aren’t just decorative. They show the instructions being executed, CPU signals, and memory contents, just like the original machine. Vermeulen calls it watching the heartbeat of the computer. This element was taken very seriously. “Two people spent months on one particular problem,” Vermeulen says. “You know, LEDs blink, but incandescent bulbs glow. So we studied exhaustively the LEDs to simulate the glow of the original bulbs. And we found that different bulbs from different years glow for different amounts of time. Measurements were made and calculations were applied, but the glow of the lamps was added. More CPU time was spent simulating that than simulating the original.”
Why? Why go to all this trouble? First, there’s the historical importance. The PDP machines, built between 1959 and the early 1970s, were revolutionary. Not only were they much cheaper than the giant mainframes used by the military and big corporations, but they were designed to be general-purpose, fully interactive machines. Instead of writing a program on punch cards, giving it to the IT department to run on the computer, print it out, and debug it maybe a day later, PDP let you type directly into the computer and test the results immediately.
A tedious task… In the 1950s, before the advent of PDP machines, mainframe computers took up entire rooms and used punch cards to input computer programs. Photo: Pictorial Parade/Getty Images
These factors led to an explosion of experimentation. Most modern programming languages, including C, were developed on DEC machines. The PDP-10 was the heart of the MIT AI Lab, the room where the term artificial intelligence was born. “The PDP-10 computer dominated the Arpanet, the precursor to the Internet,” says Lars Brinkhoff. “Internet protocols were prototyped on the PDP-10, PDP-11, and other computers. The GNU Project was inspired by the free sharing of software and information on the PDP-10. Stephen Hawking’s artificial voice grew out of the DECtalk device, which grew out of Dennis Klatt’s speech synthesis research begun on the PDP-9.”
The PDP made its way into university labs around the world, where it was embraced by a new generation of engineers, scientists, and programmers — the original computer hackers. Steve Wozniak got his start programming on a PDP-8, a small, inexpensive machine that sold by the thousands to hobbyists. Its operating system, OS/8, was the precursor to MS-DOS. Bill Gates and Paul Allen were teenage students who would sneak into the University of Washington to program the PCP-10, and it was on a PDP computer that MIT student Steve Russell and a group of friends designed a shoot-’em-up game. Space War!was one of the first video games to run on a computer.
Pioneers… Steve Russell at the California Computer History Museum, 2011. Russell stands in front of the Digital PDP-1, a computer game he developed in the early 1960s. Photo: MediaNews Group/The Mercury News/Getty Images
This legendary game wasn’t the only one. There were many others at the time, because making games was a fun way to explore possibilities. “There were Dazzle Dart, a four-player laser tennis game, and Lunar Lander,” Vermeulen says. “Maze War was the first networked video game. People connected two IMLAC minicomputer/graphics terminals to the Arpanet via a PDP-10 mainframe, and used that million-dollar pile of hardware to chase each other through a maze or shoot each other.” And the original text adventures like Colossal Cave and Zork, as well as the first multiplayer online games like MUDs and Star Trek, were also written on PDP computers.
These machines are an essential part of our digital culture, the furnace of the modern gaming and tech industries. But to be understood, Already used
“The problem with computer history is that putting old computers in a museum that aren’t being used communicates very little,” says Vermeulen. “You need to experience these machines and how they worked. And the problem with computers before about 1975 is that they were huge, heavy and nearly impossible to keep running. Microsoft co-founder Paul Allen loved his PDP-10 deeply, and with the funds he had, he was able to hire a team of skilled technicians to repair and get it running. But it was very expensive, and sadly, his family decided to discontinue this after he passed away.”
The answer is emulation. The PDP replica has all the look of the original terminal, including the lights and switches, but the calculations are done by a Raspberry Pi microcomputer connected to the back via a serial port. To get it running at home, just plug in the Raspberry Pi, connect a keyboard and monitor, boot it up and download the software. Then flip the switch on the front of the PDP-10, reboot the Raspberry Pi, and you’ll be in PDP mode, with a window on your monitor emulating the old Knight TV terminal display. A command line interface (remember those?) gives you access to a range of the original programs, including games.
This is what I’ve been waiting for. We all know the important role SpaceWar played in the birth of the modern games industry, but actually playing it and controlling a spaceship battling amongst vector explosions against a flickering starry sky…it feels like you’re living history.
In the 15 years since Vermeulen began developing his personal PDP-8 emulator, the Obsolescence Guaranteed group has sold hundreds of replicas and continues to develop more, including a replica of MIT’s experimental Project Whirlwind computer from the 1950s (which ran a simple version of tic-tac-toe). Today, a company in Panama called Chiriqui Electronic Design Studio manufactures the hardware. What started as a personal project has become something much bigger. “We had an ‘official’ launch of our PiDP-10 replica at MIT in Boston, where the original machine was kept. The demo session was attended by about 50 hackers from the 1970s. It was fun to see people playing the multi-user Maze War game 50 years later.”
Another reason the PiDP-10 is worth it is because it’s fun. I never imagined seeing something like this up close, much less plugging it into a monitor at home and playing with it. It was an exciting, nostalgic, and weirdly emotional experience. Navigating the ITS disk system, the glowing green dot-matrix font, the appealing list of programs and games, the “happy hacking!” message above the terminal command line – it’s very evocative.
Impressive…PiDP-10 screen. Photo: Keith Stewart/The Guardian
Meanwhile, programmers who bought PiDP machines are creating new programs and games. They range in age from 80-year-old PDP veterans to 20-year-olds who want to relive a bygone era of programming. Memory and processing power were scarce, so elegant and super-efficient code had to be written; there was no room for bloat. “Quite a few universities are using the PiDP-11 and -8 in their classes,” Vermeulen says. “Partly to show computer science students our origins, but also because the super-low-level programming still required for microcontrollers and hardware drivers is the type of coding you learn very well on these dinosaurs.”
Brinkhoff agrees that while these machines have a certain nostalgia, they also have something to teach us: They’re functional. “I enjoy writing new software for the 10, like a program to display fractals or generate QR codes,” he says.
“I hope it becomes more widely accepted, because if you don’t do anything with PiDP, it just sits on a shelf and the lights flash. It looks pretty, but I don’t think the computer can be truly happy unless you program it.”
The Australian government has announced it will invest nearly A$1 billion in developing quantum computers, staking its claim in a race currently dominated by the United States and China.
Headquartered in the US, PsiQuantum was co-founded by a team including two Australian researchers and has received funding from both the Australian Federal and Queensland Governments of A$470 million, for a total of A$940 million ($600 million). The project will receive funding of $13 million. In return, the company will build and operate a next-generation quantum computer in Brisbane, Australia.
stephen bartlett Researchers at the University of Sydney said the announcement amounted to Australia asserting sovereign capabilities in quantum computing and building a quantum technology ecosystem.
“What I'm really excited about about this is that the size of the investment means we're serious,” Bartlett says. Big technology companies such as IBM, Google and Microsoft are investing billions of dollars in quantum computing, but Australian funding makes PsiQuantum one of the world's largest dedicated quantum computing companies.
Quantum computers offer the possibility of completing some tasks much faster than regular computers. So far, such capabilities have only been demonstrated in non-practical problems, but as research teams in the U.S., China and elsewhere race to build larger and less error-prone machines, they are becoming increasingly common. It is hoped that this will begin to prove useful.
Many teams have built quantum computers based on superconductors, but PsiQuantum's approach involves particles of light called photons, which were thought to be difficult to scale up. However, ahead of the Australian announcement, PsiQuantum Published a paper The paper details how standard semiconductor manufacturing equipment, of the type used to make regular computer chips, could be used to build the photonic chips needed for quantum machines.
Australia has exported generations of quantum researchers, including the co-founders of PsiQuantum. Jeremy O'Brien and Terry Rudolph. Mr Bartlett said government investment could allow these scientists to return to Australia and start building their careers here. “Australia is saying we have a seat at the big table when it comes to quantum computing.”
Chey Eye, a leader in distributed artificial intelligence infrastructure, is pleased to announce the acquisition of Big Energy Investments Inc., a Canadian company specializing in strategic investments in high-performance computing infrastructure. This acquisition is an important step in CeτiAI's strategy to advance the development and accessibility of AI technology.
Strategic acquisitions and enhancements
Following the acquisition, Big Energy Investments, Ltd. acquired an advanced high-performance computing (HPC) infrastructure that included five HPC servers equipped with eight NVIDIA H100 Tensor Core GPUs and two NVIDIA Quantum-2 InfiniBand switches. We have reached a basic agreement to acquire it. These agreements are expected to be signed within the next week and underline our commitment to rapidly increasing our technological capabilities.
This strategic enhancement is critical to the initial deployment of the ceτi AI Infrastructure Network in North America, leveraging the ceτi AI Intelligent Computing Fabric to support decentralized AI networks, decentralized physical infrastructure networks (DePIN), and Manages and provides computing resources to a variety of other applications. .
Strategic development and pilot implementation
The new HPC infrastructure will support the first North American deployment of the ceτi AI Intelligent Computing Fabric, which manages the ceτi AI Infrastructure Network. The network is designed to provide essential computing resources to a variety of decentralized client networks and is a key component of ceτi AI's broader mission to democratize AI technology through decentralization. The pilot implementation will not only demonstrate the capabilities of the ceτi AI solution, but will also begin revenue generation and accumulation for the CETI token ecosystem.
Roadmap and future plans
Successful integration and demonstration of this infrastructure will set the stage for immediate expansion to data center-scale implementations, significantly scaling up ceτi AI's operational capabilities. The development of the CETI token ecosystem continues and its introduction is the next major milestone in the ceτi AI roadmap.
executive insights
“This acquisition is an important milestone in ceτi AI’s growth trajectory and is consistent with our strategic objectives to strengthen our infrastructure and accelerate the development of decentralized AI technology. By combining our capabilities, we will be able to innovate and expand our reach across North America,” said Dennis Jarvis, CEO of ceτi AI.
Forward-looking statements
This press release contains forward-looking statements regarding expected future events and anticipated results that are subject to significant risks and uncertainties. These include, but are not limited to, final procurement and integration of HPC infrastructure, deployment and performance of the ceτi AI Intelligent Computing Fabric, and broader adoption and impact of the CETI token ecosystem. Actual results and results may differ materially from those expressed or anticipated in such forward-looking statements due to a variety of factors.
About ceτi AI
Chey Eye is at the forefront of revolutionizing artificial intelligence through decentralization. cτi AI is committed to innovation and accessibility, developing a globally distributed, high-performance, scalable AI infrastructure designed to empower developers and networks around the world. ceτi AI aims to accelerate the advancement of AI technology by democratizing access to cutting-edge resources, making it more diverse and accessible to everyone. Our mission is not limited to infrastructure development. We are building the foundation for the future of AI, allowing it to grow in ways that benefit all of humanity without sacrificing freedom of choice and expression.
Users can learn more about our mission, technology, and the future we're building, along with the latest updates and community discussions, by visiting:
Chey Eye, a leader in distributed artificial intelligence infrastructure, is pleased to announce the acquisition of Big Energy Investments Inc., a Canadian company specializing in strategic investments in high-performance computing infrastructure. This acquisition is an important step in CeτiAI's strategy to advance the development and accessibility of AI technology.
Strategic acquisitions and enhancements
Following the acquisition, Big Energy Investments, Ltd. has an advanced high-performance computing (HPC) infrastructure that includes five HPC servers equipped with eight NVIDIA H100 Tensor Core GPUs and two NVIDIA Quantum-2 InfiniBand switches. We have reached a basic agreement to acquire it. These agreements are expected to be signed within the next week and underline our commitment to rapidly increasing our technological capabilities.
This strategic enhancement is critical to the initial deployment of the ceτi AI Infrastructure Network in North America, leveraging the ceτi AI Intelligent Computing Fabric to support decentralized AI networks, decentralized physical infrastructure networks (DePIN), and Manages and provides computing resources to a variety of other applications. .
Strategic development and pilot implementation
The new HPC infrastructure will support the first North American deployment of the ceτi AI Intelligent Computing Fabric, which manages the ceτi AI Infrastructure Network. The network is designed to provide essential computing resources to a variety of decentralized client networks and is a key component of ceτi AI's broader mission to democratize AI technology through decentralization. The pilot implementation will not only demonstrate the capabilities of the ceτi AI solution, but will also begin revenue generation and accumulation for the CETI token ecosystem.
Roadmap and future plans
Successful integration and demonstration of this infrastructure will set the stage for immediate expansion to data center-scale implementations, significantly scaling up ceτi AI's operational capabilities. The development of the CETI token ecosystem continues and its introduction is the next major milestone in the ceτi AI roadmap.
executive insights
“This acquisition is an important milestone in ceτi AI's growth trajectory and is consistent with our strategic objectives to strengthen our infrastructure and accelerate the development of decentralized AI technology. Big Energy Investments' resources and By combining our capabilities, we will be able to innovate and expand our reach across North America,” said Dennis Jarvis, CEO of ceτi AI.
Forward-looking statements
This press release contains forward-looking statements regarding expected future events and anticipated results that are subject to significant risks and uncertainties. These include, but are not limited to, final procurement and integration of HPC infrastructure, deployment and performance of the ceτi AI Intelligent Computing Fabric, and broader adoption and impact of the CETI token ecosystem. Actual results and results may differ materially from those expressed or anticipated in such forward-looking statements due to a variety of factors.
About ceτi AI
Chey Eye is at the forefront of revolutionizing artificial intelligence through decentralization. ceτi AI is committed to innovation and accessibility, developing a globally distributed, high-performance, scalable AI infrastructure designed to empower developers and networks around the world. ceτi AI aims to accelerate the advancement of AI technology by democratizing access to cutting-edge resources, making it more diverse and accessible to everyone. Our mission is not limited to infrastructure development. We are building the foundation for the future of AI, allowing it to grow in ways that benefit all of humanity without sacrificing freedom of choice and expression.
Users can learn more about our mission, technology, and the future we're building, along with the latest updates and community discussions, by visiting:
At a Dubai press conference, Aethir Edge debuted as a pioneering edge computing device and first licensed mining machine from Aethir, one of the industry's leading distributed cloud computing infrastructure providers alongside Qualcomm. This will allow the user to mine his 23% of Aethir's native token $ATH supply. Integrated with a decentralized cloud network to overcome the barriers of centralization, his Aethir Edge combines unparalleled edge computing capabilities, decentralized access, and exclusive benefits.
The future of distributed edge computing is here. Ethil debut Esil Edge, Token 2049 was supported by Qualcomm technology at an official press conference in Dubai. Aethir Edge spearheads the evolution to decentralized edge computing as the first sanctioned mining device integrated with decentralized cloud infrastructure, delivering elite GPU performance, 23% of Aethir's native token $ATH supply, and equity Access everything on one device.
Enter the multi-trillion computing market
The edge computing sector is rapidly evolving into a multi-trillion dollar industry, but for too long edge capacity has been siled into centralized data centers. Aethir Edge breaks through these barriers with a breakthrough architecture that interconnects high-performance edge AI devices into a distributed cloud network. By pooling localized resources, Aethir Edge brings elite computing power home and makes it accessible to everyone.
Computing power holds immense potential as an energy source for the digital realm. Aethir Edge, with support from Aethir and Qualcomm, leverages this power and takes it to the next level. Aethir Edge's vision is to fundamentally transform how users access, contribute to, and own a future that transcends the constraints of centralized networks and unleashes the full potential of edge AI technologies. Aethir Edge represents the beginning of this user-driven decentralized evolution.
The first and only certified mining device by Aethir
Aethir Edge, Aethir's only whitelisted mining product, allows users around the world to take advantage of exclusive benefits and share their spare bandwidth, IP addresses, and computing power. You can earn income. With its authorized status, Aethir Edge reserves up to 23% of the total supply of its native token $ATH for mining potential.
“We are excited to support this innovative convergence of decentralized cloud, edge infrastructure, and fair incentives,” said Mark Rydon, co-founder of Aethir. “Aethir Edge is pioneering community-powered edge computing technology through rugged hardware, proprietary mining, and Aethir’s decentralized cloud network.”
When unparalleled edge computing power meets open accessibility
Powered by the Qualcomm® SnapdragonTM 865 chip, Aethir Edge delivers superior performance for data-intensive workloads. 12GB LPDDR5 memory and 256GB UFS 3.1 storage ensure ample resources for smooth parallel processing. Distributed architecture ensures reliability and uptime by distributing capacity across peer nodes, overcoming the vulnerabilities of centralized networks.
“I am very pleased to congratulate the Aethir team on the launch of their next-generation products targeted at distributed edge computing use cases and, more importantly, powered by Qualcomm Technologies and Qualcomm processors. ,” said Qualcomm's vice president and head of enterprise development. and industrial automation. “We are very proud to work with partners like Aethir to advance our edge capabilities.”
Aethir Edge seamlessly interoperates with a variety of applications and delivers ultra-low latency through localized processing. Users around the world can access optimized experiences regardless of their location.
The backbone of innovation in the decentralized cloud ecosystem
As a core component of Aethir's decentralized cloud, Aethir Edge powers innovative new products such as the APhone, the first decentralized cloud smartphone. Localized edge capabilities enable implementation and operation across gaming, AI, VR/AR, real-time streaming, and many other applications.
“Aethir Edge perfectly complements APhone's mission to make Web3 available to everyone. APhone brings high-performance gaming, AI, graphics rendering, and more to every smartphone user around the world through a virtual OS. ” – William Peckham, APhone Chief Business Officer.
Democratize access to the future of edge computing
Aethir Edge spearheads a decentralized infrastructure that is owned and managed by users, rather than a centralized organization. This makes high-performance computing available as an elegant, easy-to-use product that is integrated with profitability. Featuring superior enterprise-grade hardware and distributed cloud infrastructure, Aethir Edge leads the transition from centralized data monopoly to the unbiased edge environment of the future.
Aethir Edge is currently actively building partnerships with distributors around the world, including crypto mining companies, hardware vendors, and distributors. If you are interested, please fill out Aethir Edge. Sales agent application form In doing so, teams can explore win-win opportunities to distribute products together and shape tomorrow's landscape through community power.
Users can visit www.myedge.io Be one of the first to unlock distributed edge computing power.
About Ethyl Edge
Esil Edge is an enterprise-grade edge computing device integrated with Aethir's distributed GPU cloud infrastructure, ushering in a new era of edge computing. As Aethir’s first and only licensed mining device, we combine powerful computing, exclusive revenue, and decentralized access into one device, unlocking the true potential of DePIN.
Ethil is a cloud computing infrastructure platform that revolutionizes the ownership, distribution, and usage paradigm of enterprise-grade graphics processing units (GPUs). By moving away from traditional centralized models, Aethir has deployed a scalable and competitive framework for sharing distributed computing resources to serve enterprise applications and customers across various industries and geographies.
Aethir is revolutionizing DePIN with its highly distributed, enterprise-grade, GPU-based computing infrastructure customized for AI and gaming. He has raised over $130 million in funding for the ecosystem, backed by major Web3 investors including Framework Ventures, Merit Circle, Hashkey, Animoca Brands, Sanctor Capital, and Infinity Ventures Crypto (IVC). , Aethir is paving the way for his Web3 future. distributed computing.
I
In the demanding technical field of semiconductor manufacturing, hardcover book-sized processors stand out. Nvidia’s H-100. On Friday, the Santa Clara, Calif., company was valued at more than $2 trillion. The next step will likely be a chip named after U.S. Navy Rear Adm. “Amazing Grace” Hopper, who was instrumental in designing and implementing the programming language.
Nvidia supplies about 80% of the global market for chips used in AI applications. The company’s H-100 chips (the “H is for hopper”) are now so valuable that they have to be transported in armored vehicles, and demand is so great that some customers have to wait 6 months to receive it.
Hopper’s importance to Nvidia, and to AI computing more generally, was reinforced last summer when Nvidia founder and CEO Jensen Fan announced the next generation accelerated computing and generation AI chip, the GH200 Grace Hopper. It was emphasized when they named it a Super Chip.
Admiral Grace Hopper in 1985. Photo: Associated Press
Hopper was born in New York City in 1906, graduated from Vassar College in 1928 with degrees in mathematics and physics, and joined the Navy after the United States entered World War II following the attack on Pearl Harbor.
According to a biography from Yale University, Initially rejected by the Navy because of her age and small stature, she was commissioned and assigned to Harvard University’s Ship Bureau Computation Project, where she worked on the Mark I, America’s first electromechanical computer, calculating the rocket’s trajectory and reaction force, aircraft gun range table, and minesweeper calibration.
After the war, Hopper joined the Eckhart-Mauchly Computer Corporation (later Sperry Rand), where she pioneered the idea of automatic programming. In 1952, she developed the first compiler, a program that translated written instructions into computer code.
“What I was looking for when I started learning English [programming] was to bring in another whole group who could easily use computers. I kept asking for a more user-friendly language. Most of what we have learned from academics and computer science people has never been adapted to humans,” Hopper explained in a 1980 interview.
Hopper retired as a rear admiral at age 79, making her the oldest active duty officer in the U.S. military. The year before her death in 1992, she was awarded the National Medal of Technology by President George H.W. Bush. She was posthumously awarded the Presidential Medal of Freedom, the highest civilian honor, in 2016.
In a 1983 interview on “60 Minutes”, Hopper was asked if the computer revolution was over. Hopper replied: “No, we’re just getting started. I got a Model T.”
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.