Google Celebrates Breakthrough: Quantum Computer Exceeds Supercomputer Performance

Google has announced a significant breakthrough in quantum computing, having developed an algorithm capable of performing tasks that traditional computers cannot achieve.

This algorithm, which serves as a set of instructions for guiding the operations of a quantum computer, has the ability to determine molecular structures, laying groundwork for potential breakthroughs in areas like medicine and materials science.

However, Google recognizes that the practical application of quantum computers is still several years away.

“This marks the first occasion in history when a quantum computer has successfully performed a verifiable algorithm that surpasses the power of a supercomputer,” Google stated in a blog post. “This repeatable, beyond-classical computation establishes the foundation for scalable verification and moves quantum computers closer to practical utilization.”

Michel Devore, Google’s chief scientist for quantum AI, who recently received the Nobel Prize in Physics, remarked that this announcement represents yet another milestone in quantum developments. “This is a further advancement towards full-scale quantum computing,” he noted.

The algorithmic advancement, allowing quantum computers to function 13,000 times faster than classical counterparts, is documented in a peer-reviewed article published in the journal Nature.

One expert cautioned that while Google’s accomplishments are impressive, they revolve around a specific scientific challenge and may not translate to significant real-world benefits. Results for two molecules were validated using nuclear magnetic resonance (NMR), akin to MRI technology, yielding insights not typically provided by NMR.

Winfried Hensinger, a professor of quantum technology at the University of Sussex, mentioned that Google has achieved “quantum superiority”, indicating that researchers have utilized quantum computers for tasks unattainable by classical systems.

Nevertheless, fully fault-tolerant quantum computers—which could undertake some of the most exciting tasks in science—are still far from realization, as they would necessitate machines capable of hosting hundreds of thousands of qubits (the basic unit of information in quantum computing).

“It’s crucial to recognize that the task achieved by Google isn’t as groundbreaking as some world-changing applications anticipated from quantum computing,” Hensinger added. “However, it represents another compelling piece of evidence that quantum computers are steadily gaining power.”

A truly capable quantum computer able to address a variety of challenges would require millions of qubits, but current quantum hardware struggles to manage the inherent instability of qubits.

“Many of the most intriguing quantum computers being discussed necessitate millions or even billions of qubits,” Hensinger explained. “Achieving this is even more challenging with the type of hardware utilized by the authors of the Google paper, which demands cooling to extremely low temperatures.”

Hartmut Neven, Google’s vice president of engineering, stated that quantum computers may be five years away from practical application, despite advances in an algorithm referred to as Quantum Echo.

Skip past newsletter promotions

“We remain hopeful that within five years, Quantum Echo will enable real-world applications that are solely feasible with quantum computers,” he said.

As a leading AI company, Google also asserts that quantum computers can generate unique data capable of enhancing AI models, thereby increasing their effectiveness.

Traditional computers represent information in bits (denoted by 0 or 1) and send them as electrical signals. Text messages, emails, and even Netflix movies streamed on smartphones consist of these bits.

Contrarily, information in a quantum computer is represented by qubits. Found within compact chips, these qubits are particles like electrons or photons that can exist in multiple states simultaneously—a concept known as superposition in quantum physics.

This characteristic enables qubits to concurrently encode various combinations of 1s and 0s, allowing computation of vast numbers of different outcomes, an impossibility for classical computers. Nonetheless, maintaining this state requires a strictly controlled environment, free from electromagnetic interference, as disturbances can easily disrupt qubits.

Progress by companies like Google has led to calls for governments and industries to implement quantum-proof cryptography, as cybersecurity experts caution that these advancements have the potential to undermine sophisticated encryption.

Source: www.theguardian.com

IBM Plans to Develop a Functional Quantum Supercomputer by 2029

Rendering of IBM’s proposed quantum supercomputer

IBM

In less than five years, you’ll have access to a Quantum SuperComputer without errors, according to IBM. The company has unveiled a roadmap for a machine named Starling, set to be available for academic and industrial researchers by 2029.

“These are scientific dreams that have been transformed into engineering achievements,” says Jay Gambetta at IBM. He mentions that he and his team have developed all the required components to make Starling a reality, giving them confidence in their ambitious timeline. The new systems will be based in a New York data center and are expected to aid in manufacturing novel chemicals and materials.

IBM has already constructed a fleet of quantum computers, yet the path to truly user-friendly devices remains challenging, with little competition in the field. Errors continue to thwart many efforts to utilize quantum effects for solving problems that typical supercomputers struggle with.

This underscores the necessity for a fault-tolerant quantum computer that can autonomously correct its mistakes. Such capabilities lead to larger, more powerful devices. There is no universal agreement on the optimal strategy to tackle these challenges, prompting the research team to explore various approaches.

All quantum computers depend on qubits, yet different groups create these essential units from light particles, extremely cold atoms, and in Starling’s case, superconducting qubits. IBM is banking on two innovations to enhance its robustness against significant errors.

First, Starling establishes new connections among its qubits, including those that are quite distant from one another. Each qubit is embedded within a chip, and researchers have innovated new hardware to link these components within a single chip and connect multiple chips together. This advancement enables Starling to be larger than its forerunners while allowing it to execute more complex programs.

According to Gambetta, Starling will employ tens of thousands of qubits, permitting 100 million quantum manipulations. Currently, the largest quantum computers house around 1,000 physical qubits, grouped into roughly 200 “logical qubits.” Within each logical qubit, several qubits function together as a single computational unit resilient to errors. The current record for logical qubits belongs to the Quantum Computing Company Quantinuum with a count of 50.

IBM is implementing a novel method for merging physical qubits into logical qubits via LDPC codes. This marks a significant shift from previous methods employed in other superconducting quantum computers. Gambetta notes that utilizing LDPC codes was once seen as a “pipe dream,” but his team has now realized crucial details to make it feasible.

The benefit of this somewhat unconventional technique is that each logical qubit created with an LDPC approach requires fewer physical qubits compared to competing strategies. Consequently, they are smaller and faster error correction becomes achievable.

“IBM has consistently set ambitious goals and accomplished significant milestones over the years,” states Stephen Bartlett from the University of Sydney. “They have achieved notable innovations and improvements in the last five years, and this represents a genuine breakthrough.” He points out that both the distant qubits and the new hardware for connecting the logical qubit codes deviate from the well-performing devices IBM previously developed, necessitating extensive testing. “It looks promising, but it also requires a leap of faith,” Bartlett adds.

Matthew Otten from the University of Wisconsin-Madison mentions that LDPC codes have only been seriously explored in recent years, and IBM’s roadmap clarifies how it functions. He emphasizes its importance as it helps researchers pinpoint potential bottlenecks and trade-offs. For example, he notes that Starling may operate slower than current superconducting quantum computers.

At its intended scale, the device could address challenges relevant to sectors such as pharmaceuticals. Here, simulations of small molecules or proteins on quantum computers like Starling could replace costly and cumbersome experimental steps in drug development, Otten explains.

IBM isn’t the only contender in the quantum computing sector planning significant advancements. For instance, Quantinuum and Psiquantum have also announced their intentions to develop fault-tolerant utility-scale machines by 2029 and 2027, respectively.

Topics:

Source: www.newscientist.com

AI Predicts Weather Instantly Without a Supercomputer

Thunderstorms in Indonesia seen from the International Space Station

NASA EARTH OBSERVATORATORY / INTERNATIONAL SPACE STATION (ISS)

Its creators claim that AI weather programs running for a second on the desktop can match the accuracy of traditional predictions that take hours or days on a powerful supercomputer.

Weather forecasts rely on physics-based models that extrapolate from observations made using satellites, balloons and weather stations since the 1950s. However, these calculations, known as numerical weather forecasts (NWPs), are highly concentrated and rely on vast, expensive, energy-hungry supercomputers.

In recent years, researchers have tried to streamline this process by applying AI. Last year, Google Scientists created an AI tool that could replace a small chunk of complex code in each cell of a weather model, dramatically reducing computer power. DeepMind later went further by doing this, using AI to replace the entire prediction. This approach is adopted by European Medium-Range Weather Forecast (ECMWF). The tool has been launched Last month it was called the Artificial Intelligence Prediction System.

However, this gradual expansion of the role of AI in weather forecasting has not replaced the calculation of all traditional figures – the new model created by Richard Turner Cambridge University and his colleagues are looking for change.

Turner says that previous work was limited to prediction and passed a step called initialization. There, data from satellites, balloons and weather stations around the world is collated, washed, manipulated and integrated into an organized grid where predictions can begin. “It’s actually half the computational resource,” Turner says.

The researchers created a model called Aardvark Weather. This replaces both the prediction and initialization stages for the first time. It uses only 10% of the input data that existing systems make, but achieves results comparable to the latest NWP predictions. Turner and his colleagues report in a study assessing the method.

Generating a perfect prediction that takes hours or days on a powerful NWP prediction supercomputer can be done in about a second on a single desktop computer using Aardvark.

However, Aardvark uses a grid model of the Earth’s surface with a square cell of 1.5 degrees, while ECMWF’s ERA5 model uses a grid with cells. 0.3 degrees smaller. This means that Aardvark’s model is too rough to pick up complex and unexpected weather patterns, David Schultz At the University of Manchester, UK.

“There are a lot of unresolved things that could blow up predictions,” Schultz says. “They don’t represent any extremes at all. They can’t solve it on this scale.”

Turner argues that Aardvark can actually beat some existing models. However, he acknowledges that AI models like him also rely entirely on these physics-based models. “It’s absolutely not working just to steal training data and train with observational data,” he says. “We tried to do that and did a complete modelless physics, but it didn’t work.”

He believes the future of weather forecasting could be scientists working on more accurate physics-based models. This is used to train AI models that replicate output faster and with less hardware. Some are even more optimistic about the AI ​​outlook.

Nikita Gouryanov At Oxford University, we believe that AI will eventually be able to produce weather forecasts that actually exceed NWP. They are trained solely on observational and historical weather data, and produce accurate predictions that are completely independent of the NWP, he says. “It’s a matter of scale, but also a matter of smartness. You have to be smart about how you deliver data and how you build the structure of a neural network.”

topic:

Source: www.newscientist.com

New Supercomputer Built to Simulate Nuclear Bombs is the Fastest in the World

El Capitan supercomputer at Lawrence Livermore National Laboratory

Garry McLeod/Lawrence Livermore National Laboratory

The top spot in the league table of the world's most powerful computers has changed hands, with one supercomputer built for US national security research overtaking another.

top 500The final list of the most powerful computers is based on one metric: how fast a machine can solve large numbers of equations, measured in floating point operations per second (FLOPS). Masu. A machine called Frontier, built in 2022, was the first to be publicly acknowledged to have reached exascale (1 billion FLOPS).

Frontier was founded by Oak Ridge National Laboratory in Tennessee to not only perform nuclear weapons simulations, but also address a variety of complex scientific problems such as climate modeling, fusion simulations, and drug discovery. Ta.

Now, Lawrence Livermore National Laboratory (LLNL) in California has developed El Capitan, which has a power of 1.742 exaFLOPS, more than any other supercomputer.

The machine was built under tight security in cooperation with the National Nuclear Security Administration, a division of the Department of Energy dedicated to developing nuclear weapons science. The agency was established in 2000 in response to revelations that nuclear secrets had been leaked from the Department of Energy to China.

Essentially, El Capitan would provide the vast computational power needed to ensure the effectiveness of the U.S. nuclear deterrent without conducting any physical nuclear tests. LLNL claims that complex, high-resolution 3D simulations of nuclear explosions that previously took months on Sierra, its most powerful system, can be completed in just hours or days on El Capitan.

topic:

Source: www.newscientist.com

The Age-Defying Power of the New Supercomputer for Your Brain

The human brain is likely the most advanced computer in the world. While it operates differently than a traditional computer and has a much softer structure, its computing power is unparalleled.

Neuromorphic computing, which models machines after the human brain and nervous system, has been a growing concept since the 1980s. Many attempts have been made to achieve this, with the DeepSouth project at the International Center for Neuromorphic Systems at Western Sydney University aiming to be the most advanced yet, with the potential to perform 228 trillion actions per second.

How does a brain computer work?

DeepSouth uses an approach to computing that is inspired by the human brain and body, aiming to combine processing power and memory just like the human brain does. By distributing power to billions of tiny units (neurons) that interact through trillions of different connections (synapses), the brain becomes incredibly powerful while consuming very little energy.

What does this mean for the future of computers?

This approach could lead to significant improvements in energy efficiency and battery life for devices such as smartphones. It could also enable the development of smaller and more powerful computers, bringing high-powered computing to a variety of applications and industries.

How DeepSouth can help fight aging

While the primary goal of DeepSouth is to improve computing technology, the neuromorphic approach also offers insights into the workings of the human brain. This could lead to a better understanding of diseases such as Alzheimer’s, dementia, and Parkinson’s and potentially aid in developing treatments for these conditions.

Source: www.sciencefocus.com