How Quantum Computers Could Enhance Exoplanet Imaging for Clearer Views

Artist’s Impression of an Exoplanet

Credit: ESA/Hubble (M. Kornmesser)

Innovative quantum computers may enhance our ability to detect exoplanets and analyze their characteristics in unprecedented detail.

Astronomers have identified thousands of planets beyond our solar system, but they believe billions of exoplanets remain to be uncovered. This exploration is crucial for the search for extraterrestrial life, though the distance from Earth complicates direct observations.

Johannes Borregard and his team at Harvard University propose that quantum computing technology could dramatically streamline this endeavor.

Capturing images of exoplanets involves detecting their faint light signals, which diminish as they traverse vast cosmic distances. Additionally, these signals can be obscured by the light of nearby stars, creating additional challenges.

According to Borregard, his NASA colleagues illustrated the difficulty of this task, likening it to locating a single photon amidst a sea of light during telescope observations.

Traditional processing methods struggle with such weak signals. However, quantum computers can harness the quantum states of incoming photons, utilizing their unique properties to gather crucial data about exoplanets. This approach could transform what typically produces indistinct images or singular blurred points into clear visuals of distant worlds, revealing light-based markers of molecules present on these exoplanets.

The central concept of the team’s proposal suggests that light from an exoplanet interacts with a quantum computing device crafted from specially engineered diamond. This technology has already shown success in storing quantum states of photons. These states would then be transmitted to an advanced quantum computer designed to process and generate images of exoplanets. In their model, Borregard and his colleagues envision the second device utilizing ultracold atoms, which have demonstrated significant potential in recent experiments.

Research indicates that employing quantum devices in this manner could produce images using only one-hundredth, or even one-thousandth, of the photons needed in conventional methods. Essentially, in scenarios of extremely weak light, quantum systems could surpass existing technology.

“Since photons adhere to quantum mechanics principles, it is intuitive to explore quantum approaches for detecting and processing light from exoplanets,” notes Cosmolpo from the Polytechnic University of Bari, Italy. However, he acknowledges that realizing this proposal poses significant challenges, necessitating precise control over both quantum computers and effective coordination between them.

Borregard concurs, recognizing promising experimental advancements in employing diamond-based and cryogenic quantum computers. He highlights that establishing a connection between these systems is currently a focus for several research teams, including his own.

Lupo introduces another innovative strategy leveraging quantum light properties. Current initiatives utilizing quantum devices have already begun to observe stars in the Canis Minor constellation. “I am eager to witness the influence of quantum computing on imaging and astronomy in the future,” he states. “This new research represents a pivotal step in that direction.”

Discover Chile: The Global Hub of Astronomy

Immerse yourself in Chile’s astronomical wonders. Experience cutting-edge observatories and gaze at the stars beneath the world’s clearest skies.

Topics:

  • Exoplanet/
  • Quantum Computing

Source: www.newscientist.com

Why Some Quantum Computers Demand More Power Than Traditional Supercomputers

El Capitan, the National Nuclear Security Administration's leading exascale computer

El Capitan Supercomputer: Power Play in Quantum Computing

Credit: LLNL/Garry McLeod

The advancement of large quantum computers offers the potential to solve complex problems beyond the reach of today’s most powerful classical supercomputers. However, this leap in capability may come with increased energy demands.

Currently, most existing quantum computers are limited in size, with less than 1,000 qubits. These fragile qubits are susceptible to errors, hindering their ability to tackle significant issues, like aiding in drug discovery. Experts agree that to reach practical utility, a Fault-Tolerant Quantum Computer (FTQC) must emerge, with a much higher qubit count and robust error correction. The engineering hurdles involved in this pursuit are substantial, compounded by multiple competing designs.

Olivier Ezratty, from the Quantum Energy Initiative (QEI), warns that the energy consumption of utility-scale FTQCs has been largely overlooked. During the Q2B Silicon Valley Conference in Santa Clara, California, on December 9, he presented his preliminary estimates. Notably, some FTQC designs could eclipse the energy requirements of the world’s top supercomputers.

For context, El Capitan, the fastest supercomputer globally, located at Lawrence Livermore National Laboratory, draws approximately 20 megawatts of electricity—three times that of the nearby city of Livermore, which has a population of 88,000. Ezratty forecasts that FTQC designs scaling up to 4,000 logical qubits may demand even more energy. Some of the power-hungry designs could require upwards of 200 megawatts.

Ezratty’s estimates derive from accessible data, proprietary insights from quantum tech firms, and theoretical models. He outlines a wide energy consumption range for future FTQCs, from 100 kilowatts to 200 megawatts. Interestingly, he believes that three forthcoming FTQC designs could ultimately operate below 1 megawatt, aligning with conventional supercomputers utilized in research labs. This variance could significantly steer industry trends, particularly as low-power models become more mainstream.

The discrepancies in projected energy use stem from the various strategies that quantum computing companies employ to construct and maintain their qubits. For instance, certain qubit technologies necessitate extensive cooling to function effectively. Light-based qubits struggle with warm light sources and detectors, leading to heightened energy consumption. Similarly, superconducting circuits require entire chips to be housed in large refrigeration systems, while designs based on trapped ions or ultracold atoms demand substantial energy input from lasers or microwaves to precisely control qubits.

Oliver Dial from IBM, known for superconducting quantum computers, anticipates that his company’s large-scale FTQC will need approximately 2 to 3 megawatts of power, a fraction of what a hyperscale AI data center could consume. This demand could be lessened through integration with existing supercomputers. Meanwhile, a team from QuEra, specializing in ultracold atomic quantum computing, estimates their FTQC will require around 100 kilowatts, landing on the lower end of Ezratty’s spectrum.

Other companies like Xanadu, focusing on light-based quantum technologies, as well as Google Quantum AI, centered on superconducting qubits, have opted not to comment. PsiQuantum, another light-based qubit developer, was unavailable for a response. New Scientist has made multiple attempts for their insights.

Ezratty also pointed out that traditional electronics responsible for directing and monitoring qubit operations could result in additional costs, particularly for FTQC systems where qubits need further instructions to self-correct errors. This complexity necessitates understanding how these algorithms contribute to energy footprints. The operational runtime length of quantum computers adds another layer, as energy savings from fewer qubits might be negated if longer operation times are needed.

To effectively measure and report the energy consumption of machines, the industry must establish robust standards and benchmarks. Ezratty emphasizes that this is an integral element of QEI’s mission, with projects actively progressing in both the United States and the European Union.

As the field of quantum computing continues to mature, Ezratty anticipates that his research will pave the way for insights into FTQC energy consumption. This understanding could be vital for optimizing designs to minimize energy use. “Countless technological options could facilitate reduced energy consumption,” he asserts.

Topics:

Source: www.newscientist.com

Will 2026 Mark the Breakthrough of Quantum Computers in Chemistry?

Quantum Computers: Solutions for Chemistry Challenges

Marijan Murat/DPA/Alamy

One of the critical questions in the quantum computing sector is whether these advanced machines can solve practical problems in fields like chemistry. Researchers in industrial and medical chemistry are poised to provide insights by 2026.

The complexity of determining the structure, reactivity, and other properties of molecules is inherently a quantum problem, primarily involving electrons. As molecular structures grow increasingly complex, these calculations become challenging, sometimes even surpassing the capabilities of traditional supercomputers.

Quantum computers, being inherently quantum, have a potential advantage in tackling these complex chemical calculations. As these computers develop and become more seamlessly integrated with conventional systems, they are gaining traction in the chemistry sector.

For instance, in 2025, IBM and the Japanese Institute of Scientific Research collaborated, employing quantum computers alongside supercomputers to model various molecules. Google researchers have also been innovating algorithms that unveil molecular structures. Additionally, RIKEN researchers are teaming up with Quantinuum to create efficient workflows, allowing quantum computers to calculate molecular energy with remarkable precision. Notably, the quantum computing software platform Kunova Computing introduced an algorithm that reportedly operates ten times more efficiently than traditional methods for energy calculations.

Progress is expected to expedite by 2026 as quantum computers become more advanced. “Future larger machines will allow us to create enhanced workflows, ultimately solving prevalent quantum chemistry problems,” states David Muñoz Ramo from Quantinuum. While his team currently focuses on hydrogen molecules, they foresee stepping into more intricate structures, such as catalysts for industrial reactions.

Other research entities are making strides in similar areas. In December, Microsoft announced a partnership with Algorithmiq, a quantum software startup, aimed at accelerating the development of quantum algorithms for chemistry. Furthermore, a study by Hyperion Research highlights chemistry as a focal area for advancement and investment in quantum computing, ranking it as one of the most promising applications in annual surveys.

However, meaningful progress in quantum chemical calculations depends on achieving error-free or fault-tolerant quantum computers, which will also unlock other potential applications for these devices. As Philip Schleich and Alan Aspuru-Guzik emphasized in a commentary for Science magazine, the ability of quantum computers to outperform classical computers hinges on the development of fault-tolerant algorithms. Thankfully, achieving fault tolerance is a widely accepted goal among quantum computer manufacturers worldwide.

Source: www.newscientist.com

Remarkable Advances in Developing Practical Quantum Computers

Quantum Computing Advancements

Practical Quantum Computers Approaching Reality

Alexander Yakimov / Alamy

The quantum computing industry is concluding the year with renewed hope, despite the absence of fully operational quantum systems. At December’s Q2B Silicon Valley Conference, industry leaders and scientists expressed optimism regarding the future of quantum computing.

“We believe that it’s highly likely that someone, or perhaps several entities, will develop a genuinely industrially viable quantum computer, but we didn’t anticipate this outcome until the end of 2025,” stated Joe Altepeter, program manager for the Defense Advanced Research Projects Agency’s Quantum Benchmarking Initiative (QBI). The QBI aims to evaluate which of the competing quantum computing approaches can yield practical devices capable of self-correction or fault tolerance.

This initiative will extend over several years, involving hundreds of professional evaluators. Reflecting on the program’s initial six months, Altepeter noted that while “major roadblocks” were identified in each approach, none disqualified any team from the pursuit of practical quantum devices.

“By late 2025, I sense we will have all major hardware components in place with adequate fidelity; the remaining challenges will be primarily engineering-focused,” asserted Scott Aaronson, a key figure in the field, during his presentation at the University of Texas at Austin. He acknowledged the ongoing challenge of discovering algorithms for practical quantum applications, but highlighted significant progress in hardware developments.

Though quantum computing hardware advancements are encouraging, application development is lagging, according to Ryan Babush from Google. During the conference, Google Quantum AI alongside partners unveiled the finalists for the XPRIZE competition, aiming to accelerate application development.

The research by the seven finalists spans simulations of biomolecules crucial for human health, algorithms enhancing classical simulations for clean energy materials, and calculations that could impact the diagnosis and treatment of complex health issues.

“A few years back, I was skeptical about running applications on quantum computers, but now my interest has significantly increased,” remarked John Preskill, a pivotal voice in quantum computing at Caltech, advocating for the near-term application of quantum systems in scientific discovery.

Over the past year, numerous quantum computers have been employed for calculations, including the physics of materials and high-energy particles, potentially rivaling or surpassing traditional computational methods.

While certain applications are deemed particularly suitable for quantum systems, challenges remain. For instance, Pranav Gokhale at Inflection, a company manufacturing quantum devices from cryogenic atoms, is implementing Scholl’s algorithm—a classic method capable of breaking many encryption systems used by banks today. However, this initial implementation still lacks the computational power necessary to effectively decrypt real-world encrypted information, illustrating that significant enhancements in both hardware and software are essential.

Dutch startup Quantware has proposed a solution to the industry’s major hardware challenge, asserting that increasing quantum computer size can enhance computational capacity while maintaining reliability. Their quantum processor unit design aims to utilize 10,000 qubits, roughly 100 times the capacity of most current superconducting quantum computers. According to Matt Reilersdam, QuantWare anticipates having its first device operational within two and a half years. Other firms, such as IBM and Quantinuum, are working toward similar large-scale quantum systems, while QuEra aims to fabricate 10,000 qubits from ultra-cold atoms within a year, intensifying the competitive landscape.

Moreover, the quantum computing industry is projected to expand significantly, with global investments expected to rise from $1.07 billion in 2024 to approximately $2.2 billion by 2027, as noted in a Quantum Computing Industry Survey by Hyperion Research.

“More individuals than ever can now access quantum computers, and I believe they will accomplish things we can scarcely imagine,” said Jamie Garcia from IBM.

Topics:

Source: www.newscientist.com

Quantum Computers Prove More Valuable Than Anticipated by 2025

Quantum Computers Could Shed Light on Quantum Behavior

Galina Nelyubova/Unsplash

Over the past year, I consistently shared the same narrative with my editor: Quantum computers are increasingly pivotal for scientific breakthroughs.

This was the primary intent from the start. The ambition to leverage quantum computers for deeper insights into our universe has been part of its conception, even referenced in Richard Feynman’s 1981 address. In his discussion about effectively simulating nature, he suggested: “Let’s construct the computer itself using quantum mechanical components that adhere to quantum laws.”

Currently, this vision is being brought to life by Google, IBM, and a multitude of academic teams. Their devices are now employed to simulate reality on a quantum scale. Below are some key highlights.

This year’s advancements in quantum technology began for me with two studies in high-energy particle physics that crossed my desk in June. Separate research teams utilized two unique quantum computers to mimic the behavior of particle pairs within quantum fields. One utilized Google’s Sycamore chip, crafted from tiny superconducting circuits, while the other, developed by QuEra, employed a chip based on cryogenic atoms regulated by lasers and electromagnetic forces.

Quantum fields encapsulate how forces like electromagnetism influence particles across the universe. Additionally, there’s a local structure that defines the behaviors observable when zooming in on a particle. Simulating these fields, especially regarding particle dynamics—where particles exhibit time-dependent behavior—poses challenges akin to producing a motion picture of such interactions. These two quantum computers addressed this issue for simplified versions of quantum fields found in the Standard Model of particle physics.

Jad Halime, a researcher at the University of Munich who was not a part of either study, remarked that enhanced versions of these experiments—simulating intricate fields using larger quantum computers—could ultimately clarify particle behaviors within colliders.

In September, teams from Harvard University and the Technical University of Munich applied quantum computers to simulate two theoretical exotic states of matter that had previously eluded traditional experiments. Quantum computers adeptly predicted the properties of these unusual materials, a feat impossible by solely growing and analyzing lab crystals.

Google’s new superconducting quantum computer, “Willow,” is set to be utilized in October. Researchers from the company and their partners leveraged Willow to execute algorithms aimed at interpreting data obtained from nuclear magnetic resonance (NMR) spectroscopy, frequently applied in molecular biochemical studies.

While the team’s demonstration using actual NMR data did not achieve results beyond what conventional computers can handle, the mathematics underlying the algorithm holds the promise of one day exceeding classical machines’ capabilities, providing unprecedented insights into molecular structures. The speed of this development hinges on advancements in quantum hardware technology.

Later, a third category of quantum computer made headlines. Quantinuum’s Helios-1, designed with trapped ions, successfully executed simulations of mathematical models relating to perfect electrical conductivity, or superconductivity. Superconductors facilitate electricity transfer without loss, promising highly efficient electronics and potentially enhancing sustainable energy grids. However, currently known superconductors operate solely under extreme conditions, rendering them impractical. Mathematical models elucidating the reasons behind certain materials’ superconducting properties are crucial for developing functional superconductors.

What did Helios-1 successfully simulate? Henrik Dreyer from Quantinuum provided insights, stating that it is likely the most pivotal model in this domain, capturing physicists’ interests since the 1960s. Although this simulation didn’t unveil new insights into superconductivity, it established quantum computers as essential players in physicists’ ongoing quest for understanding.

A week later, I was on another call with Sabrina Maniscalco discussing metamaterials with the quantum algorithm firm Algorithmiq. These materials can be finely tuned to possess unique attributes absent in naturally occurring substances. They hold potential for various applications, ranging from basic invisibility cloaks to catalysts accelerating chemical reactions.

Maniscalco’s team worked on metamaterials, a topic I delved into during my graduate studies. Their simulation utilized an IBM quantum computer built with superconducting circuits, enabling the tracking of how metamaterials manipulate information—even under conditions that challenge classical computing capabilities. Although this may seem abstract, Maniscalco mentioned that it could propel advancements in chemical catalysts, solid-state batteries, and devices converting light to electricity.

As if particle physics, new states of matter, molecular analysis, superconductors, and metamaterials weren’t enough, a recent tip led me to a study from the University of Maryland and the University of Waterloo in Canada. They utilized a trapped ion quantum computer to explore how particles bound by strong nuclear forces behave under varying temperatures and densities. Some of these behaviors are believed to occur within neutron stars—poorly understood cosmic entities—and are thought to have characterized the early universe.

While the researchers’ quantum computations involved approximations that diverged from the most sophisticated models of strong forces, the study offers evidence of yet another domain where quantum computers are emerging as powerful discovery tools.

Nevertheless, this wealth of examples comes with important caveats. Most mathematical models simulated on quantum systems require simplifications compared to the most complex models; many quantum computers are still prone to errors, necessitating post-processing of computational outputs to mitigate those inaccuracies; and benchmarking quantum results against top-performing classical computers remains an intricate challenge.

In simpler terms, conventional computing and simulation techniques continue to advance rapidly, with classical and quantum computing researchers engaging in a dynamic exchange where yesterday’s cutting-edge calculations may soon become routine. Last month, IBM joined forces with several other companies to launch a publicly accessible quantum advantage tracker. This initiative ultimately aims to provide a leaderboard showcasing where quantum computers excel or lag in comparison to classical ones.

Even if quantum systems don’t ascend to the forefront of that list anytime soon, the revelations from this past year have transformed my prior knowledge into palpable excitement and eagerness for the future. These experiments have effectively transitioned quantum computers from mere subjects of scientific exploration to invaluable instruments for scientific inquiry, fulfilling tasks previously deemed impossible just a few years prior.

At the start of this year, I anticipated primarily focusing on benchmark experiments. In benchmark experiments, quantum computers execute protocols showcasing their unique properties rather than solving practical problems. Such endeavors can illuminate the distinctions between quantum and classical computers while underscoring their revolutionary potential. However, transitioning from this stage to producing computations useful for active physicists appeared lengthy and undefined. Now, I sense this path may be shorter than previously envisioned, albeit with reasonable caution. I remain optimistic about uncovering more quantum surprises in 2026.

Topics:

Source: www.newscientist.com

Why Is AI Driving Up the Cost of Computers and Game Consoles?

Machines for Semiconductor Chip Production

David Talukdar/Alamy

The AI industry is now heavily investing in computer memory, directly collaborating with manufacturers to develop chips worth billions. These chips are the same ones found in smartphones, laptops, and gaming consoles. This could either drive prices up significantly or cause shortages, hindering production.

What drives AI’s need for memory?

AI models are tremendously large, consisting of grids filled with billions or trillions of parameters (values stored in memory) that undergo complex and repetitive calculations. This process forms the basis of how large language models process input and generate output.

Transferring this expansive data between affordable yet slower hard drives (often referred to as storage) and the processor results in a significant bottleneck. To mitigate this, a considerable amount of faster RAM (commonly termed computer memory) is utilized.

Additionally, the models created by AI companies operate at a grand scale. This necessitates computers capable of managing hundreds, thousands, or even millions of iterations of these models to cater to numerous users simultaneously.

The growing need for handling compute-intensive activities, scaling to accommodate a large user base, and minimizing limitations on expansion through virtually limitless investments results in an unquenchable thirst for hardware. Competing with firms that produce millions of laptops annually is increasingly challenging.

Why can’t chip manufacturers increase output?

It’s more complex than it appears. Semiconductor factories face production capacity limits, and establishing a new facility demands substantial investment and often spans several years.

Additionally, there are indications that manufacturers may not wish for the current scarcity to subside. Reports from Korean media suggest that Samsung Electronics and SK Hynix dominate chip production, collectively accounting for roughly 70 percent. Averse to augmenting supply, they risk having new chip factories remain underutilized during a downturn in the AI sector.

With current demand flourishing, Samsung is in a position to: raise prices as much as 60%. Why would they disrupt this momentum? For instance, a 32-gigabyte chip sold by Samsung for $149 in September is priced at $239 by November.

Have shortages like this been experienced before?

Indeed. The surge in AI has compelled firms to aggressively accumulate graphics processing unit (GPU) chips to construct extensive data centers for training and running increasingly larger models. This persistent demand has driven Nvidia’s stock price up from $13 at the beginning of 2021 to over $200 recently.

The year 2021 also witnessed widespread chip shortages across the board, triggered by a combination of the global pandemic, trade disputes, natural disasters, and extreme weather events. This disruption impacted the production of items ranging from pickup trucks to microwave ovens.

That same year experienced storage shortages as a new cryptocurrency known as Chia, which depends on storage space rather than raw computing power, gained rapid popularity.

In summary, technological advancements are outpacing developments in global supply chains.

When could this shortage end?

Not in the immediate future. OpenAI has entered into contracts with Samsung and SK Hynix that will likely dictate delivery timelines, possibly consuming 40% of global memory supply. However, this represents just one AI entity; Microsoft, Google, ByteDance, and others are similarly seeking to acquire as many chips as possible.

The resolution of this shortage may hinge on whether the anticipated AI downturn, frequently mentioned by economists and industry leaders, actually materializes, potentially leading to a surplus. However, this scenario poses risks of severe financial repercussions.

Should such a downturn not occur, projections suggest it may not settle until 2028, when new factories from smaller firms begin to contribute, allowing supply and demand to reach some semblance of balance.

Some experts indicate that this prolonged shortage could become a broader manufacturing challenge. Sanchit Vir Gogia, an industry analyst at Greyhound Research, noted to Reuters, “Memory shortages have evolved from a component-level issue to a macroeconomic concern.”

Topics:

  • artificial intelligence/
  • computer

Source: www.newscientist.com

Quantum Computers Require Classical Computing for Real-World Applications

Quantum Machine Professor Jonathan Cohen presenting at the AQC25 conference

Quantum Machines

Classical computers are emerging as a critical component in maximizing the functionality of quantum computers. This was a key takeaway from this month’s assembly of researchers who emphasized that classical systems are vital for managing quantum computers, interpreting their outputs, and enhancing future quantum computing methodologies.

Quantum computers operate on qubits—quantum entities manifesting as extremely cold atoms or miniature superconducting circuits. The computational capability of a quantum computer scales with the number of qubits it possesses.

Yet, qubits are delicate and necessitate meticulous tuning, oversight, and governance. Should these conditions not be met, the computations conducted may yield inaccuracies, rendering the devices less efficient. To manage qubits effectively, researchers utilize classical computing methods. The AQC25 conference held on November 14th in Boston, Massachusetts, addressed these challenges.

Sponsored by Quantum Machines, a company specializing in controllers for various qubit types, the AQC25 conference gathered over 150 experts, including quantum computing scholars and CEOs from AI startups. Through numerous presentations, attendees elaborated on the enabling technologies vital for the future of quantum computing and how classical computing sometimes acts as a constraint.

Per Shane Caldwell, sustainable fault-tolerant quantum computers designed to tackle practical problems are only expected to materialize with a robust classical computing framework that operates at petascale—similar to today’s leading supercomputers. Although Nvidia does not produce quantum hardware, it has recently introduced a system that links quantum processors (QPUs) to traditional GPUs, which are commonly employed in machine learning and high-performance scientific computing.

Even in optimal operations, the results from a quantum computer reflect a series of quantum properties of the qubits. To utilize this data effectively, it requires translation into conventional formats, a process that again relies on classical computing resources.

Pooya Lonar from Vancouver-based startup 1Qbit discussed this translation process and its implications, noting that the performance speed of fault-tolerant quantum computers can often hinge on the operational efficiency of classical components such as controllers and decoders. This means that whether a sophisticated quantum machine operates for hours or days to solve a problem might depend significantly on its classical components.

In another presentation, Benjamin Lienhardt from the Walter Meissner Institute for Cryogenic Research in Germany, presented findings on how traditional machine learning algorithms can facilitate the interpretation of quantum states in superconducting qubits. Similarly, Mark Saffman from the University of Wisconsin-Madison highlighted using classical neural networks to enhance the readout of qubits derived from ultra-cold atoms. Researchers unanimously agreed that non-quantum devices are instrumental in unlocking the potential of various qubit types.

IBM’s Blake Johnson shared insights into a classical decoder his team is developing as part of an ambitious plan to create a quantum supercomputer by 2029. This endeavor will employ unconventional error correction strategies, making the efficient decoding process a significant hurdle.

“As we progress, the trend will shift increasingly towards classical [computing]. The closer one approaches the QPU, the more you can optimize your system’s overall performance,” stated Jonathan Cohen from Quantum Machines.

Classical computing is also instrumental in assessing the design and functionality of future quantum systems. For instance, Izhar Medalcy, co-founder of the startup Quantum Elements, discussed how an AI-powered virtual model of a quantum computer, often referred to as a “digital twin,” can inform actual hardware design decisions.

Representatives from the Quantum Scaling Alliance, co-led by 2025 Nobel Laureate John Martinis, were also present at the conference. This reflects the importance of collaboration between quantum and classical computing realms, bringing together qubit developers, traditional computing giants like Hewlett Packard Enterprise, and computational materials specialists such as the software company Synopsys.

The collective sentiment at the conference was unmistakable. The future of quantum computing is on the horizon, bolstered significantly by experts who have excelled in classical computing environments.

Topics:

  • Computing/
  • Quantum Computing

Source: www.newscientist.com

Quantum Computers with Recyclable Qubits: A Solution for Reducing Errors

Internal optics of Atom Computing’s AC1000 system

Atom Computing

Quantum computers, utilizing qubits formed from extremely cold atoms, are rapidly increasing in size and may soon surpass classical computers in computational power. However, the frequency of errors poses a significant challenge to their practicality. Researchers have now found a way to replenish and recycle these qubits, enhancing computation reliability.

All existing quantum systems are susceptible to errors and are currently unable to perform calculations that would give them an edge over traditional computers. Nonetheless, researchers are making notable advancements in the creation of error correction methods to address this issue.

One approach involves dividing the components of quantum computers, known as qubits, into two primary categories: operational qubits that manipulate data and auxiliary qubits that monitor errors.

Developing large quantities of high-quality qubits for either function remains a significant technical hurdle. Matt Norcia and his team at Atom Computing have discovered a method to lessen the qubit requirement by recycling or substituting auxiliary qubits. They demonstrated that an error-tracking qubit can be effectively reused for up to 41 consecutive runs.

“The calculation’s duration is likely to necessitate numerous rounds of measurement. Ideally, we want to reuse qubits across these rounds, minimizing the need for a continuous influx of new qubits,” Norcia explains.

The team utilized qubits derived from electrically neutral ytterbium atoms that were chilled close to absolute zero using lasers and electromagnetic pulses. By employing “optical tweezers,” they can manipulate each atom’s quantum state, which encodes information. This method allowed them to categorize the quantum computer into three distinct zones.

In the first zone, 128 optical tweezers directed the qubits to conduct calculations. The second zone comprised 80 tweezers that held qubits for error tracking, or that could be swapped in for faulty qubits. The third zone functioned as a storage area, keeping an additional 75 qubits that had recently been deemed useful. These last two areas enabled researchers to reset or exchange the auxiliary qubit as needed.

Norcia noted that it was challenging to establish this setup due to stray laser light interfering with nearby qubits. Consequently, researchers had to develop a highly precise laser control and a method to adjust the state of data qubits, ensuring they remained “hidden” from specific harmful light types.

“The reuse of Ancilla is crucial for advancing quantum computing,” says Yuval Borger from QuEra, a U.S. quantum computing firm. Without this ability, even basic calculations would necessitate millions, or even billions, of qubits, making it impractical for current or forthcoming quantum hardware, he adds.

This challenge is recognized widely across the atom-based qubit research community. “Everyone acknowledges that neutral atoms understand the necessity to reset and reload during calculations,” Norcia asserts.

For instance, Borger highlights that a team from Harvard and MIT employed similar techniques to maintain the operation of their quantum computer using 3000 ultra-cold rubidium atoms for several hours. Other quantum setups, like Quantinuum’s recently launched Helios machine, which uses ions controlled by light as qubits, also feature qubit reusability.

topic:

Source: www.newscientist.com

E-Waste Challenges: A Guide to Recycling Old Mobile Phones and Computers

I The development of electronics that support our daily lives requires significant time, resources, and fossil fuels. The journey from mining rare earth materials to processing, manufacturing, and shipping creates immense waste. The innovations in engineering and logistics that allow consumers to buy new mobile phones annually contribute to this issue.

According to the latest Global E-Waste Monitor, the world generates 62 million tons of electronic waste each year, with projections estimating e-waste emissions could reach 82 million tons by 2030. Australia contributes 580,000 tonnes of this annually. Factors like planned obsolescence, technological advancements, and device failures are expected to escalate this figure.

It’s estimated that 23 million mobile phones are unused and gathering dust in drawers across Australia, some of which are truly non-functional. Research indicates that the average Australian produces around 22kg of e-waste annually, nearly three times the global average, according to recent studies from the Productivity Commission.

“It’s the fastest growing waste stream, but it’s also the most valuable,” states Anne Stonier from the Australia New Zealand Recycling Platform (ANZRP). “Electronics also contain substantial amounts of hard plastics. Recycling can help ensure these materials are managed responsibly, contributing to a more circular economy.”

Wondering where to dispose of your old phone? Concerned about keeping your sensitive data safe? Here are some things to consider when recycling your old device.

Discover Local Recycling Programs

Recycling e-waste is more complex than merely putting it in the yellow bin. The first step is to identify local options available to you. For instance, local councils often have designated collection points and e-waste recycling programs. The locations may vary, and e-waste is banned from landfills in Victoria, South Australia, and Western Australia.

Additionally, several major retailers run recycling initiatives. Officeworks, for example, collects and recycles batteries, computer accessories, printer cartridges, and mobile phones. Their drop-off days apply to many other electronic products as well. Bunnings also offers collection bins for batteries and larger electronics like TVs, computers, and printers. The Mobile Muster program, run by the Australian telecommunications sector, collects a variety of devices, including mobile phones and streaming devices.

Many manufacturers also have take-back programs, allowing customers to trade in their old devices for discounts or credits on future purchases. If you’re thinking of upgrading, explore the options available to you.

Some charities accept donations of electronic devices, such as DV Safe Phone and Reconnection Project, which refurbish used devices and distribute them to those in need.

Disposing of Devices that Store Personal Information

Devices, whether it be a smartwatch tracking your daily runs or a tablet previously used for work, often store sensitive personal information.

Before disposing of electronic items, remember to back up or transfer any important files to another device or storage option (like an external hard drive or cloud service) and remove any identifying marks or stickers from the device.

What happens next depends on the device. For most smartphones, tablets, and smart devices, a factory reset will suffice. For computers, laptops, hard drives, and USBs, reformatting the drive and restoring factory settings should be enough. Don’t forget that fax machines, printers, and scanners may retain copies of printed documents, so perform a factory reset on those as well.

Finally, ensure you unpair your old device from any remaining computers or gadgets.

If your device is so outdated that it won’t power on and can’t connect to your computer, there’s not much you can do. Just make sure to remove any external memory cards before recycling.

What If My Device Contains Highly Sensitive Information?

Physically destroying a device, like drilling holes in a hard drive, is usually ineffective and counterproductive to the recycling process. It may also pose dangers, especially with devices featuring non-removable batteries, which could cause explosions and health hazards.

For devices containing highly confidential information you want to ensure is irretrievable, consider using data sanitization software to reformat your device and encrypt the hard drive. Secure data erasure services are also available, though their offerings and prices can differ widely. Some companies may provide free data destruction for donated hard drives so they can be repurposed, while other services are geared toward larger enterprises.

Some specialized recyclers also offer data destruction services, Stonier mentioned. “If you’re worried about your information falling into the wrong hands, it’s best to wipe it,” she advises. “Better safe than sorry.”

What Happens If I Don’t Take Any Action?

The severity of potential threats can vary. A hard drive filled solely with family photos poses less risk than one containing sensitive financial data. For the majority, it’s improbable they would be specifically targeted unless there’s a clear motive or pre-existing vulnerability.

Criminal activity is often opportunistic, and taking basic precautions can prevent future issues and anxieties.

Source: www.theguardian.com

Chemical Computers: Mastering Pattern Recognition and Multitasking

Molecules can be utilized for computational tasks

Shutterstock/Imageflow

Chemical computers composed of enzyme networks can carry out a range of functions, including temperature measurement and substance identification, all while avoiding the need for reconstruction after each use. This adaptability resembles biological systems more than traditional digital circuits, indicating a potential merger of computing and biological processes.

In nature, living organisms contain molecular systems that continuously integrate chemical and physical signals. For instance, cells detect nutrients, hormones, and temperature variations, adjusting to survive. Researchers have attempted to create analogs of this biological flexibility for years, including efforts to form logic gates with DNA; however, most artificial systems fall short due to their simplicity, inflexibility, or scalability challenges.

In a novel approach, researcher Wilhelm Huck from Radboud University in the Netherlands focused on allowing enzymes to interact autonomously rather than scripting every chemical step, leading to complex behaviors capable of recognizing chemical patterns.

The research team developed a system utilizing seven distinct enzymes embedded in tiny hydrogel beads found in small tubes. A liquid is introduced to these tubes, injecting short amino acid chains called peptides, which function as the “inputs” for the computer. As the peptides travel through the enzymes, each enzyme endeavours to cleave the peptide at designated sites along its length. When one cleavage occurs, it alters the peptide’s structure and the available cleavage sites, thereby affecting the actions of other enzymes.

This interdependence of reactions means that enzymes form a dynamic chemical network continually evolving, yielding unique patterns for the system to analyze. “Enzymes serve as the hardware while peptides act as the software. We address novel challenges based on the input provided,” noted Lee Dongyang from Caltech, who was not part of the study.

For instance, temperature influences the reaction rates of the enzymes. Elevated temperatures can accelerate certain enzymes faster than others, modifying the output’s mixture of peptide fragments. By employing machine learning algorithms to analyze these fragments, the researchers were able to correlate fragment patterns with specific temperatures.

Different chemical reactions can take place over various timescales, giving these systems a type of “memory” for previous inputs, enabling them to identify patterns over time. For example, they can distinguish between rapid and slow light pulses, allowing for both reactive and adaptive processing of changes in input.

The outcome is a versatile, dynamic chemical computer that interprets signals akin to a living organism rather than a static chemical circuit. “The same network undertook multiple roles seamlessly, including chemical categorization, temperature sensing with an average error margin of around 1.3°C from 25°C to 55°C, pH classification, and even responding to light pulse periodicity,” Li indicated.

The researchers were astonished by the effectiveness of the compact computer, with Huck expressing hopes for future advancements that might convert optical and electrical signals directly into chemical reactions, mimicking the behavior of living cells. “We started with just six or seven enzymes and six peptides,” he remarked. “Just imagine the possibilities with 100 enzymes.”

topic:

Source: www.newscientist.com

IBM Introduces Two Quantum Computers with Unmatched Complexity

IBM researchers hold components of the Loon quantum computer

IBM

In the competitive landscape of developing error-resistant quantum supercomputers, IBM is adopting a unique approach distinct from its primary rivals. The company has recently unveiled two new quantum computing models, dubbed Nighthawk and Loon, which may validate its methodology and deliver the advancements essential for transforming next-gen devices into practical tools.

IBM’s design for quantum supercomputers is modular, emphasizing the innovation of connecting superconducting qubits both within and across different quantum units. When this interconnectivity was first proposed, some researchers expressed skepticism about its feasibility. Jay Gambetta from IBM noted that critics implied to the team, “You exist in a theoretical realm; achieving this is impossible,” which they aim to refute.

Within Loon, every qubit interlinks with six others, allowing for unique connectivity that enables vertical movement in addition to lateral motion. This feature has not been previously observed in existing superconducting quantum systems. Conversely, Nighthawk implements four-way connections among qubits.

This enhanced connectivity may be pivotal in tackling some of the most pressing issues encountered by current quantum computers. The advancements could boost computational capabilities and reduce error rates. Gambetta indicated that initial tests with Nighthawk demonstrated the ability to execute quantum programs that are 30% more complex than those on most other quantum computers in use today. Such an increase in complexity is expected to facilitate further advancements in quantum computing applications, with IBM’s earlier models already finding utility in fields like chemistry.

The industry’s ultimate objective remains the ability to cluster qubits into error-free “logical qubits.” IBM is promoting strategies that necessitate smaller groupings than those pursued by competitors like Google. This could permit IBM to realize error-free computation while sidestepping some of the financial and engineering hurdles associated with creating millions of qubits. Nonetheless, this goal hinges on the connectivity standards achieved with Loon, as stated by Gambetta.

Stephen Bartlett, a researcher at the University of Sydney in Australia, expressed enthusiasm about the enhanced qubit connectivity but noted that further testing and benchmarking of the new systems are required. “While this is not a panacea for scaling superconducting devices to a size capable of supporting genuinely useful algorithms, it represents a significant advancement,” he remarked.

However, there remain several engineering and physical challenges on the horizon. One crucial task is to identify the most effective method for reading the output of a quantum computer after calculations, an area where Gambetta mentioned recent IBM progress. The team, led by Matthias Steffen, also aims to enhance the “coherence time” for each qubit. This measure indicates how long a quantum state remains valid for computational purposes, but the introduction of new connections can often degrade this quantum state. Additionally, they are developing techniques to reset certain qubits while computations are ongoing.

Plans are in place for IBM to launch a modular quantum computer in 2026 capable of both storing and processing information, with future tests on Loon and Nighthawk expected to provide deeper insights.

Topic:

Source: www.newscientist.com

Computers Could Resolve Mathematics’ Biggest Controversy

Computers can verify mathematical proofs

Monsisi/Getty Images

A major clash in the world of mathematics may see resolution thanks to computers, potentially bringing an end to a decade-long dispute surrounding a complex proof.

It all began in 2012 when Shinichi Mochizuki, a mathematician from Kyoto University in Japan, shocked the mathematical community with his extensive 500-page proof of the ABC conjecture. This conjecture stands as a significant unsolved issue at the very essence of number theory. Mochizuki’s proof relied on an intricate and obscure framework that he developed, known as Interuniversal Teichmuller (IUT) theory, which proved challenging for even seasoned mathematicians to grasp.

The ABC conjecture, which has been around for over 40 years, presents a seemingly straightforward equation involving three integers: a + b = c, investigating the relationships among the prime numbers that constitute these values. The conjecture offers profound insights into the fundamental interactions of addition and multiplication, with ramifications for other renowned mathematical conjectures, including Fermat’s Last Theorem.

Given these potential consequences, mathematicians initially expressed excitement over verifying the proof. However, Mochizuki noted that early attempts faced challenges and more focus was needed on understanding his findings. In 2018, two distinguished German mathematicians, Peter Scholze from the University of Bonn and Jakob Stix from Goethe University in Frankfurt, announced that they had found possible flaws in the proof.

Mochizuki, however, dismissed these critiques. Lacking a central authority to arbitrate the debate, the credibility of the IUT theory has split the mathematical community into opposing factions, with one side comprising a small collective of researchers aligned with Mochizuki and the Kyoto Institute for Mathematical Sciences, where he teaches.

Now, Mochizuki has suggested a path forward to resolve the deadlock. He proposes transferring proofs from their existing mathematical notation, intended for human comprehension, to a programming language known as Lean, which can be validated and checked by computers.

This approach, known as formalization, represents a promising area of research that could revolutionize the practice of mathematics. Although there have been earlier suggestions for Mochizuki to formalize his proof, this marks the first time he has publicly indicated plans to advance this initiative.

Mochizuki was unavailable for comment on this article. However, in recent reports, he asserted that Lean would be an excellent tool for clarifying certain disputes among mathematicians that have hindered acceptance of his proof. He stated, “This represents the best, and perhaps only, way to achieve significant progress in liberating mathematical truth from social and political constraints.”

Mochizuki became convinced of the advantages of formalization after attending a conference on Lean in Tokyo last July, particularly impressed by its capacity to manage the mathematical structures essential to his IUT theory.

This could be a vital step in overcoming the current stalemate, noted Kevin Buzzard from Imperial College London. “If it’s articulated using Lean, that’s not strange at all. Much of what’s found in papers is written in unusual terms, so being able to express it in Lean means that this unusual language has become universally defined,” he explains.

“We seek to understand why [of IUT], and we’ve been awaiting clarity for over a decade,” remarked Johann Kommelin from Utrecht University in the Netherlands. “Lean will aid in uncovering those answers.”

However, both Buzzard and Kommelin acknowledge that formalizing IUT theory is an immense challenge, necessitating the conversion of a series of mathematical equations that currently exist only in a human-readable format. This effort is anticipated to be the largest formalization endeavor ever attempted, often requiring teams of specialists and taking months or even years.

This daunting reality may dissuade the limited number of mathematicians capable of undertaking this project. “Individuals will need to decide whether they are willing to invest significant time in a project that may ultimately lead to failure,” Buzzard remarked.

Even if the mathematicians succeed in completing the project and the Lean code indicates that Mochizuki’s theorem is consistent, disputes about its interpretation could still arise among mathematicians, including Mochizuki himself, according to Kommelin.

“Lean has the potential to make a significant impact and resolve the controversy, but this hinges on Mochizuki’s genuine commitment to formalizing his work,” he adds. “If he abandons it after four months, claiming ‘I’ve tried this, but Lean is too limited to grasp my proof,’ it would just add another chapter to the long saga of social issues persisting.”

Despite Mochizuki’s enthusiasm about Lean, he concedes with his critics that interpreting the meaning of the code might lead to ongoing disputes, expressing that Lean “does not appear to be a ‘magic cure’ for completely resolving social and political issues at this stage.”

Nevertheless, Buzzard remains optimistic that the formalization project, especially if successful, could propel the decade-old saga forward. “You can’t contest software,” he concludes.

topic:

Source: www.newscientist.com

Quantum Computers Confirm the Reality of Wave Functions

SEI 270583733

The wave function of a quantum object might extend beyond mere mathematical representation

Povitov/Getty Images

Does quantum mechanics accurately depict reality, or is it merely our flawed method of interpreting the peculiar characteristics of minuscule entities? A notable experiment aimed at addressing this inquiry has been conducted using quantum computers, yielding unexpectedly solid results. Quantum mechanics genuinely represents reality, at least in the context of small quantum systems. These findings could lead to the development of more efficient and dependable quantum devices.

Since the discovery of quantum mechanics over a hundred years ago, its uncertain and probabilistic traits have confounded scientists. For instance, take superposition. Are particles truly existing in multiple locations simultaneously, or do the calculations of their positions merely provide varying probabilities of their actual whereabouts? If it’s the latter, then there are hidden aspects of reality within quantum mechanics that may be restricting our certainty. These elusive aspects are termed “hidden variables,” and theories based on this premise are classified as hidden variable theories.

In the 1960s, physicist John Bell devised an experiment intended to disprove such theories. The Bell test explores quantum mechanics by evaluating the connections, or entanglement, between distant quantum particles. If these particles exhibit quantum qualities surpassing a certain threshold, indicating that their entanglement is nonlocal and spans any distance, hidden variable theories can be dismissed. The Bell test has since been performed on various quantum systems, consistently affirming the intrinsic nonlocality of the quantum realm.

In 2012, physicists Matthew Pusey, Jonathan Barrett, and Terry Rudolph developed a more comprehensive test (dubbed PBR in their honor) that enables researchers to differentiate between various interpretations of quantum systems. Among these are the ontic perspective, asserting that measurements of a quantum system and its wavefunction (a mathematical representation of a quantum state) correspond to reality. Conversely, the epistemological view suggests that this wavefunction is an illusion, concealing a richer reality beneath.

If we operate under the assumption that quantum systems possess no ulterior hidden features that impact the system beyond the wave function, the mathematics of PBR indicates we ought to comprehend phenomena ontically. This implies that quantum behavior is genuine, no matter how peculiar it appears. PBR tests function by comparing different quantum elements, such as qubits in a quantum computer, assessing how frequently they register consistent values for specific properties, like spin. If the epistemological perspective is accurate, the qubits will report identical values more often than quantum mechanics would suggest, implying that additional factors are at play.

Yang Songqinghao and his colleagues at the University of Cambridge have created a method to perform PBR tests on a functioning IBM Heron quantum computer. The findings reveal that if the number of qubits is minimal, it’s possible to assert that a quantum system is ontic. In essence, quantum mechanics appears to operate as anticipated, as consistently demonstrated by the Bell test.

Yang and his team executed this validation by evaluating the overall output from a pair or group of five qubits, such as a sequence of 1s and 0s, and determined the frequency at which this outcome aligned with predictions regarding the behavior of the quantum system, factoring in inherent errors.

“Currently, all quantum hardware is noisy and every operation introduces errors, so if we add this noise to the PBR threshold, what is the interpretation? [of our system]? ” remarks Yang. “We discovered that if we conduct the experiment on a small scale, we can fulfill the original PBR test and eliminate the epistemological interpretation.” The existence of hidden variables vanishes.

While they successfully demonstrated this for a limited number of qubits, they encountered difficulties replicating the same results for a larger set of qubits on a 156-qubit IBM machine. The error or noise present in the system becomes excessive, preventing researchers from distinguishing between the two scenarios in a PBR test.

This implies that the test cannot definitively determine whether the world is entirely quantum. At certain scales, the ontic view may dominate, yet at larger scales, the precise actions of quantum effects remain obscured.

Utilizing this test to validate the “quantum nature” of quantum computers could provide assurance that these machines not only function as intended but also enhance their potential for achieving quantum advantage: the capability to carry out tasks that would be impractically time-consuming for classical computers. “To obtain a quantum advantage, you must have quantum characteristics within your quantum computer. If not, you can discover a corresponding classical algorithm,” asserts team member Haom Yuan from Cambridge University.

“The concept of employing PBR as a benchmark for device efficacy is captivating,” he notes. Matthew Pusey PhD from York University, UK, one of the original PBR authors. However, Pusey remains uncertain about its implications for reality. “The primary purpose of conducting experiments rather than relying solely on theory is to ascertain whether quantum theory can be erroneous. Yet, if quantum theory is indeed flawed, what questions does that raise? The entire framework of ontic and epistemic states presupposes quantum theory.”

Understanding Reality To successfully conduct a PBR test, it’s essential to devise a method of performing the test without presuming that quantum theory is accurate. “A minority of individuals contend that quantum physics fundamentally fails at mesoscopic scales,” states Terry Rudolph, one of the PBR test’s founders from Imperial College London. “This experiment might not pertain to dismissing certain proposals, but let me be straightforward: I am uncertain! – Investigating fundamental aspects of quantum theory in progressively larger systems will always contribute to refining the search for alternative theories.”

reference: arXiv, Doi: arxiv.org/abs/2510.11213

topic:

Source: www.newscientist.com

Analog Computers May Train AI 1,000 Times Faster While Consuming Less Energy

Analog computers use less energy compared to digital computers

Metamol Works/Getty Images

Analog computers that can swiftly resolve the primary types of equations essential for training artificial intelligence models may offer a viable solution to the growing energy demands of data centers spurred by the AI revolution.

Devices like laptops and smartphones are known as digital computers because they handle data in binary form (0s and 1s) and can be programmed for various tasks. Conversely, analog computers are generally crafted to tackle specific problems, using continuously variable quantities like electrical resistance rather than discrete binary values.

While analog computers excel in terms of speed and energy efficiency, they have historically lagged in accuracy compared to their digital counterparts. Recently, Zhong Sun and his team at Peking University in China developed two analog chips that work collaboratively to solve matrix equations accurately—crucial for data transmission, large-scale scientific simulations, and AI model training.

The first chip generates low-precision outputs for matrix computations at high speed, while the second chip refines these outputs through an iterative improvement algorithm to assess and minimize the error rate of the initial results. Sun noted that the first chip produced results with a 1% error rate, but after three iterations with the second chip, this rate dropped to 0.0000001%, comparable to the accuracy found in conventional digital calculations.

Currently, the researchers have successfully designed a chip capable of solving 16 × 16 matrices, which equates to handling 256 variables, sufficient for addressing smaller problems. However, Sun acknowledges that addressing the complexities of today’s large-scale AI models will necessitate substantially larger circuits, potentially scaling up to 1 million by 1 million.

A unique advantage of analog chips is their ability to handle larger matrices without increased solving time, unlike digital chips, whose solving complexity rises exponentially with matrix size. This translates to a 32 x 32 analog chip outperforming the Nvidia H100 GPU, a leading chip for AI training.

Theoretically, further scaling could yield throughput up to 1,000 times greater than digital alternatives like GPUs while consuming 100 times less energy, according to Sun. However, he cautions that practical applications may exceed the circuit’s limited capabilities, limiting the perceived benefits.

“This is merely a speed comparison; your specific challenges may differ in real-world scenarios,” Sun explains. “Our chip is designed exclusively for matrix computations. If these computations dominate your tasks, the acceleration will be substantial; otherwise, the benefits may be constrained.”

Sun suggests that the most realistic outcome may be the creation of hybrid chips that incorporate some analog circuitry alongside GPUs to tackle specific problem areas, although this development might still be years away.

James Millen, a professor at King’s College London, emphasizes that matrix calculations are pivotal in AI model training, indicating that analog computing has the potential to make a significant impact.

“The contemporary landscape is dominated by digital computers. These remarkable machines are universal, capable of tackling any computation, yet not necessarily with optimal efficiency or speed,” Millen states. “Analog computers excel in performing specific tasks, making them exceptionally fast and efficient. In this research, we leverage analog computing chips to enhance matrix inversion processes—essential for training certain AI models. Improving this efficiency could help mitigate the substantial energy demands accompanying our expanding reliance on AI.”

Topic:

Source: www.newscientist.com

Google Unveils Quantum Computers’ Ability to Unlock Molecular Structures

Sure! Here’s a rewritten version of your content while preserving the HTML tags:

Google’s Quantum Computing Willow Chip

Google Quantum AI

Researchers at Google Quantum AI have leveraged Willow quantum computers to enhance the interpretation of data sourced from nuclear magnetic resonance (NMR) spectroscopy—an essential research method within chemistry and biology. This significant advancement may open new horizons for the application of quantum computing in various molecular technologies.

While quantum computers have been most effectively demonstrated in cryptographic contexts, current devices face limitations in scale and error rates that hinder their competence in decryption tasks. However, they show promise in expediting the discovery of new drugs and materials, which align with the fundamentally quantum nature of many scientific procedures. Hartmut Neven and colleagues at Google Quantum AI have showcased one instance where quantum computers can mimic the complex interactions found in natural processes.

The investigation centered on a computational method known as quantum echo and its application to NMR, a technique utilized to extract detailed information regarding molecular structures.

At its core, the concept of quantum echoes is akin to the butterfly effect. This phenomenon illustrates how minor perturbations—like the flap of a butterfly’s wings—can trigger substantial changes in broader systems. The researchers exploited a quantum approach within a system made up of 103 qubits in Willow.

During the experiment, the team executed a specific sequence of operations to alter the quantum state of a qubit in a manageable way. They then selected one qubit to disrupt, acting as a “quantum butterfly,” and employed the identical sequence of operations, effectively reversing time. Finally, the researchers evaluated the quantum characteristics of the qubits to extract insights regarding the entire system.

In a basic sense, the NMR technique applied in the lab also hinges on minor disturbances; it nudges actual molecules using electromagnetic waves and examines the system’s reactions to ascertain atomic positions—similar to using a molecular ruler. If the operations on qubits can replicate this process, the mathematical scrutiny of the qubits can likewise be translated into molecular structural details. This series of quantum computations could potentially enable the examination of atoms that are relatively distant from one another, said team member Tom O’Brien. “We’re constructing longer molecular rulers.”

The researchers believe that a protocol akin to quantum echoes would require approximately 13,000 times longer on a conventional supercomputer. Their tests indicated that two distinct quantum systems could successfully perform a quantum echo and yield identical outcomes—a notable achievement given the inconsistencies faced in previous quantum algorithms supported by the team. O’Brien noted that enhancements in the quality of Willow’s hardware and reduced qubit error rates have contributed to this success.

Nonetheless, there remains ample opportunity for refinement. In their utilization of Willow and quantum echoes for two organic molecules, the researchers operated with a mere 15 qubits at most, yielding results comparable to traditional non-quantum methods. In essence, the team has not yet demonstrated a definitive practical edge for Willow over conventional systems. This current exhibition of quantum echo remains foundational and has not been subjected to formal peer review.

“Addressing molecular structure determination is crucial and pertinent,” states Keith Fratus from HQS Quantum Simulations, a German company focused on quantum algorithms. He emphasizes that bridging established techniques such as NMR with calculations executed by quantum computers represents a significant milestone, though the technology’s immediate utility might be confined to specialized research in biology.

Doris Sels, a professor at New York University, remarked that their team’s experiments involve larger quantum computers and more complex NMR protocols and molecules than prior models. “Quantum simulation is often highlighted as a promising application for quantum computers, yet there are surprisingly few examples with industrial relevance. I believe model inference of spectroscopic data like NMR could prove beneficial,” she added. “We’re not quite there, but initiatives like this inspire continued investigation into this issue.”

O’Brien expressed optimism that the application of quantum echo to NMR will become increasingly beneficial as they refine qubit performance. Fewer errors mean a greater capability to execute more operations simultaneously and accommodate larger molecular structures.

Meanwhile, the quest for optimal applications of quantum computers is ongoing. While the experimental implementation of quantum echoes on Willow is remarkable, the mathematical analysis it facilitates may not achieve widespread adoption, according to Kurt von Keyserlingk at King’s College London. Until NMR specialists pivot away from traditional methods cultivated over decades, he suggests that its primary allure will lie with theoretical physicists focused on fundamental quantum system research. Furthermore, this protocol may face competitive challenges from conventional computing methods, as von Keyserlingk has already pondered how traditional computing might rival this approach.

Topic:

Let me know if you need any further adjustments!

Source: www.newscientist.com

Challenging Calculations: Quantum Computers May Struggle with ‘Nightmare’ Problems

SEI 270616406

Certain problems remain insurmountable for quantum computers.

Jaroslav Kushta/Getty Images

Researchers have uncovered a “nightmare scenario” computation tied to a rare form of quantum material that remains unsolvable, even with the most advanced quantum computers.

In contrast to the simpler task of determining the phase of standard matter, such as identifying whether water is in a solid or liquid state, the quantum equivalent can prove exceedingly challenging. Thomas Schuster and his team at the California Institute of Technology have demonstrated that identifying the quantum phase of matter can be notably difficult, even for quantum machines.

They mathematically examined a scenario in which a quantum computer receives a set of measurements regarding the quantum state of an object and must determine its phase. Schuster mentioned that this is not necessarily an impossible task, but his team has shown that a considerable number of quantum phases of matter—such as the complex interactions between liquid water and ice, including unusual “topological” phases that exhibit strange electrical currents—might necessitate quantum computers to perform computations over extremely protracted periods. This situation mirrors a worst-case scenario in laboratory settings, where instruments may need to operate for billions or even trillions of years to discern the characteristics of a sample.

This doesn’t imply that quantum computers are rendered obsolete for this analysis. As Schuster noted, these phases are unlikely to manifest in actual experiments involving materials or quantum systems, serving more as an indicator of our current limitations in understanding quantum computers than posing an immediate practical concern. “They’re like nightmare scenarios. It would be quite unfortunate if such a case arose. It probably won’t happen, but we need to improve our comprehension,” he stated.

Bill Fefferman from the University of Chicago raised intriguing questions regarding the overall capabilities of computers. “This might illuminate the broader limits of computation: while substantial speed improvements have been realized for specific tasks, there will inevitably be challenges that remain too daunting, even for efficient quantum computers,” he asserted.

Mathematically, he explained, this new research merges concepts from quantum information science employed in quantum cryptography with foundational principles from materials physics, potentially aiding progress in both domains.

Looking ahead, the researchers aspire to broaden their analysis to encompass more energetic or excited quantum phases of matter, which are recognized as challenging for wider calculations.

topic:

Source: www.newscientist.com

What Makes Quantum Computers So Powerful?

3D rendering of a quantum computer’s chandelier-like structure

Shutterstock / Phong Lamai Photography

Eleven years ago, I began my PhD in theoretical physics and honestly had never considered or written about quantum computers. Meanwhile, New Scientist was busy crafting the first “Quantum Computer Buyer’s Guide,” always ahead of its time. A glance through reveals how things have changed—John Martinis from UC Santa Barbara was recognized for developing an array of merely nine qubits and earned a Nobel Prize in Physics just last week. Curiously, there was no mention of quantum computers built using neutral atoms, which have rapidly transformed the field in recent years. This sparked my curiosity: how would a quantum computer buyer’s guide look today?

At present, around 80 companies globally are producing quantum computing hardware. My reporting on quantum computing has allowed me to witness firsthand how the industry evolves, complete with numerous sales pitches. If choosing between an iPhone and an Android is challenging, consider navigating the press lists of various quantum computing startups.

While there’s significant marketing hype, the challenge in comparing these devices stems from the lack of a clear standard for building quantum computers. For instance, potential qubit options include superconducting circuits, cryogenic ions, and light. With such diverse components, how does one assess their differences? This aspect will hone in on each quantum computer’s performance.

This marks a shift from the early days, where success was measured by the number of qubits—the foundational elements of quantum information processing. Many research teams have surpassed the 1000-qubit threshold, and the trajectory for achieving even more qubits appears to be becoming clearer. Researchers are exploring standard manufacturing methods, such as creating silicon-based qubits, and leveraging AI to enhance the size and capabilities of quantum computers.

Ideally, more qubits should always translate to greater computational power, enabling quantum computers to tackle increasingly complex challenges. However, in reality, ensuring each additional qubit doesn’t impede the performance of existing ones presents significant technical hurdles. Thus, it’s not just the number of qubits that counts, but how much information they can retain and how effectively they can communicate without losing data accuracy. A quantum computer could boast millions of qubits, but if they’re susceptible to errors that disrupt computations, they become virtually ineffective.

The extent of this “glitch” or noise can be measured by metrics like “gate fidelity,” which reflects how accurately a qubit or pair can perform operations, and “coherence time,” which gauges how long a qubit can maintain a viable quantum state. However, we must also consider the intricacies of inputting data into a quantum computer and retrieving outcomes, despite some favorable metrics. The growth of the quantum computing industry is partly attributed to the emergence of companies focused on qubit control and interfacing quantum internals with non-quantum users. A thorough buyer’s guide for quantum computers in 2025 should encompass these essential add-ons. Choosing a qubit means also selecting a qubit control system and an error correction mechanism. I recently spoke with a researcher developing an operating system for quantum computers, suggesting that such systems may become a necessity in the near future.

If I were to create a wish list for the short term, I would favor a machine capable of executing at least a million operations: a million-step quantum computing program with minimal error rates and robust error correction. John Preskill from the California Institute of Technology refers to this as the “Mega-Quop” machine. Last year, he expressed confidence that such machines would be fault-tolerant and powerful enough to yield scientifically significant discoveries. Yet, we aren’t there yet. The quantum computers at our disposal currently manage tens of thousands of operations, but error correction has only been effectively demonstrated for smaller tasks.

Quantum computers today are akin to adolescents—growing toward utility but still faced with developmental challenges. As a result, the question I frequently pose to quantum computer vendors is, “What can this machine actually accomplish?”

In this regard, it’s vital to compare not only various types of quantum computers but also contrast them with classical counterparts. Quantum hardware is costly and complex to manufacture, so when is it genuinely the sole viable solution for a given issue?

One method to tackle this inquiry is to pinpoint calculations traditional computers cannot resolve without unlimited time. This concept is termed “quantum supremacy,” and it keeps quantum engineers and mathematicians consistently preoccupied. Instances of quantum supremacy do exist, but they raise concerns. To be meaningful, such cases must be applicable, facilitating the construction of capable machines that can execute them, while also being demonstrable enough for mathematicians to assure that no conventional computer could compete.

In 1994, physicist Peter Shor devised a quantum computing algorithm for factoring large numbers, a technique that could potentially compromise the prevalent encryption methods utilized by banks worldwide. A sufficiently large quantum computer that could manage its own errors might execute this algorithm, yet mathematicians have yet to convincingly demonstrate that classical computers can’t efficiently factor large numbers. The most prominent claims of quantum supremacy often fall into this gray area, with some eventually being outperformed by classical machines. Ongoing demonstrations of quantum supremacy appear currently to serve primarily as confirmations of the quantum characteristics of the computers accomplishing them.

Conversely, in the mathematical discipline of “query complexity,” the superiority of quantum solutions is rigorously demonstrable, but practical algorithms remain elusive. Recent experiments have also introduced the notion of “quantum information superiority,” wherein quantum computers solve tasks using fewer qubits than traditional computers would require, focusing on the physical components instead of time. Though this sounds promising—indicating that quantum computers may solve problems without extensive scaling—they are not recommended for purchase simply because the tasks in question often lack pivotal real-world applications.

It’s undeniable that several real-world challenges are well-suited for quantum algorithms, like understanding molecular properties relevant to agriculture or medicine, or solving logistic issues like flight scheduling. Yet, researchers lack full clarity on these applications, often opting to state, “it seems.”

For instance, recent research on the prospective applications of quantum computing in genomics by Aurora Maurizio from the San Raffaele Scientific Institute in Italy and Guglielmo Mazzola at the University of Zurich suggests that traditional computing methods excel so significantly that “quantum computing may, in the near future, only yield speedups for a specific subset of sufficiently complex tasks.” Their findings indicate that while quantum computers could potentially enhance research in combinatorial problems within genomics, their application needs to be very precise and calculated.

In reality, for numerous issues not specifically designed to demonstrate quantum supremacy, there exists a spectrum in what constitutes “fast,” particularly when one considers that quantum computers might ultimately run algorithms quicker than classical computers, despite overcoming noise and technical challenges. However, this speed may not always offset the hardware’s significant costs. For example, the second-best-known quantum algorithm, Shor’s search algorithm, offers a non-exponential speedup, reducing computation time at a square root level instead. Ultimately, the question of how fast is “fast enough” to justify the transition to quantum computing may depend on individual buyers.

While it’s frustrating to include this in a purported buyer’s guide, my discussions with experts indicate that there remains far more uncertainty about what quantum computers can achieve than established knowledge. Quantum computing is an intricate, costly future technology; however, its genuine added value to our lives remains vague beyond serving the financial interests of a select few companies. This might not be satisfying, but it reflects the unique, uncharted territory of quantum computing.

For those of you reading this out of the desire to invest in a powerful, reliable quantum computer, I encourage you to proceed and let your local quantum algorithm enthusiast experiment with it. They may offer better insights in the years to come.

Topic:

Source: www.newscientist.com

Quantum Computers: Finally Attaining Unchallenged Dominance

SEI 232816755

Quantinuum’s Quantum Computer

Quantinuum

What unique capabilities do quantum computers possess that traditional computers cannot replicate? This question is central to a swiftly evolving industry, and recent findings aim to provide clarity on this topic.

Unlike classical bits, quantum computers utilize qubits that can occupy multiple states beyond just “0” or “1”, offering potential computational advantages. However, the debate on whether quantum computers can accomplish tasks beyond the reach of the most advanced traditional computers, including the notion of quantum supremacy, remains complex and contentious. This is primarily due to the stipulation that genuine demonstrations of quantum advantage must involve practical computational tasks, achievable with realistic quantum technology, while explicitly excluding any mathematical or algorithmic enhancements that may allow classical computers to eventually catch up.

William Crescher from The University of Texas at Austin and his colleagues are presently conducting experiments that satisfy both criteria. In contrast to earlier claims of quantum dominance, which were ultimately bridged by classical computing advancements, the researchers assert, “Our results are clear and enduring: no future classical algorithm development will close this gap.”

The team executed a complex mathematical experiment addressing communication challenges using 12 qubits created from laser-controlled ions by the Quantum Computing Company Quantinuum. The experiment’s objective was for two virtual participants, referred to as Alice and Bob, to devise the most efficient method for exchanging messages and performing calculations.

One section of the quantum computer, acting as Alice, prepares a specific quantum state and transmits it to Bob, another segment of the machine. Bob must discern its properties and determine how to measure Alice’s state to produce an output. By iterating this process, the duo can establish a means to forecast Bob’s output before Alice discloses her state.

The researchers conducted the procedure 10,000 times to refine the way Alice and Bob execute their tasks. With an analysis of these iterations and a rigorous mathematical examination of the protocol involved, it was found that classical algorithms with fewer than 62 bits could not compete with the performance of a 12-qubit quantum computer in this particular task. For a classical algorithm to achieve equivalent performance, it would require a performance threshold of about 330 bits, representing a nearly 30-fold difference in computational capability.

“This is an extraordinary scientific achievement that illustrates the extent of the ‘quantum advantage’ landscape, which may be broader than previously understood,” said Ashley Montanaro from the University of Bristol, UK. “Unlike most demonstrations of quantum superiority, the prospect of discovering a superior classical algorithm is virtually impossible.”

Ronald de Wolf from the Dutch Institute for Mathematics and Computer Science highlights that this experiment effectively leverages the recent rapid enhancements in existing quantum technologies while drawing upon theories of communication complexity that have been explored for years.

“The intricacies of communication are known to contribute to a verifiable and realistic distinction between quantum and classical systems. The difference is that advancements in hardware have made it feasible to implement the model for the first time,” he explains. “Moreover, they tackled a novel challenge in communication complexity, revealing a significant gap between classical and quantum capabilities even with just 12 qubits.”

These new findings differentiate themselves from earlier demonstrations of quantum superiority, but share a crucial element: their immediate practicality remains uncertain. Notable examples of quantum advantage that could produce substantial real-world benefits, such as Shor’s algorithm which could revolutionize encryption, still await confirmation regarding their applicability.

In the future, research teams might enhance their findings further by separating Alice and Bob into distinct computers. While this limits the chances of unmonitored interactions affecting outcomes of the quantum computer, the true utility of quantum dominance remains a critical issue, according to De Wolf.

“Progress beyond mere [quantum] dominance is essential for achieving [quantum] utility. Quantum computers currently outperform classical ones in specific areas of genuine interest, like some chemical computations and logistics optimization,” he suggests.

Topics:

Source: www.newscientist.com

Quantum Computers Are Now Practical and Valuable

3D illustration of a quantum computer

AdventTr/Getty Images

Amidst the excitement surrounding quantum computing, the technology may appear as a catch-all solution for various challenges. While the science is impressive, real-world applications are still developing. However, the quest for viable uses is starting to yield fruitful results. Particularly, the search for exotic quantum materials is gaining traction, which could revolutionize electronics and enhance computational power.

The discovery and exploration of new phases—especially more exotic forms analogous to ice or liquid water—remain foundational to condensed matter physics. Insights gained here can enhance our understanding of semiconductor functionality and lead to practical superconductors.

Yet, traditional experimental methods are increasingly inadequate for studying certain complex phases that theory suggests exist. For instance, the Kitaev honeycomb model predicts materials with a unique type of magnetism, but it took “decades of exploration to actually design this with real materials,” according to Simon Everred of Harvard University.

Everred and colleagues simulated this phenomenon using a quantum computer with 104 qubits made from ultra-cold atoms. They’re not alone in this endeavor; Frank Pollmann from the Technical University of Munich and his team utilized Google’s Sycamore and Willow Quantum Computers, which house 72 and 105 superconducting qubits respectively, to model conditions based on iterations of the Kitaev honeycomb framework. Both teams have documented their findings.

“These two projects harness quantum computers to investigate new phases of problems that had been theoretically predicted but not observed experimentally,” notes Petr Zapletal from the University of Erlangen-Nuremberg, who was not involved in the studies. “The advancement of quantum simulations for complex condensed matter systems is particularly thrilling.”

Both research teams confirmed the presence of anyons in their simulations, a significant progress that illustrates the growth and potential utility of quantum computers. Anyons differ fundamentally from qubits and represent exotic particles that are challenging to emulate.

Existing particles typically categorize into fermions and bosons. While chemists and materials scientists often focus on fermions, qubits generally function as bosons. The distinctions—like spin and collective behaviors—complicate the simulation of fermions using bosons. However, cold atom quantum experiments utilized Kitaev models to bridge these gaps. Masin Karinowski of Harvard, who participated in the research, described the Kitaev model as a “canvas” for exploring new physics. Through this model, the team could tune quasiparticles in their simulations by adjusting interactions among the qubits. According to Karinowski, some of these new particles might be employed to replicate novel materials.

Another critical aspect of the research was the use of Google’s quantum computer to examine materials outside equilibrium. Despite the significant exploration of equilibrium states in laboratories, the non-equilibrium realm remains largely uncharted. Pollmann notes that this aligns with laboratory trials where materials are repeatedly subjected to laser pulses. His team’s work reflects how condensed matter physicists study materials by exposing them to extreme temperatures or magnetic fields and then diagnosing changes in their phases. Such diagnostics are crucial for determining the conditions under which materials can be effectively utilized.

It’s important to clarify that these experiments don’t yield immediate real-world applications. To translate these findings into usable technologies, researchers will need to conduct further analysis on larger, less error-prone quantum computers. However, these preliminary studies carve out a niche for quantum computers in exploring physical phenomena, akin to the way traditional experimental tools have been employed for decades.

That material science might be the first field to showcase the value of quantum computing is not surprising. This aligns with how pioneers like Richard Feynman discussed quantum technology in the 1980s, envisioning its potential beyond mere devices. Moreover, this perspective diverges from the usual portrayal of quantum computing as technology primarily focused on outperforming classical computers in non-practical tasks.

“Viewing the advancement of quantum computing as a scientific approach, rather than simply through the lens of individual device performance, is undeniably supported by these experimental findings,” concludes Kalinowski.

topic:

Source: www.newscientist.com

Quantum Computers Exhibit Unexpected Randomness—And That’s Beneficial!

Quantum object shuffling is more complex than classic shuffling

Andriy Onofriyenko/Getty Images

Quantum computers are capable of generating randomness far more efficiently than previously anticipated. This remarkable discovery reveals the ongoing complexities at the intersection of quantum physics and computation.

Randomness is essential for numerous computational tasks. For instance, weather simulations require multiple iterations with randomly chosen slightly varied initial conditions. In the realm of quantum computing, researchers have demonstrated quantum advantage by arranging qubits randomly to yield outcomes that classical machines struggle to achieve.

Creating these random configurations effectively entails shuffling qubits and connecting them repeatedly, akin to shuffling a deck of cards. Initially, it was believed that adding more qubits to the system would extend the time required for shuffling, analogous to how larger decks of cards are harder to shuffle. With increased shuffling potentially compromising the delicate quantum states of qubits, the prospect of significant applications relying on randomness was thought to be limited to smaller quantum systems.

Recently, Thomas Schuster from the California Institute of Technology and his team found that generating these random sequences requires fewer shuffles than previously believed.

To illustrate this, Schuster and his colleagues conceptualized dividing the qubit ensemble into smaller segments, thereby mathematically demonstrating that each segment could independently produce a random sequence. They further established that these smaller qubit segments could be “joined” to create a well-shuffled version of the original collection of qubits in a manner that defies expectations.

“It’s quite astonishing because it indicates that classical random number generators don’t exhibit anything comparable,” states Schuster. For instance, in the case of card shuffling within a block, the top cards tend to remain near the top. This is not applicable in quantum systems, where quantum shuffles generate a random superposition of all possible arrangements.

“This is a significantly more intricate phenomenon compared to classical shuffling. The order of the top card is not preserved, as can be observed through classical methods where measuring the top card’s position post-shuffle yields a random output each time, devoid of any insights into the shuffling process itself. It’s genuinely a new and fundamentally quantum phenomenon.”

“We anticipated that this sort of random quantum behavior would be exceptionally challenging to achieve. Yet, the authors demonstrated that it can be realized with remarkable efficiency,” remarks Peter Craze from the Max Planck Institute for the Physics of Complex Systems in Germany. “This discovery was quite unexpected.”

“Random quantum circuits hold numerous applications as elements of quantum algorithms and for showcasing what is termed quantum advantage,” notes Ashley Montanaro from the University of Bristol, UK. “The authors have already identified various applications in quantum information and hope that additional applications will emerge.” While researchers can facilitate experiments demonstrating a type of quantum advantage they have previously conducted, Montanaro cautions that this does not imply we are closer to reaping the practical benefits of such advantages.

Topics:

Source: www.newscientist.com

Unveiling the Quantum Computers That Can Make a Difference

Zhang Bin/China News Service/VCG Getty Images

In the last decade, quantum computing has evolved into a multi-billion dollar sector, attracting investments from major tech firms like IBM and Google, along with the U.S. military.

However, Ignacio Cirac, a trailblazer in this field from Germany’s Max Planck Institute for Quantum Optics, provides a more measured assessment: “Quantum computers are not yet a reality,” he states, because creating a functional and practical version is exceedingly challenging.

This article is part of our special feature that delves into how experts perceive some of science’s most intriguing concepts. Click here for more information.

These quantum systems utilize qubits to encode data, in contrast to the traditional “bits” of conventional computers. Qubits can be generated through various methods, ranging from small superconducting circuits to ultra-cold atoms, yet each method presents its own complexities in construction.

The primary advantage lies in their ability to leverage quantum attributes for performing certain calculations at a speed unattainable by classical computers.

This acceleration holds promise for various challenges that traditional computers face, such as simulating complex physical systems and optimizing passenger flight schedules or grocery deliveries. Five years ago, quantum computers appeared poised to tackle these and numerous other computational hurdles.

Today, the situation is even more intricate. Certainly, the progress in creating larger quantum computers is remarkable, with numerous companies developing systems exceeding 1000 qubits. However, this progress also highlights the formidable challenges that remain.

A significant issue is that as these computers scale up, they tend to generate increased errors, and developing methods to mitigate or correct them has proven more challenging than anticipated. Last year, Google researchers made notable strides in addressing this problem, but as Cirac emphasizes, a fully functional useful quantum computer remains elusive.

Consequently, the list of viable applications for such machines may be shorter than many previously anticipated. Weighing the costs of construction against the potential savings reveals that, in many scenarios, the economics may not favor them. “The most significant misconception is that quantum computers can expedite all types of problems,” Cirac explains.

So, which issues might still benefit from quantum computing? Experts suggest that quantum computers could potentially compromise the encryption systems currently employed for secure communications, making them appealing to governments and institutions concerned with security. Scott Aaronson from the University of Texas at Austin notes this.

Another promising area for quantum computers is in modeling materials and chemical reactions. Because quantum computers operate within a framework of quantum objects, they are ideally suited for simulating other quantum systems, such as electrons, atoms, and molecules.

“These are simplified models that don’t accurately reflect real materials. However, if you appropriately design your system, there are numerous properties of real materials you can learn about physics.” Daniel Gottesman from the University of Maryland adds.

While quantum chemical simulations might seem more specialized than flight scheduling, the potential outcomes (such as discovering room-temperature superconductors) could be groundbreaking.

The extent to which these ambitions can be realized heavily relies on the algorithms guiding quantum computations and methods for correcting those pesky errors. This is a complex new domain, as Vedran Dunjko of Leiden University in the Netherlands points out, prompting researchers like himself to confront fundamental questions about information and computation.

“This creates a significant incentive to investigate the complexity of the problem and the potential of computing devices,” Dunjko asserts. “For me, this alone justifies dedicating a substantial portion of my life to these inquiries.”

Explore more articles in this series by using the links below:

topics:

Source: www.newscientist.com

Trump grants tariff exemptions for smartphones, computers, and other electronic gadgets

Following more than a week of tariffs on imports from China, the Trump administration released regulations late Friday that spared smartphones, computers, semiconductors, and other electronic devices from various fees. This move significantly reduced prices for high-tech companies like Apple and Dell, as well as for consumers purchasing iPhones and other electronic products.

A message issued by US Customs and Border Protection on Friday included a lengthy list of products that faced tariffs on Chinese goods. Notably, exclusions were granted to smartphones, computers, semiconductors, and other technology products. However, additional duties will still apply to electronic devices and smartphones, as well as an increase in tariffs on semiconductors.

This exemption is a significant relief for tech giants like Apple and Nvidia, who would have faced substantial losses from punitive taxes. Many consumers rushed to purchase iPhones to avoid potential price hikes on electronic devices. These exemptions may help mitigate inflation and uncertainty in the economy.

The tariff relief marks a change in Trump’s trade policies aimed at promoting US manufacturing. Factories producing electronic devices like iPhones and laptops are primarily located in Asia, particularly China. The exemptions apply not only to China but also to other countries.

However, this relief may be short-lived as the Trump administration plans another trade investigation related to semiconductors. This could impact other technology products and result in additional tariffs. The administration aims to protect American semiconductor production, which is essential for various consumer products.

Despite the exemptions, Trump remains committed to domestic manufacturing of these products, signaling a shift towards US production. The policy change aims to secure the supply of American semiconductors, crucial for smartphones, cars, and various other goods.

The recent tariff exemptions signify a partial retreat from Trump’s trade war with China, covering a significant portion of US imports from the country. Other Asian countries stand to benefit as well, with the exemptions reducing tariffs on imports from Taiwan, Malaysia, Vietnam, and Thailand.

Trump’s decision to exempt certain product types followed a volatile week where he reversed course on several tariffs imposed earlier. The exemption excludes China, which retaliated with its own tariffs. This led to a steep decline in the stock values of tech companies, notably impacting Apple’s market capitalization.

The tech industry views Trump’s moderation as a positive development, as it eases tensions and supports continued investment in the US. Notably, Apple CEO Tim Cook has been actively engaging with the administration to secure exemptions for Apple products and promote US manufacturing.

However, the threat of further tariffs on semiconductors and other electronics looms, with potential implications for the industry. The Trump administration is considering additional duties under legal provisions, which could impact various sectors and imports.

Apple responds to the recent tariff exemptions, remains committed to China’s manufacturing facilities, citing challenges in skilled labor availability in the US compared to China. The company has faced pressure over the years to shift some iPhone manufacturing to the US, but logistical and workforce constraints pose significant hurdles.

The potential implications of Trump’s tariff policies on Apple products raise concerns about price increases and supply chain disruptions. Apple’s strategic decisions regarding manufacturing and pricing will have a significant impact on its operations and market positioning, considering ongoing trade tensions and regulatory changes.

The looming threat of additional tariffs on electronics underscores the uncertainty and volatility in the tech industry. As the US and China navigate trade negotiations and policy shifts, tech companies like Apple face challenging decisions to maintain competitiveness and comply with evolving regulations.

Apple’s stance on tariff exemptions and manufacturing challenges reflects the complex interplay between global trade dynamics and corporate strategies. The company’s extensive supply chain and reliance on Asian manufacturing facilities underscore the broader implications of trade policies on multinational corporations.

As trade tensions continue to escalate, tech companies like Apple must navigate regulatory uncertainties and market pressures. The potential impact of tariffs on product pricing, supply chains, and global competitiveness looms large as companies seek to balance operational efficiency and regulatory compliance.

The ongoing trade negotiations between the US and China, particularly regarding technology products, highlight the delicate balance between economic interests and national security concerns. The implications of tariff policies on semiconductors and electronics underscore the broader geopolitical challenges facing the tech industry.

As companies like Apple navigate shifting trade dynamics, regulatory changes, and market uncertainties, strategic decision-making becomes increasingly complex. The need to adapt to evolving trade policies while maintaining global competitiveness requires innovative solutions and proactive engagement with policymakers.

Source: www.nytimes.com

New US Tariffs: Smartphones and Computers Exempted from China by Trump Administration

Following more than a week of tariffs on Chinese imports, the Trump administration released new rules on Friday that exempted smartphones, computers, semiconductors, and other electronic devices from certain fees. This move significantly lowered prices for high-tech companies like Apple and Dell, as well as benefiting consumers who purchase products like iPhones.

A message was issued by US Customs and Border Protection on Friday, listing the products that had previously been subjected to tariffs on Chinese goods. Certain exclusions were granted for modems, routers, flash drives, and other tech products not commonly manufactured in the US.

The exemption does not completely eliminate tariffs on electronic devices and smartphones. The administration previously imposed a 20% tariff on Chinese goods due to concerns about the country’s involvement in fentanyl trade. Additionally, tariffs on semiconductors, crucial components in electronic devices, are expected to increase.

This exemption marks a significant development in the ongoing trade war with China and is expected to have far-reaching effects on the US economy. Tech giants like Apple and Nvidia will benefit from avoiding heavy taxes that could have impacted their profits. Consumers rushed to purchase iPhones to avoid potential price hikes, relieving concerns about inflation and economic instability.

While the tariff relief provides temporary respite for the tech industry, the Trump administration has indicated plans for further trade investigations, particularly targeting semiconductors. The aim is to secure the US supply chain for vital technologies used in various products, including smartphones and automobiles.

President Trump’s shift in trade policy has implications for various industries, especially as it relates to China. The tech sector, in particular, has closely engaged with the administration to navigate the changing landscape of tariffs and taxes on imports. Apple CEO Tim Cook has been instrumental in lobbying for exemptions and advocating for US manufacturing of tech products.

As the trade tensions continue to evolve, the tech industry remains a focal point in the US-China trade relationship. Consumers may see fluctuations in prices for electronic devices as the two countries negotiate their trade terms.

Source: www.nytimes.com

Light-based computers are nearing their commercial debut

Lightweight based computer chip made by Pace, LightElligence

Light Ergens

Computers that use light rather than data to represent and manipulate data can reduce data center power requirements and at the same time speed up calculations. Two studies published today describe breakthroughs in performing real problems on light-based computers, creating techniques that are on the verge of commercial applications, the researchers say.

Electronic computers have historically followed Moore’s law, as we all use today. The power of the machine doubled every two years. However, in recent years, progress has slowed down as transistor miniaturization reaches its fundamental physical limits.

Researchers are working on many potential solutions, including quantum and photonic computing. However, Quantum Computing still struggles to achieve true utility, but Photonic Computing has reached the point where chip designs like those set in two new research are performing authentic calculations. In addition, these photonic chips can be manufactured using the same factory that manufactures silicon chips for electronic computers.

Photonic computers offer greater potential benefits than electronic computers. One is that photons travel faster than electrons do in the circuit, allowing for faster calculations and less pauses between each step of the calculation. Second, photons move without resistance and are rarely absorbed by the material on which the chip is made, allowing the same job to be performed using less energy than an electric computer that requires energy-intensive cooling.

In its research, Lightelligence, a Singapore-based company, shows that a device called a Photonic Arithmetic Computing Engine (PACE), which combines photonic and microelectronic chips, can successfully execute ISING problems that apply directly to the logistics industry and many other areas.

Meanwhile, US startup LightMatter claims that its own chip can run AI model BERT to create text in Shakespeare’s style. New Scientist Could not reach Lightmatter due to comments.

Bo Peng At LightElligence, the sector is increasingly busy with start-ups and technology is rapidly maturing. “We’re more or less pre-production,” says Peng. “It’s more like a real product than just a lab demonstration.”

Just as the world of quantum computers is trying to demonstrate the benefits of quantum, quantum machines are the point where classical computers can provide useful things. He won’t draw when this will happen, but says that this technology is closer to being ready for commercial applications – perhaps it works as a photonic chip that works with the electric chip, rather than completely replacing them to handle the specific tasks that it can provide boost.

Needless to say, hardware based on the research and Lightelligence PCI Express format. This is a standard motherboard add-on format for desktop computers that allow you to add graphics cards and other devices. Company devices can already be added to any commercial desktop, but require the appropriate software to communicate.

Robert Hadfield At the University of Glasgow in the UK, two studies show that “it’s a kind of boiling area.” “This is close to the point where the industry may consider photonic processors a viable alternative,” he says. “It’s really interesting to see how mature this architecture has become. These are photonic chips manufactured in one of the world’s leading foundries, so they can be expanded for mass production.”

Stephen SweeneyThe University of Glasgow also says that they have already seen optical data transmissions roll out around the world, with optical optical computing approaching too. “With Photonics, you can do things at a lower loss than electronics can,” says Sweeney. “And if you need to be able to do a huge amount of calculations, you need to start looking at it.”

topic:

Source: www.newscientist.com

Health monitoring technology can be integrated into clothing using thread-based computers

Computer threads woven with metal and textile yarn to make potential clothing

Hamilton Osoi, IFM

An elastic computer on threads sewn onto clothing can be used to record whole-body data that most medical sensors cannot pick up.

Wearable technologies such as smartwatches monitor body signals, such as heart rate and temperature, but usually only from a single location. This gives you an incomplete picture of how your body works.

now, Yoel Fink The Massachusetts Institute of Technology and his colleagues developed a computer that could be sewn into clothing made from chips connected with copper and elastic fiber threads.

This thread has 256 kilobytes of onboard memory around that of a simple calculator, and sensors that can detect temperature, heart rate, and body movement. There is also Bluetooth to allow various threads to communicate.

This means that location-specific data can be collected collectively on the body. It says that it is theoretically used by artificial intelligence to allow for more accurate monitoring of human health. “We're starting to write apps for fabrics, monitor our health and, frankly, we're very close to the point where we can do all sorts of things that our phones can't.”

To create individual threads, Fink and his team folded the chips into conductive boxes and connected them to copper wire. The wire was then wrapped in a protective plastic casing and pulled into a thin tube that could be covered with fabrics such as cotton or synthetic Kevlar.

To test them, four fibers were sewn onto the feet and arms of human clothes. Researchers found that they could identify various movements a person has made, such as lunges, squats, and arm circles.

The team is currently testing thread-computer-made clothing on an Arctic expedition as part of Operation Nanook, an annual military exercise led by the Canadian Army. Clothes record temperature and data from various parts of the body. Fink says it could one day help protect people in extreme circumstances.

Threads are being tested by Army personnel during training

US Army Cold Area Research & Engineering Lab

Not only does it record, but it says this could help vulnerable people detect dangerous falls. Theo Hughes-Riley At Nottingham Trent University, England.

Without the need to wire the sensors together, the design becomes much simpler than other electronic fabrics, he says. Researchers also demonstrated that the thread can be washed, but only water was used, not detergent. Therefore, durability in everyday use must be proven before it is widely adopted, says Hughes Riley.

topic:

Source: www.newscientist.com

Windows computers worldwide suffer massive outage due to Blue Screen of Death

If you see a blue screen, it’s bad news

Alex Photostock/Alamy

A large number of Microsoft Windows computers around the world today were found to be unable to boot, instead displaying the so-called “Blue Screen of Death” (BSOD), among the computers reportedly affected, with the UK’s Sky News ceasing live broadcasts just before 6am local time, as well as causing outages for a number of airline and banking services.

What’s happening on my Windows computer?

Some users have reported that their Windows devices are refusing to boot up, while others have witnessed their computers suddenly display a BSOD while in use.

Eddie Major of the University of Adelaide in Australia…

Source: www.newscientist.com

Multiple nations implement baffling export restrictions on quantum computers

Exports of quantum computers are restricted in many countries

Saigh Anys/Shutterstock

As a result of secret international negotiations, governments around the world have imposed identical export controls on quantum computers while refusing to disclose the scientific rationale behind the controls. Although quantum computers could theoretically threaten national security by breaking encryption technology, even the most advanced quantum computers currently publicly available are too small and error-prone to achieve this, making the bans seem pointless.

The UK: Quantum computers with more than 34 quantum bits (qubits) and error rates below a certain threshold. The intention seems to be to limit machines with certain capabilities, but the UK government has not stated this explicitly. New Scientist A Freedom of Information request seeking the basis for these figures was denied on national security grounds.

France has also imposed similar export controls. Quantum Bits The numbers and error rates are also improving, as are Spain and the Netherlands. Having the same limits across European countries might suggest EU regulation, but this is not the case. A spokesperson for the European Commission said: New Scientist EU member states are free to adopt national, rather than bloc-wide, measures when it comes to export controls. “The recent quantum computer restrictions by Spain and France are an example of such national measures,” they said. They declined to explain why the figures for the EU's various export bans are completely consistent if these decisions were taken independently.

A spokesman for the French Embassy in London said: New Scientist The limits were set at a level “likely to indicate a cyber risk,” they said. They noted that the regulations are the same in France, the UK, the Netherlands and Spain because of “multilateral negotiations that took place over several years under the Wassenaar Arrangement.”

“The limits chosen are based on scientific analysis of the performance of quantum computers,” the spokesperson said. New ScientistBut when asked for clarification about who carried out the analysis and whether its findings would be made public, a spokesman declined to comment further.

of Wassenaar Agreement The system, which is followed by 42 participating countries including EU member states, the UK, the US, Canada, Russia, Australia, New Zealand and Switzerland, controls the export of items with potential military applications, known as dual-use technologies. The export ban on quantum computers also includes similar language regarding 34 qubits..

New Scientist We wrote to dozens of Wassenaar member states asking whether there was quantum-computer-level research that posed a risk to export, whether it had been made public, and who had conducted it. Only a few countries responded.

“We closely monitor other countries as they introduce national restrictions on certain technologies,” a spokesperson for the Swiss Federal Ministry of Economic Affairs, Education and Research said, “but in specific cases it is already possible to block the export of such technologies using existing mechanisms.”

“We are closely following the Wassenaar discussions on the exact technical control parameters for quantum.” Milan Godin, Belgian Advisor to the EU Working Party on Dual-Use Goods, Belgium. China does not appear to have implemented its own export controls yet, but Godin said quantum computers are a dual-use technology. It has the potential to crack commercial or government codes, and its speed could ultimately enable militaries to plan faster and better, including for nuclear missile attacks.

A spokesperson for Germany's Federal Office for Economics and Export Control confirmed that the export restrictions on quantum computers are the result of negotiations under the Wassenaar Agreement, but Germany does not appear to have implemented any restrictions. “The negotiations are confidential and unfortunately we cannot provide any details or information about the considerations of the restrictions,” the spokesperson said.

Christopher MonroeThe co-founder of quantum computing company IonQ said industry participants have been aware of similar bans and are discussing their criteria, but he doesn't know where they come from.

“I don't know who decided the logic behind these numbers,” he says, but it may have something to do with the threshold for simulating a quantum computer with a regular computer. This gets exponentially harder as the number of qubits increases, so Monroe thinks the rationale behind the ban may be to limit quantum computers that are too advanced to simulate, even though such devices have no practical use.

“It would be a mistake to think that just because we can't simulate the behavior of a quantum computer doesn't mean it's useful, and severely restricting research into advances in this grey area would certainly stifle innovation,” he says.

topic:

  • safety/
  • Quantum Computing

Source: www.newscientist.com

The mysterious glow of Venus evades detection by computers, but not by the human eye

“Ash light” or AL is a faint mysterious glow or hue seen in the night hemisphere of Venus. It is often compared to Earthshine, the reflected light that illuminates the far side of the Moon.

First described by Italian astronomer Giovanni Riccioli in 1643, AL has been observed many times since then, but its faint, ephemeral, and elusive nature has prevented serious research. It’s here.

Even more problematic, AL has so far only been detected by the human eye, and no scientific instruments, either earth-based or space-based, have recorded this phenomenon.

Some authorities have declared this phenomenon to be an illusion, perhaps an eye contrast effect or even an “expectation bias.” Some have suggested that a defect in the equipment could explain the phenomenon. Light scattering, optical aberrations, background sky brightness, weather, etc.

But there are enough reliable reports about AL that some scientists can offer an explanation. These include reflected light from Earth, auroras, “airglow” radiation, lightning, and infrared (thermal) radiation from Venus’ atmosphere.

Most of these explanations are ignored for some reason. However, there is ample evidence that not only ultraviolet light from the sun, but also high-energy solar wind particles can excite oxygen atoms in Venus’ atmosphere.

This creates a pale green glow similar to that seen in the aurora borealis on Earth. However, the process is somewhat different because auroras on Earth are caused by Earth’s magnetic field interacting with solar particles, whereas Venus has no appreciable magnetic field.

It remains to be seen whether this explanation can explain all or some of the AL observations. Therefore, the long-standing mystery of AL may still turn out to be an illusion.

This article is an answer to the question (asked by Herman Townsend of Liverpool): “What is Ashen Light?”

If you have any questions, please email us at: questions@sciencefocus.comor send us a message Facebook, Xor Instagram Page (remember to include your name and location).

Check out our ultimate Interesting information More amazing science pages.

read more:

Source: www.sciencefocus.com

Google and XPRIZE collaboratively introduce $5 million reward to identify practical uses for quantum computers

Can quantum computers help?

Eric Lucero/Google

Google and XPRIZE are launching a $5 million competition to create a quantum computer that could actually benefit society. It’s already known that quantum computers can perform certain tasks faster than classical computers, ever since Google first claimed the quantum benefits of its Sycamore processor in 2019. However, these demonstration tasks are simple benchmarks and have no real-world applications.

“There are a lot of fairly abstract mathematical problems for which quantum computers can prove to provide very significant speedups,” he says. Ryan Babush Google. “However, much of the research community is less focused on adapting more abstract quantum acceleration to concrete real-world applications, or on trying to figure out how quantum computers can be used. I didn’t.”

To this end, Google and the XPRIZE Foundation are inviting researchers to come up with new quantum algorithms as part of a three-year competition. The winning algorithm could potentially solve an existing problem, such as finding a new battery electrolyte that significantly increases storage capacity, but it doesn’t have to actually solve the problem, Babush said. Instead, researchers only need to demonstrate how the algorithm is applied and detail the exact specifications of the quantum computing required. Alternatively, competitors could demonstrate how existing quantum algorithms can be applied to real-world problems that have not been considered before.

The award examines how big an impact an entrant’s algorithm can have, whether it tackles problems similar to those outlined in the United Nations’ Sustainable Development Goals, and how well it can be done on available machines. They will be judged on a variety of criteria, including feasibility. Now or in the near future.

The $5 million prize pool consists of a $3 million grand prize to be split between up to three winners, $1 million to five runners-up, and $50,000 each to the 20 semi-finalists. .

The award could help shift the focus of quantum computing researchers from technical definitions of quantum benefits, such as those demonstrated by Google and IBM, to real-world applications, it said. Nicholas Quesada At the Polytechnic University of Montreal, Canada. “[The prize is] “We realized clearly that this is a very important issue,” Quesada said. “We need to think about what we’re going to do with quantum computers.”

But finding socially beneficial quantum algorithms requires a deeper understanding of how quantum computers work, including how they deal with noise and errors, he said. bill fefferman at the University of Chicago. The award does not address this fundamental aspect of building quantum computers, he says.

“I’m generally very optimistic that we’ll find an algorithm that’s really useful,” Pfefferman says. “I’m not very optimistic that within the next three years we’ll be able to discover those algorithms and implement them on the current hardware that’s going to exist.”

topic:

Source: www.newscientist.com

Years of Study and a Grand Vision to Merge Computers and Brains

Elon Musk’s announcement on Monday caught the attention of a small community of scientists who work with the body’s nervous system to treat disorders and conditions.

Robert Gaunt, an associate professor at the University of Pittsburgh’s School of Physical Medicine and Rehabilitation, said, “Inserting a device into a human body is not an easy task. But without neuroscience research and decades of demonstrated capabilities, I don’t think even Elon Musk would have taken on a project like this.”

Musk tweeted, “The first humans @Neuralink I was recovering well yesterday. Initial results show promising neuronal spike detection.” However, many scientists are cautious about the company’s clinical trials and note that not much information has been made public.

Neuralink won FDA approval to conduct its first human clinical study last year, and the company is developing brain implants that allow people, including severely paralyzed patients, to control computers with their thoughts.

Although it’s too early to know if Neuralink’s implants will work in humans, Gaunt said the company’s announcement is an “exciting development.” His own research focuses on restoring motor control and function using brain-computer interfaces.

“In 2004, a small device known as the Utah array was implanted in a human for the first time, allowing a paralyzed man to control a computer cursor with nerve impulses,” according to a report from University of Utah. Scientists have demonstrated how brain-computer interfaces can help people control robots, stimulate muscles, decode handwriting, speech, and more.

Musk said the clinical trials will aim to treat people with paralysis and paraplegia. However, many scientists believe enhancing human performance through brain-controlled devices is far in the future and not very realistic.

Still, Neuralink’s clinical trials represent a major advance for the fields of neuroscience and bioengineering. Funding basic science research is key to private companies advancing commercially viable products, says Gaunt.

Source: www.nbcnews.com

Faster computers on the horizon with first commercially available graphene semiconductor

The team's graphene device grown on a silicon carbide substrate chip

Georgia Tech

A functioning, scalable semiconductor has been created from graphene for the first time, potentially paving the way for new types of computers that are faster and more efficient than today's silicon chips.

Graphene is a material made from a single layer of carbon atoms that is stronger than an equivalent thickness of steel. It is an excellent conductor of electricity and has excellent resistance to heat and acids. But despite its benefits, practical graphene semiconductors that can be controlled to conduct or insulate electricity at will have eluded scientists. Such semiconductors are key to creating the logic chips that power computers.

The problem is the lack of something known as a bandgap. Semiconductors have higher and lower energy bands and points at which excited electrons can hop from one to the other, or band gaps. This effectively turns the flow of current on and off, making it conductive or non-conducting, creating the binary number system of zeros and ones used in digital computers.

Previous research has shown that graphene can be made to behave like a semiconductor on small scales, but it has never been scaled up to a size that could be used in computer chips. Previous research has shown that wrinkles, domes, and holes in graphene sheets can have unusual effects on the flow of electricity, and that creating the right conditions for defects could lead to the creation of logical chips. It is shown. But so far nothing has scaled up.

now, Walter de Heer His colleagues at the Georgia Institute of Technology in Atlanta created graphene with a bandgap and demonstrated its operation as a transistor, an on/off switch that prevents or allows current to flow. Their process relies on technology similar to that used to create silicon chips, which should make it even more useful for scaling up.

De Heer's group used heated silicon carbide wafers to force the silicon to evaporate before the carbon, effectively leaving a layer of graphene on top. At the time of writing, Mr. de Heer was not available for an interview. said in a statement The electrical properties of graphene semiconductors were much better than those of silicon chips. “It's like driving on a gravel road versus driving on a highway,” he said.

Silicon chips are cheap to manufacture and supported by huge manufacturing infrastructures around the world, but we are reaching the limits of what these chips can do. Moore's Law states that the number of transistors in a circuit doubles approximately every two years, but the rate of miniaturization has slowed in recent years as circuit densities have been reached where engineers cannot reliably control the electrons. are doing. Graphene circuits have the potential to reignite progress, but hurdles remain.

“The fact that we're using wafers is important because it's really scalable,” he says. david carey At the University of Surrey, UK. “We can scale up this process using all the technologies that the entire semiconductor industry is familiar with.”

But Carey is skeptical that this development means the world will soon move from silicon to graphene chips. That's because new research requires many improvements in transistor size, quality, and manufacturing technology, and silicon has a huge head start.

“Most people who work in silicon research are exposed every day to new amazing materials that are trying to replace silicon, and nothing like this has ever happened before,” he says. . “If you're a silicon enthusiast, you'll be sitting pretty happily on top of the mountain. The idea of ​​replacing your laptop with graphene isn't quite there yet.”

topic:

Source: www.newscientist.com

Caltech Researchers Introduce Novel Error-Correction Technique for Quantum Computers

Researchers at the California Institute of Technology have developed a quantum erasure device to correct “erasure” errors in quantum computing systems. The technique allows fluorescent error detection and correction by manipulating alkaline earth neutral atoms with laser light “tweezers.” This innovation leads to a 10-fold increase in the entanglement rate of Rydberg neutral atomic systems, and is an important step forward in making quantum computers more reliable and scalable.

For the first time, researchers have successfully demonstrated the identification and removal of “erasure” errors.

Future quantum computers are expected to revolutionize problem-solving in a variety of fields, including creating sustainable materials, developing new drugs, and solving complex problems in fundamental physics. However, these pioneering quantum systems are more error-prone than the classical computers we use today. Wouldn’t it be great if researchers could whip out a special quantum eraser and remove mistakes?

Report in magazine Nature, A group of researchers led by the California Institute of Technology has demonstrated for the first time a type of quantum erasure device. Physicists have shown that mistakes can be pinpointed and corrected. quantum computing A system known as an “erasure” error.

“Typically, it’s very difficult to detect errors in quantum computers, because just the act of looking for errors creates more errors,” said Manuel Endres, co-lead author of the new study and co-author of the study. says Adam Shaw, a graduate student in the room. Professor of Physics at California Institute of Technology. “However, we found that with careful control, certain errors can be precisely identified and erased without significant impact. This is where the name erasure comes from.”

How quantum computing works

Quantum computers are based on the physical laws that govern the subatomic realm, such as entanglement, a phenomenon in which particles mimic each other while remaining connected without direct contact. In the new study, researchers focused on a type of quantum computing platform that uses arrays of neutral atoms, or atoms that carry no electric charge. Specifically, they manipulated individual alkaline earth neutral atoms trapped inside “tweezers” made with laser light. The atoms are excited to a high-energy state, or “Rydberg” state, and neighboring atoms begin to interact.

Errors are typically difficult to spot in quantum devices, but researchers have shown that if carefully controlled, some errors can cause atoms to emit light. The researchers used this ability to perform quantum simulations using atomic arrays and laser beams, as shown in this artist’s concept. Experiments show that quantum simulations can be run more efficiently by discarding erroneous atoms that are glowing.Credit: Caltech/Lance Hayashida

“The atoms in our quantum systems interact with each other and generate entanglements,” said the study’s other co-lead author, a former postdoctoral fellow at the California Institute of Technology and now at a French quantum computing company. Pascal Scholl, who works at PASQAL, explains.

Entanglement is what allows quantum computers to outperform classical computers. “But nature doesn’t like to stay in this entangled state,” Scholl explains. “Eventually an error will occur and the entire quantum state will be destroyed. You can think of these entangled states like a basket full of apples, where the atoms are the apples. Over time , some apples will start to rot. If you don’t remove these apples from the basket and replace them with fresh apples, all the apples will quickly rot. It’s not clear how to completely prevent these errors from occurring. Therefore, the only viable option at this time is to detect and remediate them.”

Innovation in error detection and correction

The new error-trapping system is designed so that atoms with errors fluoresce, or glow, when hit by a laser. “We have images of glowing atoms that show us where the errors are, so we can either exclude them from the final statistics or actively correct them by applying additional laser pulses.” says Scholl.

Implementation theory of erasure detection in neutral atom The system was first developed by Jeff Thompson, a professor of electrical and computer engineering. princeton university, and his colleagues.The team recently reported a demonstration of the technique in the journal Nature.

The Caltech team says that by removing and identifying errors in the Rydberg atomic system, the overall rate of entanglement, and therefore fidelity, can be improved. In the new study, the researchers report that only one out of every 1,000 pairs of atoms failed to entangle. This is a 10-fold improvement over what was previously achieved and the highest entanglement rate ever observed for this type of system.

Ultimately, these results bode well for quantum computing platforms that use Rydberg neutral atomic arrays. “Neutral atoms are the most scalable type of quantum computer, but until now they have not had the high degree of entanglement fidelity,” Shaw says.

References: “Elimination Transformations in High-Fidelity Rydberg Quantum Simulators” Pascal Scholl, Adam L. Shaw, Richard Bing-Shiun Tsai, Ran Finkelstein, Joonhee Choi, Manuel Endres, October 11, 2023. Nature.
DOI: 10.1038/s41586-023-06516-4

The research was funded by the National Science Foundation (NSF) through the Institute for Quantum Information and Materials (IQIM), based at the California Institute of Technology. Defense Advanced Research Projects Agency. NSF Career Award. Air Force Office of Scientific Research. NSF Quantum Leap Challenge Laboratory. Department of Energy’s Quantum Systems Accelerator. Fellowships in Taiwan and California Institute of Technology. and a Troesch Postdoctoral Fellowship. Other Caltech-related authors include graduate student Richard Bing-Shiun Tsai; Ran Finkelstein, Troesch Postdoctoral Research Fellow in Physics. Former postdoc Joonhee Choi is now a professor at Stanford University.

Source: scitechdaily.com