Beyond Quantum Anthony Valentini, Oxford University Press
Physics is experiencing unexpected challenges. Despite extensive research, the elusive dark matter remains undetected, while the Higgs boson’s discovery hasn’t clarified our path forward. Moreover, string theory, often hailed as the ultimate theory of everything, lacks solid, testable predictions. This leaves us pondering: what’s next?
Recently, many physicists and science writers have shied away from addressing this question. While they used to eagerly anticipate groundbreaking discoveries, they now often revert to philosophical musings or reiterate known facts. However, Antony Valentini from Imperial College London stands out. In his book, Beyond Quantum: Exploring the Origins and Hidden Meanings of Quantum Mechanics, he introduces bold, innovative ideas.
The book’s focus is quantum mechanics, a pillar of physics for the last century. This field hinges on the concept of the wave function—a mathematical representation capable of detailing the complete state of any system, from fundamental particles to larger entities like us.
The enigma of wave functions is their tendency not to describe ordinary localized objects but rather a diffuse, fuzzy version of them. Upon observation, the wave function “collapses” into a random outcome with probabilities defined by Born’s law, a principle established by physicist Max Born, typically covered in academic literature. This results in objects manifesting with clear attributes in specific locations.
The debate surrounding the interpretation of the wave function has persisted, with two primary perspectives emerging. One posits that wave functions represent reality itself, suggesting that electrons, cats, and humans exist in multiple states simultaneously across time and space—a many-worlds interpretation fraught with metaphysical implications.
“
Pilot wave theory has long been known to reproduce all the predictions of quantum mechanics. “
The alternative interpretation suggests that wave functions are not the entirety of reality. This is where pilot wave theory, significantly advanced by Valentini and initially proposed by Louis de Broglie in 1927, comes into play.
Louis de Broglie: Pioneer of Pilot Wave Theory
Granger – Historical Photo Archive/Alamy
Pilot wave theory posits a real yet incomplete wave function, suggesting the wave guides individual particles instead of being mere waves influencing a floating plastic bottle. In this model, particles remain specific, and their wave-like behavior originates from the pilot wave itself.
This theory has consistently validated all quantum mechanics predictions, eschewing fundamental randomness. However, Valentini underscores that this agreement rests on the assumption that particles maintain equilibrium with waves, which aligns with current experimental data but isn’t universally applicable.
Valentini’s hypothesis suggests that in the universe’s infancy, particles existed far from quantum equilibrium before settling into their current states, akin to a cup of coffee cooling down. In this scenario, the Born rule and its inherent randomness morph from core natural features into historical anomalies shaped by cosmology.
Moreover, quantum randomness also hinders the practical utilization of nonlocality, implicating direct interactions between separate objects across time and space. Valentini argues that if the Born law had not prevailed in the universe’s early stages, instantaneous communication across vast distances may have occurred, potentially leaving traces on the cosmic microwave background. If any relics from that era exist, superluminal signal transmission might still be feasible.
Though Valentini’s insights might appear speculative without concrete evidence, his rigorous examination of how conventional quantum mechanics became dominant makes his work noteworthy. While there could be gaps, especially in clearly explaining the pilot wave aspect, Valentini’s contributions illuminate what a ‘big idea’ looks like in a field rife with uncertainty.
Adolf Hitler’s death is recorded as April 30, 1945. At least, that’s the official narrative. However, some historians contest this, suggesting he escaped war-torn Berlin and lived in secrecy. Today, this alternate theory is largely viewed as a conspiracy, yet no rational historian can deny that, regardless of the available evidence, the “facts in question” existed. Hitler was either deceased that day or he was not. It’s nonsensical to suggest that he was both alive and dead on May 2, 1945. But if we replace Adolf Hitler with Schrödinger’s renowned cat, the historical “facts” become quite muddled.
Schrödinger is recognized as a foundational figure in quantum mechanics, the most successful scientific framework to date. It serves as the backbone for many fields, including chemistry, particle physics, materials science, molecular biology, and astronomy, yielding remarkable technological advancements, from lasers to smartphones. Yet, despite its successes, the essence of quantum mechanics appears perplexing at its core.
In our daily lives, we operate under the assumption that an “external” real world exists where objects like tables and chairs possess clearly defined traits, such as position and orientation, independent of observation. In the macroscopic realm, our observations merely uncover a pre-existing reality. Conversely, quantum mechanics governs the microscopic domain of atoms and subatomic particles, where certainty and clarity dissolve into ambiguity.
Quantum uncertainty implies that the future is not entirely dictated by the present. For example, if an electron is directed toward a thin barrier with a known speed, it can either bounce back or tunnel through, emerging on the opposite side. Similarly, if an atom becomes excited, it might remain excited or decay and emit a photon a few microseconds later. In both scenarios, predicting outcomes with certainty is impossible—only probabilistic estimates can be offered.
Most individuals are comfortable with the idea that the future holds uncertainties. However, quantum indeterminacy similarly applies to the past. The process is not yet complete. When scrutinized at a minute scale, history transmutes into a blend of alternate possibilities, a state known as superposition.
The hazy picture of the quantum microcosm sharpens during measurements. For instance, localizing an electron may show it at a specific location; however, quantum mechanics asserts that this doesn’t imply the electron previously existed in that state. It is already there. Observations merely disclose the specific location prior to measurement. Rather, measurement transforms the electron from a state without a defined location into one with a defined position.
So, how should we conceptualize electrons prior to observation? Picture an abundance of semi-real “ghost electrons” dispersed in space, each denoting a distinct potential. The reality dwells in an indeterminate state. This notion is sometimes explained by stating that an electron occupies multiple locations simultaneously. Moreover, measurements serve to convert a certain “ghost” into tangible reality while eliminating its counterparts.
Does the experimenter have control over the outcome? Not if they opt for the prevailing ghost. The process hinges on randomness. Yet, a layer of choice is present, which is vital for grasping quantum reality. If, instead of measuring position, the experimenter decides to assess the electron’s speed, the fuzzy initial state resolves into a distinct result. This time, instead of locating electrons, measurements yield electrons with velocity. Interestingly, it appears that electrons with speed exhibit wave-like properties, distinct from their particle nature. Thus, electrons embody both wave and particle characteristics, contingent on the measurement approach.
In summary: the behavior of electrons—as waves or particles—is dictated by the type of measurement the experimenter chooses. While this may seem bizarre, the situation grows even stranger. What has transpired to atoms before measurement relies on the experimenter’s selections. In essence, the properties of electrons—wave or particle—are contingent upon one’s choices, suggesting that something may have retroactively influenced the “external” world prior to measurement.
Is this time travel? Retroactive causality? Telepathy? These terms are often overused in popular quantum physics discussions, but the clearest explanation comes from John Wheeler, who coined the term black hole: “The past exists solely as recorded in the present,” he asserted.
While Mr. Wheeler’s assertion is thought-provoking, is there an actual experiment that validates it? Over breakfast at the Hilton Hotel in Baltimore in 1980, Wheeler mentioned a curious inquiry: “How do you suppress the ghosts of photons?” Recognizing my bewilderment, he proceeded to elaborate on a unique twist he devised for a classical quantum experiment, applicable to light, electrons, or even entire atoms.
This experiment traces back to the British polymath Thomas Young, who in 1801 aimed to demonstrate the wave properties of light. Young established a screen with two closely placed slits and illuminated it with a pinprick of light. What transpired? Instead of the anticipated two blurred light bands, Young observed a series of bright and dark stripes known as interference fringes. This phenomenon arises because light waves passing through each slit disperse, where they amplify and create brighter sections through constructive interference while canceling out in others, resulting in dark patches through destructive interference.
Light passing through two slits in a screen during a double-slit experiment
Russell Kightley/Science Photo Library
The conversation surrounding quantum mechanics began with scientists debating whether light consists of waves or particles called photons. The resolution is that it is both. Thanks to modern advancements, we can conduct Young’s experiment one photon at a time. Each photon produces a minuscule dot on the second screen, and over time, multiple dots accumulate, forming the characteristic striped pattern unearthed by Young. This situation raises questions: if a photon is a minuscule particle, it should clearly pass through either slit or the other. Yet, both slits are necessary to create the interference pattern.
What occurs if an astute experimenter wants to determine the slit a particular photon travels through? A detector can be placed near a slit to achieve this. Once that occurs, the interference pattern vanishes. The act of detecting effectively causes the photons to assume a particle-like behavior, obscuring their wave characteristics. The same principle applies to electrons; one can either pinpoint which slit the electrons traverse, resulting in the absence of interference stripes, or obscure their pathways and observe stripes manifest after numerous electrons have produced the pattern. Thus, experimenters can dictate whether photons, or electrons for that matter, act like waves or particles when they hit the detection screen.
Now, let’s discuss Wheeler’s twist. The decision to observe or not doesn’t need to be premeditated. Photons (or electrons) can pass through a slit system and remain until reaching an imaging screen. The experimenter can even opt to glance back in time to see which slit a photon originated from. Known as a delayed choice experiment, this setup has been executed and yielded anticipated outcomes. When the experimenter decides to observe, the photons fail to coalesce into a striped pattern. The essence of the phenomenon is that the reality that It was—whether the light behaves like a wave traversing both slits or a particle going through one—is contingent on the later choice of the experimenter. For clarity, in real studies, the “selections” are automated and randomized to prevent biases, occurring more swiftly than human response times.
In delayed choice experiments, the past remains unchanged. Instead, without experimentation, multiple pasts exist, intertwining distinct realities. Your measurement choice narrows down this history. While a unique past remains elusive, the number of possibilities can be reduced. Thus, this experiment is frequently referred to as the quantum eraser experiment.
Although the time used in actual experiments is merely nanoseconds, in principle, it could reach back to the dawn of the universe. This is what lay behind Wheeler’s intriguing query regarding retaining the ghost of a photon. He envisaged a distant cosmic light source being gravitationally lensed from our view by an intervening black hole, with two light paths bending around opposite sides of the black hole before converging on Earth. This scenario resembles a two-slit experiment on a cosmic scale, where a photon’s ghost may arrive via one path while another, possibly longer, route carries a different one. To execute such a cosmic interference experiment, like Young’s original experiment, the first ghost must be preserved, or “held,” allowing the waves to overlap simultaneously, awaiting the arrival of the second ghost before they merge.
Einstein claimed that past, present, and future are mere illusions. In this case, he erred in specifying “the”. A While the past is recorded in today’s history, it comprises myriad interwoven “ghost pasts,” collectively creating unique narratives on a macroscopic level. Nevertheless, at a quantum level, it transforms into a mosaic of blurred partial realities that exceed human comprehension.
Paul Davies is a theoretical physicist, cosmologist, astrobiologist, and bestselling author. His book, Quantum 2.0, will be published by Penguin in November 2025.
John Stewart Bell developed a method to measure the unique correlations permitted in the quantum world
CERN
While some perceive a Poltergeist in the attic and others spot a ghost on dark nights, there’s also the enigmatic figure of John Stewart Bell. His groundbreaking work and enduring legacy have intrigued me for years.
Consider this: how much of our reality can we claim to experience objectively? I ponder this frequently, especially when discussing the intricate nature of space, time, and quantum mechanics. Bell was deeply reflective about such matters, and his contributions have forever altered our comprehension of these concepts.
Born in Belfast in 1928, Bell was, by all accounts, a curious and cheerful child. He gravitated towards physics early and undertook his first role as a lab engineer at just 16. With training in both theoretical and experimental physics, he built a significant part of his career around particle accelerators. Yet, it was the inconsistencies he perceived within quantum theory that occupied his thoughts during late nights.
Today, this area has become a well-established branch of physics, featured prominently in New Scientist. Modern physics does not typically welcome those who question the edges of physics, mathematics, and philosophy. In Bell’s time, scientists were still grappling with the legacies of quantum theory’s pioneers, including heated debates between Niels Bohr and Albert Einstein.
My interest in Bell’s work began as a casual pursuit, though I devoted several hours to it. In 1963, he took a sabbatical with his physicist wife, using the time to craft a pair of original papers. Initially published without much attention, their significance could not be understated.
Bell transformed philosophical inquiries into testable experiments, particularly concentrating on the notion of “hidden variables” in quantum mechanics.
Quantum mechanics inherently resists certainty and determinism, as elucidated by Bohr and his contemporaries in the early 20th century. Notably, definitive statements about quantum entities remain elusive until we engage with them. Predictive ability exists only in probabilistic terms—an electron, for instance, might have a 98% likelihood of exhibiting one energy level while being 2% likely to reveal another, but the actual outcome is intrinsically random.
How does nature make these seemingly random decisions? One theory proposes that certain properties remain hidden from observers. If physicists could identify these hidden variables, they could inject absolute predictability into quantum theory.
Bell crafted a test aimed at marginalizing the myriad hidden variable theories, either altering or challenging quantum theory. This test typically involves two experimenters—Alice and Bob. A pair of entangled particles is produced repeatedly, with one particle sent to Alice and the corresponding one dispatched to Bob in a separate laboratory. Upon receipt, Alice and Bob each independently measure specific properties, for instance, Alice might analyze a particle’s spin.
Simultaneously, Bob conducts his measurements without any communication between the two experimenters. Once all data is collected, it is filtered into equations derived by Bell in 1964. This “inequality” framework evaluates the correlations between Alice and Bob’s observations. Even in scenarios devoid of quantum interactions, some correlations may occur by mere chance. However, Bell established a threshold of correlation indicating that something beyond randomness is happening. The particles demonstrate correlations unique to quantum physics, negating the presence of local hidden variables.
Thus, Bell’s test does more than affirm quantum theory as a superior explanation of our reality; it also underscores the peculiar nature of “non-locality,” revealing strange traits of our existence. This implies that quantum objects can maintain connections, with their behaviors remaining profoundly intertwined despite vast separations. Einstein critiqued this notion vigorously, as it contradicts the principles of his special theory of relativity by insinuating a form of instantaneous communication between entities.
Bell, initially a disciple of Einstein’s theories, found himself ultimately proving his idol wrong. His tests compellingly indicated that our reality is indeed quantum. This debate continues to engage researchers, particularly regarding the persistent discrepancies between quantum theory and our best understanding of gravity, framed by Einstein himself.
There was little acknowledgment of Bell’s experimental designs during his lifetime, despite the technical challenges they presented. The first experiment of this kind was conducted in 1972, and it wasn’t until 2015 that a test with minimal loopholes ultimately refuted the local hidden variable theories conclusively. In 2022, physicists Alain Aspect, John F. Krauss, and Anton Zeilinger received the Nobel Prize in Physics for their extensive work on these experiments.
So why does John Stewart Bell’s legacy resonate so strongly with me? Am I ensnared in some quantum malaise?
The answer lies in the fact that his work and the myriad experiments testing it have spawned as many questions about quantum physics and physical reality as they aim to resolve. For instance, numerous physicists concur that our universe is fundamentally non-local, yet they strive to uncover the underlying physical mechanisms at play. Others are busy formulating new hidden variable theories that evade the constraints set by Bell’s tests. Additionally, researchers are scrupulously reevaluating the mathematical assumptions Bell made in his original work, believing that fresh perspectives on Bell’s findings may be critical for advancing interpretations of quantum theory and developing cohesive theories.
The repercussions of Bell’s findings permeate the realm of quantum physics. We have engaged in Bell tests for nearly five decades, continuously enhancing entangled particles. But this is just the beginning. Recently, I collaborated with physicists to design a method to leverage Bell’s work in exploring whether free will might be partially constrained by cosmic factors. Afterwards, I received a call from another cohort of researchers keen to discuss Bell again, this time in relation to gravity and the foundational nature of space and time. They drew inspiration from his methodologies and sought to create a test that would examine genuine gravitational properties rather than quantum ones.
It’s no wonder I feel inextricably linked to Bell. His capacity to convert philosophical inquiries into tangible tests encapsulates the essence of physics. The essence of physics is to unravel the world’s most baffling mysteries through experimental means. Bell’s test vividly embodies that promise.
If I must ponder a haunting presence, I couldn’t ask for a more remarkable specter.
Proteins come together to create the foam in gin fizz.
alex oberheiser
You may think that complex equations and alcohol don’t or shouldn’t mix. But when you make your favorite cocktail, you’ll unknowingly encounter one of the most complex processes in fluid mechanics, the study of how liquids flow.
When researchers try to predict how fluids move, bubble, and wave, they often encounter complex equations. The starting point for solving almost all of these problems is the Navier-Stokes equation, named after Claude-Louis Navier and George Gabriel Stokes. They invented it in the 1800s, which was also the golden age of mixology.
So what better way to learn about fluid mechanics than with a cocktail? From how to make bubbles to unusual cloud formations to supersonic jets of liquid, there are some great surprises hidden inside your drinks! . Roll up your sleeves and get out your cocktail shaker!
gin fizz
Experience the wonders of bubble miniatures
First, something squishy. Made with two parts gin, one part lemon juice, a splash of syrup, and a splash of soda water, gin fizz is easy to make without a layer of foam.
Bubbles challenge physicists. Sometimes they behave like solids. Sometimes it behaves like a liquid. When washing dishes, soap bubbles flow like water, but the hard foam from beer can be cut off in one go.
This difference is due to the bubbles. Bubbles form when bubbles gather. But how…
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.