How Three Imaginary Physics Demons Challenged the Laws of Nature

There has always been a strong interplay between imagination and physics. Albert Einstein crafted his theory of relativity by envisioning a scenario where he chased a beam of light. Erwin Schrödinger famously introduced the idea of cats that are both alive and dead. German mathematician David Hilbert illustrated the paradox of infinity by conceptualizing a hotel with limitless rooms and patrons. Through inventive thought experiments, physicists rigorously examine concepts and deepen their comprehension.

Interestingly, three of the most enduring thought experiments revolve around what is now known as “the devil.” The most recognized is Maxwell’s Demon, conceived in 1867, envisioning a minuscule being endowed with unusual but logical abilities. Together with Laplace’s Devil and Roschmidt’s Devil, these thought experiments continue to baffle physicists today, suggesting that pondering these devils can illuminate some of the most complex principles in physics.

“What’s refreshing and unexpected is that scientists can gain profound insights about reality by engaging in these fictional realms,” says Michael Stuart, a philosopher of science at the University of York, UK. “Many would contend that the essence of science hinges upon such imaginings.”

Laplace’s Devil

The concept of our first demon originated from the mind of French polymath Pierre-Simon Laplace, who was largely influenced by Isaac Newton. In 1814, Laplace posed a straightforward query: “If Newton’s laws can predict the fall of an apple, could we apply the same logic to predict everything?” What if we had perfect knowledge about every particle and object? He invited us to picture a devil—whom he referred to as “intelligence”—that could do exactly that. If it understood the position and momentum of all particles alongside the laws of nature, it could foresee the entirety of the universe’s future. “Nothing would remain uncertain,” he asserted. “The future could be as clear as the past.”

While we may never construct a machine endowed with Laplace’s demonic faculty, envisioning such a being assists in identifying logical inconsistencies in the theory. Does it imply that everything—from planets to humans—is predetermined? Does science assert that the laws of physics dictate all outcomes? Free will may appear to be, at best, an illusion, a mere byproduct of our ignorance.

Fortunately, the essence of the first demon is relatively straightforward to dismantle. Physicists are convinced that no entity could possess the knowledge attributed to Laplace’s demon. First, Einstein’s special theory of relativity establishes that information cannot travel faster than light. Therefore, some events can indeed influence your future, but you remain ignorant at that moment since the information must travel at light speed and lacks time to reach you, thereby nullifying Laplace’s demon.

Even in the event that this devil could access knowledge from every corner of the universe, quantum mechanics introduces another obstacle. Since the 1920s, it has been acknowledged that one cannot simultaneously ascertain both a particle’s position and momentum. Therefore, the devil cannot precisely determine where each particle is or what it is doing; it can only describe the probabilities surrounding particle properties.

Laplace’s tidy particle-by-particle depiction of reality is superseded by a quantum universe, characterized by a vast, fluctuating wavefunction—an abstract mathematical construct that encapsulates all potential outcomes. Even if the devil were able to monitor these outcomes, there remains no certainty regarding which one would ultimately manifest in reality.

The Devil of Roschmidt

Though Laplace’s devil seems to have lost its potency, even more sinister thought experiments lie ahead. The second demon emerged during a period of rapid industrialization, where the steam engine intensified inquiries about heat, energy, and disorder. Austrian physicist Ludwig Boltzmann sought an explanation for entropy—a slippery concept that explains how systems devolve into chaos over time. Sandcastles fall apart, ice melts, and rust forms. Boltzmann believed that zooming into reality and observing the minute components of a larger system, like individual gas molecules filling a room, could clarify this concept.

However, his elder colleague, Austrian physicist Josef Loschmidt, challenged this approach in 1876 by posing a simple yet devastating dilemma. Imagine a universe in which time has halted; all molecules have a defined position and direction of movement. Loschmidt suggested that if you reversed the movement of each particle, you could essentially undo entropy. Roschmidt’s original positing did not mention a “demon,” although later iterations often included a demon that could perceive and freeze all particles, largely due to subsequent developments in the field.

The evolution of steam engines prompted inquiries into heat, energy, and entropy.

Loschmidt’s scenario deeply unsettled physicists as it suggested a time-related paradox. When considered at a microscopic level, reversing particle movement doesn’t seem to result in any contradictions. However, this breaks down at a macroscopic level; as the world seemingly restores itself in reverse, puddles solidify into ice, and shattered vases reassemble. This raises the question: “Why does time appear to flow in only one direction if at the microscopic level we can easily reverse it?”

Subsequent experiments attempted time reversal, much like Roschmidt’s demons. In the 1950s, Erwin Hahn utilized radio waves to temporarily synchronize electric dipoles (such as hydrogen atoms in water) to rotate uniformly, momentarily decreasing the system’s entropy. This seemingly created the illusion of time moving backward. So, did the Roschmidt demon manage to outsmart the concept of entropy?

Not entirely. It is now understood that entropy doesn’t imply that a system must always degenerate into disorder. Some systems can evolve into a more ordered state in a brief span. However, as Hahn demonstrated, entropy ultimately prevails. When the radio beam was switched off, the dipole reverted to chaos.

Why does entropy consistently rise? Scientifically speaking, we believe that the universe began in a highly ordered state with low entropy, where everything was systematically arranged. This constrains progress to one direction: toward chaos. Aside from fostering additional disorder, there are various methods to disrupt an orderly system. This suggests that in theory, Roschmidt’s demon can reverse small particles’ trajectories, albeit contrary to expectations.

“The situation with the second law differs fundamentally from Newton’s second law,” notes Katie Robertson, a philosopher at the University of Stirling in the UK. “Its probabilistic nature suggests that ‘You probably cannot reduce entropy.’”

Ultimately, the probabilities dispelled this demon, but they did little to enhance our understanding. In response to Loschmidt, Boltzmann shifted from the original approach to a more statistically oriented framework, as it succinctly captured the delicate logic of probability. His advanced thinking led to the formulation of the Boltzmann equation, now inscribed on his epitaph.

Maxwell’s Devil

The third and perhaps best-known demon was proposed by Scottish physicist James Clerk Maxwell in 1867, shortly before Roschmidt raised his concerns. Like Loschmidt, Maxwell grappled with the second law of thermodynamics, but he examined the notion of increasing entropy from a different perspective. What if, instead of rewinding the universe, we could intervene in it molecule by molecule? Envision a meddlesome being (later referred to as a demon by physicists like William Thomson) that could manipulate gas molecules trapped in a box divided by a trapdoor. Over time, this entity could violate the second law by segregating faster-moving molecules from slower-moving ones.

Various straightforward “solutions” might come to mind. Perhaps this demon expends energy opening and closing the door. However, theoretically, this “work” can be minimized infinitely. The demon could act as frivolously as desired, yet the paradox persists.


Scientists can learn a lot about reality by entering these fictional spaces

Instead, physicists began to suspect that the actual cost wasn’t the energy exerted by the demon, but the amount of information it needed to process. A certain type of memory seems mandatory to record the position and momentum of each molecule. And astonishingly, this memory is finite.

In the 1920s, Hungarian physicist Leo Szilard demonstrated that even a simplified version of Maxwell’s experiment—featuring only one molecule bouncing within a box—could enable a clever demon to extract work from the system. Nevertheless, he posited that this necessitates observing molecules and storing that information, requiring energy in the process.

Ultimately, something must yield. In the 1960s, IBM physicist Rolf Landauer made a crucial point. For the demon to remain functional, it must free up space in memory, generating heat and consequently increasing entropy within the system. The second law remains intact.

Laplace’s demon can predict the future of the entire universe.

George Rose/Getty Images

Moreover, physicists acknowledged that information, akin to energy, constitutes a tangible resource. Gaining insight into a system is not merely a matter of abstract logistics. Under appropriate conditions, information can also serve as fuel. Thus, Maxwell’s demon somehow translates information into work. Today, this demon symbolizes devices that function at the intersection of information and energy. These “information engines” not only challenge conventional wisdom but also hold the potential to convert demonic logic into practical technology. In 2024, researchers devised a quantum variant of the Szilard engine to power batteries within quantum computers. Instead of demons, microwave pulses were employed to displace higher-energy qubits from lower-energy ones, generating an energy differential capable of doing work like a battery.

While we remain distant from utilizing these innovations to charge mobile devices, the aspiration is that these miniature quantum engines will aid in manipulating particles or toggling qubits.

In this light, Maxwell’s demons have not been vanquished at all. Rather, they evolved into concepts that Maxwell could never have envisioned. Not as an infringement upon the Second Law, but as a means to explore the intricate and unexpected ways nature allows us to utilize information as a physical resource.

Collectively, these demons challenge both theoretical limits and intuitive understanding. While some have been tackled, new paradoxes continue to emerge. Yet, these are dilemmas that physicists welcome. These intriguing thought experiments provide scientists with a compelling avenue to push the boundaries of their knowledge.

topic:

Source: www.newscientist.com

Why John Stewart Bell Has Challenged Quantum Mechanics for Decades

John Stewart Bell developed a method to measure the unique correlations permitted in the quantum world

CERN

While some perceive a Poltergeist in the attic and others spot a ghost on dark nights, there’s also the enigmatic figure of John Stewart Bell. His groundbreaking work and enduring legacy have intrigued me for years.

Consider this: how much of our reality can we claim to experience objectively? I ponder this frequently, especially when discussing the intricate nature of space, time, and quantum mechanics. Bell was deeply reflective about such matters, and his contributions have forever altered our comprehension of these concepts.

Born in Belfast in 1928, Bell was, by all accounts, a curious and cheerful child. He gravitated towards physics early and undertook his first role as a lab engineer at just 16. With training in both theoretical and experimental physics, he built a significant part of his career around particle accelerators. Yet, it was the inconsistencies he perceived within quantum theory that occupied his thoughts during late nights.

Today, this area has become a well-established branch of physics, featured prominently in New Scientist. Modern physics does not typically welcome those who question the edges of physics, mathematics, and philosophy. In Bell’s time, scientists were still grappling with the legacies of quantum theory’s pioneers, including heated debates between Niels Bohr and Albert Einstein.

My interest in Bell’s work began as a casual pursuit, though I devoted several hours to it. In 1963, he took a sabbatical with his physicist wife, using the time to craft a pair of original papers. Initially published without much attention, their significance could not be understated.

Bell transformed philosophical inquiries into testable experiments, particularly concentrating on the notion of “hidden variables” in quantum mechanics.

Quantum mechanics inherently resists certainty and determinism, as elucidated by Bohr and his contemporaries in the early 20th century. Notably, definitive statements about quantum entities remain elusive until we engage with them. Predictive ability exists only in probabilistic terms—an electron, for instance, might have a 98% likelihood of exhibiting one energy level while being 2% likely to reveal another, but the actual outcome is intrinsically random.

How does nature make these seemingly random decisions? One theory proposes that certain properties remain hidden from observers. If physicists could identify these hidden variables, they could inject absolute predictability into quantum theory.

Bell crafted a test aimed at marginalizing the myriad hidden variable theories, either altering or challenging quantum theory. This test typically involves two experimenters—Alice and Bob. A pair of entangled particles is produced repeatedly, with one particle sent to Alice and the corresponding one dispatched to Bob in a separate laboratory. Upon receipt, Alice and Bob each independently measure specific properties, for instance, Alice might analyze a particle’s spin.

Simultaneously, Bob conducts his measurements without any communication between the two experimenters. Once all data is collected, it is filtered into equations derived by Bell in 1964. This “inequality” framework evaluates the correlations between Alice and Bob’s observations. Even in scenarios devoid of quantum interactions, some correlations may occur by mere chance. However, Bell established a threshold of correlation indicating that something beyond randomness is happening. The particles demonstrate correlations unique to quantum physics, negating the presence of local hidden variables.

Thus, Bell’s test does more than affirm quantum theory as a superior explanation of our reality; it also underscores the peculiar nature of “non-locality,” revealing strange traits of our existence. This implies that quantum objects can maintain connections, with their behaviors remaining profoundly intertwined despite vast separations. Einstein critiqued this notion vigorously, as it contradicts the principles of his special theory of relativity by insinuating a form of instantaneous communication between entities.

Bell, initially a disciple of Einstein’s theories, found himself ultimately proving his idol wrong. His tests compellingly indicated that our reality is indeed quantum. This debate continues to engage researchers, particularly regarding the persistent discrepancies between quantum theory and our best understanding of gravity, framed by Einstein himself.

There was little acknowledgment of Bell’s experimental designs during his lifetime, despite the technical challenges they presented. The first experiment of this kind was conducted in 1972, and it wasn’t until 2015 that a test with minimal loopholes ultimately refuted the local hidden variable theories conclusively. In 2022, physicists Alain Aspect, John F. Krauss, and Anton Zeilinger received the Nobel Prize in Physics for their extensive work on these experiments.

So why does John Stewart Bell’s legacy resonate so strongly with me? Am I ensnared in some quantum malaise?

The answer lies in the fact that his work and the myriad experiments testing it have spawned as many questions about quantum physics and physical reality as they aim to resolve. For instance, numerous physicists concur that our universe is fundamentally non-local, yet they strive to uncover the underlying physical mechanisms at play. Others are busy formulating new hidden variable theories that evade the constraints set by Bell’s tests. Additionally, researchers are scrupulously reevaluating the mathematical assumptions Bell made in his original work, believing that fresh perspectives on Bell’s findings may be critical for advancing interpretations of quantum theory and developing cohesive theories.

The repercussions of Bell’s findings permeate the realm of quantum physics. We have engaged in Bell tests for nearly five decades, continuously enhancing entangled particles. But this is just the beginning. Recently, I collaborated with physicists to design a method to leverage Bell’s work in exploring whether free will might be partially constrained by cosmic factors. Afterwards, I received a call from another cohort of researchers keen to discuss Bell again, this time in relation to gravity and the foundational nature of space and time. They drew inspiration from his methodologies and sought to create a test that would examine genuine gravitational properties rather than quantum ones.

It’s no wonder I feel inextricably linked to Bell. His capacity to convert philosophical inquiries into tangible tests encapsulates the essence of physics. The essence of physics is to unravel the world’s most baffling mysteries through experimental means. Bell’s test vividly embodies that promise.

If I must ponder a haunting presence, I couldn’t ask for a more remarkable specter.

Topic:

Source: www.newscientist.com