Quantum Theory

Quantum Theory

Quantum Superposition Bridges the Classic World


No Comments

Cracks have begun to show in one of quantum’s biggest controversies. The well-known Schrödinger’s cat thought experiment, which sought to illustrate the absurdity of applying quantum theory to the macro-physical world ruled by classical physics has been challenged by a recent advancement in quantum physics. An international team, led by Markus Arndt of the University of Vienna, successfully placed a large molecule of 2,000 atoms—the biggest object yet—into a state of quantum superposition. The advancement shows that quantum effects can be translated into the classical world, establishing the foundations for scientists to continue to demonstrate how the gap between these seemingly disparate worlds might be reconciled.

Quantum theory tells us that particles in superposition can shift between a wave-like state and a particle-state, meaning they can be in two places at once. Of course, from what is observable in the classical world, this cannot be true. If it were, our understanding of what we understand to be “real” would be challenged, opening the door for a whole host of quantum weirdness that classical theory keeps at bay. Essentially, as Schrödinger tried to prove with his thought experiment, if quantum mechanics is reflected on a macro-physical scale, it signifies that human beings could also exist in two places at once. It does not take long for this information to snowball into theories of time travel and multiple worlds, both of which find basis in quantum theory.

On a fundamental level, the new work published in Nature illustrates that the multi-state paradox of quantum mechanics, known as superposition, functions on a larger scale than previously demonstrated. In theory, we already knew this to be true, but the experiment proves it at the largest scale yet, having only been demonstrated previously using the smallest possible particles; atoms, photons and electrons. The experiment used by Arndt and his team, essentially a souped-up the double slit experiment, has been used regularly since 1801 in quantum mechanical experiments to observe the effects of superposition.

The simple experiment involves particles of light (photons) beamed toward a barrier with two slits in it. On a screen behind the barrier, the effects of quantum superposition are displayed in the form of what is known as an interference pattern. It looks something like this:

This striped pattern that results is interesting, as one might assume that a single beam of photons would produce a representative pattern of a solitary line, indicating their fall along a single path. However, the striped pattern that is produced shows that all of the photon’s possible paths are taken and eventually interfere with each other, suggesting the particle in fact also acts as a wave. This describes the probabilistic nature of quantum phenomena, challenging Einstein’s famous claim that “God does not play dice with the universe”.

In order to pull their super-sized version of this experiment off, the international team had to create not only the perfect environment but also synthesized the massive molecule itself in order to ensure it met the requirements for complex quantum activity to occur. The team built a custom interferometer—which, as the name suggests, is a tool that works by merging two or more sources of light in order to create an interference pattern—called the Long-Baseline Universal Matter-Wave Interferometer (LUMI). The team’s LUMI also beats a record: it is the longest interferometer ever built, with a baseline length of 2 metres. Use of this specialised machine permitted the researchers to fire the beam of heavy molecules (some more than 25,000 times the mass of a hydrogen atom) at the multiple-slit apparatus and observe the resulting interference pattern, confirming the molecule’s state of superposition.

With records being broken in the quantum space with what feels like near-weekly regularity, this advancement especially marks a unique turning point in the disagreement between quantum mechanics and general relativity. These two frameworks we use to understand the world around us have come as close to being bridged as ever before. While the success of this experiment does serve to wedge the door open for a number of seemingly bizarre theories like time travel and multiple worlds, it is doubtful that human beings or planets will be time traveling through multiple realities any time soon, if ever. However, this new, scalable research pushes the limit that scientists seek in quantum interactions of superposition further along, enabling and encouraging future research to continue to explore these limits.

Quantum Theory

Why Has Science Stopped Trying to Understand Quantum Theory?


1 Comment

Feature image via BBC Future

It is practically a truism that no one really understands quantum mechanics; yet quantum theory reigns as the dominant theory of the very small just as relativity does of the very large. This remains a paradox for a theory so fundamental that it provides the basis for our current understanding of phenomena such as atom decay and why stars shine, as well as how lasers and transistors work, which are embedded in just about everything. Physicists use and rely on quantum theory to predict the outcomes of experiments but they have stopped, as Dr Sean Carroll asserts in a recent op-ed for the New York Times, trying to understand how it works. While there are many contributing factors to this conundrum, a salient inhibitor is the siloed way in which we have come to think about the discipline of science and the way that the modern academic system reflects and perpetuates this way of thinking.

The barriers to understanding quantum begin with the fact that there are some truly sticky bits of theory that simply cannot be accounted for with our existing scientific frameworks. One such example is quantum’s measurement problem: a quantum waveform exists in all states of superposition until it is observed or measured, wherein it collapses into a single position. The fundamental challenge posed by this problem is that science supposes the existence of a measurable, objective world. The centrality of the role of the observer in quantum interactions defies this assumption by asserting that reality is observer-dependent and therefore non-fixed. This idea alone confronts science in a fundamental way, requiring an interpretation of reality that holds space for the “weird” and the “strange” of quantum mechanics—something that mathematics alone has not yet been able to provide.

This issue in quantum physics ignited a deep rift across the brightest minds of physics during the mid 1920s. Albert Einstein, representing the side of the argument which rejected the proposal that the quantum world could be characterized by probabilities rather than certainties, is famously quoted as claiming, “God does not play dice with the universe”. The interpretation of quantum mechanics that prevailed is the Copenhagen Interpretation, which asserts the rather less-than-satisfying conclusion that we simply cannot know more about quantum mechanics than what we can measure using equations. Understanding the theories was thus placed in the “too hard” basket.

Still, divergent theories from quantum’s inception into the 1950’s have attempted to make sense of this phenomenon. These theories had an undeniably philosophical bent and resulted in the majority of postulators being shunned from science altogether. In 1957 for example, Hugh Everett constructed a theory to account for quantum superposition with his Many Worlds Interpretation (MWI). Essentially, Everett’s MWI proposes that for every state of an atom’s superposition in a quantum interaction, that atom simultaneously takes each potential path, creating multiple, coexistent physical realities for each state. Mainstream physicists ridiculed Everett for what they considered to be a scientifically blasphemous postulation, a fact which no doubt contributed to his transition from science to defence analytics shortly after he submitted his dissertation.

Scientists’ resistance toward a multidisciplinary understanding of a scientific problem, however, is a relatively new phenomenon. For centuries, the disciplines of science and philosophy were taken in unity. In fact, the term ‘scientist’ was not even coined until the 19th century. Before that, great names such as Galileo and Newton considered themselves ‘natural philosophers’ rather than ‘scientists’. Even Einstein, Hiesenberg, Dirac and their cohort, the fathers of quantum mechanics, were schooled in European philosophy. This deep understanding of both the “soft” and “hard” sciences influenced their way of thinking, the types of questions they posed and ultimately the theories they constructed. As such, it enabled them to think beyond the boundaries of what was generally accepted information at that time and allowed them to construct new ideas that came to be known as fact.

However, since an epistemological transformation in the 1600-1700’s, which produced the distinction of “science” as the empirical investigation of phenomena, science and philosophy have become increasingly separate disciplines. While it has been a gradual process, this disciplinary divorce has become ingrained in society with the help of most knowledge institutions worldwide, culminating in the propagation of an isolationist understanding of these and other disciplines. This poses a significant challenge to the kind of fruitful multidisciplinary thinking that conceived nearly all of science’s greatest discoveries to date.

Beyond reifying the isolation of disciplines through course structures, universities also play a significant role in shaping academic discovery by prioritising certain areas of research over others. As Carroll elaborates:

“Few modern physics departments have researchers working to understand the foundations of quantum theory. On the contrary, students who demonstrate an interest in the topic are gently but firmly — maybe not so gently — steered away, sometimes with an admonishment to “Shut up and calculate!” Professors who become interested might see their grant money drying up, as their colleagues bemoan that they have lost interest in serious work.”

This situation is compounded by the fact that the metrics by which academic researchers are hired, retained and promoted has undergone a transformation over the last half-century. During this time, research culture has been impacted drastically by the dawn of the Internet, which has enabled an open and thriving, digital research economy. At the same time, an associated shift in focus towards metrics of productivity, quantified largely through research output, has become dominant across knowledge institutions. These changes frame the pervasive expectation that academic researchers should devote a majority of their time to publishing on certain topics and in certain journals in order to remain relevant and successful. Among other challenges, this focus on publication as a distinct metric of notoriety in the academic sciences has led many to game the system, with the resultant focus on quantity of output often detracting from the quality of output.

Aside from this, this phenomenon which has become known in academia as the “publish or perish” culture—that is, the pressure on academics to continuously publish work in order to sustain and further their career in academia—has left academic scientists with little spare time for creative thinking. This modern academic environment has been lamented by Peter Higgs, the scientist who discovered Higgs Boson, who doubts he could have achieved that breakthrough in today’s current academic system:

“It’s difficult to imagine how I would ever have enough peace and quiet in the present sort of climate to do what I did in 1964,” Higgs said. “Today I wouldn’t get an academic job. It’s as simple as that. I don’t think I would be regarded as productive enough.”

Explorative and imaginative thought requires both ample time and space as well as the expectation that by nature of trying new things, it is likely the researcher will encounter far more twists, turns and dead-ends than solutions. While these qualities do not fit well into the “publish or perish” framework, it is well-established that they are of critical value to innovation. Discovery demands that we challenge the very prejudices that have become ingrained in our conceptual structures. In order to do this, one must have the freedom and encouragement to shatter these, rather than be required to work within systems that reinforces them.