Feature image via BBC Future

It is practically a truism that no one really understands quantum mechanics; yet quantum theory reigns as the dominant theory of the very small just as relativity does of the very large. This remains a paradox for a theory so fundamental that it provides the basis for our current understanding of phenomena such as atom decay and why stars shine, as well as how lasers and transistors work, which are embedded in just about everything. Physicists use and rely on quantum theory to predict the outcomes of experiments but they have stopped, as Dr Sean Carroll asserts in a recent op-ed for the New York Times, trying to understand how it works. While there are many contributing factors to this conundrum, a salient inhibitor is the siloed way in which we have come to think about the discipline of science and the way that the modern academic system reflects and perpetuates this way of thinking.

The barriers to understanding quantum begin with the fact that there are some truly sticky bits of theory that simply cannot be accounted for with our existing scientific frameworks. One such example is quantum’s measurement problem: a quantum waveform exists in all states of superposition until it is observed or measured, wherein it collapses into a single position. The fundamental challenge posed by this problem is that science supposes the existence of a measurable, objective world. The centrality of the role of the observer in quantum interactions defies this assumption by asserting that reality is observer-dependent and therefore non-fixed. This idea alone confronts science in a fundamental way, requiring an interpretation of reality that holds space for the “weird” and the “strange” of quantum mechanics—something that mathematics alone has not yet been able to provide.

This issue in quantum physics ignited a deep rift across the brightest minds of physics during the mid 1920s. Albert Einstein, representing the side of the argument which rejected the proposal that the quantum world could be characterized by probabilities rather than certainties, is famously quoted as claiming, “God does not play dice with the universe”. The interpretation of quantum mechanics that prevailed is the Copenhagen Interpretation, which asserts the rather less-than-satisfying conclusion that we simply cannot know more about quantum mechanics than what we can measure using equations. Understanding the theories was thus placed in the “too hard” basket.

Still, divergent theories from quantum’s inception into the 1950’s have attempted to make sense of this phenomenon. These theories had an undeniably philosophical bent and resulted in the majority of postulators being shunned from science altogether. In 1957 for example, Hugh Everett constructed a theory to account for quantum superposition with his Many Worlds Interpretation (MWI). Essentially, Everett’s MWI proposes that for every state of an atom’s superposition in a quantum interaction, that atom simultaneously takes each potential path, creating multiple, coexistent physical realities for each state. Mainstream physicists ridiculed Everett for what they considered to be a scientifically blasphemous postulation, a fact which no doubt contributed to his transition from science to defence analytics shortly after he submitted his dissertation.

Scientists’ resistance toward a multidisciplinary understanding of a scientific problem, however, is a relatively new phenomenon. For centuries, the disciplines of science and philosophy were taken in unity. In fact, the term ‘scientist’ was not even coined until the 19th century. Before that, great names such as Galileo and Newton considered themselves ‘natural philosophers’ rather than ‘scientists’. Even Einstein, Hiesenberg, Dirac and their cohort, the fathers of quantum mechanics, were schooled in European philosophy. This deep understanding of both the “soft” and “hard” sciences influenced their way of thinking, the types of questions they posed and ultimately the theories they constructed. As such, it enabled them to think beyond the boundaries of what was generally accepted information at that time and allowed them to construct new ideas that came to be known as fact.

However, since an epistemological transformation in the 1600-1700’s, which produced the distinction of “science” as the empirical investigation of phenomena, science and philosophy have become increasingly separate disciplines. While it has been a gradual process, this disciplinary divorce has become ingrained in society with the help of most knowledge institutions worldwide, culminating in the propagation of an isolationist understanding of these and other disciplines. This poses a significant challenge to the kind of fruitful multidisciplinary thinking that conceived nearly all of science’s greatest discoveries to date.

Beyond reifying the isolation of disciplines through course structures, universities also play a significant role in shaping academic discovery by prioritising certain areas of research over others. As Carroll elaborates:

“Few modern physics departments have researchers working to understand the foundations of quantum theory. On the contrary, students who demonstrate an interest in the topic are gently but firmly — maybe not so gently — steered away, sometimes with an admonishment to “Shut up and calculate!” Professors who become interested might see their grant money drying up, as their colleagues bemoan that they have lost interest in serious work.”

This situation is compounded by the fact that the metrics by which academic researchers are hired, retained and promoted has undergone a transformation over the last half-century. During this time, research culture has been impacted drastically by the dawn of the Internet, which has enabled an open and thriving, digital research economy. At the same time, an associated shift in focus towards metrics of productivity, quantified largely through research output, has become dominant across knowledge institutions. These changes frame the pervasive expectation that academic researchers should devote a majority of their time to publishing on certain topics and in certain journals in order to remain relevant and successful. Among other challenges, this focus on publication as a distinct metric of notoriety in the academic sciences has led many to game the system, with the resultant focus on quantity of output often detracting from the quality of output.

Aside from this, this phenomenon which has become known in academia as the “publish or perish” culture—that is, the pressure on academics to continuously publish work in order to sustain and further their career in academia—has left academic scientists with little spare time for creative thinking. This modern academic environment has been lamented by Peter Higgs, the scientist who discovered Higgs Boson, who doubts he could have achieved that breakthrough in today’s current academic system:

“It’s difficult to imagine how I would ever have enough peace and quiet in the present sort of climate to do what I did in 1964,” Higgs said. “Today I wouldn’t get an academic job. It’s as simple as that. I don’t think I would be regarded as productive enough.”

Explorative and imaginative thought requires both ample time and space as well as the expectation that by nature of trying new things, it is likely the researcher will encounter far more twists, turns and dead-ends than solutions. While these qualities do not fit well into the “publish or perish” framework, it is well-established that they are of critical value to innovation. Discovery demands that we challenge the very prejudices that have become ingrained in our conceptual structures. In order to do this, one must have the freedom and encouragement to shatter these, rather than be required to work within systems that reinforces them.