The Reductionist West and Ancient Wisdom

Modern science tells us that what we can truly say about empirical reality only has relative (Relativity) or statistical significance (Quantum Mechanics), Quantum Theory in particular calling into question the role of the observer[1] but each of these disciplines resting on epistemological grounds that identify the boundaries of reality by static observable phenomena that can be brought out by repeated experiment, i.e. what the scientific community refers to as verifiable results.  This epistemological position, although very rarely explicitly called out as such, is the essence of the empiricist view of reality and this in turn has the side effect of isolating and quarantining all other phenomena as “unscientific”, or simply “subjective”, hence the more and more common assertion that theological and philosophical inquiry is for those who are unscientifically minded.

The problem with this approach however is that it is very rarely pointed out what the epistemological assumptions are in fact in order that this for this scientific approach to represent “truth”.  Nor is it made clear where the boundaries of this Truth lay, it is simply presumed that anything that lay outside this domain is a matter of the humanities and is subjective, effectively outside the domain of natural philosophy as Aristotle carved out the domains of reality so many centuries ago.

The path to this intellectual destination that we find ourselves in today was a long one but it no doubt started with the Greek/Hellenic philosophical tradition.  It then evolved very closely together with our understanding of the universe itself at the macroscopic scale as we came to understand that the Earth revolved around the Sun and that for all intents and purposes the “observable” universe emerged from a singularity event some 14 billion years ago, leaving us with unwavering faith in mathematics along with its cause and effect, materialistic and deterministic view of reality as the Worldview.  This is physical reality and while the “psychological” domain remains integral to any sort of perception, “I think therefore I am”, this essential philosophical pillar is left out of virtually all modern day scientific discussions.  Quantum Mechanics however brings this basic truth right back into the forefront of the conversation about the nature of reality.  And yet physicists have turned to what can only be called “nonsensical” explanations (multi-dimensional reality for example) to explain away this strange and odd phenomena that reveals itself in the subatomic world, speaking to our blind faith in mathematics as the only truth that can be relied on unequivocally and our stubborn reluctance to admit the role that cognitive perception plays in any universal model we create.

The problem with this is that based upon our human experience, our psyche is the dominating force of each and every individual’s reality.  The fact that a table can be said to exist, that it has specific dimensions and has a specific molecular structure that can be described precisely, that it can be said to have a specific weight and specific height, that it can be said to exist in a certain time and space, that it can be said to obey certain laws of radioactive decay is all well and good, and extraordinarily powerful when it comes to technological advancement (and underpins the Information Age and scientific evolution in the last few centuries) but it says absolutely nothing about the purpose of the table – why it exists – nor does it say anything about and how the experience of perceiving the table fits into our existence as individuals and how that experience can be shaped to improve our way of life as well as the underlying social fabric of society.  This latter questions remains the fundamental topic of the philosophers of antiquity and is almost complete absent of academic and scientific, and even religious, dialogue today.

Why this table exists, and the role of “purpose” in defining the reality of anything that can be said to exist is called teleology in philosophic circles and this was a prime element of Aristotle’s causal model on top of which his epistemology sat, a model that has been abandoned for the most part in modern scientific parlance to the point where the standard interpretation of Quantum Mechanics, the one that is taught in most if not all academic institutions, is that the math is what the math is, it’s a calculation tool and nothing can or should be said about what it tells us about the reality of the world we live in (Copenhagen Interpretation).

And arguably what’s worse, is that there are many who have interpreted the mechanical theory to imply, in as a deterministic and materialistic a view as possible, that the stochastic (statistical) nature of the math indicates not that there is a higher order principle at work as Bohm proposed with his notion of the implicate order, but that each of the potential outcomes represents a reality in and of itself that actually exists in the real, empirical sense.  This the multi-verse interpretation of Quantum Mechanics that seems to be gaining so much ground in the physics (and popular science) community today that although aligns with the underlying mathematics quite nicely completely goes against any shred of common sense.

The theory’s original author, Hugh Everett, postulated in a thesis dissertation that the wave mechanics underling quantum mechanics is reflective of the observer being an integral aspect of, and participating agent in, measurement phenomena.  This “Relative State” could be said to exist – but only in the theoretical, transcendental reality sense, rather than the wavefunction potential paths themselves having a real existence independent of observation.  What this Multi-Verse Interpretation does reflect however, which is somewhat alarming, is a resounding and unwavering  faith in determinism and local realism, even if it defies any sense of what we perceive to be “real” from a human cognitive perspective.  We’ve effectively left the reservation on this one.

But what’s been lost in all of this empiricist obsession is that there is another way to look at the world that is not entirely reductionist, and that in this changing of perspective a different kind of knowledge, what Jung called “absolute knowledge” in his treatise on synchronicity that he wrote in collaboration with physicist Wolfgang Pauli, the author of the “exclusion principle”.  This more broad definition of knowledge has a long standing tradition, longstanding in the West with the Platonists and Stoics and even some of the pre-Socratics and long standing in the East in the Taoist, Vedic and Buddhist theo-philosophical schools which are fundamentally non-reductionist and incorporate the reality of, and interconnectedness of the, psychological domain directly into their theo-philosophical systems, their metaphysics as it were.

 

In studying the worldview and mindset of the ancients, specifically the first philosophical systems that were invented in antiquity, one cannot help oneself in asking the fundamental question why one is interested in, and what in fact we can learn, by studying these ancient belief systems.  Most certainly one can argue that by understanding our history, not just as a civilization but as a race in toto, one can come to a better understanding of ourselves as a race in our modern, world integrated society that we live in today.  But is there something more than that?

Many scientists and academics today no doubt consider the study of ancient systems of belief interesting in their own right, and illustrative of how life was some three to four thousand years ago when civilization as we know it today first takes root in the Indus valley region, the Tigres-Euphrates delta, in Northern Africa/Egypt on the banks of the Nile, in the Mediterranean and then to the far east in modern China – all incidentally at around the same time give or take a few hundred years[2].  But also it appears that many in academia, specifically in the more hard scientific fields (Engineering, Physics, etc.) write off much of the knowledge of these ancient peoples as outdated, misinformed and while interesting from a mythological or historical perspective not very applicable to modern times where the great pillars of modern Science prevail.  In no small measure this notion, this camp of Scientific believers as it were, digs into their blind faith of empirical, substantive reality to counteract the blind faith and irrational beliefs surrounding Religion, the presumed source of much of the world’s conflict in our modern era.

But what if perhaps in looking into the mind of the ancients, in trying to understand and peel back the layers of dust and dogma that rest on top of the of ancient texts as they have been discovered in finds across the world, that there is some ancient wisdom there that has in fact been lost, that can in fact be “re-discovered” if only the texts, the words, and their context are but looked at and understood with the proper lens?  For there is no doubt some truth to the adage that for everything gained there is something lost, and given how much we have indeed gained in the last two thousand years we must have undoubtedly lost something along the way.

It is this belief – and there is an element of faith here no doubt – that as we have progressed in so many different ways in the modern era to be able to support the massive global society that we live in today, that we have lost our connection to the world we live in, and its fundamental source and essence, that it has been shrouded in layers and layers of specialized commoditization of goods and services and that our connection with each other has in fact been lost along with it, having been wrapped, warped and bastardized from a deep rooted spiritual connection to a transaction with some sort of mutual benefit in mind.  With the rise of consumerism and capitalism that so fuels now almost the entire civilized world, we have seen also a significant rise in what we have termed “religious extremism” which at some level no doubt is fueled by an equal and opposite reaction to these very same forces.  While at the very top of these movements there exists a quest for wealth and power consistent with all of the great world conflicts we have seen throughout the ages but at their core there is something else going on.  These movements would not be able to take root if there was not fertile ground for them to yield fruit and there does exist a level of disenfranchisement, dissatisfaction and even abhorrence with the values and principles upon which the West conducts itself which helps to give rise to these extremist and violent factions that have gained such prominence in the last few decades.  One only must listen to the dark rhetoric coming from some of the senior officials in Iran, or ISIS, to get a taste of this and while we might be quick to write off these statements and behavior as simply “evil” that must be “vanquished”, this type of simplistic thinking is just how we found ourselves in the current geopolitical mess we are in today.

This is not to suggest that there is some utopial based ancient society that we should strive to bring back to life in some way, where everyone was happy and “enlightened”, but it does beg the very interesting question that perhaps, just maybe, there exists a thread and undercurrent of knowledge reflected in the teachings of the of the great sages of old from antiquity – the Buddha’s, the Laotzu’s, the Christ’s, the Socrates’s … – that harkens back to a time when we were connected to the world around us in a more direct and fundamental way, when everything was sacred more or less, and when there existed a much more direct, open and more well understood connection with the divine even if it could not be revealed in a scientific experiment.

 

In studying the wisdom of the ancients to the West (the Greeks and Romans primarily but to a lesser extent this includes the wisdom of the Sumer-Babylonians, Egyptians, Persians and Arabs as well which was incorporated into this wisdom tree essentially) we have what might be called a direct line of sight view into the progression and advancement of knowledge.  While there are certainly gaps in terms of extant works, etc. what we do have with this tradition is the inheritance of the very language, both written and spoken, from these ancient peoples.  For how we speak today, and how we write, in the West in the Indo-European dialects and in all of the major alphabets and writing systems that are in use, are for the most part direct descendants of the languages that these ancient peoples spoke and wrote in.  This is even true of the languages spoken and written in ancient India – Sanskrit.  This whole system of knowledge can be linearly mapped in a certain way to how we think, how we understand the world around us, and of course how we communicate with each other.

This is of course facilitated by the tremendous scholarship and intellectual integration work that has been done by countless scholars in the last hundred years or so that have painstaking reviewed, dated, and translated ancient works from all of these disparate ancient civilizations, aided again by the advent of mass, world communications, the digitization and wide availability of content like the world has never seen.  What results is a looking glass into the mind of these ancient peoples, these ancient authors really, that is relatively clear.  There are questions, doubts, uncertainties that remain in interpreting all of these ancient works no doubt, and scholars and academics will debate these nuances in decades to come, but there is widespread agreed upon perspective on what these ancient authors were trying to convey, and what these ideas and  notions “mean” in our modern tongues and intellectual frameworks that allows us to truly “understand” (for the most part) the essence of many if not all of these ancient works from all of these varying different ancient civilizations.

As we go further East however, we find a natural physical boundary – the Himalayas – that allowed for the almost completely independent development of civilization that runs in some respects almost eerily parallel to the development of civilization and thought in the West.  This on the one hand provides for a study of intellectual and societal development in antiquity that can be compared and contrasted directly to its counterparts in the West, a comparative study that yields no doubt a better understanding of each of the intellectual frameworks or worldviews.  But one encounters different challenges when one crosses this great boundary in antiquity, mainly that the tools that were used and developed to communicate, language, although shared some of the same characteristics as the tools to the West, were nonetheless different enough that trying to map ideas and concepts from our modern tongues back to their ancient systems of belief yields unique challenges that we do not find when studying the ancient civilizations in the West which are our direct forefathers basically.

What we are speaking of here specifically of course is the language, specifically the written language, that was used by the ancient Chinese and what impact that system of communication has on our ability – our ability in the West – to understand the minds of the ancient Chinese intellectuals, to map their worldview and metaphysical constructs into proper Western concepts (if possible), to compare these ancient intellectual developments with what occurred to the West, in the Mediterranean primarily, and maybe perhaps come to a better understanding of the worldview of our Eastern neighbors today.  For their language evolved symbiotically with their culture and vice versa.

 


 

[1] See Snow Cone Diaries, “Death of Local Realism” chapter for a review of the two theories and their fundamental limitations.

[2] Sometimes referred to as the “Axial Age”.

Quantum Mechanics: The Death of Local Realism

From Pantheism to Monotheism

Charlie had covered a lot of ground by this point.  He’d started in the age of mythology, at the dawn of civilization, looking at the cultural and socio-political forces that underpinned and supported the local mythology and priesthood classes of the ancient civilizations, and some of the broader themes that crossed different cultures around creation mythology, particularly in the Mediterranean region which drove Western civilization development.  There was most certainly a lot more rich academic and historical (and psychological) material to cover when looking at the mythology of the ancients but Charlie thought he had at least covered a portion of it and hit the major themes – at least showed how this cultural and mythological melting pot led to the predominance of the Abrahamic religions in the West, and the proliferation of Buddhism, Hinduism and Taoism in the East.

As civilizations and empires began to emerge in the West, there was a need, a vacuum if you will, for a theological or even religious binding force to keep these vast empires together.  You saw it initially with the pantheon of Egyptian/Greek/Roman gods that dominated the Mediterranean in the first millennium BCE, gods who were synthesized and brought together as the civilizations from which they originated were brought together and coalesced through trade and warfare.  Also in the first millennium BCE we encountered the first vast empires, first the Persian then the Greek, both of which not only facilitated trade throughout the region but also drove cultural assimilation as well.

In no small measure out of reaction to what was considered dated or ignorant belief systems, belief systems that merely reinforced the ruling class and were not designed to provide real true insight and liberation for the individual, emerged the philosophical systems of the Greeks, reflecting a deep seated dissatisfaction with the religious and mythological systems of the time, and even the political systems that were very much integrated with these religious structures, to the detriment of society at large from the philosophers perspective.  The life and times of Socrates probably best characterizes the forces at work during this period, from which emerged the likes of Plato and Aristotle who guided the development of the Western mind for the next 2500 years give or take a century.

Jesus’s life in many respects runs parallel to that of Socrates, manifesting and reacting to the same set of forces that Socrates reacts to, except slightly further to the East and within the context of Roman (Jewish) rule rather than Greek rule but still reflecting the same rebellion against the forces that supported power and authority.  Jesus’s message was lost however, and survives down to us through translation and interpretation that undoubtedly dilutes his true teaching, only the core survives.  The works of Plato and Aristotle survive down to us though so we can analyze and digest their complete metaphysical systems that touch on all aspects of thought and intellectual development; the scope of Aristotle’s epistêmai.

In the Common Era (CE), the year of the Lord so to speak (AD), monotheism takes root in the West, coalescing and providing the driving force for the Roman Empire and then the Byzantine Empire that followed it, and then providing the basis of the Islamic Conquests and their subsequent Empire, the Muslims attesting to the same Abrahamic traditions and roots of the Christian and the Jews (of which Jesus was of course one, a fact Christians sometimes forget).  Although undoubtedly monotheism did borrow and integrate from the philosophical traditions that preceded it, mainly to justify and solidify their theological foundations for the intellectually minded, with the advent of the authority of the Church which “interpreted” the Christian tradition for the good of the masses, you find a trend of suppression of rational or logical thinking that was in any way inconsistent with the Bible, the Word of God, or in any way challenged the power of the Church.  In many respects, with the rise in power and authority of the Church we see an abandonment of the powers of the mind, the intellect, which were held so fast and dear to by Plato and Aristotle.  Reason was abandoned for faith as it were, blind faith in God.  The Dark Ages came and went.

 

The Scientific Revolution

Then, another revolution takes place, one that unfolds in Western Europe over centuries and covers first the Renaissance, then the Scientific Revolution and the Age of Enlightenment, where printing and publishing start to make many ancient texts and their interpretations and commentary available to a broader public, outside of the monasteries.  This intellectual groundswell provided the spark that ended up burning down the blind faith in the Bible, and the Church that held its literal interpretation so dear.  Educational systems akin to colleges, along with a core curriculum of sorts (scholasticism) start to crop up in Western Europe in the Renaissance and Age of Enlightenment, providing access to many of the classic texts and rational frameworks to more and more learned men, ideas and thoughts that expanded upon mankind’s notion of reason and its limits, and its relationship to theology and society, begin to be exchanged via letters and published works in a way that was not possible prior.   This era of intellectual growth culminates in the destruction of the geocentric model of the universe, providing the crucial blow into the foundations of all of the Abrahamic religions and laying the foundation for the predominance of science (natural philosophy) and reason that marked the centuries that followed and underpins Western civilization to this day.

Then came Copernicus, Kepler, Galileo and Newton, with many great thinkers in between of course, alongside the philosophical and metaphysical advancements from the likes of Descartes and Kant among others, establishing without question empiricism, deduction and scientific method as the guiding principles behind which knowledge and reality must be based and providing the philosophical basis for the political revolutions that marked the end of the 18th century in France, England and America.

The geometry and astronomy of the Greeks as it turned out, Euclid and Ptolemy in particular, provided the mathematical framework within which the advancements of the Scientific Revolution were made.  Ptolemy’s geocentric model was upended no doubt, but his was the model that was refuted in the new system put forth by Copernicus some 15 centuries later, it was the reference point.  And Euclid’s geometry was superseded, expanded really, by Descartes’s model, i.e. the Cartesian coordinate system, which provided the basis for analytic geometry and calculus, the mathematical foundations of modern physics that are still with us today.

The twentieth century saw even more rapid developments in science and in physics in particular, with the expansion of Newtonian physics with Einstein’s Theory of Relativity in the early 21st century, and then with the subsequent advancement of Quantum Theory which followed close behind which provides the theoretical foundation for the digital world we live in today[1].

But the Scientific Revolution of the 17th, 18th and 19th centuries did not correspond to the complete abandonment of the notion of an anthropomorphic God.  The advancements of this period of Western history provided more of an extension of monotheism, a more broad theoretical and metaphysical framework within which the God was to be viewed, rendering the holy texts not obsolete per se but rendering them more to the realm of allegory and mythology, and most certainly challenging the literal interpretations of the Bible and Qur’an that had prevailed for centuries.

The twentieth century was different though.  Although you see some scattered references to God (Einstein’s famous quotation “God does not play dice” for example), the split between religion and science is cemented in the twentieth century.  The analytic papers and studies that are done, primarily by physicists and scientists, although in some cases have a metaphysical bent or at least some form of metaphysical interpretation (i.e. what do the theories imply about the underlying reality which they intend to explain), leave the notion of God out altogether, a marked contrast to the philosophers and scientists of the Scientific Revolution some century or two prior within which the notion of God, as perceived by the Church, continued to play a central role if only in terms of the underlying faith of the authors.

The shift in the twentieth century however, which can really only be described as radical even though its implications are only inferred and rarely spoken of directly, is the change of faith from an underlying anthropomorphic entity/deity that represents the guiding force of the universe and mankind in particular, to a faith in the idea that the laws of the universe can be discovered, i.e. that they exist eternally, and that these laws themselves are paramount relative to religion or theology which does not rest on an empirical foundation.  Some Enlightenment philosophers of course would take issue with this claim, but twentieth century science was about what could be proven experimentally in the physical world, not about what could be the result of reason or logical constructs.

This faith, this transformation of faith from religion toward science as it were, is implicit in all the scientific developments of the twentieth century, particularly in the physics community, where it it is fair to say that any statement or position of the role of God in science reflected ignorance, ignorance of the underlying framework of laws that clearly governed the behavior of “things”, things which were real and which could be described in terms of qualities such as mass, energy, momentum, velocity, trajectory, etc.  These constructs were much more sound and real than the fluff of the philosophers and metaphysicians, where mind and reason, and in fact perception, was on par with the physical world to at least some extent.  Or were they?

In this century of great scientific advancement, advancement which fundamentally transforms the world within which we live, facilitating the development of nuclear energy, atomic bombs, and digital computer technology to name but a few of what can only be described as revolutionary advancements of the twentieth century, and in turn in many respects drives tremendous economic progress and prosperity throughout the modern industrialized world post World War II, it is science driven at its core by advanced mathematics, which emerges as the underlying truth within which the universe and reality is to be perceived.  Mathematical theories and their associated formulas that predicted the datum and behavior of not only the objective reality of the forces that prevailed on our planet, but also explained and predicted the behavior of grand cosmological forces; laws which describe the creation and motion of the universe and galaxies, the motion of the planets and the stars, laws that describe the inner workings of planetary and galaxy formation, stars and black holes.

And then to top things off, in the very same century we find that in the subatomic realm the world is governed by a seemingly very different set of laws, laws which appear fundamentally incompatible with the laws that govern the “classical world”.  With the discovery of quantum theory and the ability to experimentally verify its predictions, we begin to understand the behavior of the subatomic realm, a fantastic, mysterious and extraordinary (and seemingly random) world which truly defies imagination.  A world where the notion of continuous existence itself is called into question.  The Ancient Greek philosophers could have never foreseen wave particle duality, no scientist before the twentieth century could.  The fabric of reality was in fact much more mysterious than anyone could have imagined.

From Charlie’s standpoint something was lost here though as these advancements and “discoveries” were made.  He believed in progress no doubt, the notion that civilization progresses ever forward and that there was an underlying “evolution” of sorts that had taken place with humanity over the last several thousand years, but he did believe that some social and/or theological intellectual rift had been created in the twentieth century, and that some sort of replacement was needed.  Without religion the moral and ethical framework of society is left governed only by the rule of law, a powerful force no doubt and perhaps grounded in an underlying sense of morality and ethics but the personal foundation of morality and ethics had been crushed with the advent of science from Charlie’s perspective, flooding the world into conflict and materialism, despite the economic progress and greater access to resources for mankind at large.  It wasn’t science’s fault per se, but it was left up to the task of the intellectual community at large to find a replacement to that which had been lost from Charlie’s view.  There was no longer any self-governing force of “do good to thy neighbor” that permeated society anymore, no fellowship of the common man, what was left to shape our world seemed to be a “what’s in it for me” and a “let’s see what I can get away with” attitude, one that flooded the court systems of the West and fueled radical religious groups and terrorism itself, leading to more warfare and strife rather than peace and prosperity which was supposed to be the promise of science wasn’t it?  With the loss of God, his complete removal from the intellectual framework of Western society, there was a break in the knowledge and belief in the interconnectedness of humanity and societies at large, and Quantum Mechanics called this loss of faith of interconnectedness directly into question from Charlie’s perspective.  If everything was connected, entangled, at the subatomic realm, if this was a proven and scientifically verified fact, how could we not take the next logical step and ask what that meant to our world-view?  “That’s a philosophical problem” did not seem to be good enough for Charlie.

Abandonment of religion for something more profound was a good thing no doubt, but what was it that people really believed in nowadays in the Digital Era?  That things and people were fundamentally separate, that they were operated on by forces that determined their behavior, and that the notion of God was for the ignorant and the weak and that eventually all of the underlying behavior and reality could be described within the context of the same science which discovered Relativity and Quantum Mechanics.  Or worse that these questions themselves were not of concern, that our main concern is the betterment of ourselves and our individual families even if that meant those next to us would need to suffer for our gain?  Well where did that leave us?  Where do ethics and morals fit into a world driven by greed and self-promotion?

To be fair Charlie did see some movement toward some sort of more refined theological perspective toward the end of the twentieth century and into the 21st century, as Yoga started to become more popular and some of the Eastern theo-philosophical traditions such as Tai Chi and Buddhism start to gain a foothold in the West, looked at perhaps as more rational belief systems than the religions of the West which have been and remain such a source of conflict and disagreement throughout the world.  And the driving force for this adoption of Yoga in the West seemed to be more aligned with materialism and self-gain than it was for spiritual advancement and enlightenment, Charlie didn’t see this Eastern perspective permeating into broader society, it wasn’t being taught in schools, the next generation, the Digital Generation, will be more materialistic than its predecessors, theology was relegated to the domain of religion and in the West this wasn’t even fair game to teach in schools anymore.

The gap between science and religion that emerged as a byproduct of the Scientific Revolution remained significant, the last thing you were going to find were scientists messing around with the domain of religion, or even theology for that matter.  Metaphysics maybe, in terms of what the developments of science said about reality, but most certainly not theology and definitely not God.  And so our creation myth was bereft of a creator – the Big Bang had no actors, simply primal nuclear and subatomic forces at work against particles that expanded and formed gases and planets and ultimately led to us, the thinking, rational human mind who was capable of contemplating and discovering the laws of the universe and question our place in them, all a byproduct of natural selection, the guiding force was apparently random chance, time, and the genetic encoding of the will to survive as a species.

 

Quantum Physics

Perhaps quantum theory, quantum mechanics, could provide that bridge.  There are some very strange behaviors that have been witnessed and modeled (and proven by experiment) at the quantum scale, principles that defy our notions of space and time that were cemented in the beginning of the twentieth century by Einstein and others.  So Charlie dove in to quantum mechanics to see what he could find and where it led.  For if there were gods or heroes in our culture today, they were the Einsteins, Bohrs, Heisenbergs and Hawkings of our time that defined our reality and determined what the next generation of minds were taught, those that broke open the mysteries of the universe with their minds and helped us better understand the world we live in.  Or did they?

From Charlie’s standpoint, Relativity Theory could be grasped intellectually by the educated, intelligent mind.  You didn’t need advanced degrees or a deep understanding of complex mathematics to understand that at a very basic level, Relativity Theory implied that Mass and Energy were equivalent, related by the speed of light that moved at a fixed speed no matter what your frame of reference, that space and time were not in fact separate and distinct concepts, that our ideas of three dimensional Cartesian space were inadequate for describing the world around us at the cosmic scale, that they were correlated concepts and are more accurately grouped together in the notion of spacetime which more accurately describes the motion and behavior of everything in the universe, more accurately than the theorems devised by Newton at least.

Relativity says that even gravity’s effect was subject to the same principles that played out at the cosmic scale, i.e. that spacetime “bends” at points of singularity (black holes for example), bends to the extent that light in fact is impacted by the severe gravitational forces at these powerful places in the universe.  And indeed that our measurements of time and space were “relative”, relative to the speed and frame of reference from which these measurements were made, the observer was in fact a key element in the process of measurement.  Although Relativity represented a major step in metaphysical or even scientific approach that expanded our notions of how the universe around us could be described, it still left us with a deterministic and realist model of the universe.

But at their basic, core level, these concepts could be understood, grasped as it were, by the vast majority of the public, even if they had very little if any bearing on their daily lives and didn’t fundamentally change or shift their underlying religious or theological beliefs, or even their moral or ethical principles.  Relativity was accepted in the modern age, it just didn’t really affect the subjective frame of reference, the mental or intellectual frame of reference, within which the majority of humanity perceived the world around them.  It was relegated to the realm of physics and a problem for someone else to consider and at best, a problem which needed to be understood to pass a physics or science exam in high school or college, to be buried in your consciousness in lieu of our more pressing daily and life pursuits be they family, career and money, or other forms of self-preservation in the modern, Digital era; an era most notably marked by materialism, self-promotion and greed.

Quantum Theory was different though.  Its laws were more subtle and complex than the world described by classical physics, the world described in painstaking mathematical precision by Newton, Einstein and others.  And after a lot of studying and research, the only conclusion that Charlie could definitively come to was that in order to understand quantum theory, or at least try to come to terms with it, a wholesale different perspective on what reality truly was, or at the very least how reality was to be defined, was required.  In other words, in order to understand what quantum theory actually means, or in order to grasp the underlying intellectual context within which the behaviors of the underlying particles/fields that quantum theory describes were to be understood, a new framework of understanding, a new description of reality, must be adopted.  What was real, as understood by classical physics which had dominated the minds of humankind for centuries, needed to be abandoned, or at the very least significantly modified, in order for quantum theory to be comprehended by any mind, or any mind that had spent any time struggling with quantum theory and trying to grasp it.  Things would never be the same from a physics perspective, this much was clear, whether or not the daily lives of the bulk of those who struggle to survive in the civilized world would evolve along with them in concert with these developments remained to be seen.

Quantum Mechanics, also known as quantum physics or simply Quantum Theory, is the branch of physics that deals with the behavior or particles and matter in the atomic and subatomic realms, or quantum realm so called given the quantized nature of “things” at this scale (more on this later).  So you have some sense of scale, an atom is 10-8 cm across give or take, and the nucleus, or center of an atom, which is made up of what we now call protons and neutrons, is approximately 10-12 cm across.  An electron or a photon, the name we give for a “particle” of light”, cannot truly be “measured” from a size perspective in terms of classical physics for many of the reasons we’ll get into below as we explore the boundaries of the quantum world but suffice it to say at present our best guess at the estimate of the size of an electron are in the range of 10-18 cm or so[2].

Whether or not electrons, or photons (particles of light) for that matter, really exist as particles whose physical size, and/or momentum can be actually “measured” is not as straightforward a question as it might appear and gets at some level to the heart of the problem we encounter when we attempt to apply the principles of “existence” or “reality” to the subatomic realm, or quantum realm, within the context of the semantic and intellectual framework established in classical physics that has evolved over the last three hundred years or so; namely as defined by independently existing, deterministic and quantifiable measurements of size, location, momentum, mass or velocity.

The word quantum comes from the Latin quantus, meaning “how much” and it is used in this context to identify the behavior of subatomic things that move from and between discrete states rather than a continuum of values or states as is presumed in classical physics.  The term itself had taken on meanings in several contexts within a broad range of scientific disciplines in the 19th and early 20th centuries, but was formalized and refined as a specific field of study as “quantum mechanics” by Max Planck at the turn of the 20th century and represents the prevailing and distinguishing characteristic of reality at this scale.

Newtonian physics, or even the extension of Newtonian physics as “discovered” by Einstein with Relativity theory in the beginning of the twentieth century (a theory whose accuracy is well established via experimentation at this point), assumes that particles, things made up of mass, energy and momentum exist independent of the observer or their instruments of observation, are presumed to exist in continuous form, moving along specific trajectories and whose properties (mass, velocity, etc.) can only be changed by the action of some force upon which these things or objects are affected.  This is the essence of Newtonian mechanics upon which the majority of modern day physics, or at least the laws of physics that affect us here at a human scale, is defined and philosophically falls into the realm of realism and determinism.

The only caveat to this view that was put forth by Einstein is that these measurements themselves, of speed or even mass or energy content of a specific object can only be said to be universally defined according to these physical laws within the specific frame of reference of an observer.  Their underlying reality is not questioned – these things clearly exist independent of observation or measurement, clearly (or so it seems) – but the values, or the properties of these things is relative to a frame of reference of the observer.  This is what Relativity tells us.  So the velocity of a massive body, and even the measurement of time itself which is a function of distance and speed, is a function of the relative speed and position of the observer who is performing said measurement.  For the most part, the effects of Relativity can be ignored when we are referring to objects on Earth that are moving at speeds that are minimal with respect to the speed of light and are less massive than say black holes.  As we measure things at the cosmic scale, where distances are measured in terms of light years and black holes and other massive phenomena exist which bend spacetime, aka singularities, the effects of Relativity cannot be ignored.[3]

Leaving aside the field of cosmology for the moment and getting back to the history of the development of quantum mechanics (which arguably is integrally related to cosmology at a basic level), at the end of the 19th century Planck was commissioned by electric companies to create light bulbs that used less energy, and in this context was trying to understand how the intensity of electromagnetic radiation emitted by a black body (an object that absorbs all electromagnetic radiation regardless of frequency or angle of incidence) depended on the frequency of the radiation, i.e. the color of the light.  In his work, and after several iterations of hypothesis that failed to have predictive value, he fell upon the theory that energy is only absorbed or released in quantized form, i.e. in discrete packets of energy he referred to as “bundles” or” energy elements”, the so called Planck postulate.  And so the field of quantum mechanics was born.[4]

Despite the fact that Einstein is best known for his mathematical models and theories for the description of the forces of gravity and light at a cosmic scale, i.e. Relativity, his work was also instrumental in the advancement of quantum mechanics as well.   For example, in his work in the effect of radiation on metallic matter and non-metallic solids and liquids, he discovered that electrons are emitted from matter as a consequence of their absorption of energy from electromagnetic radiation of a very short wavelength, such as visible or ultraviolet radiation.  Einstein established that in certain experiments light appeared to behave like a stream of tiny particles that he called photons, not just a wave, lending more credence and authority to the particle theories describing of quantum realm.  He therefore hypothesized the existence of light quanta, or photons, as a result of these experiments, laying the groundwork for subsequent wave-particle duality discoveries and reinforcing the discoveries of Planck with respect to black body radiation and its quantized behavior.[5]

 

Wave-Particle Duality and Wavefunction Collapse

Prior to the establishment of light’s properties as waves, and then in turn the establishment of wave like characteristics of subatomic elements like photons and electrons by Louis de Broglie in the 1920s, it had been fairly well established that these subatomic particles, or electrons or photons as they were later called, behaved like particles.  However the debate and study of the nature of light and subatomic matter went all the way back to the 17th century where competing theories of the nature of light were proposed by Isaac Newton, who viewed light as a system of particles, and Christiaan Huygens who postulated that light behaved like a wave.  It was not until the work of Einstein, Planck, de Broglie and other physicists of the twentieth century that the nature of these subatomic particles, both light and electrons, were proven to behave both like particles and waves, the result dependent upon the experiment and the context of the system which being observed.  This paradoxical principle known as wave-particle duality is one of the cornerstones, and underlying mysteries, of the implications of the reality described by Quantum Theory.

As part of the discoveries of subatomic particle wave-like behavior, what Planck discovered in his study of black body radiation (and Einstein as well within the context of his study of light and photons) was that the measurements or states of a given particle such as a photon or an electron, had to take on values that were multiples of very small and discrete quantities, i.e. non-continuous, the relation of which was represented by a constant value known as the Planck constant[6].

In the quantum realm then, there was not a continuum of values and states of matter as was assumed in physics up until that time, there were bursts of energies and changes of state that were ultimately discrete, and yet fixed, where certain states and certain values could in fact not exist, representing a dramatic departure from the way most of think about movement and change in the “real world” and most certainly a significant departure from Newtonian mechanics upon which Relativity was based.[7]

The classic demonstration of light’s behavior as a wave, and perhaps one of the most astonishing experiments of all time, is illustrated in what is called the double-slit experiment[8].  In the basic version of this experiment, a light source such as a laser beam is shone at a thin plate that that is pierced by two parallel slits.  The light in turn passes through each of the slits and displays on a screen behind the plate.  The image that is displayed on the screen behind the plate is not one of a constant band of light that passes through each one of the slits as you might expect if the light were simply a particle or sets of particles, the light displayed on the screen behind the double-slitted slate is one of light and dark bands, indicating that the light is behaving like a wave and is subject to interference, the strength of the light on the screen cancelling itself out or becoming stronger depending upon how the individual waves interfere with each other.  This behavior is exactly akin to what we consider fundamental wavelike behavior, for example like the nature of waves in water where the waves have greater strength if they synchronize correctly (peaks of waves) and cancel each other out (trough of waves) if not.

What is even more interesting however, and was most certainly unexpected, is that once equipment was developed that could reliably send a single particle (electron or photon for example, the behavior was the same) through a double-slitted slate, these photons did end up at a single location on the screen after passing through one of the slits as was expected, but the location on the screen, as well as which slit the particle appeared to pass through (in later versions of the experiment which slit “it” passed through could in fact be detected) seemed to be somewhat random.  What researchers found as more and more of these subatomic particles were sent through the slate one at a time, the same wave like interference pattern emerged that showed up when the experiment was run with a full beam of light as was done by Young some 100 years prior.

So hold on for a second, Charlie had gone over this again and again, and according to all the literature he read on quantum theory and quantum mechanics they all pretty much said the same thing, namely that the heart of the mystery of quantum mechanics could be seen in this very simple experiment.  And yet it was really hard to, perhaps impossible, to understand what was actually going on, or at least understand without abandoning some of the very foundational principles of physics, like for example that these things called subatomic particles actually existed, because they seemed to behave like waves.  Or did they?

What was clear was that this subatomic particle, corpuscle or whatever you wanted to call it, did not have a linear and fully deterministic trajectory in the classical physics sense, this much was very clear due to the fact that the distribution against the screen when they were sent through the double slits individually appeared to be random.  But what was more odd was that when the experiment was run one corpuscle at a time, not only was the final location on the screen seemingly random individually, but the same pattern emerged after many, many single experiment runs as when a full wave, or set of these corpuscles, was sent through the double slits.  So not only did the individual photon seem to be aware of the final wave like pattern of its parent wave, but that this corpuscle appeared to be interfering with itself when it went through the two slits individually.  What?  What the heck was going on here?

Furthermore, to make things even more mysterious, as the final location of each of the individual photons in the two slit and other related experiments was evaluated and studied, it was discovered that although the final location of an individual one of these particles could not be determined exactly before the experiment was performed, i.e. there was a fundamental element of uncertainty or randomness involved at the individual corpuscle level that could not be escaped, it was discovered that the final locations of these particles measured in toto after many experiments were performed exhibited statistical distribution behavior that could be modeled quite precisely, precisely from a mathematical statistics and probability distribution perspective.  That is to say that the sum total distribution of the final locations of all the particles after passing through the slit(s) could be established stochastically, i.e. in terms of well-defined probability distribution consistent with probability theory and well defined mathematics that governed statistical behavior.  So in total you could predict at some sense what the behavior would look like over a large distribution set even if you couldn’t predict what the outcome would look like for an individual corpuscle.

The mathematics behind this particle distribution that was discovered is what is known as the wave function, typically denoted by the mathematical symbol the Greek letter psi, ψ or its capital equivalent Ψ, which predicts what the probability distribution of these “particles” will look like on the screen behind the slate over a given period of time after many individual experiments are run, or in quantum theoretical terms the wavefunction predicts the quantum state of a particle throughout a fixed spacetime interval.  This very foundational and groundbreaking equation was discovered by the Austrian physicist Erwin Schrödinger in 1925, published in 1926, and is commonly referred to in the scientific literature as the Schrödinger equation, analogous in the field of quantum mechanics to Newton’s second law of motion in classical physics.

With the discovery of the wave function, or wavefunction, it now became possible to predict the potential locations or states of motions of these subatomic particles, an extremely potent theoretical model that has led to all sorts of inventions and technological advancements in the twentieth century and beyond.   This wavefunction represents a probability distribution of potential states or outcomes that describe the quantum state of a particle and predicts with a great degree of accuracy the potential location of a particle given a location or state of motion.

Again, this implied that individual corpuscles were interfering with themselves when passing through the two slits on the slate which was very odd indeed.  In other words, the individual particles were exhibiting wave like characteristics even when they were sent through the double-slitted slate one at a time.  This phenomenon was shown to occur with atoms as well as electrons and photons, confirming that all of these subatomic so-called particles exhibited wave like properties as well as their particle like qualities, the behavior observed determined upon the type of experiment, or measurement as it were, that the “thing” was subject to.

As Louis De Broglie, the physicist responsible for bridging the theoretical gap between the study of corpuscles (particles, matter or atoms) and waves by establishing the symmetric relation between momentum and wavelength which had at its core Planck’s constant, i.e. the De Broglie equation, described this mysterious and somewhat counterintuitive relationship between wave and particle like behavior:

A wave must be associated with each corpuscle and only the study of the wave’s propagation will yield information to us on the successive positions of the corpuscle in space[9].

So by the 1920s then, you have a fairly well established mathematical theory to govern the behavior of subatomic particles, backed by a large body of empirical and experimental evidence, that indicates quite clearly that what we would call “matter” (or particles or corpuscles) in the classical sense, behaves very differently, or at least has very different fundamental characteristics, in the subatomic realm.  It exhibits properties of a particle, or a thing or object, as well as a wave depending upon the type of experiment that is run.  So the concept of matter itself then, as we had been accustomed to dealing with and discussing and measuring for some centuries, at least as far back as the time of Newton (1642-1727), had to be reexamined within the context of quantum mechanics.  For in Newtonian physics, and indeed in the geometric and mathematical framework within which it was developed and conceived which went back to ancient times (Euclid 300 BCE), matter was presumed to be either a particle or a wave, but most certainly not both.

What even further complicated matters was that matter itself, again as defined by Newtonian mechanics and its extension via Relativity Theory, taken together what is commonly referred to as classical physics, was presumed to have some very definite, well-defined and fixed, real properties.  Properties like mass, location or position in space, and velocity or trajectory were all presumed to have a real existence independent of whether or not they were measured or observed, even if the actual values were relative to the frame of reference of the observer.  All of this hinged upon the notion that the speed of light was fixed no matter what the frame of reference of the observer of course, this was a fixed absolute, nothing could move faster than the speed of light.  Well even this seemingly self-evident notion, or postulate one might call it, ran into problems as scientists continued to explore the quantum realm.

So by the 1920s then, the way scientists looked at and viewed matter as we would classically consider it within the context of Newton’s postulates from the early 1700s which were extended further into the notion of spacetime as put forth by Einstein, was encountering some significant difficulties when applied to the behavior of elements in the subatomic, quantum, world.  Furthermore, there was extensive empirical and scientific evidence which lent significant credibility to quantum theory, which illustrated irrefutably that these subatomic elements behaved not only like waves, exhibiting characteristics such as interference and diffraction, but also like particles in the classic Newtonian sense that had measurable, well defined characteristics that could be quantified within the context of an experiment.

In his Nobel Lecture in 1929, Louis de Broglie, summed up the challenge for physicists of his day, and to a large extent physicists of modern times, given the discoveries of quantum mechanics as follows:

The necessity of assuming for light two contradictory theories-that of waves and that of corpuscles – and the inability to understand why, among the infinity of motions which an electron ought to be able to have in the atom according to classical concepts, only certain ones were possible: such were the enigmas confronting physicists at the time…[10]

 

Uncertainty, Entanglement, and the Cat in a Box

The other major tenet of quantum theory that rests alongside wave-particle duality, and that provides even more complexity when trying to wrap our minds around what is actually going on in the subatomic realm, is what is sometimes referred to as the uncertainty principle, or the Heisenberg uncertainty principle, named after the German theoretical physicist Werner Heisenberg who first put forth the theories and models representing the probability distribution of outcomes of the position of these subatomic particles in certain experiments like the double-slit experiment previously described, even though the wave function itself was the discovery of Schrödinger.

The uncertainty principle states that there is a fundamental limit on the accuracy with which certain pairs of physical properties of atomic particles, position and momentum being the classical pair for example, that can be known at any given time with certainty.  In other words, physical quantities come in conjugate pairs, where only one of the measurements of a given pair can be known precisely at any given time.  In other words, when one quantity in a conjugate pair is measured and becomes determined, the complementary conjugate pair becomes indeterminate.  In other words, what Heisenberg discovered, and proved, was that the more precisely one attempts to measure one of these complementary properties of subatomic particles, the less precisely the other associated complementary attribute of the element can be determined or known.

Published by Heisenberg in 1927, the uncertainty principle states that they are fundamental, conceptual limits of observation in the quantum realm, another radical departure from the principles of Newtonian mechanics which held that all attributes of a thing were measurable at any given time, i.e. existed or were real.  The uncertainty principle is a statement on the fundamental property of quantum systems as they are mathematically and theoretically modeled and defined, and of course empirically validated by experimental results, not a statement about the technology and method of the observational systems themselves.  This is an important point.  This wasn’t a theoretical problem, or a problem with the state of instrumentation that was being used for measurement, it was a characteristic of the domain itself.

Max Born, who won the Nobel Prize in Physics in 1954 for his work in quantum mechanics, specifically for his statistical interpretations of the wave function, describes this now other seemingly mysterious attribute of the quantum realm as follows (the specific language he uses reveals at some level his interpretation of the quantum theory, more on interpretations later):

…To measure space coordinates and instants of time, rigid measuring rods and clocks are required.  On the other hand, to measure momenta and energies, devices are necessary with movable parts to absorb the impact of the test object and to indicate the size of its momentum.  Paying regard to the fact that quantum mechanics is competent for dealing with the interaction of object and apparatus, it is seen that no arrangement is possible that will fulfill both requirements simultaneously.[11]

Whereas classical physicists, physics prior to the introduction of relativity and quantum theory, distinguished between the study of particles and waves, the introduction of quantum theory and wave-particle duality established that this classic intellectual bifurcation of physics at the macroscopic scale was wholly inadequate in describing and predicting the behavior of these “things” that existed in the subatomic realm, all of which took on the characteristics of both waves and particles depending upon the experiment and context of the system being observed.  Furthermore the actual precision within which a state of a “thing” in the subatomic world could be defined was conceptually limited, establishing a limit to which the state of a given subatomic state could be defined, another divergence from classical physics.  And then on top of this, was the requirement of the mathematical principles of statistics and probability theory, as well as significant extensions to the underlying geometry, to describe and model the behavior at this scale, all calling into question our classical materialistic notions and beliefs that we had held so dear for centuries.

Even after the continued refinement and experimental evidence that supported Quantum Theory however, there did arise some significant resistance to the completeness of the theory itself, or at least questions as to its true implications with respect to Relativity and Newtonian mechanics.  The most notable of these criticisms came from Einstein himself, most infamously encapsulated in a paper he co-authored with two of his colleagues Boris Podolsky and Nathan Rosen published in 1936 which came to be known simply as the EPR paper, or simply the EPR paradox, which called attention to what they saw as the underlying inconsistencies of the theory that still required explanation.  In this paper they extended some of the quantum theoretical models to different thought experiments/scenarios to yield what they considered to be at very least improbable, if not impossible, conclusions.

They postulated that given the formulas and mathematical models that described the current state of quantum theory, i.e. the description of a wave function that described the probabilistic outcomes for a given subatomic system, that if such a system were transformed into two systems – split apart if you will – by definition both systems would then be governed by the same wave function and whose subsequent behavior and state would be related, no matter what their separation in spacetime, violating one of the core tenets of classically physics, namely communication faster than the speed of light.  This was held to be a mathematically true and consistent with quantum theory, although at the time could not be validated via experiment.

They went on to show that if this is true, it implies that if you have a single particle system that is split into two separate particles and subsequently measured, these two now separate and distinct particles would then be governed by the same wave function, and in turn would be governed by the same uncertainty principles outlined by Heisenberg; namely that a defined measurement of a particle in system A will cause its conjugate value in system B to be undeterminable or “correlated”, even if the two systems had no classical physical contact with each other and were light years apart from each other.

But hold on a second, how could this be possible?  How could you have two separate “systems”, governed by the same wave function, or behavioral equation so to speak, that no matter how far apart they were, or no matter how much time elapsed between measurements, that you had a measurement in one system which fundamentally correlated with (or uncorrelated with, the argument is the same) a measurement in the other system that its separate from?  They basically took the wave function theory, which governs behavior of quantized particles, and its corresponding implication of uncertainty as outlined most notably by Heisenberg, and extended it to multiple, associated and related subatomic systems, related and governed by the same wave function despite their separation in space (and time) yielding a very awkward and somewhat unexplainable result, at least unexplainable in terms of classic physics.

The question they raised boiled down to, how could you have two unrelated, distant systems whose measurements or underlying structure depended upon each other in a very well-defined and mathematically and (theoretically at the time but subsequently verified via experiment) empirically measurable way?  Does that imply that these systems are communicating in some way either explicitly or implicitly?  If so that would seem to call into question the principle of the fixed speed of light that was core to Relativity Theory.  The other alternative option seemed to be that the theory was incomplete in some way, which was Einstein’s view.  Were there “hidden”, yet to be discovered variables that governed the behavior of quantum systems that had yet to be discovered, what came to be known in the literature as hidden variable theories?

If it were true, and in the past half century or so many experiments have verified this, it is at the very least extremely odd behavior, or perhaps better put reflected very odd characteristics, certainly inconsistent with prevailing theories of physics.  Or at least characteristics that we have come to not expect in our descriptions of “reality” that we had grown accustomed to expect.  Are these two subsystems, once correlated, communicating with each other?  Is there some information that is being passed between them that violates the speed of light boundary that forms the cornerstone of modern, classical physics?  This seems unlikely, and most certainly is something that Einstein felt uncomfortable with.  This “spooky action at a distance”, which is what Einstein referred to it as, seemed literally to defy the laws of physics.  But the alternative appeared to be that this notion of what we consider to be “real”, at least as it was classically defined, would need to be modified in some way to take into account this correlated behavior between particles or systems that were physically separated beyond classical boundaries.

From Einstein’s perspective, two possible explanations for this behavior were put forth, 1) either there existed some model of behavior of the interacting systems/particles that was still yet undiscovered, so called hidden variables, or 2) the notion of locality, or perhaps more aptly put as the tenet of local determinism (which Einstein and others associated directly and unequivocally with reality), which underpinned all of classical physics had to be drastically modified if not completely abandoned.

In Einstein’s words however, the language for the first alternative that he seemed to prefer was not that there were hidden variables per se, but more so that quantum theory as it stood in the first half of the twentieth century was incomplete.  That is to say that some variable, coefficient or hidden force was missing from quantum theory which was the driving force behind the correlated behavior of the attributes of these physically separate particles that were separate beyond classical means of communication in any way.  For Einstein it was the completeness option that he preferred, unwilling to consider the idea that the notion of locality was not absolute.  Ironically enough, hindsight being twenty-twenty and all, Einstein had just postulated that there was no such thing as absolute truth, or absolute reality, on the macroscopic and cosmic physical plane with Relativity Theory, so one might think that he would have been more open to relaxing this requirement in the quantum realm, but apparently not, speaking to the complexities and subtleties of quantum theory implications even for some of the greatest minds of the time.

Probably the most widely known metaphor that illustrated Einstein and other’s criticism of quantum theory is the thought experiment, or paradox as it is sometimes referred to as, called Schrödinger’s cat, or Schrödinger’s cat paradox.[12]  In this thought experiment, which according to tradition emerged out of discussions between Schrödinger and Einstein just after the EPR paper was published, a cat is placed in a fully sealed and fully enclosed box with a radioactive source subject to certain measurable and quantifiable rate of decay, a rate that is presumably less than the life time of a cat.  In the box with the cat is one internal radioactive monitor which measures any radioactive particles in the box (# of radioactive particles <= 1), and flask of poison that is triggered by the radioactivity monitor if it is triggered.  According to quantum theory which governs the rate of decay with some random probability distribution over time, it is impossible to say at any given moment, until the box is opened in fact, whether or not the cat is dead or alive.  But how could this be?  The cat is in an undefined state until the box is opened?  There is nothing definitive that we can say about the state of the cat independent of actually opening the box?  The calls into question, bringing the analogy to the macroscopic level, whether or not according to quantum theory reality can be defined independent of observation (or measurement) within the context of the cat, the box and the radioactive particle and its associated monitor.

In the course of developing this experiment, Schrödinger coined the term entanglement[13], one of the great still yet to be solved mysteries, or perhaps better called paradoxes, that exist to this day in quantum theory/mechanics.  Mysterious not in the sense as to whether or not the principle actually exists, entanglement has been verified in a variety of physically verifiable experiments as outlined in the EPR paper and illustrated in the cat paradox and is accepted as a scientific fact in the physics community, but a mystery in the sense as to how this can be possible given that it seems, at least on the face of it, to fly in the face of classical Newtonian mechanics, almost determinism itself actually.  Schrödinger himself is probably the best person to turn to understand quantum entanglement and he describes it as:

When two systems, of which we know the states by their respective representatives, enter into temporary physical interaction due to known forces between them, and when after a time of mutual influence the systems separate again, then they can no longer be described in the same way as before, viz. by endowing each of them with a representative of its own. I would not call that one but rather the characteristic trait of quantum mechanics, the one that enforces its entire departure from classical lines of thought. By the interaction the two representatives [the quantum states] have become entangled.[14]

The principle of entanglement calls into question of what is known as local realism; “local” in the sense that all the behaviors and datum of a given system are determined by the qualities or attributes of those objects within that given system bounded by spacetime as defined by Newtonian mechanics and Relativity or some force that is acting upon said system, and “real” in the sense that the system itself exists independent of observation or apparatus/elements of observation.

Taking the non-local theory explanation to the extreme, and something which has promoted quite a bit of what can reasonably be called hysterical reaction in some academic and pseudo-academic communities even to this day, is that the existence of proven correlation of two pairs of entities that are separated in spacetime far enough from each other so that the speed of light boundary could not be crossed, if the two separated particles do indeed seem to hold a distinct and mathematically predictable correlation, i.e. this notion of entanglement “action at a distance” as it is sometimes called, then all of classical physics is called into question.  Einstein specifically called out these “spooky at a distance” theories as defunct, he so believed in the invariable tenets of Relativity, and it’s hard to argue with his position quite frankly because correlation does not necessarily imply communication.  But if local realism and its underlying tenets of determinism are to be held fast to, then where does that leave quantum theory?

This problem gets somewhat more crystallized, or well defined, in 1964 when the physicist John Stewart Bell (1928-1990) in his seminal paper entitled “On the Einstein Podolsky Rosen Paradox, takes the EPR argument one step further and asserts, proves mathematically via a reductio ad absurdum argument, that if quantum theory is true, that in fact no hidden parameter or variable theory could possibly exist that reproduces all of the predictions of quantum mechanics and is also consistent with locality[15].  In other words, Bell asserted that the hidden variable hypothesis, or at the very least a broad category of hidden variable hypotheses, was incompatible with quantum theory itself, unless the notion of locality was abandoned or at least relaxed to some extent.  In his own words:

In a theory in which parameters are added to quantum mechanics to determine the results of individual measurements, without changing the statistical predictions, there must be a mechanism whereby the setting of one measuring device can influence the reading of another instrument, however remote. Moreover, the signal involved must propagate instantaneously, so that a theory could not be Lorentz invariant.[16]

This assertion is called Bell’s theorem and it posits that quantum mechanics and the concept of locality, which again states that an object is influenced directly only by its immediate surroundings and is a cornerstone of the theories of Newton and Einstein regarding the behavior of matter and the objective world, are mathematically incompatible and inconsistent with each other, providing further impetus as it were, that this classical notion of locality was in need of closer inspection, modification or perhaps even abandoned entirely.

Although there still exists some debate among physicists as to whether or not there is enough experimental evidence to prove out Bell’s theorem beyond a shadow of a doubt, it seems to be broadly accepted in the scientific community that this property of entanglement exists beyond classical physical boundaries.  However, the question as to whether or not all types of hidden variable theories are ruled out by Bell’s theorems appears to be a legitimate question and is still up for debate, and perhaps this loop hole more so than any other is the path which Bohm and Hiley take with their Causal, or Ontological Interpretation of Quantum Theory (more below).

Criticisms of Bell’s theorem and the related experiments aside however, if you believe quantum theory, and you’d be hard pressed not to at this point, you must conclude that the theory violates and is inconsistent with Relativity in some way, a rather disconcerting and problematic conclusion for the twentieth century physicist to say the least and a problem which plagues, and motivates, many modern theoretical physicists to this day.

Quantum Theory then, as expressed with Bell’s theorem, Heisenberg’s uncertainty principle and this idea of entanglement, asserts that there exists a level of interconnectedness between physically disparate systems that defies at least some level the classical physics notion of deterministic locality, pointing to either the incompleteness of quantum theory or to the requirement of some sort of non-trivial modification of the concept of local realism which has underpinned classical physics for the last few centuries if not longer.

In other words, the implications of quantum theory, a theory that has very strong predictive and experimental evidence which backs up the soundness and strength of the underlying math, is that there is something else is at work that connects the state of particles or things at the subatomic scale that cannot be altogether described, pinpointed, or explained.  Einstein himself struggles with this notion even toward the end of his lifetime in 1954 when he says:

…The following idea characterizes the relative independence of objects far apart in space, A and B: external influence on A has no direct influence on B; this is known as the Principle of Local Action, which is used consistently only in field theory. If this axiom were to be completely abolished, the idea of the existence of quasi enclosed systems, and thereby the postulation of laws which can be checked empirically in the accepted sense, would become impossible….[17]

 

Interpretations of Quantum Theory: Back to First Philosophy

There is no question as to the soundness of the mathematics behind quantum theory and there is now a very large body of experimental evidence that supports the underlying mathematics, including empirical evidence of not only the particle behavior that it intends to describe (as in the two slit experiment for example), but also experimental evidence that validates Bell’s theorem and the EPR Paradox.  What is somewhat less clear however, and what arguably may belong more to the world of metaphysics and philosophy rather than physics, is how quantum theory is to be interpreted as a representation of reality given the state of affairs that it introduces.  What does quantum theory tell us about the world we live in, irrespective of the soundness of its predictive power?  This is a question that physicists, philosophers and even theologians have struggled with since the theory has gained wide acceptance and prominence in the scientific community since the 1930s.

There are many interpretations of quantum theory but there are three in particular that Charlie thought deserved attention due primarily to a) their prevalence or acceptance in the academic community, and/or b) their impact on scientific or philosophical inquiry into the limits of quantum theory.

The standard, orthodox interpretation of quantum theory and the one most often compared to when differing interpretations to quantum theory are put forth is most commonly referred to as the Copenhagen Interpretation which renders the theoretical boundaries of interpretation of the theory to the experiment itself, the Many-worlds (or Many-minds) interpretation which explores the boundaries of the nature of reality proposing in some extreme variants the existence of multiple universes/realities simultaneously, and the Causal Interpretation which is also sometimes called De Broglie-Bohm theory or Bohmian mechanics, which extends the theory to include the notion of quantum potential and at the same time abandons the classical notion of locality but still preserves objective realism and determinism.[18]

The most well established and most commonly accepted interpretation of Quantum Theory, the one that is most often taught in schools and textbooks and the one that most alternative interpretations are compared against, is the Copenhagen Interpretation[19]. The Copenhagen interpretation holds that the theories of quantum mechanics do not yield a description of an objective reality, but deal only with sets of probabilistic outcomes of experimental values borne from experiments observing or measuring various aspects of energy quanta, entities that do not fit neatly into classical interpretations of mechanics.  The underlying tenet here is that the act of measurement itself, the observer (or by extension the apparatus of observation) causes the set of probabilistic outcomes to converge on a single outcome, a feature of quantum mechanics commonly referred to as wavefunction collapse, and that any additional interpretation of what might actually be going on, i.e. the underlying reality, defies explanation and the interpretation of which is in fact inconsistent with the fundamental mathematical tenets of the theory itself.

In this interpretation of quantum theory, reality (used here in the classical sense of the term as existing independent of the observer) is a function of the experiment, and is defined as a result of the act of observation and has no meaning independent of measurement.  In other words, reality in the quantum world from this point of view does not exist independent of observation, or put somewhat differently, the manifestation of what we think of or define as “real” is intrinsically tied to and related to the act of observation of the system itself.

Niels Bohr has been one of the strongest proponents of this interpretation, an interpretation which refuses to associate any metaphysically implications with the underlying physics.  He holds that given this proven interdependence between that which was being observed and the act of observation, no metaphysical interpretation can in fact be extrapolated from the theory, it is and can only be a tool to describe and measure states and particle/wave behavior in the subatomic realm that are made as a result of some well-defined experiment, i.e. that attempting to make some determination as to what quantum theory actually meant, violated the fundamental tenets of the theory itself.  From Bohr’s perspective, the inability to draw conclusions beyond the results of the experiments which the theory covers was a necessary conclusion of the theorem’s basic tenets and that was the end of the matter.  This view can be seen as the logical conclusion of the notion of complementarity, one of the fundamental and intrinsic features of quantum mechanics that makes it so mysterious and hard to describe or understand in classical terms.

Complementarity, which is closely tied to the Copenhagen interpretation, expresses the notion that in the quantum domain the results of the experiments, the values yielded (or observables) were fundamentally tied to the act of measurement itself and that in order to obtain a complete picture of the state of any given system, as bound by the uncertainty principle, one would need to run multiple experiments across a given system, each result in turn rounding out the notion of the state, or reality of said system.  These combined features of the theory said something profound about the underlying uncertainty of the theory itself.  Perhaps complementarity can be viewed as the twin of uncertainty, or its inverse postulate.  Bohr summarized this very subtle and yet at the same time very profound notion of complementarity in 1949 as follows:

…however far the [quantum physical] phenomena transcend the scope of classical physical explanation, the account of all evidence must be expressed in classical terms. The argument is simply that by the word “experiment” we refer to a situation where we can tell others what we have learned and that, therefore, the account of the experimental arrangements and of the results of the observations must be expressed in unambiguous language with suitable application of the terminology of classical physics.

This crucial point…implies the impossibility of any sharp separation between the behavior of atomic objects and the interaction with the measuring instruments which serve to define the conditions under which the phenomena appear…. Consequently, evidence obtained under different experimental conditions cannot be comprehended within a single picture, but must be regarded as complementary in the sense that only the totality of the phenomena exhausts the possible information about the objects.[20]

Complementarity was in fact the core underlying principle which drove the existence of the uncertainty principle from Bohr’s perspective; it was the underlying characteristic and property of the quantum world that captured at some level its very essence.  And complementarity, taken to its logical and theoretical limits, did not allow or provide any framework for describing, any definition of the real world outside of the domain within which it dealt with, namely the measurement values or results, the measurement instruments themselves, and the act of measurement itself.

Another interpretation or possible question to be asked given the uncertainty implicit in Quantum Theory, was that perhaps all possible outcomes as described in the wave function did in some respect manifest even if they call could not be seen or perceived in our objective reality.  This premise underlies an interpretation of quantum theory that has gained some prominence in the last few decades, especially within the computer science and computational complexity fields, and has come to be known as the Many-Worlds interpretation.

This original formulation of this theory was laid out by Hugh Everett in his PHD thesis in 1957 in a paper entitled The Theory of the Universal Wave Function wherein he referred to the interpretation not as the many-worlds interpretation but as the Relative-State formulation of Quantum Mechanics (more on this distinction below), but the theory was subsequently developed and expanded upon by several authors and the term many-worlds sort of stuck.[21]

In Everett’s original exposition of the theory, he begins by calling out some of the problems with the original, or classic, interpretation of quantum mechanics; specifically what he and other members of the physics community believed to be the artificial creation of the wavefunction collapse construct to explain quantum uncertain to deterministic behavior transitions, as well as the difficulty that this interpretation had in dealing with systems that consisted of more than one observer, as the main drivers for an alternative viewpoint of the interpretation of the quantum theory, or what he referred to as a metatheory given that the standard interpretation could be derived from it.

Although Bohr, and presumably Heisenberg and von Neumann, whose collective views on the interpretation of quantum theory make up what is now commonly referred to as the Copenhagen Interpretation of quantum theory, would no doubt explain away these seemingly contradictory and inconsistent problems with as out of scope of the theory itself (i.e. quantum theory is a theory that is intellectually and epistemologically bound by the experimental apparatus and their subsequent results which provide the scope of the underlying mechanics), Everett finds this view lacking as it fundamentally prevents us from any true explanation as to what the theory says about “reality”, or the real world as it were, a world considered to be governed by the laws of classic physics where things and objects exists independent of observers and have real, static measurable and definable qualities, a world fundamentally incompatible with the stochastic and uncertain characteristics that governed the behavior of “things” in the subatomic or quantum realm.

The aim is not to deny or contradict the conventional formulation of quantum theory, which has demonstrated its usefulness in an overwhelming variety of problems, but rather to supply a new, more general and complete formulation, from which the conventional interpretation can be deduced.[22]

Everett’s starts by making the following basic assumptions from which he devises his somewhat counter intuitive but yet now relatively widely accepted standard interpretations of quantum theory are 1) all physical systems large or small, can be described as states within Hilbert space, the fundamental geometric framework upon which quantum mechanics is constructed, 2) that the concept of an observer can be abstracted to be a machine like entity with access to unlimited memory which stores a history of previous states, or previous  observations, and has the ability to made deductions, or associations, regarding actions and behavior solely based upon this memory and this simple deductive process thereby incorporating observers and acts of observation (i.e. measurement) completely into the model, and 3) with assumptions 1 and 2, the entire state of the universe, which includes the observers within it, can be described in a consistent, coherent and fully deterministic fashion without the need of the notion of wavefunction collapse, or any additional assumptions for that matter.

Everett makes what he calls a simplifying assumption to quantum theory, i.e. removing the need for or notion of wavefunction collapse, and assumes the existence of a universal wave function which accounts for and describes the behavior of all physical systems and their interaction in the universe, absorbing the observer and the act of observation into the model, observers being simply another form of a quantum state that interacts with the environment.  Once these assumptions are made, he can then abstract the concept of measurement as just interactions between quantum systems all governed by this same universal wave function.  In Everett’s metatheory, the notion of what an observer means and how they fit into the overall model are fully defined, and the challenge stemming from the seemingly arbitrary notion of wavefunction collapse is resolved.

In Everett’s view, there exists a universal wavefunction which corresponds to an objective, deterministic reality and the notion of wavefunction collapse as put forth by von Neumann (and reflective of the standard interpretation of quantum mechanics) represents not a collapse so to speak, but represents a manifestation of the realization of one possible outcome of measurement that exists in our “reality”, or multi-verse.

But from Everett’s perspective, if you take what can be described as a literal interpretation of the wavefunction as the overarching description of reality, this implies that the rest of the possible states reflected in the wave function of that system do not cease to exist with the act of observation, with the collapse of the quantum mechanical wave that describes said system state in Copenhagen quantum mechanical nomenclature, but that these other states do have some existence that persists but are simply not perceived by us.  In his own words, and this is a subtle yet important distinction between Everett’s view and the view of subsequent proponents of the many-worlds interpretation, they remain uncorrelated with the observer and therefore they do not exist in their manifest reality.

We now consider the question of measurement in quantum mechanics, which we desire to treat as a natural process within the theory of pure wave mechanics. From our point of view there is no fundamental distinction between “measuring apparata” and other physical systems. For us, therefore, a measurement is simply a special case of interaction between physical systems – an interaction which has the property of correlating a quantity in one subsystem with a quantity in another.[23]

This implies of course that these unperceived states do have some semblance of reality, that they do in fact exists as possible realities, realities that are thought to have varying levels of “existence” depending upon which version of the many-worlds interpretation you adhere to.  With DeWitt and Deutsch for example, a more literal, or “actual” you might say, interpretation of Everett’s original theory is taken, where these other states, these other realities or multi-verses, do in fact physical exist even though they cannot be perceived or validated by experiment.[24]  This is a more literal interpretation of Everett’s thesis however, because nowhere does Everett explicitly state that these other universes actually exist, what he does say on the matter seems to imply the existence of “possible” or potential universes that reflect non-measured or non-actualized states of physical systems, but not these unrealized outcomes actually exists in some physical universe:

In reply to a preprint of this article some correspondents have raised the question of the “transition from possible to actual,” arguing that in “reality” there is—as our experience testifies—no such splitting of observer states, so that only one branch can ever actually exist. Since this point may occur to other readers the following is offered in explanation.

The whole issue of the transition from “possible” to “actual” is taken care of in the theory in a very simple way—there is no such transition, nor is such a transition necessary for the theory to be in accord with our experience. From the viewpoint of the theory all elements of a superposition (all “branches”) are “actual,” none any more “real” than the rest. It is unnecessary to suppose that all but one are somehow destroyed, since all the separate elements of a superposition individually obey the wave equation with complete indifference to the presence or absence (“actuality” or not) of any other elements. This total lack of effect of one branch on another also implies that no observer will ever be aware of any “splitting” process.

Arguments that the world picture presented by this theory is contradicted by experience, because we are unaware of any branching process, are like the criticism of the Copernican theory that the mobility of the earth as a real physical fact is incompatible with the common sense interpretation of nature because we feel no such motion. In both cases the argument fails when it is shown that the theory itself predicts that our experience will be what it in fact is. (In the Copernican case the addition of Newtonian physics was required to be able to show that the earth’s inhabitants would be unaware of any motion of the earth.)[25]

According to this view, the act of measurement of a quantum system, and its associated principles of uncertainty and entanglement, is simply the reflection of this splitting off of the observable universe from a higher order notion of a multiverse where all possible outcomes and alternate histories have the potential to exist.  The radical form of the many-worlds view is that these potential, unmanifest realities do in fact exist, whereas Everett seems to only go so far as to imply that they “could” exist and that conceptually their existence should not be ignored.

As hard as multiverse interpretation of quantum mechanics might be to wrap your head around, it does represent an elegant solution to some of the challenges raised by the broader physics community against quantum theory, most notable the EPR paradox and its extension to more everyday life examples as illustrated in the infamous Schrodinger’s cat paradox.  It does however raise some significant questions as to his theory of mind and subjective experience, a notion that he glosses over somewhat by abstracting observers into simple machines of sorts but nonetheless rests as a primary building block upon which his metatheory rests[26].

Another interpretation of these strange and perplexing findings of quantum mechanics in the early 20th century is Bohmian Mechanics, sometimes also referred to as de Broglie-Bohm theory, pilot-wave theory, or the Causal Interpretation of quantum theory.  The major contributors of the interpretation were initially Louis De Broglie who originally developed pilot-wave theory in the early part of the twentieth century but dropped the work after he got stuck on how to extend it to multi-body systems, and most prominently David Bohm who fully developed the theory in the second half of the twentieth century with the British physicist Basil Hiley.

Bohmian mechanics is most fully developed in Bohm and Hiley’s book entitled The Undivided Universe first published in 1993 although much of its contents and the underlying theory had been thought out and published in previous papers on the topic since the 1950s.  In their book they refer to their interpretation not as the Causal Interpretation, or even as de Broglie-Bohm theory, but as the Ontological Interpretation of Quantum Theory given that from their perspective its gives the only complete causal and deterministic model of quantum theory.

David Bohm was an American born British physicist of the twentieth century who made a variety of contributions to theoretical physics, but who also invested much time and thought into the metaphysical implications of quantum mechanics, and in metaphysics and philosophy in general, a topic that most theoretical physicists have steered away from, presumably due to the adverse effects it could have on their academic and pursuits in physics proper as Bohm himself encountered to some extent throughout his career.  In this respect, Bohm was a bit of a rebel relative to his peers in the academic community because he extended the hard science of theoretical physics into the more abstract realm of the descriptions of reality as a whole, incorporating first philosophy back into the discussion so to speak, but doing so with the tool of hard mathematics, making his interpretation very hard to ignore, or at least impossible to ignore its implications from a theoretical physics perspective.

Bohm, like many other physicists (like Everett for example), was dissatisfied with the mainstream interpretations of quantum mechanics as represented by the Copenhagen school of thought and in 1952 published an alternative theory which extended the pilot-wave theory of De Broglie that was published some thirty years prior and applied its basic principles to multi-body quantum systems, developing a more robust mathematical foundation to pilot-wave theory which had been previously lacking.  He then, along with Hiley, further extended the underlying mathematics of quantum theory to include a concept called quantum potential, a principle that provided a deterministic pillar into the probabilistic and stochastic nature of standard quantum theory interpretation, the actual position and momentum of the underlying particle(s) in question being the so called hidden variables.

De-Broglie’s pilot-wave theory from 1927 affirms the existence of subatomic particles, or corpuscles as they were called back then, but viewed these particles not as independent existing entities but as integrated into an undercurrent, or wave, which gave these subatomic particles their wave-like characteristics of diffraction and interference while still explaining their particle like behavior as illustrated in certain experimental results.  This represented a significant divergence away from standard interpretations of quantum theory and was not well received, hence the silence on advancement of the theory for the next twenty years or so by the physics community.  From his 1927 paper on the topic, De Broglie describes pilot-wave theory as follows:

One will assume the existence, as distinct realities, of the material point and of the continuous wave represented by the [wave function], and one will take it as a postulate that the motion of the point is determined as a function of the phase of the wave by the equation. One then conceives the continuous wave as guiding the motion of the particle. It is a pilot wave.[27]

De Broglie’s pilot-wave theory was dismissed by the broader academic community when it was presented at the time however, mainly due to fact that it’s implications were only understood to describe only single-body systems, and no doubt due to the fact that the common interpretation of quantum mechanics postulated that nothing could be said about the “existence” of a subatomic particle until it was measured and therefore the matter wasn’t further pursued until Bohm picked the theory back up some thirty years later.  Bohm expanded the theory to apply it to multi-body systems, giving the theory a more solid scientific ground and providing a fully developed framework for further consideration by the broader physics community.

Bohmian Mechanics, as pilot-wave theory later evolved into its more mature form, provides a mathematical and metaphysical framework within which subatomic reality can indeed be thought of as actually existing independent of an observer or an act of measurement, a significant departure from standard interpretations of the theory that were prevalent for most of the twentieth century (in philosophic terms it’s a fully realist interpretation).  The theory was consistent with Bell’s Theorem as it abandoned the notion of locality, and also was also fully deterministic, positing that once the value of these hidden variables was known, all future states, and even past states, could be calculated and known as well, consistent in this sense with classical physics.[28]

That the guiding wave, in the general case, propagates not in ordinary three-space but in a multidimensional-configuration space is the origin of the notorious ‘nonlocality’ of quantum mechanics. It is a merit of the de Broglie-Bohm version to bring this out so explicitly that it cannot be ignored.[29]

Bohmian Mechanics falls into the category of hidden variable theories.  It lays out a description of reality in the quantum realm where the wave function.  In other words it states that there are in fact hidden variables which dictate the actual position, momentum et al of the behavior of particles in the subatomic world, and outlines a factor it refers to as quantum potential which governs or guides the behavior and description of a quantum system and determines its future and past states, irrespective of whether or not the quantum system is observed or measured.

Along with this theory being fully deterministic, it also explains away the notion of wavefunction collapse as put forth by von Neumann by positing that the pilot-wave behaves according the stochastic principles of Schrodinger’s wave function but that there is some element of intelligent, or active information, involved in the behavior of the underlying wave/particle.  In other words, from their perspective, the wave/particle knows about its environment and behaves in a pseudo-intelligent manner (they stay away from the word intelligence but Charlie couldn’t see any other way to describe what it is that they meant to say).  In two-slit experiment parlance, it knows whether or not one or both of the slits are open and in turn behaves or moves so to speak with this knowledge in mind.

According to Bohm, one of the motivations for exploring the possibility of a fully deterministic/causal extension of quantum theory was not necessarily because he believed it to be the right interpretation, the correct one, but to show the possibility of such theories, the existence of which was cast into serious doubt after the development of Bell’s theorem in the 1950s.

… it should be kept in mind that before this proposal was made there had existed the widespread impression that no conceptions of hidden variables at all, not even if they were abstract, and hypothetical, could possibly be consistent with the quantum theory.[30]

Bohmian mechanics is consistent with Bell’s theorem, which rules out hidden variables only in theories which assume local realism, i.e. that all objects or things are governed by and behave according to the principles of classical physics which are bound by the constraints of Relativity and the fixed speed of light, which has been proven to not be the case in quantum mechanics, causing of course much consternation in the physics community and calling into question classical realism in general.[31]

Bohmian Mechanics (or The Ontological Interpretation of quantum theory which is the terminology that Bohm and Hiley adopt to describe their hypothesis of what is actually happening in the quantum realm) agrees with all of the predictions and models of quantum mechanics as developed by Bohr, Heisenberg and von Neumann (the orthodox Copenhagen Interpretation) but extends the model with this notion of quantum potential, develops a metaphysical notion of active information which guides the subatomic particle(s), and makes nonlocality explicit (something which Einstein held to be absolute and immovable).  With respect to the importance of the development of Bohmian mechanics, at least from a theoretical and mathematical perspective even if you don’t want to believe the interpretation, Bell himself (1987) had this to say about Bohmian mechanics:

But in 1952 I saw the impossible done. It was in papers by David Bohm. Bohm showed explicitly how parameters could indeed be introduced, into nonrelativistic wave mechanics, with the help of which the indeterministic description could be transformed into a deterministic one. More importantly, in my opinion, the subjectivity of the orthodox version, the necessary reference to the ‘observer,’ could be eliminated. …

But why then had Born not told me of this ‘pilot wave’? If only to point out what was wrong with it? Why did von Neumann not consider it? More extraordinarily, why did people go on producing ‘‘impossibility’’ proofs, after 1952, and as recently as 1978? … Why is the pilot wave picture ignored in text books? Should it not be taught, not as the only way, but as an antidote to the prevailing complacency? To show us that vagueness, subjectivity, and indeterminism, are not forced on us by experimental facts, but by deliberate theoretical choice?[32]

Bohmian Mechanics uniqueness is not only that it yields to the presumption of non-locality, (which was and is consistent with experimental results that show that there is in fact a strong degree of correlation between physically separated and once integral quantum systems, i.e. systems that are entangled, what Einstein perhaps misappropriately referred to as “spooky action at a distance”) but also in that it proves that hidden variable type theories are in fact mathematically possible and still consistent with the basic tenets of quantum mechanics, the latter point of which had been seriously called into question.

In other words, what Bohmian mechanics calls our attention to quite directly, is that there are metaphysical assumptions about reality in general that are fundamentally non-classical in nature that must be accounted for when interpreting quantum theory, the existence of what Bell refers to as “multidimensional-configuration space” that underlies the correlation of entangled particles/systems.  That is the only way to explain that once integrated but then subsequently separated quantum systems could be correlated in such a mathematically consistent and predictable way, behavior initially described by EPR as natural theoretical extensions of quantum theory in the first half of the twentieth century and subsequently proven experimentally in the latter part of the twentieth century by Aspect[33] among others.

And it was these same quantum systems whose behavior which was modeled so successfully with quantum mechanics, that in some shape or form constituted the basic building blocks that provided the foundation of the entire “classically” physical world.  This latter fact could not be denied, and yet the laws and theorems that have been developed to describe this behavior were (and still are for that matter) fundamentally incompatible with classical physics and its underlying assumptions about what is “real” and how these objects of reality behave and are related to each other.[34]  Although the orthodox interpretation of Quantum Theory would have us believe that we can draw no metaphysical conclusions based upon what quantum mechanics tells us, that it is simply a tool for deriving values or observables from experimental results, Bohmian Mechanics shows us that this interpretation albeit consistent and fully coherent is lacking in many respects and that a new perspective is required even if the Bohmian view is rejected.

Bohmian Mechanics, and Everett’s Relative State formulation of quantum mechanics as well to an extent, both extend well beyond the laws of classical physics to round out or complete their theories, both explicitly drawing on notions of metaphysics and the existence of some sort of underlying reality that exists in the subatomic realm, and this is where they depart significantly from the standard Copenhagen interpretation and the view most rigorously defended by Bohr.  The Copenhagen view holds that quantum theory tells us about the measurement of observables within the context of the quantum world, it’s an empirical measuring tool and nothing more and further that’s all that can be extrapolated from it by definition.  There’s no metaphysics explicit or implicit in the theory and any epistemological interpretation is ruled out.  Bohmian Mechanics and Everett’s Relative State formulation of quantum theory (and by association the various Many-World Interpretations that stemmed from it by DeWitt, Deutsch and others) intend to try and explain what is really happening in the quantum realm in a manner that’s consistent with the underlying model of behavior and prediction of experimental results, and some adventure into metaphysics, Aristotle’s first philosophy, is required in order to do this given that some departure from the assumptions of classical physics are required.

In the Relative State formulation, the wave function of Schrodinger is postulated to be a true representation of all of reality, abstracted to include observers at all levels, observers roughly corresponding to machines that can store the results of measurements (quantum states) and apply some level of deductive reasoning to correlate states and make subsequent observations.  From this perspective, the wave function represents perspectives (this is not the term that Everett uses but the one Charlie prefers) of a correlated reality that comes into existence, a correlated reality between one or many quantum system states/observers, all definable within the geometry of Hilbert space rather than Cartesian space which is used in Newtonian mechanics (with an extra dimension of time in Relativity).

Bohm (and Hiley) lay out an extension to the quantum theoretical mathematical model which is not only fully deterministic, but also “real”, not yielding to the Copenhagen view that reality in the quantum world only exists upon measurement, i.e. a reality existing independent of any observation, albeit a fundamentally non-local reality which is completely consistent with Bell’s Theorem.  Both interpretations however, and others that fit into similar categories as does Bell’s Theorem itself, call into serious question the notion of local realism which sits in the center of Newtonian mechanics which has driven scientific development in the last three hundred years.

One can put it quite succinctly by observing that no matter what school of interpretation you adhere to, at the very least the classical notion of local realism must be abandoned, one would be hard pressed to find someone with a good understanding of Quantum Theory who would dispute this.  In other words, regardless of which interpretation is more attractive, or which one you adhere to, what cannot be ignored is that the classical interpretation of reality, that it has intrinsic properties that exist independent of observation and can be precisely measured in a fully deterministic and predictive way, the assumption that drove the developments of the Scientific Revolution and provided the underlying metaphysical framework for Newton, Einstein and others, was in need of serious revision.


[1] Without quantum mechanics we wouldn’t have transistors which are the cornerstone of modern computing.

[2] Our current ability to measure the size of these subatomic particles goes down to approximately 10-16 cm leveraging currently available instrumentation, so at the very least we can say that our ability to measure anything in the subatomic realm, or most certainly the realm of the general constituents of basic atomic elements such as quarks or gluons for example, is very challenging to say the least.  Even the measurement of the estimated size of an atom is not so straightforward as the measurement is dictated by the circumference of the atom, a measurement that relies specifically on the size or radius of the “orbit” of the electrons on said atom, “particles” whose actual “location” cannot be “measured” in tandem with their momentum, standard tenets of quantum mechanics, both of which constitute what we consider measurement in the classic Newtonian sense.

[3] In some respects, even at the cosmic scale, there is still significant reason to believe that even Relativity has room for improvement as evidenced by what physicists call Dark Matter and/or Dark Energy, artifacts and principles that have been created by theoretical physicists to describe matter and energy that they believe should exist according to Relativity Theory but the evidence for which their existence is still yet ”undiscovered”.  For more on Dark Matter see http://en.wikipedia.org/wiki/Dark_matter and Dark Energy see http://en.wikipedia.org/wiki/Dark_energy, both of which remain mysteries and lines of active research for modern day cosmology.

[4] Quantum theory has its roots in this initial hypothesis by Planck, and in this sense he is considered by some to be the father of quantum theory and quantum mechanics.  It is for this work in the discovery of “energy quanta” that Max Planck received the Nobel Prize in Physics in 1918, some 15 or so years after publishing.

[5] Einstein termed this behavior the photoelectric effect, and it’s for this work that he won the Nobel Prize in Physics in 1921.

[6] The Planck constant was first described as the proportionality constant between the energy (E) of a photon and the frequency (ν) of its associated electromagnetic wave.  This relation between the energy and frequency is called the Planck relation or the Planck–Einstein equation:

[7] It is interesting to note that Planck and Einstein had a very symbiotic relationship toward the middle and end of their careers, and much of their work complemented and built off of each other.  For example Planck is said to have contributed to the establishment and acceptance of Einstein’s revolutionary concept of Relativity within the scientific community after being introduced by Einstein in 1905, the theory of course representing a radical departure from the standard classical physical and mechanics models that had held up for centuries prior.  It was through the collaborative work and studies of Planck and Einstein in some sense then that the field of quantum mechanics and quantum theory is shaped how it is today; Planck who defined the term quanta with respect to the behavior of elements in the realms of matter, electricity, gas and heat, and Einstein who used the term to describe the discrete emissions of light, or photons.

[8] The double slit experiment was first devised and used by Thomas Young in the early nineteenth century to display the wave like characteristics of light.  It wasn’t until the technology was available to send a single “particle” (a photon or electron for example) that the wave like and stochastically distributed nature of the underlying “particles” was discovered as well.  http://en.wikipedia.org/wiki/Young%27s_interference_experiment

[9] Louis de Broglie, “The wave nature of the electron”, Nobel Lecture, Dec 12th, 1929

[10] Louis de Broglie, “The wave nature of the electron”, Nobel Lecture, Dec 12th, 1929

[11] Max Born, “The statistical interpretation of quantum mechanics” Nobel Lecture, December 11, 1954.

[12] Erwin Schrodinger made many of the fundamental discoveries in the foundation of quantum mechanics, most notably the wave function which described the behavior of subatomic particles.  He shared some of the same concerns of standard interpretations of quantum mechanics with Einstein, as illustrated in his cat paradox that he is so well known for.

[13] Actually Verschränkung in German.

[14] Schrödinger, E. (1935) Discussion of Probability Relations Between Separated SystemsProceedings of the Cambridge Philosophical Society, 31: pg. 555

[15] As later analysis and criticism has pointed out, Bell’s theorem rules out hidden variable theories of a given genre rather than all hidden variable theories in toto.

[16] Bell, John (1964). “On the Einstein Podolsky Rosen Paradox”Physics 1 (3): 195–200.

[17] Albert Einstein, Quantum Mechanics and Reality (“Quanten-Mechanik und Wirklichkeit”, Dialectica 2:320-324, 1948)

[18] For a more complete review of a multitude of interpretations of Quantum Theory going well beyond this analysis see http://en.wikipedia.org/wiki/Interpretations_of_quantum_mechanics.

[19] This mode of thought was formulated primarily by Niels Bohr and Werner Heisenberg, stemming from their collaboration in Copenhagen in 1927; hence the name.  The term was further crystallized in writings by Heisenberg in the 1950s when addressing contradictory interpretations of quantum theory and still represents the most widely accepted, and widely taught, interpretation of quantum mechanics in physics today.

[20] Niels Bohr (1949),”Discussions with Einstein on Epistemological Problems in Atomic Physics”. In P. Schilpp. Albert Einstein: Philosopher-Scientist. Open Court.

[21] Everett was a graduate student at Princeton at the time that he authored The Theory of the Universal Wave Function and his advisor was John Wheeler, one of the most respected theoretical physicists of the latter half of the twentieth century.  Incidentally Everett did not continue in academia and therefore subsequent interpretations and expansions upon his theory were left to later authors and researchers, most notably by Bryce Dewitt in 1973 who coined the term “many-worlds” and then developed even further by subsequent physicists such as David Deutsch among others.  DeWitt’s book on the subject included several different viewpoints and research papers and was called The Many-Worlds Interpretation of Quantum Mechanics; it included a reprint of Everett’s thesis.  Deutsch’s seminal work on the topic is probably his book entitled The Fabric of Reality published in 1997 where he expands and extends the man-worlds interpretation to other disciplines outside of physics such as philosophy and epistemology, computer science and quantum computing, and even biology and theories of evolution.

[22] From the Introduction of Everett’s thesis in 1957 “Relative State” Formulation of Quantum Mechanics.

[23] Hugh Everett, III.  Theory of the Universal Wave Function, 1957.  Pg 53.

[24] Deutsch actually posits that proof of the “existence” of these other multi-verses is given by the wave interference pattern displayed in even the single split version of the classic double slit experiment as well as the some of the running time algorithm enhancements driven by quantum computing, namely Shor’s algorithm which finds the polynomial factors of a given number which runs an order of magnitude faster on quantum computers than it does on classical, 1 or 0 but based machines.  This claim is controversial to say the least, or at least remains an open point of contention among the broader physics community. See http://daviddeutsch.physics.ox.ac.uk/Articles/Frontiers.html for a summary of his views on the matter.

[25] Everett’s thesis in 1957 “Relative State” Formulation of Quantum Mechanics, Note on Page 15, presumably in response to criticisms he received upon publishing the draft of his thesis to various distinguished members of the physics community, one of who was Niels Bohr.

[26] Deutsch actually posits that proof of the “existence” of these other multi-verses is given by the wave interference pattern displayed in even the single split version of the classic double slit experiment as well as the some of the running time algorithm enhancements driven by quantum computing, namely Shor’s algorithm which finds the polynomial factors of a given number which runs an order of magnitude faster on quantum computers than it does on classical, 1 or 0 bit based machines.  This claim is controversial to say the least, or at least remains an open point of contention among the broader physics community. See http://daviddeutsch.physics.ox.ac.uk/Articles/Frontiers.html for a summary of Deutsch’s views on the matter and Bohm and Hiley’s Chapter on Many-Worlds in their 1993 book entitled The Undivided Universe: An Ontological Interpretation of Quantum Theory for a good overview of the strengths and weaknesses mathematical and otherwise of Everett and DeWitt’s different perspectives on the Many-Worlds approach.

[27] Louis De Broglie `Wave mechanics and the atomic structure of matter and of radiation’, Le Journal de Physique et le Radium, 8, 225 (1927)

[28] These features are why it is sometimes referred to as the Causal Interpretation due to the fact that that it outlined a fully causal description of the universe and its contents.

[29] From Stanford Encyclopedia entry on Bohmian Mechanics by Sheldon Goldstein, quote from Bell, Speakable and Unspeakable in Quantum Mechanics, Cambridge: Cambridge University Press; 1987, p. 115.

[30]  David Bohm, Wholeness and the Implicate Order, London: Routledge 1980 pg. 81.

[31] In fact, it was Bohm’s extension of De Broglie’s work on pilot-wave theory that provided at least to some degree the motivation for Bell to come up with his theorem to begin with; see Bell’s paper entitled On the Einstein Podolsky Rosen Paradox in 1964, published some 12 years after Bohm published his adaption of De Broglie’s pilot-wave theory.

[32] From Stanford Encyclopedia entry on Bohmian Mechanics, 2001 by Sheldon Goldstein; taken from Bell 1987, “Speakable and Unspeakable in Quantum Mechanics”, Cambridge University Press.

[33] Experimental Realization of Einstein-Podolsky-Rosen-Bohm Gedankenexpermient: A New Violation of Bell’s Inequalities; Aspect, Grangier, and Roger, July 1982.

[34] There has been significant progress in the last decade or two in reconciling quantum theory and classical mechanics, most notably with respect to Newtonian trajectory behavior, what is described in the literature as accounting for the classical limit.  For a good review of the topic see the article The Emergence of Classical Dynamics in a Quantum World by Tanmoy Bhattacharya, Salman Habib, and Kurt Jacobs published in Las Alamos Science in 2002.

Einstein and Spacetime: It’s all Relative

At this point, Charlie had enough material and had performed enough research to establish the core part of his thesis no doubt, illustrating what at least from his perspective seemed the clear borrowing and synthesis of various religious and theological doctrines in ancient times that led ultimately to what could be considered to be the natural evolution of religion, i.e. monotheism.  First with the advent of Judaism which established the basic tenets of monotheism, along with its basic Cosmology and theology, as well as its lineage in history starting with Adam and Eve and then following along to the generations of Abraham, Moses and then Jesus, all of which were included in the religious traditions of the Abrahamic faiths at some level.

But what he found as he traced these developments through the Middle Ages, as naturalism, theism and at least the foundations of materialism had been established, it wasn’t necessarily that the belief in a Creator had been abandoned per se, but it had been superseded, subsumed so to speak, by the belief that the material universe, the substance of Aristotle (ousia), which stems originally from the Greek verb “to be” or “being”, obeyed natural laws which could be “discovered” and were best described by advanced mathematics.  So a byproduct of the Scientific Revolution was not so much materialism and atheism, as Charlie had expected, but the introduction of advanced mathematics as the language of God.

With Newton (1642-1727 CE) then, in particular with his law of universal gravitation and laws of motion as articulated in his  Philosophiæ Naturalis Principia Mathematica (Mathematical Principles of Natural Philosophy) first published in 1687, Charlie saw not only the foundations of mechanism, the notion that the world can be completely and entirely explained through mechanical and mathematical laws, but also the natural extension of this premise, determinism, i.e. the belief that the course of the universe was laid out entirely by cause and effect which was driven by these same mathematical laws and principles that governed materialism.

This is not to say that Newton himself believed this, or subsequent influential scientists for that matter (Einstein as a prime example), but it is with Newton’s work in what we now term physics, Newtonian mechanics as it is typically referred to today, that these now ubiquitous philosophical principles which permeates our Western materialistic and capitalistic society have their strongest roots.  If all is governed by pure mathematical laws, if the physical universe is all that truly “exists”, and all the laws that govern this material universe are “discoverable” then what room is there for the Soul, the notion of free will, mythology (whose purpose is to speak to the soul at a very basic level), or even systems ethics or morality for that matter, outside of its place in the social and political spheres which focused on capitalism and the protection of basic property rights, liberty and the pursuit of happiness for their own sake which were also relics of the Enlightenment era to at least some degree.

As Charlie had already discovered, the word “science” derives from the Latin sciencia, meaning knowledge or “that which can be known” and is a derivation of the Latin verb scire, or “to know”.  Sciencia is the typical translation of the key Aristotelian term epistêmai which meant the same, i.e. knowledge, although epistêmai the way Aristotle used it had a much broader meaning than the term “science” does today, and Aristotle was compelled to spell out in painstaking detail the types of “knowledge” that existed, its categories, and even a broad description of being itself, or “existence”, i.e. being qua being which represented a fundamental conceptual counterpart to knowledge.  At some level the bulk of the corpus of Aristotle’s work could be considered a theory of knowledge, or epistemology, and it was very clear that the language that governs our different branches of science today, and even the word science itself, have their roots in the epistemology and underlying metaphysical and semantic framework that Aristotle established some 2500 years ago.

The word sciencia as a derivation of the Greek epistêmai was carried down through the Middle Ages well into the Age of Enlightenment and into the modern era, and what we consider science today has become almost equivalent with our notion or concept of “truth”, which of course implies that if the field of knowledge is outside of science, its subject to opinion and a level of subjectivity and un-verifiability that keeps it steps away from the grand pedestal of science.  This was the essence of the basis of Charlie’s ongoing argument with Niels.

Aristotle’s “epistêmai”, or “sciencia”, provides the basis for the categorization of the intellectual development that ensues during the Enlightenment era and branches of knowledge start to mature and evolve, culminating from a natural philosophical perspective in Newton’s great work Philosophiæ Naturalis Principia Mathematica, which in many respects marks the beginning of modern science, and certainly physics, as we know it today.

Newton’s prime treatise described core mathematical principles in the world of Natural Philosophy, harkening back to the categorization of fields of knowledge by Aristotle which looked at the laws that operate in the “physical” world as the discipline of natural philosophy; as distinguished from Aristotle’s first philosophy or what we now call “metaphysics”.  But in the subsequent centuries following the adoption and establishment of Newtonian mechanics in the academic and intellectual community, the word “science became rooted in our vocabulary and replaced the old term “natural philosophy” and in turn the fields of knowledge or study outside of science started to play a much more secondary role in the development of human thought.  Most certainly theology, or religion, from the perspective of the academic community at least, was pushed aside to make room for scientific development and this separation of theology, or even first philosophy, from science was a significant byproduct of the Scientific Revolution and survives into the modern Information Age.

In some sense, Charlie mused, the term science itself and its implications of it signifying the branch of knowledge within which all that can truly be known is studied (implying that all that lie outside of the domain of science is open to conjecture or simply a matter of opinion rather than backed up by hard empirical tested data) pointed almost directly at the glass wall that Charlie was trying to break through, namely that this bifurcation of reality into scientific and non-scientific realms, albeit an important development that facilitated the advancement of science itself, freeing it from the shackles of conservation religious beliefs, had the unintended, subtle and yet at the same time profound consequence of limiting and boxing in the notion of Knowledge and Reality in and of themselves.

Between Newton and Einstein, the two most influential physicists of the modern era (if you can call Newton a physicist even though there was no such thing in Newton’s time), we do find a variety of developments in not only the field of astronomy, which tested and verified Newton’s laws on universal gravitation and motion, but also in the fields of optics, electricity and magnetism, culminating in the discovery of what are called ‘Maxwell’s equations’, a theoretical and mathematical model that consolidated and integrated the previously separate domains of optics, magnetism and electricity under the heading of electrodynamics, indicating that all three of these fields of study and their implications and laws were actually just manifestations of of the same underlying force, i.e. electromagnetism.

As experimentation and testing of theories advanced however, and instrumentation became more advanced and precise, various holes and inconsistencies developed which pointed to cracks in the armor not only in Newtonian mechanics, but also with Maxwell’s mathematical and theoretical models surrounding the new, consolidated field of electromagnetism which included optics as well as magnetism and electricity.  These inconsistencies, or perhaps better-termed irregularities, to a very great extent provided the impetus for Einstein’s original work in physics before he developed his Relativity Theory.

Einstein is best known for two fundamentally radical scientific developments that forever changed the course of scientific history, Relativity Theory which built upon and effectively supplanted Newtonian mechanics as the dominant model of the physical universe, reconciling inconsistencies in some of the astronomical observations of his time and at the same time upending the notion that time was a constant force that moved at a constant rate of progress no matter where you were or how fast you were traveling in “relative” space, and of course his discovery of the equivalence between mass and energy that is captured in the elegant and now famous equation , both revolutionary theories that were to forever change the nature of physics.

His Relativity Theory is actually broken into two parts, the first of which is Special Relativity which posits an altogether new structure of the physical universe by integrating the notion of space and time, i.e. spacetime, and General Relativity which builds off of the developments of his Special Relativity Theory and develops a system of universal gravitation at the cosmic as well as earthly scale, both theories resting on the notion that the speed of light is constant in the universe (186,000 miles per hour) no matter what an observer’s frame of reference and no matter how fast an observer is moving relative to the object of measurement.

Einstein was undoubtedly the most influential physicist of the 20th century and his work was truly ground breaking and represented a major step in the development of advanced mathematical models to represent the world around us at the cosmic scale, illustrating to the academic and intellectual communities at the time, i.e. physicists and scientists, that the world as they knew it was not as simple as had been previously thought.  Although Einstein is best known for his theories on Relativity and mass-energy equivalence however, the work that he actually won the Nobel Prize for in 1921 (at the age of 42) actually created some of the building blocks for what later became the field of Quantum Mechanics, a theory incidentally that Einstein voiced great concern with over the course of his career, calling it “incomplete” or at the very least missing some key variables/inputs.  It is from his concerns regarding Quantum Theory that in fact we have the famous quotation, “God does not play dice”.

Einstein was just as much of a philosopher as he was a physicist however, and much of the latter part of his career he not only questioned the premise of the quantum mechanical models that began to take shape during the middle of the twentieth century, but he also spent a good deal of his time thinking and writing about what the great “discoveries” of twentieth century physics actually meant, i.e. their relevance to and about the world we lived in from a metaphysical and theological perspective.  In his view, the advancements in physics marked by Relativity and Quantum Theory were not simply mathematical and measurement tools to aid the development of science and technological advancement, but had serious implications on the nature of reality itself, as well as God’s role in the creation and sustenance of said reality.

Perhaps the most notable example of the moral dilemma which Einstein faced with respect to technological advancement as a result of developments in physics in the first half of the twentieth century and their social as well as ethical implications is illustrated in his involvement, and subsequent regret, in the famed Manhattan Project, the US Government funded initiative during WW II that developed, and of course then later used, the atomic bomb against Japan in 1945.  Despite his later public regrets on the subject, Einstein contributed significantly to these efforts which ran for some seven years, cost the United States nearly 2 billion dollars, and at its height employed more than 130,000 people.[1]

 

Albert Einstein was born in Germany in 1879 and spent most of his formative years there in school.  His father was an electrical engineer so you could say that the study of electrical currents, and science in general, was inherited to a great extent.  He supposedly wrote his first paper on scientific topics at the age of 16 on the behavior of magnetic fields, a work entitled On the Investigation of the State of the Ether in a Magnetic Field.  In 1900 Einstein’s was awarded his degree in teaching from the Zurich Polytechnical school and after struggling for almost two years to find a job, he finally landed work in Bern, Switzerland, at the Federal Office for Intellectual Property as an assistant examiner where he evaluated patent applications for electromagnetic devices.  Interestingly enough, his work in the patent office was very much in line with his later research and thinking with respect to the transmission of electric signals and the synchronization of time, concepts which played a significant role in the subsequent development of his theories in electromagnetism and physics which had such a profound effect on modern science.

On 30 April 1905, Einstein was awarded a doctorate in Physics by the University of Zurich with his thesis A New Determination of Molecular Dimensions.  That same year he also published papers on the photoelectric effect (for which he later won a Nobel Prize in Physics), Brownian motion which developed mathematical models describing the motion of particles suspended in a fluid, liquid or gas, Special Relativity, and the relationship of mass and energy as a function of the speed of light, marking the beginning of decades of revolutionary scientific developments at both the cosmic and subatomic scale.[2]

Einstein’s work on the photoelectric effect study in particular had significant impact on the subsequent development of the Quantum Theory, for it proved that when certain types of matter were bombarded with short-wave electromagnetism, they emitted what Einstein referred to as photoelectrons, particles which later came to be known simply as photons, the study of which led directly to some of the most odd and mysterious behaviors that have come to characterize Quantum Theory, i.e. the fact that light behaves both like a particle and a wave depending upon the experiment used to study it.  This discovery led to important developments in understanding the quantized nature of light, i.e. it’s characteristic to move from state to state in a non-continuous fashion, a discovery which in many respects formed the foundations of Quantum Theory.

At the beginning of the rise of Nazi power in Germany in the 1930s, and while visiting Universities in the United States in 1933, Germany passed a law barring Jews from holding official positions, including teaching at Universities, and it is said that Einstein also learned at this time that there was a bounty placed on his head.  Einstein then moved to the United States in 1933 permanently, as the Nazis rose in power in his homeland of Germany.  There he took up a position at the Institute for Advanced Study at Princeton University, a position which he held until his death in 1955.  During this period, Einstein spent much of his intellectual pursuits trying to come up with a unified theory that incorporated his models of Relativity (the General case) which dealt with the behaviors of massive bodies, light and time at a cosmic scale, and Quantum Mechanics which dealt with the description of the world at the microscopic and subatomic scale, an endeavor which the field of theoretical physics still struggles with to this day.

On a more personal level, Einstein was a great lover of music and an accomplished violinist.  His mother was a pianist and Einstein was taught the violin at a very early age, supposedly starting at the age of 5, although he is said to have taken up music more passionately in his teenage years where he grew a great affection for the work of Mozart.  His music is thought to have played a significant role in his social life over the years, as he is noted to have played violin in Germany and Switzerland with friends, most notably with Max Planck and his son prior to moving to the States in 1933, and then in the United States as well later in life at Princeton University where he is said to have joined in with the famed Julliard Quartet on occasion.

From a scientific and physics perspective however, it is Einstein’s work on Relativity and the equivalence of mass and energy that gained him the popularity and repute that still stands to this day.  His theories on Relativity are separated into what he referred to as the “Special” case, which was published initially in 1905 where he posited the notion of spacetime as a holistic construct within which classical Newtonian mechanical observations of “physical bodies” and “motion” must be viewed in order to be fully consistent and coherent, and the “General” case which expanded upon Special Relativity to include a more general case which included mathematical formulae for measuring classical physical attributes such as mass and speed when no reference system existed from which the measurements could be made and sat relative to.

Special Relativity is the physical theory of measurement in an inertial frame of reference and was proposed by Einstein in a paper in 1905 entitled On the Electrodynamics of Moving Bodies.  The paper reconciled James Clerk Maxwell’s mathematical models (aka Maxwell’s equations) on electricity and magnetism which had been published in the 1860s, with the laws of mechanics as described by Galileo and Newton.  Einstein reconciled these seemingly disparate fields of study by introducing major changes to mechanics close to the speed of light.  This work only later became known as the Special Relativity Theory of relativity and is distinguished from the General theory in that it considers the frame of reference of the observer, whereas the General theory assumes all observers are equivalent.

In his work on Special Relativity, Einstein generalizes Galileo’s notion of relativity, which states that all uniform motion is relative and that there is no absolute and well-defined state of rest, from classical mechanics to all the laws of physics, including both the laws of mechanics and of electrodynamics, unifying these seemingly disparate and distinct scientific fields of study to a large extent.  Special relativity is built upon the notion that the speed of light is fixed in an absolute sense, and is the same for all inertial observers regardless of the state of motion of the source.

 

… the same laws of electrodynamics and optics will be valid for all frames of reference for which the equations of mechanics hold good.  We will raise this conjecture (the purport of which will hereafter be called the “Principle of Relativity”) to the status of a postulate, and also introduce another postulate, which is only apparently irreconcilable with the former, namely, that light is always propagated in empty space with a definite velocity c which is independent of the state of motion of the emitting body.  These two postulates suffice for the attainment of a simple and consistent theory of the electrodynamics of moving bodies based on Maxwell’s theory for stationary bodies.

 

 

Much of Einstein’s work on Special relativity can also be seen as an extension, or at least complementary, to the work of the Russian theoretical physicist and mathematician Hermann Minkowski, a contemporary of Einstein.  More specifically, it was Minkowski’s theory of spacetime, which extended the 3 dimensional classical view of reality based upon the geometric algebra of Euclid, Galileo and Descartes among others, to include a fourth dimension of time to explain the true nature of reality, or frame of reference for an event.

 

The views of space and time which I wish to lay before you have sprung from the soil of experimental physics, and therein lies their strength.  They are radical.  Henceforth space by itself, and time by itself, are doomed to fade away into mere shadows, and only a kind of union of the two will preserve an independent reality.[3]

 

What the Special theory of Relativity says basically, and much of its theoretical implications have been experimentally verified at this point, is that the concepts of space and time, which had been looked at as constants no matter what the reference point for the previous two millennium, had to be considered relative, in the sense that their measurement and value depended upon the frame of reference, and the speed, at which the observer was moving.  To arrive at these conclusions, and implicit in the theorems and mathematics behind the theory, the speed of light was presumed to be fixed from all vantage points and frames of reference.  Furthermore, and this was no small contribution of course, it posits and proves that mass and energy are equivalent, as expressed in the famous equation E = mc2.

General Relativity, as it was later called to distinguish itself from Special relativity, was developed to apply the principle of Special Relativity to the more general case, i.e. to any frame of reference.  General Relativity introduces Einstein’s theory of gravity, as it exists and acts upon bodies in motion in the spacetime continuum that is established in Special Relativity.  Whereas Special Relativity restricts itself to a flat spacetime continuum where cosmic scale gravitational effects are negligent, in General Relativity gravitational effects are represented as curvatures of spacetime, i.e. at the cosmic scale gravity affects the very nature of the spacetime continuum itself.   And just as the curvature of the earth’s surface is not noticeable in everyday life and can be effectively ignored in everyday life (when measuring distance or speed for example), the curvature of spacetime can be effectively ignored on smaller, non-cosmic scales of measurement.  In other words, Special relativity is a valid approximation of General Relativity at smaller, non-cosmic scales.

From Einstein’s General Relativity theory then, we not only have the beginnings of the establishment of the model within which the cosmos itself can be studied, introducing the basic principles that are used to this day that define modern Cosmology culminating perhaps most notably in the development of the Big Bang Theory in the latter part of the twentieth century, but we also have a dissolution of the notions of space and time as absolute, independent entities, bringing an end to the era of absolute physical existence which had been an implicit assumption of Western physicists, philosophers, naturalists and theologians for at least some 2000 years or so[4].

As a thought experiment and to illustrate the implications of Relativity when taken to extreme limits, imagine for a moment that you were able to travel at the speed of light, or at least close to it.  Not only would you become enormously massive (infinitely so at the speed of light), but your perception of time relative to your peers at rest would slow down dramatically, a notion known as time dilation, and furthermore your idea of space as defined by any act of measurement would change dramatically as well, a concept referred to as length contraction, where objects that are parallel with the individual’s line of movement would appear to be infinitely small.

 

Leaving Relativity aside for the moment, Charlie wanted to know how Einstein’s revolutionary theories of universal gravitation, the introduction of the idea that space and time were integrally related constructs, the fixed nature of speed of light, and the equivalence of mass and energy played into the development of Quantum Theory, the mathematical principles which so accurately predict the behavior of particles (or waves) in the subatomic world that have dominated the theoretical physics landscape since the middle of the twentieth century and have been the source of so much debate from an interpretation perspective since the theories and their predictive power have been established.  Furthermore, Charlie was interested in the views of Einstein, and other prominent physicists of the twentieth century, on Quantum Theory and what it meant to our view of reality itself, what is now commonly referred to as Interpretations of Quantum Theory, something that Einstein clearly had a strong opinion on as evidenced by his vocal and consistent criticism of the theory itself with respect to its completeness.

As Charlie had already found, atomic theory, which to a large extent forms the basis for our materialist modern day view of reality, posits that all matter, all substance or physical reality, is composed of discrete and composite things called atoms, constructs which form the fundamental building blocks of the universe and are indivisible in nature.  Atomic theory from this basic perspective has its roots in Ancient Greek philosophy, attributed to some of the pre-Socratic philosophers such as Democritus, Leucippus, and the Epicurean school from the 6th and 5th centuries BCE.[5]

It wasn’t until the end of the 18th century, more than two millennia after the initial basic tenets of atomicism by the Ancient Greek philosophers, that physicists were able to expand upon this theory and provide a more empirical and mathematical basis for these essential building blocks of nature, building blocks which were eventually determined to be divisible in fact, consisting of electrons, protons and other even further divisible structures that are the basis of much study and debate in modern particle physics.

The first of these developments was the law of conservation of mass, formulated by Antoine Lavoisier in 1789, which states that the total mass in a chemical reaction remains constant, and the second was the law of definite proportions, first proven by the French chemist Joseph Louis Proust in 1799 which states that if a compound is broken down into its constituent elements, then the masses of the constituents will always have the same proportions regardless of the quantity or source of the original substance.

Then, upon the publication of James Maxwell’s Treatise on Electricity and Magnetism in 1873, Maxwell showed that the interactions of both positive and negative charges that had been previously thought of as two separate forces, i.e. electricity and magnetism, were regulated by one force, electromagnetism, governed by four basic laws; 1) electric charges attract or repel one another with a force inversely proportional to the square of the distance between them: unlike charges attract, like ones repel, 2) magnetic poles, or states of polarization at individual points, attract or repel one another in a similar way and always come in pairs: every north pole is yoked to a south pole, 3) an electric current in a wire creates a circular magnetic field around the wire, its direction, clockwise or counter-clockwise, depends on the direction of the current, and 4) a current is induced in a loop of wire when it is moved towards or away from a magnetic field, or a magnet is moved towards or away from it, the direction of current depending on that of the movement.

Then in 1897, J.J. Thompson discovered a particle, or corpuscle as he called it, that was some 1000 times smaller than the atom as it had been estimated at the time.  Thompson didn’t know it then but this corpuscle that he had discovered was actually the electron.  Thompson’s discovery was followed closely thereafter by the discovery of a positively charged constituent of mass that rested in the center of the atom by Ernest Rutherford in 1909, a student of Thompson.  Rutherford, building on the work of his teacher, discovered that most of the mass and positive charge of an atom was concentrated in a very small fraction of its volume, which he presumed to be its center, what later came to be known as the nucleus of the atom.  This result led Rutherford to propose a planetary model of the atom where electrons of negative charge orbited around a positively charged nucleus that contained the majority of the mass of the atom.

Shortly after Rutherford’s discovery, one of his students, Niels Bohr, landed on a more broad and well defined model for the structure of the atom that leveraged findings in Quantum Mechanics (although the field wasn’t called that quite yet) and specifically some of Planck’s work on quantization to further describe and model the picture of the atom.  By studying the hydrogen atom, Bohr theorized that an electron orbited the nucleus of an atom in particular, discrete circular orbits with fixed angular momentum and energy, the electron’s distance from the nucleus being a function of its energy level.

Bohr’s theory cleaned up some of the shortcomings of the planetary model proposed by Rutherford because it explained how atoms could achieve stable states, a shortcoming of the prior work by Rutherford.  He further theorized that atoms could only make quantum leaps of energy states, and when this occurred light was emitted or absorbed at a frequency proportional to the change in energy, explaining another phenomenon that was lacking in Rutherford’s model and introducing the basic building blocks of the Quantum Theory.  Essentially what Bohr discovered and contributed to Quantum Theory, leveraging Plank’s models in the quantized nature of radiation emission, was that electrons orbit neutrons in the outer part of the atom corresponding to definite, discrete and fixed energy levels, and that when an electron jumps from one discrete state to another, it gives rise to the emission or absorption of electromagnetic radiation at a specific characteristic wavelength.[6]

Atomic theory as it stands today was later refined through works of many physicists in the fields of electromagnetism and radioactivity, developments which further divided atomic structure and gave rise to the term elementary particles, which refers to the subatomic particles we are most familiar with today, namely electrons, protons and neutrons.  But the story doesn’t end here.  Models in the world of theoretical physics start to get complicated pretty quickly over the next few decades after this wave (no pun intended) of discoveries in the early twentieth century.  And as the theories became more complex, and the experimental results that they predicted become more expansive, comprehensive and verified, some very interesting and revealing questions are posed that have still yet to be answered satisfactorily from Charlie’s standpoint, and were never answered satisfactorily from Einstein’s standpoint either.

 

[1] Toward the end of his life, Einstein is attributed to have said to his friend Linus Pauling, “I made one great mistake in my life — when I signed the letter to President Roosevelt recommending that atom bombs be made; but there was some justification — the danger that the Germans would make them”.  Quote from Einstein: The Life and Times by Ronald Clark. page 752

[2] 1905 which was the year where Einstein’s ground breaking work in Brownian motion, Special Relativity, and mass/energy equivalence were published is sometimes referred to as Annus Mirabilis, or literally “extraordinary year”.

[3] From Minkowski’s address delivered at the 80th Assembly of German Natural Scientists and Physicians on September 21, 1908.

[4] See http://science.howstuffworks.com/science-vs-myth/what-if/what-if-faster-than-speed-of-light1.htm

[5] The word “atom” comes from the Greek adjective atomos, which literally means “indivisible”, and pre-Socratic philosophers, specifically the Epicurean school, posited that the world consisted of indivisible atoms that moved through a universal substratum of physical existence, i.e. the void, which was effectively defined as the parameters of space through which these indivisible atoms moved.  It was believed that atoms could join together in various combinations which explained the variety of things or substances that existed in the reality perceived by our senses.

[6] Since Bohr’s model is essentially a quantized version of Rutherford’s, some scholars refer to the model as the Rutherford-Bohr model as opposed to just the Rutherford model.  As a theory, it may be considered to be obsolete given later advancements however, because of its simplicity and its correct results for selected systems, the Bohr model is still commonly taught to introduce students to Quantum Mechanics.

Wave-Particle Duality: So Much for the Atom

From Charlie’s standpoint, Relativity Theory could be grasped intellectually by the educated, intelligent mind.  You didn’t need advanced degrees or a deep understanding of complex mathematics to understand that at a very basic level, Relativity Theory implied that basic measurements like speed, distance and even mass were relative and depended upon the observer’s frame of reference, that mass and energy were basically convertible into each other and equivalent, related by the speed of light that moved at a fixed speed no matter what your frame of reference, and that space and time were not in fact separate and distinct concepts but in order for a more accurate picture of the universe to emerge they needed to be combined into a single notion of spacetime.  Relativity says that even gravity’s effect was subject to the same principles that played out at the cosmic scale, i.e. that spacetime “bends” at points of singularity (black holes for example), bends to the extent that light in fact is impacted by the severe gravitational forces at these powerful places in the universe.  And indeed that our measurements of time and space were “relative”, relative to the speed and frame of reference from which these measurements were made, the observer was in fact a key element in the process of measurement.

If you assumed all these things, you ended up with a more complete and accurate mathematical and theoretical understanding of the universe than you had with Newtonian mechanics, and one that is powerful enough that despite the best efforts of many great minds over the last 100 years or so, has yet to be supplanted with anything better, at least at the macro scale of the universe.  Charlie didn’t doubt that Relativity represented a major step in scientific and even metaphysical step forward to mankind’s understanding of the physical universe, but a subtle and quite distinctive feature of this model was that it reinforced a deterministic and realist model of the universe.  In other words, Relativity implicitly assumed that that objects in the physical did in fact exist, i.e. they were “real”, real in the sense that they had an absolute existence in the spacetime continuum somewhere that could be described in terms of qualitative data like speed, mass, velocity, etc. and furthermore that if you knew a set of starting criteria, what scientists like to call a “system state”, as well as a set of variables/forces that acted on said system, you could in turn predict with certainty the outcome of said forces on such a system, i.e. the set of observed descriptive qualities of the objects in said system after the forces have acted upon the objects that existed in the original system state, i.e. the physical world was fully deterministic.

Charlie didn’t want to split hairs on these seemingly inconsequential and subtle assumptions, assumptions that not only underpinned Einstein’s Relativity in fact but also to a great extent underpinned Newtonian mechanics as well, but these were in fact very modern metaphysical assumptions and had not in fact been assumed, at least not in to the degree of certainty of modern times, in theoretical models of reality that existed prior to the Scientific Revolution.  Prior to Newton, the world of the spirit, theology in fact, was very much considered to be just as real as the physical world, the world governed by science.  This fact was true not only in the West but also in the East, and to a great extent remains true in Eastern philosophical thought today, whereas in the West not so much.

But at their basic, core level, these concepts could be understood, grasped as it were, by the vast majority of the public, even if they had very little if any bearing on their daily lives and didn’t fundamentally change or shift their underlying religious or theological beliefs, or even their moral or ethical principles.  Relativity was accepted in the modern age, as were its deterministic and realistic philosophical and metaphysical assumptions, the principles just didn’t really affect the subjective frame of reference, the mental or intellectual frame of reference, within which the majority of humanity perceived the world around them.  Relativity itself as a theoretical construct was relegated to the realm of physics, a problem which needed to be understood to pass a physics or science exam in high school or college, to be buried in your consciousness in lieu of our more pressing daily and life pursuits be they family, career and money, or other forms of self-preservation in the modern, Information Age; an era most notably marked by materialism, self-promotion, greed, and capitalism, which interestingly enough all pay homage to realism and determinism to a large extent.

Quantum Theory was altogether different however.  Its laws were more subtle and complex than the world described by classical physics, the world described in painstaking mathematical precision by Newton, Einstein and others.  And after a lot of studying and research, the only conclusion that Charlie could definitively come to was that in order to understand Quantum Theory, or at least try to come to terms with it, a wholesale different perspective on what reality truly was, or at the very least how reality was to be defined, was required.  In other words, in order to understand what Quantum Theory actually means, or in order to grasp the underlying intellectual context within which the behaviors of the underlying particles/fields that Quantum Theory describes were to be understood, a new framework of understanding, a new description of reality, must be adopted.  What we considered to be “reality”, or what was “real”, as understood and implied by classical physics which had dominated the minds of the Western world for over 300 years since the publication of Newton’s Principia, needed to be abandoned, or at the very least significantly modified, in order for Quantum Theory to be comprehended in any meaningful way, in order for anyone to make any sense of what Quantum Theory “said” about the nature of the substratum of existence.

Things would never be the same from a physics perspective, this much was clear, whether or not the daily lives of the bulk of those who struggle to survive in the civilized world would evolve along with physicists in concert with these developments remained to be seen.

 

Quantum Mechanics is the branch of physics that deals with the behavior or particles and matter in the atomic and subatomic realms, or quantum realm so called given the quantized nature of “things” at this scale.  So you have some sense of scale, an atom is 10-8 cm across give or take, and the nucleus, or center of an atom, which is made up of what we now call protons and neutrons, is approximately 10-12 cm across.  An electron, or a photon for that matter, cannot truly be measured from a size perspective in terms of classical physics for many of the reasons we’ll get into below as we explore the boundaries of the quantum world, but suffice it to say at present our best guess at the estimate of the size of an electron are in the range of 10-18 cm or so.[1]

Whether or not electrons, or photons (particles of light) for that matter, really exist as particles whose physical size, and/or momentum can be actually “measured” is not as straightforward a question as it might appear and gets at some level to the heart of the problem we encounter when we attempt to apply the principles of existence or reality to the subatomic realm, or quantum realm, within the context of the semantic and intellectual framework established in classical physics that has evolved over the last three hundred years or so; namely as defined by independently existing, deterministic and quantifiable measurements of size, location, momentum, mass or velocity.

The word quantum comes from the Latin quantus, meaning “how much” and it is used in this context to identify the behavior of subatomic things that move from and between discrete states rather than a continuum of values or states as is assumed and fundamental to classical physics.  The term itself had taken on meanings in several contexts within a broad range of scientific disciplines in the 19th and early 20th centuries, but was formalized and refined as a specific field of study as Quantum Mechanics by Max Planck at the turn of the 20th century and quantization arguably represents the prevailing and distinguishing characteristic of reality at this scale.

Newtonian physics, or even the extension of Newtonian physics as put forth by Einstein with Relativity theory in the beginning of the twentieth century (a theory whose accuracy is well established via experimentation at this point), assumes that particles, things made up of mass, energy and momentum exist independent of the observer or their instruments of observation, and are presumed to exist in continuous form, moving along specific trajectories and whose properties (mass, velocity, etc.) can only be changed by the action of some force upon which these things or objects are affected.  This is the essence of Newtonian mechanics upon which the majority of modern day physics, or at least the laws of physics that affect us here at a human scale, is defined and philosophically has at its heart the presumption of realism and determinism.

The only caveat to this view that was put forth by Einstein is that these measurements themselves, of speed or even mass or energy content of a specific object, can only be said to be universally defined according to these physical laws within the specific frame of reference of an observer.  Their underlying reality is not questioned – these things clearly exist independent of observation or measurement, clearly (or so it seems) – but the values, or the properties of these things is relative to a frame of reference of the observer change depending upon your frame of reference.  This is what Relativity tells us.  So the velocity of a massive body, and even the measurement of time itself which is a function of distance and speed, is a function of the relative speed and position of the observer who is performing said measurement.

For the most part, the effects of Relativity can be ignored when we are referring to objects on Earth that are moving at speeds that are minimal with respect to the speed of light and are less massive than say black holes.  As we measure things at the cosmic scale, where distances are measured in terms of light years and black holes and other massive phenomena exist which bend spacetime (aka singularities) the effects of Relativity cannot be ignored however.[2]

Leaving aside the field of Cosmology for the moment and getting back to the history of the development of Quantum Mechanics, at the end of the 19th century Planck was commissioned by electric companies to create light bulbs that used less energy, and in this context was trying to understand how the intensity of electromagnetic radiation emitted by a black body (an object that absorbs all electromagnetic radiation regardless of frequency or angle of incidence) depended on the frequency of the radiation, i.e. the color of the light.  In his work, and after several iterations of hypotheses that failed to have predictive value, he fell upon the theory that energy is only absorbed or released in quantized form, i.e. in discrete packets of energy he referred to as “bundles” or” energy elements”, the so called Planck postulate.  And so the field of Quantum Mechanics was born.[3]

Despite the fact that Einstein is best known for his mathematical models and theories for the description of the forces of gravity and light at a cosmic scale, his work was also instrumental in the advancement of Quantum Mechanics as well.   For example, in his work in the effect of radiation on metallic matter and non-metallic solids and liquids, he discovered that electrons are emitted from matter as a consequence of their absorption of energy from electromagnetic radiation of a very short wavelength, such as visible or ultraviolet radiation.  Einstein established that in certain experiments light appeared to behave like a stream of tiny particles, not just as a wave, lending more credence and authority to the particle theories describing of quantum realm.  He therefore hypothesized the existence of light quanta, or photons, as a result of these experiments, laying the groundwork for subsequent wave-particle duality discoveries and reinforcing the discoveries of Planck with respect to black body radiation and its quantized behavior.[4]

Prior to the establishment of light’s properties as waves, and then in turn the establishment of wave like characteristics of subatomic elements like photons and electrons by Louis de Broglie in the 1920s, it had been fairly well established that these subatomic particles, or electrons or photons as they were later called, behaved like particles.  However the debate and study of the nature of light and subatomic matter went all the way back to the 17th century where competing theories of the nature of light were proposed by Isaac Newton, who viewed light as a system of particles, and Christiaan Huygens who postulated that light behaved like a wave.  It was not until the work of Einstein, Planck, de Broglie and other physicists of the twentieth century that the nature of these subatomic particles, both light and electrons, were proven to behave both like particles and waves, the result dependent upon the experiment and the context of the system which being observed.  This paradoxical principle known as wave-particle duality is one of the cornerstones, and underlying mysteries, of Quantum Theory.

As part of the discoveries of subatomic particle wave-like behavior, what Planck discovered in his study of black body radiation, and Einstein as well within the context of his study of light and photons, was that the measurements or states of a given particle such as a photon or an electron had to take on values that were multiples of very small and discrete quantities, i.e. were non-continuous, the relation of which was represented by a constant value known as the Planck constant[5].

In the quantum realm then, there was not a continuum of values and states of matter as was assumed in physics up until that time, there were bursts of energies and changes of state that were ultimately discrete, and yet at the same time at fixed amplitudes or values, where certain states and certain values could in fact not exist, representing a dramatic departure from the way physicists, and the rest of us mortals, think about movement and change in the “real world”, and most certainly represented a significant departure from Newtonian mechanics upon which Relativity was based where the idea of continuous motion, in fact continuous existence, was never even questioned.

It is interesting to note that Planck and Einstein had a very symbiotic relationship toward the middle and end of their careers, and much of their work complemented and built off of each other.  For example Planck is said to have contributed to the establishment and acceptance of Einstein’s revolutionary concept of Relativity within the scientific community after being introduced by Einstein in 1905, the theory of course representing a radical departure from the standard classical physical and mechanics models that had held up for centuries prior.  It was through the collaborative work and studies of Planck and Einstein in some sense then that the field of Quantum Mechanics and Quantum Theory is shaped how it is today; Planck who defined the term quanta with respect to the behavior of elements in the realms of matter, electricity, gas and heat, and Einstein who used the term to describe the discrete emissions of light, or photons.

The classic demonstration of light’s behavior as a wave, and perhaps one of the most astonishing and game changing experiments of all time, is illustrated in what is called the double-slit experiment.  In the basic version of this experiment, a light source such as a laser beam is shone at a thin plate that that is pierced by two parallel slits.  The light in turn passes through each of the slits and displays on a screen behind the plate.  The image that is displayed on the screen behind the plate as it turns out is not one of a constant band of light that passes through each one of the slits as you might expect if the light were simply a particle or sets of particles, the light displayed on the screen behind the double-slitted slate is one of light and dark bands, indicating that the light is behaving like a wave and is subject to interference, the strength of the light on the screen cancelling itself out or becoming stronger depending upon how the individual waves interfere with each other.  This behavior is exactly akin to what we consider fundamental wavelike behavior, for example like the nature of waves in water where the waves have greater strength if they synchronize correctly (peaks of waves) and cancel each other out (trough of waves) if not.

What is even more interesting however, and was most certainly unexpected, is that once equipment was developed that could reliably send a single particle, an electron or photon for example, through a double-slitted slate, the individual particles did indeed end up at a single location on the screen after passing through just one of the slits as was expected, but however – and here was the kicker – the location on the screen that the particle ended up at, as well as which slit the particle appeared to pass through (in later versions of the experiment which slit “it” passed through could in fact be detected) was not consistent and followed seemingly random and erratic behavior.  What researchers found as more and more of these subatomic particles were sent through the slate one at a time, was that the same wave like interference pattern emerged that showed up when the experiment was run with a full beam of light as was done by Young some 100 years prior[6].

So hold on for a second, Charlie had gone over this again and again, and according to all the literature he read on Quantum Theory and Quantum Mechanics all pretty much said the same thing, namely that the heart of the mystery of Quantum Mechanics could be seen in this very simple experiment.  And yet it was really hard to, perhaps impossible, to understand what was actually going on, or at least understand without abandoning some of the very foundational principles of classical physics, like for example that these things called subatomic particles actually existed as independent particles or “objects” as we might refer to them at the macroscopic scale, because they seemed to behave like waves when looked at in aggregate but at the same time behaved, sort of, like particles when looked at individually.

What was clear was that this subatomic particle, corpuscle or whatever you wanted to call it, did not have appear to have a linear and fully deterministic trajectory in the classical physics sense, this much was very clear due to the fact that the distribution against the back screen when they were sent through the double slits experiment as individual particles appeared to be random.  But what was more odd was that when the experiment was run one corpuscle at a time, again whatever that term really means at the quantum level, not only was the final location on the screen seemingly random individually, but the same aggregate pattern emerged after many, many single corpuscle experiment runs as when a full wave, or set of these corpuscles, was sent through the double slits.

So it appeared, and this was and still remains a very important and telling mysterious characteristic feature of the behavior of these “things” at the subatomic scale, is that not only did the individual photon seemed to be aware of the final wave like pattern of its parent wave, but also that this corpuscle appeared to be interfering with itself when it went through the two slits individually.  Charlie wanted to repeat this again for emphasis, because these conclusions, which from his perspective after doing a heck of a lot of research into Quantum Theory for a guy who was inherently lazy but was still looking to try and understand the source of our fundamentally mechanistic and materialistic world view which so dominates Western society today, a view which clearly rested on philosophical and metaphysical foundations which stemmed from classical physical notions of objective reality, with the double-slit experiment you could clearly see that the fundamental substratum of existence not only exhibited wave like as well as particle like behavior, but when looked at at the individual “particle” level, whatever the heck that actually means at the subatomic scale, the individual particle seemed to not only be aware of its parent like wave structure, but the experimental results seemed to imply that the individual particle was interfering with itself.

Furthermore, to make things even more mysterious, as the final location of each of the individual photons in the two slit and other related experiments was evaluated and studied, it was discovered that although the final location of an individual one of these particles could not be determined exactly before the experiment was performed, i.e. there was a fundamental element of uncertainty or randomness involved at the individual corpuscle level, it was discovered that the final locations of these particles measured in toto after many experiments were performed exhibited statistical distribution behavior that could be modeled quite precisely, precisely from a mathematical statistics and probability distribution perspective.  That is to say that the sum total distribution of the final locations of all the particles after passing through the slit(s) could be established stochastically, i.e. in terms of well-defined probability distribution consistent with probability theory and well defined mathematics that governed statistical behavior.  So in total you could predict what the particle like behavior would look like over a large distribution set of particles in the double slit experiment even if you couldn’t predict with certainty what the outcome would look like for an individual corpuscle.

The mathematics behind this particle distribution that was discovered is what is known as the wave function, typically denoted by the Greek letter psi, ψ or its capital equivalentΨ, predicts what the probability distribution of these “particles” will look like on the screen behind the slate after many individual experiments are run, or in quantum theoretical terms, the wave function predicts the quantum state of a particle throughout a fixed spacetime interval.  The wave function was discovered by the Austrian physicist Erwin Schrödinger in 1925, published in 1926, and is commonly referred to in the scientific literature as the Schrödinger equation, analogous in the field of Quantum Mechanics to Newton’s second law of motion in classical physics.

This wave function represents a probability distribution of potential states or outcomes that describe the quantum state of a particle and predicts with a great degree of accuracy the potential location of a particle given a location or state of motion.  With the discovery of the wave function, it now became possible to predict the potential locations or states of these subatomic particles, an extremely potent theoretical model that has led to all sorts of inventions and technological advancements since its discovery.

Again, this implied that individual corpuscles were interfering with themselves when passing through the two slits on the slate, which was very odd indeed.  In other words, the individual particles were exhibiting wave like characteristics even when they were sent through the double-slitted slate one at a time.  This phenomenon was shown to occur with atoms as well as electrons and photons, confirming that all of these subatomic so-called particles exhibited wave like properties as well as particle like qualities, the behavior observed determined upon the type of experiment, or measurement as it were, that the “thing” was subject to.

As Louis De Broglie, the physicist responsible for bridging the theoretical gap between matter, in this case electrons, and waves by establishing the symmetric relation between momentum and wavelength which had at its core Planck’s constant (the De Broglie equation), described this mysterious and somewhat counterintuitive relationship between matter and waves, “A wave must be associated with each corpuscle and only the study of the wave’s propagation will yield information to us on the successive positions of the corpuscle in space.”[7]  In the Award Ceremony Speech in 1929 in honor of Louis de Broglie for his work in establishing the relationship between matter and waves for electrons, we find the essence of his ground breaking and still mysterious discovery which remains a core characteristic of Quantum Mechanics to this day.

 

Louis de Broglie had the boldness to maintain that not all the properties of matter can be explained by the theory that it consists of corpuscles. Apart from the numberless phenomena which can be accounted for by this theory, there are others, according to him, which can be explained only by assuming that matter is, by its nature, a wave motion. At a time when no single known fact supported this theory, Louis de Broglie asserted that a stream of electrons which passed through a very small hole in an opaque screen must exhibit the same phenomena as a light ray under the same conditions. It was not quite in this way that Louis de Broglie’s experimental investigation concerning his theory took place. Instead, the phenomena arising when beams of electrons are reflected by crystalline surfaces, or when they penetrate thin sheets, etc. were turned to account. The experimental results obtained by these various methods have fully substantiated Louis de Broglie’s theory. It is thus a fact that matter has properties which can be interpreted only by assuming that matter is of a wave nature. An aspect of the nature of matter which is completely new and previously quite unsuspected has thus been revealed to us.[8]

 

So by the 1920s then, you have a fairly well established mathematical theory to govern the behavior of subatomic particles, backed by a large body of empirical and experimental evidence, that indicates quite clearly that what we would call “matter” (or particles or corpuscles) in the classical sense, behaves very differently, or at least has very different fundamental characteristics, in the subatomic realm.  It exhibits properties of a particle, or a thing or object, as well as a wave depending upon the type of experiment that is run.  So the concept of matter itself then, as we had been accustomed to dealing with and discussing and measuring for some centuries, at least as far back as the time of Newton (1642-1727), had to be reexamined within the context of Quantum Mechanics.  For in Newtonian physics, and indeed in the geometric and mathematical framework within which it was developed and conceived which reached far back into antiquity (Euclid circa 300 BCE), matter was presumed to be either a particle or a wave, but most certainly not both.

What even further complicated matters was that matter itself, again as defined by Newtonian mechanics and its extension via Relativity Theory taken together what is commonly referred to as classical physics, was presumed to have some very definite, well-defined and fixed, real properties.  Properties like mass, location or position in space, and velocity or trajectory were all presumed to have a real existence independent of whether or not they were measured or observed, even if the actual values were relative to the frame of reference of the observer.  All of this hinged upon the notion that the speed of light was fixed no matter what the frame of reference of the observer of course, this was a fixed absolute, nothing could move faster than the speed of light.  Well even this seemingly self-evident notion, or postulate one might call it, ran into problems as scientists continued to explore the quantum realm.

So by the 1920s then, the way scientists looked at and viewed matter as we would classically consider it within the context of Newton’s postulates from the early 1700s which were extended further into the notion of spacetime as put forth by Einstein, was encountering some significant difficulties when applied to the behavior of elements in the subatomic, quantum, world.  Difficulties that persist to this day it was important to point out.  Furthermore, there was extensive empirical and scientific evidence which lent significant credibility to Quantum Theory, which illustrated irrefutably that these subatomic elements behaved not only like waves, exhibiting characteristics such as interference and diffraction, but also like particles in the classic Newtonian sense that had measurable, well defined characteristics that could be quantified within the context of an experiment.

In his Nobel Lecture in 1929, Louis de Broglie, summed up the challenge for physicists of his day, and to a large extent physicists of modern times, given the discoveries of Quantum Mechanics as follows:

 

The necessity of assuming for light two contradictory theories-that of waves and that of corpuscles – and the inability to understand why, among the infinity of motions which an electron ought to be able to have in the atom according to classical concepts, only certain ones were possible: such were the enigmas confronting physicists at the time…[9]

 

The other major tenet of Quantum Theory that rests alongside wave-particle duality, and that provides even more complexity when trying to wrap our minds around what is actually going on in the subatomic realm, is what is sometimes referred to as the uncertainty principle, or the Heisenberg uncertainty principle, named after the German theoretical physicist Werner Heisenberg who first put forth the theories and models representing the probability distribution of outcomes of the position of these subatomic particles in certain experiments like the double-slit experiment previously described, even though the wave function itself was the discovery of Schrödinger.

The uncertainty principle states that there is a fundamental theoretical limit on the accuracy with which certain pairs of physical properties of atomic particles, position and momentum being the classical pair for example, that can be known at any given time with certainty.  In other words, physical quantities come in conjugate pairs, where only one of the measurements of a given pair can be known precisely at any given time.  In other words, when one quantity in a conjugate pair is measured and becomes determined, the complementary conjugate pair becomes indeterminate.  In other words, what Heisenberg discovered, and proved mathematically, was that the more precisely one attempts to measure one of these complimentary properties of subatomic particles, the less precisely the other associated complementary attribute of the element can be determined or known.

Published by Heisenberg in 1927, the uncertainty principle states that they are fundamental, conceptual limits of observation in the quantum realm, another radical departure from the realistic and deterministic principles of classical physics which held that all attributes of a thing were measurable at any given time, i.e. this thing or object existed and was real and had measurable and well defined properties irrespective of its state.  It’s important to point out here that the uncertainty principle is a statement on the fundamental property of quantum systems as they are mathematically and theoretically modeled and defined, and of course empirically validated by experimental results, not a statement about the technology and method of the observational systems themselves.  This wasn’t a theoretical problem, or a problem with the state of instrumentation that was being used for measurement, it was a characteristic of the domain itself.

Max Born, who won the Nobel Prize in Physics in 1954 for his work in Quantum Mechanics, specifically for his statistical interpretations of the wave function, describes this now other seemingly mysterious attribute of the quantum realm as follows (the specific language he uses reveals at some level his interpretation of the quantum theory, more on interpretations later):

 

…To measure space coordinates and instants of time, rigid measuring rods and clocks are required.  On the other hand, to measure momenta and energies, devices are necessary with movable parts to absorb the impact of the test object and to indicate the size of its momentum.  Paying regard to the fact that quantum mechanics is competent for dealing with the interaction of object and apparatus, it is seen that no arrangement is possible that will fulfill both requirements simultaneously.[10]

 

Whereas classical physicists, physics prior to the introduction of Relativity and Quantum Theory, distinguished between the study of particles and waves, the introduction of Quantum Theory and wave-particle duality established that this classic intellectual bifurcation of physics at the macroscopic scale was wholly inadequate in describing and predicting the behavior of these “things” that existed in the subatomic realm, all of which took on the characteristics of both waves and particles depending upon the experiment and context of the system being observed.  Furthermore the actual precision within which a state of a “thing” in the subatomic world could be defined was conceptually bound, establishing theoretical limits upon which the state of a given subatomic state could be defined, another divergence from classical physics.  And then on top of this, was the requirement of the mathematical principles of statistics and probability theory, as well as significant extensions to the underlying geometry which were required to map the wave function itself in subatomic spacetime, all called quite clearly into question our classical materialistic notions, again based on realism and determinism, upon which scientific advancement had been built for centuries.

 

[1] Our current ability to measure the size of these subatomic particles goes down to approximately 10-16 cm leveraging currently available instrumentation, so at the very least we can say that our ability to measure anything in the subatomic realm, or most certainly the realm of the general constituents of basic atomic elements such as quarks or gluons for example, is very challenging to say the least.  Even the measurement of the estimated size of an atom is not so straightforward as the measurement is dictated by the circumference of the atom, a measurement that relies specifically on the size or radius of the “orbit” of the electrons on said atom, “particles” whose actual “location” cannot be “measured” in tandem with their momentum, standard tenets of Quantum Mechanics, both of which constitute what we consider measurement in the classic Newtonian sense.

[2] In some respects, even at the cosmic scale, there is still significant reason to believe that even Relativity has room for improvement as evidenced by what physicists call Dark Matter and/or Dark Energy, artifacts and principles that have been created by theoretical physicists to describe matter and energy that they believe should exist according to Relativity Theory but the evidence for which their existence is still yet ”undiscovered”.  Both Dark Matter and Dark Energy represent active lines of research in modern day Cosmology.

[3] Quantum theory has its roots in this initial hypothesis by Planck, and in this sense he is considered by some to be the father of quantum theory and quantum mechanics.  It is for this work in the discovery of “energy quanta” that Max Planck received the Nobel Prize in Physics in 1918, some 15 or so years after publishing.

[4] Einstein termed this behavior the photoelectric effect, and it’s for this work that he won the Nobel Prize in Physics in 1921.

[5] The Planck constant was first described as the proportionality constant between the energy (E) of a photon and the frequency (ν) of its associated electromagnetic wave.  This relation between the energy and frequency is called the Planck relation or the Planck–Einstein equation:

[6] The double slit experiment was first devised and used by Thomas Young in the early nineteenth century to display the wave like characteristics of light.  It wasn’t until the technology was available to send a single “particle” (a photon or electron for example) that the wave like and stochastically distributed nature of the underlying “particles” was discovered as well.  http://en.wikipedia.org/wiki/Young%27s_interference_experiment

[7] Louis de Broglie, “The wave nature of the electron”, Nobel Lecture, Dec 12th, 1929

[8] Presentation Speech by Professor C.W. Oseen, Chairman of the Nobel Committee for Physics of the Royal Swedish Academy of Sciences, on December 10, 1929.  Taken from http://www.nobelprize.org/nobel_prizes/physics/laureates/1929/press.html.

[9] Louis de Broglie, “The wave nature of the electron”, Nobel Lecture, Dec 12th, 1929

[10] Max Born, “The statistical interpretation of quantum mechanics” Nobel Lecture, December 11, 1954.

To What End: The Limits of Science

Charlie could remember back to when some of this had all started to germinate.  He was still in school back then.  Back in Providence.  When he was a ‘student-athlete’, whatever the heck that meant.  But in his better moments, he was an amateur philosopher.  Exploring the nature and depths of his own mind, and looking at and analyzing the scientific and analytical models that were presented before him that described reality.  There was hard science, there were the arts, and there was philosophy.  And certainly if you read something in a textbook presented by a Professor with a PHD, it must be true.  A hard fact.  Undisputed.

The sciences were a little tough for Charlie though.  He steered clear of disciplines that had lab hours or were brutally difficult to get through.  That left out most of the sciences.  He didn’t get into software engineering until much later.  Until he had to find a way to make a living that didn’t involve hitting a yellow fuzzy ball.  He did read some Einstein though, and some Stephen Hawking, just to try and get an understanding of the scientific models that underlie the physical world that we lived in.

What struck Charlie about some of these models, not that he understood them completely of course (nor did he think that he completely understood them today), was the limitations that seemed to be present in their descriptive power.  For Quantum Theory in particular, you had this embedded notion of “uncertainty”, some sort of probability distribution of outcomes that mapped the behavior of these subatomic things, a model that by design was incompatible with the tried and true notions of classical physics, that things and objects were real, had mass and velocity and “existed” beyond any act of observation or measurement, even if their “reality”, as defined by these measurable quantities really, was relative at a very basic level to the frame of reference of the observer.  The models were supposed to describe the world we lived in, at least better than any of the other theories that were out there, and yet they seemed to just beg more questions.

Quantum Mechanics even had a principle they called the uncertainty principle.  In physics?  So part of the theory is there are certain limits on what can be known?  That seemed very odd to have a principle called the uncertainty principle, which was so well defined, mathematically even (Δψp Δψq ≥ ℏ/2 if you must know) that sat right square in the middle of the hardest of sciences.

In an oft quoted passage, one of the greatest scientific minds of the 20th century and one of the original formulators of Quantum Mechanics, Max Born had this to say about the limits of Quantum Theory, which calls out directly the epistemological limits of science itself to some degree.

 

“ …To measure space coordinates and instants of time, rigid measuring rods and clocks are required. On the other hand, to measure momenta and energies, devices are necessary with movable parts to absorb the impact of the test object and to indicate the size of its momentum. Paying regard to the fact that quantum mechanics is competent for dealing with the interaction of object and apparatus, it is seen that no arrangement is possible that will fulfill both requirements simultaneously… ”[1]

 

So science had its limits then.  Scientists themselves recognized these limits.  And they gave the limits names.  And they named this one the “uncertainty principle” or “relativity”.  That said it all, Charlie remembered thinking.  So even hardcore theoretical physicists recognized the limits of their models and the methods that they used to arrive at and measure outcomes.  What we call science, which was based upon empirical study and verifiable evidence, what we deemed to be the boundaries of our physical world, appeared to be simply a map of the territory and smacked of some very basic underlying limitations.

And yet the Western mind, if one could generalize such a thing, was rooted in the fundamental belief of the “reality” of the physical world, believing that all experience and reality was explainable and predictable, basing its assumptions on what appeared to be Reason and Logic, built on empiricism essentially or what Pirsig called logical positivism.  So Charlie, and science itself it appeared, was presented with this philosophical problem, where the implications of this belief system, and its limits in fact, its basic assumptions about “reality”, should be well understood, and well taught.  And yet these limitations weren’t taught.  The assumptions that were built into these models that modern science directly called into question seemed to be brushed under the rug so to speak.  David Bohm’s struggle throughout the end of his career in fact reflected this struggle to have these basic assumptions, which rested at the heart and pinnacle of modern science, brought to light in some meaningful way, which in turn forced him to construct a broader theory of knowledge which incorporated physics, and the mind, and directly spoke to the basic assumptions of Western science which no longer seemed tenable.

Then there was this whole religious orthodoxy thing that remained a mystery to Charlie, and yet still held tremendous influence and sway over millions and millions of people throughout the world.  Not to pick on Christians here, as the Muslims and Jews (mainly the Abrahamic religions for the most part it seemed) all had their religious orthodoxy which held their Scripture to be divine revelation and to be interpreted literally and used as a reference guide to life itself, despite the fact that it was clear that this Scripture which they held so dear was clearly interpreted, translated and compiled by authors who were definitely not the prophets in question.  It did not take too much research to find out that neither Moses, nor Jesus nor Muhammad actually wrote anything down, they were presumably too busy living and teaching and reveling in the glory of the Creator.  These Evangelicals, who regarded the word of their God, as it was translated from the original Greek or Hebrew or Arabic as the case may be, should be interpreted literally and the ‘subjective’ experience of mystics should be ignored because it doesn’t have a basis on objective scientific truth, God reveals himself only to his chosen people.  That just didn’t seem to hold water to Charlie.  That premise seemed to lack the very rational foundations that it held so dear.

His mind rolled back to his senior year in college.  He and Jenry were roommates.  They lived in some shabby old house right by the local pub they used to go to all the time – Oliver’s it was called.  The location was great, but the place was practically falling apart.  It was college though, you were supposed to live like that apparently.  And yet in this setting, there was room for abstract thought, some exploration of the ideas and concepts that were being pressed into their formative minds.  And Charlie was doing enough reading, was exposed to enough of basic principles that formed the basis of modern science, that his mind was able to see a hole in the framework.  A hole in the model as it stood.  And yet he didn’t understand why this hole wasn’t more obvious to everyone around him, or at the very least, why this hole didn’t grab as much attention as he thought it should.

 

Physics as it stands today rests on the conceptually framework that physical reality is measurable and quantifiable and fundamentally “real”.  And this entire framework rests on the belief in, ultimate faith in, the predictive power of advanced mathematics to represent the physical world.  This branch of knowledge, and it was important to keep in mind that this was but one branch of knowledge, takes as given that the physical world is of three dimensions, dimensions represented by Cartesian (Euclidean) space that could be mapped in a basic x, y, and z coordinate system that could be present the location of anything in physical space.  And in turn, that time was mapped over it as the fourth dimension and always moved linearly in one direction.  This model had lasted from the time of the Greeks until Einstein’s day, more than two thousand years.

But Einstein postulated, and the boundaries of the theory were proven by later scientific experiments, that time and physical space itself was not only a function of the observer, that time and space in and of themselves were “relative” in fact, that the faster you approached the speed of light, the more your notion of time and space diverged from that of an observer at rest, the more relative time and space became.  But he also proved, that in order to build a more comprehensive model of physical reality, space and time needed to be fundamentally linked as conceptual constructs, and in fact, at the cosmic scale, spacetime was elastic, it “bended”.

Quantum Mechanics in turn showed that not only did the subatomic world operate according to very different and wholly irreconcilable laws than that of “classical physics”, but that the nature of physical reality was much more complex than perhaps we could ever imagine, that the underlying physical structure of the universe, of all of the physical world in fact, behaved not only according to the principles of matter, or particles, but also according to wavelike principles as well.  Hence the wave-particle duality paradox that sits at the heart of quantum reality and remains one of its great mysteries of science.  But Quantum Theory also shows us unequivocally that the idea of “measurement”, which sits at the very core of the philosophy of Western science, has limits, and that irrespective of the conundrum between classical physics and Quantum Theory, there remain interpretative questions about Quantum Theory itself that are still unanswered and force us to incorporate philosophy, metaphysics, our definition of knowledge and reality itself, back into the discussion.

What Bohm searched for, where he branched from theoretical physics into the realm of metaphysics, left the reservation so to speak, was in his search for unified order which he was compelled to establish after concluding that the only rational explanation of Quantum Theory was a notion which he called undivided wholeness.  He did not see a purely mathematical and theoretical answer to the seemingly irreconcilable differences between classical physics and quantum reality.  His premise seemed to be not only that a mathematical model that incorporated both the principles inherent to Einstein’s General Relativity and those that underlay Quantum Theory was not only impossible, but that in order to make sense of the “reality” of the models that described the two different domains, an adventure beyond physics was inevitable.

Bohm saw that the only path of reconciliation as it were lay outside the domain of physics and in the realm of metaphysics, where the notion of Mind and the Intellect were an integral part of the process of experience, i.e. his holomovement concept, and were directly incorporated into the theoretical model.  He effectively concluded that any study of the nature of the physical universe led one, from a rational and empirical basis alone, to the notion of an underling implicate and explicate order structure in which various explicate orders were perceived and stood unfolded from an underlying coherent implicate order structure that was characterized by some level of undivided wholeness, a concept within which thought itself was an integral part.

The implications of an explicate and implicate order framework for reality, given Bohmian Mechanics which illustrates the possibility of non-local hidden variable theories to explain Quantum Mechanics, is that the existence of a supposed “unified field theory”, or a model of “quantum gravity” so sought after by physicists since Einstein, is highly unlikely.  To take the implications one step further, Bohm’s model of reality implies that mathematics as a model for describing reality is limited, albeit powerful for describing various explicate orders such as Newtonian mechanics, Relativity (both Special and General Theories) as well as Quantum Mechanics, is limited to explicate orders and that in order to find a holistic model for describing all of reality, and in turn all explicate orders, one must look to the concepts of consciousness and integrated wholeness and interdependence, leaning on what appeared to be very Eastern philosophical principles that were fundamental principles, axioms as it were, in Vedanta and Buddhism.

Einstein’s belief in this “Unified Field Theory”, the existence of which essentially forms the basis of his criticism of Quantum Mechanics as incomplete, seemed to not only be improbable, but perhaps even impossible given the fundamental incompatibilities of the assumptions of the different theories and models.  In other words the very idea of local realism and local determinism as a construct, core to the theories of Special and General Relativity of Einstein and of course Newtonian mechanics, seemed to be at best limited to a certain domain of experience and at worst were fundamentally flawed as assumptions of the basis of physical reality.  And this violation of the principle of local realism has been empirically proven, at least at the quantum level, not only mathematically with the introduction of Bohmian Mechanics and the notion of Quantum Potential, but also subsequently experimentally by showing the relationship and interdependence of particle properties in two independent systems that were separated by classical physical boundaries.

Charlie thought that this quest for a “unified field theory” was a bit of a fool’s errand of sorts and what we really should be focused on was a quest for a “unified knowledge theory”, making it explicit that some elements of metaphysics, theoretical constructs that could provide linking and overarching themes across all the branches of science, must be included in our models of reality in order that all of our knowledge and all of our experience could be understood and comprehended in a fully coherent and consistent conceptual framework.

But in order to come up with a “unified knowledge theory”, you had to move beyond physics and modern science, and incorporate the science of mind and the act of perception itself into the overall framework.  Mathematical models of physical reality, whatever that was, appeared to only take you so far, which is the essence of Bohm’s case for an implicate and explicate order framework for “reality”, branching away from the orthodox interpretation of Quantum Theory (Copenhagen Interpretation) which stated that Quantum Theory was not in fact a framework for reality as we know it but simply a measuring tool that told us, approximately, the behavior of “stuff” at the subatomic level.

Hence Charlie’s ultimate conclusion that this search for a “Unified Field Theory”, which drives the field of theoretical physics, as well as particle physics to a great extent today, is fundamentally misguided.  String Theory and other abstract theoretical mathematical constructs represent the search for an answer to a metaphysical question using a tool that is wholly inadequate to answer and solve the problem.  The use of mathematics to answer to the metaphysical question of how the universe works and how Quantum Theory and General Relativity can be unified to explain a “unified field theory” appeared to be a fruitless effort, akin to an attempt to use a hammer and nails to build a skyscraper.

What we should be looking for, and what Bohm really provided us with, and in fact what Aristotle spoke to some 2500 years ago, is a Unified Knowledge Theory, within which physics, metaphysics, biology, psychology, etc. can be viewed as branches of knowledge that complement each other to provide a complete picture of the world we live in.  Where these seemingly contradictory and separate domains can peacefully coexist and collectively give us a perspective on the nature of reality as a whole as well as our place in this reality.

 

What Charlie thought he had fallen upon that seemed to go unnoticed in the modern era, the Age of Reason, the Age of Science, was that understanding starts and ends with language, the means with which we communicate ideas to one another and construct an understanding of, and are able to navigate through, the world around us.  And then language in its most concrete form was reflected in the written word, as expressed in various phonetic alphabets which were developed, invented, to express and codify language and encapsulate and communicate more abstract concepts into “systems” of thought that allowed us to express more complex ideas and to formulate models of reality.  Of course writing was most likely invented to communicate various forms of trade and economic transactions, but from a broader perspective it was then used to communicate knowledge itself, which in turn formed the basis as to how we look at the world around us and how we perceive our place in this reality, which at some level harkened back to those age old questions that have plagued man since the dawn of history, “who are we and from whence we came?”.

The Greek language, the ultimate forefather of all Western European languages, in many respects came to define how we look at knowledge itself in all its various forms.  As far as Charlie could gather this seemed to be Aristotle’s unique and lasting contribution to the West.  It is from his branches of knowledge, his epistêmai, that the language of modern science as a whole is derived.  And out of this ancient Greek philosophical movement, mathematics also originated as one of the cornerstones of metaphysics.  These mathematical principles, even the principle of the One, were a core part of the Greek philosophical schools as evidenced not only by Aristotle’s comprehensive discussion of these principles in his Physics and Metaphysics (even if he dismisses them as incoherent belief systems) but also more directly in the Pythagorean school which was influential throughout Greece in the pre-Socratic era.  It is from this tradition that Euclid and Ptolemy come from and it from these “scientists” that our modern reliance on mathematics as the ultimate expression of creation stems from.

But Mathematics as we have found is a limited and constrained abstract tool, even if it might perhaps be our most powerful abstract tool for modeling (physical) reality.  It was powerful yes, but clearly not powerful enough to explain the totality of behavior of particles and bodies in both the subatomic world as outlined in Quantum Mechanics and the world of massive bodies which warp spacetime as described in Einstein’s theories of Special and General Relativity.  In order to construct a fully coherent descriptive model of all existence it appeared that you needed more abstract symbols and more consistent, explicit assumptions about the grounding of existence, and the notion of perception, which encompassed the more scientific notions of observation and measurement, had to be built into the model somehow.

Mathematics, seen as the ultimate language to describe the physical world, a tenet solidified by Sir Isaac Newton and reinforced by the sheer beauty and elegance of the theories of General and Special theories of relativity as espoused by Albert Einstein, in fact constrained us from seeing the limits of this language in describing the ultimate source of all things and the process by which the universe itself was created.  Einstein himself fell into this trap in his immovable belief that Quantum Theory was in fact flawed, incomplete, failing to consider the possibility that perhaps some of the basic assumptions about the nature of physical reality needed to be at the very least relaxed if not abandoned entirely, something he was unwilling to even consider such was his conviction in the classical view of the world.

What Bohm searched for, and what ultimately led him out of physics proper and into metaphysics and philosophy, was some notion of unified order under which classical physics quantum reality could be explained.  He concluded that in order to explain the totality of even a purely physical reality, one had to formulate a theory of order that presumed some sort of hierarchical structure, where various explicate orders could manifest to explain a certain domain, and yet at the same time be incorporated into a single model of reality, his implicate order.  And he furthermore postulated that rather that it was more accurate to look at reality as a process of unfoldment rather than the existence of some hard and fast physical reality as had been assumed by classical physics over the last few hundred years.  And therefore Bohm had to leave the world of physics proper and enter the realm of metaphysics where the more abstract concepts of mind and the notion of perception itself could be, and had to be, incorporated into the model.

 

So Charlie came to what he thought to be the logical conclusion that any unified knowledge theory must encompass levels of abstraction that go beyond mathematics, and yet at the same time must be constrained by language itself, which is the means through which we communicate ideas and thoughts to each other.  In other words, any unified knowledge theory, any comprehensive and coherent model of the world must incorporate the act pf perception directly into the model – this is the Mind of Anaxagoras, the Intellect of the Neo-Platonists.  But there was nothing less empirical, less scientific, than the science of the mind, i.e. psychology, right?

Interesting enough, Carl Jung postulated that you could actually “prove” the existence of what he referred to as the collective unconscious, which as its name suggests represents the existence of an undercurrent universal mental framework from which individual psyche’s draw their source, or at least the source of what he called archetypes, or universal archaic symbols and patterns, motifs and themes, that he found common in the psyches of a wide range of his patients, too common from his perspective in fact to be the result of chance or happenstance.

Jung’s method of proof as it were, was to establish the connection of the individual psyche with what he described as universal archetypal themes that firmly established the existence of some universal ground of human symbols from which the individual symbology, the psyche, must draw.  He reckoned that the individual who perceives these archetypes, themes and motifs which he saw manifest in a wide variety of his patients in his psychoanalytic work, could have no precursory knowledge of the existence of these archetypes and therefore the symbols themselves, the common mythical themes, must stem from a source that is present in some way in all human psyches and yet still is not tied to the individual conscious mind as it were.

One of the examples that Jung found in his psychoanalytic practice that he gives to illustrate the workings of this collective unconscious, and ultimately led to his “discovery”, concerns a vision that one of his patients supposedly had in his office one day.  The patient was somewhat delusional and had visions that he was a Christ like figure and one day in Jung’s office this patient claims to see a phallic, tube like structure coming down and out of the sun.  He points out the existence of this symbol/structure to Jung, believing firmly in its existence, but Jung sees nothing out of the ordinary.  Jung then proceeds to think very little of the event until many years later he reads of an archeological discovery of a text which describes a mystic ritual that involves the vision of a tube like structure emanating from the sun.  Jung surmises that his patient could have no knowledge of the description of this ancient ritual which corresponded so closely to his vision in Jung’s office years earlier (the text had not even been discovered at the time of the original vision by the patient) and therefore must be evidence of the existence of some common symbolic denominator that individuals can tap into so to speak, and at some level underlies the psyche of all human existence, i.e. the collective unconscious.

 

My thesis then, is as follows: in addition to our immediate consciousness, which is of a thoroughly personal nature and which we believe to be the only empirical psyche (even if we tack on the personal unconscious as an appendix), there exists a second psychic system of a collective, universal, and impersonal nature which is identical in all individuals.  This collective unconscious does not develop individually but is inherited. It consists of pre-existent forms, the archetypes, which can only become conscious secondarily and which give definite form to certain psychic contents.[2]

 

Out of his psychoanalytic work then, emerges Jung’s theory of the collective unconscious, as evidenced by the existence of these universal archetypal themes, as well as his psychoanalytical healing technique which he called individuation which Jung used to guide the individual psyche to a better understanding of one’s connection to this collective, universal “unconscious” via the means of what he called active imagination, which as it turned out was heavily reliant on symbols, mandalas in particular which of course play such a strong role in the meditative practices and rituals of the Eastern philosophical traditions.

The existence of these motifs (or mythemes when looked at within the context of mythology which can be viewed as the expression of the collective unconscious of a society or civilization as a whole) across the boundaries of time and space, manifesting in the mind of man throughout the course of history spoke to the existence of a collective unconscious, from which these archetypal images or themes must emerge.  To Jung, consciousness and its counterpart the unconscious, the sum total of which made up the psyche of man, represented the every ground of reality.

 

When one reflects upon what consciousness really is, one is profoundly impressed by the extreme wonder of the fact that an event which takes place outside in the cosmos simultaneously produces an internal image, that it takes place, so to speak, inside as well, which is to say; becomes conscious.[3]

 

Now this was interesting.  You start with the concept of cultural borrowing, you search for something deeper, something more rich that connects the ancient cultures.  You look into their mythology (and theology because arguably the further back you go into ancient history the less distinguishable a society’s mythology is from its theology), cultural cosmology and mythology in general, and you end up with some parallels but nothing concrete per se, then you look at mythology as a whole, and you end up, as both Jung and Campbell had done really, in the realm of psychology, which as it turns out is kind of where you end up if you follow the end of modern physics as well.  That seemed strange.  And yet it seemed to point back to the idea that if you wanted to really understand the world, understand it even at the physical level, you had to establish a broader perspective than models that had a purely empirically driven and (physical) scientific basis.

 

This quest for ultimate knowledge, and order, is as old as mankind itself and is reflected in the cosmological traditions of all of the ancient civilizations – as evidenced by the Egyptian, Sumer/Babylonian, Greek and Judeo-Christian cosmologies which all attempt to lay down the structure of the world as we know it and how and why it came into existence – leaving aside the theological dogma whose only purpose was to serve the establishment of power and authority.

In much the same way as the ancients searched for a unified theory of order (the maat of the Egyptians, the Chronos of the Greek cosmological system, etc.) Plato, Aristotle, Euclid, Newton, Darwin, Kepler, Einstein, Planck and Bohm carried the torch of this quest for a unified metaphysical structure forward throughout the development of Western civilization and led ultimately to the branching of knowledge into the different sciences today, upon which our modern society rests today and relies on to guide us through life.  In fact, Charlie’s premise was that in fact modern man, in the Information Age where empirical reality was so baked into our Western minds, we had as much blind faith in science today than as orthodox religious zealots and believers had faith in their God.

In prior eras, where mankind had understood less about how things really worked, they could rely on religion, a grand creator God, as the underlying reason behind and explanation of how things worked and how things came to be.  This is the creation myth of Genesis and provides the rational explanation to the cosmologies of all the ancient peoples, in the East and the West.  But we do not have that luxury today, we have science and science has showed us things, things about the nature of reality that must be incorporated into our understanding of not only the physical world around us, but also its socio-biological foundations, as well as the integral role of mind, of consciousness, in forming the basis of how we “perceive” the world.

But none of these great thinkers, scientists, philosophers or sages that had so marked intellectual progress sin the West over the centuries had access to and were exposed to the state of knowledge as it stood today, in the Information Age where we as a species understood not only that our species as a whole was some few hundred thousand years old, and was not crafted from the clay of the earth as the mythologies of the West would have us believe, but through the process of natural selection, evolution, engineered by our own genetic structure which incorporated the role of chance into our evolution (genetic mutation).  We came to be able to speak and communicate with each other, form abstract concepts thoughts into words and syllables that could be communicated from mind to mind, to cultivate the land and domesticate animals, followed by the invention of writing and the spread of mankind throughout the world to the point where we not only truly understood how connected and integrated we all are not only as a species as whole, but also as an organism whose destiny is tied to the planet in a very real and tangible way.

Scholars and academics of today, and to all those who are curious and have the time to explore the origins of mankind and how our own belief systems have evolved since the dawn of civilization, have a much deeper and broader understanding of how our species, which is in many respects is characterized by our ability to speak, our ability to communicate with each other, and our ability to write down and develop complex and sophisticated models of thought and concepts that have led to a profound understanding of not only how the physical universe has come to be, but also of how our minds have developed and the fundamental connection between the act of experience and our perception of the physical universe.

This is the logical conclusion that must be drawn when one takes a hard look at the sciences as they stand today, fields of knowledge which are based upon empirically verified and proven facts, facts which followed point to the inevitable conclusion that there exist intellectual boundaries and limits of science itself, and that a broader perspective must be used if we want to truly understand this world we live in, as well as how our place in it has evolved and/or should evolve moving forward.  A perspective that must integrate at some level the role of consciousness itself, upon which any understanding of anything in fact, must be based.


[1] From Max Born’s Nobel Laureate speech, reference http://originoftheuniverse.wikia.com/wiki/Uncertainty_Principle.

[2] C. G. Jung, The Archetypes and the Collective Unconscious (London 1996) p. 43

[3] C Jung; Basel Seminar, privately printed, 1934, p. i

Interpretations of Quantum Mechanics: Back to First Philosophy

If you believe in the power of mathematics to describe the universe, as the language of God so to speak, in its theoretical as well as predictive power in describing the nature of the physical universe at the cosmic as well as subatomic scale, you must in turn admit to or adhere to Bell’s Theorem that posits that there can be no local hidden variables that explain the EPR Paradox, which means that from a mathematical perspective, there must exist some sort of non-local force, some connecting principle, that sits behind the behavior and complex relationship of these subatomic “particles” that are being measured and interact with each other to form the basis of those things which we consider to be “real”, or are said to exist in some way, shape or form.

And if we as human beings, and all animals or physical objects for that matter, that consist of these subatomic particles, must adhere at some level to these very same principles, i.e. that there is some non-local underlying force that drives interconnectedness that is inconsistent with Einstein’s fundamental premise of Relativity, namely that nothing can travel at speeds faster than the speed of light, then we are left with the task of trying to make sense of what Quantum Theory actually implies, or means, and what those implications are for existence and life itself and how we are to live.  It is in this spirit that Charlie ventured into the next level of modern day theoretical physics, what present day physicists call Interpretations of Quantum Mechanics, or from a philosophical perspective the metaphysical implications of Quantum Theory.

Although at first glance the exercise might seem to be a purely intellectual one, and to be clear Charlie had no desire to do more research or homework than required to wrap up his thesis, he did feel that before embarking on what he thought the metaphysical and philosophical implications of what Quantum Theory was, and how it might help him understand the basis, or lack of merit, of this underlying materialistic and objective view of reality, it was important that he cover at least some of the prevailing interpretations of the theory from within the physics community itself, a group of academics that were much brighter than he, and people that actually understood the underlying mathematics, something that Charlie could only do at a theoretical level, mostly relying on the physicists and other scientific authors’ interpretations of the underlying math in forming his understanding of Quantum Theory and what it implied about the nature of these subatomic “things”, and in turn what it implied about the nature of the bigger things that consisted of these smaller subatomic “things” that in turn make up what we consider to be “physical reality”.

 

There are many Interpretations of Quantum Theory but there are three in particular that Charlie thought deserved attention due either to their prevalence or acceptance in the academic community and/or their impact on scientific or philosophical inquiry into the limits of Quantum Theory’s implications from a mathematical, theoretical, or metaphysical, perspective.  The first was the standard orthodox interpretation, the one most often compared to or cited in reference to when differing interpretations are put forth and explained.  This is most commonly referred to as the Copenhagen Interpretation and it basically renders the theoretical boundaries of interpretation of Quantum Theory to the results of the experiment itself and no further.  This point of view can be looked at as a pure mathematical and physical behavioral modelling view of Quantum Mechanics and fundamental rejects any philosophical or metaphysical implications.

The second Charlie wanted to look at was definitely a little out there but still held some prevalence in the academic community and was mathematically and theoretically sound as far as he could gather, and intellectually interesting, so he thought he should study it and understand what it said about Quantum Theory’s potential metaphysical implications in the somewhat extreme, abstract theoretically mathematical case.  This interpretation has a few variants but is mostly referred to in the literature as the Many-worlds, or Many-minds, interpretation and it expands upon the theoretical boundaries of Quantum Mechanics by explaining its stochastic nature by proposing the existence of multiple universes, or at least multiple possible universes.

The last interpretation that at some level Charlie found the most appealing intellectually, particularly given the metaphysical extensions which it included explicitly, was referred to as the Causal Interpretation, also sometimes referred to as de Broglie-Bohm theory or simply Bohmian mechanics.  It extends Quantum Mechanics to include a principle it refers to as quantum potential, and while it abandons the classical notion of locality it still preserves objective realism and determinism to a large extent.  In most academic circles this theory is viewed as a hidden variable theory within the context of the EPR Paradox and Bell’s Theorem.

 

The most well established and most commonly accepted interpretation of Quantum Theory, the one that is most often taught in schools and textbooks and the one that most alternative interpretations are compared against, is the Copenhagen Interpretation.  This interpretation is most often associated with Niels Bohr and Werner Heisenberg, stemming from their collaboration in Copenhagen in 1927, hence the name.  The term was further crystallized in writings by Heisenberg in the 1950s when expressing his views on contradictory interpretations of Quantum Theory.

The Copenhagen interpretation holds that the Quantum Theory does not and cannot yield a description of an objective reality, but deals only with sets of probabilistic outcomes of experimental values borne from experiments observing or measuring various aspects of energy quanta, entities that do not fit neatly into classical interpretations of mechanics.  The underlying tenet here is that the act of measurement itself, the observer (or by extension the apparatus of observation) causes the set of probabilistic outcomes to converge on a single outcome, a feature of Quantum Mechanics commonly referred to as wave function collapse, and that any additional interpretation of what might actually be going on, i.e. the underlying “reality”, defies explanation and the interpretation of which is in fact inconsistent with the fundamental mathematical tenets of the theory itself.

In this interpretation of Quantum Theory, reality (used here in the classical sense of the term as existing independent of the observer) is a function of the experiment, and is defined as a result of the act of observation and has no meaning independent of the yielding of some measurement value.  In other words, reality in the quantum world from this point of view does not exist independent of observation.  Or put somewhat differently, the manifestation of what we think of or define as “real” is intrinsically tied to and related to the act of observation of the system itself.

Niels Bohr has been one of the strongest proponents of this interpretation, an interpretation which refuses to associate any metaphysical implications with the underlying theoretical model.  His position is that given this proven interdependence between that which was being observed and the act of observation, no metaphysical interpretation can, or should, in fact be extrapolated from the theory, it is and can only be a tool to describe and measure states and particle/wave behavior in the subatomic realm that are made as a result of some well-defined experiment.  In other words, in Bohr’s view, attempting to make some determination as to what Quantum Theory actually meant, beyond the results of a given experiment, violated the fundamental tenets of the theory itself.  From Bohr’s perspective, the inability to draw conclusions beyond the results of the experiments which the mathematical models predict and measure was a necessary conclusion of the theorem’s basic tenets and that was the end of the matter.  This view can be seen as the logical conclusion of the principle of complementarity, one of the fundamental and intrinsic features of Quantum Mechanics that makes it so mysterious and hard to understand in classical terms.

Complementarity, which is closely tied to the Copenhagen interpretation, expresses the notion that in the quantum domain the results of the experiments, the values yielded (or observables) were fundamentally tied to the act of measurement itself and that in order to obtain a complete picture of the state of any given system, as bound by the uncertainty principle, one would need to run multiple experiments across a given system, each result in turn rounding out the notion of the state, or reality of said system.  These combinatorial and uncertainty features of the domain which were intrinsic to Quantum Theory said something profound about the underlying uncertainty of the theory itself from a classical realist perspective.  Perhaps complementarity can be viewed as the twin of uncertainty, or its inverse postulate.

Bohr summarized this very subtle and yet at the same time very profound notion of complementarity in 1949 as follows:

 

…however far the [quantum physical] phenomena transcend the scope of classical physical explanation, the account of all evidence must be expressed in classical terms. The argument is simply that by the word “experiment” we refer to a situation where we can tell others what we have learned and that, therefore, the account of the experimental arrangements and of the results of the observations must be expressed in unambiguous language with suitable application of the terminology of classical physics.

This crucial point…implies the impossibility of any sharp separation between the behavior of atomic objects and the interaction with the measuring instruments which serve to define the conditions under which the phenomena appear…. Consequently, evidence obtained under different experimental conditions cannot be comprehended within a single picture, but must be regarded as complementary in the sense that only the totality of the phenomena exhausts the possible information about the objects.[1]

 

Complementarity was in fact the core underlying principle which drove the existence of the uncertainty principle from Bohr’s perspective.  It was the underlying characteristic and property of the quantum world that captured at some level its very essence.  And complementarity, taken to its logical and theoretical limits, did not allow or provide any framework for describing any definition of the real world outside of the domain within which it dealt with, namely the measurement values or results of a given experiment, the measurement instruments themselves that were part of a given experiment, and the act of measurement itself.

 

Another interpretation or possible question to be asked given the uncertainty implicit in Quantum Mechanics was that perhaps all possible outcomes as described in the wave function did in some respect manifest, even if they call could not be seen or perceived in our objective reality.  Although on the surface it seemed like a rather outlandish premise, this interpretation of Quantum Theory has gained some prominence in the last few decades, especially within the Computer Science and Computational Complexity fields (which Charlie was schooled in), and has come to be known in the literature as the Many-Worlds interpretation of Quantum Theory.

This original formulation of this theory was laid out by Hugh Everett in his PHD thesis in 1957 in a paper entitled The Theory of the Universal Wave Function wherein he referred to the interpretation not as Many-worlds but, much more aptly and more accurately given his initial formulation of the theoretical extensions of Quantum Mechanics that he proposed, as the Relative-State formulation of Quantum Mechanics.  Almost completely ignored by the broader scientific community for several decades after he published his work, the theory was subsequently developed and expanded upon by several authors in the last decade or two and has come to be known, along with its variants that have cropped up, as the Many-worlds interpretation.

Everett was a graduate student at Princeton at the time that he authored The Theory of the Universal Wave Function and his advisor was John Wheeler, one of the most respected theoretical physicists of the latter half of the twentieth century.  After writing his thesis, Everett did not in fact continue a career in academia and therefore subsequent interpretations and expansions upon his theory were left to later authors and researchers, most notably by Bryce Dewitt who coined the term “many-worlds”, and David Deutsch among others.  DeWitt’s book on the topic published in 1973 entitled The Many-Worlds Interpretation of Quantum Mechanics in many respects popularized this interpretation and brought it back into mainstream physics and it included a reprint of Everett’s thesis.  Deutsch’s seminal work on the topic is a book entitled The Fabric of Reality published in 1997 where he expands and extends the Many-worlds interpretation to other academic disciplines outside of physics such as philosophy and epistemology, computer science and quantum computing, and even Biology and theories of evolution.

In Everett’s original exposition of the theory, he begins by calling out some of the problems with the original, or classic, interpretation of Quantum Mechanics, specifically what he and other members of the physics community believed to be the artificial creation of the notion of wave function collapse to explain the quantum uncertain to deterministic behavior transitions, as well as the difficulty that standard interpretations of the theory had in dealing with systems that consisted of more than one observer.  These he considered to be the main drivers behind his search for an alternative view, interpretation, or theoretical extension even of Quantum Theory.  He actually referred to his Relative State formulation of Quantum Mechanics as a metatheory given that the standard interpretation could be derived from it.

Although Bohr, and presumably Heisenberg and von Neumann as well, whose collective views Quantum Theory’s philosophical implications make up the Copenhagen Interpretation, would no doubt explain away these strange and seemingly arbitrary assumptions as out of scope of the theory itself (i.e. Quantum Theory is intellectually and epistemologically bound by the experimental apparatus and their associated experimental results), Everett finds this view philosophically limiting and at the very least worth exploring tweaks and extensions to the theory to see if these shortcomings can be removed, and in turn what the implications are theoretically speaking when some of the more standard and orthodox assumptions of Quantum Mechanics are relaxed in some sense.

Because to Everett, and this view that has become more common in the last decade or two in fact, the standard interpretation of Quantum Theory (read Copenhagen Interpretation) fundamentally prevents us from any true explanation as to what the theory says about the nature of “reality” itself, or the real world as it were, a world considered to be governed by the laws of classic physics where things and objects exists independent of observers and where “objects” or “particles” have real, static measurable and definable qualities that exist independently of the act of measurement or observation, a world fundamentally incompatible with the stochastic and uncertain characteristics that governed the behavior of “things” in the subatomic or quantum realm.  In Everett’s own words, his intent in defining a Relative State formulation of Quantum Mechanics is:

 

The aim is not to deny or contradict the conventional formulation of quantum theory, which has demonstrated its usefulness in an overwhelming variety of problems, but rather to supply a new, more general and complete formulation, from which the conventional interpretation can be deduced.[2]

 

Everett’s starts by making the following two basic assumptions from which he devises his somewhat counter intuitive but yet now relatively widely accepted interpretations of Quantum Theory.  Firstly he assumes that all physical systems large or small can be described as states within Hilbert space, the fundamental geometric framework upon which Quantum Mechanics is constructed.  Secondly he abstracts the notion of observer as a machine like entity with access to unlimited memory, which stores a history of previous states, or previous observations, and also has the ability to make simple deductions, or associations, regarding actions and behavior of system states solely based upon this memory and deductive reasoning.

His second assumption represents a marked distinction between it and Quantum Theory proper and incorporates observers and acts of observation (i.e. measurement) completely into the theoretical model.  Furthermore, Everett proposes, and this is the core part of his thesis, that if you yield to assumptions 1 and 2, you can come up with an extension to Quantum Mechanics that describes the entire state of the universe, which includes the observers and objects of observation, that can be described in a completely mathematically consistent, coherent and fully deterministic manner without the need of the notion of wave function collapse or any additional assumptions for that matter.

Everett makes what he calls a simplifying assumption to Quantum Theory, i.e. removing the need for or notion of wave function collapse, and assumes the existence of a universal wave function (which is the title of his thesis in fact) which accounts for and describes the behavior of all physical systems and their interaction in the universe, completely including the observer and the act of observation into the model, observers being viewed as simply another form of a quantum state that interacts with the environment.  Once these assumptions are made, he can then abstract the notion of measurement, which is the source of much of the oddity and complexity surrounding Quantum Theory, as simply interactions between quantum systems that are all governed by this same universal wave function.

In Everett’s metatheory, the notion of what an observer means and how they fit into the overall model are fully defined, and what he views as the seemingly arbitrary notion of wave function collapse is circumvented.  His metatheory is defined by the assumption of the existence of a universal wave function which corresponds to the existence of a fully deterministic multi-verse based reality whereby the notion of wave function collapse represents not a collapse so to speak, but represents a manifestation of the realization of one possible outcome of measurement that exists in our “reality”, or our specific multi-verse, i.e. the one which we observe during our act of measurement.

But in Everett’s theoretical description of the universe, if you take what can be described as a literal interpretation of this universal wave function as the overarching description of reality, the other, unobserved, possible states reflected in the wave function of any system in question do not cease to exist with the act of observation.  The act of observation of a given system does not represent a collapse of the quantum mechanical wave that describes said system state in Copenhagen quantum mechanical nomenclature, but that these other states that do not manifest in our act of observation of said system do have some existence that persists (to what degree and level of reality they persist is a somewhat open ended question and the subject of much debate in subsequent interpretations of Everett’s metatheory) but are simply not perceived by us.

In his own words, and this is a subtle yet important distinction between Everett’s view and the view of subsequent proponents of the Many-worlds interpretation, these unmanifest and unobserved states exist but remain uncorrelated with the observer in question, an observer that is incorporated and abstracted into the universal wave function model of reality.

 

We now consider the question of measurement in quantum mechanics, which we desire to treat as a natural process within the theory of pure wave mechanics. From our point of view there is no fundamental distinction between “measuring apparata” and other physical systems. For us, therefore, a measurement is simply a special case of interaction between physical systems – an interaction which has the property of correlating a quantity in one subsystem with a quantity in another.[3]

 

This implies of course that these unperceived states do have some semblance of reality, that they do in fact exists as possible realities, realities that are thought to have varying levels of “existence” depending upon which version of the Many-worlds interpretation you adhere to.  With DeWitt and Deutsch for example, a more literal, or “actual” you might say, interpretation of Everett’s original theory is taken, where these other states, these other realities or multi-verses, do in fact physically exist even though they cannot be perceived or validated by experiment.[4]

This is a more literal interpretation of Everett’s thesis however, and certainly nowhere does Everett explicitly state that these other potential uncorrelated states as he calls them actually physically exist.  What he does say on the matter seems to imply some form of existence of these “possible” or potential universes that reflect non-measured or non-actualized states of physical systems, but not necessarily that these unrealized outcomes actually exist in some physical universe.

 

In reply to a preprint of this article some correspondents have raised the question of the “transition from possible to actual,” arguing that in “reality” there is—as our experience testifies—no such splitting of observer states, so that only one branch can ever actually exist. Since this point may occur to other readers the following is offered in explanation.

The whole issue of the transition from “possible” to “actual” is taken care of in the theory in a very simple way—there is no such transition, nor is such a transition necessary for the theory to be in accord with our experience. From the viewpoint of the theory all elements of a superposition (all “branches”) are “actual,” none any more “real” than the rest. It is unnecessary to suppose that all but one are somehow destroyed, since all the separate elements of a superposition individually obey the wave equation with complete indifference to the presence or absence (“actuality” or not) of any other elements. This total lack of effect of one branch on another also implies that no observer will ever be aware of any “splitting” process.

Arguments that the world picture presented by this theory is contradicted by experience, because we are unaware of any branching process, are like the criticism of the Copernican theory that the mobility of the earth as a real physical fact is incompatible with the common sense interpretation of nature because we feel no such motion. In both cases the argument fails when it is shown that the theory itself predicts that our experience will be what it in fact is. (In the Copernican case the addition of Newtonian physics was required to be able to show that the earth’s inhabitants would be unaware of any motion of the earth.)[5]

 

According to Everett’s view then, the act of measurement of a quantum system, and its associated principles of uncertainty and entanglement, is simply the reflection of this splitting off of the observable universe from a higher order notion of a multiverse where all possible outcomes and alternate histories have the potential to exist.  The radical form of the Many-worlds view is that these potential, unmanifest realities do in fact exist, whereas Everett seems to only go so far as to imply that they “could” exist and that conceptually their existence should not be ignored.

As hard as this Many-worlds or Many-minds interpretation of Quantum Mechanics might be to wrap your head around, it does represent a somewhat elegant theoretically and mathematically sound solution to some of the criticisms and challenges raised by the broader physics community against Quantum Theory, namely the EPR paradox and the Schrodinger’s cat problem.  It does also raise some significant questions however as to the validity of his underlying theory of mind and subjective experience in general, notions which Everett somewhat glosses over (albeit intentionally, he is not constructing a theory of mind nor does he ever state that he intends to in any way) by making the simple assumption that observers can be incorporated into his universal wave function view of reality by abstracting them into simple deductive reasoning and memory based machines.  Nonetheless this aspect of Everett’s interpretation of Quantum Theory, his implicit and simplified theory of observation and the role of mind, remains one of the most hotly debated and widely criticized aspect of his metatheory, and one upon which arguably his entire theoretical model rests and therefore calls the validity of the metatheory itself into question in some sense.[6]

 

The question of these varying interpretations, or approaches if you will, of Quantum Theory is what does the underlying theory, given that we can almost certainly guarantee its certitude, given that its backed up by very sound empirical evidence, really say about what it is that is going on?  What is it that is happening in this quantum world and how does that affect, or does it affect, how we should understand what is happening at the human scale, or at the cosmic scale even.  David Bohm, the main architect of what has come to be known as Bohmian Mechanics, an alternative viewpoint on Quantum Theory that although not taught in standard physics courses is at least becoming more widely accepted as a theoretically possible alternative, would say definitively yes.[7]

The theoretical foundations for Bohmian Mechanics were laid by Louis De Broglie in 1927 when he originally proposed that Schrodinger’s wave function could be interpreted as describing the existence a central physical particle accompanied by a pilot wave that governed its behavior, thereby explaining why these subatomic “particles” behaved like waves and particles depending upon the experiment.  David Bohm then picked up on de Broglie’s work in 1952 to expand it to encompass more complex, multi-body physical systems, an original criticism of the work which, along with a subsequently proved incorrect proof that all hidden variable theories were impossible by John von Neumann in 1932, led to the abandonment of the theory by de Broglie and the physics community for some twenty years.

Bohmian Mechanics is most fully developed in Bohm and Basil Hiley’s book entitled The Undivided Universe which was first published in 1993 although much of its contents and the underlying theory had been thought out and published in previous papers on the topic since the 1950s.  In their book they refer to their interpretation not as the Causal Interpretation, or even as De Broglie-Bohm Theory, but as the Ontological Interpretation of Quantum Theory given that from their perspective its gives the only complete causal and deterministic theoretical model of Quantum Theory where it is the actual position and location of the particle within the pilot-wave that determines the statistical outcome of the experiment that is governed by the wave function.

David Bohm was an American born British physicist of the twentieth century who made a variety of contributions to theoretical physics, but who also invested much time and thought into the metaphysical implications of Quantum Mechanics, and in metaphysics and philosophy in general, a topic that most theoretical physicists have steered away from.  In this respect Bohm was a bit of a rebel relative to his peers in the academic community because he extended the hard science of theoretical physics into the more abstract realm of the descriptions of reality as a whole, incorporating first philosophy back into the discussion in many respects, but doing so with the tool of hard mathematics, making his theories very hard, if not impossible to ignore by the physics community at large, establishing a scientific foothold for some very Eastern philosophical metaphysical assumptions, namely what he called undivided wholeness.

Bohm was, like Everett, was dissatisfied with the mainstream interpretations of Quantum Mechanics which basically said its just a model, try not to think about it.  This led him, apparently with some prodding by Einstein with whom he had ongoing dialogue toward the end of Einstein’s life, particularly with respect to the metaphysical implications and/or the completeness of Quantum Theory, to look for an alternative approach to Quantum Theory that might actually a) prove that hidden variable theories were actually possible (something that still remained in doubt well into the 70s and 80s even decades after Bohm first published his adaptation of de Broglie’s pilot-wave theory which supported multi-bodied systems) and b) actually try and explain what the heck was going on, which was wholly absent from Bohr’s, von Neumann and Heisenberg’s interpretations of Quantum Mechanics, whose views collectively gave weight to the orthodox view that it was simply a set of equations to predict results and had certain fundamental limitations to it, that were inherent in the model itself, and there was no point trying to further describe or explain what was actually going on.

Bohm then in the early 1950s (re)discovered, and subsequently expanded upon, the pilot-wave theory put forth by de Broglie in the 1920s, developing a more robust theoretical and mathematical foundation to de Broglie’s work which had been previously lacking, specifically extending the model to support multi-body systems.  He then further extended the underlying mathematics of Quantum Theory to include a fundamentally non-local force called quantum potential which provided the basis for non-local correlations between subatomic particles and their associated measurements.

Furthermore, in Bohmian Mechanics, which is a hidden variable theory as defined in the context of the EPR Paradox and Bell’s Theorem many respects, Bohm posits that it is the actual position and momentum of the underlying particle(s) in question that were the so called hidden variables, values which governed, along with the quantum potential, how a quantum wave-particle would behave, thereby not only proving that hidden variable theories were possible (non-local ones specifically which was consistent with Bell’s Theorem), but also fundamentally sidestepping the measurement problem – there was no need conceptually for the notion of wave function collapse if in fact you could have a fully deterministic model of quantum behavior, behavior mapped by the Schrodinger equation with an addition term to account for the quantum potential of a given system, mapped out both for single particle as well as many particle systems through time and space which is effectively what Bohmian Mechanics provides.

De-Broglie’s pilot-wave theory from 1927 affirms the existence of subatomic particles, or corpuscles as they were called back then, but viewed these particles not as independent existing entities but as integrated into an undercurrent, or wave, which was fully described by Schrodinger’s wave function and gave these subatomic particles their wave-like characteristics of diffraction and interference while at the same time explained their particle like behavior as illustrated in certain experiments.  This represented a significant divergence from standard interpretations of Quantum Theory at the time.  From his original 1927 paper on the topic, De Broglie describes pilot-wave theory as follows:

 

One will assume the existence, as distinct realities, of the material point and of the continuous wave represented by the [wave function], and one will take it as a postulate that the motion of the point is determined as a function of the phase of the wave by the equation. One then conceives the continuous wave as guiding the motion of the particle. It is a pilot wave.[8]

 

De Broglie’s pilot-wave theory was dismissed by the broader academic community when it was presented at the time however due to the fact that the model as presented by de Broglie was only understood to describe only single-body systems, combined with the then very strong belief that any variant of hidden variable theories were theoretically impossible as put forth by von Neumann in 1932.[9]  Pilot-wave theoretical research wasn’t further pursued until Bohm picked the theory back up some thirty years later, where driven primarily by the desire to illustrate that hidden variable theories were in fact possible, expanded the theory to apply it to multi-body systems, giving the theory a more solid scientific ground and providing a fully developed framework for further consideration by the broader physics community.

Bohmian Mechanics, as Bohm’s exposition of de Broglie’s pilot-wave theory later evolved into its more mature form, provides a mathematical framework within which subatomic reality can indeed be thought of as actually existing independent of an observer or an act of measurement, a significant departure from standard interpretations of the theory that were prevalent for most of the twentieth century (in philosophic terms it’s a fully realist interpretation).  The theory was consistent with Bell’s Theorem as it abandoned the notion of locality (Bohm’s pilot-wave theory actually inspired Bell in his work on Bell’s Theorem in fact) and also was also fully deterministic, positing that once the value of these hidden variables was known, all future states, and even past states, could be calculated and known as well, consistent in this sense with classical physics.[10]  As John Stewart Bell, a proponent of pilot-wave theory puts it:

 

That the guiding wave, in the general case, propagates not in ordinary three-space but in a multidimensional-configuration space is the origin of the notorious ‘nonlocality’ of quantum mechanics. It is a merit of the de Broglie-Bohm version to bring this out so explicitly that it cannot be ignored.[11]

 

Bohmian Mechanics falls into the category of hidden variable theories.  It lays out a description of quantum reality where the Schrodinger wave function, along with the notion of quantum potential, is the guiding model of subatomic behavior, the position and momentum of said particle being the so-called “hidden variable” which in turn determine the result of a given experiment or observable result.  Bohmian Mechanics is not only explicitly non-local, but its also fundamentally realistic and deterministic, although at the same time it flies in the face of some of the basic assumptions of classical physics, not brushing these features aside as in the more standard, orthodox view of Quantum Theory, but calling attention to them directly.  No wonder his work was not that well received during his lifetime.

One of the distinguishing features of Bohmian Mechanics, and one that provides the basis for his metaphysical interpretations of Quantum Theory, is his notion of quantum potential, a force which, unlike the classical physics notion of force where the effect is a function of intensity or magnitude, is not only fundamentally non-local in a classical physics sense, but also, along with the Schrodinger wave equation, governs the behavior of a quantum system and determines its future and past states, irrespective of whether or not the quantum system is observed or measured.  It’s the glue that keeps Bohmian Mechanics together and, along with the establishment of the actual position and momentum of a given particle (or set of particles) as being fundamentally real, is the mathematical (and metaphysical) tool that he uses to explain what’s actually going on in the quantum realm.

Quantum potential in Bohm’s view is a force that is universally present not only in the quantum realm but underlying all of physics, a force that effectively becomes negligent as the quantum system becomes sufficiently large and complex and is transformed from a system that exhibits both wave and particle like behavior to a system governed by classical physics as described by Newton.

It provides us with an explanation for wave function collapse and quantum measurement uncertainty as put forth by Heisenberg, von Neumann and others by positing that the Schrodinger’s wave function does in fact fully describe quantum system behavior, that the actual position and momentum of a given quantum state does in fact exist even if it is not measured or observed, and that there exists some element of non-local active information within the environment which explains the observable and experimentally verifiable existence of the correlation of classically separated (quantum) entities..  In other words, in Bohmian Mechanics, the quantum system not only has some definitive initial state, but it also knows about its environment to a certain extent, information that is embedded in the underlying quantum potential of a given system and can be mathematically modeled by adding the notion of quantum potential to Quantum Theory.

According to Bohm, one of the motivations for exploring the possibility of a fully deterministic/causal extension of Quantum Theory was not necessarily because he believed it to be the right interpretation, the correct one, but to show the possibility of such theories, the existence of which was cast into serious doubt after the development of von Neumann’s mathematical work in the 1930s, and even after Bell’s continuation of these theoretical constraints on Quantum Theory, which did in fact allow for non-local hidden variable theories, in the 1960s.

 

… it should be kept in mind that before this proposal was made there had existed the widespread impression that no conceptions of hidden variables at all, not even if they were abstract, and hypothetical, could possibly be consistent with the quantum theory.[12]

 

Bohmian mechanics is consistent with Bell’s Theorem, which rules out hidden variables only in theories which assume local realism, i.e. that all objects or things are governed by and behave according to the principles of classical physics which are bound by the constraints of Relativity and the fixed speed of light, principles which have been demonstrated to be wholly inconsistent with Quantum Mechanics, causing of course much consternation in the physics community and calling into question classical realism in general.[13]

Bohmian Mechanics agrees with all of the mathematical predictions of standard interpretations of Quantum Theory, i.e. its mathematically equivalent, but extends the theoretical model to try and explain what is actually going on, what is driving the non-local behavior of these subatomic “things” and what in fact can be said to be “known” about the state of quantum systems independent of the act of measurement or observation.  With this notion of quantum potential, Bohm provides a mathematical as well as metaphysical principle which “guides” subatomic particle(s), gives them some sense of environmental awareness, and is inherently nonlocal (in the classical sense).

With respect to the importance of Bohm’s work in Quantum Mechanics, Bell himself, albeit some 30 years after Bohm originally published his extension of de Broglie’s pilot-wave theory, had this to say:

 

But in 1952 I saw the impossible done. It was in papers by David Bohm. Bohm showed explicitly how parameters could indeed be introduced, into nonrelativistic wave mechanics, with the help of which the indeterministic description could be transformed into a deterministic one. More importantly, in my opinion, the subjectivity of the orthodox version, the necessary reference to the ‘observer,’ could be eliminated. …

But why then had Born not told me of this ‘pilot wave’? If only to point out what was wrong with it? Why did von Neumann not consider it? More extraordinarily, why did people go on producing ‘‘impossibility’’ proofs, after 1952, and as recently as 1978? … Why is the pilot wave picture ignored in text books? Should it not be taught, not as the only way, but as an antidote to the prevailing complacency? To show us that vagueness, subjectivity, and indeterminism, are not forced on us by experimental facts, but by deliberate theoretical choice?[14]

 

Bohmian Mechanics contribution to Quantum Mechanics, and physics as a whole in fact, is not only that it calls into question the presumption of local realism specifically, what Einstein referred to as “spooky action at a distance”, but also in that it proved unequivocally that hidden variable theories are in fact theoretically and mathematically possible and still consistent with the basic tenets of Quantum Mechanics.  Bohm in fact “completes” Quantum Mechanics in the true sense that EPR (EPR Paradox) initially called into question.

What Bohmian Mechanics calls our attention to quite directly, and in a very uncomfortable way from a classical physics perspective, is that there are metaphysical assumptions about reality in general that are fully baked into classical physics that must be relaxed in order to understand, and in fact explain, Quantum Mechanics.  Furthermore, it was these same subatomic particles (and/or waves) whose behavior which was modeled so successfully with Quantum Mechanics, that in some shape or form constituted the basic building blocks of the entire “classically” physical world.  This fact could not be denied.  And yet the laws and theorems that have been developed to describe this behavior, i.e. classical physics, were and still are fundamentally incompatible with the laws that govern the subatomic realm, specifically the underlying assumptions about what is “real” and how these objects of reality behave and are related to each other.[15]

 

While the standard interpretation of Quantum Theory holds that the model is simply a calculation tool and is bound by certain metaphysical constraints that are inherent to the theoretical model itself, Bohmian Mechanics and Everett’s Relative State formulation of Quantum Theory (and by association the various Many-World Interpretations that stemmed from it by DeWitt, Deutsch and others) intend to try and explain what is really going on in a manner that’s at least consistent with the underlying mathematics of Quantum Theory, albeit drawing very different conclusions about the nature of the reality that is being described.  In order to do this, some adventure into metaphysics, Aristotle’s first philosophy, is required.

Although the orthodox interpretation of Quantum Theory would have us believe that we can draw no metaphysical conclusions based upon what quantum mechanics tells us, that it is simply a tool for deriving values or observables from experimental results, Bohmian Mechanics, along with Everett’s Relative State formulation, tell us that there do exist, can exist, alternative theoretical models of quantum behavior which give at least an explanation of what “might” be going on, despite requiring an altogether different perspective on what we think of as “real”, and/or the nature of existence itself, requiring us to reconsider the underlying assumptions that sit at the very foundation of classical physics.

One can put it quite succinctly by observing that no matter what interpretation of Quantum Theory you find most attractive, at the very least the classical notion of local realism must be abandoned in order to make sense of what is going on.  One would be hard pressed to find someone with a good understanding of Quantum Theory who would dispute this.


[1] Niels Bohr (1949),”Discussions with Einstein on Epistemological Problems in Atomic Physics”. In P. Schilpp. Albert Einstein: Philosopher-Scientist. Open Court.

[2] From the Introduction of Everett’s thesis in 1957 “Relative State” Formulation of Quantum Mechanics.

[3] Hugh Everett, III.  Theory of the Universal Wave Function, 1957.  Pg 53.

[4] Deutsch actually posits that proof of the “existence” of these other multi-verses is given by the wave interference pattern displayed in even the single split version of the classic double slit experiment as well as the some of the running time algorithm enhancements driven by quantum computing, namely Shor’s algorithm which finds the polynomial factors of a given number which runs an order of magnitude faster on quantum computers than it does on classical, 1 or 0 but based machines.  This claim is controversial to say the least, or at least remains an open point of contention among the broader physics community. See http://daviddeutsch.physics.ox.ac.uk/Articles/Frontiers.html for a summary of his views on the matter.

[5] Everett’s thesis in 1957 “Relative State” Formulation of Quantum Mechanics, Note on Page 15, presumably in response to criticisms he received upon publishing the draft of his thesis to various distinguished members of the physics community, one of who was Niels Bohr.

[6] See Bohm and Hiley’s Chapter on Many-Worlds in their 1993 book entitled The Undivided Universe: An Ontological Interpretation of Quantum Theory for a good overview of the strengths and weaknesses mathematical and otherwise of Everett and DeWitt’s different perspectives on the Many-Worlds interpretation of Quantum Theory.

[7] Bohmian Mechanics is also sometimes referred to as De Broglie-Bohm Theory or the Causal Interpretation of Quantum Theory.  Its authors however prefer Ontological Interpretation.

[8] Louis De Broglie `Wave mechanics and the atomic structure of matter and of radiation’, Le Journal de Physique et le Radium, 8, 225 (1927)

[9] John von Neumann was instrumental in not only laying the mathematical foundations of Quantum Mechanics but also establishing the mathematical boundaries within which interpretations of the theory could be made, which included as it turned out a fairly comprehensive proof that ruled out (certain) classes of hidden variable theories to explain the underlying mathematics, a line of research that was followed by Bell which of course led to an expansion of the theoretical limitations of hidden variable theories, i.e. Bell’s Theorem, which depending on which source you read proved von Neumann’s assumptions to be false, or at best misleading.  Von Neumann also interestingly enough posited the idea of consciousness as an explanation for wave function collapse, a notion that of course was not addressed or picked up by the broader physics community given its philosophical implications.

[10] These features are why it is sometimes referred to as the Causal Interpretation due to the fact that that it outlined a fully causal description of the universe and its contents.

[11] From Stanford Encyclopedia entry on Bohmian Mechanics by Sheldon Goldstein, quote from Bell, Speakable and Unspeakable in Quantum Mechanics, Cambridge: Cambridge University Press; 1987, p. 115.

[12]  David Bohm, Wholeness and the Implicate Order, London: Routledge 1980 pg. 81.

[13] In fact, it was Bohm’s extension of De Broglie’s work on pilot-wave theory that provided at least to some degree the motivation for Bell to come up with his theorem to begin with; see Bell’s paper entitled On the Einstein Podolsky Rosen Paradox in 1964, published some 12 years after Bohm published his adaption of De Broglie’s pilot-wave theory.

[14] From Stanford Encyclopedia entry on Bohmian Mechanics, 2001 by Sheldon Goldstein; taken from Bell 1987, “Speakable and Unspeakable in Quantum Mechanics”, Cambridge University Press.

[15] There has been significant progress in the last decade or two in reconciling quantum theory and classical mechanics, most notably with respect to Newtonian trajectory behavior, what is described in the literature as accounting for the classical limit.  For a good review of the topic see the article The Emergence of Classical Dynamics in a Quantum World by Tanmoy Bhattacharya, Salman Habib, and Kurt Jacobs published in Las Alamos Science in 2002.

Aristotle and Democritus: Knowledge and the Atom

Having established the premise of his thesis, what appeared to be clear cultural borrowing of mythological and cosmological themes between and among the ancient Western civilizations, themes which crystalized and evolved into monotheism as it spread throughout the West after the death of Christ, it still wasn’t clear to Charlie where this hard distinction and separation between the objective world and its associated means of perception which is related to the subject or perceiver of objects.

Aristotle’s teachings, which formed the basis of philosophical doctrine into and even beyond the Middle Ages, explored the nature of the physical or natural world and its divisions into fields of knowledge, fields which evolved into the branches of science as we know them today.[1]  He even explored the nature of being itself as it related to knowledge, and concluded that the basis of all knowledge rested on the understanding of the various causes, or purpose (aition, or aitia in Greek) of a thing which existed – the essence of his esoteric notion of being qua being, or that which provided the basis upon which we can say that a thing exists.  His theory of knowledge rested on the notion of substance, ousia in Greek, the distinction between the matter and form of a thing, the form of a thing being associated with its ultimate purpose (telos or final cause).

Aristotle’s worldview rested on the assumption that there was a cause, a purpose, to everything, the understanding of which was a prerequisite for any sort of knowledge of it, upon which its existence in fact rested.  For in his model knowledge and existence were two sides of the same coin.   In his exploration of and attempt to define being qua being, he comes up with the idea of “substance”, which is closely tied to being and existence itself.  His notion of substance however, which clearly his notion of what we might call “reality” today depended upon to no small extent, is ascribed features of both matter (hulê) and form (eidos or morphê), and covered the animate as well as the inanimate world.  In other words, it wasn’t simply the material world or objective reality that defined existence in Aristotle’s philosophy, existence had an underlying purpose which established the parameters within which “reality” could, and in fact had to, be understood in fact.

It is also from this exploration of what substance might actually be, the definition of which represents a good chunk of Aristotle’s Metaphysics, that our word for “essence” ultimately derives.  For in order to try and find a definition for substance, a term which clearly he finds critical in his description of reality and how we are to understand the world around us, Aristotle uses the phrase to ti ên einai, literally “the what it was to be” for a thing, or a shorter variant to ti esti or “the what it is” of a thing, to try and describe what substance, ousia, might be.  Both of these phrases were translated into the Latin with an altogether new word, essentia, given their obscure meaning in the source Greek which of course is where our English “essence” comes from.  It’s interesting to note that with all of our building and construction of science, fields of knowledge or knowing, which rest in no small measure on the work and terminology of Aristotle, this notion of “essence” has been completely lost and perhaps best replaced by properties or attributes that can be measured or empirically verified which determine the reality of a thing, an entirely empirical view of reality and one which is a significant departure from the theologians and philosophers of the past three thousand years.  The same view of reality posited and held so fastly to by Niels as well, the perspective which denied the reality or relevance of the mystical experience or any other subjective experience which could not be empirically verified or measured for that matter.  And yet the notion of essence along with the concept of purpose or causation and its relevance in defining being qua being which had at its core this notion of substance, went well beyond the material definition of a thing in his model of reality, the very same intellectual framework within which knowledge itself has come to be defined over the ages.

Aristotle diverged from Plato’s theological premise of a divine, intelligent creator as laid out in the Timaeus and interpreted by subsequent philosophers (what Aristotle refers to as Plato’s “unwritten teaching”) as the “One” on the basis that Plato’s metaphysical foundations were weak and inconsistent and did not stand up to rational and logical criticism, with particular emphasis on the weakness of Plato’s Theory of Forms.  But Platonic doctrine, in its written or unwritten form, while albeit perhaps resting on weaker metaphysical and rational foundations than Aristotle’s theory of knowledge which rested squarely on causation or purpose, had no clear notion of separation between subject and object at all, simply a grand overarching nous or intellect from which these abstract Forms and Ideas emerged and therefore could ultimately be grasped or understood.

Aristotle breaks things down much more completely and thoroughly than his predecessor no doubt, and perhaps establishes the groundwork upon which subject and object are completely distinguished which is such a marked characteristic of modern day materialism, but even in Aristotle’s profoundly rational and logical metaphysical model the distinction between the object and perceiver of said object is not clearly made, and is most certainly not emphasized in any way.  And more importantly the notion of the individual’s place in society and the establishment of the criteria which should formulate the basis of good living, i.e. ethics and morals, notions that were also critically relevant to Plato and even Socrates, were a core part of his philosophical doctrine.  These were elements of practical philosophy from his perspective and they had a place in his teachings that was as important and relevant as his other philosophical works such as his Physics, Metaphysics, On the Heavens, On the Soul, and even his logical treatises that historically became grouped together as the Organon (Categories, On Interpretation, Prior Analytics, Posterior Analytics, Topics and Sophistical Refutations).

Aristotle’s teachings, along with Neo-Platonic doctrine and the notion of the One and its emanation into the many, formed the basis of the intellectual understanding of not only philosophy and the natural sciences as they were taught well into the Middle Ages, but also theology as well as these metaphysical doctrines were integrated and synthesized with the interpretation of the Scripture of the Abrahamic religions as they developed and evolved in their respective intellectual communities.  Aristotle’s teaching, which later became blended with Platonism in the 3rd century CE with the teachings of Plotinus as reflected in the Enneads[2], was preserved first by the Romans and then by the Muslims, and in the Middle Ages came to represent the core part of a classical curriculum of sorts, a curriculum which included not just philosophy and theology, but also medicine, biology, astrology, ethics, political philosophy, logic and mathematics as well.  And the words and terminology, and overarching structure of the fields of knowledge and the approach to their study, had been established by Aristotle in the 4th century BCE and even persist into the terminology and language we use to describe philosophy and physics even to this day.

But nowhere in these systems of belief, be they purely philosophical or theological, could Charlie see a clear distinction between the source of the existence of a thing – the creator God of the Jews, Muslims and Christians, the ultimate or final cause or primary mover of Aristotle, or the One from which all things emanated in Neo-Platonism – and that which was created, this objective world which was the focus of all the sciences of modern times and which had come to define reality itself.  This mechanistic world view of modern times rested on the reality of the objects of our senses along with those forces which acted upon this reality, forces which incidentally also had their roots in Aristotle’s system of knowledge as he (in Physics and Metaphysics) emphasized the importance of the role of movement, or locomotion (kinesis), as a key defining element of reality.  Movement which connected the potential state (dunamis) of a thing and its actual state (energeia, or entelecheia), and were bound by the notion of Time and Space which were integrally related to this movement.  Sounds an awful lot like modern physics does it not?

This idea that this objective world, as defined by the constituents of an object combined with the forces that acted upon it, was the one and only reality was clearly a modern invention however.  All of these ancient and what we might call outdated systems of philosophy and theology as they developed well into the Middle Ages incorporated not only what we would call today systems of physics, natural philosophy and even theology, but also extensive theories of the soul and the relevance of ethics and morality in living a “good” life which were completely integrated into their doctrines, either as “laws” as handed down by God in the Abrahamic religions, or as very well thought out and rational extensions into the philosophic doctrines themselves as handbooks for harmonious living.

For example Aristotle’s theory of happiness (eudaimonia) which is what he proposes is the ultimate goal of life, the telos of the soul in fact, is tightly related to the notion of the pursuit of, and ultimate understanding of, virtue or excellence (aretê), the subject of two of his most prominent extant works, Nicomachean Ethics and Eudemian Ethics.[3]  Ethics and morality, their importance and relevance in human life and happiness, their relevance to the proper and healthy functioning of the state and society as a whole, rested on the fundamental belief in the reality of the soul as the “form” of man, i.e. that which gave him purpose or was his ultimate goal or end (telos).  The belief in the soul as a key element of reality in fact (be it immortal or not) was and is a major and consistent theme of all of these ancient belief systems, be they philosophical or religious, and the moral and ethical foundations which were embedded in them weren’t divorced from their doctrines as they are so markedly from the sciences today.

In some sense, Charlie mused, language itself could be viewed as the beginning of this separation, the beginning of the bifurcation that characterized our collective, individually mutual exclusive and yet at the same time fundamentally interdependent (according to the Eastern philosophical systems at least), universal reality that defined the world in which we all lived and breathed.  The development of language itself, the bedrock of civilization as it were, required the notion of separation, required the concept of objectivity; objectivism to some degree was in fact a necessary precondition for the development of language.  For every word, which is some combination of strewn together syllables that has meaning in one language or another, can only be understood relative to some other idea or concept – the metaphor or analogy being used – or conversely could be understood in contrast to some idea or concept which represents the opposite of the meaning of the word in question, that which something is not.  For it is the world of opposites in which we live, which the great Indian sages tell us that the infinite lives beyond, and yet at the same time the greater the abstraction of a word or concept, the closer we come to truly understanding, and identifying with this unified existence.[4]

Take the term Satchitananda from the Indian philosophical tradition for example; the word used to describe the essential nature of the non-dual ultimate experience of Brahman from which all things emanate that is described at great length in the Vedas, the Upanishads in particular.  Satchitananda is a composition of three Sanskrit words: the present participle of the Sanskrit verb “to be” or sat, combined with the nouns cit, meaning “consciousness” and ānanda meaning “bliss” or “absolute bliss” in this context.  Satchitananda is a word meant to convey the concept of or idea of “the existence of a pure essence that is present and active, and consists of pure consciousness and absolute bliss”, an analogy in the Platonic school to the penultimate Idea or Form in Plato’s world of Forms and Ideas which emerges from the divine creator described in the Timaeus, and the core Aristotelian teleological (causal) principle which gives purpose and form to everything that exists, being qua being.

But here’s the catch, and Charlie almost smiled wryly when his mind went down this road, that the word itself, Satchitananda in this case, a word ironically intended to describe a state of being that was beyond words and the world of name and form, it’s manifestation – spoken or thought – implied that there is a thing, something that exists, and something whose essential nature is the essence of bliss and consciousness, even if it was in its purest and most essential form, and of course some perceiver or subject who experiences this state.  Duality, or at least the existence of a perceiver and that which is perceived, was even implicit in the term Satchitananda, and to take it one step further to its Neo-Platonic form, there must exist a meta or supra Platonic Form or Idea that rests behind or above the notion of Satchitananda that lends its understanding.

Although not so clear how we modern intellectuals latched so religiously on this mechanistic world view, it was clear however that as these ancient peoples evolved and progressed, a cultural melting pot emerged that facilitated the exchange of ideas, both of a religious and intellectual nature, as well as technology advancements that led to increased urbanization which further reinforced an environment conducive to the more rapid exchange of thought and ideas.  Individuals transitioned into more specialized and “civilized” roles in their respective societies and civilizations, allowing for the progression of metaphysical and theological development beyond the prevailing mythologies and pantheistic traditions that had reigned supreme for thousands of years prior to the advent of civilization in the Mediterranean and in the East.  This specialization and evolution of thought ran parallel to the expansion of trade and cultural exchange that developed as civilization emerged in the Mediterranean and Near East, marked most notably by the advent of successive empires and cultures in this region:  notably

  • the Persian Empire in the first half of the first millennium BCE in the Near East,
  • the period of Hellenic influence marked most notably with Alexander the Great’s conquest of the Eastern Mediterranean into the Near East, marking the rise of Greek influence(and philosophy) in the Mediterranean,
  • the period of Roman and Latin (and predominantly Christian) influence in the West starting at the end of the first century BCE that carried into the second millennium CE; the rise and fall of the Roman Empire and then the persistence of the Byzantine Empire in the Near East which carried forth a Greek intellectual and philosophical bent albeit Christian in faith, and lastly
  • a period of Islamic influence in the Near East beginning in the latter part of the first millennium CE and extending into the second millennium CE driven by the teachings and empire of Mohammed.

This melting pot and theo-cultural exchange continued well into the Middle Ages until the advent of what historians today call the Renaissance (14th and 15th centuries CE), the Scientific Revolution (16th and 17th centuries CE) and the Age of Enlightenment (17th and 18th centuries CE) from which eventually emerged what we would today call science which reinforced a more literal and materialistic form of atomism and mechanism which, for the belief in the atom at least, is first associated with the Greek philosopher Leucippus and his student Democritus in the 5th and 4th centuries BCE who are credited with the formulation of the concept of atomism and the void which it depends on.  It was in this time period of accelerated civilization growth toward the end of the Middle Ages when the influence of all these competing cultures and theo-philosophies that had been developing for centuries, for millennia really, were analyzed again within a more pure socio-political context, akin to Plato’s Republic or Al-Farabi’s Virtuous City, rather than a purely religious context as they had been with Christianity and Islam throughout the first millennium CE and beyond.

Even with the axe to grind from all the different competing religious systems that developed during this extended period of civilization development and evolution in the West, each of these religious systems assimilated and incorporated the Hellenistic philosophical principles in order to rationalize and justify their creeds, for even into the period of Christian and Islamic influence in the West, the Hellenic philosophers were considered to be the torch bearers of reason and were still looked upon as pillars of philosophical and theological thought.

The prominence of Hellenistic metaphysical and philosophical thought extended even well into the Middle Ages and through the period of the Age of Enlightenment, speaking to the power of the traditions and disciplines that emerged in Ancient Classical Greece.  These ancient Greek philosophical systems from the Hellenistic era were integrated into these subsequent theological systems (mostly Abrahamic) and in each of them there existed a belief in a single Creator of the universe, a universe which in the Platonic sense emanated from an anthropomorphic God the Yahweh of the Jews, the God of the Christians, and the Allah of the Muslims

Each of these Abrahamic religions, religions which dominate even today’s religious landscape, views the universe’s existence as the result of the will of a benign and omniscient creator upon whose existence the universe depends.  Once integrated into their respective religious traditions, the Hellenic theo-philosophical traditions provided for a much more rational foundation for not only the existence of the One, but also for the existence of its laws which established the rules for proper ethics and moral conduct in these religious traditions, leaning on the extensive metaphysical, fundamentally rational, foundations created by the Greeks and subsequent interpreters of their teachings and then incorporated and synthesized into the mythology of the Old Testament and in turn leveraged establish and reinforce the legitimacy of the teachings of each respective religious school’s founder – Mohammad of the Muslims, Moses of the Jews, and Jesus of the Christians.

Charlie without a doubt believed that religion, particularly after the fall of the Roman Empire straight through the Christian Crusades of the 11th, 12th and 13th centuries, accounted for more death, suffering and destruction than any other source in the history of mankind, and even no doubt accounts for much of the conflict that we see in the world today with fundamentalist Islamic factions taking moral and ethical stands against the materialism and sensualism which is so prevalent in the Western world today whilst the Jewish community still desperately tries to defend what they consider to be their homeland and birthright that took them millennia to (re)establish from outside interlopers and invaders since the dawn of Western civilization.

But Charlie did believe, that if you could cut through the religious dogma and literal interpretation of Scripture that “believers” seemed to get so hung up on, that all of these religions of the West contained inherent in them a fundamental a notion of wholeness and unity, stemming from their faith in a creative and anthropomorphic God that was intrinsic to each of their respective traditions, albeit in allegorical form, even if this faith in a unified creative whole was exclusive and intolerant of alternative points of view which was the source in Charlie’s view of so much conflict for the last two thousand years or so.

This belief in the existence of a single, anthropomorphic deity that was such a marked characteristic of the religious development of Western civilization has come under fire in the 20th and 21st centuries as science has advanced to the point where the creation of the universe itself could be explained in a rational and deterministic framework, as reflected in Big Bang theory which sits atop widely accepted astronomical and physical empirical data and evidence, providing the cornerstone for atheistic belief systems which have attacked the foundations of organized religion.

And it was this altogether abandonment of religion as a tool of faith structure for morals and ethics that Charlie had a problem with, because whether or not you belied in their dogma, or its exclusionary and almost arrogant tenet that their way, their path, was the one and only way, once you abandoned religion entirely, something was lost.  The soul had been cast aside as a tool of the Churches, Mosques and Synagogues to control their believers along with.  And along with the belief in the soul itself, the natural extension of the importance and relevance of a morals ethics for this soul’s happiness and ultimate liberation, the establishment of a moral and ethical society within which to promote this happiness, the inherent psychological and socially constructive value of the narrative of the soul, i.e. myth, and the soul’s essential link and connection to being itself had all been thrown out with it.  Talk about throwing out the baby with the bath water.

But as Charlie parsed through and studied the great philosophers and theologians that crafted and evolved these sophisticated and complex theological systems that sat behind this faith in a single, unified and anthropomorphic God over the last three millennia, this notion of unity and interconnectedness which came from the philosophy espoused by Plato and Aristotle was not lost, it was integrated into these religious systems.  And to understand how science, materialism, emerged from this age of imperialism and religious dogma that marked the two millennia after Socrates was executed for questioning authority, for espousing reason over faith, you had to look at how these theological systems evolved, who affected their evolution, and from what basis the rational and metaphysical platforms from which Descartes, Newton and the other prolific ground breaking thinkers that followed in their footsteps firmly established us in this current age of Science and Reason – a world where Science is the prevailing Religion, and Faith in the fundamental reality of the objective world, a world defined by that which can be measured and perceived by our senses and the instruments we have designed as an extension thereof, predominates intellectual thought.  For in modern times, faith in science (for good reason one might argue) has far eclipsed and overshadowed our faith and belief in religion, or God; a transformation driven by the intellectuals, scientists and learned scholars of the last few centuries which has relegated religion to the corners of the ignorant, uninformed and uneducated, and almost completely absent from academic study altogether.

There were centuries of thought and philosophical and theological inquiry that took place between the time of Plato and Aristotle’s original writings in Classical Greece, writings which broke from the reigning traditions of belief in the prevailing theos and mythos of their time, and the ensuing interpretations of their work which evolved and were assimilated into different cultural and religious systems not only throughout the Hellenistic and Roman eras which lasted well until the 5th century CE and beyond, but well up until the Renaissance which was marked by revolutionary thinkers such as Galileo, Descartes and Newton who challenged the reigning Christian belief systems which had had a choke hold on the Western civilization throughout much of the Middle Ages.  Running parallel to this development in the West was the evolution of Eastern theological and metaphysical systems which had their roots in Vedanta which reached as far back into antiquity as the first half of the second millennium BC[5] and continued to evolve and affect Eastern religious and philosophical development through the second millennium CE, marked most notably by the advent of Buddhism as professed by Siddhartha Gautama in the 5th and 4th centuries BC, and the exposition of Vedanta philosophy by Shankara in the 8th and 9th centuries CE.

Alongside the Hellenistic philosophical traditions which were thriving at the time of Christ, there existed all of the religious and theological traditions that were brought into India by conquering nations and immigrants over the first and second millennium CE, most notable of which were Islam and Christianity, both of which flourished and were accepted side by side with the native Hindu and Buddhist cultures that had at their core the acceptance of the Many, alongside the One, both being perceived as various reflections of the same unified Brahman, or in the case of Buddhism the belief in no godhead but simply the way.

Ironically, it was most probably the polytheism that was inherent to the Hindu tradition, the belief in the joy and beauty of the celebration of the many different aspects of the divine, that allowed the Indian society to be so tolerant of other theological and religious systems over the centuries, or at least so it appeared to Charlie from where he stood in the beginning of the third millennia AD.  But this polytheism that was such a core tenet of the Hindu religion was married to a core, fundamental belief of the direct perception of non-dual realty that was the goal of all religious and spiritual traditions, the Satchitananda of the Vedas (a concept which Charlie looked at as a de-anthropomorphized Yahweh of the Jews, God of the Christians, and Allah of the Muslims) that created the foundation of tolerance from which all these religious systems could thrive and flourish side by side.

Religious belief systems as espoused by Islam and Christianity, as seen in juxtaposition to the teleological, epistemological, and non-anthropomorphic theological pursuits that characterized the Greek philosophical tradition, clouded some of these philosophical and metaphysical developments surely, but even in these religious systems there existed an undercurrent of philosophical inquiry that provided for the foundation of further pursuit of natural philosophy that took hold in the middle of the second millennium CE, culminating in what historians call the Age of Enlightenment a thousand or so years after which of course marks the end of what present day historians call the Dark Ages.

And yet what Charlie was searching for, now that his thesis had been fairly well established, was where this fundamental and immovable faith in the reality of the world of the senses, the world that exists only if it can be empirically measured or perceived by the senses or some extension of the senses, which stood in contrast (at least in its most modern interpretation) to the belief in a divine creator, found its unquestionable foothold.  But he couldn’t find it, at least not in the theo-philosophical traditions of the Ancient Mediterranean and certainly not in the Abrahamic monotheistic faiths that emerged thereafter in the West.  He found great philosophers and profound and extensive theological systems, he found great religious figures who professed illumination and direct communion with the divine from which the great Islamic and Christian religions sprung forth, and even great theologians and religious figures through the Middle Ages who attempted to integrate the profound metaphysics of the Ancients with their own religious creeds and belief systems like St Augustine (354-430 CE), Averroes (1126-1198 CE), and Thomas Aquinas (1225-1274 CE) among others, but none of them professed the supremacy of the material world over the spiritual, and none of them certainly dismissed the idea of a principle of a divine or otherwise omniscient creator.  This was clearly a much, much later development.

Materialism at some level did have its roots in the Hellenic philosophical landscape however, albeit one that did not dominate ancient thought as the Platonic, Peripatetic (Aristotle) and Stoic schools did in antiquity, but one that had a place nonetheless and one that established if nothing else some of the semantics and language upon which modern science developed.  Namely in the Epicurean school founded Epicurus toward the end of the fourth century BCE in Ancient Greece who expanded and expounded on the philosophical work of his predecessors Leucippus and his student Democritus who postulated that all things of the world were made up of atoms which is an English word derived from the Greek atomos which means “uncuttable” or “indivisible”.  In this school of thought, the atom represented the fundamental, indivisible building block of everything in the known universe, animate as well as inanimate, and originated out of the great void or ether[6].

This system of belief as passed down by Epicurus and his followers represents the first real materialistic philosophical school, materialistic in the sense that they did not believe in any teleological, or first principle, foundation of the universe or belief in any sort of creative or divine principle as put forth by Plato or his followers.  The Epicurean school sat in contrast to the Platonic and Peripatetic philosophical systems that still held that there was some core principle, or first cause, upon which the physical (and spiritual) universe sprung forth.  From the Epicurean standpoint, the world was made of objects, indivisible entities that interacted with each other than in their composite form made up the known universe and no further teleological explanation was necessary, rendering the idea of free will a mere human construct lacking any rational foundation.

But it was important to not confuse the Epicurean philosophy of atomism, what we might call today a precursor to materialism, which exists alongside mechanism, or the belief that the known universe is simply a compilation of substances and corporeal objects that interact with each other and are governed by laws of science or mathematics.  Although the notion of movement and substance as fundamental principles in reality and the description of existence did have a core place in Aristotle’s Physics, this mechanistic philosophical development, an offshoot of the materialism of Democritus, came much later, in the Age of Enlightenment stemming primarily out of the work of Descartes (1596 – 1650 CE) and then followed by Newton (1642 – 1726) and then many other great thinkers and authors, true lovers of wisdom, of the Scientific Revolution and who started to discover deeper laws of the natural order of the universe, laws based upon mathematical principles and the establishment of the supremacy of empiricism upon which any notion of reality must be constructed.  In their eyes, Truth can indeed be known, but it was grounded in the notion of law and the ability to predict and understand the behavior of the objective, material world.  The penultimate discovery which characterizes this development, was that the laws that governed planetary motion pointed to a universe where the Earth, God’s penultimate creation where mankind held a profound place, was not in fact the center upon which the sun and stars revolved around, overturning and bringing into question centuries held belief that shook the very foundations of monotheism.

But Epicureanism, like its Ancient Greek theo-philosophical counterparts Platonism and Stoicism, was developed to attempt to primarily to establish a system of ethics and way of life based upon a more reasonable foundation than its mythical predecessors, a belief system which people could comprehend and understand, and a belief system that rejected the notion of any sort of divine creative principle that lacked intellectual capacity.  This was the concept of nous, Mind or Intellect, that was first established by Anaxagoras and then was incorporated into Neo-Platonism as the name for the core principle which brought the world of the many into existence, from which the world as we know it emanated.

Epicureanism, like Platonism and Aristotle’s philosophy, was an answer to the “why we’re here” and the ultimate purpose of existence in a rational and logical framework of understanding, providing for a rational foundation to a system of ethics and morals that was created in juxtaposition to the belief in mere god heads or straight mythology, or even in the seemingly rationally absent belief in “salvation” through the belief in the revelations of one prophet or another depending upon which major religious faith you ascribed to.  Plato attempted to answer the same questions, he simply presented them in an open ended form, dialectic, which was meant to be used by his students as a tool for understanding.  In its essential form, Epicureanism rejected the notion of the reality of gods (theos) at all, or even the existence of the soul, teaching its followers that the right and correct path was the pursuit of moderate pleasure, or the absence of pain, boiling life down to a pleasure optimization problem within which the notion of judgment upon death was absent.

In the words of the renowned Latin Epicurean poet and philosopher Lucretius from the 1st century CE, we can find the rational underpinnings for the belief in atomism, a precursor to materialism, as well as to why a belief in the underlying materialistic and objective reality, a world which consisted at its most core basic level as atoms acting and reacting upon each other, would leave no room for any sort of divine creative principle as a natural conclusion.

And yet it is hard to believe that anything in nature could stand revealed as solid matter.  The lightning of heaven goes through the walls of houses, like shouts and speech; iron glows white in fire; red-hot rocks are shattered by savage steam; hard gold is softened and melted down by heat; chilly brass, defeated by heat, turns liquid; heat seeps through silver, so does piercing cold; by custom raising the cup, we feel them both as water is poured in, drop by drop, above.[7]

So in none of these ancient theo-philosophical systems, not even in the Epicurean school, could Charlie find this notion of true separateness which underlies today’s predominantly mechanistic world view, this notion that the world around us was distinct from the individual who lived and was.  Epicureanism reflected a belief in atomism for sure, that much was clear, but this atomistic philosophy underpinned a system of ethics that espoused a path of the greater good, or lesser evil, which implied a holistic view of man’s place in society and mankind’s place in the world around him.  Atoms were the indivisible component of the universe in the Epicurean view no doubt, and man and all animate creatures were made of these indivisible atomos, but this principle was subsumed in the ethical framework within which it sat rather than the primary driving force of the theo-philosophy as is the case with mechanism which predominates the thinking of modern man in today’s technologically advanced world.

Despite this ancient atomic worldview of the Epicureans, this relegation of the realm of the divine, religion as it were, as completely a figment of mankind’s imagination, this break between science and religion, was a much later development, a development whose roots could be found in the Age of Enlightenment which swept up the socio-political and intellectual establishment of Europe in the 17th and 18th centuries.

But Charlie found as he dug into the intellectual developments that occurred in the 15th, 16th and 17th centuries in Western Europe, categorized by later historians as the Age of Enlightenment or Age of Reason, despite its strong anti-establishment and anti-religious roots, still did not profess true mechanism, which is a more modern term (post Newton) that implies a strong atheistic bent combined with a fundamental belief that all reality has a purely mechanical explanation.

Both Platonic philosophy and Aristotelian metaphysics both played a significant role in the development of theology and epistemology in the centuries that followed their published works, developing and maturing into what modern scholars call Neo-Platonism – “neo” in the sense that it represented an assimilation of some theological principles from both Ancient Judaic and Egyptian circles, combined with a broader interpretative and commentated tradition based off of the original work attributed to Plato or Aristotle exclusively.

Neo-Platonism, which in turn exerted a strong influence on the development of early Christian theology, as well as on Muslim and Jewish theology well into the Middle Ages, has its roots in the teachings of Plotinus (204/5-270 CE) and Porphyry (234-305 CE) in the 2nd and third centuries CE, some six or seven centuries after Plato and then Aristotle lived, taught and authored, speaking to the depth of their teachings and their fortitude.  The Neo-Platonic teachings represent the first truly deep metaphysical framework that center around monotheism, in much more direct and explicit way than in previous philosophical traditions which allude to and elaborate on a single unified creative principle, developments which ran parallel with the monotheistic developments that were occurring in the Mediterranean and Near East at the time with the spread of Christianity in the region.  [The primary reference text for Neo-Platonism is the Enneads, authored by Porphyry but essentially consisting of a compilation of Plotinus’s teachings with an introductory section on the life of Plotinus[8]In the Enneads, we find the first true monotheistic theological and metaphysical framework that rests alongside a system of ethics and morality based upon the concept of hierarchical system of virtues.]

Alongside Neo-Platonism which provided for the theological and metaphysical link between the theo-philosophical systems of the Ancient Greeks to Christian theology, it was the Peripatetic school founded by Aristotle which provided for the language and categorization of study of the pursuit of knowledge (epistêmai in Greek) in general, Greek categorization and intellectual frameworks which, translated into Latin, were used to provide an intellectual framework to students of the Middle Ages, providing for the underlying metaphysics for virtually all of the monotheistic traditions that followed.

To Aristotle, there were three main branches of knowledge: 1) “theoretical” knowledge of which first philosophy (what became to know as metaphysics given that in this school it was meant to be studied after “meta” his Physics) and natural philosophy belong, 2) “practical knowledge” which included the knowledge and intellectual pursuits in the ethical, moral and political spheres, and 3) “productive knowledge” which included those disciplines that contributed toward the creation of beautiful and useful objects, of more practical consideration if you will.

And it was with Aristotle that we find the categorization of the fields of knowledge (or sciencia in Latin which is the translation for the Greek word epistêmai which is the word that Aristotle used in his writings) which carried down through the Middle Ages well into the Age of Enlightenment, providing for the semantic framework within which truth and knowledge itself was to be explored, providing the semantic framework first in the Greek, which was then translated into Latin and then in turn into the rest of the Romance languages that followed, English of course being one of them.

Aristotle’s epistêmai, what came to referred to as sciencia, provides the basis for the categorization of the research that is performed branches of knowledge start to mature and evolve in the Age of Enlightenment, culminating from a natural philosophical perspective in Newton’s great work Philosophiæ Naturalis Principia Mathematica, or Mathematical Principles of Natural Philosophy, which marks the beginning of science as we know it today.


[1] The word “science” in fact is the English translation for the Latin sciencia which literally means “to know” and is the direct translation of the Greek word epistêmai which is the word Aristotle uses for knowledge in his teachings.

[2] The Platonic doctrine of the notion of the One, or the demiurge, from which the phenomenal world emanates via the nous or intellect, was developed by Plotinus (204/5 – 270 CE) and then encapsulated in the Enneads written by his student Porphyry (234 – 305 CE), which passed into the Muslim/Arabic philosophical tradition under the title The Theology of Aristotle, came to be known much later (19th century or so) as Neo-Platonism and represented a synthesis of Platonic and Aristotelian doctrine, espousing the belief that the essence of each of these seemingly at odds philosophical teachings were not in conflict but complemented and were consistent with each other if their true and essential tenets were properly understood.

[3] These titles, Nicomachean Ethics and Eudemian Ethics, were not used by Aristotle himself and were a later editorial addition to these works either in dedication to or by his son Nicomachus and his friend Eudemus respectively.  He refers to the principles therein in his Politics (1295a36) using the phrase ta êthika, which denotes the study of ethics and morals in general, which in Aristotle’s system of philosophy came with a connotation of their role in not only the development of individual character, but also the importance of the individual practice of ethics and morals as it related to the proper functioning of society as a whole.   See Aristotle’s Ethics by Richard Kraut 2012, published in the Winter 2012 edition of the Stanford Encyclopedia of Philosophy: http://plato.stanford.edu/archives/win2012/entries/aristotle-ethics/

[4] Aristotle asserts basically the same thing in his Metaphysics where he attempts to establish the first principles, determining that at the very least there are three – a pair of opposites, or contraries, that are complemented by a substratum of sorts that underlies the two and gives them a platform for existence.  His notion of the importance and relevance of three in the first principles of the universe is reminiscent of the three established by Neo-Platonists some 6 or 7 centuries later albeit Aristotle doesn’t name or establish the three, he simply deduces that three is the most likely candidate for the number of first principles.

[5] The Rig Veda is one of the oldest extant texts in any Indo-European language.  With philological and linguistic evidence indicating that it was composed roughly between 1700–1100 BC, known as the early Vedic period.

[6] This belief in the void is one of the philosophical concepts that Aristotle attacks as lacking a sound and coherent ration foundation in his Metaphysics.

[7] Lucretius, De Rerum Natura; Book I, lines 487-496.  ‘De rerum natura (On the Nature of Things) is a didactic poem intended to explain Epicurean philosophy to a Roman audience. The poem, written in some 7,400 dactylic hexameters, is divided into six untitled books, and explores Epicurean physics through richly poetic language and metaphors.  In it Lucretius presents the principles of atomism; the nature of the mind and soul; explanations of sensation and thought; the development of the world and its phenomena; and explains a variety of celestial  and terrestrial phenomena. The universe described in the poem operates according to these physical principles, guided by fortuna, “chance”, and not the divine intervention of the traditional Roman deities.’  – from http://en.wikipedia.org/wiki/De_rerum_natura

[8] Porphyry tells us (Cf. Life of Plotinus, chapters.24-26) that the First Ennead deals with Human or ethical topics; the Second and Third Enneads are mostly devoted to cosmological subjects or physical reality; The Fourth concerns about Soul; the Fifth to knowledge and intelligible reality; and finally the Sixth has for topics Being and what is above it, the One or first principle of all.  Outside of his Enneads, Porphyry was prolific author and philosopher in his own right.  He wrote an introductory work on ancient philosophy and logic called the Isagoge for example, which in its Latin translation form represented the standard textbook on logic and philosophy that was taught to students well through the Middle Ages and even into the Renaissance and Age of Enlightenment in the West.

%d bloggers like this: