Theology Reconsidered: An Introduction

What follows is the Introductory chapter from a newly published, two Volume work entitled Theology Reconsidered.  The book can be purchased from Lambert publishing via their website; Volume I here and Volume II here.

 


When looking at the first mythological and philosophical works from antiquity, it is very easy to get lost in the “facts” surrounding these ancient works and lose sight of their true meaning and import to the people and cultures within which these works emerged from and out of.  Much of the modern academic and scholarly literature concerning these ancient “theo-philosophical” works falls into this category.  To a large extent, the purpose of this work is to try and “recover” said meanings of these ancient works as much as possible, and to look at them within a much broader theological, mythological and philosophical narrative that we find throughout Eurasia in the first millennium BCE, the so-called “Axial Age” of modern man.

In order to do this, we take a primarily intellectual journey through the mind of ancient man, as he sees the world and as is reflected in the earliest literary evidence of man, trying to understand these works not only within the broader “Eurasian” context, but also trying to look at them through the eyes of the ancient philosophers, theologians, priests and scholars who wrote these ancient texts, or in many cases were the ones to “compile” or “transcribe” these longstanding theo-philosophical traditions, but also to try and understand them within the theological, intellectual and socio-cultural context within which these works arose.  This broader meaning we call knowledge, which from a modern philosophical perspective is referred to as epistemology.

This knowledge is what Philo Judaeus takes great pain to describe in his exegesis of the Pentateuch, Genesis in particular, what the Neo-Platonists take pains to describe in their literature which arises in defense of their doctrines as Christianity takes root and begins to supplant and snuff out their schools of learning and wisdom, it is what is alluded to in the so-called hidden, or unwritten, teachings of Plato and that which is hidden, kept secret, by the followers of Pythagoras  and also in the Eleusinian mysteries and the alchemical Hermetic doctrines attributed to Hermes Trismegistus, and also what the Upanishads refer to as Brahmavidyā, or knowledge of Brahman, that from deep antiquity is believed to be passed down from teacher to disciple – as Plato refers to in his Seventh Letter as that which is “brought to birth in the Soul, as light that is kindled by a leaping spark, and thereafter nourishes itself.”.[1]

After the author completed his first major work, the Snow Cone Diaries, we considered the writing “experiment” complete, the Work was done.  Following that exercise, and for reasons we cannot completely explain, we found it necessary to flesh out some of the ideas therein, reflecting a continued interest, and ultimately curiosity, concerning the advent, development, and evolution of what we term theo-philosophical developments of mankind which abound in the historical record, following the intellectual journey as it were up until modern times, what we refer to herein as the Information Age, where Information is at our finger tips, where activity is “informed” by its surroundings (Quantum Theory), and yet while we all for the most part (and in particular in Western academia) ignore the wisdom of the ancients.

The Snow Cone Diaries was somewhat manufactured in the sense that it was intentionally modelled after one of the most influential books that the author has read written by a modern self-proclaimed “metaphysician”, or philosophologer (i.e. one who studies philosophy) named Robert Pirsig entitled Zen and the Art of Motorcycle Maintenance, initially published in 1974 but which the author encountered and read during the summer between his freshman and sophomore years in college, years of great tumult and change for most, the author being no exception.

After college, as is documented in Snow Cone Diaries, the author spent several years pursuing a career in professional tennis which in the end amounted to more so than anything else a time period of intense reading and personal analysis and introspection that led the author, in a manner that can perhaps best be described as “chance”, or if you believe in such things, Fate as it were, to Eastern philosophy and mysticism.  The author had been prepared for this adventure somewhat as he had majored in Ancient History as an undergraduate at Brown University and had written his thesis on the “Origins and Influence of Hermetism”.  In a sense then, the author’s initial, and now very persistent and long lasting, foray into Eastern mysticism was a natural extension of the intellectual pursuits of his undergraduate years.

The impetus and source of the author’s interest in Eastern philosophy, Kuṇḍalinī Yoga specifically in fact, stemmed primarily, at least at the beginning, from an interest in the mental and psychological demands of the game of tennis at the professional level, where many matches were determined by an individual’s performance at key points within a given match.  After spending a 6-9 months on the professional tour, mostly travelling in Europe, It became very clear that an individual’s success (which was of course almost entirely measured in Wins and Losses) while depending of course on physical attributes such as power, strength and fitness, at the same time very much depended upon what is referred to in sports as the mental aspect of the game.

This somewhat revealing and interesting component of the game, which arguably is an aspect of all professional sports but is more accentuated as it were in tennis given that it is mono y mono so to speak, became even more pronounced when two opponents were somewhat evenly matched as it were, and victory hinged upon just a few points – which as it turned out happened more often than you might think.  The author became somewhat intrigued by this phenomena, if we may call it that, and he became fascinated with what in professional sports is referred to as The Zone.

As such, it became very clear that in order to get into The Zone, there was a very well documented and well-studied connection between what peak performance and what we might call “clarity of mind.  In turn, this clarity was connected to, and in many respects seemed to be dependent upon, what are even in the sports psychological literature referred to as the development and cultivation of various rituals and practices, both on and off the court, before, during and after matches, in order to facilitate and/or “bring about” these states of mind where peak performance could be “attained”, or in Eastern philosophical parlance, realized.[2]

With this background then, the author while he was playing and studying, began writing – primarily about the so called “mystical experience” which was such an integral part of the Eastern philosophical tradition and its fundamental relationship to The Zone as it was understood in Sports Psychology.  The effort was centered around (and to a large extent this is also true for the Snow Cone Diaries as well as the current work) an attempt to establish a rational grounding, or intellectual footing as it were, within which these states of mind could be better understood and as such better integrated, or at least somewhat integrated, into what the author now calls the objective realist intellectual framework that underpins not just Western academia, but also is clearly the very rational ground of the Western mind, or psyche.

For despite all the author’s education and training, mental and physical gymnastics galore as it were, no one had ever even broached the topic or discussed with him this idea of peak performance and its relationship to the cultivation of clarity of mind, and the related states of consciousness that were associated to these “states” as it were, despite the fact that it appeared to be an almost empirically proven correlation between the two – at least in Sports Psychology circles.  It also became clear over time that this goal of peak performance as it were was not just dependent at some level upon these so-called states of mind, but in fact that that there seemed to be a very direct, causal relationship between the two.  Furthermore, these very same states of mind that were described in the Psychological literature around peak performance were not only clearly an exteremly significant, and somewhat undervalued and under practiced, component of competing as a professional athlete at the very highest levels, but that they were also an integral part of the mystical experience as well as it was described in almost all of the Eastern philosophical literature.  And of course, after some reflection, these states of mind seemed to be an integral component to success in life, however one might choose to define such a thing.

Following the model of Robert Pirsig then, and because the author felt strongly that the ideas that he was exploring, presenting and analyzing were best understood only within the psychological and mental context within which the author himself initially encountered and confronted such ideas, the author felt compelled to take his initial more “academic” works and wrap them around a loosely fictional character which he named Charlie, as well create additional (also loosely fictional) characters to which Charlie was “responding” and “reacting” to in order to try and establish the empirical reality and power of the nature of mind, and along with it the fundamental truth and power of the ancient art of meditation and mysticism to which it is integrally tied.  For perhaps the hallmark of the Eastern philosophical tradition is the emphasis and description of the art of meditation and its relationship to the attainment of these states of mind, what the author calls the Science of the Mind as it were, an altogether Eastern discipline.

Using Charlie as his mouthpiece then, we essentially argue in the Snow Cone Diaries, as Pirsig had done before him, that not only are our current (Western) intellectual models lacking in some very basic and fundamental ways – given the lack of emphasis and focus on the mind and experience itself as basis of reality as we understand it – but that these limitations had, again as Pirsig had argued before him as well, significant implications on how society in the West functioned and how individuals within that society behaved toward each other as well as the nature of the relationship between individuals and (Western) society as whole to the world around them in general.[3]

Given the level of effort and personal sacrifices that were made to publish Snow Cone Diaries, and given that the author is by no means a full time writer and first and foremost has a demanding professional career and responsibilities as a parent that were and are first priority, we never thought that he would embark upon a subsequent work.  We thought we were done.  However, our interest in ancient philosophy and the art of meditation did not dwindle, and the author’s meditation practice continued to flourish and grow and (as it is wont to do as any persistent and schooled meditation practitioner will tell you) the practice itself continued to have a profound impact on us in the following ways as it pertains to this work specifically:

  1. Our own personal conceptions of the nature of reality and the disconnect between it and commonly held and systematically taught “belief systems” which we are taught from early childhood and are presented as “empirically true” in the West,
  2. a continued and increased dissatisfaction of the current prevailing “Western” worldviews and belief systems,
  3. a deeper appreciation for the eternal truths that the very first philosophers from classical antiquity, as reflected in the term the Axial Age, were trying to convey and seemed to be missed or passed over in much of the scholarly work surrounding these ancient texts and authors,
  4. that in fact there were many more parallels and commonalities between these various belief systems that were compiled in classical antiquity throughout “Eurasia”, much more so than was reflected in most if not all of the scholarly and academic work surrounding these traditions, and
  5. the disciplines surrounding the study of the philosophers of deep antiquity were “siloed”, in the sense that the Sinologists (Chinese) weren’t collaborating with the Vedic or ancient Sanskrit scholars and the “Classicists” who studied the ancient Greek philosophers didn’t seem to be collaborating with either of these two disciplines either

Given these facts, and after studying the various traditions from antiquity, again as reflected in the so-called Axial Age, even from a layperson’s perspective it seemed that there were underlying similarities and patterns that were being missed primarily because these disciplines in and of themselves independently requires such a rigorous and deep level of knowledge about the specific domain.  Those that could understand and read ancient Chinese script were not necessarily the same people that knew and could read Vedic Sanskrit, and these people of course were not necessarily the same people that knew ancient Greek or Latin, and in turn these people were not necessarily the same people that could read cuneiform of the ancient Sumer-Babylonians or the hieroglyphs of the ancient Egyptians for that matter.

However, given that we now live in the Information Age, and that the translations of many of these texts, as well as the underlying meaning and etymologies of the various terms and words of the ancient languages themselves as reflected in the ancient writing systems that developed in the 3rd and 2nd millennium BCE throughout Eurasia, are now readily available, the time seemed to be ripe for a generalist of sorts to pull together the knowledge from all these somewhat disparate domains and bring them together in some sort of cohesive whole, in a more comprehensive and somewhat more scholarly fashion than had been done in Snow Cone Diaries, which was more of a personal journey tan it was an intellectual or academic one.

So given that the disciplines and domains of study and research described herein continued to impress themselves upon the author, and given our continued interest in ancient philosophy, and writing even after the publishing of Snow Cone Diaries, we ended up publishing two interim works thereafter that summed up and further explored some of the more esoteric and less well known aspects of Hellenic philosophy and their subsequent influence on the development and foundations of early Christianity, as well as an exposition of the philosophy of the Far East, the latter being and area of research that was relatively new to the author and that was not considered in the Snow Cone Diaries.[4]

The Far Eastern ancient philosophical tradition, what is referred to as ancient Chinese philosophy, which is covered in detail in the current work, is intriguing for many reasons but for the sake of brevity in this Introduction, suffice it to say that the Yijing, what is more commonly known as the I Ching, is arguably the most fascinating and intriguing theo-philosophical work from antiquity, hands down.  And the more the author studied it and was exposed to its origins and influence throughout Chinese history, the more impressed he was with its place as one of the greatest and most intellectual achievements in the history of mankind, one that reached far back into Chinese antiquity (3rd millennium BCE at least) and one that undoubtedly rivaled the Vedas and Avesta as representative of some of the oldest theo-philosophical treatises of ancient man.[5]

Furthermore, as the author began to understand more and more of the nature, content, structure and origins of the Yijing, the most prolific and influential of all of the ancient Chinese “philosophical” works, if we may call it that, it became apparent that its basic architecture, particularly from a numerological and metaphysical perspective, shared many common characteristics of ancient Pythagorean philosophy, in particular as reflected in the symbol that perhaps more so than anything else has come to represent said philosophy, namely the Tetractys.  Following this intellectual thread as it were, the author published what was supposed to be a small academic piece on the similarities from a numerological and arithmological perspective between “Pythagorean” philosophy, or what we know of it, and classical Chinese philosophy as reflected in the I Ching, what is referred to in this work as its more modern Romanized form, i.e. the Yijing.[6]

 

This work in its current form is to a large degree an outgrowth and evolution of the intellectual journey that is documented and mapped in the Snow Cone Diaries, and in particular an outgrowth of research done after Snow Cone Diaries was written exploring the nature and origins of early Hellenic philosophy and its relationship to early Chinese philosophy as well as ancient Vedic or Indo-Aryan philosophy as reflected primarily in the Upanishads, the latter of which was rigorously and systematically studied at the Rāmakrishna-Vivekananda Center of New York under the guidance of Swami Adiswarananda to whom this work is dedicated to.

So while this work can at some level be considered to be extensive revision and expansion of the academic and intellectual pursuits that are reflected in the Snow Cone Diaries, it is distinctive from the author’s first major work in many respects and represents a much deeper dive into the material covered therein, as well as covers topics and areas of inquiry that are not covered.  Having said that, this work is much more “academic” in the sense that it represents – at least from the author’s perspective – a much higher level of scholarship than is reflected in the Snow Cone Diaries, and of course the personal narrative, Charlie himself, has been put to rest (God rest his Soul).[7]

Given the extent of the material covered in this work, the author in no way intends to represent it as an exhaustive study of any of the specific topics that is covered herein.  In fact, each chapter or section of the work could be covered, and is covered, in much greater length in a variety of works that are cited as references and for further study and research.  The author has however taken great pains to try and refer to, and directly cite, the most influential and comprehensive works that cover the various topics in question and of course the interested reader can follow these lines of inquiry and these references to learn more about any given topic.

The specific source material that is used is not only cited directly throughout as footnotes, but is also covered from a much broader perspective in the Sources and Bibliography section at the end of the work.  Perhaps more so than other works from before the 21st century, an era the author refers to as the “Information Age”, this work stands directly on the shoulders of many academics and scholars that have toiled and taken great pains to open up the world of antiquity to the modern Western reader and scholar through countless translations and historical books and records, many of which are now electronically available and upon which easy access the author has greatly relied.

There are no doubt particular sections or chapters which the author has glossed over in a manner that may be considered to be “superficial”, particularly by academics and scholars who have spent the better part of their professional careers studying and writing about the specific topics in question.[8]  However, each of the lines of thought represented in each Chapter of each Part of this work represent a coherent and cohesive whole and in their entirety, and of course for the sake of brevity (as ironic a term that may be given the length and scope of this work), is intended to show as complete a picture as possible in one text.

The approach from a reference and bibliography standpoint is to have significant footnotes and references directly within the material itself rather than, as is the case with most academic works, at the end of a chapter or even at the end of the work.  The footnotes, the explanations and small intellectual excursions which are reflected in the extensive footnotes that are included directly in the text not only serve to give credit to the reference material and the work and analysis put in by other academics and scholars on whose research and work mine ultimately depends and builds upon, but also as sidebar notes that may be of interest to the reader that provide direct links and references to works that the reader can refer to if they are interested in a certain topic that is not covered in detail in this work.[9]

The footnote style that is used is essentially adopted from the writings of Swami Nikhilananda (1895 – 1973), one of the foremost Sanskrit and Vedic scholars in the West in the 20th century.[10]  Nikhilananda’s works have in no small measure influenced the author, as he studied at the Rāmakrishna-Vivekananda Center which he founded in the middle of the twentieth century which was led by the author’s teacher, Swami Adiswarananda from 1973 until his passing in 2007.[11]

In this context, Vedānta, and more broadly what we refer to as “Indo-European philosophy”  in this work, is a central and constant theme throughout this work, in particular with respect to the modern conception of ancient Indian philosophy as it is presented in the teachings and works of Swami Vivekananda (1863 – 1902), one of the foremost proponents and most influential of the modern “Indian philosophers”.[12]  From the author’s perspective, Vedānta, as reflective of one of, if not the, oldest and richest of the Indo-European theo-philosophical traditions, can (and should) be leveraged as an intellectual and theo-philosophical benchmark of sorts for the recasting of the definitions of knowledge and reality in the West, one of the main thrusts of this work.

 

The work is divided into 4 major sections, Books or Parts, following more or less the intellectual development of mankind since the dawn of “history”, history in this sense being marked by the invention and widespread use of writing after which we have a “direct” or “first hand” exposure to the mind of man, or at least into the minds of the authors of the works that are covered herein.

  1. On Creation and On Metaphysics, Parts I and II: how the ancients looked at the world and defined reality and knowledge (),
  2. On Theology and Physics, Part III: how we came to our current, modern conceptions of reality and knowledge in the West (Part III), and
  3. On Ontology Part IV: a deeper and more comprehensive look at the nature of reality, Being in the sense that it was looked at by Aristotle and Plato as understood through a modern Western intellectual lens, and in particular in light of the knowledge of the East.

The chapters and sections in each of the respective Parts, or Books, are designed and written as much as possible to be modular as much as possible.  By “modular” we mean to say that they are written with the intention, again as much as possible, of being stand-alone essays or dissertations of their respective topics such that the reader can read a particular chapter without necessarily reading preceding chapters.  That is to say, the design of the work itself is such that it need not be approached or “read” in a sequential fashion from start to finish.  And of course as such, some material and content is repeated in the various sections and Parts of this work so that said “modular” design is achieved.  Given the breadth of the topics covered herein, this type of modular design is not only intentional but is almost required in order for the work to have value.  For if it is not read, it of course cannot have the intended impact or influence on modes of thinking which to a large extent the intended purpose of the work.

One of the main underlying themes of the work, especially in Parts I and II, is an exploration and analysis of the potentially shared origins of not just the mythology of the first “civilized” peoples in Eurasia, which the “Laurasian” Mythos hypothesis of Witzel, but also an expanded version of said hypothesis which analyzes and discusses the potential shared the origins of “philosophy” – what is referred to again as theo-philosophy throughout following the terminology of Snow Cone Diaries which brings attention to the fact that the earliest systems of “philosophy” from antiquity are not just analytical or rational systems of thought, but are also fundamentally theological in nature.

Parts I and II of this work are primarily focused on this area in history, the 3rd to 1st millennium BCE when we have introduced into the historical record evidence and documents that outline the Mythos of these early Eurasian peoples, specifically the creation narratives (what we refer to as cosmological or theogonical narratives), which is followed by a detailed analysis of the subsequent theo-philosophical tradition which emerges from, and is fundamentally and intrinsically related to, the underlying comsogonical narrative, i.e. again the respective Mythos.

Part III focuses on intellectual developments that take place in the West post classical antiquity from the intellectual developments that characterize Hellenic philosophy, through the advent of more orthodox religious or theological developments, straight through the Enlightenment Era and Scientific Revolution periods of Western intellectual history where effectively the worldview is overturned and Science, as we define it in more modern terms, begins to eclipse the dogmatic religious and theological worldviews that had dominated the intellectual landscape in the West for some thousand years prior, the so-called “Dark Ages” .

Part III then goes on to look at scientific developments in the 20th century, Relativity Theory and Quantum Mechanics in particular, which call into question our modern (and pervasive) notions of deterministic, objective based frameworks of reality, what we refer to collectively as objective realism, which represent from the author’s perspective a somewhat unintended byproduct of the Scientific Revolution and which, given their limitations with respect to understanding reality from a comprehensive or holistic perspective (i.e. ontology, or the study of the nature of being or reality), require – in the same intellectual spirit and intent pursued by Kant, Pirsig and other more modern Western philosophers – a wholesale revision in order for not only the two theoretical pillars of modern Science (Classical Mechanics and Quantum Mechanics) to be understood in any meaningful way, but also such that the knowledge and wisdom of the East is integrated into our conception and understanding of reality as well.

Part IV covers in detail much of the material that was first introduced in Snow Cone Diaries with respect to the fundamental incompatibilities of Quantum and Classical Mechanics, going into (theoretical) detail not just with Relativity but also Quantum Theory, as well as some of the philosophical, and ultimately metaphysical, implications of Quantum Theory, covering two interpretative models in particular that the author thinks are relevant to the ontological questions that are the topic of Part IV – namely the Relative-State formulation of Quantum Mechanics by Hugh Everett as well as the pilot-wave theory that is attributed to Louis de Broglie and David Bohm.  The Metaphysics of Quality as presented by Robert Pirsig is also offered up as an alternate model for ontological inquiry given its adoption and incorporation of the direct perception of “intuitive” reality directly into its metaphysics as it were.

Part IV then offers up various alternative interpretations of reality that attempt to present and synthesize what we understand about the nature of reality both from a scientific perspective, as well as from what we might term a mystical or spiritual perspective, models which directly incorporate experiential reality into account when defining reality or the extent of knowledge itself, i.e. what is referred to as epistemology in modern philosophical nomenclature.  The models and analysis in Part IV directly take into account the role of active consciousness, cognition and perception, what in Quantum Theory has come to be known as the act of observation which from a Scientific perspective, at least again from the author’s standpoint, must be taken into account in any formulation of reality and in any definition of knowledge.

The alternative approaches to defining reality and knowledge that are presented and described in Part IV basically synthesize typically “Eastern” and “Western” worldviews, and from the author’s standpoint, are far better suited than existing philosophical or religious intellectual frameworks to prepare us not just as individuals to survive and thrive in the modern, Information Age, but also are much better suited to serve the society as a whole, from a national as well as global perspective, given the level of interdependence and interconnectedness of not just the human race, but also the natural world within which we live and depend upon for our survival moving forward into the future.

The last several chapters of Part IV, much more so than the author originally intended in fact, are dedicated to a fairly lengthy discussion of a relatively modern debate surrounding different ways or approaches to interpret, how best to understand, the life and teachings of the 19th century Bengali (Indian/Hindu) sage Rāmakrishna Paramhamsa, a tradition of course to which this author is closely linked from a theo-philosophical perspective.  Rāmakrishna in this sense, and how he is perceived and approached in these final chapters of this work, is the full manifestation of, and in turn the perfect example of, the limitations of Western “thinking” and the implicit epistemological restrictions and assumptions that while true, are fundamentally limited in their capacity to deal with anything that falls outside of the realm of Science proper and as such is dealt with as a case study of sorts for the need to integrate the Science of the Mind as it were into any ontological framework that we are to choose that would include the knowledge of the East along with the knowledge of the West.

This so-called mystical, or supraconscious experience, which is the intended result of the practice of the ancient art of meditation as it has been passed down to us through various classically Eastern theo-philosophical traditions – in the Upanishads in particular but also implicit in the writings and teachings of Plato and Greek Eleusinian mystery and Orphic traditions and of course in the teachings of Buddha as well – are presented as a necessary and integral component of any “redefinition” of reality and knowledge which, following any sort of rational interpretation of Quantum Theory must take into account the role of the observer and the act of cognition i.e. perception, into account in any coherent and complete model of reality.

Along these lines, various intellectual frameworks and models which include direct experiential reality are explored and discussed at length in Part IV, as well as in the Epilogue, with specific chapters dedicated to the re-interpretation of Upanishadic philosophy as presented by Vivekananda in the early 20th century as well as an objective analysis of the experiences and interpretation of the life of Paramhamsa Rāmakrishna in particular who according to tradition of course was the primary influence and inspiration for Vivekananda’s teachings and life in general.

Rāmakrishna as a mystic then, and mysticism  in general – specifically defined by the practices and experiences associated with the direct perception of the ground of reality and existence itself which is the hallmark of Eastern philosophy – is not only one of the main, recurrent (and under emphasized) themes of ancient theo-philosophy in all its forms throughout Eurasian antiquity as reflected in the material in Parts I and II of this work, but also from an ontological standpoint represents one of the other main thrusts of this work which is covered in Part IV as well as summed up in the Epilogue which follows Part IV.

This “Western” view of Rāmakrishna, which is primarily represented in the book Kālī’s Child (a work which is critiqued at length in Part IV and the first section of the Epilogue as well) is from the author’s perspective a perfect illustration of the fundamental limitations of Scientific inquiry as we understand it in the modern Ear in the West.  An intellectual domain that rests squarely on the implicit, and very often left out, assumptions of not just empiricism and rationalism, philosophical modes of thought which characterized the Enlightenment Era for the most part, but also causal determinism and again objective realism, which provide the very basis for epistemology (i.e. our scope and understanding of knowledge itself) in the West, be they recognized as such or not.

Therefore when it comes to understanding, or again interpreting from a Western intellectual perspective, fundamentally Eastern theo-philosophical constructs such as Satcitānanda, Brahman, Purusha, Dao, or Nirvana, all words and terms that fall outside of Science proper in the West given their lack of empirical, objective reality and yet at the same time reflect concepts and ontological principles that are fundamentally required to come to any sort of understanding of any great sage, saint or prophet in the history of man – Paramhamsa Rāmakrishna included.  The choices we are left with given the modern Western intellectual landscape are the need to either study these domains specifically where these words and their associated meanings originate from, or alternatively expand our intellectual domain in the West to include some sort of corollary to these ideas, what they inherently mean and signify – the latter of which is the approach that Pirsig takes by formulating a new metaphysics which he calls the Metaphysics of Quality but what he unfortunately falls short of doing, a topic covered at length in the Epilogue as well.

This analysis of course lends itself to one of the core and final arguments of this work, namely that the intellectual and metaphysical model that is applied to reality in the West, i.e. our ontological framework, while being extraordinarily powerful from a natural philosophical perspective, i.e. Science, is in fact an inadequate conceptual framework for the comprehension of the full scope of reality and therefore  is in need of wholesale revision and/or significant expansion and extension metaphysically and theo-philosophically speaking in order to support a more broad definition of reality through which a more complete and fuller understanding of existence itself can be at least approached.  Hence the title of Part IV of the work, On Ontology.

 


[1] See Plato, Letters.  Letter 7, aka Seventh Letter 341c – 341d.  From Plato. Plato in Twelve Volumes, Vol. 7 translated by R.G. Bury. Cambridge, MA, Harvard University Press; London, William Heinemann Ltd. 1966.  See http://www.perseus.tufts.edu/hopper/text?doc=Perseus%3Atext%3A1999.01.0164%3Aletter%3D7%3Asection%3D341c.  While the actual authenticity of the letter by Plato is debated by scholars it does for the most part reflect the writing style and philosophy as presented by Plato from the author’s perspective and so while perhaps not written by Plato’s hand, still nonetheless seems to accurately represent something akin to what Plato would write, specifically with respect to the specific part of the work cited herein.

[2] The Psychology of peak performance was spear headed in the Nick Bollettieri Academy in the 70s and 80s by the now well renowned and prolific Dr. Jim Loehr, now founder and Chairman of the Human Performance Institute.  See https://www.jjhpi.com/why-hpi/our-people/dr-jim-loehr.

[3] An overview of Pirsig’s work and his invention of a new mode of thinking to address some of the inherent limitations of the modern, Western worldview which he refers to as the Metaphysics of Quality are covered in some detail in the final section of this work.

[4] These works are Philosophy in Antiquity: The Greeks and Philosophy in Antiquity: The Far East respectively.  Both published by Lambert Academic Publishing in 2015.

[5] Tradition has it that Confucius is believed to have said that if he had fifty years to spare, he would spend it to contemplating and studying the Yijing.

[6] The original paper by the author regarding the similarities between Pythagorean philosophy and the Yijing is entitled “Numerology and Arithmology in Pythagorean Philosophy and the Yijing”, published in 2016 and can be found at https://www.academia.edu/27439070/Numerology_and_Arithmology_in_Pythagorean_Philosophy_and_the_Yijing.

[7] Two interim works were published by the author covering Hellenic philosophy and Chinese philosophy specifically that were leveraged as source material for some of the content herein, specifically some of the content in Parts I and II of this work.  See Philosophy in Antiquity: The Greeks (2015) and Philosophy in Antiquity: The Far East (2016), both published by Lambert Academic Publishing in 2016.

[8] In particular the author cites the sections on Enlightenment Era philosophy as well as Arabic/Muslim philosophy as examples of Chapters which could be expanded upon greatly and to a large extent do not do justice to the actors and individuals, and the belief systems which they put forward in their writings, described therein.

[9] The footnotes also incidentally serve as reminders and reference points to the author himself so as sections of material are revisited and/or reworked and/or revised, the pertinent sources are readily available.

[10] Swami Nikhilananda is a direct disciple of Sarada Devi (1853 – 1920), the consort and wife of the 19th century Bengali sage Paramhamsa Rāmakrishna (1836 – 1886).  He is also the founder of and subsequent leader of the Rāmakrishna-Vivekananda Center of New York from 1933 to 1973 and is one of the foremost interpreters (and translators) of Vedic philosophy into English in the 20th century.  He has authored definitive translations with extensive commentaries on the Upanishads and the Bhagavad Gītā, and he is also known for providing the definitive English translation of the Srī Srī Rāmakrishna Kathāmrita, commonly referred to in the West as the Gospel of Srī Rāmakrishna, a monumental work covering detailed teachings and events of the last few years of Rāmakrishna’s life as seen through the eyes of one of his foremost (householder) disciples, Mahendranath Gupta (1854 – 1932), or simply ‘M’.

[11] See https://www.ramakrishna.org/ for information regarding the Rāmakrishna-Vivekananda Center of New York.

[12] Swami Vivekananda was the first to introduce Yoga and Vedānta to the West at the end of the 19th century.  He was the foremost student and spiritual successor of Paramhamsa Rāmakrishna, a figure who is dealt with at length in Part IV of this work.  Vivekananda’s modern conception of Vedānta and Indian philosophy more broadly, is also covered at length in Part IV of this work.

Quantum Mechanics: The Death of Local Realism

From Pantheism to Monotheism

Charlie had covered a lot of ground by this point.  He’d started in the age of mythology, at the dawn of civilization, looking at the cultural and socio-political forces that underpinned and supported the local mythology and priesthood classes of the ancient civilizations, and some of the broader themes that crossed different cultures around creation mythology, particularly in the Mediterranean region which drove Western civilization development.  There was most certainly a lot more rich academic and historical (and psychological) material to cover when looking at the mythology of the ancients but Charlie thought he had at least covered a portion of it and hit the major themes – at least showed how this cultural and mythological melting pot led to the predominance of the Abrahamic religions in the West, and the proliferation of Buddhism, Hinduism and Taoism in the East.

As civilizations and empires began to emerge in the West, there was a need, a vacuum if you will, for a theological or even religious binding force to keep these vast empires together.  You saw it initially with the pantheon of Egyptian/Greek/Roman gods that dominated the Mediterranean in the first millennium BCE, gods who were synthesized and brought together as the civilizations from which they originated were brought together and coalesced through trade and warfare.  Also in the first millennium BCE we encountered the first vast empires, first the Persian then the Greek, both of which not only facilitated trade throughout the region but also drove cultural assimilation as well.

In no small measure out of reaction to what was considered dated or ignorant belief systems, belief systems that merely reinforced the ruling class and were not designed to provide real true insight and liberation for the individual, emerged the philosophical systems of the Greeks, reflecting a deep seated dissatisfaction with the religious and mythological systems of the time, and even the political systems that were very much integrated with these religious structures, to the detriment of society at large from the philosophers perspective.  The life and times of Socrates probably best characterizes the forces at work during this period, from which emerged the likes of Plato and Aristotle who guided the development of the Western mind for the next 2500 years give or take a century.

Jesus’s life in many respects runs parallel to that of Socrates, manifesting and reacting to the same set of forces that Socrates reacts to, except slightly further to the East and within the context of Roman (Jewish) rule rather than Greek rule but still reflecting the same rebellion against the forces that supported power and authority.  Jesus’s message was lost however, and survives down to us through translation and interpretation that undoubtedly dilutes his true teaching, only the core survives.  The works of Plato and Aristotle survive down to us though so we can analyze and digest their complete metaphysical systems that touch on all aspects of thought and intellectual development; the scope of Aristotle’s epistêmai.

In the Common Era (CE), the year of the Lord so to speak (AD), monotheism takes root in the West, coalescing and providing the driving force for the Roman Empire and then the Byzantine Empire that followed it, and then providing the basis of the Islamic Conquests and their subsequent Empire, the Muslims attesting to the same Abrahamic traditions and roots of the Christian and the Jews (of which Jesus was of course one, a fact Christians sometimes forget).  Although undoubtedly monotheism did borrow and integrate from the philosophical traditions that preceded it, mainly to justify and solidify their theological foundations for the intellectually minded, with the advent of the authority of the Church which “interpreted” the Christian tradition for the good of the masses, you find a trend of suppression of rational or logical thinking that was in any way inconsistent with the Bible, the Word of God, or in any way challenged the power of the Church.  In many respects, with the rise in power and authority of the Church we see an abandonment of the powers of the mind, the intellect, which were held so fast and dear to by Plato and Aristotle.  Reason was abandoned for faith as it were, blind faith in God.  The Dark Ages came and went.

 

The Scientific Revolution

Then, another revolution takes place, one that unfolds in Western Europe over centuries and covers first the Renaissance, then the Scientific Revolution and the Age of Enlightenment, where printing and publishing start to make many ancient texts and their interpretations and commentary available to a broader public, outside of the monasteries.  This intellectual groundswell provided the spark that ended up burning down the blind faith in the Bible, and the Church that held its literal interpretation so dear.  Educational systems akin to colleges, along with a core curriculum of sorts (scholasticism) start to crop up in Western Europe in the Renaissance and Age of Enlightenment, providing access to many of the classic texts and rational frameworks to more and more learned men, ideas and thoughts that expanded upon mankind’s notion of reason and its limits, and its relationship to theology and society, begin to be exchanged via letters and published works in a way that was not possible prior.   This era of intellectual growth culminates in the destruction of the geocentric model of the universe, providing the crucial blow into the foundations of all of the Abrahamic religions and laying the foundation for the predominance of science (natural philosophy) and reason that marked the centuries that followed and underpins Western civilization to this day.

Then came Copernicus, Kepler, Galileo and Newton, with many great thinkers in between of course, alongside the philosophical and metaphysical advancements from the likes of Descartes and Kant among others, establishing without question empiricism, deduction and scientific method as the guiding principles behind which knowledge and reality must be based and providing the philosophical basis for the political revolutions that marked the end of the 18th century in France, England and America.

The geometry and astronomy of the Greeks as it turned out, Euclid and Ptolemy in particular, provided the mathematical framework within which the advancements of the Scientific Revolution were made.  Ptolemy’s geocentric model was upended no doubt, but his was the model that was refuted in the new system put forth by Copernicus some 15 centuries later, it was the reference point.  And Euclid’s geometry was superseded, expanded really, by Descartes’s model, i.e. the Cartesian coordinate system, which provided the basis for analytic geometry and calculus, the mathematical foundations of modern physics that are still with us today.

The twentieth century saw even more rapid developments in science and in physics in particular, with the expansion of Newtonian physics with Einstein’s Theory of Relativity in the early 21st century, and then with the subsequent advancement of Quantum Theory which followed close behind which provides the theoretical foundation for the digital world we live in today[1].

But the Scientific Revolution of the 17th, 18th and 19th centuries did not correspond to the complete abandonment of the notion of an anthropomorphic God.  The advancements of this period of Western history provided more of an extension of monotheism, a more broad theoretical and metaphysical framework within which the God was to be viewed, rendering the holy texts not obsolete per se but rendering them more to the realm of allegory and mythology, and most certainly challenging the literal interpretations of the Bible and Qur’an that had prevailed for centuries.

The twentieth century was different though.  Although you see some scattered references to God (Einstein’s famous quotation “God does not play dice” for example), the split between religion and science is cemented in the twentieth century.  The analytic papers and studies that are done, primarily by physicists and scientists, although in some cases have a metaphysical bent or at least some form of metaphysical interpretation (i.e. what do the theories imply about the underlying reality which they intend to explain), leave the notion of God out altogether, a marked contrast to the philosophers and scientists of the Scientific Revolution some century or two prior within which the notion of God, as perceived by the Church, continued to play a central role if only in terms of the underlying faith of the authors.

The shift in the twentieth century however, which can really only be described as radical even though its implications are only inferred and rarely spoken of directly, is the change of faith from an underlying anthropomorphic entity/deity that represents the guiding force of the universe and mankind in particular, to a faith in the idea that the laws of the universe can be discovered, i.e. that they exist eternally, and that these laws themselves are paramount relative to religion or theology which does not rest on an empirical foundation.  Some Enlightenment philosophers of course would take issue with this claim, but twentieth century science was about what could be proven experimentally in the physical world, not about what could be the result of reason or logical constructs.

This faith, this transformation of faith from religion toward science as it were, is implicit in all the scientific developments of the twentieth century, particularly in the physics community, where it it is fair to say that any statement or position of the role of God in science reflected ignorance, ignorance of the underlying framework of laws that clearly governed the behavior of “things”, things which were real and which could be described in terms of qualities such as mass, energy, momentum, velocity, trajectory, etc.  These constructs were much more sound and real than the fluff of the philosophers and metaphysicians, where mind and reason, and in fact perception, was on par with the physical world to at least some extent.  Or were they?

In this century of great scientific advancement, advancement which fundamentally transforms the world within which we live, facilitating the development of nuclear energy, atomic bombs, and digital computer technology to name but a few of what can only be described as revolutionary advancements of the twentieth century, and in turn in many respects drives tremendous economic progress and prosperity throughout the modern industrialized world post World War II, it is science driven at its core by advanced mathematics, which emerges as the underlying truth within which the universe and reality is to be perceived.  Mathematical theories and their associated formulas that predicted the datum and behavior of not only the objective reality of the forces that prevailed on our planet, but also explained and predicted the behavior of grand cosmological forces; laws which describe the creation and motion of the universe and galaxies, the motion of the planets and the stars, laws that describe the inner workings of planetary and galaxy formation, stars and black holes.

And then to top things off, in the very same century we find that in the subatomic realm the world is governed by a seemingly very different set of laws, laws which appear fundamentally incompatible with the laws that govern the “classical world”.  With the discovery of quantum theory and the ability to experimentally verify its predictions, we begin to understand the behavior of the subatomic realm, a fantastic, mysterious and extraordinary (and seemingly random) world which truly defies imagination.  A world where the notion of continuous existence itself is called into question.  The Ancient Greek philosophers could have never foreseen wave particle duality, no scientist before the twentieth century could.  The fabric of reality was in fact much more mysterious than anyone could have imagined.

From Charlie’s standpoint something was lost here though as these advancements and “discoveries” were made.  He believed in progress no doubt, the notion that civilization progresses ever forward and that there was an underlying “evolution” of sorts that had taken place with humanity over the last several thousand years, but he did believe that some social and/or theological intellectual rift had been created in the twentieth century, and that some sort of replacement was needed.  Without religion the moral and ethical framework of society is left governed only by the rule of law, a powerful force no doubt and perhaps grounded in an underlying sense of morality and ethics but the personal foundation of morality and ethics had been crushed with the advent of science from Charlie’s perspective, flooding the world into conflict and materialism, despite the economic progress and greater access to resources for mankind at large.  It wasn’t science’s fault per se, but it was left up to the task of the intellectual community at large to find a replacement to that which had been lost from Charlie’s view.  There was no longer any self-governing force of “do good to thy neighbor” that permeated society anymore, no fellowship of the common man, what was left to shape our world seemed to be a “what’s in it for me” and a “let’s see what I can get away with” attitude, one that flooded the court systems of the West and fueled radical religious groups and terrorism itself, leading to more warfare and strife rather than peace and prosperity which was supposed to be the promise of science wasn’t it?  With the loss of God, his complete removal from the intellectual framework of Western society, there was a break in the knowledge and belief in the interconnectedness of humanity and societies at large, and Quantum Mechanics called this loss of faith of interconnectedness directly into question from Charlie’s perspective.  If everything was connected, entangled, at the subatomic realm, if this was a proven and scientifically verified fact, how could we not take the next logical step and ask what that meant to our world-view?  “That’s a philosophical problem” did not seem to be good enough for Charlie.

Abandonment of religion for something more profound was a good thing no doubt, but what was it that people really believed in nowadays in the Digital Era?  That things and people were fundamentally separate, that they were operated on by forces that determined their behavior, and that the notion of God was for the ignorant and the weak and that eventually all of the underlying behavior and reality could be described within the context of the same science which discovered Relativity and Quantum Mechanics.  Or worse that these questions themselves were not of concern, that our main concern is the betterment of ourselves and our individual families even if that meant those next to us would need to suffer for our gain?  Well where did that leave us?  Where do ethics and morals fit into a world driven by greed and self-promotion?

To be fair Charlie did see some movement toward some sort of more refined theological perspective toward the end of the twentieth century and into the 21st century, as Yoga started to become more popular and some of the Eastern theo-philosophical traditions such as Tai Chi and Buddhism start to gain a foothold in the West, looked at perhaps as more rational belief systems than the religions of the West which have been and remain such a source of conflict and disagreement throughout the world.  And the driving force for this adoption of Yoga in the West seemed to be more aligned with materialism and self-gain than it was for spiritual advancement and enlightenment, Charlie didn’t see this Eastern perspective permeating into broader society, it wasn’t being taught in schools, the next generation, the Digital Generation, will be more materialistic than its predecessors, theology was relegated to the domain of religion and in the West this wasn’t even fair game to teach in schools anymore.

The gap between science and religion that emerged as a byproduct of the Scientific Revolution remained significant, the last thing you were going to find were scientists messing around with the domain of religion, or even theology for that matter.  Metaphysics maybe, in terms of what the developments of science said about reality, but most certainly not theology and definitely not God.  And so our creation myth was bereft of a creator – the Big Bang had no actors, simply primal nuclear and subatomic forces at work against particles that expanded and formed gases and planets and ultimately led to us, the thinking, rational human mind who was capable of contemplating and discovering the laws of the universe and question our place in them, all a byproduct of natural selection, the guiding force was apparently random chance, time, and the genetic encoding of the will to survive as a species.

 

Quantum Physics

Perhaps quantum theory, quantum mechanics, could provide that bridge.  There are some very strange behaviors that have been witnessed and modeled (and proven by experiment) at the quantum scale, principles that defy our notions of space and time that were cemented in the beginning of the twentieth century by Einstein and others.  So Charlie dove in to quantum mechanics to see what he could find and where it led.  For if there were gods or heroes in our culture today, they were the Einsteins, Bohrs, Heisenbergs and Hawkings of our time that defined our reality and determined what the next generation of minds were taught, those that broke open the mysteries of the universe with their minds and helped us better understand the world we live in.  Or did they?

From Charlie’s standpoint, Relativity Theory could be grasped intellectually by the educated, intelligent mind.  You didn’t need advanced degrees or a deep understanding of complex mathematics to understand that at a very basic level, Relativity Theory implied that Mass and Energy were equivalent, related by the speed of light that moved at a fixed speed no matter what your frame of reference, that space and time were not in fact separate and distinct concepts, that our ideas of three dimensional Cartesian space were inadequate for describing the world around us at the cosmic scale, that they were correlated concepts and are more accurately grouped together in the notion of spacetime which more accurately describes the motion and behavior of everything in the universe, more accurately than the theorems devised by Newton at least.

Relativity says that even gravity’s effect was subject to the same principles that played out at the cosmic scale, i.e. that spacetime “bends” at points of singularity (black holes for example), bends to the extent that light in fact is impacted by the severe gravitational forces at these powerful places in the universe.  And indeed that our measurements of time and space were “relative”, relative to the speed and frame of reference from which these measurements were made, the observer was in fact a key element in the process of measurement.  Although Relativity represented a major step in metaphysical or even scientific approach that expanded our notions of how the universe around us could be described, it still left us with a deterministic and realist model of the universe.

But at their basic, core level, these concepts could be understood, grasped as it were, by the vast majority of the public, even if they had very little if any bearing on their daily lives and didn’t fundamentally change or shift their underlying religious or theological beliefs, or even their moral or ethical principles.  Relativity was accepted in the modern age, it just didn’t really affect the subjective frame of reference, the mental or intellectual frame of reference, within which the majority of humanity perceived the world around them.  It was relegated to the realm of physics and a problem for someone else to consider and at best, a problem which needed to be understood to pass a physics or science exam in high school or college, to be buried in your consciousness in lieu of our more pressing daily and life pursuits be they family, career and money, or other forms of self-preservation in the modern, Digital era; an era most notably marked by materialism, self-promotion and greed.

Quantum Theory was different though.  Its laws were more subtle and complex than the world described by classical physics, the world described in painstaking mathematical precision by Newton, Einstein and others.  And after a lot of studying and research, the only conclusion that Charlie could definitively come to was that in order to understand quantum theory, or at least try to come to terms with it, a wholesale different perspective on what reality truly was, or at the very least how reality was to be defined, was required.  In other words, in order to understand what quantum theory actually means, or in order to grasp the underlying intellectual context within which the behaviors of the underlying particles/fields that quantum theory describes were to be understood, a new framework of understanding, a new description of reality, must be adopted.  What was real, as understood by classical physics which had dominated the minds of humankind for centuries, needed to be abandoned, or at the very least significantly modified, in order for quantum theory to be comprehended by any mind, or any mind that had spent any time struggling with quantum theory and trying to grasp it.  Things would never be the same from a physics perspective, this much was clear, whether or not the daily lives of the bulk of those who struggle to survive in the civilized world would evolve along with them in concert with these developments remained to be seen.

Quantum Mechanics, also known as quantum physics or simply Quantum Theory, is the branch of physics that deals with the behavior or particles and matter in the atomic and subatomic realms, or quantum realm so called given the quantized nature of “things” at this scale (more on this later).  So you have some sense of scale, an atom is 10-8 cm across give or take, and the nucleus, or center of an atom, which is made up of what we now call protons and neutrons, is approximately 10-12 cm across.  An electron or a photon, the name we give for a “particle” of light”, cannot truly be “measured” from a size perspective in terms of classical physics for many of the reasons we’ll get into below as we explore the boundaries of the quantum world but suffice it to say at present our best guess at the estimate of the size of an electron are in the range of 10-18 cm or so[2].

Whether or not electrons, or photons (particles of light) for that matter, really exist as particles whose physical size, and/or momentum can be actually “measured” is not as straightforward a question as it might appear and gets at some level to the heart of the problem we encounter when we attempt to apply the principles of “existence” or “reality” to the subatomic realm, or quantum realm, within the context of the semantic and intellectual framework established in classical physics that has evolved over the last three hundred years or so; namely as defined by independently existing, deterministic and quantifiable measurements of size, location, momentum, mass or velocity.

The word quantum comes from the Latin quantus, meaning “how much” and it is used in this context to identify the behavior of subatomic things that move from and between discrete states rather than a continuum of values or states as is presumed in classical physics.  The term itself had taken on meanings in several contexts within a broad range of scientific disciplines in the 19th and early 20th centuries, but was formalized and refined as a specific field of study as “quantum mechanics” by Max Planck at the turn of the 20th century and represents the prevailing and distinguishing characteristic of reality at this scale.

Newtonian physics, or even the extension of Newtonian physics as “discovered” by Einstein with Relativity theory in the beginning of the twentieth century (a theory whose accuracy is well established via experimentation at this point), assumes that particles, things made up of mass, energy and momentum exist independent of the observer or their instruments of observation, are presumed to exist in continuous form, moving along specific trajectories and whose properties (mass, velocity, etc.) can only be changed by the action of some force upon which these things or objects are affected.  This is the essence of Newtonian mechanics upon which the majority of modern day physics, or at least the laws of physics that affect us here at a human scale, is defined and philosophically falls into the realm of realism and determinism.

The only caveat to this view that was put forth by Einstein is that these measurements themselves, of speed or even mass or energy content of a specific object can only be said to be universally defined according to these physical laws within the specific frame of reference of an observer.  Their underlying reality is not questioned – these things clearly exist independent of observation or measurement, clearly (or so it seems) – but the values, or the properties of these things is relative to a frame of reference of the observer.  This is what Relativity tells us.  So the velocity of a massive body, and even the measurement of time itself which is a function of distance and speed, is a function of the relative speed and position of the observer who is performing said measurement.  For the most part, the effects of Relativity can be ignored when we are referring to objects on Earth that are moving at speeds that are minimal with respect to the speed of light and are less massive than say black holes.  As we measure things at the cosmic scale, where distances are measured in terms of light years and black holes and other massive phenomena exist which bend spacetime, aka singularities, the effects of Relativity cannot be ignored.[3]

Leaving aside the field of cosmology for the moment and getting back to the history of the development of quantum mechanics (which arguably is integrally related to cosmology at a basic level), at the end of the 19th century Planck was commissioned by electric companies to create light bulbs that used less energy, and in this context was trying to understand how the intensity of electromagnetic radiation emitted by a black body (an object that absorbs all electromagnetic radiation regardless of frequency or angle of incidence) depended on the frequency of the radiation, i.e. the color of the light.  In his work, and after several iterations of hypothesis that failed to have predictive value, he fell upon the theory that energy is only absorbed or released in quantized form, i.e. in discrete packets of energy he referred to as “bundles” or” energy elements”, the so called Planck postulate.  And so the field of quantum mechanics was born.[4]

Despite the fact that Einstein is best known for his mathematical models and theories for the description of the forces of gravity and light at a cosmic scale, i.e. Relativity, his work was also instrumental in the advancement of quantum mechanics as well.   For example, in his work in the effect of radiation on metallic matter and non-metallic solids and liquids, he discovered that electrons are emitted from matter as a consequence of their absorption of energy from electromagnetic radiation of a very short wavelength, such as visible or ultraviolet radiation.  Einstein established that in certain experiments light appeared to behave like a stream of tiny particles that he called photons, not just a wave, lending more credence and authority to the particle theories describing of quantum realm.  He therefore hypothesized the existence of light quanta, or photons, as a result of these experiments, laying the groundwork for subsequent wave-particle duality discoveries and reinforcing the discoveries of Planck with respect to black body radiation and its quantized behavior.[5]

 

Wave-Particle Duality and Wavefunction Collapse

Prior to the establishment of light’s properties as waves, and then in turn the establishment of wave like characteristics of subatomic elements like photons and electrons by Louis de Broglie in the 1920s, it had been fairly well established that these subatomic particles, or electrons or photons as they were later called, behaved like particles.  However the debate and study of the nature of light and subatomic matter went all the way back to the 17th century where competing theories of the nature of light were proposed by Isaac Newton, who viewed light as a system of particles, and Christiaan Huygens who postulated that light behaved like a wave.  It was not until the work of Einstein, Planck, de Broglie and other physicists of the twentieth century that the nature of these subatomic particles, both light and electrons, were proven to behave both like particles and waves, the result dependent upon the experiment and the context of the system which being observed.  This paradoxical principle known as wave-particle duality is one of the cornerstones, and underlying mysteries, of the implications of the reality described by Quantum Theory.

As part of the discoveries of subatomic particle wave-like behavior, what Planck discovered in his study of black body radiation (and Einstein as well within the context of his study of light and photons) was that the measurements or states of a given particle such as a photon or an electron, had to take on values that were multiples of very small and discrete quantities, i.e. non-continuous, the relation of which was represented by a constant value known as the Planck constant[6].

In the quantum realm then, there was not a continuum of values and states of matter as was assumed in physics up until that time, there were bursts of energies and changes of state that were ultimately discrete, and yet fixed, where certain states and certain values could in fact not exist, representing a dramatic departure from the way most of think about movement and change in the “real world” and most certainly a significant departure from Newtonian mechanics upon which Relativity was based.[7]

The classic demonstration of light’s behavior as a wave, and perhaps one of the most astonishing experiments of all time, is illustrated in what is called the double-slit experiment[8].  In the basic version of this experiment, a light source such as a laser beam is shone at a thin plate that that is pierced by two parallel slits.  The light in turn passes through each of the slits and displays on a screen behind the plate.  The image that is displayed on the screen behind the plate is not one of a constant band of light that passes through each one of the slits as you might expect if the light were simply a particle or sets of particles, the light displayed on the screen behind the double-slitted slate is one of light and dark bands, indicating that the light is behaving like a wave and is subject to interference, the strength of the light on the screen cancelling itself out or becoming stronger depending upon how the individual waves interfere with each other.  This behavior is exactly akin to what we consider fundamental wavelike behavior, for example like the nature of waves in water where the waves have greater strength if they synchronize correctly (peaks of waves) and cancel each other out (trough of waves) if not.

What is even more interesting however, and was most certainly unexpected, is that once equipment was developed that could reliably send a single particle (electron or photon for example, the behavior was the same) through a double-slitted slate, these photons did end up at a single location on the screen after passing through one of the slits as was expected, but the location on the screen, as well as which slit the particle appeared to pass through (in later versions of the experiment which slit “it” passed through could in fact be detected) seemed to be somewhat random.  What researchers found as more and more of these subatomic particles were sent through the slate one at a time, the same wave like interference pattern emerged that showed up when the experiment was run with a full beam of light as was done by Young some 100 years prior.

So hold on for a second, Charlie had gone over this again and again, and according to all the literature he read on quantum theory and quantum mechanics they all pretty much said the same thing, namely that the heart of the mystery of quantum mechanics could be seen in this very simple experiment.  And yet it was really hard to, perhaps impossible, to understand what was actually going on, or at least understand without abandoning some of the very foundational principles of physics, like for example that these things called subatomic particles actually existed, because they seemed to behave like waves.  Or did they?

What was clear was that this subatomic particle, corpuscle or whatever you wanted to call it, did not have a linear and fully deterministic trajectory in the classical physics sense, this much was very clear due to the fact that the distribution against the screen when they were sent through the double slits individually appeared to be random.  But what was more odd was that when the experiment was run one corpuscle at a time, not only was the final location on the screen seemingly random individually, but the same pattern emerged after many, many single experiment runs as when a full wave, or set of these corpuscles, was sent through the double slits.  So not only did the individual photon seem to be aware of the final wave like pattern of its parent wave, but that this corpuscle appeared to be interfering with itself when it went through the two slits individually.  What?  What the heck was going on here?

Furthermore, to make things even more mysterious, as the final location of each of the individual photons in the two slit and other related experiments was evaluated and studied, it was discovered that although the final location of an individual one of these particles could not be determined exactly before the experiment was performed, i.e. there was a fundamental element of uncertainty or randomness involved at the individual corpuscle level that could not be escaped, it was discovered that the final locations of these particles measured in toto after many experiments were performed exhibited statistical distribution behavior that could be modeled quite precisely, precisely from a mathematical statistics and probability distribution perspective.  That is to say that the sum total distribution of the final locations of all the particles after passing through the slit(s) could be established stochastically, i.e. in terms of well-defined probability distribution consistent with probability theory and well defined mathematics that governed statistical behavior.  So in total you could predict at some sense what the behavior would look like over a large distribution set even if you couldn’t predict what the outcome would look like for an individual corpuscle.

The mathematics behind this particle distribution that was discovered is what is known as the wave function, typically denoted by the mathematical symbol the Greek letter psi, ψ or its capital equivalent Ψ, which predicts what the probability distribution of these “particles” will look like on the screen behind the slate over a given period of time after many individual experiments are run, or in quantum theoretical terms the wavefunction predicts the quantum state of a particle throughout a fixed spacetime interval.  This very foundational and groundbreaking equation was discovered by the Austrian physicist Erwin Schrödinger in 1925, published in 1926, and is commonly referred to in the scientific literature as the Schrödinger equation, analogous in the field of quantum mechanics to Newton’s second law of motion in classical physics.

With the discovery of the wave function, or wavefunction, it now became possible to predict the potential locations or states of motions of these subatomic particles, an extremely potent theoretical model that has led to all sorts of inventions and technological advancements in the twentieth century and beyond.   This wavefunction represents a probability distribution of potential states or outcomes that describe the quantum state of a particle and predicts with a great degree of accuracy the potential location of a particle given a location or state of motion.

Again, this implied that individual corpuscles were interfering with themselves when passing through the two slits on the slate which was very odd indeed.  In other words, the individual particles were exhibiting wave like characteristics even when they were sent through the double-slitted slate one at a time.  This phenomenon was shown to occur with atoms as well as electrons and photons, confirming that all of these subatomic so-called particles exhibited wave like properties as well as their particle like qualities, the behavior observed determined upon the type of experiment, or measurement as it were, that the “thing” was subject to.

As Louis De Broglie, the physicist responsible for bridging the theoretical gap between the study of corpuscles (particles, matter or atoms) and waves by establishing the symmetric relation between momentum and wavelength which had at its core Planck’s constant, i.e. the De Broglie equation, described this mysterious and somewhat counterintuitive relationship between wave and particle like behavior:

A wave must be associated with each corpuscle and only the study of the wave’s propagation will yield information to us on the successive positions of the corpuscle in space[9].

So by the 1920s then, you have a fairly well established mathematical theory to govern the behavior of subatomic particles, backed by a large body of empirical and experimental evidence, that indicates quite clearly that what we would call “matter” (or particles or corpuscles) in the classical sense, behaves very differently, or at least has very different fundamental characteristics, in the subatomic realm.  It exhibits properties of a particle, or a thing or object, as well as a wave depending upon the type of experiment that is run.  So the concept of matter itself then, as we had been accustomed to dealing with and discussing and measuring for some centuries, at least as far back as the time of Newton (1642-1727), had to be reexamined within the context of quantum mechanics.  For in Newtonian physics, and indeed in the geometric and mathematical framework within which it was developed and conceived which went back to ancient times (Euclid 300 BCE), matter was presumed to be either a particle or a wave, but most certainly not both.

What even further complicated matters was that matter itself, again as defined by Newtonian mechanics and its extension via Relativity Theory, taken together what is commonly referred to as classical physics, was presumed to have some very definite, well-defined and fixed, real properties.  Properties like mass, location or position in space, and velocity or trajectory were all presumed to have a real existence independent of whether or not they were measured or observed, even if the actual values were relative to the frame of reference of the observer.  All of this hinged upon the notion that the speed of light was fixed no matter what the frame of reference of the observer of course, this was a fixed absolute, nothing could move faster than the speed of light.  Well even this seemingly self-evident notion, or postulate one might call it, ran into problems as scientists continued to explore the quantum realm.

So by the 1920s then, the way scientists looked at and viewed matter as we would classically consider it within the context of Newton’s postulates from the early 1700s which were extended further into the notion of spacetime as put forth by Einstein, was encountering some significant difficulties when applied to the behavior of elements in the subatomic, quantum, world.  Furthermore, there was extensive empirical and scientific evidence which lent significant credibility to quantum theory, which illustrated irrefutably that these subatomic elements behaved not only like waves, exhibiting characteristics such as interference and diffraction, but also like particles in the classic Newtonian sense that had measurable, well defined characteristics that could be quantified within the context of an experiment.

In his Nobel Lecture in 1929, Louis de Broglie, summed up the challenge for physicists of his day, and to a large extent physicists of modern times, given the discoveries of quantum mechanics as follows:

The necessity of assuming for light two contradictory theories-that of waves and that of corpuscles – and the inability to understand why, among the infinity of motions which an electron ought to be able to have in the atom according to classical concepts, only certain ones were possible: such were the enigmas confronting physicists at the time…[10]

 

Uncertainty, Entanglement, and the Cat in a Box

The other major tenet of quantum theory that rests alongside wave-particle duality, and that provides even more complexity when trying to wrap our minds around what is actually going on in the subatomic realm, is what is sometimes referred to as the uncertainty principle, or the Heisenberg uncertainty principle, named after the German theoretical physicist Werner Heisenberg who first put forth the theories and models representing the probability distribution of outcomes of the position of these subatomic particles in certain experiments like the double-slit experiment previously described, even though the wave function itself was the discovery of Schrödinger.

The uncertainty principle states that there is a fundamental limit on the accuracy with which certain pairs of physical properties of atomic particles, position and momentum being the classical pair for example, that can be known at any given time with certainty.  In other words, physical quantities come in conjugate pairs, where only one of the measurements of a given pair can be known precisely at any given time.  In other words, when one quantity in a conjugate pair is measured and becomes determined, the complementary conjugate pair becomes indeterminate.  In other words, what Heisenberg discovered, and proved, was that the more precisely one attempts to measure one of these complementary properties of subatomic particles, the less precisely the other associated complementary attribute of the element can be determined or known.

Published by Heisenberg in 1927, the uncertainty principle states that they are fundamental, conceptual limits of observation in the quantum realm, another radical departure from the principles of Newtonian mechanics which held that all attributes of a thing were measurable at any given time, i.e. existed or were real.  The uncertainty principle is a statement on the fundamental property of quantum systems as they are mathematically and theoretically modeled and defined, and of course empirically validated by experimental results, not a statement about the technology and method of the observational systems themselves.  This is an important point.  This wasn’t a theoretical problem, or a problem with the state of instrumentation that was being used for measurement, it was a characteristic of the domain itself.

Max Born, who won the Nobel Prize in Physics in 1954 for his work in quantum mechanics, specifically for his statistical interpretations of the wave function, describes this now other seemingly mysterious attribute of the quantum realm as follows (the specific language he uses reveals at some level his interpretation of the quantum theory, more on interpretations later):

…To measure space coordinates and instants of time, rigid measuring rods and clocks are required.  On the other hand, to measure momenta and energies, devices are necessary with movable parts to absorb the impact of the test object and to indicate the size of its momentum.  Paying regard to the fact that quantum mechanics is competent for dealing with the interaction of object and apparatus, it is seen that no arrangement is possible that will fulfill both requirements simultaneously.[11]

Whereas classical physicists, physics prior to the introduction of relativity and quantum theory, distinguished between the study of particles and waves, the introduction of quantum theory and wave-particle duality established that this classic intellectual bifurcation of physics at the macroscopic scale was wholly inadequate in describing and predicting the behavior of these “things” that existed in the subatomic realm, all of which took on the characteristics of both waves and particles depending upon the experiment and context of the system being observed.  Furthermore the actual precision within which a state of a “thing” in the subatomic world could be defined was conceptually limited, establishing a limit to which the state of a given subatomic state could be defined, another divergence from classical physics.  And then on top of this, was the requirement of the mathematical principles of statistics and probability theory, as well as significant extensions to the underlying geometry, to describe and model the behavior at this scale, all calling into question our classical materialistic notions and beliefs that we had held so dear for centuries.

Even after the continued refinement and experimental evidence that supported Quantum Theory however, there did arise some significant resistance to the completeness of the theory itself, or at least questions as to its true implications with respect to Relativity and Newtonian mechanics.  The most notable of these criticisms came from Einstein himself, most infamously encapsulated in a paper he co-authored with two of his colleagues Boris Podolsky and Nathan Rosen published in 1936 which came to be known simply as the EPR paper, or simply the EPR paradox, which called attention to what they saw as the underlying inconsistencies of the theory that still required explanation.  In this paper they extended some of the quantum theoretical models to different thought experiments/scenarios to yield what they considered to be at very least improbable, if not impossible, conclusions.

They postulated that given the formulas and mathematical models that described the current state of quantum theory, i.e. the description of a wave function that described the probabilistic outcomes for a given subatomic system, that if such a system were transformed into two systems – split apart if you will – by definition both systems would then be governed by the same wave function and whose subsequent behavior and state would be related, no matter what their separation in spacetime, violating one of the core tenets of classically physics, namely communication faster than the speed of light.  This was held to be a mathematically true and consistent with quantum theory, although at the time could not be validated via experiment.

They went on to show that if this is true, it implies that if you have a single particle system that is split into two separate particles and subsequently measured, these two now separate and distinct particles would then be governed by the same wave function, and in turn would be governed by the same uncertainty principles outlined by Heisenberg; namely that a defined measurement of a particle in system A will cause its conjugate value in system B to be undeterminable or “correlated”, even if the two systems had no classical physical contact with each other and were light years apart from each other.

But hold on a second, how could this be possible?  How could you have two separate “systems”, governed by the same wave function, or behavioral equation so to speak, that no matter how far apart they were, or no matter how much time elapsed between measurements, that you had a measurement in one system which fundamentally correlated with (or uncorrelated with, the argument is the same) a measurement in the other system that its separate from?  They basically took the wave function theory, which governs behavior of quantized particles, and its corresponding implication of uncertainty as outlined most notably by Heisenberg, and extended it to multiple, associated and related subatomic systems, related and governed by the same wave function despite their separation in space (and time) yielding a very awkward and somewhat unexplainable result, at least unexplainable in terms of classic physics.

The question they raised boiled down to, how could you have two unrelated, distant systems whose measurements or underlying structure depended upon each other in a very well-defined and mathematically and (theoretically at the time but subsequently verified via experiment) empirically measurable way?  Does that imply that these systems are communicating in some way either explicitly or implicitly?  If so that would seem to call into question the principle of the fixed speed of light that was core to Relativity Theory.  The other alternative option seemed to be that the theory was incomplete in some way, which was Einstein’s view.  Were there “hidden”, yet to be discovered variables that governed the behavior of quantum systems that had yet to be discovered, what came to be known in the literature as hidden variable theories?

If it were true, and in the past half century or so many experiments have verified this, it is at the very least extremely odd behavior, or perhaps better put reflected very odd characteristics, certainly inconsistent with prevailing theories of physics.  Or at least characteristics that we have come to not expect in our descriptions of “reality” that we had grown accustomed to expect.  Are these two subsystems, once correlated, communicating with each other?  Is there some information that is being passed between them that violates the speed of light boundary that forms the cornerstone of modern, classical physics?  This seems unlikely, and most certainly is something that Einstein felt uncomfortable with.  This “spooky action at a distance”, which is what Einstein referred to it as, seemed literally to defy the laws of physics.  But the alternative appeared to be that this notion of what we consider to be “real”, at least as it was classically defined, would need to be modified in some way to take into account this correlated behavior between particles or systems that were physically separated beyond classical boundaries.

From Einstein’s perspective, two possible explanations for this behavior were put forth, 1) either there existed some model of behavior of the interacting systems/particles that was still yet undiscovered, so called hidden variables, or 2) the notion of locality, or perhaps more aptly put as the tenet of local determinism (which Einstein and others associated directly and unequivocally with reality), which underpinned all of classical physics had to be drastically modified if not completely abandoned.

In Einstein’s words however, the language for the first alternative that he seemed to prefer was not that there were hidden variables per se, but more so that quantum theory as it stood in the first half of the twentieth century was incomplete.  That is to say that some variable, coefficient or hidden force was missing from quantum theory which was the driving force behind the correlated behavior of the attributes of these physically separate particles that were separate beyond classical means of communication in any way.  For Einstein it was the completeness option that he preferred, unwilling to consider the idea that the notion of locality was not absolute.  Ironically enough, hindsight being twenty-twenty and all, Einstein had just postulated that there was no such thing as absolute truth, or absolute reality, on the macroscopic and cosmic physical plane with Relativity Theory, so one might think that he would have been more open to relaxing this requirement in the quantum realm, but apparently not, speaking to the complexities and subtleties of quantum theory implications even for some of the greatest minds of the time.

Probably the most widely known metaphor that illustrated Einstein and other’s criticism of quantum theory is the thought experiment, or paradox as it is sometimes referred to as, called Schrödinger’s cat, or Schrödinger’s cat paradox.[12]  In this thought experiment, which according to tradition emerged out of discussions between Schrödinger and Einstein just after the EPR paper was published, a cat is placed in a fully sealed and fully enclosed box with a radioactive source subject to certain measurable and quantifiable rate of decay, a rate that is presumably less than the life time of a cat.  In the box with the cat is one internal radioactive monitor which measures any radioactive particles in the box (# of radioactive particles <= 1), and flask of poison that is triggered by the radioactivity monitor if it is triggered.  According to quantum theory which governs the rate of decay with some random probability distribution over time, it is impossible to say at any given moment, until the box is opened in fact, whether or not the cat is dead or alive.  But how could this be?  The cat is in an undefined state until the box is opened?  There is nothing definitive that we can say about the state of the cat independent of actually opening the box?  The calls into question, bringing the analogy to the macroscopic level, whether or not according to quantum theory reality can be defined independent of observation (or measurement) within the context of the cat, the box and the radioactive particle and its associated monitor.

In the course of developing this experiment, Schrödinger coined the term entanglement[13], one of the great still yet to be solved mysteries, or perhaps better called paradoxes, that exist to this day in quantum theory/mechanics.  Mysterious not in the sense as to whether or not the principle actually exists, entanglement has been verified in a variety of physically verifiable experiments as outlined in the EPR paper and illustrated in the cat paradox and is accepted as a scientific fact in the physics community, but a mystery in the sense as to how this can be possible given that it seems, at least on the face of it, to fly in the face of classical Newtonian mechanics, almost determinism itself actually.  Schrödinger himself is probably the best person to turn to understand quantum entanglement and he describes it as:

When two systems, of which we know the states by their respective representatives, enter into temporary physical interaction due to known forces between them, and when after a time of mutual influence the systems separate again, then they can no longer be described in the same way as before, viz. by endowing each of them with a representative of its own. I would not call that one but rather the characteristic trait of quantum mechanics, the one that enforces its entire departure from classical lines of thought. By the interaction the two representatives [the quantum states] have become entangled.[14]

The principle of entanglement calls into question of what is known as local realism; “local” in the sense that all the behaviors and datum of a given system are determined by the qualities or attributes of those objects within that given system bounded by spacetime as defined by Newtonian mechanics and Relativity or some force that is acting upon said system, and “real” in the sense that the system itself exists independent of observation or apparatus/elements of observation.

Taking the non-local theory explanation to the extreme, and something which has promoted quite a bit of what can reasonably be called hysterical reaction in some academic and pseudo-academic communities even to this day, is that the existence of proven correlation of two pairs of entities that are separated in spacetime far enough from each other so that the speed of light boundary could not be crossed, if the two separated particles do indeed seem to hold a distinct and mathematically predictable correlation, i.e. this notion of entanglement “action at a distance” as it is sometimes called, then all of classical physics is called into question.  Einstein specifically called out these “spooky at a distance” theories as defunct, he so believed in the invariable tenets of Relativity, and it’s hard to argue with his position quite frankly because correlation does not necessarily imply communication.  But if local realism and its underlying tenets of determinism are to be held fast to, then where does that leave quantum theory?

This problem gets somewhat more crystallized, or well defined, in 1964 when the physicist John Stewart Bell (1928-1990) in his seminal paper entitled “On the Einstein Podolsky Rosen Paradox, takes the EPR argument one step further and asserts, proves mathematically via a reductio ad absurdum argument, that if quantum theory is true, that in fact no hidden parameter or variable theory could possibly exist that reproduces all of the predictions of quantum mechanics and is also consistent with locality[15].  In other words, Bell asserted that the hidden variable hypothesis, or at the very least a broad category of hidden variable hypotheses, was incompatible with quantum theory itself, unless the notion of locality was abandoned or at least relaxed to some extent.  In his own words:

In a theory in which parameters are added to quantum mechanics to determine the results of individual measurements, without changing the statistical predictions, there must be a mechanism whereby the setting of one measuring device can influence the reading of another instrument, however remote. Moreover, the signal involved must propagate instantaneously, so that a theory could not be Lorentz invariant.[16]

This assertion is called Bell’s theorem and it posits that quantum mechanics and the concept of locality, which again states that an object is influenced directly only by its immediate surroundings and is a cornerstone of the theories of Newton and Einstein regarding the behavior of matter and the objective world, are mathematically incompatible and inconsistent with each other, providing further impetus as it were, that this classical notion of locality was in need of closer inspection, modification or perhaps even abandoned entirely.

Although there still exists some debate among physicists as to whether or not there is enough experimental evidence to prove out Bell’s theorem beyond a shadow of a doubt, it seems to be broadly accepted in the scientific community that this property of entanglement exists beyond classical physical boundaries.  However, the question as to whether or not all types of hidden variable theories are ruled out by Bell’s theorems appears to be a legitimate question and is still up for debate, and perhaps this loop hole more so than any other is the path which Bohm and Hiley take with their Causal, or Ontological Interpretation of Quantum Theory (more below).

Criticisms of Bell’s theorem and the related experiments aside however, if you believe quantum theory, and you’d be hard pressed not to at this point, you must conclude that the theory violates and is inconsistent with Relativity in some way, a rather disconcerting and problematic conclusion for the twentieth century physicist to say the least and a problem which plagues, and motivates, many modern theoretical physicists to this day.

Quantum Theory then, as expressed with Bell’s theorem, Heisenberg’s uncertainty principle and this idea of entanglement, asserts that there exists a level of interconnectedness between physically disparate systems that defies at least some level the classical physics notion of deterministic locality, pointing to either the incompleteness of quantum theory or to the requirement of some sort of non-trivial modification of the concept of local realism which has underpinned classical physics for the last few centuries if not longer.

In other words, the implications of quantum theory, a theory that has very strong predictive and experimental evidence which backs up the soundness and strength of the underlying math, is that there is something else is at work that connects the state of particles or things at the subatomic scale that cannot be altogether described, pinpointed, or explained.  Einstein himself struggles with this notion even toward the end of his lifetime in 1954 when he says:

…The following idea characterizes the relative independence of objects far apart in space, A and B: external influence on A has no direct influence on B; this is known as the Principle of Local Action, which is used consistently only in field theory. If this axiom were to be completely abolished, the idea of the existence of quasi enclosed systems, and thereby the postulation of laws which can be checked empirically in the accepted sense, would become impossible….[17]

 

Interpretations of Quantum Theory: Back to First Philosophy

There is no question as to the soundness of the mathematics behind quantum theory and there is now a very large body of experimental evidence that supports the underlying mathematics, including empirical evidence of not only the particle behavior that it intends to describe (as in the two slit experiment for example), but also experimental evidence that validates Bell’s theorem and the EPR Paradox.  What is somewhat less clear however, and what arguably may belong more to the world of metaphysics and philosophy rather than physics, is how quantum theory is to be interpreted as a representation of reality given the state of affairs that it introduces.  What does quantum theory tell us about the world we live in, irrespective of the soundness of its predictive power?  This is a question that physicists, philosophers and even theologians have struggled with since the theory has gained wide acceptance and prominence in the scientific community since the 1930s.

There are many interpretations of quantum theory but there are three in particular that Charlie thought deserved attention due primarily to a) their prevalence or acceptance in the academic community, and/or b) their impact on scientific or philosophical inquiry into the limits of quantum theory.

The standard, orthodox interpretation of quantum theory and the one most often compared to when differing interpretations to quantum theory are put forth is most commonly referred to as the Copenhagen Interpretation which renders the theoretical boundaries of interpretation of the theory to the experiment itself, the Many-worlds (or Many-minds) interpretation which explores the boundaries of the nature of reality proposing in some extreme variants the existence of multiple universes/realities simultaneously, and the Causal Interpretation which is also sometimes called De Broglie-Bohm theory or Bohmian mechanics, which extends the theory to include the notion of quantum potential and at the same time abandons the classical notion of locality but still preserves objective realism and determinism.[18]

The most well established and most commonly accepted interpretation of Quantum Theory, the one that is most often taught in schools and textbooks and the one that most alternative interpretations are compared against, is the Copenhagen Interpretation[19]. The Copenhagen interpretation holds that the theories of quantum mechanics do not yield a description of an objective reality, but deal only with sets of probabilistic outcomes of experimental values borne from experiments observing or measuring various aspects of energy quanta, entities that do not fit neatly into classical interpretations of mechanics.  The underlying tenet here is that the act of measurement itself, the observer (or by extension the apparatus of observation) causes the set of probabilistic outcomes to converge on a single outcome, a feature of quantum mechanics commonly referred to as wavefunction collapse, and that any additional interpretation of what might actually be going on, i.e. the underlying reality, defies explanation and the interpretation of which is in fact inconsistent with the fundamental mathematical tenets of the theory itself.

In this interpretation of quantum theory, reality (used here in the classical sense of the term as existing independent of the observer) is a function of the experiment, and is defined as a result of the act of observation and has no meaning independent of measurement.  In other words, reality in the quantum world from this point of view does not exist independent of observation, or put somewhat differently, the manifestation of what we think of or define as “real” is intrinsically tied to and related to the act of observation of the system itself.

Niels Bohr has been one of the strongest proponents of this interpretation, an interpretation which refuses to associate any metaphysically implications with the underlying physics.  He holds that given this proven interdependence between that which was being observed and the act of observation, no metaphysical interpretation can in fact be extrapolated from the theory, it is and can only be a tool to describe and measure states and particle/wave behavior in the subatomic realm that are made as a result of some well-defined experiment, i.e. that attempting to make some determination as to what quantum theory actually meant, violated the fundamental tenets of the theory itself.  From Bohr’s perspective, the inability to draw conclusions beyond the results of the experiments which the theory covers was a necessary conclusion of the theorem’s basic tenets and that was the end of the matter.  This view can be seen as the logical conclusion of the notion of complementarity, one of the fundamental and intrinsic features of quantum mechanics that makes it so mysterious and hard to describe or understand in classical terms.

Complementarity, which is closely tied to the Copenhagen interpretation, expresses the notion that in the quantum domain the results of the experiments, the values yielded (or observables) were fundamentally tied to the act of measurement itself and that in order to obtain a complete picture of the state of any given system, as bound by the uncertainty principle, one would need to run multiple experiments across a given system, each result in turn rounding out the notion of the state, or reality of said system.  These combined features of the theory said something profound about the underlying uncertainty of the theory itself.  Perhaps complementarity can be viewed as the twin of uncertainty, or its inverse postulate.  Bohr summarized this very subtle and yet at the same time very profound notion of complementarity in 1949 as follows:

…however far the [quantum physical] phenomena transcend the scope of classical physical explanation, the account of all evidence must be expressed in classical terms. The argument is simply that by the word “experiment” we refer to a situation where we can tell others what we have learned and that, therefore, the account of the experimental arrangements and of the results of the observations must be expressed in unambiguous language with suitable application of the terminology of classical physics.

This crucial point…implies the impossibility of any sharp separation between the behavior of atomic objects and the interaction with the measuring instruments which serve to define the conditions under which the phenomena appear…. Consequently, evidence obtained under different experimental conditions cannot be comprehended within a single picture, but must be regarded as complementary in the sense that only the totality of the phenomena exhausts the possible information about the objects.[20]

Complementarity was in fact the core underlying principle which drove the existence of the uncertainty principle from Bohr’s perspective; it was the underlying characteristic and property of the quantum world that captured at some level its very essence.  And complementarity, taken to its logical and theoretical limits, did not allow or provide any framework for describing, any definition of the real world outside of the domain within which it dealt with, namely the measurement values or results, the measurement instruments themselves, and the act of measurement itself.

Another interpretation or possible question to be asked given the uncertainty implicit in Quantum Theory, was that perhaps all possible outcomes as described in the wave function did in some respect manifest even if they call could not be seen or perceived in our objective reality.  This premise underlies an interpretation of quantum theory that has gained some prominence in the last few decades, especially within the computer science and computational complexity fields, and has come to be known as the Many-Worlds interpretation.

This original formulation of this theory was laid out by Hugh Everett in his PHD thesis in 1957 in a paper entitled The Theory of the Universal Wave Function wherein he referred to the interpretation not as the many-worlds interpretation but as the Relative-State formulation of Quantum Mechanics (more on this distinction below), but the theory was subsequently developed and expanded upon by several authors and the term many-worlds sort of stuck.[21]

In Everett’s original exposition of the theory, he begins by calling out some of the problems with the original, or classic, interpretation of quantum mechanics; specifically what he and other members of the physics community believed to be the artificial creation of the wavefunction collapse construct to explain quantum uncertain to deterministic behavior transitions, as well as the difficulty that this interpretation had in dealing with systems that consisted of more than one observer, as the main drivers for an alternative viewpoint of the interpretation of the quantum theory, or what he referred to as a metatheory given that the standard interpretation could be derived from it.

Although Bohr, and presumably Heisenberg and von Neumann, whose collective views on the interpretation of quantum theory make up what is now commonly referred to as the Copenhagen Interpretation of quantum theory, would no doubt explain away these seemingly contradictory and inconsistent problems with as out of scope of the theory itself (i.e. quantum theory is a theory that is intellectually and epistemologically bound by the experimental apparatus and their subsequent results which provide the scope of the underlying mechanics), Everett finds this view lacking as it fundamentally prevents us from any true explanation as to what the theory says about “reality”, or the real world as it were, a world considered to be governed by the laws of classic physics where things and objects exists independent of observers and have real, static measurable and definable qualities, a world fundamentally incompatible with the stochastic and uncertain characteristics that governed the behavior of “things” in the subatomic or quantum realm.

The aim is not to deny or contradict the conventional formulation of quantum theory, which has demonstrated its usefulness in an overwhelming variety of problems, but rather to supply a new, more general and complete formulation, from which the conventional interpretation can be deduced.[22]

Everett’s starts by making the following basic assumptions from which he devises his somewhat counter intuitive but yet now relatively widely accepted standard interpretations of quantum theory are 1) all physical systems large or small, can be described as states within Hilbert space, the fundamental geometric framework upon which quantum mechanics is constructed, 2) that the concept of an observer can be abstracted to be a machine like entity with access to unlimited memory which stores a history of previous states, or previous  observations, and has the ability to made deductions, or associations, regarding actions and behavior solely based upon this memory and this simple deductive process thereby incorporating observers and acts of observation (i.e. measurement) completely into the model, and 3) with assumptions 1 and 2, the entire state of the universe, which includes the observers within it, can be described in a consistent, coherent and fully deterministic fashion without the need of the notion of wavefunction collapse, or any additional assumptions for that matter.

Everett makes what he calls a simplifying assumption to quantum theory, i.e. removing the need for or notion of wavefunction collapse, and assumes the existence of a universal wave function which accounts for and describes the behavior of all physical systems and their interaction in the universe, absorbing the observer and the act of observation into the model, observers being simply another form of a quantum state that interacts with the environment.  Once these assumptions are made, he can then abstract the concept of measurement as just interactions between quantum systems all governed by this same universal wave function.  In Everett’s metatheory, the notion of what an observer means and how they fit into the overall model are fully defined, and the challenge stemming from the seemingly arbitrary notion of wavefunction collapse is resolved.

In Everett’s view, there exists a universal wavefunction which corresponds to an objective, deterministic reality and the notion of wavefunction collapse as put forth by von Neumann (and reflective of the standard interpretation of quantum mechanics) represents not a collapse so to speak, but represents a manifestation of the realization of one possible outcome of measurement that exists in our “reality”, or multi-verse.

But from Everett’s perspective, if you take what can be described as a literal interpretation of the wavefunction as the overarching description of reality, this implies that the rest of the possible states reflected in the wave function of that system do not cease to exist with the act of observation, with the collapse of the quantum mechanical wave that describes said system state in Copenhagen quantum mechanical nomenclature, but that these other states do have some existence that persists but are simply not perceived by us.  In his own words, and this is a subtle yet important distinction between Everett’s view and the view of subsequent proponents of the many-worlds interpretation, they remain uncorrelated with the observer and therefore they do not exist in their manifest reality.

We now consider the question of measurement in quantum mechanics, which we desire to treat as a natural process within the theory of pure wave mechanics. From our point of view there is no fundamental distinction between “measuring apparata” and other physical systems. For us, therefore, a measurement is simply a special case of interaction between physical systems – an interaction which has the property of correlating a quantity in one subsystem with a quantity in another.[23]

This implies of course that these unperceived states do have some semblance of reality, that they do in fact exists as possible realities, realities that are thought to have varying levels of “existence” depending upon which version of the many-worlds interpretation you adhere to.  With DeWitt and Deutsch for example, a more literal, or “actual” you might say, interpretation of Everett’s original theory is taken, where these other states, these other realities or multi-verses, do in fact physical exist even though they cannot be perceived or validated by experiment.[24]  This is a more literal interpretation of Everett’s thesis however, because nowhere does Everett explicitly state that these other universes actually exist, what he does say on the matter seems to imply the existence of “possible” or potential universes that reflect non-measured or non-actualized states of physical systems, but not these unrealized outcomes actually exists in some physical universe:

In reply to a preprint of this article some correspondents have raised the question of the “transition from possible to actual,” arguing that in “reality” there is—as our experience testifies—no such splitting of observer states, so that only one branch can ever actually exist. Since this point may occur to other readers the following is offered in explanation.

The whole issue of the transition from “possible” to “actual” is taken care of in the theory in a very simple way—there is no such transition, nor is such a transition necessary for the theory to be in accord with our experience. From the viewpoint of the theory all elements of a superposition (all “branches”) are “actual,” none any more “real” than the rest. It is unnecessary to suppose that all but one are somehow destroyed, since all the separate elements of a superposition individually obey the wave equation with complete indifference to the presence or absence (“actuality” or not) of any other elements. This total lack of effect of one branch on another also implies that no observer will ever be aware of any “splitting” process.

Arguments that the world picture presented by this theory is contradicted by experience, because we are unaware of any branching process, are like the criticism of the Copernican theory that the mobility of the earth as a real physical fact is incompatible with the common sense interpretation of nature because we feel no such motion. In both cases the argument fails when it is shown that the theory itself predicts that our experience will be what it in fact is. (In the Copernican case the addition of Newtonian physics was required to be able to show that the earth’s inhabitants would be unaware of any motion of the earth.)[25]

According to this view, the act of measurement of a quantum system, and its associated principles of uncertainty and entanglement, is simply the reflection of this splitting off of the observable universe from a higher order notion of a multiverse where all possible outcomes and alternate histories have the potential to exist.  The radical form of the many-worlds view is that these potential, unmanifest realities do in fact exist, whereas Everett seems to only go so far as to imply that they “could” exist and that conceptually their existence should not be ignored.

As hard as multiverse interpretation of quantum mechanics might be to wrap your head around, it does represent an elegant solution to some of the challenges raised by the broader physics community against quantum theory, most notable the EPR paradox and its extension to more everyday life examples as illustrated in the infamous Schrodinger’s cat paradox.  It does however raise some significant questions as to his theory of mind and subjective experience, a notion that he glosses over somewhat by abstracting observers into simple machines of sorts but nonetheless rests as a primary building block upon which his metatheory rests[26].

Another interpretation of these strange and perplexing findings of quantum mechanics in the early 20th century is Bohmian Mechanics, sometimes also referred to as de Broglie-Bohm theory, pilot-wave theory, or the Causal Interpretation of quantum theory.  The major contributors of the interpretation were initially Louis De Broglie who originally developed pilot-wave theory in the early part of the twentieth century but dropped the work after he got stuck on how to extend it to multi-body systems, and most prominently David Bohm who fully developed the theory in the second half of the twentieth century with the British physicist Basil Hiley.

Bohmian mechanics is most fully developed in Bohm and Hiley’s book entitled The Undivided Universe first published in 1993 although much of its contents and the underlying theory had been thought out and published in previous papers on the topic since the 1950s.  In their book they refer to their interpretation not as the Causal Interpretation, or even as de Broglie-Bohm theory, but as the Ontological Interpretation of Quantum Theory given that from their perspective its gives the only complete causal and deterministic model of quantum theory.

David Bohm was an American born British physicist of the twentieth century who made a variety of contributions to theoretical physics, but who also invested much time and thought into the metaphysical implications of quantum mechanics, and in metaphysics and philosophy in general, a topic that most theoretical physicists have steered away from, presumably due to the adverse effects it could have on their academic and pursuits in physics proper as Bohm himself encountered to some extent throughout his career.  In this respect, Bohm was a bit of a rebel relative to his peers in the academic community because he extended the hard science of theoretical physics into the more abstract realm of the descriptions of reality as a whole, incorporating first philosophy back into the discussion so to speak, but doing so with the tool of hard mathematics, making his interpretation very hard to ignore, or at least impossible to ignore its implications from a theoretical physics perspective.

Bohm, like many other physicists (like Everett for example), was dissatisfied with the mainstream interpretations of quantum mechanics as represented by the Copenhagen school of thought and in 1952 published an alternative theory which extended the pilot-wave theory of De Broglie that was published some thirty years prior and applied its basic principles to multi-body quantum systems, developing a more robust mathematical foundation to pilot-wave theory which had been previously lacking.  He then, along with Hiley, further extended the underlying mathematics of quantum theory to include a concept called quantum potential, a principle that provided a deterministic pillar into the probabilistic and stochastic nature of standard quantum theory interpretation, the actual position and momentum of the underlying particle(s) in question being the so called hidden variables.

De-Broglie’s pilot-wave theory from 1927 affirms the existence of subatomic particles, or corpuscles as they were called back then, but viewed these particles not as independent existing entities but as integrated into an undercurrent, or wave, which gave these subatomic particles their wave-like characteristics of diffraction and interference while still explaining their particle like behavior as illustrated in certain experimental results.  This represented a significant divergence away from standard interpretations of quantum theory and was not well received, hence the silence on advancement of the theory for the next twenty years or so by the physics community.  From his 1927 paper on the topic, De Broglie describes pilot-wave theory as follows:

One will assume the existence, as distinct realities, of the material point and of the continuous wave represented by the [wave function], and one will take it as a postulate that the motion of the point is determined as a function of the phase of the wave by the equation. One then conceives the continuous wave as guiding the motion of the particle. It is a pilot wave.[27]

De Broglie’s pilot-wave theory was dismissed by the broader academic community when it was presented at the time however, mainly due to fact that it’s implications were only understood to describe only single-body systems, and no doubt due to the fact that the common interpretation of quantum mechanics postulated that nothing could be said about the “existence” of a subatomic particle until it was measured and therefore the matter wasn’t further pursued until Bohm picked the theory back up some thirty years later.  Bohm expanded the theory to apply it to multi-body systems, giving the theory a more solid scientific ground and providing a fully developed framework for further consideration by the broader physics community.

Bohmian Mechanics, as pilot-wave theory later evolved into its more mature form, provides a mathematical and metaphysical framework within which subatomic reality can indeed be thought of as actually existing independent of an observer or an act of measurement, a significant departure from standard interpretations of the theory that were prevalent for most of the twentieth century (in philosophic terms it’s a fully realist interpretation).  The theory was consistent with Bell’s Theorem as it abandoned the notion of locality, and also was also fully deterministic, positing that once the value of these hidden variables was known, all future states, and even past states, could be calculated and known as well, consistent in this sense with classical physics.[28]

That the guiding wave, in the general case, propagates not in ordinary three-space but in a multidimensional-configuration space is the origin of the notorious ‘nonlocality’ of quantum mechanics. It is a merit of the de Broglie-Bohm version to bring this out so explicitly that it cannot be ignored.[29]

Bohmian Mechanics falls into the category of hidden variable theories.  It lays out a description of reality in the quantum realm where the wave function.  In other words it states that there are in fact hidden variables which dictate the actual position, momentum et al of the behavior of particles in the subatomic world, and outlines a factor it refers to as quantum potential which governs or guides the behavior and description of a quantum system and determines its future and past states, irrespective of whether or not the quantum system is observed or measured.

Along with this theory being fully deterministic, it also explains away the notion of wavefunction collapse as put forth by von Neumann by positing that the pilot-wave behaves according the stochastic principles of Schrodinger’s wave function but that there is some element of intelligent, or active information, involved in the behavior of the underlying wave/particle.  In other words, from their perspective, the wave/particle knows about its environment and behaves in a pseudo-intelligent manner (they stay away from the word intelligence but Charlie couldn’t see any other way to describe what it is that they meant to say).  In two-slit experiment parlance, it knows whether or not one or both of the slits are open and in turn behaves or moves so to speak with this knowledge in mind.

According to Bohm, one of the motivations for exploring the possibility of a fully deterministic/causal extension of quantum theory was not necessarily because he believed it to be the right interpretation, the correct one, but to show the possibility of such theories, the existence of which was cast into serious doubt after the development of Bell’s theorem in the 1950s.

… it should be kept in mind that before this proposal was made there had existed the widespread impression that no conceptions of hidden variables at all, not even if they were abstract, and hypothetical, could possibly be consistent with the quantum theory.[30]

Bohmian mechanics is consistent with Bell’s theorem, which rules out hidden variables only in theories which assume local realism, i.e. that all objects or things are governed by and behave according to the principles of classical physics which are bound by the constraints of Relativity and the fixed speed of light, which has been proven to not be the case in quantum mechanics, causing of course much consternation in the physics community and calling into question classical realism in general.[31]

Bohmian Mechanics (or The Ontological Interpretation of quantum theory which is the terminology that Bohm and Hiley adopt to describe their hypothesis of what is actually happening in the quantum realm) agrees with all of the predictions and models of quantum mechanics as developed by Bohr, Heisenberg and von Neumann (the orthodox Copenhagen Interpretation) but extends the model with this notion of quantum potential, develops a metaphysical notion of active information which guides the subatomic particle(s), and makes nonlocality explicit (something which Einstein held to be absolute and immovable).  With respect to the importance of the development of Bohmian mechanics, at least from a theoretical and mathematical perspective even if you don’t want to believe the interpretation, Bell himself (1987) had this to say about Bohmian mechanics:

But in 1952 I saw the impossible done. It was in papers by David Bohm. Bohm showed explicitly how parameters could indeed be introduced, into nonrelativistic wave mechanics, with the help of which the indeterministic description could be transformed into a deterministic one. More importantly, in my opinion, the subjectivity of the orthodox version, the necessary reference to the ‘observer,’ could be eliminated. …

But why then had Born not told me of this ‘pilot wave’? If only to point out what was wrong with it? Why did von Neumann not consider it? More extraordinarily, why did people go on producing ‘‘impossibility’’ proofs, after 1952, and as recently as 1978? … Why is the pilot wave picture ignored in text books? Should it not be taught, not as the only way, but as an antidote to the prevailing complacency? To show us that vagueness, subjectivity, and indeterminism, are not forced on us by experimental facts, but by deliberate theoretical choice?[32]

Bohmian Mechanics uniqueness is not only that it yields to the presumption of non-locality, (which was and is consistent with experimental results that show that there is in fact a strong degree of correlation between physically separated and once integral quantum systems, i.e. systems that are entangled, what Einstein perhaps misappropriately referred to as “spooky action at a distance”) but also in that it proves that hidden variable type theories are in fact mathematically possible and still consistent with the basic tenets of quantum mechanics, the latter point of which had been seriously called into question.

In other words, what Bohmian mechanics calls our attention to quite directly, is that there are metaphysical assumptions about reality in general that are fundamentally non-classical in nature that must be accounted for when interpreting quantum theory, the existence of what Bell refers to as “multidimensional-configuration space” that underlies the correlation of entangled particles/systems.  That is the only way to explain that once integrated but then subsequently separated quantum systems could be correlated in such a mathematically consistent and predictable way, behavior initially described by EPR as natural theoretical extensions of quantum theory in the first half of the twentieth century and subsequently proven experimentally in the latter part of the twentieth century by Aspect[33] among others.

And it was these same quantum systems whose behavior which was modeled so successfully with quantum mechanics, that in some shape or form constituted the basic building blocks that provided the foundation of the entire “classically” physical world.  This latter fact could not be denied, and yet the laws and theorems that have been developed to describe this behavior were (and still are for that matter) fundamentally incompatible with classical physics and its underlying assumptions about what is “real” and how these objects of reality behave and are related to each other.[34]  Although the orthodox interpretation of Quantum Theory would have us believe that we can draw no metaphysical conclusions based upon what quantum mechanics tells us, that it is simply a tool for deriving values or observables from experimental results, Bohmian Mechanics shows us that this interpretation albeit consistent and fully coherent is lacking in many respects and that a new perspective is required even if the Bohmian view is rejected.

Bohmian Mechanics, and Everett’s Relative State formulation of quantum mechanics as well to an extent, both extend well beyond the laws of classical physics to round out or complete their theories, both explicitly drawing on notions of metaphysics and the existence of some sort of underlying reality that exists in the subatomic realm, and this is where they depart significantly from the standard Copenhagen interpretation and the view most rigorously defended by Bohr.  The Copenhagen view holds that quantum theory tells us about the measurement of observables within the context of the quantum world, it’s an empirical measuring tool and nothing more and further that’s all that can be extrapolated from it by definition.  There’s no metaphysics explicit or implicit in the theory and any epistemological interpretation is ruled out.  Bohmian Mechanics and Everett’s Relative State formulation of quantum theory (and by association the various Many-World Interpretations that stemmed from it by DeWitt, Deutsch and others) intend to try and explain what is really happening in the quantum realm in a manner that’s consistent with the underlying model of behavior and prediction of experimental results, and some adventure into metaphysics, Aristotle’s first philosophy, is required in order to do this given that some departure from the assumptions of classical physics are required.

In the Relative State formulation, the wave function of Schrodinger is postulated to be a true representation of all of reality, abstracted to include observers at all levels, observers roughly corresponding to machines that can store the results of measurements (quantum states) and apply some level of deductive reasoning to correlate states and make subsequent observations.  From this perspective, the wave function represents perspectives (this is not the term that Everett uses but the one Charlie prefers) of a correlated reality that comes into existence, a correlated reality between one or many quantum system states/observers, all definable within the geometry of Hilbert space rather than Cartesian space which is used in Newtonian mechanics (with an extra dimension of time in Relativity).

Bohm (and Hiley) lay out an extension to the quantum theoretical mathematical model which is not only fully deterministic, but also “real”, not yielding to the Copenhagen view that reality in the quantum world only exists upon measurement, i.e. a reality existing independent of any observation, albeit a fundamentally non-local reality which is completely consistent with Bell’s Theorem.  Both interpretations however, and others that fit into similar categories as does Bell’s Theorem itself, call into serious question the notion of local realism which sits in the center of Newtonian mechanics which has driven scientific development in the last three hundred years.

One can put it quite succinctly by observing that no matter what school of interpretation you adhere to, at the very least the classical notion of local realism must be abandoned, one would be hard pressed to find someone with a good understanding of Quantum Theory who would dispute this.  In other words, regardless of which interpretation is more attractive, or which one you adhere to, what cannot be ignored is that the classical interpretation of reality, that it has intrinsic properties that exist independent of observation and can be precisely measured in a fully deterministic and predictive way, the assumption that drove the developments of the Scientific Revolution and provided the underlying metaphysical framework for Newton, Einstein and others, was in need of serious revision.


[1] Without quantum mechanics we wouldn’t have transistors which are the cornerstone of modern computing.

[2] Our current ability to measure the size of these subatomic particles goes down to approximately 10-16 cm leveraging currently available instrumentation, so at the very least we can say that our ability to measure anything in the subatomic realm, or most certainly the realm of the general constituents of basic atomic elements such as quarks or gluons for example, is very challenging to say the least.  Even the measurement of the estimated size of an atom is not so straightforward as the measurement is dictated by the circumference of the atom, a measurement that relies specifically on the size or radius of the “orbit” of the electrons on said atom, “particles” whose actual “location” cannot be “measured” in tandem with their momentum, standard tenets of quantum mechanics, both of which constitute what we consider measurement in the classic Newtonian sense.

[3] In some respects, even at the cosmic scale, there is still significant reason to believe that even Relativity has room for improvement as evidenced by what physicists call Dark Matter and/or Dark Energy, artifacts and principles that have been created by theoretical physicists to describe matter and energy that they believe should exist according to Relativity Theory but the evidence for which their existence is still yet ”undiscovered”.  For more on Dark Matter see http://en.wikipedia.org/wiki/Dark_matter and Dark Energy see http://en.wikipedia.org/wiki/Dark_energy, both of which remain mysteries and lines of active research for modern day cosmology.

[4] Quantum theory has its roots in this initial hypothesis by Planck, and in this sense he is considered by some to be the father of quantum theory and quantum mechanics.  It is for this work in the discovery of “energy quanta” that Max Planck received the Nobel Prize in Physics in 1918, some 15 or so years after publishing.

[5] Einstein termed this behavior the photoelectric effect, and it’s for this work that he won the Nobel Prize in Physics in 1921.

[6] The Planck constant was first described as the proportionality constant between the energy (E) of a photon and the frequency (ν) of its associated electromagnetic wave.  This relation between the energy and frequency is called the Planck relation or the Planck–Einstein equation:

[7] It is interesting to note that Planck and Einstein had a very symbiotic relationship toward the middle and end of their careers, and much of their work complemented and built off of each other.  For example Planck is said to have contributed to the establishment and acceptance of Einstein’s revolutionary concept of Relativity within the scientific community after being introduced by Einstein in 1905, the theory of course representing a radical departure from the standard classical physical and mechanics models that had held up for centuries prior.  It was through the collaborative work and studies of Planck and Einstein in some sense then that the field of quantum mechanics and quantum theory is shaped how it is today; Planck who defined the term quanta with respect to the behavior of elements in the realms of matter, electricity, gas and heat, and Einstein who used the term to describe the discrete emissions of light, or photons.

[8] The double slit experiment was first devised and used by Thomas Young in the early nineteenth century to display the wave like characteristics of light.  It wasn’t until the technology was available to send a single “particle” (a photon or electron for example) that the wave like and stochastically distributed nature of the underlying “particles” was discovered as well.  http://en.wikipedia.org/wiki/Young%27s_interference_experiment

[9] Louis de Broglie, “The wave nature of the electron”, Nobel Lecture, Dec 12th, 1929

[10] Louis de Broglie, “The wave nature of the electron”, Nobel Lecture, Dec 12th, 1929

[11] Max Born, “The statistical interpretation of quantum mechanics” Nobel Lecture, December 11, 1954.

[12] Erwin Schrodinger made many of the fundamental discoveries in the foundation of quantum mechanics, most notably the wave function which described the behavior of subatomic particles.  He shared some of the same concerns of standard interpretations of quantum mechanics with Einstein, as illustrated in his cat paradox that he is so well known for.

[13] Actually Verschränkung in German.

[14] Schrödinger, E. (1935) Discussion of Probability Relations Between Separated SystemsProceedings of the Cambridge Philosophical Society, 31: pg. 555

[15] As later analysis and criticism has pointed out, Bell’s theorem rules out hidden variable theories of a given genre rather than all hidden variable theories in toto.

[16] Bell, John (1964). “On the Einstein Podolsky Rosen Paradox”Physics 1 (3): 195–200.

[17] Albert Einstein, Quantum Mechanics and Reality (“Quanten-Mechanik und Wirklichkeit”, Dialectica 2:320-324, 1948)

[18] For a more complete review of a multitude of interpretations of Quantum Theory going well beyond this analysis see http://en.wikipedia.org/wiki/Interpretations_of_quantum_mechanics.

[19] This mode of thought was formulated primarily by Niels Bohr and Werner Heisenberg, stemming from their collaboration in Copenhagen in 1927; hence the name.  The term was further crystallized in writings by Heisenberg in the 1950s when addressing contradictory interpretations of quantum theory and still represents the most widely accepted, and widely taught, interpretation of quantum mechanics in physics today.

[20] Niels Bohr (1949),”Discussions with Einstein on Epistemological Problems in Atomic Physics”. In P. Schilpp. Albert Einstein: Philosopher-Scientist. Open Court.

[21] Everett was a graduate student at Princeton at the time that he authored The Theory of the Universal Wave Function and his advisor was John Wheeler, one of the most respected theoretical physicists of the latter half of the twentieth century.  Incidentally Everett did not continue in academia and therefore subsequent interpretations and expansions upon his theory were left to later authors and researchers, most notably by Bryce Dewitt in 1973 who coined the term “many-worlds” and then developed even further by subsequent physicists such as David Deutsch among others.  DeWitt’s book on the subject included several different viewpoints and research papers and was called The Many-Worlds Interpretation of Quantum Mechanics; it included a reprint of Everett’s thesis.  Deutsch’s seminal work on the topic is probably his book entitled The Fabric of Reality published in 1997 where he expands and extends the man-worlds interpretation to other disciplines outside of physics such as philosophy and epistemology, computer science and quantum computing, and even biology and theories of evolution.

[22] From the Introduction of Everett’s thesis in 1957 “Relative State” Formulation of Quantum Mechanics.

[23] Hugh Everett, III.  Theory of the Universal Wave Function, 1957.  Pg 53.

[24] Deutsch actually posits that proof of the “existence” of these other multi-verses is given by the wave interference pattern displayed in even the single split version of the classic double slit experiment as well as the some of the running time algorithm enhancements driven by quantum computing, namely Shor’s algorithm which finds the polynomial factors of a given number which runs an order of magnitude faster on quantum computers than it does on classical, 1 or 0 but based machines.  This claim is controversial to say the least, or at least remains an open point of contention among the broader physics community. See http://daviddeutsch.physics.ox.ac.uk/Articles/Frontiers.html for a summary of his views on the matter.

[25] Everett’s thesis in 1957 “Relative State” Formulation of Quantum Mechanics, Note on Page 15, presumably in response to criticisms he received upon publishing the draft of his thesis to various distinguished members of the physics community, one of who was Niels Bohr.

[26] Deutsch actually posits that proof of the “existence” of these other multi-verses is given by the wave interference pattern displayed in even the single split version of the classic double slit experiment as well as the some of the running time algorithm enhancements driven by quantum computing, namely Shor’s algorithm which finds the polynomial factors of a given number which runs an order of magnitude faster on quantum computers than it does on classical, 1 or 0 bit based machines.  This claim is controversial to say the least, or at least remains an open point of contention among the broader physics community. See http://daviddeutsch.physics.ox.ac.uk/Articles/Frontiers.html for a summary of Deutsch’s views on the matter and Bohm and Hiley’s Chapter on Many-Worlds in their 1993 book entitled The Undivided Universe: An Ontological Interpretation of Quantum Theory for a good overview of the strengths and weaknesses mathematical and otherwise of Everett and DeWitt’s different perspectives on the Many-Worlds approach.

[27] Louis De Broglie `Wave mechanics and the atomic structure of matter and of radiation’, Le Journal de Physique et le Radium, 8, 225 (1927)

[28] These features are why it is sometimes referred to as the Causal Interpretation due to the fact that that it outlined a fully causal description of the universe and its contents.

[29] From Stanford Encyclopedia entry on Bohmian Mechanics by Sheldon Goldstein, quote from Bell, Speakable and Unspeakable in Quantum Mechanics, Cambridge: Cambridge University Press; 1987, p. 115.

[30]  David Bohm, Wholeness and the Implicate Order, London: Routledge 1980 pg. 81.

[31] In fact, it was Bohm’s extension of De Broglie’s work on pilot-wave theory that provided at least to some degree the motivation for Bell to come up with his theorem to begin with; see Bell’s paper entitled On the Einstein Podolsky Rosen Paradox in 1964, published some 12 years after Bohm published his adaption of De Broglie’s pilot-wave theory.

[32] From Stanford Encyclopedia entry on Bohmian Mechanics, 2001 by Sheldon Goldstein; taken from Bell 1987, “Speakable and Unspeakable in Quantum Mechanics”, Cambridge University Press.

[33] Experimental Realization of Einstein-Podolsky-Rosen-Bohm Gedankenexpermient: A New Violation of Bell’s Inequalities; Aspect, Grangier, and Roger, July 1982.

[34] There has been significant progress in the last decade or two in reconciling quantum theory and classical mechanics, most notably with respect to Newtonian trajectory behavior, what is described in the literature as accounting for the classical limit.  For a good review of the topic see the article The Emergence of Classical Dynamics in a Quantum World by Tanmoy Bhattacharya, Salman Habib, and Kurt Jacobs published in Las Alamos Science in 2002.

Wave-Particle Duality: So Much for the Atom

From Charlie’s standpoint, Relativity Theory could be grasped intellectually by the educated, intelligent mind.  You didn’t need advanced degrees or a deep understanding of complex mathematics to understand that at a very basic level, Relativity Theory implied that basic measurements like speed, distance and even mass were relative and depended upon the observer’s frame of reference, that mass and energy were basically convertible into each other and equivalent, related by the speed of light that moved at a fixed speed no matter what your frame of reference, and that space and time were not in fact separate and distinct concepts but in order for a more accurate picture of the universe to emerge they needed to be combined into a single notion of spacetime.  Relativity says that even gravity’s effect was subject to the same principles that played out at the cosmic scale, i.e. that spacetime “bends” at points of singularity (black holes for example), bends to the extent that light in fact is impacted by the severe gravitational forces at these powerful places in the universe.  And indeed that our measurements of time and space were “relative”, relative to the speed and frame of reference from which these measurements were made, the observer was in fact a key element in the process of measurement.

If you assumed all these things, you ended up with a more complete and accurate mathematical and theoretical understanding of the universe than you had with Newtonian mechanics, and one that is powerful enough that despite the best efforts of many great minds over the last 100 years or so, has yet to be supplanted with anything better, at least at the macro scale of the universe.  Charlie didn’t doubt that Relativity represented a major step in scientific and even metaphysical step forward to mankind’s understanding of the physical universe, but a subtle and quite distinctive feature of this model was that it reinforced a deterministic and realist model of the universe.  In other words, Relativity implicitly assumed that that objects in the physical did in fact exist, i.e. they were “real”, real in the sense that they had an absolute existence in the spacetime continuum somewhere that could be described in terms of qualitative data like speed, mass, velocity, etc. and furthermore that if you knew a set of starting criteria, what scientists like to call a “system state”, as well as a set of variables/forces that acted on said system, you could in turn predict with certainty the outcome of said forces on such a system, i.e. the set of observed descriptive qualities of the objects in said system after the forces have acted upon the objects that existed in the original system state, i.e. the physical world was fully deterministic.

Charlie didn’t want to split hairs on these seemingly inconsequential and subtle assumptions, assumptions that not only underpinned Einstein’s Relativity in fact but also to a great extent underpinned Newtonian mechanics as well, but these were in fact very modern metaphysical assumptions and had not in fact been assumed, at least not in to the degree of certainty of modern times, in theoretical models of reality that existed prior to the Scientific Revolution.  Prior to Newton, the world of the spirit, theology in fact, was very much considered to be just as real as the physical world, the world governed by science.  This fact was true not only in the West but also in the East, and to a great extent remains true in Eastern philosophical thought today, whereas in the West not so much.

But at their basic, core level, these concepts could be understood, grasped as it were, by the vast majority of the public, even if they had very little if any bearing on their daily lives and didn’t fundamentally change or shift their underlying religious or theological beliefs, or even their moral or ethical principles.  Relativity was accepted in the modern age, as were its deterministic and realistic philosophical and metaphysical assumptions, the principles just didn’t really affect the subjective frame of reference, the mental or intellectual frame of reference, within which the majority of humanity perceived the world around them.  Relativity itself as a theoretical construct was relegated to the realm of physics, a problem which needed to be understood to pass a physics or science exam in high school or college, to be buried in your consciousness in lieu of our more pressing daily and life pursuits be they family, career and money, or other forms of self-preservation in the modern, Information Age; an era most notably marked by materialism, self-promotion, greed, and capitalism, which interestingly enough all pay homage to realism and determinism to a large extent.

Quantum Theory was altogether different however.  Its laws were more subtle and complex than the world described by classical physics, the world described in painstaking mathematical precision by Newton, Einstein and others.  And after a lot of studying and research, the only conclusion that Charlie could definitively come to was that in order to understand Quantum Theory, or at least try to come to terms with it, a wholesale different perspective on what reality truly was, or at the very least how reality was to be defined, was required.  In other words, in order to understand what Quantum Theory actually means, or in order to grasp the underlying intellectual context within which the behaviors of the underlying particles/fields that Quantum Theory describes were to be understood, a new framework of understanding, a new description of reality, must be adopted.  What we considered to be “reality”, or what was “real”, as understood and implied by classical physics which had dominated the minds of the Western world for over 300 years since the publication of Newton’s Principia, needed to be abandoned, or at the very least significantly modified, in order for Quantum Theory to be comprehended in any meaningful way, in order for anyone to make any sense of what Quantum Theory “said” about the nature of the substratum of existence.

Things would never be the same from a physics perspective, this much was clear, whether or not the daily lives of the bulk of those who struggle to survive in the civilized world would evolve along with physicists in concert with these developments remained to be seen.

 

Quantum Mechanics is the branch of physics that deals with the behavior or particles and matter in the atomic and subatomic realms, or quantum realm so called given the quantized nature of “things” at this scale.  So you have some sense of scale, an atom is 10-8 cm across give or take, and the nucleus, or center of an atom, which is made up of what we now call protons and neutrons, is approximately 10-12 cm across.  An electron, or a photon for that matter, cannot truly be measured from a size perspective in terms of classical physics for many of the reasons we’ll get into below as we explore the boundaries of the quantum world, but suffice it to say at present our best guess at the estimate of the size of an electron are in the range of 10-18 cm or so.[1]

Whether or not electrons, or photons (particles of light) for that matter, really exist as particles whose physical size, and/or momentum can be actually “measured” is not as straightforward a question as it might appear and gets at some level to the heart of the problem we encounter when we attempt to apply the principles of existence or reality to the subatomic realm, or quantum realm, within the context of the semantic and intellectual framework established in classical physics that has evolved over the last three hundred years or so; namely as defined by independently existing, deterministic and quantifiable measurements of size, location, momentum, mass or velocity.

The word quantum comes from the Latin quantus, meaning “how much” and it is used in this context to identify the behavior of subatomic things that move from and between discrete states rather than a continuum of values or states as is assumed and fundamental to classical physics.  The term itself had taken on meanings in several contexts within a broad range of scientific disciplines in the 19th and early 20th centuries, but was formalized and refined as a specific field of study as Quantum Mechanics by Max Planck at the turn of the 20th century and quantization arguably represents the prevailing and distinguishing characteristic of reality at this scale.

Newtonian physics, or even the extension of Newtonian physics as put forth by Einstein with Relativity theory in the beginning of the twentieth century (a theory whose accuracy is well established via experimentation at this point), assumes that particles, things made up of mass, energy and momentum exist independent of the observer or their instruments of observation, and are presumed to exist in continuous form, moving along specific trajectories and whose properties (mass, velocity, etc.) can only be changed by the action of some force upon which these things or objects are affected.  This is the essence of Newtonian mechanics upon which the majority of modern day physics, or at least the laws of physics that affect us here at a human scale, is defined and philosophically has at its heart the presumption of realism and determinism.

The only caveat to this view that was put forth by Einstein is that these measurements themselves, of speed or even mass or energy content of a specific object, can only be said to be universally defined according to these physical laws within the specific frame of reference of an observer.  Their underlying reality is not questioned – these things clearly exist independent of observation or measurement, clearly (or so it seems) – but the values, or the properties of these things is relative to a frame of reference of the observer change depending upon your frame of reference.  This is what Relativity tells us.  So the velocity of a massive body, and even the measurement of time itself which is a function of distance and speed, is a function of the relative speed and position of the observer who is performing said measurement.

For the most part, the effects of Relativity can be ignored when we are referring to objects on Earth that are moving at speeds that are minimal with respect to the speed of light and are less massive than say black holes.  As we measure things at the cosmic scale, where distances are measured in terms of light years and black holes and other massive phenomena exist which bend spacetime (aka singularities) the effects of Relativity cannot be ignored however.[2]

Leaving aside the field of Cosmology for the moment and getting back to the history of the development of Quantum Mechanics, at the end of the 19th century Planck was commissioned by electric companies to create light bulbs that used less energy, and in this context was trying to understand how the intensity of electromagnetic radiation emitted by a black body (an object that absorbs all electromagnetic radiation regardless of frequency or angle of incidence) depended on the frequency of the radiation, i.e. the color of the light.  In his work, and after several iterations of hypotheses that failed to have predictive value, he fell upon the theory that energy is only absorbed or released in quantized form, i.e. in discrete packets of energy he referred to as “bundles” or” energy elements”, the so called Planck postulate.  And so the field of Quantum Mechanics was born.[3]

Despite the fact that Einstein is best known for his mathematical models and theories for the description of the forces of gravity and light at a cosmic scale, his work was also instrumental in the advancement of Quantum Mechanics as well.   For example, in his work in the effect of radiation on metallic matter and non-metallic solids and liquids, he discovered that electrons are emitted from matter as a consequence of their absorption of energy from electromagnetic radiation of a very short wavelength, such as visible or ultraviolet radiation.  Einstein established that in certain experiments light appeared to behave like a stream of tiny particles, not just as a wave, lending more credence and authority to the particle theories describing of quantum realm.  He therefore hypothesized the existence of light quanta, or photons, as a result of these experiments, laying the groundwork for subsequent wave-particle duality discoveries and reinforcing the discoveries of Planck with respect to black body radiation and its quantized behavior.[4]

Prior to the establishment of light’s properties as waves, and then in turn the establishment of wave like characteristics of subatomic elements like photons and electrons by Louis de Broglie in the 1920s, it had been fairly well established that these subatomic particles, or electrons or photons as they were later called, behaved like particles.  However the debate and study of the nature of light and subatomic matter went all the way back to the 17th century where competing theories of the nature of light were proposed by Isaac Newton, who viewed light as a system of particles, and Christiaan Huygens who postulated that light behaved like a wave.  It was not until the work of Einstein, Planck, de Broglie and other physicists of the twentieth century that the nature of these subatomic particles, both light and electrons, were proven to behave both like particles and waves, the result dependent upon the experiment and the context of the system which being observed.  This paradoxical principle known as wave-particle duality is one of the cornerstones, and underlying mysteries, of Quantum Theory.

As part of the discoveries of subatomic particle wave-like behavior, what Planck discovered in his study of black body radiation, and Einstein as well within the context of his study of light and photons, was that the measurements or states of a given particle such as a photon or an electron had to take on values that were multiples of very small and discrete quantities, i.e. were non-continuous, the relation of which was represented by a constant value known as the Planck constant[5].

In the quantum realm then, there was not a continuum of values and states of matter as was assumed in physics up until that time, there were bursts of energies and changes of state that were ultimately discrete, and yet at the same time at fixed amplitudes or values, where certain states and certain values could in fact not exist, representing a dramatic departure from the way physicists, and the rest of us mortals, think about movement and change in the “real world”, and most certainly represented a significant departure from Newtonian mechanics upon which Relativity was based where the idea of continuous motion, in fact continuous existence, was never even questioned.

It is interesting to note that Planck and Einstein had a very symbiotic relationship toward the middle and end of their careers, and much of their work complemented and built off of each other.  For example Planck is said to have contributed to the establishment and acceptance of Einstein’s revolutionary concept of Relativity within the scientific community after being introduced by Einstein in 1905, the theory of course representing a radical departure from the standard classical physical and mechanics models that had held up for centuries prior.  It was through the collaborative work and studies of Planck and Einstein in some sense then that the field of Quantum Mechanics and Quantum Theory is shaped how it is today; Planck who defined the term quanta with respect to the behavior of elements in the realms of matter, electricity, gas and heat, and Einstein who used the term to describe the discrete emissions of light, or photons.

The classic demonstration of light’s behavior as a wave, and perhaps one of the most astonishing and game changing experiments of all time, is illustrated in what is called the double-slit experiment.  In the basic version of this experiment, a light source such as a laser beam is shone at a thin plate that that is pierced by two parallel slits.  The light in turn passes through each of the slits and displays on a screen behind the plate.  The image that is displayed on the screen behind the plate as it turns out is not one of a constant band of light that passes through each one of the slits as you might expect if the light were simply a particle or sets of particles, the light displayed on the screen behind the double-slitted slate is one of light and dark bands, indicating that the light is behaving like a wave and is subject to interference, the strength of the light on the screen cancelling itself out or becoming stronger depending upon how the individual waves interfere with each other.  This behavior is exactly akin to what we consider fundamental wavelike behavior, for example like the nature of waves in water where the waves have greater strength if they synchronize correctly (peaks of waves) and cancel each other out (trough of waves) if not.

What is even more interesting however, and was most certainly unexpected, is that once equipment was developed that could reliably send a single particle, an electron or photon for example, through a double-slitted slate, the individual particles did indeed end up at a single location on the screen after passing through just one of the slits as was expected, but however – and here was the kicker – the location on the screen that the particle ended up at, as well as which slit the particle appeared to pass through (in later versions of the experiment which slit “it” passed through could in fact be detected) was not consistent and followed seemingly random and erratic behavior.  What researchers found as more and more of these subatomic particles were sent through the slate one at a time, was that the same wave like interference pattern emerged that showed up when the experiment was run with a full beam of light as was done by Young some 100 years prior[6].

So hold on for a second, Charlie had gone over this again and again, and according to all the literature he read on Quantum Theory and Quantum Mechanics all pretty much said the same thing, namely that the heart of the mystery of Quantum Mechanics could be seen in this very simple experiment.  And yet it was really hard to, perhaps impossible, to understand what was actually going on, or at least understand without abandoning some of the very foundational principles of classical physics, like for example that these things called subatomic particles actually existed as independent particles or “objects” as we might refer to them at the macroscopic scale, because they seemed to behave like waves when looked at in aggregate but at the same time behaved, sort of, like particles when looked at individually.

What was clear was that this subatomic particle, corpuscle or whatever you wanted to call it, did not have appear to have a linear and fully deterministic trajectory in the classical physics sense, this much was very clear due to the fact that the distribution against the back screen when they were sent through the double slits experiment as individual particles appeared to be random.  But what was more odd was that when the experiment was run one corpuscle at a time, again whatever that term really means at the quantum level, not only was the final location on the screen seemingly random individually, but the same aggregate pattern emerged after many, many single corpuscle experiment runs as when a full wave, or set of these corpuscles, was sent through the double slits.

So it appeared, and this was and still remains a very important and telling mysterious characteristic feature of the behavior of these “things” at the subatomic scale, is that not only did the individual photon seemed to be aware of the final wave like pattern of its parent wave, but also that this corpuscle appeared to be interfering with itself when it went through the two slits individually.  Charlie wanted to repeat this again for emphasis, because these conclusions, which from his perspective after doing a heck of a lot of research into Quantum Theory for a guy who was inherently lazy but was still looking to try and understand the source of our fundamentally mechanistic and materialistic world view which so dominates Western society today, a view which clearly rested on philosophical and metaphysical foundations which stemmed from classical physical notions of objective reality, with the double-slit experiment you could clearly see that the fundamental substratum of existence not only exhibited wave like as well as particle like behavior, but when looked at at the individual “particle” level, whatever the heck that actually means at the subatomic scale, the individual particle seemed to not only be aware of its parent like wave structure, but the experimental results seemed to imply that the individual particle was interfering with itself.

Furthermore, to make things even more mysterious, as the final location of each of the individual photons in the two slit and other related experiments was evaluated and studied, it was discovered that although the final location of an individual one of these particles could not be determined exactly before the experiment was performed, i.e. there was a fundamental element of uncertainty or randomness involved at the individual corpuscle level, it was discovered that the final locations of these particles measured in toto after many experiments were performed exhibited statistical distribution behavior that could be modeled quite precisely, precisely from a mathematical statistics and probability distribution perspective.  That is to say that the sum total distribution of the final locations of all the particles after passing through the slit(s) could be established stochastically, i.e. in terms of well-defined probability distribution consistent with probability theory and well defined mathematics that governed statistical behavior.  So in total you could predict what the particle like behavior would look like over a large distribution set of particles in the double slit experiment even if you couldn’t predict with certainty what the outcome would look like for an individual corpuscle.

The mathematics behind this particle distribution that was discovered is what is known as the wave function, typically denoted by the Greek letter psi, ψ or its capital equivalentΨ, predicts what the probability distribution of these “particles” will look like on the screen behind the slate after many individual experiments are run, or in quantum theoretical terms, the wave function predicts the quantum state of a particle throughout a fixed spacetime interval.  The wave function was discovered by the Austrian physicist Erwin Schrödinger in 1925, published in 1926, and is commonly referred to in the scientific literature as the Schrödinger equation, analogous in the field of Quantum Mechanics to Newton’s second law of motion in classical physics.

This wave function represents a probability distribution of potential states or outcomes that describe the quantum state of a particle and predicts with a great degree of accuracy the potential location of a particle given a location or state of motion.  With the discovery of the wave function, it now became possible to predict the potential locations or states of these subatomic particles, an extremely potent theoretical model that has led to all sorts of inventions and technological advancements since its discovery.

Again, this implied that individual corpuscles were interfering with themselves when passing through the two slits on the slate, which was very odd indeed.  In other words, the individual particles were exhibiting wave like characteristics even when they were sent through the double-slitted slate one at a time.  This phenomenon was shown to occur with atoms as well as electrons and photons, confirming that all of these subatomic so-called particles exhibited wave like properties as well as particle like qualities, the behavior observed determined upon the type of experiment, or measurement as it were, that the “thing” was subject to.

As Louis De Broglie, the physicist responsible for bridging the theoretical gap between matter, in this case electrons, and waves by establishing the symmetric relation between momentum and wavelength which had at its core Planck’s constant (the De Broglie equation), described this mysterious and somewhat counterintuitive relationship between matter and waves, “A wave must be associated with each corpuscle and only the study of the wave’s propagation will yield information to us on the successive positions of the corpuscle in space.”[7]  In the Award Ceremony Speech in 1929 in honor of Louis de Broglie for his work in establishing the relationship between matter and waves for electrons, we find the essence of his ground breaking and still mysterious discovery which remains a core characteristic of Quantum Mechanics to this day.

 

Louis de Broglie had the boldness to maintain that not all the properties of matter can be explained by the theory that it consists of corpuscles. Apart from the numberless phenomena which can be accounted for by this theory, there are others, according to him, which can be explained only by assuming that matter is, by its nature, a wave motion. At a time when no single known fact supported this theory, Louis de Broglie asserted that a stream of electrons which passed through a very small hole in an opaque screen must exhibit the same phenomena as a light ray under the same conditions. It was not quite in this way that Louis de Broglie’s experimental investigation concerning his theory took place. Instead, the phenomena arising when beams of electrons are reflected by crystalline surfaces, or when they penetrate thin sheets, etc. were turned to account. The experimental results obtained by these various methods have fully substantiated Louis de Broglie’s theory. It is thus a fact that matter has properties which can be interpreted only by assuming that matter is of a wave nature. An aspect of the nature of matter which is completely new and previously quite unsuspected has thus been revealed to us.[8]

 

So by the 1920s then, you have a fairly well established mathematical theory to govern the behavior of subatomic particles, backed by a large body of empirical and experimental evidence, that indicates quite clearly that what we would call “matter” (or particles or corpuscles) in the classical sense, behaves very differently, or at least has very different fundamental characteristics, in the subatomic realm.  It exhibits properties of a particle, or a thing or object, as well as a wave depending upon the type of experiment that is run.  So the concept of matter itself then, as we had been accustomed to dealing with and discussing and measuring for some centuries, at least as far back as the time of Newton (1642-1727), had to be reexamined within the context of Quantum Mechanics.  For in Newtonian physics, and indeed in the geometric and mathematical framework within which it was developed and conceived which reached far back into antiquity (Euclid circa 300 BCE), matter was presumed to be either a particle or a wave, but most certainly not both.

What even further complicated matters was that matter itself, again as defined by Newtonian mechanics and its extension via Relativity Theory taken together what is commonly referred to as classical physics, was presumed to have some very definite, well-defined and fixed, real properties.  Properties like mass, location or position in space, and velocity or trajectory were all presumed to have a real existence independent of whether or not they were measured or observed, even if the actual values were relative to the frame of reference of the observer.  All of this hinged upon the notion that the speed of light was fixed no matter what the frame of reference of the observer of course, this was a fixed absolute, nothing could move faster than the speed of light.  Well even this seemingly self-evident notion, or postulate one might call it, ran into problems as scientists continued to explore the quantum realm.

So by the 1920s then, the way scientists looked at and viewed matter as we would classically consider it within the context of Newton’s postulates from the early 1700s which were extended further into the notion of spacetime as put forth by Einstein, was encountering some significant difficulties when applied to the behavior of elements in the subatomic, quantum, world.  Difficulties that persist to this day it was important to point out.  Furthermore, there was extensive empirical and scientific evidence which lent significant credibility to Quantum Theory, which illustrated irrefutably that these subatomic elements behaved not only like waves, exhibiting characteristics such as interference and diffraction, but also like particles in the classic Newtonian sense that had measurable, well defined characteristics that could be quantified within the context of an experiment.

In his Nobel Lecture in 1929, Louis de Broglie, summed up the challenge for physicists of his day, and to a large extent physicists of modern times, given the discoveries of Quantum Mechanics as follows:

 

The necessity of assuming for light two contradictory theories-that of waves and that of corpuscles – and the inability to understand why, among the infinity of motions which an electron ought to be able to have in the atom according to classical concepts, only certain ones were possible: such were the enigmas confronting physicists at the time…[9]

 

The other major tenet of Quantum Theory that rests alongside wave-particle duality, and that provides even more complexity when trying to wrap our minds around what is actually going on in the subatomic realm, is what is sometimes referred to as the uncertainty principle, or the Heisenberg uncertainty principle, named after the German theoretical physicist Werner Heisenberg who first put forth the theories and models representing the probability distribution of outcomes of the position of these subatomic particles in certain experiments like the double-slit experiment previously described, even though the wave function itself was the discovery of Schrödinger.

The uncertainty principle states that there is a fundamental theoretical limit on the accuracy with which certain pairs of physical properties of atomic particles, position and momentum being the classical pair for example, that can be known at any given time with certainty.  In other words, physical quantities come in conjugate pairs, where only one of the measurements of a given pair can be known precisely at any given time.  In other words, when one quantity in a conjugate pair is measured and becomes determined, the complementary conjugate pair becomes indeterminate.  In other words, what Heisenberg discovered, and proved mathematically, was that the more precisely one attempts to measure one of these complimentary properties of subatomic particles, the less precisely the other associated complementary attribute of the element can be determined or known.

Published by Heisenberg in 1927, the uncertainty principle states that they are fundamental, conceptual limits of observation in the quantum realm, another radical departure from the realistic and deterministic principles of classical physics which held that all attributes of a thing were measurable at any given time, i.e. this thing or object existed and was real and had measurable and well defined properties irrespective of its state.  It’s important to point out here that the uncertainty principle is a statement on the fundamental property of quantum systems as they are mathematically and theoretically modeled and defined, and of course empirically validated by experimental results, not a statement about the technology and method of the observational systems themselves.  This wasn’t a theoretical problem, or a problem with the state of instrumentation that was being used for measurement, it was a characteristic of the domain itself.

Max Born, who won the Nobel Prize in Physics in 1954 for his work in Quantum Mechanics, specifically for his statistical interpretations of the wave function, describes this now other seemingly mysterious attribute of the quantum realm as follows (the specific language he uses reveals at some level his interpretation of the quantum theory, more on interpretations later):

 

…To measure space coordinates and instants of time, rigid measuring rods and clocks are required.  On the other hand, to measure momenta and energies, devices are necessary with movable parts to absorb the impact of the test object and to indicate the size of its momentum.  Paying regard to the fact that quantum mechanics is competent for dealing with the interaction of object and apparatus, it is seen that no arrangement is possible that will fulfill both requirements simultaneously.[10]

 

Whereas classical physicists, physics prior to the introduction of Relativity and Quantum Theory, distinguished between the study of particles and waves, the introduction of Quantum Theory and wave-particle duality established that this classic intellectual bifurcation of physics at the macroscopic scale was wholly inadequate in describing and predicting the behavior of these “things” that existed in the subatomic realm, all of which took on the characteristics of both waves and particles depending upon the experiment and context of the system being observed.  Furthermore the actual precision within which a state of a “thing” in the subatomic world could be defined was conceptually bound, establishing theoretical limits upon which the state of a given subatomic state could be defined, another divergence from classical physics.  And then on top of this, was the requirement of the mathematical principles of statistics and probability theory, as well as significant extensions to the underlying geometry which were required to map the wave function itself in subatomic spacetime, all called quite clearly into question our classical materialistic notions, again based on realism and determinism, upon which scientific advancement had been built for centuries.

 

[1] Our current ability to measure the size of these subatomic particles goes down to approximately 10-16 cm leveraging currently available instrumentation, so at the very least we can say that our ability to measure anything in the subatomic realm, or most certainly the realm of the general constituents of basic atomic elements such as quarks or gluons for example, is very challenging to say the least.  Even the measurement of the estimated size of an atom is not so straightforward as the measurement is dictated by the circumference of the atom, a measurement that relies specifically on the size or radius of the “orbit” of the electrons on said atom, “particles” whose actual “location” cannot be “measured” in tandem with their momentum, standard tenets of Quantum Mechanics, both of which constitute what we consider measurement in the classic Newtonian sense.

[2] In some respects, even at the cosmic scale, there is still significant reason to believe that even Relativity has room for improvement as evidenced by what physicists call Dark Matter and/or Dark Energy, artifacts and principles that have been created by theoretical physicists to describe matter and energy that they believe should exist according to Relativity Theory but the evidence for which their existence is still yet ”undiscovered”.  Both Dark Matter and Dark Energy represent active lines of research in modern day Cosmology.

[3] Quantum theory has its roots in this initial hypothesis by Planck, and in this sense he is considered by some to be the father of quantum theory and quantum mechanics.  It is for this work in the discovery of “energy quanta” that Max Planck received the Nobel Prize in Physics in 1918, some 15 or so years after publishing.

[4] Einstein termed this behavior the photoelectric effect, and it’s for this work that he won the Nobel Prize in Physics in 1921.

[5] The Planck constant was first described as the proportionality constant between the energy (E) of a photon and the frequency (ν) of its associated electromagnetic wave.  This relation between the energy and frequency is called the Planck relation or the Planck–Einstein equation:

[6] The double slit experiment was first devised and used by Thomas Young in the early nineteenth century to display the wave like characteristics of light.  It wasn’t until the technology was available to send a single “particle” (a photon or electron for example) that the wave like and stochastically distributed nature of the underlying “particles” was discovered as well.  http://en.wikipedia.org/wiki/Young%27s_interference_experiment

[7] Louis de Broglie, “The wave nature of the electron”, Nobel Lecture, Dec 12th, 1929

[8] Presentation Speech by Professor C.W. Oseen, Chairman of the Nobel Committee for Physics of the Royal Swedish Academy of Sciences, on December 10, 1929.  Taken from http://www.nobelprize.org/nobel_prizes/physics/laureates/1929/press.html.

[9] Louis de Broglie, “The wave nature of the electron”, Nobel Lecture, Dec 12th, 1929

[10] Max Born, “The statistical interpretation of quantum mechanics” Nobel Lecture, December 11, 1954.

To What End: The Limits of Science

Charlie could remember back to when some of this had all started to germinate.  He was still in school back then.  Back in Providence.  When he was a ‘student-athlete’, whatever the heck that meant.  But in his better moments, he was an amateur philosopher.  Exploring the nature and depths of his own mind, and looking at and analyzing the scientific and analytical models that were presented before him that described reality.  There was hard science, there were the arts, and there was philosophy.  And certainly if you read something in a textbook presented by a Professor with a PHD, it must be true.  A hard fact.  Undisputed.

The sciences were a little tough for Charlie though.  He steered clear of disciplines that had lab hours or were brutally difficult to get through.  That left out most of the sciences.  He didn’t get into software engineering until much later.  Until he had to find a way to make a living that didn’t involve hitting a yellow fuzzy ball.  He did read some Einstein though, and some Stephen Hawking, just to try and get an understanding of the scientific models that underlie the physical world that we lived in.

What struck Charlie about some of these models, not that he understood them completely of course (nor did he think that he completely understood them today), was the limitations that seemed to be present in their descriptive power.  For Quantum Theory in particular, you had this embedded notion of “uncertainty”, some sort of probability distribution of outcomes that mapped the behavior of these subatomic things, a model that by design was incompatible with the tried and true notions of classical physics, that things and objects were real, had mass and velocity and “existed” beyond any act of observation or measurement, even if their “reality”, as defined by these measurable quantities really, was relative at a very basic level to the frame of reference of the observer.  The models were supposed to describe the world we lived in, at least better than any of the other theories that were out there, and yet they seemed to just beg more questions.

Quantum Mechanics even had a principle they called the uncertainty principle.  In physics?  So part of the theory is there are certain limits on what can be known?  That seemed very odd to have a principle called the uncertainty principle, which was so well defined, mathematically even (Δψp Δψq ≥ ℏ/2 if you must know) that sat right square in the middle of the hardest of sciences.

In an oft quoted passage, one of the greatest scientific minds of the 20th century and one of the original formulators of Quantum Mechanics, Max Born had this to say about the limits of Quantum Theory, which calls out directly the epistemological limits of science itself to some degree.

 

“ …To measure space coordinates and instants of time, rigid measuring rods and clocks are required. On the other hand, to measure momenta and energies, devices are necessary with movable parts to absorb the impact of the test object and to indicate the size of its momentum. Paying regard to the fact that quantum mechanics is competent for dealing with the interaction of object and apparatus, it is seen that no arrangement is possible that will fulfill both requirements simultaneously… ”[1]

 

So science had its limits then.  Scientists themselves recognized these limits.  And they gave the limits names.  And they named this one the “uncertainty principle” or “relativity”.  That said it all, Charlie remembered thinking.  So even hardcore theoretical physicists recognized the limits of their models and the methods that they used to arrive at and measure outcomes.  What we call science, which was based upon empirical study and verifiable evidence, what we deemed to be the boundaries of our physical world, appeared to be simply a map of the territory and smacked of some very basic underlying limitations.

And yet the Western mind, if one could generalize such a thing, was rooted in the fundamental belief of the “reality” of the physical world, believing that all experience and reality was explainable and predictable, basing its assumptions on what appeared to be Reason and Logic, built on empiricism essentially or what Pirsig called logical positivism.  So Charlie, and science itself it appeared, was presented with this philosophical problem, where the implications of this belief system, and its limits in fact, its basic assumptions about “reality”, should be well understood, and well taught.  And yet these limitations weren’t taught.  The assumptions that were built into these models that modern science directly called into question seemed to be brushed under the rug so to speak.  David Bohm’s struggle throughout the end of his career in fact reflected this struggle to have these basic assumptions, which rested at the heart and pinnacle of modern science, brought to light in some meaningful way, which in turn forced him to construct a broader theory of knowledge which incorporated physics, and the mind, and directly spoke to the basic assumptions of Western science which no longer seemed tenable.

Then there was this whole religious orthodoxy thing that remained a mystery to Charlie, and yet still held tremendous influence and sway over millions and millions of people throughout the world.  Not to pick on Christians here, as the Muslims and Jews (mainly the Abrahamic religions for the most part it seemed) all had their religious orthodoxy which held their Scripture to be divine revelation and to be interpreted literally and used as a reference guide to life itself, despite the fact that it was clear that this Scripture which they held so dear was clearly interpreted, translated and compiled by authors who were definitely not the prophets in question.  It did not take too much research to find out that neither Moses, nor Jesus nor Muhammad actually wrote anything down, they were presumably too busy living and teaching and reveling in the glory of the Creator.  These Evangelicals, who regarded the word of their God, as it was translated from the original Greek or Hebrew or Arabic as the case may be, should be interpreted literally and the ‘subjective’ experience of mystics should be ignored because it doesn’t have a basis on objective scientific truth, God reveals himself only to his chosen people.  That just didn’t seem to hold water to Charlie.  That premise seemed to lack the very rational foundations that it held so dear.

His mind rolled back to his senior year in college.  He and Jenry were roommates.  They lived in some shabby old house right by the local pub they used to go to all the time – Oliver’s it was called.  The location was great, but the place was practically falling apart.  It was college though, you were supposed to live like that apparently.  And yet in this setting, there was room for abstract thought, some exploration of the ideas and concepts that were being pressed into their formative minds.  And Charlie was doing enough reading, was exposed to enough of basic principles that formed the basis of modern science, that his mind was able to see a hole in the framework.  A hole in the model as it stood.  And yet he didn’t understand why this hole wasn’t more obvious to everyone around him, or at the very least, why this hole didn’t grab as much attention as he thought it should.

 

Physics as it stands today rests on the conceptually framework that physical reality is measurable and quantifiable and fundamentally “real”.  And this entire framework rests on the belief in, ultimate faith in, the predictive power of advanced mathematics to represent the physical world.  This branch of knowledge, and it was important to keep in mind that this was but one branch of knowledge, takes as given that the physical world is of three dimensions, dimensions represented by Cartesian (Euclidean) space that could be mapped in a basic x, y, and z coordinate system that could be present the location of anything in physical space.  And in turn, that time was mapped over it as the fourth dimension and always moved linearly in one direction.  This model had lasted from the time of the Greeks until Einstein’s day, more than two thousand years.

But Einstein postulated, and the boundaries of the theory were proven by later scientific experiments, that time and physical space itself was not only a function of the observer, that time and space in and of themselves were “relative” in fact, that the faster you approached the speed of light, the more your notion of time and space diverged from that of an observer at rest, the more relative time and space became.  But he also proved, that in order to build a more comprehensive model of physical reality, space and time needed to be fundamentally linked as conceptual constructs, and in fact, at the cosmic scale, spacetime was elastic, it “bended”.

Quantum Mechanics in turn showed that not only did the subatomic world operate according to very different and wholly irreconcilable laws than that of “classical physics”, but that the nature of physical reality was much more complex than perhaps we could ever imagine, that the underlying physical structure of the universe, of all of the physical world in fact, behaved not only according to the principles of matter, or particles, but also according to wavelike principles as well.  Hence the wave-particle duality paradox that sits at the heart of quantum reality and remains one of its great mysteries of science.  But Quantum Theory also shows us unequivocally that the idea of “measurement”, which sits at the very core of the philosophy of Western science, has limits, and that irrespective of the conundrum between classical physics and Quantum Theory, there remain interpretative questions about Quantum Theory itself that are still unanswered and force us to incorporate philosophy, metaphysics, our definition of knowledge and reality itself, back into the discussion.

What Bohm searched for, where he branched from theoretical physics into the realm of metaphysics, left the reservation so to speak, was in his search for unified order which he was compelled to establish after concluding that the only rational explanation of Quantum Theory was a notion which he called undivided wholeness.  He did not see a purely mathematical and theoretical answer to the seemingly irreconcilable differences between classical physics and quantum reality.  His premise seemed to be not only that a mathematical model that incorporated both the principles inherent to Einstein’s General Relativity and those that underlay Quantum Theory was not only impossible, but that in order to make sense of the “reality” of the models that described the two different domains, an adventure beyond physics was inevitable.

Bohm saw that the only path of reconciliation as it were lay outside the domain of physics and in the realm of metaphysics, where the notion of Mind and the Intellect were an integral part of the process of experience, i.e. his holomovement concept, and were directly incorporated into the theoretical model.  He effectively concluded that any study of the nature of the physical universe led one, from a rational and empirical basis alone, to the notion of an underling implicate and explicate order structure in which various explicate orders were perceived and stood unfolded from an underlying coherent implicate order structure that was characterized by some level of undivided wholeness, a concept within which thought itself was an integral part.

The implications of an explicate and implicate order framework for reality, given Bohmian Mechanics which illustrates the possibility of non-local hidden variable theories to explain Quantum Mechanics, is that the existence of a supposed “unified field theory”, or a model of “quantum gravity” so sought after by physicists since Einstein, is highly unlikely.  To take the implications one step further, Bohm’s model of reality implies that mathematics as a model for describing reality is limited, albeit powerful for describing various explicate orders such as Newtonian mechanics, Relativity (both Special and General Theories) as well as Quantum Mechanics, is limited to explicate orders and that in order to find a holistic model for describing all of reality, and in turn all explicate orders, one must look to the concepts of consciousness and integrated wholeness and interdependence, leaning on what appeared to be very Eastern philosophical principles that were fundamental principles, axioms as it were, in Vedanta and Buddhism.

Einstein’s belief in this “Unified Field Theory”, the existence of which essentially forms the basis of his criticism of Quantum Mechanics as incomplete, seemed to not only be improbable, but perhaps even impossible given the fundamental incompatibilities of the assumptions of the different theories and models.  In other words the very idea of local realism and local determinism as a construct, core to the theories of Special and General Relativity of Einstein and of course Newtonian mechanics, seemed to be at best limited to a certain domain of experience and at worst were fundamentally flawed as assumptions of the basis of physical reality.  And this violation of the principle of local realism has been empirically proven, at least at the quantum level, not only mathematically with the introduction of Bohmian Mechanics and the notion of Quantum Potential, but also subsequently experimentally by showing the relationship and interdependence of particle properties in two independent systems that were separated by classical physical boundaries.

Charlie thought that this quest for a “unified field theory” was a bit of a fool’s errand of sorts and what we really should be focused on was a quest for a “unified knowledge theory”, making it explicit that some elements of metaphysics, theoretical constructs that could provide linking and overarching themes across all the branches of science, must be included in our models of reality in order that all of our knowledge and all of our experience could be understood and comprehended in a fully coherent and consistent conceptual framework.

But in order to come up with a “unified knowledge theory”, you had to move beyond physics and modern science, and incorporate the science of mind and the act of perception itself into the overall framework.  Mathematical models of physical reality, whatever that was, appeared to only take you so far, which is the essence of Bohm’s case for an implicate and explicate order framework for “reality”, branching away from the orthodox interpretation of Quantum Theory (Copenhagen Interpretation) which stated that Quantum Theory was not in fact a framework for reality as we know it but simply a measuring tool that told us, approximately, the behavior of “stuff” at the subatomic level.

Hence Charlie’s ultimate conclusion that this search for a “Unified Field Theory”, which drives the field of theoretical physics, as well as particle physics to a great extent today, is fundamentally misguided.  String Theory and other abstract theoretical mathematical constructs represent the search for an answer to a metaphysical question using a tool that is wholly inadequate to answer and solve the problem.  The use of mathematics to answer to the metaphysical question of how the universe works and how Quantum Theory and General Relativity can be unified to explain a “unified field theory” appeared to be a fruitless effort, akin to an attempt to use a hammer and nails to build a skyscraper.

What we should be looking for, and what Bohm really provided us with, and in fact what Aristotle spoke to some 2500 years ago, is a Unified Knowledge Theory, within which physics, metaphysics, biology, psychology, etc. can be viewed as branches of knowledge that complement each other to provide a complete picture of the world we live in.  Where these seemingly contradictory and separate domains can peacefully coexist and collectively give us a perspective on the nature of reality as a whole as well as our place in this reality.

 

What Charlie thought he had fallen upon that seemed to go unnoticed in the modern era, the Age of Reason, the Age of Science, was that understanding starts and ends with language, the means with which we communicate ideas to one another and construct an understanding of, and are able to navigate through, the world around us.  And then language in its most concrete form was reflected in the written word, as expressed in various phonetic alphabets which were developed, invented, to express and codify language and encapsulate and communicate more abstract concepts into “systems” of thought that allowed us to express more complex ideas and to formulate models of reality.  Of course writing was most likely invented to communicate various forms of trade and economic transactions, but from a broader perspective it was then used to communicate knowledge itself, which in turn formed the basis as to how we look at the world around us and how we perceive our place in this reality, which at some level harkened back to those age old questions that have plagued man since the dawn of history, “who are we and from whence we came?”.

The Greek language, the ultimate forefather of all Western European languages, in many respects came to define how we look at knowledge itself in all its various forms.  As far as Charlie could gather this seemed to be Aristotle’s unique and lasting contribution to the West.  It is from his branches of knowledge, his epistêmai, that the language of modern science as a whole is derived.  And out of this ancient Greek philosophical movement, mathematics also originated as one of the cornerstones of metaphysics.  These mathematical principles, even the principle of the One, were a core part of the Greek philosophical schools as evidenced not only by Aristotle’s comprehensive discussion of these principles in his Physics and Metaphysics (even if he dismisses them as incoherent belief systems) but also more directly in the Pythagorean school which was influential throughout Greece in the pre-Socratic era.  It is from this tradition that Euclid and Ptolemy come from and it from these “scientists” that our modern reliance on mathematics as the ultimate expression of creation stems from.

But Mathematics as we have found is a limited and constrained abstract tool, even if it might perhaps be our most powerful abstract tool for modeling (physical) reality.  It was powerful yes, but clearly not powerful enough to explain the totality of behavior of particles and bodies in both the subatomic world as outlined in Quantum Mechanics and the world of massive bodies which warp spacetime as described in Einstein’s theories of Special and General Relativity.  In order to construct a fully coherent descriptive model of all existence it appeared that you needed more abstract symbols and more consistent, explicit assumptions about the grounding of existence, and the notion of perception, which encompassed the more scientific notions of observation and measurement, had to be built into the model somehow.

Mathematics, seen as the ultimate language to describe the physical world, a tenet solidified by Sir Isaac Newton and reinforced by the sheer beauty and elegance of the theories of General and Special theories of relativity as espoused by Albert Einstein, in fact constrained us from seeing the limits of this language in describing the ultimate source of all things and the process by which the universe itself was created.  Einstein himself fell into this trap in his immovable belief that Quantum Theory was in fact flawed, incomplete, failing to consider the possibility that perhaps some of the basic assumptions about the nature of physical reality needed to be at the very least relaxed if not abandoned entirely, something he was unwilling to even consider such was his conviction in the classical view of the world.

What Bohm searched for, and what ultimately led him out of physics proper and into metaphysics and philosophy, was some notion of unified order under which classical physics quantum reality could be explained.  He concluded that in order to explain the totality of even a purely physical reality, one had to formulate a theory of order that presumed some sort of hierarchical structure, where various explicate orders could manifest to explain a certain domain, and yet at the same time be incorporated into a single model of reality, his implicate order.  And he furthermore postulated that rather that it was more accurate to look at reality as a process of unfoldment rather than the existence of some hard and fast physical reality as had been assumed by classical physics over the last few hundred years.  And therefore Bohm had to leave the world of physics proper and enter the realm of metaphysics where the more abstract concepts of mind and the notion of perception itself could be, and had to be, incorporated into the model.

 

So Charlie came to what he thought to be the logical conclusion that any unified knowledge theory must encompass levels of abstraction that go beyond mathematics, and yet at the same time must be constrained by language itself, which is the means through which we communicate ideas and thoughts to each other.  In other words, any unified knowledge theory, any comprehensive and coherent model of the world must incorporate the act pf perception directly into the model – this is the Mind of Anaxagoras, the Intellect of the Neo-Platonists.  But there was nothing less empirical, less scientific, than the science of the mind, i.e. psychology, right?

Interesting enough, Carl Jung postulated that you could actually “prove” the existence of what he referred to as the collective unconscious, which as its name suggests represents the existence of an undercurrent universal mental framework from which individual psyche’s draw their source, or at least the source of what he called archetypes, or universal archaic symbols and patterns, motifs and themes, that he found common in the psyches of a wide range of his patients, too common from his perspective in fact to be the result of chance or happenstance.

Jung’s method of proof as it were, was to establish the connection of the individual psyche with what he described as universal archetypal themes that firmly established the existence of some universal ground of human symbols from which the individual symbology, the psyche, must draw.  He reckoned that the individual who perceives these archetypes, themes and motifs which he saw manifest in a wide variety of his patients in his psychoanalytic work, could have no precursory knowledge of the existence of these archetypes and therefore the symbols themselves, the common mythical themes, must stem from a source that is present in some way in all human psyches and yet still is not tied to the individual conscious mind as it were.

One of the examples that Jung found in his psychoanalytic practice that he gives to illustrate the workings of this collective unconscious, and ultimately led to his “discovery”, concerns a vision that one of his patients supposedly had in his office one day.  The patient was somewhat delusional and had visions that he was a Christ like figure and one day in Jung’s office this patient claims to see a phallic, tube like structure coming down and out of the sun.  He points out the existence of this symbol/structure to Jung, believing firmly in its existence, but Jung sees nothing out of the ordinary.  Jung then proceeds to think very little of the event until many years later he reads of an archeological discovery of a text which describes a mystic ritual that involves the vision of a tube like structure emanating from the sun.  Jung surmises that his patient could have no knowledge of the description of this ancient ritual which corresponded so closely to his vision in Jung’s office years earlier (the text had not even been discovered at the time of the original vision by the patient) and therefore must be evidence of the existence of some common symbolic denominator that individuals can tap into so to speak, and at some level underlies the psyche of all human existence, i.e. the collective unconscious.

 

My thesis then, is as follows: in addition to our immediate consciousness, which is of a thoroughly personal nature and which we believe to be the only empirical psyche (even if we tack on the personal unconscious as an appendix), there exists a second psychic system of a collective, universal, and impersonal nature which is identical in all individuals.  This collective unconscious does not develop individually but is inherited. It consists of pre-existent forms, the archetypes, which can only become conscious secondarily and which give definite form to certain psychic contents.[2]

 

Out of his psychoanalytic work then, emerges Jung’s theory of the collective unconscious, as evidenced by the existence of these universal archetypal themes, as well as his psychoanalytical healing technique which he called individuation which Jung used to guide the individual psyche to a better understanding of one’s connection to this collective, universal “unconscious” via the means of what he called active imagination, which as it turned out was heavily reliant on symbols, mandalas in particular which of course play such a strong role in the meditative practices and rituals of the Eastern philosophical traditions.

The existence of these motifs (or mythemes when looked at within the context of mythology which can be viewed as the expression of the collective unconscious of a society or civilization as a whole) across the boundaries of time and space, manifesting in the mind of man throughout the course of history spoke to the existence of a collective unconscious, from which these archetypal images or themes must emerge.  To Jung, consciousness and its counterpart the unconscious, the sum total of which made up the psyche of man, represented the every ground of reality.

 

When one reflects upon what consciousness really is, one is profoundly impressed by the extreme wonder of the fact that an event which takes place outside in the cosmos simultaneously produces an internal image, that it takes place, so to speak, inside as well, which is to say; becomes conscious.[3]

 

Now this was interesting.  You start with the concept of cultural borrowing, you search for something deeper, something more rich that connects the ancient cultures.  You look into their mythology (and theology because arguably the further back you go into ancient history the less distinguishable a society’s mythology is from its theology), cultural cosmology and mythology in general, and you end up with some parallels but nothing concrete per se, then you look at mythology as a whole, and you end up, as both Jung and Campbell had done really, in the realm of psychology, which as it turns out is kind of where you end up if you follow the end of modern physics as well.  That seemed strange.  And yet it seemed to point back to the idea that if you wanted to really understand the world, understand it even at the physical level, you had to establish a broader perspective than models that had a purely empirically driven and (physical) scientific basis.

 

This quest for ultimate knowledge, and order, is as old as mankind itself and is reflected in the cosmological traditions of all of the ancient civilizations – as evidenced by the Egyptian, Sumer/Babylonian, Greek and Judeo-Christian cosmologies which all attempt to lay down the structure of the world as we know it and how and why it came into existence – leaving aside the theological dogma whose only purpose was to serve the establishment of power and authority.

In much the same way as the ancients searched for a unified theory of order (the maat of the Egyptians, the Chronos of the Greek cosmological system, etc.) Plato, Aristotle, Euclid, Newton, Darwin, Kepler, Einstein, Planck and Bohm carried the torch of this quest for a unified metaphysical structure forward throughout the development of Western civilization and led ultimately to the branching of knowledge into the different sciences today, upon which our modern society rests today and relies on to guide us through life.  In fact, Charlie’s premise was that in fact modern man, in the Information Age where empirical reality was so baked into our Western minds, we had as much blind faith in science today than as orthodox religious zealots and believers had faith in their God.

In prior eras, where mankind had understood less about how things really worked, they could rely on religion, a grand creator God, as the underlying reason behind and explanation of how things worked and how things came to be.  This is the creation myth of Genesis and provides the rational explanation to the cosmologies of all the ancient peoples, in the East and the West.  But we do not have that luxury today, we have science and science has showed us things, things about the nature of reality that must be incorporated into our understanding of not only the physical world around us, but also its socio-biological foundations, as well as the integral role of mind, of consciousness, in forming the basis of how we “perceive” the world.

But none of these great thinkers, scientists, philosophers or sages that had so marked intellectual progress sin the West over the centuries had access to and were exposed to the state of knowledge as it stood today, in the Information Age where we as a species understood not only that our species as a whole was some few hundred thousand years old, and was not crafted from the clay of the earth as the mythologies of the West would have us believe, but through the process of natural selection, evolution, engineered by our own genetic structure which incorporated the role of chance into our evolution (genetic mutation).  We came to be able to speak and communicate with each other, form abstract concepts thoughts into words and syllables that could be communicated from mind to mind, to cultivate the land and domesticate animals, followed by the invention of writing and the spread of mankind throughout the world to the point where we not only truly understood how connected and integrated we all are not only as a species as whole, but also as an organism whose destiny is tied to the planet in a very real and tangible way.

Scholars and academics of today, and to all those who are curious and have the time to explore the origins of mankind and how our own belief systems have evolved since the dawn of civilization, have a much deeper and broader understanding of how our species, which is in many respects is characterized by our ability to speak, our ability to communicate with each other, and our ability to write down and develop complex and sophisticated models of thought and concepts that have led to a profound understanding of not only how the physical universe has come to be, but also of how our minds have developed and the fundamental connection between the act of experience and our perception of the physical universe.

This is the logical conclusion that must be drawn when one takes a hard look at the sciences as they stand today, fields of knowledge which are based upon empirically verified and proven facts, facts which followed point to the inevitable conclusion that there exist intellectual boundaries and limits of science itself, and that a broader perspective must be used if we want to truly understand this world we live in, as well as how our place in it has evolved and/or should evolve moving forward.  A perspective that must integrate at some level the role of consciousness itself, upon which any understanding of anything in fact, must be based.


[1] From Max Born’s Nobel Laureate speech, reference http://originoftheuniverse.wikia.com/wiki/Uncertainty_Principle.

[2] C. G. Jung, The Archetypes and the Collective Unconscious (London 1996) p. 43

[3] C Jung; Basel Seminar, privately printed, 1934, p. i

%d bloggers like this: