Reference Material:
A
Primer on Determinism by John Earman (1986)
My
second entry extends the question asked in my first post to Quantum Mechanics
(QM). Is QM deterministic? Unlike for classical mechanics (CM), an adequate
response here demands consideration of the various interpretations of QM.
Relative to CM, QM is much less well-understood, and the different approaches
to making sense of it affect the status of determinism within its structure.
I’ll choose three of the well-known interpretations and attempt to evaluate
determinism within each. All interpretations share some set of theoretical
material. What distinguishes the approaches, in my understanding, can be
satisfactorily described by how each purports to resolve the measurement
problem (see reference material and/or additional links). Let’s jump right in.
First,
there is a theoretical core that is common to all the interpretations of QM.
Some of this core is quite mathematical (e.g. operators in Hilbert space), but
the more physical aspect is basically that quantum states evolve according to
the Schrodinger equation. As far as my understanding goes, it is relatively
uncontroversial to claim that the time evolution according to the Schrodinger
equation is deterministic. So the basic core of the theory is deterministic,
and each interpretation will then build around this core to either preserve or
violate this determinism.
Perhaps
the most well-known interpretation of QM is the so-called Copenhagen
interpretation. It is actually a family of views that diverge in important
ways, and a one paragraph summary certainly doesn’t do it justice, but let’s
crudely summarize anyhow. According to this view, quantum mechanics (in
particular, the wave function) is a tool for determining the probabilities of
certain experimental outcomes. The process of measurement, in some mysterious
manner, causes the system under study to suddenly assume one of a number of
different possible states, a process known as the “collapse of the wave
function”. The exact result that one observes in any particular experiment
cannot be predicted, only described probabilistically. There seem to be
antirealist and pragmatist themes to this view – don’t worry too much about
fundamental ontology, just use the theory for what it’s fantastically good at,
and infer nothing more. I think it’s clear that this view leaves QM decidedly
indeterministic. We can describe the probability distribution that obtains
among a set of possible experimental outcomes, but the result of each
individual experiment is fundamentally random.
We
move from a kind of antirealism to a kind of realism. The “many-worlds”
interpretation regards the wave function as objectively real, and the
theoretical core of QM as in some sense complete. Systems evolve only in
accordance with the Schrodinger equation. What follows is that the wave function
never collapses, as we seemingly observe, but instead every possible outcome of
a given quantum experiment is somehow realized. The apparent incongruence between this
claim and our experience is made possible by a radical assertion. When a
quantum experiment is performed, the world “branches” into many parallel worlds.
Each parallel world exhibits a possible outcome of the experiment such that all
possible outcomes are realized. Within the many-worlds framework, there are
ways (too involved to delve into here) of trying to explain the apparent
probabilistic nature of quantum experiments that we observe. The question
vis-à-vis determinism is if these explanations succeed in banishing the
indeterminism of, say, the Copenhagen interpretation. My empiricist reaction is
to say that the apparently probabilistic nature of the outcomes of quantum
experiments in this world, the one we
observe, is the explanandum. If your explanation involves positing countless
other worlds that are, almost by construction, unobservable, rendering personal
identity unintelligible along the way, then perhaps it’s not much of an
explanation after all. It’s unclear to me whether many-worlds gives us a
deterministic QM.
The
last interpretation I’ll mention is Bohmian mechanics. This interpretation of
QM differs from the previous two in two important respects. First, the
classical notion of definite position is restored. In addition to the wave
function, a complete description of the state of a system also includes the
definite positions of the constituent particles. Second, mathematical structure
is added to the common QM core. The wave function evolves according to the
Schrodinger equation, but the particle positions evolve via the “guiding
equation”. The time evolution of a system is completely deterministic. The wave
function describes the randomness (arising via ignorance) in initial particle
positions, and this randomness is deterministically evolved through the
experiment to yield the apparently random outcomes we observe. I understand
Bohmian mechanics to be completely deterministic, or at least as deterministic
as CM.
So
where does this leave us? I come back to the conclusion that I attempted to
articulate at the end of my first post. Let’s imagine a world “governed”
entirely by QM at the most fundamental level. Is such a world deterministic? It
depends on which flavor of QM, but let’s take the Copenhagen interpretation for
argument’s sake. The state of a system (a single particle, the universe, etc.)
as described by its wave function evolves deterministically according to the
Schroedinger equation. Whenever a quantum experiment occurs (a double-slit
experiment is performed, a Uranium atom undergoes alpha decay, etc.), the
system evolves in a purely random way, albeit in a way we can describe with the
language of probability. The overall picture of the world that results is one
in which events unfold in two distinct ways: deterministically or in a purely
random, acausal manner. In my understanding, this is the picture of the world that
CM and QM are giving us.
Additional Links: