Wave function collapse

In quantum mechanics, wave function collapse, also called reduction of the state vector,[1] occurs when a wave function—initially in a superposition of several eigenstates—reduces to a single eigenstate due to interaction with the external world.

Collapse is one of the two processes by which quantum systems evolve in time; the other is the continuous evolution governed by the Schrödinger equation.

Historically, Werner Heisenberg was the first to use the idea of wave function reduction to explain quantum measurement.

for the corresponding observed value, any arbitrary state of the quantum system can be expressed as a vector using bra–ket notation:

Wave functions can therefore always be expressed as eigenstates of an observable though the converse is not necessarily true.

To account for the experimental result that repeated measurements of a quantum system give the same results, the theory postulates a "collapse" or "reduction of the state vector" upon observation,[5]: 566  abruptly converting an arbitrary state into a single component eigenstate of the observable: where the arrow represents a measurement of the observable corresponding to the

The sum of the probability over all possible outcomes must be one:[7] As examples, individual counts in a double slit experiment with electrons appear at random locations on the detector; after many counts are summed the distribution shows a wave interference pattern.

This statistical aspect of quantum measurements differs fundamentally from classical mechanics.

To predict measurement outcomes from quantum solutions, the orthodox interpretation of quantum theory postulates wave function collapse and uses the Born rule to compute the probable outcomes.

[10] Despite the widespread quantitative success of these postulates scientists remain dissatisfied and have sought more detailed physical models.

[11]: 127 Quantum theory offers no dynamical description of the "collapse" of the wave function.

[12] Various interpretations of quantum mechanics attempt to provide a physical model for collapse.

Results from tests of Bell's theorem shows that these variables would need to be non-local.

While models in all groups have contributed to better understanding of quantum theory, no alternative explanation for individual events has emerged as more useful than collapse followed by statistical prediction with the Born rule.

[citation needed] Quantum decoherence explains why a system interacting with an environment transitions from being a pure state, exhibiting superpositions, to a mixed state, an incoherent combination of classical alternatives.

[14] This transition is fundamentally reversible, as the combined state of system and environment is still pure, but for all practical purposes irreversible in the same sense as in the second law of thermodynamics: the environment is a very large and complex quantum system, and it is not feasible to reverse their interaction.

Decoherence is thus very important for explaining the classical limit of quantum mechanics, but cannot explain wave function collapse, as all classical alternatives are still present in the mixed state, and wave function collapse selects only one of them.

[15][16][14] The form of decoherence known as environment-induced superselection proposes that when a quantum system interacts with the environment, the superpositions apparently reduce to mixtures of classical alternatives.

The combined wave function of the system and environment continue to obey the Schrödinger equation throughout this apparent collapse.

[17] More importantly, this is not enough to explain actual wave function collapse, as decoherence does not reduce it to a single eigenstate.

[15][14] The concept of wavefunction collapse was introduced by Werner Heisenberg in his 1927 paper on the uncertainty principle, "Über den anschaulichen Inhalt der quantentheoretischen Kinematik und Mechanik", and incorporated into the mathematical formulation of quantum mechanics by John von Neumann, in his 1932 treatise Mathematische Grundlagen der Quantenmechanik.

[18] Niels Bohr never mentions wave function collapse in his published work, but he repeatedly cautioned that we must give up a "pictorial representation".

Despite the differences between Bohr and Heisenberg, their views are often grouped together as the "Copenhagen interpretation", of which wave function collapse is regarded as a key feature.

[19] John von Neumann's influential 1932 work Mathematical Foundations of Quantum Mechanics took a more formal approach, developing an "ideal" measurement scheme[20][21]: 1270  that postulated that there were two processes of wave function change: In 1957 Hugh Everett III proposed a model of quantum mechanics that dropped von Neumann's first postulate.

He proposed that the discontinuous change is instead a splitting of a wave function representing the universe.

[21]: 1288  While Everett's approach rekindled interest in foundational quantum mechanics, it left core issues unresolved.

[21]: 1290 [20]: 5 Beginning in 1970 H. Dieter Zeh sought a detailed quantum decoherence model for the discontinuous change without postulating collapse.

Further work by Wojciech H. Zurek in 1980 lead eventually to a large number of papers on many aspects of the concept.

[21]: 1273  Decoherence has been shown to work very quickly and within a minimal environment, but as yet it has not succeeded in a providing a detailed model replacing the collapse postulate of orthodox quantum mechanics.

Von Neumann's projection postulate was conceived based on experimental evidence available during the 1930s, in particular Compton scattering.