[1] Possibly the first systematic attempt at a mathematical theory was developed by John von Neumann.
That theory was based in turn on the theory of projection-valued measures for self-adjoint operators that had been recently developed (by von Neumann and independently by Marshall Stone) and the Hilbert space formulation of quantum mechanics (attributed by von Neumann to Paul Dirac).
In this formulation, the state of a physical system corresponds to a vector of length 1 in a Hilbert space H over the complex numbers.
An observable is represented by a self-adjoint (i.e. Hermitian) operator A on H. If H is finite dimensional, by the spectral theorem, A has an orthonormal basis of eigenvectors.
Quantum mechanics, moreover, gives a recipe for computing a probability distribution Pr on the possible outcomes given the initial system state is ψ.
In this case, the state space can be geometrically represented as the surface of a sphere, as shown in the figure on the right.
Von Neumann formulated the question 1) and provided an argument why the answer had to be no, if one accepted the formalism he was proposing.
However, according to Bell, von Neumann's formal proof did not justify his informal conclusion.
Quantum indeterminacy can also be illustrated in terms of a particle with a definitely measured momentum for which there must be a fundamental limit to how precisely its location can be specified.
Quantum indeterminacy is the assertion that the state of a system does not determine a unique collection of values for all its measurable properties.
The values of an observable will be obtained non-deterministically in accordance with a probability distribution that is uniquely determined by the system state.
For example, in the spin 1/2 example discussed above, the system can be prepared in the state ψ by using measurement of σ1 as a filter that retains only those particles such that σ1 yields +1.
By the von Neumann (so-called) postulates, immediately after the measurement the system is assuredly in the state ψ.
In fact, Einstein, Boris Podolsky and Nathan Rosen showed that if quantum mechanics is correct, then the classical view of how the real world works (at least after special relativity) is no longer tenable.
This view included the following two ideas: This failure of the classical view was one of the conclusions of the EPR thought experiment in which two remotely located observers, now commonly referred to as Alice and Bob, perform independent measurements of spin on a pair of electrons, prepared at a source in a special state called a spin singlet state.
From this it follows that either value of spin in the x direction is not an element of reality or that the effect of Alice's measurement has infinite speed of propagation.
Quantum randomness is the statistical manifestation of that indeterminacy, witnessable in results of experiments repeated many times.
It refers to the null logical connectivity that exists between mathematical propositions (in the same language) that neither prove nor disprove one another.
[11] In the work of Paterek et al., the researchers demonstrate a link connecting quantum randomness and logical independence in a formal system of Boolean propositions.
In experiments measuring photon polarisation, Paterek et al. demonstrate statistics correlating predictable outcomes with logically dependent mathematical propositions, and random outcomes with propositions that are logically independent.
He showed how indeterminacy's indefiniteness arises in evolved density operators representing mixed states, where measurement processes encounter irreversible 'lost history' and ingression of ambiguity.