The need for contextuality was discussed informally in 1935 by Grete Hermann,[13] but it was more than 30 years later when Simon B. Kochen and Ernst Specker, and separately John Bell, constructed proofs that any realistic hidden-variable theory able to explain the phenomenology of quantum mechanics is contextual for systems of Hilbert space dimension three and greater.
The Kochen–Specker theorem proves that realistic noncontextual hidden-variable theories cannot reproduce the empirical predictions of quantum mechanics.
In addition, Kochen and Specker constructed an explicitly noncontextual hidden-variable model for the two-dimensional qubit case in their paper on the subject,[1] thereby completing the characterisation of the dimensionality of quantum systems that can demonstrate contextual behaviour.
[2] The sheaf-theoretic, or Abramsky–Brandenburger, approach to contextuality initiated by Samson Abramsky and Adam Brandenburger is theory-independent and can be applied beyond quantum theory to any situation in which empirical data arises in contexts.
Adán Cabello, Simone Severini, and Andreas Winter introduced a general graph-theoretic framework for studying contextuality of different physical theories.
[5] In the CbD approach,[20][21][22] developed by Ehtibar Dzhafarov, Janne Kujala, and colleagues, (non)contextuality is treated as a property of any system of random variables, defined as a set
, measuring the same content in different context, are always identically distributed, the system is called consistently connected (satisfying "no-disturbance" or "no-signaling" principle).
Except for certain logical issues,[7][21] in this case CbD specializes to traditional treatments of contextuality in quantum physics.
CbD essentially coincides with the probabilistic part of Abramsky's sheaf-theoretic approach if the system is strongly consistently connected, which means that the joint distributions of
[27] In particular, Vctor Cervantes, Ehtibar Dzhafarov, and colleagues have demonstrated that random variables describing certain paradigms of simple decision making form contextual systems,[28][29][30] whereas many other decision-making systems are noncontextual once their inconsistent connectedness is properly taken into account.
[27] An extended notion of contextuality due to Robert Spekkens applies to preparations and transformations as well as to measurements, within a general framework of operational physical theories.
[31] With respect to measurements, it removes the assumption of determinism of value assignments that is present in standard definitions of contextuality.
This breaks the interpretation of nonlocality as a special case of contextuality, and does not treat irreducible randomness as nonclassical.
This was further explored by Simmons et al,[32] who demonstrated that other notions of contextuality could also be motivated by Leibnizian principles, and could be thought of as tools enabling ontological conclusions from operational statistics.
However, such a number does not define a full probability distribution, i.e. values over a set of mutually exclusive events, summing up to 1.
Starting now from extracontextuality as a postulate, the fact that certainty can be transferred between contexts, and is then associated with a given projector, is the very basis of the hypotheses of Gleason's theorem, and thus of Born's rule.
[37][38] Also, associating a state vector with an extravalence class clarifies its status as a mathematical tool to calculate probabilities connecting modalities, which correspond to the actual observed physical events or results.
Examples explored to date rely on additional memory constraints which have a more computational than foundational motivation.
[40] The Kochen–Specker theorem proves that quantum mechanics is incompatible with realistic noncontextual hidden variable models.
On the other hand Bell's theorem proves that quantum mechanics is incompatible with factorisable hidden variable models in an experiment in which measurements are performed at distinct spacelike separated locations.
Arthur Fine showed that in the experimental scenario in which the famous CHSH inequalities and proof of nonlocality apply, a factorisable hidden variable model exists if and only if a noncontextual hidden variable model exists.
[8] This equivalence was proven to hold more generally in any experimental scenario by Samson Abramsky and Adam Brandenburger.
It has also been proved that CF(e) is an upper bound on the extent to which e violates any normalised noncontextuality inequality.
Moreover, the dual linear program to that which maximises λ computes a noncontextual inequality for which this violation is attained.
One of them, denoted CNT3, uses the notion of a quasi-coupling, that differs from a coupling in that the probabilities in the joint distribution of its values are replaced with arbitrary reals (allowed to be negative but summing to 1).
[45] Extensions to the qubit case have been investigated by Juani Bermejo Vega et al.[41] This line of research builds on earlier work by Ernesto Galvão,[40] which showed that Wigner function negativity is necessary for a state to be "magic"; it later emerged that Wigner negativity and contextuality are in a sense equivalent notions of nonclassicality.
[47] In 2009, Janet Anders and Dan Browne showed that two specific examples of nonlocality and contextuality were sufficient to compute a non-linear function.
In 2013, Robert Raussendorf showed more generally that access to strongly contextual measurement statistics is necessary and sufficient for an l2-MBQC to compute a non-linear function.
He also showed that to compute non-linear Boolean functions with sufficiently high probability requires contextuality.
[47] A further generalization and refinement of these results due to Samson Abramsky, Rui Soares Barbosa and Shane Mansfield appeared in 2017, proving a precise quantifiable relationship between the probability of successfully computing any given non-linear function and the degree of contextuality present in the l2-MBQC as measured by the contextual fraction.