History of randomness

[1][2] Beyond religion and games of chance, randomness has been attested for sortition since at least ancient Athenian democracy in the form of a kleroterion.

The early part of the twentieth century saw a rapid growth in the formal analysis of randomness, and mathematical foundations for probability were introduced, leading to its axiomatization in 1933.

[6] Over 3,000 years ago, the problems concerned with the tossing of several coins were considered in the I Ching, one of the oldest Chinese mathematical texts, that probably dates to 1150 BC.

The two principal elements yin and yang were combined in the I Ching in various forms to produce Heads and Tails permutations of the type HH, TH, HT, etc.

Deborah J. Bennett suggests that ordinary people face an inherent difficulty in understanding randomness, although the concept is often taken as being obvious and self-evident.

She cites studies by Kahneman and Tversky; these concluded that statistical principles are not learned from everyday experience because people do not attend to the detail necessary to gain such knowledge.

Around 400 BC, Democritus presented a view of the world as governed by the unambiguous laws of order and considered randomness as a subjective concept that only originated from the inability of humans to understand the nature of events.

The Chinese analyzed the cracks in turtle shells, while the Germans, who according to Tacitus had the highest regards for lots and omens, utilized strips of bark.

[16][17] Over the centuries, many Christian scholars wrestled with the conflict between the belief in free will and its implied randomness, and the idea that God knows everything that happens.

Saints Augustine and Aquinas tried to reach an accommodation between foreknowledge and free will, but Martin Luther argued against randomness and took the position that God's omniscience renders human actions unavoidable and determined.

[20][21][22] In his 1565 Liber de Lude Aleae (a gambler's manual published after his death) Gerolamo Cardano wrote one of the first formal tracts to analyze the odds of winning at various games.

[23] Around 1620 Galileo wrote a paper called On a discovery concerning dice that used an early probabilistic model to address specific questions.

The first known suggestion for viewing randomness in terms of complexity was made by Leibniz in an obscure 17th-century document discovered after his death.

While the mathematical elite was making progress in understanding randomness from the 17th to the 19th century, the public at large continued to rely on practices such as fortune telling in the hope of taming chance.

[29] "I know of scarcely anything so apt to impress the imagination as the wonderful form of cosmic order expressed by the "Law of Frequency of Error."

Whenever a large sample of chaotic elements are taken in hand and marshalled in the order of their magnitude, an unsuspected and most beautiful form of regularity proves to have been latent all along.

The tops of the marshalled row form a flowing curve of invariable proportions; and each element, as it is sorted into place, finds, as it were, a pre-ordained niche, accurately adapted to fit it."

[33] By the end of the 19th century, Newton's model of a mechanical universe was fading away as the statistical view of the collision of molecules in gases was studied by Maxwell and Boltzmann.

During the 20th century, the five main interpretations of probability theory (e.g., classical, logical, frequency, propensity and subjective) became better understood, were discussed, compared and contrasted.

In 1900 Louis Bachelier applied Brownian motion to evaluate stock options, effectively launching the fields of financial mathematics and stochastic processes.

This approach led him to suggest a definition of randomness that was later refined and made mathematically rigorous by Alonzo Church by using computable functions in 1940.

[40][41] The advent of quantum mechanics in the early 20th century and the formulation of the Heisenberg uncertainty principle in 1927 saw the end to the Newtonian mindset among physicists regarding the determinacy of nature.

[43][44] Given that the frequency approach cannot deal with "a single toss" of a coin, and can only address large ensembles or collectives, the single-case probabilities were treated as propensities or chances.

[48] Martingales for the study of chance and betting strategies were introduced by Paul Lévy in the 1930s and were formalized by Joseph L. Doob in the 1950s.

[53] In 1964, Benoît Mandelbrot suggested that most statistical models approached only a first stage of dealing with indeterminism, and that they ignored many aspects of real world turbulence.

[56] Despite mathematical advances, reliance on other methods of dealing with chance, such as fortune telling and astrology continued in the 20th century.

[57][58][59] White House Chief of Staff Donald Regan criticized the involvement of astrologer Joan Quigley in decisions made during Ronald Reagan's presidency in the 1980s.

Ancient fresco of dice players in Pompei
Depiction of Roman goddess Fortuna who determined fate, by Hans Beham , 1541
Hotei , the deity of fortune observing a cock fight in a 16th-century Japanese print
The Fortune Teller by Vouet , 1617
Antony Gormley 's Quantum Cloud sculpture in London was designed by a computer using a random walk algorithm.
Café Central , one of the early meeting places of the Vienna circle