In the case of equal probabilities for the transitions, probabilistic Turing machines can be defined as deterministic Turing machines having an additional "write" instruction where the value of the write is uniformly distributed in the Turing machine's alphabet (generally, an equal likelihood of writing a "1" or a "0" on to the tape).
, where At each step, the Turing machine probabilistically applies either the transition function
In this way, the process of selecting a transition function at each step of the computation resembles a coin flip.
One such notion that includes several important complexity classes is allowing for an error probability of 1/3.
Another class defined using this notion of acceptance is BPL, which is the same as BPP but places the additional restriction that languages must be solvable in logarithmic space.
Complexity classes arising from other definitions of acceptance include RP, co-RP, and ZPP.
If the machine is restricted to logarithmic space instead of polynomial time, the analogous RL, co-RL, and ZPL complexity classes are obtained.
By enforcing both restrictions, RLP, co-RLP, BPLP, and ZPLP are yielded.
Probabilistic computation is also critical for the definition of most classes of interactive proof systems, in which the verifier machine depends on randomness to avoid being predicted and tricked by the all-powerful prover machine.
One of the central questions of complexity theory is whether randomness adds power; that is, is there a problem that can be solved in polynomial time by a probabilistic Turing machine but not a deterministic Turing machine?
On the other hand, the power randomness gives to interactive proof systems, as well as the simple algorithms it creates for difficult problems such as polynomial-time primality testing and log-space graph connectedness testing, suggests that randomness may add power.