A binary symmetric channel (or BSCp) is a common communications channel model used in coding theory and information theory.
The bit will be "flipped" with a "crossover probability" of p, and otherwise is received correctly.
This model can be applied to varied communication channels such as telephone lines or disk drive storage.
The noisy-channel coding theorem applies to BSCp, saying that information can be transmitted at any rate up to the channel capacity with arbitrarily low error.
the received variable, then the channel is characterized by the conditional probabilities:[1] It is assumed that
, then the receiver can swap the output (interpret 1 when it sees 0, and vice versa) and obtain an equivalent channel with crossover probability
is the binary entropy function, defined by:[2] The mutual information can be reformulated as where the first and second step follows from the definition of mutual information and conditional entropy respectively.
) equals the binary entropy function, which leads to the third line and this can be further simplified.
The entropy of a binary variable is at most 1 bit, and equality is attained if its probability distribution is uniform.
For this, note that it is a property of any binary symmetric channel that a uniform probability distribution of the input results in a uniform probability distribution of the output.
Shannon's noisy-channel coding theorem gives a result about the rate of information that can be transmitted through a communication channel with arbitrarily low error.
, there is a very high probability of recovering the original message by decoding, if
or in effect the rate of the channel is bounded by the quantity stated in the theorem.
satisfies the conclusion of theorem, by integration over the probabilities.
noise is exponentially small in n. At this point, the proof works for a fixed message
We achieve this by eliminating half of the codewords from the code with the argument that the proof for the decoding error probability holds for at least half of the codewords.
This gives the total process the name random coding with expurgation.
Using approximation to estimate the number of codewords in the Hamming ball, we have
Now taking expectation on both sides we have, by appropriately choosing the value of
Now by applying Markov's inequality, we can show the decoding error probability for the first
, a case we would like to avoid to keep the decoding error probability exponentially small.
Very recently, a lot of work has been done and is also being done to design explicit error-correcting codes to achieve the capacities of several standard communication channels.
The approach behind the design of codes which meet the channel capacities of
have been to correct a lesser number of errors with a high probability, and to achieve the highest possible rate.
, but it does not give us an idea of any explicit codes which achieve that rate.
In fact such codes are typically constructed to correct only a small fraction of errors with a high probability, but achieve a very good rate.
This when expressed in asymptotic terms, gives us an error probability of
LDPC codes have been considered for this purpose for their faster decoding time.
[4] The binary symmetric channel can model a disk drive used for memory storage: the channel input represents a bit being written to the disk and the output corresponds to the bit later being read.
Conversely, being able to transmit effectively over the BSC can give rise to solutions for more complicated channels.