Decoding methods

There have been many common methods of mapping messages to codewords.

, then ideal observer decoding generates the codeword

The process results in this solution: For example, a person can choose the codeword

In such a case, the sender and receiver(s) must agree ahead of time on a decoding convention.

Popular conventions include: Given a received vector

maximum likelihood decoding picks a codeword

If all codewords are equally likely to be sent then this scheme is equivalent to ideal observer decoding.

[1] The maximum likelihood decoding algorithm is an instance of the "marginalize a product function" problem which is solved by applying the generalized distributive law.

, minimum distance decoding picks a codeword

to minimise the Hamming distance: i.e. choose the codeword

Note that if the probability of error on a discrete memoryless channel

Minimum distance decoding is a reasonable decoding method when the following conditions are met: These assumptions may be reasonable for transmissions over a binary symmetric channel.

They may be unreasonable for other media, such as a DVD, where a single scratch on the disk can cause an error in many neighbouring symbols or codewords.

Syndrome decoding is a highly efficient method of decoding a linear code over a noisy channel, i.e. one on which errors are made.

is capable of correcting up to errors made by the channel (since if no more than

errors are made then minimum distance decoding will still correctly decode the incorrectly transmitted codeword).

Ordinary minimum distance decoding would lookup the vector

for the nearest match - i.e. an element (not necessarily unique)

Syndrome decoding takes advantage of the property of the parity matrix that: for all

is defined to be: To perform ML decoding in a binary symmetric channel, one has to look-up a precomputed table of size

Note that this is already of significantly less complexity than that of a standard array decoding.

in a further reduced table of size This is a family of Las Vegas-probabilistic methods all based on the observation that it is easier to guess enough error-free positions, than it is to guess all the error-positions.

contained no errors, and hence equalled the positions of the sent codeword, then we may decode.

errors occurred, the probability of such a fortunate selection of columns is given by

This method has been improved in various ways, e.g. by Stern[4] and Canteaut and Sendrier.

[5] Partial response maximum likelihood (PRML) is a method for converting the weak analog signal from the head of a magnetic disk or tape drive into a digital signal.

A Viterbi decoder uses the Viterbi algorithm for decoding a bitstream that has been encoded using forward error correction based on a convolutional code.

The Hamming distance is used as a metric for hard decision Viterbi decoders.

The squared Euclidean distance is used as a metric for soft decision decoders.

Optimal decision decoding algorithm (ODDA) for an asymmetric TWRC system.