Forward algorithm

The forward algorithm, in the context of a hidden Markov model (HMM), is used to calculate a 'belief state': the probability of a state at a certain time, given the history of evidence.

The forward and backward algorithms should be placed within the context of probability as they appear to simply be names given to a set of standard mathematical procedures within a few fields.

For example, neither "forward algorithm" nor "Viterbi" appear in the Cambridge encyclopedia of mathematics.

The main observation to take away from these algorithms is how to organize Bayesian updates and inference to be computationally efficient in the context of directed graphs of variables (see sum-product networks).

The backward algorithm complements the forward algorithm by taking into account the future history if one wanted to improve the estimate for past times.

In order to achieve the most likely sequence, the Viterbi algorithm is required.

The goal of the forward algorithm is to compute the joint probability

are assumed to be discrete, finite random variables.

The hidden Markov model's state transition probabilities

naively would require marginalizing over all possible state sequences

Instead, the forward algorithm takes advantage of the conditional independence rules of the hidden Markov model (HMM) to perform the calculation recursively.

are given by the model's emission distributions and transition probabilities, which are assumed to be known, one can quickly calculate

The recursion formula given above can be written in a more compact form.

The initial condition is set in accordance to the prior probability over

has been computed using the forward algorithm, we can easily obtain the related joint probability

as Once the conditional probability has been calculated, we can also find the point estimate of

is given by The forward algorithm is easily modified to account for observations from variants of the hidden Markov model as well, such as the Markov jump linear system.

We have observations of seaweed for three consecutive days as dry, damp, and soggy in order.

The possible states of weather can be sunny, cloudy, or rainy.

To reduce this complexity, Forward algorithm comes in handy, where the trick lies in using the conditional independence of the sequence steps to calculate partial probabilities,

is the number of hidden or latent variables, like weather in the example above, and

This is clear reduction from the adhoc method of exploring all the possible states with a complexity of

Since the development of speech recognition[4] and pattern recognition and related fields like computational biology which use HMMs, the forward algorithm has gained popularity.

The forward algorithm is mostly used in applications that need us to determine the probability of being in a specific state when we know about the sequence of observations.

The Forward algorithm will then tell us about the probability of data with respect to what is expected from our model.

One of the applications can be in the domain of Finance, where it can help decide on when to buy or sell tangible assets.

It can have applications in all fields where we apply Hidden Markov Models.

The popular ones include Natural language processing domains like tagging part-of-speech and speech recognition.

Forward algorithm can also be applied to perform Weather speculations.

We can have a HMM describing the weather and its relation to the state of observations for few consecutive days (some examples could be dry, damp, soggy, sunny, cloudy, rainy etc.).

Temporal evolution of a hidden Markov model
Temporal evolution of a hidden Markov model