This family of models was proposed by David Blei and John Lafferty and is an extension to Latent Dirichlet Allocation (LDA) that can handle sequential documents.
Whereas words are still assumed to be exchangeable, in a dynamic topic model the order of the documents plays a fundamental role.
More precisely, the documents are grouped by time slice (e.g.: years) and it is assumed that the documents of each group come from a set of topics that evolved from the set of the previous slice.
The former representation has some disadvantages due to the fact that the parameters are constrained to be non-negative and sum to one.
Since both distributions are in the exponential family, one solution to this problem is to represent them in terms of the natural parameters, that can assume any real value and can be individually changed.
Using the natural parameterization, the dynamics of the topic model are given by and The generative process at time slice 't' is therefore: where
Blei and Lafferty argue that applying Gibbs sampling to do inference in this model is more difficult than in static models, due to the nonconjugacy of the Gaussian and multinomial distributions.
In the original paper, a dynamic topic model is applied to the corpus of Science articles published between 1881 and 1999 aiming to show that this method can be used to analyze the trends of word usage inside topics.
A continuous dynamic topic model was developed by Wang et al. and applied to predict the timestamp of documents.