In statistics, a generalized linear mixed model (GLMM) is an extension to the generalized linear model (GLM) in which the linear predictor contains random effects in addition to the usual fixed effects.
is distributed according to the exponential family with its expectation related to the linear predictor
Generalized linear mixed models are a special cases of hierarchical generalized linear models in which the random effects are normally distributed.
The complete likelihood[5] has no general closed form, and integrating over the random effects is usually extremely computationally intensive.
[6] For example, the penalized quasi-likelihood method, which essentially involves repeatedly fitting (i.e. doubly iterative) a weighted normal mixed model with a working variate,[7] is implemented by various commercial and open source statistical programs.
Fitting generalized linear mixed models via maximum likelihood (as via the Akaike information criterion (AIC)) involves integrating over the random effects.
For this reason, methods involving numerical quadrature or Markov chain Monte Carlo have increased in use, as increasing computing power and advances in methods have made them more practical.
Estimates of the Akaike information criterion for generalized linear mixed models based on certain exponential family distributions have recently been obtained.