Seemingly unrelated regressions

Each equation is a valid linear regression on its own and can be estimated separately, which is why the system is called seemingly unrelated,[3]: 332  although some authors suggest that the term seemingly related would be more appropriate,[1]: 306  since the error terms are assumed to be correlated across the equations.

Each equation i has a single response variable yir, and a ki-dimensional vector of regressors xir.

Finally, if we stack these m vector equations on top of each other, the system will take the form [4]: eq.

The SUR model is usually estimated using the feasible generalized least squares (FGLS) method.

:[6]: 198 In the second step we run generalized least squares regression for (1) using the variance matrix

: This estimator is unbiased in small samples assuming the error terms εir have symmetric distribution; in large samples it is consistent and asymptotically normal with limiting distribution[6]: 198 Other estimation techniques besides FGLS were suggested for SUR model:[7] the maximum likelihood (ML) method under the assumption that the errors are normally distributed; the iterative generalized least squares (IGLS), where the residuals from the second step of FGLS are used to recalculate the matrix

again using GLS, and so on, until convergence is achieved; the iterative ordinary least squares (IOLS) scheme, where estimation is performed on equation-by-equation basis, but every equation includes as additional regressors the residuals from the previously estimated equations in order to account for the cross-equation correlations, the estimation is run iteratively until convergence is achieved.

[8] Zellner and Ando (2010) developed a direct Monte Carlo method for the Bayesian analysis of SUR model.

[9] There are two important cases when the SUR estimates turn out to be equivalent to the equation-by-equation OLS.