Chapter 8 Expectation maximization algorithms

Somewhat surprisingly, it is possible to develop an algorithm, known as the expectation-maximization algorithm, for computing the maximum of a likelihood function in situations where computing the likelihood itself is quite difficult. This is possible in situations where the model is defined in terms of certain unobserved components, and where likelihood computations and optimization is relatively easy had we had the complete observation. The EM algorithm exploits this special structure, and is thus not a general optimization algorithm, but the situation where it applies is common enough in statistics that it is one of the core optimization algorithms used for computing maximum-likelihood estimates.

In this chapter it is shown that the algorithm is generally an descent algorithm of the negative log-likelihood, and examples of its implementation are given to multinomial cell collapsing and Gaussian mixtures. The theoretical results needed for the EM algorithm for a special case of mixed models are given as well. Finally, some theoretical results as well as practical implementations for computing estimates of the Fisher information are presented.