An expanded theoretical treatment of iteration-dependent majorize-minimize algorithms

Matthew W. Jacobson, Jeffrey A. Fessler

Research output: Contribution to journalArticlepeer-review

67 Scopus citations

Abstract

The majorize-minimize (MM) optimization technique has received considerable attention in signal and image processing applications, as well as in statistics literature. At each iteration of an MM algorithm, one constructs a tangent majorant function that majorizes the given cost function and is equal to it at the current iterate. The next iterate is obtained by minimizing this tangent majorant function, resulting in a sequence of iterates that reduces the cost function monotonically. A well-known special case of MM methods are expectation-maximization algorithms. In this paper, we expand on previous analyses of MM, due to Fessler and Hero, that allowed the tangent majorants to be constructed in iteration-dependent ways. Also, this paper overcomes an error in one of those earlier analyses. There are three main aspects in which our analysis builds upon previous work. First, our treatment relaxes many assumptions related to the structure of the cost function, feasible set, and tangent majorants. For example, the cost function can be nonconvex and the feasible set for the problem can be any convex set. Second, we propose convergence conditions, based on upper curvature bounds, that can be easier to verify than more standard continuity conditions. Furthermore, these conditions allow for considerable design freedom in the iteration-dependent behavior of the algorithm. Finally, we give an original characterization of the local region of convergence of MM algorithms based on connected (e.g., convex) tangent majorants. For such algorithms, cost function minimizers will locally attract the iterates over larger neighborhoods than typically is guaranteed with other methods. This expanded treatment widens the scope of the MM algorithm designs that can be considered for signal and image processing applications, allows us to verify the convergent behavior of previously published algorithms, and gives a fuller understanding overall of how these algorithms behave.

Original languageEnglish (US)
Pages (from-to)2411-2422
Number of pages12
JournalIEEE Transactions on Image Processing
Volume16
Issue number10
DOIs
StatePublished - Oct 2007
Externally publishedYes

Keywords

  • Expectation-maximization (EM)
  • Majorize-minimize (MM)
  • Optimization transfer
  • SAGE

ASJC Scopus subject areas

  • Software
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'An expanded theoretical treatment of iteration-dependent majorize-minimize algorithms'. Together they form a unique fingerprint.

Cite this