Quadratic Majorization for Nonconvex Loss with Applications to the Boosting Algorithm

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


Classical robust statistical methods dealing with noisy data are often based on modifications of convex loss functions. In recent years, nonconvex loss-based robust methods have been increasingly popular. A nonconvex loss can provide robust estimation for data contaminated with outliers. The significant challenge is that a nonconvex loss can be numerically difficult to optimize. This article proposes quadratic majorization algorithm for nonconvex (QManc) loss. The QManc can decompose a nonconvex loss into a sequence of simpler optimization problems. Subsequently, the QManc is applied to a powerful machine learning algorithm: quadratic majorization boosting algorithm (QMBA). We develop QMBA for robust classification (binary and multi-category) and regression. In high-dimensional cancer genetics data and simulations, the QMBA is comparable with convex loss-based boosting algorithms for clean data, and outperforms the latter for data contaminated with outliers. The QMBA is also superior to boosting when directly implemented to optimize nonconvex loss functions. Supplementary material for this article is available online.

Original languageEnglish (US)
Pages (from-to)491-502
Number of pages12
JournalJournal of Computational and Graphical Statistics
Issue number3
StatePublished - Jul 3 2018
Externally publishedYes


  • Boosting
  • Machine learning
  • Nonconvex
  • Quadratic majorization
  • Robust

ASJC Scopus subject areas

  • Statistics and Probability
  • Discrete Mathematics and Combinatorics
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Quadratic Majorization for Nonconvex Loss with Applications to the Boosting Algorithm'. Together they form a unique fingerprint.

Cite this