Exterior-Point Optimization for Sparse and Low-Rank Optimization

Shuvomoy Das Gupta, Bartolomeo Stellato, Bart P.G. Van Parys

Research output: Contribution to journalArticlepeer-review

Abstract

Many problems of substantial current interest in machine learning, statistics, and data science can be formulated as sparse and low-rank optimization problems. In this paper, we present the nonconvex exterior-point optimization solver (NExOS)—a first-order algorithm tailored to sparse and low-rank optimization problems. We consider the problem of minimizing a convex function over a nonconvex constraint set, where the set can be decomposed as the intersection of a compact convex set and a nonconvex set involving sparse or low-rank constraints. Unlike the convex relaxation approaches, NExOS finds a locally optimal point of the original problem by solving a sequence of penalized problems with strictly decreasing penalty parameters by exploiting the nonconvex geometry. NExOS solves each penalized problem by applying a first-order algorithm, which converges linearly to a local minimum of the corresponding penalized formulation under regularity conditions. Furthermore, the local minima of the penalized problems converge to a local minimum of the original problem as the penalty parameter goes to zero. We then implement and test NExOS on many instances from a wide variety of sparse and low-rank optimization problems, empirically demonstrating that our algorithm outperforms specialized methods.

Original languageEnglish (US)
Pages (from-to)795-833
Number of pages39
JournalJournal of Optimization Theory and Applications
Volume202
Issue number2
DOIs
StatePublished - Aug 2024

All Science Journal Classification (ASJC) codes

  • Control and Optimization
  • Management Science and Operations Research
  • Applied Mathematics

Keywords

  • 65K05
  • 90C30
  • First-order algorithms
  • Low-rank optimization
  • Nonconvex optimization
  • Sparse optimization

Fingerprint

Dive into the research topics of 'Exterior-Point Optimization for Sparse and Low-Rank Optimization'. Together they form a unique fingerprint.

Cite this