Linear convergence of a Frank-Wolfe type algorithm over trace-norm balls

Zeyuan Allen-Zhu, Elad Hazan, Wei Hu, Yuanzhi Li

Research output: Contribution to journalConference articlepeer-review

29 Scopus citations

Abstract

We propose a rank-k variant of the classical Frank-Wolfe algorithm to solve convex optimization over a trace-norm ball. Our algorithm replaces the top singular-vector computation (1-SVD) in Frank-Wolfe with a top-k singular-vector computation (k-SVD), which can be done by repeatedly applying 1-SVD k times. Alternatively, our algorithm can be viewed as a rank-k restricted version of projected gradient descent. We show that our algorithm has a linear convergence rate when the objective function is smooth and strongly convex, and the optimal solution has rank at most k. This improves the convergence rate and the total time complexity of the Frank-Wolfe method and its variants.

Original languageEnglish (US)
Pages (from-to)6192-6201
Number of pages10
JournalAdvances in Neural Information Processing Systems
Volume2017-December
StatePublished - 2017
Event31st Annual Conference on Neural Information Processing Systems, NIPS 2017 - Long Beach, United States
Duration: Dec 4 2017Dec 9 2017

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Linear convergence of a Frank-Wolfe type algorithm over trace-norm balls'. Together they form a unique fingerprint.

Cite this