Abstract
In the framework of online convex optimization, most iterative algorithms require the computation of projections onto convex sets, which can be computationally expensive. To tackle this problem Hazan and Kale (2012) proposed the study of projection-free methods that replace projections with less expensive computations. The most common approach is based on the Frank-Wolfe method, that uses linear optimization computation in lieu of projections. Recent work by Garber and Kretzu (2022) gave sublinear adaptive regret guarantees with projection free algorithms based on the Frank Wolfe approach. In this work we give projection-free algorithms that are based on a different technique, inspired by Mhammedi (2022), that replaces projections by set-membership computations. We propose a simple lazy gradient-based algorithm with a Minkowski regularization that attains near-optimal adaptive regret bounds. For general convex loss functions we improve previous adaptive regret bounds from O(T3/4) to O(√T), and further to tight interval dependent bound Õ(√I) where I denotes the interval length. For strongly convex functions we obtain the first poly-logarithmic adaptive regret bounds using a projection-free algorithm.
Original language | English (US) |
---|---|
Pages (from-to) | 1055-1073 |
Number of pages | 19 |
Journal | Proceedings of Machine Learning Research |
Volume | 201 |
State | Published - 2023 |
Event | 34th International Conference onAlgorithmic Learning Theory, ALT 2023 - Singapore, Singapore Duration: Feb 20 2023 → Feb 23 2023 |
All Science Journal Classification (ASJC) codes
- Artificial Intelligence
- Software
- Control and Systems Engineering
- Statistics and Probability