Abstract
The (stochastic) gradient descent and the multiplicative update method are probably the most popular algorithms in machine learning. We introduce and study a new regularization which provides a unification of the additive and multiplicative updates. This regularization is derived from an hyperbolic analogue of the entropy function, which we call hypentropy. It is motivated by a natural extension of the multiplicative update to negative numbers. The hypentropy has a natural spectral counterpart which we use to derive a family of matrix-based updates that bridge gradient methods and the multiplicative method for matrices. While the latter is only applicable to positive semidefinite matrices, the spectral hypentropy method can naturally be used with general rectangular matrices. We analyze the new family of updates by deriving tight regret bounds. We study empirically the applicability of the new update for settings such as multiclass learning, in which the parameters constitute a general rectangular matrix.
Original language | English (US) |
---|---|
Pages (from-to) | 386-407 |
Number of pages | 22 |
Journal | Proceedings of Machine Learning Research |
Volume | 117 |
State | Published - 2020 |
Event | 31st International Conference on Algorithmic Learning Theory, ALT 2020 - San Diego, United States Duration: Feb 8 2020 → Feb 11 2020 |
All Science Journal Classification (ASJC) codes
- Artificial Intelligence
- Software
- Control and Systems Engineering
- Statistics and Probability
Keywords
- Experimentation
- Exponentiated Gradient
- Gradient Descent
- Online Convex Optimization