Abstract
The stochastic composition optimization proposed recently by Wang et al. [2014] minimizes the objective with the compositional expectation form: minx (EiFi o EjGj)(x). It summarizes many important applications in machine learning, statistics, and finance. In this paper, we consider the finite-sum scenario for composition optimization: (Formula presented.). We propose two algorithms to solve this problem by combining the stochastic compositional gradient descent (SCGD) and the stochastic variance reduced gradient (SVRG) technique. A constant linear convergence rate is proved for strongly convex optimization, which substantially improves the sublinear rate O(K−0.8) of the best known algorithm.
Original language | English (US) |
---|---|
State | Published - 2017 |
Event | 20th International Conference on Artificial Intelligence and Statistics, AISTATS 2017 - Fort Lauderdale, United States Duration: Apr 20 2017 → Apr 22 2017 |
Conference
Conference | 20th International Conference on Artificial Intelligence and Statistics, AISTATS 2017 |
---|---|
Country/Territory | United States |
City | Fort Lauderdale |
Period | 4/20/17 → 4/22/17 |
All Science Journal Classification (ASJC) codes
- Artificial Intelligence
- Statistics and Probability