TY - GEN
T1 - Composite objective mirror descent
AU - Duchi, John C.
AU - Shalev-Shwartz, Shai
AU - Singer, Yoram
AU - Tewari, Ambuj
PY - 2010/12/1
Y1 - 2010/12/1
N2 - We present a new method for regularized convex optimization and analyze it under both online and stochastic optimization settings. In addition to unifying previously known firstorder algorithms, such as the projected gradient method, mirror descent, and forward-backward splitting, our method yields new analysis and algorithms. We also derive specific instantiations of our method for commonly used regularization functions, such as l1, mixed norm, and trace-norm.
AB - We present a new method for regularized convex optimization and analyze it under both online and stochastic optimization settings. In addition to unifying previously known firstorder algorithms, such as the projected gradient method, mirror descent, and forward-backward splitting, our method yields new analysis and algorithms. We also derive specific instantiations of our method for commonly used regularization functions, such as l1, mixed norm, and trace-norm.
UR - http://www.scopus.com/inward/record.url?scp=84898075653&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84898075653&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:84898075653
SN - 9780982252925
T3 - COLT 2010 - The 23rd Conference on Learning Theory
SP - 14
EP - 26
BT - COLT 2010 - The 23rd Conference on Learning Theory
T2 - 23rd Conference on Learning Theory, COLT 2010
Y2 - 27 June 2010 through 29 June 2010
ER -