TY - GEN
T1 - Proximal Newton-type methods for convex optimization
AU - Lee, Jason D.
AU - Sun, Yuekai
AU - Saunders, Michael A.
PY - 2012
Y1 - 2012
N2 - We seek to solve convex optimization problems in composite form: minimize xεℝn f(x) := g(x) + h(x); where g is convex and continuously differentiable and h : ℝn → ℝ is a convex but not necessarily differentiable function whose proximal mapping can be evaluated efficiently. We derive a generalization of Newton-type methods to handle such convex but nonsmooth objective functions. We prove such methods are globally convergent and achieve superlinear rates of convergence in the vicinity of an optimal solution. We also demonstrate the performance of these methods using problems of relevance in machine learning and statistics.
AB - We seek to solve convex optimization problems in composite form: minimize xεℝn f(x) := g(x) + h(x); where g is convex and continuously differentiable and h : ℝn → ℝ is a convex but not necessarily differentiable function whose proximal mapping can be evaluated efficiently. We derive a generalization of Newton-type methods to handle such convex but nonsmooth objective functions. We prove such methods are globally convergent and achieve superlinear rates of convergence in the vicinity of an optimal solution. We also demonstrate the performance of these methods using problems of relevance in machine learning and statistics.
UR - http://www.scopus.com/inward/record.url?scp=84877762791&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84877762791&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:84877762791
SN - 9781627480031
T3 - Advances in Neural Information Processing Systems
SP - 827
EP - 835
BT - Advances in Neural Information Processing Systems 25
T2 - 26th Annual Conference on Neural Information Processing Systems 2012, NIPS 2012
Y2 - 3 December 2012 through 6 December 2012
ER -