A stochastic compositional gradient method using Markov samples

Mengdi Wang, Ji Liu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

12 Scopus citations

Abstract

Consider the convex optimization problem minx f (g(x)) where both f and g are unknown but can be estimated through sampling. We consider the stochastic compositional gradient descent method (SCGD) that updates based on random function and subgradient evaluations, which are generated by a conditional sampling oracle. We focus on the case where samples are corrupted with Markov noise. Under certain diminishing stepsize assumptions, we prove that the iterate of SCGD converges almost surely to an optimal solution if such a solution exists. Under specific constant stepsize assumptions, we obtain finite-sample error bounds for the averaged iterates of the algorithm. We illustrate an application to online value evaluation in dynamic programming.

Original languageEnglish (US)
Title of host publication2016 Winter Simulation Conference
Subtitle of host publicationSimulating Complex Service Systems, WSC 2016
EditorsTheresa M. Roeder, Peter I. Frazier, Robert Szechtman, Enlu Zhou
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages702-713
Number of pages12
ISBN (Electronic)9781509044863
DOIs
StatePublished - Jul 2 2016
Event2016 Winter Simulation Conference, WSC 2016 - Arlington, United States
Duration: Dec 11 2016Dec 14 2016

Publication series

NameProceedings - Winter Simulation Conference
Volume0
ISSN (Print)0891-7736

Other

Other2016 Winter Simulation Conference, WSC 2016
Country/TerritoryUnited States
CityArlington
Period12/11/1612/14/16

All Science Journal Classification (ASJC) codes

  • Software
  • Modeling and Simulation
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'A stochastic compositional gradient method using Markov samples'. Together they form a unique fingerprint.

Cite this