TY - JOUR

T1 - Noise conditions for prespecified convergence rates of stochastic approximation algorithms

AU - Chong, Edwin K.P.

AU - Wang, I. Jeng

AU - Kulkarni, Sanjeev R.

N1 - Funding Information:
Manuscript received September 5, 1997; revised April 9, 1998. This work was supported in part by the National Science Foundation under Grants ECS-9501652 and IRI-9457645. E. K. P. Chong is with the School of Electrical and Computer Engineering, Purdue University, West Lafayette, IN 47907-1285 USA. I-J. Wang is with the Research and Technology Development Center, The Johns Hopkins University Applied Physics Laboratory, Laurel, MD 20723-6099 USA (e-mail: I-Jeng.Wang@jhuapl.edu). S. R. Kulkarni is with the Department of Electrical Engineering, Princeton University, Princeton, NJ 08544 USA. Communicated by P. Moulin, Associate Editor for Nonparametric Estimation, Classification, and Neural Networks. Publisher Item Identifier S 0018-9448(99)01417-0.

PY - 1999

Y1 - 1999

N2 - We develop deterministic necessary and sufficient conditions on individual noise sequences of a stochastic approximation algorithm for the error of the iterates to converge at a given rate. Specifically, suppose {ρn} is a given positive sequence converging monotonically to zero. Consider a stochastic approximation algorithm xn+1 =xn-an(Anxn - bn) + anen, where {xn} is the iterate sequence, {an}is the step size sequence, {en} is the noise sequence, and x* is the desired zero of the function f(x) = Ax -b. Then, under appropriate assumptions, we show that xn -x* =o(ρn) if and only if the sequence {en} satisfies one of five equivalent conditions. These conditions are based on well-known formulas for noise sequences: Kushner and Clark's condition, Chen's condition, Kulkarni and Horn's condition, a decomposition condition, and a weighted averaging condition. Our necessary and sufficient condition on {en} to achieve a convergence rate of {ρn} is basically that the sequence {en/ρn} satisfies any one of the above five well-known conditions. We provide examples to illustrate our result. In particular, we easily recover the familiar result that if an = a/n and {en} is a martingale difference process with bounded variance, then xn -x* = o(n 1/2(log(n))β) for any β>1/2.

AB - We develop deterministic necessary and sufficient conditions on individual noise sequences of a stochastic approximation algorithm for the error of the iterates to converge at a given rate. Specifically, suppose {ρn} is a given positive sequence converging monotonically to zero. Consider a stochastic approximation algorithm xn+1 =xn-an(Anxn - bn) + anen, where {xn} is the iterate sequence, {an}is the step size sequence, {en} is the noise sequence, and x* is the desired zero of the function f(x) = Ax -b. Then, under appropriate assumptions, we show that xn -x* =o(ρn) if and only if the sequence {en} satisfies one of five equivalent conditions. These conditions are based on well-known formulas for noise sequences: Kushner and Clark's condition, Chen's condition, Kulkarni and Horn's condition, a decomposition condition, and a weighted averaging condition. Our necessary and sufficient condition on {en} to achieve a convergence rate of {ρn} is basically that the sequence {en/ρn} satisfies any one of the above five well-known conditions. We provide examples to illustrate our result. In particular, we easily recover the familiar result that if an = a/n and {en} is a martingale difference process with bounded variance, then xn -x* = o(n 1/2(log(n))β) for any β>1/2.

KW - Convergence rate

KW - Kiefer-wolfowitz

KW - Necessary and sufficient conditions

KW - Noise sequences

KW - Robbins-monro

KW - Stochastic approximation

UR - http://www.scopus.com/inward/record.url?scp=0033100443&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0033100443&partnerID=8YFLogxK

U2 - 10.1109/18.749035

DO - 10.1109/18.749035

M3 - Article

AN - SCOPUS:0033100443

SN - 0018-9448

VL - 45

SP - 810

EP - 814

JO - IRE Professional Group on Information Theory

JF - IRE Professional Group on Information Theory

IS - 2

ER -