Abstract
We develop deterministic necessary and sufficient conditions on individual noise sequences of a stochastic approximation algorithm for the error of the iterates to converge at a given rate. Specifically, suppose {ρn} is a given positive sequence converging monotonically to zero. Consider a stochastic approximation algorithm xn+1 =xn-an(Anxn - bn) + anen, where {xn} is the iterate sequence, {an}is the step size sequence, {en} is the noise sequence, and x* is the desired zero of the function f(x) = Ax -b. Then, under appropriate assumptions, we show that xn -x* =o(ρn) if and only if the sequence {en} satisfies one of five equivalent conditions. These conditions are based on well-known formulas for noise sequences: Kushner and Clark's condition, Chen's condition, Kulkarni and Horn's condition, a decomposition condition, and a weighted averaging condition. Our necessary and sufficient condition on {en} to achieve a convergence rate of {ρn} is basically that the sequence {en/ρn} satisfies any one of the above five well-known conditions. We provide examples to illustrate our result. In particular, we easily recover the familiar result that if an = a/n and {en} is a martingale difference process with bounded variance, then xn -x* = o(n 1/2(log(n))β) for any β>1/2.
Original language | English (US) |
---|---|
Pages (from-to) | 810-814 |
Number of pages | 5 |
Journal | IEEE Transactions on Information Theory |
Volume | 45 |
Issue number | 2 |
DOIs | |
State | Published - 1999 |
Externally published | Yes |
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
Keywords
- Convergence rate
- Kiefer-wolfowitz
- Necessary and sufficient conditions
- Noise sequences
- Robbins-monro
- Stochastic approximation