## Abstract

We develop deterministic necessary and sufficient conditions on individual noise sequences of a stochastic approximation algorithm for the error of the iterates to converge at a given rate. Specifically, suppose {ρ_{n}} is a given positive sequence converging monotonically to zero. Consider a stochastic approximation algorithm x_{n+1} =x_{n}-a_{n}(A_{n}x_{n} - b_{n}) + a_{n}e_{n}, where {x_{n}} is the iterate sequence, {a_{n}}is the step size sequence, {e_{n}} is the noise sequence, and x* is the desired zero of the function f(x) = Ax -b. Then, under appropriate assumptions, we show that x_{n} -x* =o(ρ_{n}) if and only if the sequence {e_{n}} satisfies one of five equivalent conditions. These conditions are based on well-known formulas for noise sequences: Kushner and Clark's condition, Chen's condition, Kulkarni and Horn's condition, a decomposition condition, and a weighted averaging condition. Our necessary and sufficient condition on {e_{n}} to achieve a convergence rate of {ρ_{n}} is basically that the sequence {e_{n}/ρ_{n}} satisfies any one of the above five well-known conditions. We provide examples to illustrate our result. In particular, we easily recover the familiar result that if a_{n} = a/_{n} and {e_{n}} is a martingale difference process with bounded variance, then x_{n} -x* = o(n ^{1/2}(log(n))β) for any β>1/2.

Original language | English (US) |
---|---|

Pages (from-to) | 810-814 |

Number of pages | 5 |

Journal | IEEE Transactions on Information Theory |

Volume | 45 |

Issue number | 2 |

DOIs | |

State | Published - 1999 |

Externally published | Yes |

## All Science Journal Classification (ASJC) codes

- Information Systems
- Computer Science Applications
- Library and Information Sciences

## Keywords

- Convergence rate
- Kiefer-wolfowitz
- Necessary and sufficient conditions
- Noise sequences
- Robbins-monro
- Stochastic approximation