The complexity of model classes, and smoothing noisy data

Peter L. Bartlett, Sanjeev R. Kulkarni

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

We consider the problem of smoothing a sequence of noisy observations using a fixed class of models. Via a deterministic analysis, we obtain necessary and sufficient conditions on the noise sequence and model class that ensure that a class of natural estimators gives near-optimal smoothing. In the case of i.i.d. random noise, we show that the accuracy of these estimators depends on a measure of complexity of the model class involving covering numbers. Our formulation and results are quite general and are related to a number of problems in learning, prediction, and estimation. As a special case, we consider an application to output smoothing for certain classes of linear and nonlinear systems. The performance of output smoothing is given in terms of natural complexity parameters of the model class, such as bounds on the order of linear systems, the l1-norm of the impulse response of stable linear systems, or the memory of a Lipschitz nonlinear system satisfying a fading memory condition.

Original languageEnglish (US)
Pages (from-to)133-140
Number of pages8
JournalSystems and Control Letters
Volume34
Issue number3
DOIs
StatePublished - Jun 18 1998

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • General Computer Science
  • Mechanical Engineering
  • Electrical and Electronic Engineering

Keywords

  • Computational learning theory
  • Covering numbers
  • Smoothing
  • System identification

Fingerprint

Dive into the research topics of 'The complexity of model classes, and smoothing noisy data'. Together they form a unique fingerprint.

Cite this