Consistency in models for distributed learning under communication constraints

Research output: Contribution to journalArticlepeer-review

26 Scopus citations

Abstract

Motivated by sensor networks and other distributed settings, several models for distributed learning are presented. The models differ from classical works in statistical pattern recognition by allocating observations of an independent and identically distributed (i.i.d.) sampling process among members of a network of simple learning agents. The agents are limited in their ability to communicate to a central fusion center and thus, the amount of information available for use in classification or regression is constrained. For several basic communication models in both the binary classification and regression frameworks, we question the existence of agent decision rules and fusion rules that result in a universally consistent ensemble; the answers to this question present new issues to consider with regard to universal consistency. This paper addresses the issue of whether or not the guarantees provided by Stone's theorem in centralized environments hold in distributed settings.

Original languageEnglish (US)
Pages (from-to)52-63
Number of pages12
JournalIEEE Transactions on Information Theory
Volume52
Issue number1
DOIs
StatePublished - Jan 2006

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Keywords

  • Classsification
  • Consistency
  • Distributed learning
  • Nonparametric
  • Regression
  • Sensor networks
  • Statistical pattern recognition

Fingerprint

Dive into the research topics of 'Consistency in models for distributed learning under communication constraints'. Together they form a unique fingerprint.

Cite this