Consistency in models for communication constrained distributed learning

Research output: Contribution to journalConference article

1 Scopus citations

Abstract

Motivated by sensor networks and other distributed settings, several models for distributed learning are presented. The models differ from classical works in statistical pattern recognition by allocating observations of an i.i.d. sampling process amongst members of a network of learning agents. The agents are limited in their ability to communicate to a fusion center; the amount of information available for classification or regression is constrained. For several simple communication models, questions of universal consistency are addressed; i.e., the asymptotics of several agent decision rules and fusion rules are considered in both binary classification and regression frameworks. These models resemble distributed environments and introduce new questions regarding universal consistency. Insofar as these models offer a useful picture of distributed scenarios, this paper considers whether the guarantees provided by Stone's Theorem in centralized environments hold in distributed settings.

Original languageEnglish (US)
Pages (from-to)442-456
Number of pages15
JournalLecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)
Volume3120
DOIs
StatePublished - Jan 1 2004
Event17th Annual Conference on Learning Theory, COLT 2004 - Banff, Canada
Duration: Jul 1 2004Jul 4 2004

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Consistency in models for communication constrained distributed learning'. Together they form a unique fingerprint.

  • Cite this