Maximum entropy distribution estimation with generalized regularization

Miroslav Dudík, Robert E. Schapire

Research output: Chapter in Book/Report/Conference proceedingConference contribution

23 Scopus citations

Abstract

We present a unified and complete account of maximum entropy distribution estimation subject to constraints represented by convex potential functions or, alternatively, by convex regularization. We provide fully general performance guarantees and an algorithm with a complete convergence proof. As special cases, we can easily derive performance guarantees for many known regularization types, including ℓ1, ℓ2, ℓ2 2 and ℓ1+ ℓ22 style regularization. Furthermore, our general approach enables us to use information about the structure of the feature space or about sample selection bias to derive entirely new regularization functions with superior guarantees. We propose an algorithm solving a large and general subclass of generalized maxent problems, including all discussed in the paper, and prove its convergence. Our approach generalizes techniques based on information geometry and Bregman divergences as well as those based more directly on compactness.

Original languageEnglish (US)
Title of host publicationLearning Theory - 19th Annual Conference on Learning Theory, COLT 2006, Proceedings
PublisherSpringer Verlag
Pages123-138
Number of pages16
ISBN (Print)3540352945, 9783540352945
DOIs
StatePublished - 2006
Event19th Annual Conference on Learning Theory, COLT 2006 - Pittsburgh, PA, United States
Duration: Jun 22 2006Jun 25 2006

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume4005 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

Other19th Annual Conference on Learning Theory, COLT 2006
CountryUnited States
CityPittsburgh, PA
Period6/22/066/25/06

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Maximum entropy distribution estimation with generalized regularization'. Together they form a unique fingerprint.

  • Cite this

    Dudík, M., & Schapire, R. E. (2006). Maximum entropy distribution estimation with generalized regularization. In Learning Theory - 19th Annual Conference on Learning Theory, COLT 2006, Proceedings (pp. 123-138). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 4005 LNAI). Springer Verlag. https://doi.org/10.1007/11776420_12