Training Conditional Random Fields for Maximum Labelwise Accuracy

Samuel S. Gross, Chuong B. Do, Olga Russakovsky, Serafim Batzoglou

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Scopus citations

Abstract

We consider the problem of training a conditional random field (CRF) to maximize per-label predictive accuracy on a training set, an approach motivated by the principle of empirical risk minimization. We give a gradient-based procedure for minimizing an arbitrarily accurate approximation of the empirical risk under a Hamming loss function. In experiments with both simulated and real data, our optimization procedure gives significantly better testing performance than several current approaches for CRF training, especially in situations of high label noise.

Original languageEnglish (US)
Title of host publicationNIPS 2006
Subtitle of host publicationProceedings of the 19th International Conference on Neural Information Processing Systems
EditorsBernhard Scholkopf, John C. Platt, Thomas Hofmann
PublisherMIT Press Journals
Pages529-536
Number of pages8
ISBN (Electronic)0262195682, 9780262195683
StatePublished - 2006
Externally publishedYes
Event19th International Conference on Neural Information Processing Systems, NIPS 2006 - Vancouver, Canada
Duration: Dec 4 2006Dec 7 2006

Publication series

NameNIPS 2006: Proceedings of the 19th International Conference on Neural Information Processing Systems

Conference

Conference19th International Conference on Neural Information Processing Systems, NIPS 2006
Country/TerritoryCanada
CityVancouver
Period12/4/0612/7/06

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Training Conditional Random Fields for Maximum Labelwise Accuracy'. Together they form a unique fingerprint.

Cite this