Randomized optimum models for structured prediction

Daniel Tarlow, Ryan P. Adams, Richard S. Zemel

Research output: Contribution to journalConference articlepeer-review

14 Scopus citations

Abstract

One approach to modeling structured discrete data is to describe the probability of states via an energy function and Gibbs distribution. A recurring difficulty in these models is the computation of the partition function, which may require an intractable sum. However, in many such models, the mode can be found efficiently even when the partition function is unavailable. Recent work on Perturb-and-MAP (PM) models (Papandreou and Yuille, 2011) has exploited this discrepancy to approximate the Gibbs distribution for Markov random fields (MRFs). Here, we explore a broader class of models, called Randomized Optimum models (RandOMs), which include PM as a special case. This new class of models encompasses not only MRFs, but also other models that have intractable partition functions yet permit efficient mode-finding, such as those based on bipartite matchings, shortest paths, or connected components in a graph. We develop likelihood-based learning algorithms for RandOMs, which, empirical results indicate, can produce better models than PM.

Original languageEnglish (US)
Pages (from-to)1221-1229
Number of pages9
JournalJournal of Machine Learning Research
Volume22
StatePublished - 2012
Externally publishedYes
Event15th International Conference on Artificial Intelligence and Statistics, AISTATS 2012 - La Palma, Spain
Duration: Apr 21 2012Apr 23 2012

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Randomized optimum models for structured prediction'. Together they form a unique fingerprint.

Cite this