The hierarchical hidden Markov model: Analysis and applications

Shai Fine, Yoram Singer, Naftali Tishby

Research output: Contribution to journalArticlepeer-review

687 Scopus citations


We introduce, analyze and demonstrate a recursive hierarchical generalization of the widely used hidden Markov models, which we name Hierarchical Hidden Markov Models (HHMM). Our model is motivated by the complex multi-scale structure which appears in many natural sequences, particularly in language, handwriting and speech. We seek a systematic unsupervised approach to the modeling of such structures. By extending the standard Baum-Welch (forward-backward) algorithm, we derive an efficient procedure for estimating the model parameters from unlabeled data. We then use the trained model for automatic hierarchical parsing of observation sequences. We describe two applications of our model and its parameter estimation procedure. In the first application we show how to construct hierarchical models of natural English text. In these models different levels of the hierarchy correspond to structures on different length scales in the text. In the second application we demonstrate how HHMMs can be used to automatically identify repeated strokes that represent combination of letters in cursive handwriting.

Original languageEnglish (US)
Pages (from-to)41-62
Number of pages22
JournalMachine Learning
Issue number1
StatePublished - Jul 1998

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence


  • Cursive handwriting
  • Hidden variable models
  • Statistical models
  • Temporal pattern recognition


Dive into the research topics of 'The hierarchical hidden Markov model: Analysis and applications'. Together they form a unique fingerprint.

Cite this