A Systolic Neural Network Architecture for Hidden Markov Models

Jenq Neng Hwang, John A. Vlontzos, Sun Yuan Kung

Research output: Contribution to journalComment/debatepeer-review

30 Scopus citations

Abstract

This paper advocates a systolic neural network architecture for implementing the hidden Markov models (HMM's). A programmable systolic array is proposed, which maximizes the strength of VLSI in terms of intensive and pipelined computing and yet circumvents the limitation on communication. A unified algorithmic formulation for recurrent BackPropagation (RBP) network and HMM's is exploited for the architectural design, which results in a basic structure of a universal simulation tool for these connectionist networks. These networks accomplish the information storage/retrieval process by altering the pattern of connecting among a large number of primitive units, and/or by modifying certain weights associated with each connection. Important concerns regarding partitioning for large networks, fault-tolerance for ring array architecture, scaling for avoiding underflow, and architecture for locally interconnected networks are also discussed. Finally, the implementations based on commercially available VLSI chips (e.g., Inmos T800) and custom VLSI technology are discussed.

Original languageEnglish (US)
Pages (from-to)1967-1979
Number of pages13
JournalIEEE Transactions on Acoustics, Speech, and Signal Processing
Volume37
Issue number12
DOIs
StatePublished - Dec 1989
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Signal Processing

Fingerprint

Dive into the research topics of 'A Systolic Neural Network Architecture for Hidden Markov Models'. Together they form a unique fingerprint.

Cite this