A delay damage model selection algorithm for narx neural networks

Tsung Nan Lin, C. Lee Giles, Bill G. Home, Sun Yuan Kung

Research output: Contribution to journalArticlepeer-review

105 Scopus citations


Recurrent neural networks have become popular models for system identification and time series prediction. Nonlinear autoregressive models with exogenous inputs (NARX) neural network models are a popular subclass of recurrent networks and have been used in many applications. Although embedded memory can be found in all recurrent network models, it is particularly prominent in NARX models. We show that using intelligent memory order selection through pruning and good initial heuristics significantly improves the generalization and predictive performance of these nonlinear systems on problems as diverse as grammatical inference and time series prediction.

Original languageEnglish (US)
Pages (from-to)2719-2730
Number of pages12
JournalIEEE Transactions on Signal Processing
Issue number11
StatePublished - 1997

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Electrical and Electronic Engineering


  • Automata
  • Autoregressive
  • Embedding theory
  • Gradient descent training
  • Latching
  • Long-term dependencies
  • Memory
  • NARX
  • Networks
  • Pruning
  • Recurrent neural networks
  • Tapped-delay lines
  • Temporal sequences
  • Time series


Dive into the research topics of 'A delay damage model selection algorithm for narx neural networks'. Together they form a unique fingerprint.

Cite this