Past is important: A method for determining memory structure in NARX neural networks

C. Lee Giles, Tsungnan Lin, Bill G. Horne, S. Y. Kung

Research output: Contribution to conferencePaper

4 Scopus citations

Abstract

Recurrent networks have become popular models for system identification and time series prediction. NARX (Nonlinear AutoRegressive models with eXogenous inputs) network models are a popular subclass of recurrent networks and have been used in many applications. Though embedded memory can be found in all recurrent network models, it is particularly prominent in NARX models. We show that using intelligent memory order selection through pruning and good initial heuristics significantly improves the generalization and predictive performance of these nonlinear systems on problems as diverse as grammatical inference and time series prediction.

Original languageEnglish (US)
Pages1834-1839
Number of pages6
StatePublished - Jan 1 1998
EventProceedings of the 1998 IEEE International Joint Conference on Neural Networks. Part 1 (of 3) - Anchorage, AK, USA
Duration: May 4 1998May 9 1998

Other

OtherProceedings of the 1998 IEEE International Joint Conference on Neural Networks. Part 1 (of 3)
CityAnchorage, AK, USA
Period5/4/985/9/98

All Science Journal Classification (ASJC) codes

  • Software

Fingerprint Dive into the research topics of 'Past is important: A method for determining memory structure in NARX neural networks'. Together they form a unique fingerprint.

  • Cite this

    Giles, C. L., Lin, T., Horne, B. G., & Kung, S. Y. (1998). Past is important: A method for determining memory structure in NARX neural networks. 1834-1839. Paper presented at Proceedings of the 1998 IEEE International Joint Conference on Neural Networks. Part 1 (of 3), Anchorage, AK, USA, .