Smooth function approximation using neural networks

Silvia Ferrari, Robert Frank Stengel

Research output: Contribution to journalArticlepeer-review

222 Scopus citations

Abstract

An algebraic approach for representing multidimensional nonlinear functions by feedforward neural networks is presented. In this paper, the approach is implemented for the approximation of smooth batch data containing the function's input, output, and possibly, gradient information. The training set is associated to the network adjustable parameters by nonlinear weight equations. The cascade structure of these equations reveals that they can be treated as sets of linear systems. Hence, the training process and the network approximation properties can be investigated via linear algebra. Four algorithms are developed to achieve exact or approximate matching of input-output and/or gradient-based training sets. Their application to the design of forward and feedback neurocontrollers shows that algebraic training is characterized by faster execution speeds and better generalization properties than contemporary optimization techniques.

Original languageEnglish (US)
Pages (from-to)24-38
Number of pages15
JournalIEEE Transactions on Neural Networks
Volume16
Issue number1
DOIs
StatePublished - Jan 1 2005

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Smooth function approximation using neural networks'. Together they form a unique fingerprint.

Cite this