Smooth function approximation using neural networks

Silvia Ferrari, Robert F. Stengel

Research output: Contribution to journalArticlepeer-review

262 Scopus citations


An algebraic approach for representing multidimensional nonlinear functions by feedforward neural networks is presented. In this paper, the approach is implemented for the approximation of smooth batch data containing the function's input, output, and possibly, gradient information. The training set is associated to the network adjustable parameters by nonlinear weight equations. The cascade structure of these equations reveals that they can be treated as sets of linear systems. Hence, the training process and the network approximation properties can be investigated via linear algebra. Four algorithms are developed to achieve exact or approximate matching of input-output and/or gradient-based training sets. Their application to the design of forward and feedback neurocontrollers shows that algebraic training is characterized by faster execution speeds and better generalization properties than contemporary optimization techniques.

Original languageEnglish (US)
Pages (from-to)24-38
Number of pages15
JournalIEEE Transactions on Neural Networks
Issue number1
StatePublished - Jan 2005

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence
  • Computer Networks and Communications
  • Computer Science Applications


  • Algebraic
  • Function approximation
  • Gradient
  • Input-output
  • Training


Dive into the research topics of 'Smooth function approximation using neural networks'. Together they form a unique fingerprint.

Cite this