TY - JOUR
T1 - Decision-Based Neural Networks with Signal/Image Classification Applications
AU - Kung, S. Y.
AU - Taur, J. S.
N1 - Funding Information:
Manuscript received March 31, 1992; revised March 17, 1993. This research was supported in part by a grant and a research contract from the Air Force Office of Scientific Research and the Defense Advanced Research Project Agency. The authors are with the Department of Electrical Engineering, Princeton, NJ 08544 USA. IEEE Log Number 9409819.
PY - 1995/1
Y1 - 1995/1
N2 - Supervised learning networks based on a decision-based formulation are explored. More specifically, a decision-based neural network (DBNN) is proposed, which combines the perceptron-like learning rule and hierarchical nonlinear network structure. The decision-based mutual training can be applied to both static and temporal pattern recognition problems. For static pattern recognition, two hierarchical structures are proposed: hidden-node and subcluster structures. The relationships between DBNN's and other models (linear perceptron, piecewise-linear perceptron, LVQ, and PNN) are discussed. As to temporal DBNN's, model-based discriminant functions may be chosen to compensate possible temporal variations, such as waveform warping and alignments. Typical examples include DTW distance, prediction error, or likelihood functions. For classification applications, DBNN's are very effective in computation time and performance. This is confirmed by simulations conducted for several applications, including texture classification, OCR, and ECG analysis.
AB - Supervised learning networks based on a decision-based formulation are explored. More specifically, a decision-based neural network (DBNN) is proposed, which combines the perceptron-like learning rule and hierarchical nonlinear network structure. The decision-based mutual training can be applied to both static and temporal pattern recognition problems. For static pattern recognition, two hierarchical structures are proposed: hidden-node and subcluster structures. The relationships between DBNN's and other models (linear perceptron, piecewise-linear perceptron, LVQ, and PNN) are discussed. As to temporal DBNN's, model-based discriminant functions may be chosen to compensate possible temporal variations, such as waveform warping and alignments. Typical examples include DTW distance, prediction error, or likelihood functions. For classification applications, DBNN's are very effective in computation time and performance. This is confirmed by simulations conducted for several applications, including texture classification, OCR, and ECG analysis.
UR - http://www.scopus.com/inward/record.url?scp=0029182227&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0029182227&partnerID=8YFLogxK
U2 - 10.1109/72.363439
DO - 10.1109/72.363439
M3 - Article
C2 - 18263296
AN - SCOPUS:0029182227
SN - 1045-9227
VL - 6
SP - 170
EP - 181
JO - IEEE Transactions on Neural Networks
JF - IEEE Transactions on Neural Networks
IS - 1
ER -