TY - GEN
T1 - Improving Performance in Neural Networks Using a Boosting Algorithm
AU - Drucker, Harris
AU - Schapire, Robert
AU - Simard, Patrice
N1 - Publisher Copyright:
© 1992 Neural information processing systems foundation. All rights reserved.
PY - 1992
Y1 - 1992
N2 - A boosting algorithm converts a learning machine with error rate less than 50% to one with an arbitrarily low error rate. However, the algorithm discussed here depends on having a large supply of independent training samples. We show how to circumvent this problem and generate an ensemble of learning machines whose performance in optical character recognition problems is dramatically improved over that of a single network. We report the effect of boosting on four databases (all handwritten) consisting of 12,000 digits from segmented ZIP codes from the United State Postal Service (USPS) and the following from the National Institute of Standards and Testing (NIST): 220,000 digits, 45,000 upper case alphas, and 45,000 lower case alphas. We use two performance measures: the raw error rate (no rejects) and the reject rate required to achieve a 1% error rate on the patterns not rejected. Boosting improved performance in some cases by a factor of three.
AB - A boosting algorithm converts a learning machine with error rate less than 50% to one with an arbitrarily low error rate. However, the algorithm discussed here depends on having a large supply of independent training samples. We show how to circumvent this problem and generate an ensemble of learning machines whose performance in optical character recognition problems is dramatically improved over that of a single network. We report the effect of boosting on four databases (all handwritten) consisting of 12,000 digits from segmented ZIP codes from the United State Postal Service (USPS) and the following from the National Institute of Standards and Testing (NIST): 220,000 digits, 45,000 upper case alphas, and 45,000 lower case alphas. We use two performance measures: the raw error rate (no rejects) and the reject rate required to achieve a 1% error rate on the patterns not rejected. Boosting improved performance in some cases by a factor of three.
UR - https://www.scopus.com/pages/publications/105021006154
UR - https://www.scopus.com/pages/publications/105021006154#tab=citedBy
U2 - 10.5555/645753.668055
DO - 10.5555/645753.668055
M3 - Conference contribution
AN - SCOPUS:105021006154
T3 - Advances in Neural Information Processing Systems
SP - 42
EP - 49
BT - Advances in Neural Information Processing Systems 5, NIPS 1992
A2 - Hanson, Stephen Jose
A2 - Cowan, Jack D.
A2 - Giles, C. Lee
PB - Neural information processing systems foundation
T2 - 5th Advances in Neural Information Processing Systems, NIPS 1992
Y2 - 30 November 1992 through 3 December 1992
ER -