TY - GEN
T1 - Multiclass Ridge-adjusted Slack Variable Optimization using selected basis for fast classification
AU - Yu, Yinan
AU - Diamantaras, Konstantinos I.
AU - McKelvey, Tomas
AU - Kung, S. Y.
N1 - Publisher Copyright:
© 2014 EURASIP.
PY - 2014/11/10
Y1 - 2014/11/10
N2 - Kernel techniques for classification is especially challenging in terms of computation and memory requirement when data fall into more than two categories. In this paper, we extend a binary classification technique called Ridge-adjusted Slack Variable Optimization (RiSVO) to its multiclass counterpart where the label information encoding scheme allows the computational complexity to remain the same to the binary case. The main features of this technique are summarized as follows: (1) Only a subset of data are pre-selected to construct the basis for kernel computation; (2) Simultaneous active training set selection for all classes helps reduce complexity meanwhile improving robustness; (3) With the proposed active set selection criteria, inclusion property is verified empirically. Inclusion property means that once a pattern is excluded, it will no longer return to the active training set and therefore can be permanently removed from the training procedure. This property greatly reduce the complexity. The proposed techniques are evaluated on standard multiclass datasets MNIST, USPS, pendigits and letter which could be easily compared with existing results.
AB - Kernel techniques for classification is especially challenging in terms of computation and memory requirement when data fall into more than two categories. In this paper, we extend a binary classification technique called Ridge-adjusted Slack Variable Optimization (RiSVO) to its multiclass counterpart where the label information encoding scheme allows the computational complexity to remain the same to the binary case. The main features of this technique are summarized as follows: (1) Only a subset of data are pre-selected to construct the basis for kernel computation; (2) Simultaneous active training set selection for all classes helps reduce complexity meanwhile improving robustness; (3) With the proposed active set selection criteria, inclusion property is verified empirically. Inclusion property means that once a pattern is excluded, it will no longer return to the active training set and therefore can be permanently removed from the training procedure. This property greatly reduce the complexity. The proposed techniques are evaluated on standard multiclass datasets MNIST, USPS, pendigits and letter which could be easily compared with existing results.
KW - RKHS basis construction
KW - RiSVO
KW - kernel
KW - large scale data
KW - multiclass classification
UR - http://www.scopus.com/inward/record.url?scp=84911895850&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84911895850&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:84911895850
T3 - European Signal Processing Conference
SP - 1178
EP - 1182
BT - 2014 Proceedings of the 22nd European Signal Processing Conference, EUSIPCO 2014
PB - European Signal Processing Conference, EUSIPCO
T2 - 22nd European Signal Processing Conference, EUSIPCO 2014
Y2 - 1 September 2014 through 5 September 2014
ER -