TY - GEN
T1 - Perturbation regulated kernel regressors for supervised machine learning
AU - Kung, S. Y.
AU - Wu, Pei Yuan
PY - 2012
Y1 - 2012
N2 - This paper develops a kernel perturbation-regulated (KPR) regressor based on the errors-in-variables models. KPR offers a strong smoothing capability critical to the robustness of regression or classification results. For Gaussian cases, the notion of orthogonal polynomials is instrumental to optimal estimation and its error analysis. More exactly, the regressor may be expressed as a linear combination of many simple Hermite Regressors, each focusing on one (and only one) orthogonal polynomial. For Gaussian or non-Gaussian cases, this paper formally establishes a Two-Projection Theorem allowing the estimation task to be divided into two projection stages: the first projection reveals the effect of model-induced error (caused by under-represented regressor models) while the second projection reveals the extra estimation error due to the (inevitable) input measuring error. The two-projection analysis leads to a closed-form error formula critical for order/error tradeoff. The simulation results not only confirm the theoretical prediction but also demonstrate superiority of KPR over the conventional ridge regression method in MSE reduction.
AB - This paper develops a kernel perturbation-regulated (KPR) regressor based on the errors-in-variables models. KPR offers a strong smoothing capability critical to the robustness of regression or classification results. For Gaussian cases, the notion of orthogonal polynomials is instrumental to optimal estimation and its error analysis. More exactly, the regressor may be expressed as a linear combination of many simple Hermite Regressors, each focusing on one (and only one) orthogonal polynomial. For Gaussian or non-Gaussian cases, this paper formally establishes a Two-Projection Theorem allowing the estimation task to be divided into two projection stages: the first projection reveals the effect of model-induced error (caused by under-represented regressor models) while the second projection reveals the extra estimation error due to the (inevitable) input measuring error. The two-projection analysis leads to a closed-form error formula critical for order/error tradeoff. The simulation results not only confirm the theoretical prediction but also demonstrate superiority of KPR over the conventional ridge regression method in MSE reduction.
UR - http://www.scopus.com/inward/record.url?scp=84870673143&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84870673143&partnerID=8YFLogxK
U2 - 10.1109/MLSP.2012.6349743
DO - 10.1109/MLSP.2012.6349743
M3 - Conference contribution
AN - SCOPUS:84870673143
SN - 9781467310260
T3 - IEEE International Workshop on Machine Learning for Signal Processing, MLSP
BT - 2012 IEEE International Workshop on Machine Learning for Signal Processing - Proceedings of MLSP 2012
T2 - 2012 22nd IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2012
Y2 - 23 September 2012 through 26 September 2012
ER -