TY - GEN
T1 - Gradient weights help nonparametric regressors
AU - Kpotufe, Samory
AU - Boularias, Abdeslam
PY - 2012
Y1 - 2012
N2 - In regression problems over ℝd, the unknown function f often varies more in some coordinates than in others. We show that weighting each coordinate i with the estimated norm of the ith derivative of f is an efficient way to significantly improve the performance of distance-based regressors, e.g. kernel and κ-NN regressors. We propose a simple estimator of these derivative norms and prove its consistency. Moreover, the proposed estimator is efficiently learned online.
AB - In regression problems over ℝd, the unknown function f often varies more in some coordinates than in others. We show that weighting each coordinate i with the estimated norm of the ith derivative of f is an efficient way to significantly improve the performance of distance-based regressors, e.g. kernel and κ-NN regressors. We propose a simple estimator of these derivative norms and prove its consistency. Moreover, the proposed estimator is efficiently learned online.
UR - http://www.scopus.com/inward/record.url?scp=84877732148&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84877732148&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:84877732148
SN - 9781627480031
T3 - Advances in Neural Information Processing Systems
SP - 2861
EP - 2869
BT - Advances in Neural Information Processing Systems 25
T2 - 26th Annual Conference on Neural Information Processing Systems 2012, NIPS 2012
Y2 - 3 December 2012 through 6 December 2012
ER -