TY - GEN
T1 - Fast, smooth and adaptive regression in metric spaces
AU - Kpotufe, Samory
PY - 2009
Y1 - 2009
N2 - It was recently shown that certain nonparametric regressors can escape the curse of dimensionality when the intrinsic dimension of data is low ([1, 2]). We prove some stronger results in more general settings. In particular, we consider a regres-sor which, by combining aspects of both tree-based regression and kernel regression, adapts to intrinsic dimension, operates on general metrics, yields a smooth function, and evaluates in time O(log n). We derive a tight convergence rate of the form n-2/(2+d) where d is the Assouad dimension of the input space.
AB - It was recently shown that certain nonparametric regressors can escape the curse of dimensionality when the intrinsic dimension of data is low ([1, 2]). We prove some stronger results in more general settings. In particular, we consider a regres-sor which, by combining aspects of both tree-based regression and kernel regression, adapts to intrinsic dimension, operates on general metrics, yields a smooth function, and evaluates in time O(log n). We derive a tight convergence rate of the form n-2/(2+d) where d is the Assouad dimension of the input space.
UR - http://www.scopus.com/inward/record.url?scp=84858739063&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84858739063&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:84858739063
SN - 9781615679119
T3 - Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference
SP - 1024
EP - 1032
BT - Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference
PB - Neural Information Processing Systems
T2 - 23rd Annual Conference on Neural Information Processing Systems, NIPS 2009
Y2 - 7 December 2009 through 10 December 2009
ER -