### Abstract

It was recently shown that certain nonparametric regressors can escape the curse of dimensionality when the intrinsic dimension of data is low ([1, 2]). We prove some stronger results in more general settings. In particular, we consider a regres-sor which, by combining aspects of both tree-based regression and kernel regression, adapts to intrinsic dimension, operates on general metrics, yields a smooth function, and evaluates in time O(log n). We derive a tight convergence rate of the form n ^{-2/(2+d)} where d is the Assouad dimension of the input space.

Original language | English (US) |
---|---|

Title of host publication | Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference |

Pages | 1024-1032 |

Number of pages | 9 |

State | Published - Dec 1 2009 |

Event | 23rd Annual Conference on Neural Information Processing Systems, NIPS 2009 - Vancouver, BC, Canada Duration: Dec 7 2009 → Dec 10 2009 |

### Publication series

Name | Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference |
---|

### Other

Other | 23rd Annual Conference on Neural Information Processing Systems, NIPS 2009 |
---|---|

Country | Canada |

City | Vancouver, BC |

Period | 12/7/09 → 12/10/09 |

### All Science Journal Classification (ASJC) codes

- Information Systems

## Cite this

Kpotufe, S. K. (2009). Fast, smooth and adaptive regression in metric spaces. In

*Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference*(pp. 1024-1032). (Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference).