From graph to manifold Laplacian: The convergence rate

Research output: Contribution to journalLetterpeer-review

210 Scopus citations


The convergence of the discrete graph Laplacian to the continuous manifold Laplacian in the limit of sample size N → ∞ while the kernel bandwidth ε → 0, is the justification for the success of Laplacian based algorithms in machine learning, such as dimensionality reduction, semi-supervised learning and spectral clustering. In this paper we improve the convergence rate of the variance term recently obtained by Hein et al. [From graphs to manifolds-Weak and strong pointwise consistency of graph Laplacians, in: P. Auer, R. Meir (Eds.), Proc. 18th Conf. Learning Theory (COLT), Lecture Notes Comput. Sci., vol. 3559, Springer-Verlag, Berlin, 2005, pp. 470-485], improve the bias term error, and find an optimal criteria to determine the parameter ε given N.

Original languageEnglish (US)
Pages (from-to)128-134
Number of pages7
JournalApplied and Computational Harmonic Analysis
Issue number1
StatePublished - Jul 2006
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Applied Mathematics


Dive into the research topics of 'From graph to manifold Laplacian: The convergence rate'. Together they form a unique fingerprint.

Cite this