TY - JOUR
T1 - Spectral convergence of the connection Laplacian from random samples
AU - Singer, Amit
AU - Wu, Hau Tieng
N1 - Funding Information:
Award Number R01GM090200 from the NIGMS, by Award Number FA9550-12-1-0317 and FA9550-13-1-0076 from AFOSR, and by Award Number LTR DTD 06-05-2012 from the Simons Foundation to A.S. AFOSR grant FA9550-09-1-0551, NSF grant CCF-0939370, FRG grant DSM-1160319 and Sloan Research Fellow FR-2015-65363 to H.-T. W.
Publisher Copyright:
© The authors 2016.
PY - 2017
Y1 - 2017
N2 - Spectral methods that are based on eigenvectors and eigenvalues of discrete graph Laplacians, such as Diffusion Maps and Laplacian Eigenmaps, are often used for manifold learning and nonlinear dimensionality reduction. Itwas previously shown by Belkin & Niyogi (2007, Convergence of Laplacian eigenmaps, vol. 19. Proceedings of the 2006 Conference on Advances in Neural Information Processing Systems. The MIT Press, p. 129.) that the eigenvectors and eigenvalues of the graph Laplacian converge to the eigenfunctions and eigenvalues of the Laplace-Beltrami operator of the manifold in the limit of infinitely many data points sampled independently from the uniform distribution over the manifold. Recently, we introduced Vector Diffusion Maps and showed that the connection Laplacian of the tangent bundle of the manifold can be approximated from random samples. In this article, we present a unified framework for approximating other connection Laplacians over the manifold by considering its principle bundle structure. We prove that the eigenvectors and eigenvalues of these Laplacians converge in the limit of infinitely many independent random samples. We generalize the spectral convergence results to the case where the data points are sampled from a non-uniform distribution, and for manifolds with and without boundary.
AB - Spectral methods that are based on eigenvectors and eigenvalues of discrete graph Laplacians, such as Diffusion Maps and Laplacian Eigenmaps, are often used for manifold learning and nonlinear dimensionality reduction. Itwas previously shown by Belkin & Niyogi (2007, Convergence of Laplacian eigenmaps, vol. 19. Proceedings of the 2006 Conference on Advances in Neural Information Processing Systems. The MIT Press, p. 129.) that the eigenvectors and eigenvalues of the graph Laplacian converge to the eigenfunctions and eigenvalues of the Laplace-Beltrami operator of the manifold in the limit of infinitely many data points sampled independently from the uniform distribution over the manifold. Recently, we introduced Vector Diffusion Maps and showed that the connection Laplacian of the tangent bundle of the manifold can be approximated from random samples. In this article, we present a unified framework for approximating other connection Laplacians over the manifold by considering its principle bundle structure. We prove that the eigenvectors and eigenvalues of these Laplacians converge in the limit of infinitely many independent random samples. We generalize the spectral convergence results to the case where the data points are sampled from a non-uniform distribution, and for manifolds with and without boundary.
KW - Connection Laplacian
KW - Diffusion maps
KW - Graph connection Laplacian
KW - Orientable diffusion maps
KW - Principal bundle
KW - Vector diffusion distance
KW - Vector diffusion maps
UR - http://www.scopus.com/inward/record.url?scp=85071697604&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85071697604&partnerID=8YFLogxK
U2 - 10.1093/imaiai/iaw016
DO - 10.1093/imaiai/iaw016
M3 - Article
AN - SCOPUS:85071697604
SN - 2049-8772
VL - 6
SP - 58
EP - 123
JO - Information and Inference
JF - Information and Inference
IS - 1
ER -