Divergence estimation for multidimensional densities via κ-nearest-neighbor distances

Qing Wang, Sanjeev R. Kulkarni, Sergio Verdú

Research output: Contribution to journalArticlepeer-review

258 Scopus citations

Abstract

A new universal estimator of divergence is presented for multidimensional continuous densities based on κ-nearest-neighbor (κ-NN) distances. Assuming independent and identically distributed (i.i.d.) samples, the new estimator is proved to be asymptotically unbiased and mean-square consistent. In experiments with high-dimensional data, the κ-NN approach generally exhibits faster convergence than previous algorithms. It is also shown that the speed of convergence of the κ-NN method can be further improved by an adaptive choice of κ.

Original languageEnglish (US)
Pages (from-to)2392-2405
Number of pages14
JournalIEEE Transactions on Information Theory
Volume55
Issue number5
DOIs
StatePublished - 2009

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Keywords

  • Divergence
  • Information measure
  • Kullback
  • Leibler
  • Nearest-neighbor
  • Partition
  • Random vector
  • Universal estimation

Fingerprint

Dive into the research topics of 'Divergence estimation for multidimensional densities via κ-nearest-neighbor distances'. Together they form a unique fingerprint.

Cite this