Abstract
We present a new algorithm for independent component analysis which has provable performance guarantees. In particular, suppose we are given samples of the form y=Ax+η where A is an unknown but non-singular n×n matrix, x is a random variable whose coordinates are independent and have a fourth order moment strictly less than that of a standard Gaussian random variable and η is an n-dimensional Gaussian random variable with unknown covariance Σ: We give an algorithm that provably recovers A and Σ up to an additive ϵ and whose running time and sample complexity are polynomial in n and 1/ϵ. To accomplish this, we introduce a novel “quasi-whitening” step that may be useful in other applications where there is additive Gaussian noise whose covariance is unknown. We also give a general framework for finding all local optima of a function (given an oracle for approximately finding just one) and this is a crucial step in our algorithm, one that has been overlooked in previous attempts, and allows us to control the accumulation of error when we find the columns of $$A$$A one by one via local search.
Original language | English (US) |
---|---|
Pages (from-to) | 215-236 |
Number of pages | 22 |
Journal | Algorithmica |
Volume | 72 |
Issue number | 1 |
DOIs | |
State | Published - May 1 2015 |
All Science Journal Classification (ASJC) codes
- General Computer Science
- Computer Science Applications
- Applied Mathematics
Keywords
- Cumulants
- Independent component analysis
- Method of moments
- Mixture models