Fast variational sparse bayesian learning with automatic relevance determination for superimposed signals

Dmitriy Shutin, Thomas Buchgraber, Sanjeev R. Kulkarni, H. Vincent Poor

Research output: Contribution to journalArticlepeer-review

68 Scopus citations

Abstract

In this work, a new fast variational sparse Bayesian learning (SBL) approach with automatic relevance determination (ARD) is proposed. The sparse Bayesian modeling, exemplified by the relevance vector machine (RVM), allows a sparse regression or classification function to be constructed as a linear combination of a few basis functions. It is demonstrated that, by computing the stationary points of the variational update expressions with noninformative (ARD) hyperpriors, a fast version of variational SBL can be constructed. Analysis of the computed stationary points indicates that SBL with Gaussian sparsity priors and noninformative hyperpriors corresponds to removing components with signal-to-noise ratio below a 0 dB threshold; this threshold can also be adjusted to significantly improve the convergence rate and sparsity of SBL. It is demonstrated that the pruning conditions derived for fast variational SBL coincide with those obtained for fast marginal likelihood maximization; moreover, the parameters that maximize the variational lower bound also maximize the marginal likelihood function. The effectiveness of fast variational SBL is demonstrated with synthetic as well as with real data.

Original languageEnglish (US)
Article number6020818
Pages (from-to)6257-6261
Number of pages5
JournalIEEE Transactions on Signal Processing
Volume59
Issue number12
DOIs
StatePublished - Dec 2011

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Fast variational sparse bayesian learning with automatic relevance determination for superimposed signals'. Together they form a unique fingerprint.

Cite this