TY - JOUR
T1 - Fast variational sparse bayesian learning with automatic relevance determination for superimposed signals
AU - Shutin, Dmitriy
AU - Buchgraber, Thomas
AU - Kulkarni, Sanjeev R.
AU - Poor, H. Vincent
N1 - Funding Information:
Manuscript received October 26, 2010; revised March 14, 2011; accepted August 22, 2011. Date of publication September 15, 2011; date of current version November 16, 2011. The associate editor coordinating the review of this manuscript and approving it for publication was Dr. Jerome Idier. This research was supported in part by an Erwin Schrödinger Postdoctoral Fellowship, Austrian Science Fund Project J2909-N23, in part by the Austrian Science Fund under Grant NFN-SISE-S10610-N13, and in part by the NSF Science and Technology Center under Grant CCF-0939370, the U.S. Army Research Office under Grant W911NF-07-1-0185, and the U.S. Office of Naval Research under Grant N00014-09-1-0342.
PY - 2011/12
Y1 - 2011/12
N2 - In this work, a new fast variational sparse Bayesian learning (SBL) approach with automatic relevance determination (ARD) is proposed. The sparse Bayesian modeling, exemplified by the relevance vector machine (RVM), allows a sparse regression or classification function to be constructed as a linear combination of a few basis functions. It is demonstrated that, by computing the stationary points of the variational update expressions with noninformative (ARD) hyperpriors, a fast version of variational SBL can be constructed. Analysis of the computed stationary points indicates that SBL with Gaussian sparsity priors and noninformative hyperpriors corresponds to removing components with signal-to-noise ratio below a 0 dB threshold; this threshold can also be adjusted to significantly improve the convergence rate and sparsity of SBL. It is demonstrated that the pruning conditions derived for fast variational SBL coincide with those obtained for fast marginal likelihood maximization; moreover, the parameters that maximize the variational lower bound also maximize the marginal likelihood function. The effectiveness of fast variational SBL is demonstrated with synthetic as well as with real data.
AB - In this work, a new fast variational sparse Bayesian learning (SBL) approach with automatic relevance determination (ARD) is proposed. The sparse Bayesian modeling, exemplified by the relevance vector machine (RVM), allows a sparse regression or classification function to be constructed as a linear combination of a few basis functions. It is demonstrated that, by computing the stationary points of the variational update expressions with noninformative (ARD) hyperpriors, a fast version of variational SBL can be constructed. Analysis of the computed stationary points indicates that SBL with Gaussian sparsity priors and noninformative hyperpriors corresponds to removing components with signal-to-noise ratio below a 0 dB threshold; this threshold can also be adjusted to significantly improve the convergence rate and sparsity of SBL. It is demonstrated that the pruning conditions derived for fast variational SBL coincide with those obtained for fast marginal likelihood maximization; moreover, the parameters that maximize the variational lower bound also maximize the marginal likelihood function. The effectiveness of fast variational SBL is demonstrated with synthetic as well as with real data.
UR - http://www.scopus.com/inward/record.url?scp=81455134312&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=81455134312&partnerID=8YFLogxK
U2 - 10.1109/TSP.2011.2168217
DO - 10.1109/TSP.2011.2168217
M3 - Article
AN - SCOPUS:81455134312
SN - 1053-587X
VL - 59
SP - 6257
EP - 6261
JO - IEEE Transactions on Signal Processing
JF - IEEE Transactions on Signal Processing
IS - 12
M1 - 6020818
ER -