TY - GEN
T1 - Collaborative hierarchical sparse modeling
AU - Sprechmann, Pablo
AU - Ramirez, Ignacio
AU - Sapiro, Guillermo
AU - Eldar, Yonina
PY - 2010
Y1 - 2010
N2 - Sparse modeling is a powerful framework for data analysis and processing. Traditionally, encoding in this framework is done by solving an ℓ1-regularized linear regression problem, usually called Lasso. In this work we first combine the sparsity-inducing property of the Lasso model, at the individual feature level, with the block-sparsity property of the group Lasso model, where sparse groups of features are jointly encoded, obtaining a sparsity pattern hierarchically structured. This results in the hierarchical Lasso, which shows important practical modeling advantages. We then extend this approach to the collaborative case, where a set of simultaneously coded signals share the same sparsity pattern at the higher (group) level but not necessarily at the lower one. Signals then share the same active groups, or classes, but not necessarily the same active set. This is very well suited for applications such as source separation. An efficient optimization procedure, which guarantees convergence to the global optimum, is developed for these new models. The underlying presentation of the new framework and optimization approach is complemented with experimental examples and preliminary theoretical results.
AB - Sparse modeling is a powerful framework for data analysis and processing. Traditionally, encoding in this framework is done by solving an ℓ1-regularized linear regression problem, usually called Lasso. In this work we first combine the sparsity-inducing property of the Lasso model, at the individual feature level, with the block-sparsity property of the group Lasso model, where sparse groups of features are jointly encoded, obtaining a sparsity pattern hierarchically structured. This results in the hierarchical Lasso, which shows important practical modeling advantages. We then extend this approach to the collaborative case, where a set of simultaneously coded signals share the same sparsity pattern at the higher (group) level but not necessarily at the lower one. Signals then share the same active groups, or classes, but not necessarily the same active set. This is very well suited for applications such as source separation. An efficient optimization procedure, which guarantees convergence to the global optimum, is developed for these new models. The underlying presentation of the new framework and optimization approach is complemented with experimental examples and preliminary theoretical results.
UR - https://www.scopus.com/pages/publications/77953703525
UR - https://www.scopus.com/inward/citedby.url?scp=77953703525&partnerID=8YFLogxK
U2 - 10.1109/CISS.2010.5464845
DO - 10.1109/CISS.2010.5464845
M3 - Conference contribution
AN - SCOPUS:77953703525
SN - 9781424474172
T3 - 2010 44th Annual Conference on Information Sciences and Systems, CISS 2010
BT - 2010 44th Annual Conference on Information Sciences and Systems, CISS 2010
T2 - 44th Annual Conference on Information Sciences and Systems, CISS 2010
Y2 - 17 March 2010 through 19 March 2010
ER -