Robust iteratively reweighted Lasso for sparse tensor factorizations

Hyon Jung Kim, Esa Ollila, Visa Koivunen, H. Vincent Poor

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Scopus citations


A new tensor approximation method is developed based on the CANDECOMP/PARAFAC (CP) factorization that enjoys both sparsity (i.e., yielding factor matrices with some non-zero elements) and resistance to outliers and non-Gaussian measurement noise. This method utilizes a robust bounded loss function for errors in the low-rank tensor approximation while encouraging sparsity with Lasso (or ℓ1-) regularization to the factor matrices (of a tensor data). A simple alternating, iteratively reweighted (IRW) Lasso algorithm is proposed to solve the resulting optimization problem. Simulation studies illustrate that the proposed method provides excellent performance in terms of mean square error accuracy for heavy-tailed noise conditions, with relatively small loss in conventional Gaussian noise.

Original languageEnglish (US)
Title of host publication2014 IEEE Workshop on Statistical Signal Processing, SSP 2014
PublisherIEEE Computer Society
Number of pages4
ISBN (Print)9781479949755
StatePublished - 2014
Event2014 IEEE Workshop on Statistical Signal Processing, SSP 2014 - Gold Coast, QLD, Australia
Duration: Jun 29 2014Jul 2 2014

Publication series

NameIEEE Workshop on Statistical Signal Processing Proceedings


Other2014 IEEE Workshop on Statistical Signal Processing, SSP 2014
CityGold Coast, QLD

All Science Journal Classification (ASJC) codes

  • Electrical and Electronic Engineering
  • Applied Mathematics
  • Signal Processing
  • Computer Science Applications


  • Iteratively reweighted least squares
  • Lasso
  • big data
  • regularization
  • robust loss function


Dive into the research topics of 'Robust iteratively reweighted Lasso for sparse tensor factorizations'. Together they form a unique fingerprint.

Cite this