Learning Nonnegative Factors from Tensor Data: Probabilistic Modeling and Inference Algorithm

Lei Cheng, Xueke Tong, Shuai Wang, Yik Chung Wu, H. Vincent Poor

Research output: Contribution to journalArticlepeer-review

27 Scopus citations

Abstract

Tensor canonical polyadic decomposition (CPD) with nonnegative factor matrices, which extracts useful latent information from multidimensional data, has found wide-spread applications in various big data analytic tasks. Currently, the implementation of most existing algorithms needs the knowledge of tensor rank. However, this information is practically unknown and difficult to acquire. To address this issue, a probabilistic approach is taken in this paper. Different from previous works, this paper firstly introduces a sparsity-promoting nonnegative Gaussian-gamma prior, based on which a novel probabilistic model for the CPD problem with nonnegative and continuous factors is established. This probabilistic model further enables the derivation of an efficient inference algorithm that accurately learns the nonnegative factors from the tensor data, along with an integrated feature of automatic rank determination. Numerical results using synthetic data and real-world applications are presented to show the remarkable performance of the proposed algorithm.

Original languageEnglish (US)
Article number9006902
JournalIEEE Transactions on Signal Processing
Volume68
DOIs
StatePublished - 2020
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Electrical and Electronic Engineering

Keywords

  • Tensor decomposition
  • automatic rank determination
  • nonnegative factors
  • variational inference

Fingerprint

Dive into the research topics of 'Learning Nonnegative Factors from Tensor Data: Probabilistic Modeling and Inference Algorithm'. Together they form a unique fingerprint.

Cite this