A Self-Paced Regularization Framework for Multilabel Learning

Changsheng Li, Fan Wei, Junchi Yan, Xiaoyu Zhang, Qingshan Liu, Hongyuan Zha

Research output: Contribution to journalArticlepeer-review

30 Scopus citations


In this brief, we propose a novel multilabel learning framework, called multilabel self-paced learning, in an attempt to incorporate the SPL scheme into the regime of multilabel learning. Specifically, we first propose a new multilabel learning formulation by introducing a self-paced function as a regularizer, so as to simultaneously prioritize label learning tasks and instances in each iteration. Considering that different multilabel learning scenarios often need different self-paced schemes during learning, we thus provide a general way to find the desired self-paced functions. To the best of our knowledge, this is the first work to study multilabel learning by jointly taking into consideration the complexities of both training instances and labels. Experimental results on four publicly available data sets suggest the effectiveness of our approach, compared with the state-of-the-art methods.

Original languageEnglish (US)
Pages (from-to)2660-2666
Number of pages7
JournalIEEE Transactions on Neural Networks and Learning Systems
Issue number6
StatePublished - Jun 2018

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence
  • Computer Networks and Communications
  • Computer Science Applications


  • Local correlation
  • multi-label learning
  • selfpaced learning


Dive into the research topics of 'A Self-Paced Regularization Framework for Multilabel Learning'. Together they form a unique fingerprint.

Cite this