A Self-Paced Regularization Framework for Multilabel Learning

Changsheng Li, Fan Wei, Junchi Yan, Xiaoyu Zhang, Qingshan Liu, Hongyuan Zha

Research output: Contribution to journalArticlepeer-review

23 Scopus citations

Abstract

In this brief, we propose a novel multilabel learning framework, called multilabel self-paced learning, in an attempt to incorporate the SPL scheme into the regime of multilabel learning. Specifically, we first propose a new multilabel learning formulation by introducing a self-paced function as a regularizer, so as to simultaneously prioritize label learning tasks and instances in each iteration. Considering that different multilabel learning scenarios often need different self-paced schemes during learning, we thus provide a general way to find the desired self-paced functions. To the best of our knowledge, this is the first work to study multilabel learning by jointly taking into consideration the complexities of both training instances and labels. Experimental results on four publicly available data sets suggest the effectiveness of our approach, compared with the state-of-the-art methods.

Original languageEnglish (US)
Pages (from-to)2660-2666
Number of pages7
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume29
Issue number6
DOIs
StatePublished - Jun 2018
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Keywords

  • Local correlation
  • multi-label learning
  • selfpaced learning

Fingerprint

Dive into the research topics of 'A Self-Paced Regularization Framework for Multilabel Learning'. Together they form a unique fingerprint.

Cite this