TY - GEN
T1 - Semi-Supervised Few-Shot Learning from A Dependency-Discriminant Perspective
AU - Hou, Zejiang
AU - Kung, Sun Yuan
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - We study the few-shot learning (FSL) problem, where a model learns to recognize new objects with extremely few labeled training data per category. Most of previous FSL approaches resort to the meta-learning paradigm, where the model accumulates inductive bias through learning from many training tasks, in order to solve new unseen few-shot tasks. In contrast, we propose a simple semi-supervised FSL approach to exploit unlabeled data accompanying the few-shot task to improve FSL performance. More exactly, to train a classifier, we propose a Dependency Maximization loss based on the Hilbert-Schmidt norm of the cross-covariance operator, which maximizes the statistical dependency between the embedded feature of the unlabeled data and their label predictions, together with the supervised loss over the support set. The obtained classifier is used to infer the pseudo-labels of the unlabeled data. Furthermore, we propose an Instance Discriminant Analysis to evaluate the credibility of the pseudo-labeled examples and select the faithful ones into an augmented support set, which is used to retrain the classifier. We iterate the process until the pseudo-labels of the unlabeled data becomes stable. Through extensive experiments on four widely used few-shot classification benchmarks, including mini-ImageNet, tiered-ImageNet, CUB, and CIFARFS, the proposed method outperforms previous state-of-the-art FSL methods.
AB - We study the few-shot learning (FSL) problem, where a model learns to recognize new objects with extremely few labeled training data per category. Most of previous FSL approaches resort to the meta-learning paradigm, where the model accumulates inductive bias through learning from many training tasks, in order to solve new unseen few-shot tasks. In contrast, we propose a simple semi-supervised FSL approach to exploit unlabeled data accompanying the few-shot task to improve FSL performance. More exactly, to train a classifier, we propose a Dependency Maximization loss based on the Hilbert-Schmidt norm of the cross-covariance operator, which maximizes the statistical dependency between the embedded feature of the unlabeled data and their label predictions, together with the supervised loss over the support set. The obtained classifier is used to infer the pseudo-labels of the unlabeled data. Furthermore, we propose an Instance Discriminant Analysis to evaluate the credibility of the pseudo-labeled examples and select the faithful ones into an augmented support set, which is used to retrain the classifier. We iterate the process until the pseudo-labels of the unlabeled data becomes stable. Through extensive experiments on four widely used few-shot classification benchmarks, including mini-ImageNet, tiered-ImageNet, CUB, and CIFARFS, the proposed method outperforms previous state-of-the-art FSL methods.
UR - http://www.scopus.com/inward/record.url?scp=85137828851&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85137828851&partnerID=8YFLogxK
U2 - 10.1109/CVPRW56347.2022.00319
DO - 10.1109/CVPRW56347.2022.00319
M3 - Conference contribution
AN - SCOPUS:85137828851
T3 - IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
SP - 2816
EP - 2824
BT - Proceedings - 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2022
PB - IEEE Computer Society
T2 - 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2022
Y2 - 19 June 2022 through 20 June 2022
ER -