TY - GEN
T1 - Few-Shot Learning Via Dependency Maximization and Instance Discriminant Analysis
AU - Hou, Zejiang
AU - Kung, Sun Yuan
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - We study the few-shot learning (FSL) problem, where a model learns to recognize new objects with extremely few labeled training data per category. Most of previous FSL approaches resort to the meta-learning paradigm, where the model accumulates inductive bias through learning many training tasks so as to solve a new unseen few-shot task. In contrast, we propose a simple approach to exploit unlabeled data accompanying the few-shot task for improving few-shot performance. Firstly, we propose a Dependency Maximization method based on the Hilbert-Schmidt norm of the cross-covariance operator, which maximizes the statistical dependency between the embedded feature of those unlabeled data and their label predictions, together with the supervised loss over the support set. We then use the obtained model to infer the pseudo-labels for those unlabeled data. Furthermore, we propose an Instance Discriminant Analysis to evaluate the credibility of each pseudo-labeled example and select the most faithful ones into an augmented support set to retrain the model as in the first step. We iterate the above process until the pseudo-labels for the unlabeled data become stable. Following the standard transductive and semi-supervised FSL setting, our experiments show that the proposed method outperforms previous state-of-The-Art methods on four widely used benchmarks, including mini-ImageNet, tiered-ImageNet, CUB, and CIFARFS.
AB - We study the few-shot learning (FSL) problem, where a model learns to recognize new objects with extremely few labeled training data per category. Most of previous FSL approaches resort to the meta-learning paradigm, where the model accumulates inductive bias through learning many training tasks so as to solve a new unseen few-shot task. In contrast, we propose a simple approach to exploit unlabeled data accompanying the few-shot task for improving few-shot performance. Firstly, we propose a Dependency Maximization method based on the Hilbert-Schmidt norm of the cross-covariance operator, which maximizes the statistical dependency between the embedded feature of those unlabeled data and their label predictions, together with the supervised loss over the support set. We then use the obtained model to infer the pseudo-labels for those unlabeled data. Furthermore, we propose an Instance Discriminant Analysis to evaluate the credibility of each pseudo-labeled example and select the most faithful ones into an augmented support set to retrain the model as in the first step. We iterate the above process until the pseudo-labels for the unlabeled data become stable. Following the standard transductive and semi-supervised FSL setting, our experiments show that the proposed method outperforms previous state-of-The-Art methods on four widely used benchmarks, including mini-ImageNet, tiered-ImageNet, CUB, and CIFARFS.
KW - cross-domain
KW - dependency maximization
KW - few-shot learning
KW - semi-supervised
UR - http://www.scopus.com/inward/record.url?scp=85122786702&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85122786702&partnerID=8YFLogxK
U2 - 10.1109/MLSP52302.2021.9596284
DO - 10.1109/MLSP52302.2021.9596284
M3 - Conference contribution
AN - SCOPUS:85122786702
T3 - IEEE International Workshop on Machine Learning for Signal Processing, MLSP
BT - 2021 IEEE 31st International Workshop on Machine Learning for Signal Processing, MLSP 2021
PB - IEEE Computer Society
T2 - 31st IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2021
Y2 - 25 October 2021 through 28 October 2021
ER -