BatMan-CLR: Making Few-Shots Meta-learners Resilient Against Label Noise
Contributo in Atti di convegno
Data di Pubblicazione:
2025
Abstract:
The negative impact of label noise is well studied in classical supervised learning yet remains an open research question in meta-learning. Meta-learners aim to adapt to unseen tasks by learning a good initial model in meta-training and fine-tuning it to new tasks during meta-testing. In this paper, we present an extensive analysis of the impact of label noise on the performance of meta-learners, specifically gradient-based N-way K-shot learners. We show that the accuracy of Reptile, iMAML, and foMAML drops by up to 34% when meta-training is affected by label noise on the three representative datasets: Omniglot, CifarFS, and MiniImageNet. To strengthen the resilience against label noise, we propose two sampling techniques, namely manifold (Man) and batch manifold (BatMan), which transforms the noisy supervised learners into semi-supervised learners to increase the utility of noisy labels. We construct N-way 2-contrastive-shot tasks through augmentation, learn the embedding via a contrastive loss in meta-training, and perform classification through zeroing on the embeddings in meta-testing. We show that our approach can effectively mitigate the impact of meta-training label noise. Even with 60% wrong labels BatMan and Man can limit the meta-testing accuracy drop to 2.5, 9.4, 1.1% points with existing meta-learners across Omniglot, CifarFS, and MiniImageNet, respectively. We provide our code online: https://gitlab.ewi.tudelft.nl/dmls/publications/batman-clr-noisy-meta-learning.
Tipologia CRIS:
04A-Conference paper in volume
Elenco autori:
Galjaard, Jeroen M.; Birke, Robert; Pérez, Juan F.; Chen, Lydia Y.
Link alla scheda completa:
Link al Full Text:
Titolo del libro:
Lecture Notes in Computer Science
Pubblicato in: