A review on NLP zero-shot and few-shot learning: methods and applications
No Thumbnail Available
Date
2025
Journal Title
Journal ISSN
Volume Title
Publisher
Springer Nature
Abstract
Zero-shot and few-shot learning techniques in natural language processing (NLP), this comprehensive review traces their evolution from traditional methods to cutting-edge approaches like transfer learning and pre-trained language models, semantic embedding, attribute-based approaches, generative models for data augmentation in zero-shot learning, and meta-learning, model-agnostic meta-learning, relationship networks, model-agnostic meta-learning (MAML), prototypical networks in few-shot learning. Real-world applications underscore the adaptability and efficacy of these techniques across various NLP tasks in both industry and academia. Acknowledging challenges inherent in zero-shot and few-shot learning, this review identifies limitations and suggests avenues for improvement. It emphasizes theoretical foundations alongside practical considerations such as accuracy and generalization across diverse NLP tasks. By consolidating key insights, this review provides researchers and practitioners with valuable guidance on the current state and future potential of zero-shot and few-shot learning techniques in addressing real-world NLP challenges. Looking ahead, this review aims to stimulate further research, fostering a deeper understanding of the complexities and applicability of zero-shot and few-shot learning techniques in NLP. By offering a roadmap for future exploration, it seeks to contribute to the ongoing advancement and practical implementation of NLP technologies across various domains. © The Author(s) 2025.
Description
Keywords
Artificial intelligence, Deep learning, Few-shot learning, Machine learning, Model agnostic meta-learning (MAML), Natural language processing, Zero-shot learning
Citation
Discover Applied Sciences, 2025, Vol.7, 9, p. -
