Vincent, A.M.Padikkal, P.Bini, A.A.2026-02-032025IEEE Access, 2025, 13, , pp. 130816-130831https://doi.org/10.1109/ACCESS.2025.3591142https://idr.nitk.ac.in/handle/123456789/20665This paper investigates the significance of hyperparameter optimization in meta-learning for image classification tasks. Despite advancements in deep learning, real-time image classification applications often suffer from data inadequacy. Few-shot learning addresses this challenge by enabling learning from limited samples. Meta-learning, a prominent tool for few-shot learning, learns across multiple classification tasks. We explore different types of meta-learners, with a particular focus on metric-based models. We analyze the potential of hyperparameter optimization techniques, specifically Bayesian optimization and its variants, to enhance the performance of these models. Experimental results on the Omniglot and ImageNet datasets demonstrate that incorporating Bayesian optimization, particularly its evolutionary strategy variant, into meta-learning frameworks leads to improved accuracy compared to settings without hyperparameter optimization. Here, we show that by optimizing hyperparameters for individual tasks rather than using a uniform setting, we achieve notable gains in model performance, underscoring the importance of tailored hyperparameter configurations in meta-learning. © 2013 IEEE.Bayesian networksClassification (of information)Deep learningEvolutionary algorithmsImage enhancementLearning algorithmsLearning systemsOptimizationBayesian optimizationClassification tasksFew-shot learningHyper-parameterHyper-parameter optimizationsImages classificationLearn+MetalearningReal time imagesReal-time imagesImage classificationOptimizing Hyperparameters in Meta-Learning for Enhanced Image Classification