Browsing by Author "Prasad, N."
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Item Bio-Inspired Hyperparameter Tuning of Federated Learning for Student Activity Recognition in Online Exam Environment(Multidisciplinary Digital Publishing Institute (MDPI), 2024) Ramu, R.; Prasad, N.; Guddeti, R.M.R.; Mohan, B.R.Nowadays, online examination (exam in short) platforms are becoming more popular, demanding strong security measures for digital learning environments. This includes addressing key challenges such as head pose detection and estimation, which are integral for applications like automatic face recognition, advanced surveillance systems, intuitive human–computer interfaces, and enhancing driving safety measures. The proposed work holds significant potential in enhancing the security and reliability of online exam platforms. It achieves this by accurately classifying students’ attentiveness based on distinct head poses, a novel approach that leverages advanced techniques like federated learning and deep learning models. The proposed work aims to classify students’ attentiveness with the help of different head poses. In this work, we considered five head poses: front face, down face, right face, up face, and left face. A federated learning (FL) framework with a pre-trained deep learning model (ResNet50) was used to accomplish the classification task. To classify students’ activity (behavior) in an online exam environment using the FL framework’s local client device, we considered the ResNet50 model. However, identifying the best hyperparameters in the local client ResNet50 model is challenging. Hence, in this study, we proposed two hybrid bio-inspired optimized methods, namely, Particle Swarm Optimization with Genetic Algorithm (PSOGA) and Particle Swarm Optimization with Elitist Genetic Algorithm (PSOEGA), to fine-tune the hyperparameters of the ResNet50 model. The bio-inspired optimized methods employed in the ResNet50 model will train and classify the students’ behavior in an online exam environment. The FL framework trains the client model locally and sends the updated weights to the server model. The proposed hybrid bio-inspired algorithms outperform the GA and PSO when independently used. The proposed PSOGA not only outperforms the proposed PSOEGA but also outperforms the benchmark algorithms considered for performance evaluation by giving an accuracy of 95.97%. © 2024 by the authors.Item Hybrid Bio-Optimized Algorithms for Hyperparameter Tuning in Machine Learning Models: A Software Defect Prediction Case Study(Multidisciplinary Digital Publishing Institute (MDPI), 2024) Das, M.; Mohan, B.R.; Guddeti, R.M.R.; Prasad, N.Addressing real-time optimization problems becomes increasingly challenging as their complexity continues to escalate over time. So bio-optimization algorithms (BoAs) come into the picture to solve such problems due to their global search capability, adaptability, versatility, parallelism, and robustness. This article aims to perform hyperparameter tuning of machine learning (ML) models by integrating them with BoAs. Aiming to maximize the accuracy of the hybrid bio-optimized defect prediction (HBoDP) model, this research paper develops four novel hybrid BoAs named the gravitational force Lévy flight grasshopper optimization algorithm (GFLFGOA), the gravitational force Lévy flight grasshopper optimization algorithm–sparrow search algorithm (GFLFGOA-SSA), the gravitational force grasshopper optimization algorithm–sparrow search algorithm (GFGOA-SSA), and the Lévy flight grasshopper optimization algorithm–sparrow search algorithm (LFGOA-SSA). These aforementioned algorithms are proposed by integrating the good exploration capacity of the SSA with the faster convergence of the LFGOA and GFGOA. The performances of the GFLFGOA, GFLFGOA-SSA, GFGOA-SSA, and LFGOA-SSA are verified by conducting two different experiments. Firstly, the experimentation was conducted on nine benchmark functions (BFs) to assess the mean, standard deviation (SD), and convergence rate. The second experiment focuses on boosting the accuracy of the HBoDP model through the fine-tuning of the hyperparameters in the artificial neural network (ANN) and XGBOOST (XGB) models. To justify the effectiveness and performance of these hybrid novel algorithms, we compared them with four base algorithms, namely the grasshopper optimization algorithm (GOA), the sparrow search algorithm (SSA), the gravitational force grasshopper optimization algorithm (GFGOA), and the Lévy flight grasshopper optimization algorithm (LFGOA). Our findings illuminate the effectiveness of this hybrid approach in enhancing the convergence rate and accuracy. The experimental results show a faster convergence rate for BFs and improvements in software defect prediction accuracy for the NASA defect datasets by comparing them with some baseline methods. © 2024 by the authors.Item Improving Machine Learning Models with Hybrid Metaheuristic Algorithm for Software Defect Prediction(Springer Science and Business Media Deutschland GmbH, 2025) Das, M.; Prasad, N.; Mohan, B.R.Software defect prediction has always been an area of interest in the field of software engineering. As the prediction of software defects plays a vital role, researchers are focusing more on metaheuristic algorithms to develop better prediction models. In this paper, we focused on the parameter tuning of the machine learning (ML) models using hybrid metaheuristic algorithms. Here, we have used three metaheuristic algorithms, namely sparrow search, wolf pack, and artificial bee colony optimization algorithm (ABC), to optimize the hyperparameters of the ML model. We have developed a hybrid version of these algorithms for better performance. The sparrow search algorithm (SSA) has high search accuracy and slow convergence speed with the advantages of good stability and strong robustness. The Wolf Pack Algorithm (WPA) has a robust global optimization ability, fast convergence speed, and various optimization strategies. The artificial bee colony (ABC) optimization algorithm has the advantage of not being influenced by the initial parameters, thus enabling search in a wider search space. Considering the strongest features of the aforementioned algorithms, two new hybrid algorithms have been developed, namely sparrow search algorithm-wolf pack algorithm (SSA-WPA) and sparrow search algorithm-artificial bee colony (SSA-ABC). These two algorithms are combined with the artificial neural network and XGBOOST model for better accuracy. To achieve the correctness of the proposed method, it is being verified by five defective NASA datasets and compared with the base methods. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.
