Faculty Publications

Permanent URI for this communityhttps://idr.nitk.ac.in/handle/123456789/18736

Publications by NITK Faculty

Browse

Search Results

Now showing 1 - 4 of 4
  • Item
    A Multimodal Contrastive Federated Learning for Digital Healthcare
    (Springer, 2023) Sachin, D.N.; Annappa, B.; Ambesenge, S.; Tony, A.E.
    Digital healthcare applications have gained enormous global interest due to the rapid development of the internet of medical things (IoMT), which helps access massive amounts of multimodal healthcare data. Using this rich multimodal data without violating user privacy becomes crucial. Federated learning (FL) isolates data and protects user privacy. Clients collaboratively learn global models without data transmission. Most of the current FL approaches still depend on single-modal data. It is known that multimodal data always benefit from the complementarity of different modalities. This paper proposes a multimodal contrastive federated learning framework for digital healthcare. The proposed framework solves the multimodal federated learning problem. The proposed architecture used a geometric multimodal contrastive representation learning method to learn representations of multiple modalities in a shared, high-dimensional space. This helps optimize the representations to capture the inter-modal relationships better and improves the multimodal model’s overall performance. Experiments show that the proposed framework performs better than conventional single-modality FL and multimodal FL framework approaches. Given its generality and extensibility, the proposed framework can be used for many downstream tasks in healthcare applications. © 2023, The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd.
  • Item
    FedCure: A Heterogeneity-Aware Personalized Federated Learning Framework for Intelligent Healthcare Applications in IoMT Environments
    (Institute of Electrical and Electronics Engineers Inc., 2024) Sachin, D.N.; Annappa, B.; Hegde, S.; Abhijit, C.S.; Ambesange, S.
    The advent of the Internet of Medical Things (IoMT) devices has led to a healthcare revolution, introducing a new era of smart applications driven by Artificial Intelligence (AI). These advanced technologies have greatly influenced the healthcare industry and have played a crucial role in enhancing the quality of life globally. Federated Learning (FL) has become popular as a technique to create models that can be shared universally using the vast datasets collected from IoMT devices while maintaining data privacy. However, the complex variations in IoMT environments, including diverse devices, data characteristics, and model complexities, create challenges for the straightforward application of traditional FL methods. Consequently, it is not well-suited for deployment in such contexts. This paper introduces FedCure, a personalized FL framework tailored for intelligent IoMT-based healthcare applications operating within a cloud-edge architecture. FedCure is adept at addressing the challenges within IoMT environments by employing personalized FL techniques that can effectively mitigate the impact of heterogeneity. Furthermore, the integration of edge computing technology enhances processing speed and minimizes latency in intelligent IoMT applications. Lastly, this research showcases several case studies encompassing IoMT-based applications, such as Eye Retinopathy Detection, Diabetes Monitoring, Maternal Health, Remote Health Monitoring, and Human Activity Recognition. These case studies provide a means to assess the effectiveness of the proposed FedCure framework and showcase exceptional performance with accuracy and minimal communication overhead, especially in addressing the challenges posed by heterogeneity. © 2013 IEEE.
  • Item
    EdgeFedNet: Edge Server Based Communication and Computation Efficient Federated Learning
    (Springer, 2025) Gowtham, L.; Annappa, B.; Sachin, D.N.
    Federated learning (FL) is a new learning framework for training machine learning and deep learning models using data spread over several edge devices. Edge devices like mobile phones and IoT devices have constraints on computational power, resources, and connectivity for training the model. Also, many model parameters will be exchanged while training the model, leading to high communication costs in FL when bandwidth is limited. This paper presents EdgeFedNet a new form of training the model in FL. The proposed method reduces the model parameters by pruning the model and restricts the communication between clients and the cloud server by implementing edge servers. An edge server near a set of clients forms a cluster and coordinates the FL training. The aggregated model updates from all the edge servers are sent to the cloud server, restricting the frequent communication between the clients and the cloud server. The experimental results exhibit a remarkable reduction in the number model parameters (up to 54%) and effectively address the communication overhead by reducing communication rounds by 59% compared to the baseline approach FedAvg. These enhancements are achieved without sacrificing accuracy, presenting promising implications for more efficient model parameter pruning and communication strategies. © The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd. 2025.
  • Item
    Smart client selection strategies for enhanced federated learning in digital healthcare applications
    (Springer, 2025) Sachin, D.N.; Annappa, B.; Ambesange, S.
    Federated Learning (FL) trains AI models in healthcare without sharing patient data. FL computes client models locally and combines them to create a global model. However, involving all clients is impractical due to resource limitations. Random selection of a subset of clients in each FL round can pose challenges for resource-limited devices, leading to longer processing times and potential training failures. To tackle these obstacles, this research proposes a novel strategy for FL that treats each training round as a client selection process to improve the efficiency and effectiveness of FL in healthcare applications, where data privacy is paramount. The approach begins by calculating the uncertainty value for each client, which quantifies the contribution of the client’s data to the overall model. Clients are then ranked based on their uncertainty values, and those with higher loss values are given a higher probability of participating in the training process. The experimental outcomes clearly show that the proposed strategy effectively makes 1.3x training faster, and 30% lowers communication expenses, conserves computational resources, and enhances model performance when contrasted with random client selection. © The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2024.