EdgeFedNet: Edge Server Based Communication and Computation Efficient Federated Learning

No Thumbnail Available

Date

2025

Journal Title

Journal ISSN

Volume Title

Publisher

Springer

Abstract

Federated learning (FL) is a new learning framework for training machine learning and deep learning models using data spread over several edge devices. Edge devices like mobile phones and IoT devices have constraints on computational power, resources, and connectivity for training the model. Also, many model parameters will be exchanged while training the model, leading to high communication costs in FL when bandwidth is limited. This paper presents EdgeFedNet a new form of training the model in FL. The proposed method reduces the model parameters by pruning the model and restricts the communication between clients and the cloud server by implementing edge servers. An edge server near a set of clients forms a cluster and coordinates the FL training. The aggregated model updates from all the edge servers are sent to the cloud server, restricting the frequent communication between the clients and the cloud server. The experimental results exhibit a remarkable reduction in the number model parameters (up to 54%) and effectively address the communication overhead by reducing communication rounds by 59% compared to the baseline approach FedAvg. These enhancements are achieved without sacrificing accuracy, presenting promising implications for more efficient model parameter pruning and communication strategies. © The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd. 2025.

Description

Keywords

Communication, Computation, Edge computing, Federated learning

Citation

SN Computer Science, 2025, 6, 3, pp. -

Collections

Endorsement

Review

Supplemented By

Referenced By