Faculty Publications

Permanent URI for this communityhttps://idr.nitk.ac.in/handle/123456789/18736

Publications by NITK Faculty

Browse

Search Results

Now showing 1 - 3 of 3
  • Item
    VMAP: Matching-based Efficient Offloading in IoT-Fog Environments with Variable Resources
    (IEEE Computer Society, 2023) Morey, J.V.; Satpathy, A.; Addya, S.K.
    Fog computing is a promising technology for critical, resource-intensive, and time-sensitive applications. In this regard, a significant challenge is generating an offloading solution that minimizes the latency, energy, and number of outages for a dense IoT-Fog environment. However, the existing solutions either focus on a single objective or mainly dedicate fixed-sized resources as virtual resource units (VRUs). Moreover, these solutions are restrictive and not comprehensive, resulting in poor performance. To overcome these challenges, this paper proposes a VMAP model addressing the lacunas above. Offloading problem is abstracted as a one-to-many matching game between two sets of entities - tasks and fog nodes (FNs) by considering both preferences. Moreover, the preferences and weights of the parameters are generated using the Analytic Hierarchy Process (AHP). Exhaustive simulations indicate that the proposed strategy outperforms the baseline algorithms, considering average task latency and energy consumption by 35% and 22.2%, respectively. Additionally, resource utilization also experiences a boost by 28.57%, and 97.98% of tasks complete their execution within the deadlines. © 2023 IEEE.
  • Item
    LBA: Matching Theory Based Latency-Sensitive Binary Offloading in IoT-Fog Networks
    (Institute of Electrical and Electronics Engineers Inc., 2024) Soni, P.; Deshlahre, O.C.; Satpathy, A.; Addya, S.K.
    The Internet of Things (IoT) is growing more popular with applications like healthcare services, traffic monitoring, video streaming, smart homes, etc. These applications produce an enormous amount of data, so a realistic option in this instance is to offload computational tasks to their proximity fog nodes (FNs) instead of the remote cloud. However, a negligent offloading strategy may cause anomalous computational traffic load at the FNs, causing congestion that may adversely affect the latency. However, the latency of task flows from IoT devices comprises communications latency at BS and computational latency at FNs. Therefore, designing offloading algorithms to distribute the computational load at FN evenly and efficiently utilize the FN resources is crucial. To solve this problem, we proposed LBA in a fog network with a binary offloading strategy using the matching theory-based approach. We utilize the Analytic Hierarchy Process (AHP) to generate the preference list. Furthermore, the binary offloading technique follows the deferred acceptance algorithm (DAA) to produce a stable assignment, and the complete offloading problem is modeled as a one-to-many matching game. Comprehensive simulations ensure that LBA can accomplish a better-balanced assignment for homogeneous and heterogeneous input concerning all the baseline algorithms. © 2024 IEEE.
  • Item
    Containerized deployment of micro-services in fog devices: a reinforcement learning-based approach
    (Springer, 2022) Nath, S.B.; Chattopadhyay, S.; Karmakar, R.; Addya, S.K.; Chakraborty, S.; Ghosh, S.K.
    The real power of fog computing comes when deployed under a smart environment, where the raw data sensed by the Internet of Things (IoT) devices should not cross the data boundary to preserve the privacy of the environment, yet a fast computation and the processing of the data is required. Devices like home network gateway, WiFi access points or core network switches can work as a fog device in such scenarios as its computing resources can be leveraged by the applications for data processing. However, these devices have their primary workload (like packet forwarding in a router/switch) that is time-varying and often generates spikes in the resource demand when bandwidth-hungry end-user applications, are started. In this paper, we propose pick–test–choose, a dynamic micro-service deployment and execution model that considers such time-varying primary workloads and workload spikes in the fog nodes. The proposed mechanism uses a reinforcement learning mechanism, Bayesian optimization, to decide the target fog node for an application micro-service based on its prior observation of the system’s states. We implement PTC in a testbed setup and evaluate its performance. We observe that PTC performs better than four other baseline models for micro-service offloading in a fog computing framework. In the experiment with an optical character recognition service, the proposed PTC gives average response time in the range of 9.71 sec–50 sec, which is better than Foglets (24.21 sec–80.35 sec), first-fit (16.74 sec–88 sec), best-fit (11.48 sec–57.39 sec) and mobility-based method (12 sec–53 sec). A further scalability study with an emulated setup over Amazon EC2 further confirms the superiority of PTC over other baselines. © 2021, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.