Repository logo
Communities & Collections
All of DSpace
  • English
  • العربية
  • বাংলা
  • Català
  • Čeština
  • Deutsch
  • Ελληνικά
  • Español
  • Suomi
  • Français
  • Gàidhlig
  • हिंदी
  • Magyar
  • Italiano
  • Қазақ
  • Latviešu
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Srpski (lat)
  • Српски
  • Svenska
  • Türkçe
  • Yкраї́нська
  • Tiếng Việt
Log In
Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Martin, J.P."

Filter results by typing the first few letters
Now showing 1 - 17 of 17
  • Results Per Page
  • Sort Options
  • No Thumbnail Available
    Item
    Elucidating the challenges for the praxis of fog computing: An aspect-based study
    (John Wiley and Sons Ltd vgorayska@wiley.com Southern Gate Chichester, West Sussex PO19 8SQ, 2019) Martin, J.P.; Kandasamy, A.; Chandrasekaran, K.; Joseph, C.T.
    The evolutionary advancements in the field of technology have led to the instigation of cloud computing. The Internet of Things paradigm stimulated the extensive use of sensors distributed across the network edges. The cloud datacenters are assigned the responsibility for processing the collected sensor data. Recently, fog computing was conceptuated as a solution for the overwhelmed narrow bandwidth. The fog acts as a complementary layer that interplays with the cloud and edge computing layers, for processing the data streams. The fog paradigm, as any distributed paradigm, has its set of inherent challenges. The fog environment necessitates the development of management platforms that effectuates the orchestration of fog entities. Owing to the plenitude of research efforts directed toward these issues in a relatively young field, there is a need to organize the different research works. In this study, we provide a compendious review of the research approaches in the domain, with special emphasis on the approaches for orchestration and propose a multilevel taxonomy to classify the existing research. The study also highlights the application realms of fog computing and delineates the open research challenges in the domain. © 2019 John Wiley & Sons, Ltd.
  • No Thumbnail Available
    Item
    Explicating fog computing key research challenges and solutions
    (CRC Press, 2021) Martin, J.P.; Singh, V.; Chandrasekaran, K.; Kandasamy, A.
    [No abstract available]
  • No Thumbnail Available
    Item
    Exploring the support for high performance applications in the container runtime environment
    (Springer Berlin Heidelberg, 2018) Martin, J.P.; Kandasamy, A.; Chandrasekaran, K.
    Cloud computing is the driving power behind the current technological era. Virtualization is rightly referred to as the backbone of cloud computing. Impacts of virtualization employed in high performance computing (HPC) has been much reviewed by researchers. The overhead in the virtualization layer was one of the reasons which hindered its application in the HPC environment. Recent developments in virtualization, especially the OS container based virtualization provides a solution that employs a lightweight virtualization layer and promises lesser overhead. Containers are advantageous over virtual machines in terms of performance overhead which is a major concern in the case of both data intensive applications and compute intensive applications. Currently, several industries have adopted container technologies such as Docker. While Docker is widely used, it has certain pitfalls such as security issues. The recently introduced CoreOS Rkt container technology overcomes these shortcomings of Docker. There has not been much research on how the Rkt environment is suited for high performance applications. The differences in the stack of the Rkt containers suggest better support for high performance applications. High performance applications consist of CPU-intensive and data-intensive applications. The High Performance Linpack Library and the Graph500 are the commonly used computation intensive and data-intensive benchmark applications respectively. In this work, we explore the feasibility of this inter-operable Rkt container in high performance applications by running the HPL and Graph500 applications and compare its performance with the commonly used container technologies such as LXC and Docker containers. © 2018, The Author(s).
  • No Thumbnail Available
    Item
    Fog Assisted Personalized Dynamic Pricing for Smartgrid
    (Institute of Electrical and Electronics Engineers Inc., 2023) Joseph, C.T.; Martin, J.P.; Chandrasekaran, K.; Raja, S.P.
    Unit electricity pricing is of vital importance in an electric grid network. It is essential to charge the customers in a fair manner. Traditional pricing models are found to be inadequate in the ability to charge customers fairly due to a lack of support for real-time communication between customers and electricity providers. With the introduction of smart devices in the electric grid domain, the real-time gathering of information is a seamless process. Such an electric network that uses smart devices is called a smart grid. In a smart grid network, electricity providers can monitor the electricity usage pattern of customers in a real-time manner, which can then be analyzed to determine the appropriate prices. To analyze the customer's history of usage and price the electricity in a real-time manner, the computation must be performed with minimal latencies. Adoption of a fog computing layer in the smart grids can aid in the attainment of this goal. In this article, we propose a novel method for the pricing of electricity. In our approach, the electric demand of a household is predicted based on their past usage patterns. Users are then clustered into different bins based on their demands, and an evolutionary algorithm is used to generate the prices for the users present in different bins in a real-time manner to ensure the maximum attainable profit to a service provider. © 2014 IEEE.
  • No Thumbnail Available
    Item
    Fuzzy Reinforcement Learning based Microservice Allocation in Cloud Computing Environments
    (2019) Joseph, C.T.; Martin, J.P.; Chandrasekaran, K.; Kandasamy, A.
    Nowadays the Cloud Computing paradigm has become the defacto platform for deploying and managing user applications. Monolithic Cloud applications pose several challenges in terms of scalability and flexibility. Hence, Cloud applications are designed as microservices. Application scheduling and energy efficiency are key concerns in Cloud computing research. Allocating the microservice containers to the hosts in the datacenter is an NP-hard problem. There is a need for efficient allocation strategies to determine the placement of the microservice containers in Cloud datacenters to minimize Service Level Agreement violations and energy consumption. In this paper, we design a Reinforcement Learning-based Microservice Allocation (RL-MA) approach. The approach is implemented in the ContainerCloudSim simulator. The evaluation is conducted using the real-world Google cluster trace. Results indicate that the proposed method reduces both the SLA violation and energy consumption when compared to the existing policies. � 2019 IEEE.
  • No Thumbnail Available
    Item
    Fuzzy Reinforcement Learning based Microservice Allocation in Cloud Computing Environments
    (Institute of Electrical and Electronics Engineers Inc., 2019) Joseph, C.T.; Martin, J.P.; Chandrasekaran, K.; Kandasamy, A.
    Nowadays the Cloud Computing paradigm has become the defacto platform for deploying and managing user applications. Monolithic Cloud applications pose several challenges in terms of scalability and flexibility. Hence, Cloud applications are designed as microservices. Application scheduling and energy efficiency are key concerns in Cloud computing research. Allocating the microservice containers to the hosts in the datacenter is an NP-hard problem. There is a need for efficient allocation strategies to determine the placement of the microservice containers in Cloud datacenters to minimize Service Level Agreement violations and energy consumption. In this paper, we design a Reinforcement Learning-based Microservice Allocation (RL-MA) approach. The approach is implemented in the ContainerCloudSim simulator. The evaluation is conducted using the real-world Google cluster trace. Results indicate that the proposed method reduces both the SLA violation and energy consumption when compared to the existing policies. © 2019 IEEE.
  • No Thumbnail Available
    Item
    HTmRPL++ : A Trust-Aware RPL Routing Protocol for Fog Enabled Internet of Things
    (Institute of Electrical and Electronics Engineers Inc., 2020) Subramanian, N.; Mitra, S.; Martin, J.P.; Chandrasekaran, K.
    With the proliferation of Fog computing, computation is moved to edge devices and is not based on a purely centralized approach. In a Fog computing network, the network topology is dynamic. New nodes will join and leave. One of the major issues in Fog computing is trust. Trust is the level of assurance that an object will behave in a satisfactory manner. The Routing Protocol for Low Power and Lossy Networks (RPL) is a protocol used for routing in IoT networks. RPL provides meager protection against routing or other forms of attacks. The resource-constrained nature of Fog nodes prevents the use of heavyweight cryptographic algorithms to achieve secured communication. A lightweight mechanism is thus essential to impart security in Fog-IoT networks. Trust analysis provides a behavior-based analysis of entities in the system with the power to predict future behavior. In this paper, a lightweight Recommendation based Trust Mechanism is proposed to impart security to RPL. © 2020 IEEE.
  • No Thumbnail Available
    Item
    Location Privacy Using Data Obfuscation in Fog Computing
    (2019) Naik, C.; Siddhartha, M.; Martin, J.P.; Chandrasekaran, K.
    In the past few decades, smartphones and Global Positioning System(GPS) devices have led to the popularity of Location Based Services. It is crucial for large MNCs to get a lot of data from people and provide their services accordingly. However, on the other side, the concern of privacy has also increased among the users, and they would like to hide their whereabouts. The rise of data consumption and the hunger for faster network speed has also led to the emergence of new concepts such as the Fog Computing. Fog computing paradigm extends the storage, networking, and computing facilities of the cloud computing towards the edge of the networks while removing the load on the server centers and decreasing the latency at the edge device. The fog computing will help in unlimited growth of location services and this adoption of fog computing calls for the need for more secure and robust algorithms for location privacy. One of the ways we can alter the information regarding the location of the user is Location Obfuscation. This can be done reversibly or irreversibly. In this paper, we address the problem of location privacy and present a solution based on the type of data that has to be preserved (in our case, it is distance). A mobile application has been designed and developed to test and validate the feasibility of the proposed obfuscation techniques for the Fog computing environments. � 2019 IEEE.
  • No Thumbnail Available
    Item
    Location Privacy Using Data Obfuscation in Fog Computing
    (Institute of Electrical and Electronics Engineers Inc., 2019) Naik, C.; Sri Siddhartha, M.; Martin, J.P.; Chandrasekaran, K.
    In the past few decades, smartphones and Global Positioning System(GPS) devices have led to the popularity of Location Based Services. It is crucial for large MNCs to get a lot of data from people and provide their services accordingly. However, on the other side, the concern of privacy has also increased among the users, and they would like to hide their whereabouts. The rise of data consumption and the hunger for faster network speed has also led to the emergence of new concepts such as the Fog Computing. Fog computing paradigm extends the storage, networking, and computing facilities of the cloud computing towards the edge of the networks while removing the load on the server centers and decreasing the latency at the edge device. The fog computing will help in unlimited growth of location services and this adoption of fog computing calls for the need for more secure and robust algorithms for location privacy. One of the ways we can alter the information regarding the location of the user is Location Obfuscation. This can be done reversibly or irreversibly. In this paper, we address the problem of location privacy and present a solution based on the type of data that has to be preserved (in our case, it is distance). A mobile application has been designed and developed to test and validate the feasibility of the proposed obfuscation techniques for the Fog computing environments. © 2019 IEEE.
  • No Thumbnail Available
    Item
    Machine Learning Approaches for Resource Allocation in the Cloud: Critical Reflections
    (2018) Murali, A.; Das, N.N.; Sukumaran, S.S.; Chandrasekaran, K.; Joseph, C.; Martin, J.P.
    Resource Allocation is the effective and efficient use of a Cloud's resources and is a very challenging problem in cloud environments. Many attempts have been made to make Resource Allocation automated and optimal in terms of profit. The best of these methods used Machine Learning, but this comes with an overhead for computation. A lot of research has been done in this domain to find more efficient methods. Distributed Neural Networks (DNN) is the future of computation and will soon be used to make the computation of large-scale data faster and easier. DNN is currently the most researched area. This paper will summarize the major research works in these fields. A new taxonomy is proposed and can be used as a reference for all future research in this domain. The paper also proposes some areas that need more research in the foreseeable future. � 2018 IEEE.
  • No Thumbnail Available
    Item
    Machine Learning Approaches for Resource Allocation in the Cloud: Critical Reflections
    (Institute of Electrical and Electronics Engineers Inc., 2018) Murali, A.; Das, N.N.; Sukumaran, S.S.; Chandrasekaran, K.; Joseph, C.T.; Martin, J.P.
    Resource Allocation is the effective and efficient use of a Cloud's resources and is a very challenging problem in cloud environments. Many attempts have been made to make Resource Allocation automated and optimal in terms of profit. The best of these methods used Machine Learning, but this comes with an overhead for computation. A lot of research has been done in this domain to find more efficient methods. Distributed Neural Networks (DNN) is the future of computation and will soon be used to make the computation of large-scale data faster and easier. DNN is currently the most researched area. This paper will summarize the major research works in these fields. A new taxonomy is proposed and can be used as a reference for all future research in this domain. The paper also proposes some areas that need more research in the foreseeable future. © 2018 IEEE.
  • No Thumbnail Available
    Item
    Machine Learning Powered Autoscaling for Blockchain-Based Fog Environments
    (Springer Science and Business Media Deutschland GmbH, 2022) Martin, J.P.; Joseph, C.T.; Chandrasekaran, K.; Kandasamy, A.
    Internet-of-Things devices generate huge amount of data which further need to be processed. Fog computing provides a decentralized infrastructure for processing these huge volumes of data. Fog computing environments provide low latency and location-aware alternative to conventional cloud computing by placing the processing nodes closer to the end devices. Co-ordination among end devices can become cumbersome and complex with the increasing amount of IoT devices. Some of the major challenges faced while executing services in the fog environment is the resource provisioning for the user services, service placement among the fog devices and scaling of fog devices based on the current load on the network. Being a decentralized infrastructure, fog computing is vulnerable to external threats such as data thefts. This work presents a blockchain based fog framework for making autoscaling decisions with the use of machine learning techniques. Evaluation is done by performing a series of experiments that show how the services are handled by the fog framework and how the autoscaling decisions are made. © 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.
  • No Thumbnail Available
    Item
    Mobility aware autonomic approach for the migration of application modules in fog computing environment
    (Springer Science and Business Media Deutschland GmbH, 2020) Martin, J.P.; Kandasamy, A.; Chandrasekaran, K.
    The fog computing paradigm has emanated as a widespread computing technology to support the execution of the internet of things applications. The paradigm introduces a distributed, hierarchical layer of nodes collaboratively working together as the Fog layer. User devices connected to Fog nodes are often non-stationary. The location-aware attribute of Fog computing, deems it necessary to provide uninterrupted services to the users, irrespective of their locations. Migration of user application modules among the Fog nodes is an efficient solution to tackle this issue. In this paper, an autonomic framework MAMF, is proposed to perform migrations of containers running user modules, while satisfying the Quality of Service requirements. The hybrid framework employing MAPE loop concepts and Genetic Algorithm, addresses the migration of containers in the Fog environment, while ensuring application delivery deadlines. The approach uses the pre-determined value of user location for the next time instant, to initiate the migration process. The framework was modelled and evaluated in iFogSim toolkit. The re-allocation problem was also mathematically modelled as an Integer Linear Programming problem. Experimental results indicate that the approach offers an improvement in terms of network usage, execution cost and request execution delay, over the existing approaches. © 2020, Springer-Verlag GmbH Germany, part of Springer Nature.
  • No Thumbnail Available
    Item
    Toward efficient autonomic management of clouds: A CDS-based hierarchical approach
    (2018) Martin, J.P.; Kandasamy, A.; Chandrasekaran, K.
    Cloud computing is one of the most sought-after technologies today. Beyond a shadow of doubt, the number of clients opting for Cloud is increasing. This steers the complexity of the management of the Cloud computing environment. In order to serve the demands of customers, Cloud providers are resorting to more resources. Relying on a single managing element to coordinate the entire pool of resources is no more an efficient solution. Therefore, we propose to use a hierarchical approach for autonomic management. The problem that we consider here is to determine the nodes at which we have to place the Autonomic Managers (AMs), in order to ease the management process and minimize the cost of communication between the AMs. We propose a graph-theory-based model using Connected Dominating Set (CDS) that allows to determine an effective placement of AMs in different Data Centers (DCs), and, their collaboration with the Global Manager (GM). The approach considers the construction of domination sets and then, distributing the control of the dominees among the dominators. � 2018, Springer Nature Singapore Pte Ltd.
  • No Thumbnail Available
    Item
    Toward efficient autonomic management of clouds: A CDS-based hierarchical approach
    (Springer Verlag service@springer.de, 2018) Martin, J.P.; Kandasamy, A.; Chandrasekaran, K.
    Cloud computing is one of the most sought-after technologies today. Beyond a shadow of doubt, the number of clients opting for Cloud is increasing. This steers the complexity of the management of the Cloud computing environment. In order to serve the demands of customers, Cloud providers are resorting to more resources. Relying on a single managing element to coordinate the entire pool of resources is no more an efficient solution. Therefore, we propose to use a hierarchical approach for autonomic management. The problem that we consider here is to determine the nodes at which we have to place the Autonomic Managers (AMs), in order to ease the management process and minimize the cost of communication between the AMs. We propose a graph-theory-based model using Connected Dominating Set (CDS) that allows to determine an effective placement of AMs in different Data Centers (DCs), and, their collaboration with the Global Manager (GM). The approach considers the construction of domination sets and then, distributing the control of the dominees among the dominators. © 2018, Springer Nature Singapore Pte Ltd.
  • No Thumbnail Available
    Item
    Unraveling the challenges for the application of fog computing in different realms: A multifaceted study
    (Springer Verlag service@springer.de, 2019) Martin, J.P.; Kandasamy, A.; Chandrasekaran, K.
    Fog computing is an emerging paradigm that deals with distributing data and computation at intermediate layers between the cloud and the edge. Cloud computing was introduced to support the increasing computing requirements of users. Later, it was observed that end users experienced a delay involved in uploading the large amounts of data to the cloud for processing. Such a seemingly centralized approach did not provide a good user experience. To overcome this limitation, processing capability was incorporated in devices at the edge. This led to the rise of edge computing. This paradigm suffered because edge devices had limited capability in terms of computing resources and storage requirements. Relying on these edge devices alone was not sufficient. Thus, a paradigm was needed without the delay in uploading to the cloud and without the resource availability constraints. This is where fog computing came into existence. This abstract paradigm involves the establishment of fog nodes at different levels between the edge and the cloud. Fog nodes can be different entities, such as personal computers (PCs). There are different realms where fog computing may be applied, such as vehicular networks and the Internet of Things. In all realms, resource management decisions will vary based on the environmental conditions. This chapter attempts to classify the various approaches for managing resources in the fog environment based on their application realm, and to identify future research directions. © Springer Nature Singapore Pte Ltd. 2019.
  • No Thumbnail Available
    Item
    Virtual machine migration—a perspective study
    (Springer Verlag service@springer.de, 2018) Joseph, C.; Martin, J.P.; Chandrasekaran, K.; Kandasamy, A.
    The technology of Cloud computing has been ruling the IT world for the past few decades. One of the most notable tools that helped in prolonging the reign of Cloud computing is virtualization. While virtualization continues to be a boon for the Cloud technology, it is not short of its own pitfalls. One such pitfall results from the migration of virtual machines. Though migration incurs an overhead on the system, an efficient system cannot neglect migrating the virtual machines. This work attempts to carry out a perspective study on virtual machine migration. The various migration techniques proposed in the literature have been classified based on the aspects of migration that they consider. A survey of the various metrics that characterize the performance of a migration technique is also done. © 2018, Springer Nature Singapore Pte Ltd.

Maintained by Central Library NITK | DSpace software copyright © 2002-2026 LYRASIS

  • Privacy policy
  • End User Agreement
  • Send Feedback
Repository logo COAR Notify