Journal Articles
Permanent URI for this collectionhttps://idr.nitk.ac.in/handle/123456789/19884
Browse
25 results
Search Results
Item Semantic Segmentation of Remotely Sensed Images for Land-use and Land-cover Classification: A Comprehensive Review(Taylor and Francis Ltd., 2025) Putty, A.; Annappa, B.; Pariserum Perumal, S.Remotely Sensed Images (RSI) based land-use and land-cover (LULC) mapping facilitates applications such as forest logging, biodiversity protection, and urban topographical kinetics. This process has gained more attention with the widespread availability of geospatial and remote sensing data. With recent advances in machine learning and the possibility of processing nearly real-time information on the computer, LULC mapping methods broadly fall into two categories: (i) framework-dependent algorithms, where mappings are done using the in-built algorithms in Geographical Information System (GIS) software and (ii) framework-independent algorithms, which are mainly based on deep learning techniques. Both approaches have their unique advantages and challenges. Along with the working patterns and performances of these two methodologies, this comprehensive review thoroughly analyzes deep learning architectures catering different technical capabilities like feature extraction, boundary extraction, transformer-based mechanism based mechanism, attention mechanism, pyramid pooling and lightweight models. To fine-tune these semantic segmentation processes, current technical and domain challenges and insights into future directions for analysing RSIs of varying spatial and temporal resolutions are summarized. Cross domain users with application specific requirements can make use of this study to select appropriate LULC semantic segmentation models. © 2025 IETE.Item Best resource recommendation for a stochastic process(International Information Institute Ltd. No. 509 Fujimi-Cho 6-64-3 Tachikawa City, Tokyo 190-0013, 2016) Thomas, L.; Manoj Kumar, M.V.; Annappa, B.The aim of this study was to develop an Artificial Neural Network's recommendation model for an online process using the complexity of load and performance of the resources. The proposed model investigate the resource performance using stochastic gradient decent method and probabilistic cost function for learning ranking function. The test result of CoSeLoG project is presented with accuracy of 72.856%. © 2016 International Information Institute.Item Concept drifts detection and localisation in process mining(International Information Institute Ltd. No. 509 Fujimi-Cho 6-64-3 Tachikawa City, Tokyo 190-0013, 2016) Manoj Kumar, M.V.; Thomas, L.; Annappa, B.Process mining provides methods and techniques for analyzing eventlogs recorded in modern information systems that support real-world operations. While analyzing an event-log, techniques in process mining assumes that the process as a static entity. This is not often the case due to possibility of phenomenon called concept drift. During the period of execution, process can experience concept drift and can evolve with respect to any of its associated perspectives exhibiting various patterns-of-change with different pace. This paper presents the method for detecting and localizing the sudden concept drifts in control-flow perspective of the process by using features extracted by processing the traces in process-log. © 2016 International Information Institute.Item Social network pruning for building optimal social network: A user perspective(Elsevier B.V., 2017) Sumith, N.; Annappa, B.; Bhattacharya, S.Social networks with millions of nodes and edges are difficult to visualize and understand. Therefore, approaches to simplify social networks are needed. This paper addresses the problem of pruning social network while not only retaining but also improving its information propagation properties. The paper presents an approach which examines the nodal attribute of a node and develops a criterion to retain a subset of nodes to form a pruned graph of the original social network. To authenticate feasibility of the proposed approach to information propagation process, it is evaluated on small world properties such as average clustering coefficient, diameter, path length, connected components and modularity. The pruned graph, when compared to original social network, shows improvement in small world properties which are essential for information propagation. Results also give a significantly more refined picture of social network, than has been previously highlighted. The efficacy of the pruned graph is demonstrated in the information diffusion process under Independent Cascade (IC) and Linear Threshold (LT) models on various seeding strategies. In all size ranges and across various seeding strategies, the proposed approach performs consistently well in IC model and outperforms other approaches in LT model. Although, the paper discusses the problem with the context of information propagation for viral marketing, the pruned graph generated from the proposed approach is also suitable for any application, where information propagation has to take place reasonably fast and effectively. © 2016 Elsevier B.V.Item A holistic approach to influence maximization in social networks: STORIE(Elsevier Ltd, 2018) Sumith, N.; Annappa, B.; Bhattacharya, S.Crowd sourcing techniques are used in social networks to propagate information at a faster pace through campaigns. One of the challenges of crowd sourcing system is to recruit right users to be a part of successful campaigns. Fetching this right group of people, who influence a vast population to adopt information, is termed as influence maximization. Concerns of scalability and effectiveness need an effective and a viable solution. This paper proposes the solution in three stages. At the first stage, the large social network is pruned based on the nodal properties to make the solution scalable. At the second stage, Outdegree Rank (OR), is proposed and at the third stage, Influence Estimation (IE) approach estimates user influence. This work amalgamates aspects of structure, heuristic and user influence, to form STORIE. The proposed approach is compared to standard heuristics, on various experimental setups such as RNNDp, RNUDp and TVM. The spread of information is observed for HEP, PHY, Twitter, Infectious and YouTube data, under Independent Cascade model and STORIE gives optimal results, with an increase up to 50%. Although the paper discusses influence maximization, the proposed approach is also applicable to understand the spread of epidemics, computer virus, and rumor spreading in the real world and can also be extended to detect anomalies in web and social networks. © 2017 Elsevier B.V.Item Transcriptional processes: Models and inference(World Scientific Publishing Co. Pte Ltd wspc@wspc.com.sg, 2018) Shetty, K.S.; Annappa, B.Many biochemical events involve multistep reactions. One of the most important biological processes that involve multistep reaction is the transcriptional process. Models for multistep reaction necessarily need multiple states and it is a challenge to compute model parameters that best agree with experimental data. Therefore, the aim of this work is to design a multistep promoter model which accurately characterizes transcriptional bursting and is consistent with observed data. To address this issue, we develop a model for promoters with several OFF states and a single ON state using Erlang distribution. To explore the combined effects of model and data, we combine Monte Carlo extension of Expectation Maximization (MCEM) and delay Stochastic Simulation Algorithm (DSSA) and call the resultant algorithm as delay Bursty MCEM. We apply this algorithm to time-series data of endogenous mouse glutaminase promoter to validate the model assumptions and infer the kinetic parameters. Our results show that with multiple OFF states, we are able to infer and produce a model which is more consistent with experimental data. Our results also show that delay Bursty MCEM inference is more efficient. © 2018 World Scientific Publishing Europe Ltd.Item Influence maximization in large social networks: Heuristics, models and parameters(Elsevier B.V., 2018) Sumith, N.; Annappa, B.; Bhattacharya, S.Online social networks play a major role not only in socio psychological front, but also in the economic aspect. The way social network serves as a platform of information spread, has attracted a wide range of applications at its doorstep. In recent years, lot of efforts are directed to use the phenomenon of vast spread of information, via social networks, in various applications, ranging from poll analysis, product marketing, identifying influential users and so on. One such application that has gained research attention is the influence maximization problem. The influence maximization problem aims to fetch the top influential users in the social networks. The aim of the paper is to provide a comprehensive analysis on the state of art approaches towards identifying influential users. In this review, we discuss various challenges and approaches to identify influential users in online social networks. This review concludes with future research direction, helping researchers to bring possible improvements to the existing body of work. © 2018 Elsevier B.V.Item COVID-19: Automatic detection from X-ray images by utilizing deep learning methods(Elsevier Ltd, 2021) Nigam, B.; Nigam, A.; Jain, R.; Dodia, S.; Arora, N.; Annappa, B.In recent months, a novel virus named Coronavirus has emerged to become a pandemic. The virus is spreading not only humans, but it is also affecting animals. First ever case of Coronavirus was registered in city of Wuhan, Hubei province of China on 31st of December in 2019. Coronavirus infected patients display very similar symptoms like pneumonia, and it attacks the respiratory organs of the body, causing difficulty in breathing. The disease is diagnosed using a Real-Time Reverse Transcriptase Polymerase Chain reaction (RT-PCR) kit and requires time in the laboratory to confirm the presence of the virus. Due to insufficient availability of the kits, the suspected patients cannot be treated in time, which in turn increases the chance of spreading the disease. To overcome this solution, radiologists observed the changes appearing in the radiological images such as X-ray and CT scans. Using deep learning algorithms, the suspected patients’ X-ray or Computed Tomography (CT) scan can differentiate between the healthy person and the patient affected by Coronavirus. In this paper, popular deep learning architectures are used to develop a Coronavirus diagnostic systems. The architectures used in this paper are VGG16, DenseNet121, Xception, NASNet, and EfficientNet. Multiclass classification is performed in this paper. The classes considered are COVID-19 positive patients, normal patients, and other class. In other class, chest X-ray images of pneumonia, influenza, and other illnesses related to the chest region are included. The accuracies obtained for VGG16, DenseNet121, Xception, NASNet, and EfficientNet are 79.01%, 89.96%, 88.03%, 85.03% and 93.48% respectively. The need for deep learning with radiologic images is necessary for this critical condition as this will provide a second opinion to the radiologists fast and accurately. These deep learning Coronavirus detection systems can also be useful in the regions where expert physicians and well-equipped clinics are not easily accessible. © 2021 Elsevier LtdItem GPU-aware resource management in heterogeneous cloud data centers(Springer, 2021) Kulkarni, A.K.; Annappa, B.The power of rapid scalability and easy maintainability of cloud services is driving many high-performance computing applications from company server racks into cloud data centers. With the evolution of Graphics Processing Units, composing of an extensive array of parallel computing single-instruction-multiple-data processors are being considered as a platform for high-performance computing because of their high throughput. Many cloud providers have begun offering GPU-enabled services for the users where GPUs are essential (for high computational power) to meet the desired Quality-of-service. Virtual machine placement and load balancing the GPUs in the virtualized environments like the cloud is still an evolving area of research and it is of prime importance to achieve higher resource efficiency and also to save energy. The current VM placement techniques do not consider the impact of VM workload type and GPU memory status on the VM placement decisions. This paper discusses the current issues with the First Fit policy of virtual machine placement used in VMWare Horizon and proposes a GPU-aware VM placement technique for GPU-enabled virtualized environments like cloud data centers. The experiments conducted using the synthetic workloads indicate reduction in the energy consumption, reduction in search space of physical hosts, and the makespan of the system. It also presents a summary of the current challenges for GPU resource management in virtualized environments and specific issues in developing cloud applications targeting GPUs under the virtualization layer. © 2021, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.Item A novel receptive field-regularized V-net and nodule classification network for lung nodule detection(John Wiley and Sons Inc, 2022) Dodia, S.; Annappa, B.; Mahesh, M.Recent advancements in deep learning have achieved great success in building a reliable computer-aided diagnosis (CAD) system. In this work, a novel deep-learning architecture, named receptive field regularized V-net (RFR V-Net), is proposed for detecting lung cancer nodules with reduced false positives (FP). The method uses a receptive regularization on the encoder block's convolution and deconvolution layer of the decoder block in the V-Net model. Further, nodule classification is performed using a new combination of SqueezeNet and ResNet, named nodule classification network (NCNet). Postprocessing image enhancement is performed on the 2D slice by increasing the image's intensity by adding pseudo-color or fluorescence contrast. The proposed RFR V-Net resulted in dice similarity coefficient of 95.01% and intersection over union of 0.83, respectively. The proposed NCNet achieved the sensitivity of 98.38% and FPs/Scan of 2.3 for 3D representations. The proposed NCNet resulted in considerable improvements over existing CAD systems. © 2021 Wiley Periodicals LLC.
- «
- 1 (current)
- 2
- 3
- »
