Faculty Publications
Permanent URI for this communityhttps://idr.nitk.ac.in/handle/123456789/18736
Publications by NITK Faculty
Browse
19 results
Search Results
Item Civil Engineering for Multi-hazard Risk Reduction-An Introduction(Springer Science and Business Media Deutschland GmbH, 2024) Sreekeshava, K.S.; Kolathayar, S.; Vinod Chandra Menon, N.; Bhargavi.CThe modern built environment faces diverse hazards, emphasizing the need for engineering practices prioritizing safety and resilience. This exploration delves into key aspects of civil engineering: Accessibility and Convenience, Geotechnical Engineering, Risk Analysis and Structural Analysis. It aims to provide a foundational understanding of multidisciplinary approaches used to mitigate risks in civil engineering. In the realm of Accessibility and Convenience, research explores alternative construction materials such as bamboo and innovative concrete formulations. Studies investigate the use of metakaolin, ground granulated blast-furnace slag, alkali activated concrete and coconut coir fibres to enhance durability and sustainability. Polyethylene glycol and chemical admixtures like red mud and silica fume are also examined for their impact on concrete properties. Geotechnical Engineering focuses on subsurface characteristics crucial for safety assessments. Soft computing techniques, including Group Method of Data Handling and Random Forests Classifier, are applied for slope stability analysis. Digital Image Correlation is employed to study soil displacement, while artificial intelligence models predict residual strength post liquefaction. Risk Analysis and Approaches cover climate-smart agriculture, floodplain mapping, solid waste management, and disaster resilience. Machine learning aids in land use classification, flood forecasting, earthquake prediction and identifying risk factors in road construction. The study also evaluates safety distances around gas and oil pipelines. Structural Analysis involves transient and modal analysis of structures under various loads. Contributions include crack propagation studies using digital image segmentation and the application of deep convolutional neural networks for surface crack detection. Building surface crack detection, construction sequence analysis and seismic studies on different building types are explored for structural integrity. The overarching theme underscores the interdisciplinary nature of civil engineering in addressing contemporary challenges. These include climate change impacts, disaster resilience, sustainable materials, and advanced technologies like IoT and AI. As civil engineering plays a pivotal role in developing hazard-resilient structures, the presented research contributes to the evolving landscape of risk reduction and safety enhancement in the built environment. © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2024.Item GPU implementation of non-local maximum likelihood estimation method for denoising magnetic resonance images(Springer Verlag service@springer.de, 2017) Upadhya, A.H.K.; Talawar, B.; Rajan, J.Magnetic resonance imaging (MRI) is a widely deployed medical imaging technique used for various applications such as neuroimaging, cardiovascular imaging and musculoskeletal imaging. However, MR images degrade in quality due to noise. The magnitude MRI data in the presence of noise generally follows a Rician distribution if acquired with single-coil systems. Several methods are proposed in the literature for denoising MR images corrupted with Rician noise. Amongst the methods proposed in literature for denoising MR images corrupted with Rician noise, the non-local maximum likelihood methods (NLML) and its variants are popular. In spite of the performance and denoising quality, NLML algorithm suffers from a tremendous time complexity O(m3N3) , where m3 and N3 represent the search window and image size, respectively, for a 3D image. This makes the algorithm challenging for deployment in the real-time applications where fast and prompt results are required. A viable solution to this shortcoming would be the application of a data parallel processing framework such as Nvidia CUDA so as to utilize the mutually exclusive and computationally intensive calculations to our advantage. The GPU-based implementation of NLML-based image denoising achieves significant speedup compared to the serial implementation. This research paper describes the first successful attempt to implement a GPU-accelerated version of the NLML algorithm. The main focus of the research was on the parallelization and acceleration of one computationally intensive section of the algorithm so as to demonstrate the execution time improvement through the application of parallel processing concepts on a GPU. Our results suggest the possibility of practical deployment of NLML and its variants for MRI denoising. © 2016, Springer-Verlag Berlin Heidelberg.Item Daily pan evaporation modeling in climatically contrasting zones with hybridization of wavelet transform and support vector machines(Springer Verlag service@springer.de, 2017) Pammar, L.; Deka, P.C.The estimation of evaporation has been under surveillance, which is being carried out by many researchers toward applications in the fields related to hydrology and water resources management. Due to complexities associated with its estimation, research has employed several modes via direct and indirect methods to estimate. Accurate estimations are still the thrust area of research in these fields. The pan evaporation estimations with the help of data modeling techniques have provided better results in the recent past. The advancement in the field of data modeling has introduced several techniques which can best fit the data type and provide accurate estimations. The novel gamma test (GT) was used to decide the best input–output combination. Parameter optimization was carried out by grid search. The developed models gave better estimations of pan evaporation, but exhibited some limitations with nonlinearity, and sparse and noisy data. These limitations paved way for data pre-processing techniques such as wavelet transform. This study made an attempt to explore hybrid modeling using discrete wavelet transform (DWT) and support vector machines (SVR) for pan evaporation estimation. Two stations representing contrasting climatic zones namely ‘Bajpe’ and ‘Bangalore’ located in the state of Karnataka, India, are selected in this study. The meteorological datasets recorded at these stations are analyzed using gamma test and grid search to use the best input–output combinations for the models. The modeled pan evaporation estimations are very promising toward ever demanding accuracy expected in the associated fields. © 2017, The International Society of Paddy and Water Environment Engineering and Springer Japan.Item Elucidating the challenges for the praxis of fog computing: An aspect-based study(John Wiley and Sons Ltd vgorayska@wiley.com Southern Gate Chichester, West Sussex PO19 8SQ, 2019) Martin, J.P.; Kandasamy, A.; Chandrasekaran, K.; Joseph, C.T.The evolutionary advancements in the field of technology have led to the instigation of cloud computing. The Internet of Things paradigm stimulated the extensive use of sensors distributed across the network edges. The cloud datacenters are assigned the responsibility for processing the collected sensor data. Recently, fog computing was conceptuated as a solution for the overwhelmed narrow bandwidth. The fog acts as a complementary layer that interplays with the cloud and edge computing layers, for processing the data streams. The fog paradigm, as any distributed paradigm, has its set of inherent challenges. The fog environment necessitates the development of management platforms that effectuates the orchestration of fog entities. Owing to the plenitude of research efforts directed toward these issues in a relatively young field, there is a need to organize the different research works. In this study, we provide a compendious review of the research approaches in the domain, with special emphasis on the approaches for orchestration and propose a multilevel taxonomy to classify the existing research. The study also highlights the application realms of fog computing and delineates the open research challenges in the domain. © 2019 John Wiley & Sons, Ltd.Item Hybrid wavelet packet machine learning approaches for drought modeling(Springer, 2020) Das, P.; Naganna, S.R.; Deka, P.C.; Pushparaj, J.Among all the natural disasters, drought has the most catastrophic encroachment on the surrounding and environment. Gulbarga, one of the semi-arid districts of Karnataka state, India receives about 700 mm of average annual rainfall and is drought inclined. In this study, the forecasting of drought for the district has been carried out for a lead time of 1 month and 6 months. The multi-temporal Standardized Precipitation Index (SPI) has been used as the drought quantifying parameter due to the fact that it is calculated on the basis of one simplest parameter, i.e., rainfall and additionally due to its ease of use. The fine resolution daily gridded precipitation data (0.25º × 0.25º) procured from Indian Meteorological Department (IMD) of 21 grid locations within the study area have been used for the analysis. Forecasting of drought plays a significant role in drought preparedness and mitigation plans. With the advent of machine learning (ML) techniques over the past few decades, forecasting of any hydrologic event has become easier and more accurate. However, the use of these techniques for drought forecasting is still obscure. In this study, Artificial Neural Network (ANN) and Support Vector Regression (SVR) techniques have been employed to examine their accuracy in drought forecasting over shorter and longer lead times. Furthermore, two hybrid approaches have been formulated by coupling a data transformation method with each of the aforementioned ML approaches. At the outset, pre-processing of input data (i.e., SPI) has been carried out using Wavelet Packet Transform (WPT) and then used as inputs to ANN and SVR models to induce hybrid WP-ANN and WP-SVR models. The performance of the hybrid models has been evaluated based on the statistical indices such as R2 (co-efficient of determination), RMSE (Root Mean Square Error), and MAE (Mean Absolute Error). The results showed that the hybrid techniques have better forecast performance than the standalone machine learning approaches. Hybrid WP-ANN model performed relatively better than WP-SVR model for most of the grid locations. Also, the forecasting results deteriorated as the lead time increased from 1 to 6 months. © 2020, Springer-Verlag GmbH Germany, part of Springer Nature.Item An improved sliding window prediction-based outlier detection and correction for volatile time-series(John Wiley and Sons Ltd, 2021) Ranjan, K.G.; Tripathy, D.S.; Prusty, B.R.; Jena, D.Steady-state forecasting is indispensable for power system planning and operation. A forecasting model for inputs considering their historical record is a preliminary step for such type of studies. Since the historical data quality is decisive in edifice an accurate forecasting model, data preprocessing is essential. Primarily, the quality of raw data is affected by the presence of outliers, and preprocessing refers to outlier detection and correction. In this paper, an effort is made to improve the existing sliding window prediction-based preprocessing method. The recommended reforms are the calculation of appropriate window width and a new outlier correction approach. The proposed method denoted as improved sliding window prediction-based preprocessing is applied to the historical data of PV generation, load power, and the ambient temperature of different time-steps collected from various places in the United States and India. Firstly, the method's efficacy through detailed result analysis demonstrating the proposed preprocessing as a better way than its precursor and k-nearest neighbor approach is presented. Later, the improved out-of-sample forecasting accuracy canonizes the proposed method’s concert compared to both the above techniques and the case without preprocessing. © 2020 John Wiley & Sons LtdItem Performance enhancement of SVM model using discrete wavelet transform for daily streamflow forecasting(Springer Science and Business Media Deutschland GmbH, 2021) Kambalimath S, S.; Deka, P.C.Streamflow modeling becomes a vital task in any hydrological study for an improved planning and management of water resources. Soft computing and machine learning techniques are becoming popular day by day for their predictive capability when limited input data are available. In the present study, Support Vector Machine (SVM) technique is applied to forecast 1-day, 3-day, and 5-day ahead streamflow using daily streamflow time-series of Khanapur, Cholachguda, and Navalgund gauging stations in Malaprabha sub-basin located in the Karnataka state of India. Furthermore, Discrete Wavelet Transform is used as a data pre-processing method to evaluate the performance enhancement of SVM model, for which four different mother wavelet functions are used and tested separately, namely, Haar, Daubechies, Coiflets, and Symlets. Models are evaluated using coefficient of determination (R2), root-mean-square error, and Nash–Sutcliffe efficiency. The study indicates that the performance of SVM model improves considerably when wavelet method is coupled. It is found that the R2 values for Khanapur station using SVM are 0.91, 0.66, and 0.46 for 1-day, 3-day, and 5-day lead-time forecasts, respectively. However, when wavelet method is coupled with SVM model, the R2 is improved to 0.99, 0.73, and 0.68 for 1-day, 3-day, and 5-day lead-time forecasts, respectively. © 2021, The Author(s), under exclusive licence to Springer-Verlag GmbH, DE part of Springer Nature.Item Adopting elitism-based Genetic Algorithm for minimizing multi-objective problems of IoT service placement in fog computing environment(Academic Press, 2021) Natesha, B.V.; Guddeti, R.M.R.Fog computing is an emerging computation technology for handling and processing the data from IoT devices. The devices such as the router, smart gateways, or micro-data centers are used as the fog nodes to host and service the IoT applications. However, the primary challenge in fog computing is to find the suitable nodes to deploy and run the IoT application services as these devices are geographically distributed and have limited computational resources. In this paper, we design the two-level resource provisioning fog framework using docker and containers and formulate the service placement problem in fog computing environment as a multi-objective optimization problem for minimizing the service time, cost, energy consumption and thus ensuring the QoS of IoT applications. We solved the said multi-objective problem using the Elitism-based Genetic Algorithm (EGA). The proposed approach is evaluated on fog computing testbed developed using docker and containers on 1.4 GHz 64-bit quad-core processor devices. The experimental results demonstrate that the proposed method outperforms other state-of-the-art service placement strategies considered for performance evaluation in terms of service cost, energy consumption, and service time. © 2021 Elsevier LtdItem The utility of proper orthogonal decomposition for dimensionality reduction in understanding behavior of concrete(Techno-Press, 2021) Manoj, A.; Babu Narayan, K.S.B.Properties of wet and set concrete are influenced by a wide range of variables. With new formulations being tried and adopted, understanding workability, strength and durability characteristics of these formulations is of utmost importance. From among the wide range of variables that affect properties of concrete, identification of the most vital, interplay between variables, quantification of influence, for judicious manipulation of mix proportioning, placement, compaction and curing, to get the desired and targeted end results can vastly be improved by employing the state of the art data handling tools. Group method of data handling (GMDH), a set of mathematical algorithms, is of great usage potential in multi-variable data modeling, optimization and pattern recognition. Proper Orthogonal Decomposition (POD) a subset of GMDH, a technique for systematic dimensionality reduction and pattern recognition, is of great importance in studying complex datasets. This paper presents the need for adoption of GMDH techniques in concrete technology with an account of trends in this direction and also provides an illustration of POD’s utility as a valid decision-making tool in dimensionality reduction and projection of behavior of concrete subjected to elevated temperature. © 2021 Techno-Press, Ltd.Item Temperature-Dependent Conformational Evolution of SARS CoV-2 RNA Genome Using Network Analysis(American Chemical Society, 2021) Singh, O.; Venugopal, P.P.; Mathur, A.; Chakraborty, D.Understanding the dynamics of the SARS CoV-2 RNA genome and its dependence on temperature is necessary to fight the current COVID-19 crisis. Computationally, the handling of large data is a major challenge in the elucidation of the structures of RNA. This work presents network analysis as an important tool to see the conformational evolution and the most dominant structures of the RNA genome at six different temperatures. It effectively distinguished different communities of RNA having structural variation. It is found that at higher temperatures (348 K and above), 80% of the RNA structure is destroyed in both the SPC/E and mTIP3P water models. The thermal denaturation free energy change ??Gvalue calculated for the long-lived structure at higher temperatures of 348 and 363 K ranges from 2.58 to 2.78 kcal/mol for the SPC/E water model, which agrees well with the experimentally reported thermal denaturation free energy range of 2.874 kcal/mol of SARS CoV-NP at normal pH. At higher temperatures, the stability of RNA conformation is found to be due to the existence of non-native base pairs in the SPC/E water model. © 2021 American Chemical Society
