Faculty Publications
Permanent URI for this communityhttps://idr.nitk.ac.in/handle/123456789/18736
Publications by NITK Faculty
Browse
18 results
Search Results
Item Development of wavelet transform based numeric relay for differential protection of power transformer(2003) Vittal, K.P.; Gaonakar, D.N.; Fakruddin, D.B.This paper proposes the wavelet transform (WT) based algorithm useful in digital differential protection of transformer. It is shown that, the wavelet transform has more distinct feature extraction property, due to its ability to extract information from the transient signals, simultaneously both in time and frequency. A decision logic has been devised using extracted feature from differential current, so as to distinguish an internal fault from an inrush / over-excitation. The proposed algorithm is evaluated using simulated inrush, over-excitation and internal fault current signals. For this purpose a transient behavioral model of power transformer has been developed using MATLAB software. Results of evaluation study show that, proposed WT based differential protection scheme can over come the problem of false tripping due to inrush/over-excitation.Item NeuralDoc-Automating Code Translation Using Machine Learning(Springer Science and Business Media Deutschland GmbH, 2022) Sree Harsha, S.; Sohoni, A.C.; Chandrasekaran, K.Source code documentation is the process of writing concise, natural language descriptions of how the source code behaves during run time. In this work, we propose a novel approach called NeuralDoc, for automating source code documentation using machine learning techniques. We model automatic code documentation as a language translation task, where the source code serves as the input sequence, which is translated by the machine learning model to natural language sentences depicting the functionality of the program. The machine learning model that we use is the Transformer, which leverages the self-attention and multi-headed attention features to effectively capture long-range dependencies and has been shown to perform well on a range of natural language processing tasks. We integrate the copy attention mechanism and incorporate the use of BERT, which is a pre-training technique into the basic Transformer architecture to create a novel approach for automating code documentation. We build an intuitive interface for users to interact with our models and deploy our system as a web application. We carry out experiments on two datasets consisting of Java and Python source programs and their documentation, to demonstrate the effectiveness of our proposed method. © 2022, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.Item An Improved Transformer Transducer Architecture for Hindi-English Code Switched Speech Recognition(International Speech Communication Association, 2022) Antony, A.; Kota, S.R.; Lade, A.; Spoorthy, V.; Koolagudi, S.G.Due to the extensive usage of technology in many languages throughout the world, interest in Automatic Speech Recognition (ASR) systems for Code-Switching (CS) in speech has grown in recent years. Several studies have shown that End-to-End (E2E) ASR is easier to adopt and works much better in monolingual settings. E2E systems are likewise widely recognised for requiring massive quantities of labelled speech data. Since there is a scarcity in the availability of large amount of CS speech, E2E ASR takes longer computation time and does not offer promising results. In this work, an E2E ASR model system using a transformer-transducer architecture is introduced for code-switched Hindi-English speech, and also addressed training data scarcity by leveraging the vastly available monolingual data. Specifically, the language-specific modules in the Transformer are pre-trained by leveraging the vastly available single language speech datasets. The proposed method also provides a Word Error Rate (WER) of 29.63% and Transliterated Word Error Rate (T-WER) of 27.42% which is better than the state-of-the-art by 2.19%. © © 2022 ISCA.Item Design of a New Single-Phase 15-Level Inverter with Minimized Components(Institute of Electrical and Electronics Engineers Inc., 2023) Nageswar Rao, B.N.; Yellasiri, Y.; Shiva Naik, B.S.; Aditya, K.; Karunakaran, E.; Kumar, M.V.Multilevel inverters (MLI) provide a number of challenges, the most significant of which is the requirement for a high number of power semiconductors and separate dc supplies to assimilate renewable energy into a grid successfully. Because of this, reducing the number of components used in these kinds of inverters is quite important. Because transformer-based multilevel inverters (TBMIs) have become more commonplace, the use of many dc supplies in the cascaded inverter is no longer necessary for the device to function. Based on the outcomes of this study, a new transformer-based MLI with fifteen levels (15L) and eight switches can be built with only one dc source required. The suggested MLI consists of three isolated transformers. The suggested MLI structure has many unique benefits, including the use of fewer switching components and the availability of self-galvanic isolation. The MATLAB simulation results are carried out to evaluate the effectiveness of the suggested TBMLI. In addition, a comparison of the suggested structure to other recent configurations is presented. © 2023 IEEE.Item Face Detection and Recognition Using OpenCV and Vision Transformer(Institute of Electrical and Electronics Engineers Inc., 2023) Kumar, K.; Pingale, N.; Rudra, B.Face recognition technology is vital in the real world with diverse applications. It is primarily used for security, law enforcement, personalization, healthcare, and education. Face recognition systems use biometric features like facial landmarks, texture, and shape to identify and verify individuals. The suggested approach employs a transformer-based architecture that solely relies on self-attention and does not utilize Convolutional Layers. This design choice enables the model to be trained efficiently with minimal computational power and fewer parameters than a CNN. The application of Vision Transformer (ViT) in various computer vision tasks has been highly successful, making it a state-of-the-art approach. Given its superior performance, we are interested in exploring whether ViT can enhance the accuracy of sheep face recognition.In this paper, we show that ViT can be a useful technique for facial recognition. Since there was no predefined dataset for face recognition, a PCI dataset was built for this investigation. Along with the PCI dataset, two more well-known datasets, AT&T and 5-Celebrity, we used to examine performance. In our model was seen that ViT could identify human faces on the PCI dataset with a 99% accuracy rate and perform much better than other face recognition algorithms like Eigenface, FisherFace, and LBPH. © 2023 IEEE.Item Complex Aware Transformer-CNN for Refractive Index Prediction in Plasmonic Waveguide(Institute of Electrical and Electronics Engineers Inc., 2025) Chaurasia, A.R.; Marwade, V.; Singh, M.Estimating the effective refractive index of a plasmonic waveguide with high precision is essential for various photonic applications. Traditional analytical and numerical methods often involve extensive computational methods. Deep learning-based approaches have shown promise in improving both accuracy and efficiency. This paper presents a deep learning-based approach for effective refractive index estimation using a hybrid Complex Aware Transformer-Convolutional Neural Network (CAT-CNN) model utilizing convolutional feature extraction, transformer-based attention mechanisms, and squeeze-and-excitation blocks to improve predictive accuracy. Trained on a dataset of plasmonic waveguide parameters at a fixed frequency of 193.2 THz, the model achieves a combined testing R2 score of 0.99978, demonstrating high precision in predicting the real and imaginary parts of the effective refractive index. Our results demonstrate that CAT-CNN achieves state-of-the-art performance in terms of prediction accuracy and computational efficiency. The proposed model has significant implications for the design of high-performance plasmonic sensors and integrated photonic devices. © 2025 IEEE.Item A novel single source multilevel inverter with hybrid switching technique(John Wiley and Sons Ltd, 2022) Nageswar Rao, B.; Yellasiri, Y.; Shiva Naik, B.; Venkataramanaiah, J.; Aditya, K.; Panda, A.A novel multilevel inverter (MLI) configuration with the hybrid switching technique is presented in this paper. The proposed MLI consists of the H-bridge combination with unidirectional switches, half-bridges, and transformers. The suggested MLI with the additional cascaded connection increases to higher voltage levels. The number of employed components in this topology is drastically minimized. Therefore, the complexity, cost, and volume of the proposed topology are also reduced. The operation of the suggested topology is tested through the improved novel switching technique. This modulation method reduces the total harmonic distortion (THD) and produces high root mean square (RMS) voltage. Further, a comprehensive comparison with the recent MLI topologies is performed to validate the merits of the suggested inverter. Simulation and experimental results verify the suggested topology performance using the new modulation technique at different loading conditions and modulation indices. © 2021 John Wiley & Sons, Ltd.Item Hindi fake news detection using transformer ensembles(Elsevier Ltd, 2023) Praseed, A.; Rodrigues, J.; Santhi Thilagam, P.S.In the past few decades, due to the growth of social networking sites such as Whatsapp and Facebook, information distribution has been at a level never seen before. Knowing the integrity of information has been a long-standing problem, even more so for the regional languages. Regional languages, such as Hindi, raise challenging problems for fake news detection as they tend to be resource constrained. This limits the amount of data available to efficiently train models for these languages. Most of the existing techniques to detect fake news is targeted towards the English language or involves the manual translation of the language to the English language and then proceeding with Deep Learning methods. Pre-trained transformer based models such as BERT are fine-tuned for the task of fake news detection and are commonly employed for detecting fake news. Other pre-trained transformer models, such as ELECTRA and RoBERTa have also been shown to be able to detect fake news in multiple languages after suitable fine-tuning. In this work, we propose a method for detecting fake news in resource constrained languages such as Hindi more efficiently by using an ensemble of pre-trained transformer models, all of which are individually fine-tuned for the task of fake news detection. We demonstrate that the use of such a transformer ensemble consisting of XLM-RoBERTa, mBERT and ELECTRA is able to improve the efficiency of fake news detection in Hindi by overcoming the drawbacks of individual transformer models. © 2022 Elsevier LtdItem A novel nine-level inverter with reduced component count using common leg configuration(Springer Science and Business Media Deutschland GmbH, 2023) Nageswar Rao, B.; Yellasiri, Y.; Shiva Naik, B.S.; Aditya, K.This article proposes a nine-level (9 L) inverter with a common leg configuration employing transformers and a single dc source. The suggested inverter uses eight switches and two transformers to produce 9 L output voltage. The suggested circuit minimizes the switches and transformers compared with existing transformer-based multilevel inverters (TMLI). Therefore, the proposed circuit cost, volume and complexity are also reduced. Additionally, a thorough comparison with the various 9 L inverter circuits is conducted to ensure the benefits of the suggested TMLI. A basic logic gate-based pulse width modulation (PWM) is implemented for the suggested 9 L inverter. Simulation and hardware studies verifying the feasibility and proficiency of the suggested inverter are performed. © 2023, The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature.Item A new single-phase multilevel inverter with improved modulation technique(John Wiley and Sons Ltd, 2023) Nageswar Rao, B.; Yellasiri, Y.; Shiva Naik, B.; Aditya, K.; K Panda, A.This article proposes a seventeen-level (17L) inverter with a common leg configuration and an improved modulation technique. The proposed inverter uses only 10 switches, one toroidal core transformer, and one dc source. Therefore, the proposed design offers less control complexity with reduced cost and volume. Additionally, the suggested modulation technique improves the load voltage quality by minimizing the harmonic content. Simulation and laboratory studies are performed to confirm the proficiency of the suggested inverter with a new modulation technique. Further, a thorough comparison with recent transformer-based circuits is carried out to highlight the benefits of the proposed structure. © 2023 John Wiley & Sons Ltd.
