Conference Papers
Permanent URI for this collectionhttps://idr.nitk.ac.in/handle/123456789/28506
Browse
6 results
Search Results
Item Development of wavelet transform based numeric relay for differential protection of power transformer(2003) Vittal, K.P.; Gaonakar, D.N.; Fakruddin, D.B.This paper proposes the wavelet transform (WT) based algorithm useful in digital differential protection of transformer. It is shown that, the wavelet transform has more distinct feature extraction property, due to its ability to extract information from the transient signals, simultaneously both in time and frequency. A decision logic has been devised using extracted feature from differential current, so as to distinguish an internal fault from an inrush / over-excitation. The proposed algorithm is evaluated using simulated inrush, over-excitation and internal fault current signals. For this purpose a transient behavioral model of power transformer has been developed using MATLAB software. Results of evaluation study show that, proposed WT based differential protection scheme can over come the problem of false tripping due to inrush/over-excitation.Item NeuralDoc-Automating Code Translation Using Machine Learning(Springer Science and Business Media Deutschland GmbH, 2022) Sree Harsha, S.; Sohoni, A.C.; Chandrasekaran, K.Source code documentation is the process of writing concise, natural language descriptions of how the source code behaves during run time. In this work, we propose a novel approach called NeuralDoc, for automating source code documentation using machine learning techniques. We model automatic code documentation as a language translation task, where the source code serves as the input sequence, which is translated by the machine learning model to natural language sentences depicting the functionality of the program. The machine learning model that we use is the Transformer, which leverages the self-attention and multi-headed attention features to effectively capture long-range dependencies and has been shown to perform well on a range of natural language processing tasks. We integrate the copy attention mechanism and incorporate the use of BERT, which is a pre-training technique into the basic Transformer architecture to create a novel approach for automating code documentation. We build an intuitive interface for users to interact with our models and deploy our system as a web application. We carry out experiments on two datasets consisting of Java and Python source programs and their documentation, to demonstrate the effectiveness of our proposed method. © 2022, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.Item An Improved Transformer Transducer Architecture for Hindi-English Code Switched Speech Recognition(International Speech Communication Association, 2022) Antony, A.; Kota, S.R.; Lade, A.; Spoorthy, V.; Koolagudi, S.G.Due to the extensive usage of technology in many languages throughout the world, interest in Automatic Speech Recognition (ASR) systems for Code-Switching (CS) in speech has grown in recent years. Several studies have shown that End-to-End (E2E) ASR is easier to adopt and works much better in monolingual settings. E2E systems are likewise widely recognised for requiring massive quantities of labelled speech data. Since there is a scarcity in the availability of large amount of CS speech, E2E ASR takes longer computation time and does not offer promising results. In this work, an E2E ASR model system using a transformer-transducer architecture is introduced for code-switched Hindi-English speech, and also addressed training data scarcity by leveraging the vastly available monolingual data. Specifically, the language-specific modules in the Transformer are pre-trained by leveraging the vastly available single language speech datasets. The proposed method also provides a Word Error Rate (WER) of 29.63% and Transliterated Word Error Rate (T-WER) of 27.42% which is better than the state-of-the-art by 2.19%. © © 2022 ISCA.Item Design of a New Single-Phase 15-Level Inverter with Minimized Components(Institute of Electrical and Electronics Engineers Inc., 2023) Nageswar Rao, B.N.; Yellasiri, Y.; Shiva Naik, B.S.; Aditya, K.; Karunakaran, E.; Kumar, M.V.Multilevel inverters (MLI) provide a number of challenges, the most significant of which is the requirement for a high number of power semiconductors and separate dc supplies to assimilate renewable energy into a grid successfully. Because of this, reducing the number of components used in these kinds of inverters is quite important. Because transformer-based multilevel inverters (TBMIs) have become more commonplace, the use of many dc supplies in the cascaded inverter is no longer necessary for the device to function. Based on the outcomes of this study, a new transformer-based MLI with fifteen levels (15L) and eight switches can be built with only one dc source required. The suggested MLI consists of three isolated transformers. The suggested MLI structure has many unique benefits, including the use of fewer switching components and the availability of self-galvanic isolation. The MATLAB simulation results are carried out to evaluate the effectiveness of the suggested TBMLI. In addition, a comparison of the suggested structure to other recent configurations is presented. © 2023 IEEE.Item Face Detection and Recognition Using OpenCV and Vision Transformer(Institute of Electrical and Electronics Engineers Inc., 2023) Kumar, K.; Pingale, N.; Rudra, B.Face recognition technology is vital in the real world with diverse applications. It is primarily used for security, law enforcement, personalization, healthcare, and education. Face recognition systems use biometric features like facial landmarks, texture, and shape to identify and verify individuals. The suggested approach employs a transformer-based architecture that solely relies on self-attention and does not utilize Convolutional Layers. This design choice enables the model to be trained efficiently with minimal computational power and fewer parameters than a CNN. The application of Vision Transformer (ViT) in various computer vision tasks has been highly successful, making it a state-of-the-art approach. Given its superior performance, we are interested in exploring whether ViT can enhance the accuracy of sheep face recognition.In this paper, we show that ViT can be a useful technique for facial recognition. Since there was no predefined dataset for face recognition, a PCI dataset was built for this investigation. Along with the PCI dataset, two more well-known datasets, AT&T and 5-Celebrity, we used to examine performance. In our model was seen that ViT could identify human faces on the PCI dataset with a 99% accuracy rate and perform much better than other face recognition algorithms like Eigenface, FisherFace, and LBPH. © 2023 IEEE.Item Complex Aware Transformer-CNN for Refractive Index Prediction in Plasmonic Waveguide(Institute of Electrical and Electronics Engineers Inc., 2025) Chaurasia, A.R.; Marwade, V.; Singh, M.Estimating the effective refractive index of a plasmonic waveguide with high precision is essential for various photonic applications. Traditional analytical and numerical methods often involve extensive computational methods. Deep learning-based approaches have shown promise in improving both accuracy and efficiency. This paper presents a deep learning-based approach for effective refractive index estimation using a hybrid Complex Aware Transformer-Convolutional Neural Network (CAT-CNN) model utilizing convolutional feature extraction, transformer-based attention mechanisms, and squeeze-and-excitation blocks to improve predictive accuracy. Trained on a dataset of plasmonic waveguide parameters at a fixed frequency of 193.2 THz, the model achieves a combined testing R2 score of 0.99978, demonstrating high precision in predicting the real and imaginary parts of the effective refractive index. Our results demonstrate that CAT-CNN achieves state-of-the-art performance in terms of prediction accuracy and computational efficiency. The proposed model has significant implications for the design of high-performance plasmonic sensors and integrated photonic devices. © 2025 IEEE.
