Conference Papers
Permanent URI for this collectionhttps://idr.nitk.ac.in/handle/123456789/28506
Browse
4 results
Search Results
Item Scale independent raga identification using chromagram patterns and swara based features(2013) Dighe, P.; Agrawal, P.; Karnick, H.; Thota, S.; Raj, B.In Indian classical music a raga describes the constituent structure of notes in a musical piece. In this work, we investigate the problem of scale independent automatic raga identification by achieving state-of-the-art results using GMM based Hidden Markov Models over a collection of features consisting of chromagram patterns, mel-cepstrum coefficients and timbre features. We also perform the above task using 1) discrete HMMs and 2) classification trees over swara based features created from chromagrams using the concept of vadi of a raga.On a dataset of 4 ragas- darbari, khamaj, malhar and sohini; we have achieved an average accuracy of ∼ 97%. This is a certain improvement over previous works because they use the knowledge of scale used in the raga performance. We believe that with a more careful selection of features and by fusing results from multiple classifiers we should be able to improve results further. © 2013 IEEE.Item Raga classification for Carnatic music(Springer Verlag service@springer.de, 2015) Suma, S.M.; Koolagudi, S.G.In this work, an effort has been made to identify raga of given piece of Carnatic music. In the proposed method, direct raga classification without the use of note sequence has been performed using pitch as the primary feature. The primitive features that are extracted from the probability density function (pdf) of the pitch contour are used for classification. A feature vector of 36 dimension is obtained by extracting some parameters from the pdf. Since non-sequential features are extracted from the signal, artificial neural network (ANN) is used as a classifier. The database used for validating the system consists of 162 songs from 12 ragas. The average classification accuracy is found to be 89.5%. © Springer India 2015.Item Rhythm and timbre analysis for carnatic music processing(Springer Science and Business Media Deutschland GmbH info@springer-sbm.com, 2016) Heshi, R.; Suma, S.M.; Koolagudi, S.G.; Bhandari, S.; Sreenivasa Rao, K.S.In this work, an effort has been made to analyze rhythm and timbre related features to identify raga and tala from a piece of Carnatic music. Raga and Tala classification is performed using both rhythm and timbre features. Rhythm patterns and rhythm histogram are used as rhythm features. Zero crossing rate (ZCR), centroid, spectral roll-off, flux, entropy are used as timbre features. Music clips contain both instrumental and vocals. To find similarity between the feature vectors T-Test is used as a similarity measure. Further, classification is done using Gaussian Mixture Models (GMM). The results shows that the rhythm patterns are able to distinguish different ragas and talas with an average accuracy of 89.98 and 86.67 % respectively. © Springer India 2016.Item Note Transcription from Carnatic Music(Springer, 2020) Suma, S.M.; Koolagudi, S.G.; Ramteke, P.B.; Sreenivasa Rao, K.S.In this work, an effort has been made to identify note sequence of different ragas of Carnatic Music. The proposed heuristic method makes use of standard just-intonation frequency ratios between notes for basic transcription of music piece into written sequence of notes. The notes present in a given piece of music are obtained using pitch histograms. The normalized pitch contour of the music piece is segmented based on detection of the note boundaries. These segments are labeled using note information already available. Without prior knowledge of raga, 30 out of 64 sequences are identified accurately and additional 18 sequences are identified with one note error. With the prior raga knowledge 76.56% accuracy is observed in note sequence identification. © 2020, Springer Nature Singapore Pte Ltd.
