Conference Papers

Permanent URI for this collectionhttps://idr.nitk.ac.in/handle/123456789/28506

Browse

Search Results

Now showing 1 - 2 of 2
  • Item
    Scale independent raga identification using chromagram patterns and swara based features
    (2013) Dighe, P.; Agrawal, P.; Karnick, H.; Thota, S.; Raj, B.
    In Indian classical music a raga describes the constituent structure of notes in a musical piece. In this work, we investigate the problem of scale independent automatic raga identification by achieving state-of-the-art results using GMM based Hidden Markov Models over a collection of features consisting of chromagram patterns, mel-cepstrum coefficients and timbre features. We also perform the above task using 1) discrete HMMs and 2) classification trees over swara based features created from chromagrams using the concept of vadi of a raga.On a dataset of 4 ragas- darbari, khamaj, malhar and sohini; we have achieved an average accuracy of ∼ 97%. This is a certain improvement over previous works because they use the knowledge of scale used in the raga performance. We believe that with a more careful selection of features and by fusing results from multiple classifiers we should be able to improve results further. © 2013 IEEE.
  • Item
    Rhythm and timbre analysis for carnatic music processing
    (Springer Science and Business Media Deutschland GmbH info@springer-sbm.com, 2016) Heshi, R.; Suma, S.M.; Koolagudi, S.G.; Bhandari, S.; Sreenivasa Rao, K.S.
    In this work, an effort has been made to analyze rhythm and timbre related features to identify raga and tala from a piece of Carnatic music. Raga and Tala classification is performed using both rhythm and timbre features. Rhythm patterns and rhythm histogram are used as rhythm features. Zero crossing rate (ZCR), centroid, spectral roll-off, flux, entropy are used as timbre features. Music clips contain both instrumental and vocals. To find similarity between the feature vectors T-Test is used as a similarity measure. Further, classification is done using Gaussian Mixture Models (GMM). The results shows that the rhythm patterns are able to distinguish different ragas and talas with an average accuracy of 89.98 and 86.67 % respectively. © Springer India 2016.