Please use this identifier to cite or link to this item:
|Title:||Raga and Tonic Identification in Carnatic Music|
|Authors:||Samsekai, Manjabhat, S.|
|Citation:||Journal of New Music Research, 2017, Vol.46, 3, pp.229-245|
|Abstract:||Raga and tonic are the basic elements based on which melody is constructed in Carnatic music. Raga is the framework for building melody where as tonic frequency establishes the base and a swara is identified ( R or G etc.) based on that base frequency. In this work, an effort has been made to identify raga and tonic of a given piece of Carnatic music. The proposed method is divided into two phases. In the first phase, tonic and raga have been determined independently using the features extracted from pitch histogram. In the second phase, raga and tonic are updated iteratively using the derived note information. In this work, raga will be recognised based on the features extracted from probability density function (pdf) of pitch values extracted from the music clip. The raga identification is performed using different classifiers such as feedforward neural network model, Gaussian Mixture Models and decision trees. A mathematical model based on the parameters of pitch pdf is proposed for tonic identification. Proposed raga and tonic identification system is evaluated on two datasets: 213 music clips from 14 ragas and CompMusic data-set (538 clips from 17 ragas). For first data-set, the average accuracy of raga and tonic identification is found to be 90.14 and 94.83%, respectively. With CompMusic data-set, an average accuracy of 95% is achieved for raga identification. 2017 Informa UK Limited, trading as Taylor & Francis Group.|
|Appears in Collections:||1. Journal Articles|
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.