Faculty Publications
Permanent URI for this communityhttps://idr.nitk.ac.in/handle/123456789/18736
Publications by NITK Faculty
Browse
Search Results
Item Performance evaluation of dimensionality reduction techniques on hyperspectral data for mineral exploration(Springer Science and Business Media Deutschland GmbH, 2023) C, D.; Shetty, A.; Narasimhadhan, A.V.With recent advances in hardware and wide range of applications, hyperspectral remote sensing proves to be a promising technology for analysing terrain. However, the sheer volume of bands, strong inter band correlation and redundant information makes interpretation of hyperspectral data a tedious task. Aforementioned issues can be addressed to a considerable extent by reducing the dimensionality of hyperspectral data. Though plethora of algorithms exist to downsize hyperspectral data, quality assessment of these techniques remains unanswered. Since Dimensionality Reduction (DR) is a special case of unsupervised learning, classification accuracy cannot be directly used to compare the performance of different dimensionality reduction techniques. As a consequence, a different type of goodness measure is essential which is expected to be easily interpretable, robust against outliers and applicable to most algorithms and datasets. In this paper, fifteen popular dimensionality reduction algorithms are reviewed, evaluated and compared on hyperspectral dataset for mineral exploration. The performance of various DR algorithms is tested on hyperspectral mineral data since the extensive study of DR for mineral mapping is scarce compared to land cover mapping. Also, DR techniques are evaluated based on coranking criteria which is independent of label information. This facilitates to demonstrate the robust technique for mineral mapping and also provides meaningful insight into topology preservation. These techniques play a vital role in mineral exploration since in field observation is expensive, time consuming and requires more man power. From experimental results it is evident that, deep autoencoders provide better embedding with a quality index value of 0.9938, when K = 120 compared to other existing nonlinear techniques. The conclusions presented are unique since previous studies have not evaluated the results qualitatively and comparison between conventional machine learning and deep learning algorithms is limited. © 2023, The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature.Item Knowledge distillation: A novel approach for deep feature selection(Elsevier B.V., 2023) C, D.; Shetty, A.; Narasimhadhan, A.V.High dimensional data in hyperspectral remote sensing leads to computational, analytical, and storage complexities. Dimensionality reduction serves as an efficient tool to remove redundant, irrelevant, and highly correlated features. Recently, deep learning approaches have received remarkable progress in hyperspectral data analysis. In this paper, a new end-to-end deep learning framework based on a teacher-student network inspired by knowledge distillation is proposed for deep feature selection. Initially, a complicated teacher deep neural network is employed on complex high dimensional data to learn its corresponding best low dimensional representation. Then, the knowledge from the network is transferred to a simple student network that performs feature selection. Hence, it eventually leads to deep neural network compression which is of prime concern in hyperspectral remote sensing. Limited studies have been carried out to explore the benefits of knowledge distillation on hyperspectral data. The proposed method could be employed to choose deep features for both supervised and unsupervised tasks. Experimental results reveal the performance of the proposed scheme using limited features. In comparison to 1D and simple autoencoder models, the 2D model based on convolutional autoencoder delivers greater classification accuracies, with a classification accuracy value of 96.15% for the Indian Pines dataset and 97.82% for the Pavia University dataset. A similar trend is reported with unsupervised learning as well. Furthermore, the proposed model has a low degree of sensitivity to parameter selection. © 2022 National Authority of Remote Sensing & Space Science
