Please use this identifier to cite or link to this item:
https://idr.nitk.ac.in/jspui/handle/123456789/17722
Title: | Deep Learning Models For Change Detection Analysis Using Geo-Spatial Data |
Authors: | Navnath, Naik Nitesh |
Supervisors: | Chandrasekaran, K Venkatesan, M |
Keywords: | Environmental monitoring;Change Detection;Land use and Land Cover;Time-series |
Issue Date: | 2023 |
Publisher: | National Institute Of Technology Karnataka Surathkal |
Abstract: | Due to enhancements in space technology, the number of sensors is increasing grad- ually, contributing towards the high availability of remote sensing images. Multiple satellites have been placed into orbit around the Earth, and more launches are antici- pated. There is a need to monitor the surface of the Earth continuously. Every day, many images with high spatial and temporal resolution are captured. These contribute to acquiring a large amount of remote sensing data and a need to analyze this available data on time. These various possibilities contribute to the known revolution of Big Data in remote sensing. Numerous global and local environmental monitoring applications are made possible by the tremendous volume of images. Assessment of climate change, monitoring of disasters, and urban planning are only a few examples of uses. Determining changes in land use and land cover is crucial for conserving various natural resources and the environment. It can reveal how humans utilize the land in a particular area. Learning how land use patterns have changed worldwide has become necessary to address global climate change and promote sustainable development. This research proposes novel approaches and methods for autonomously utilizing the remote sensing data collected by the increasing number of sensors. There is a high requirement to offer new strategies to extract information from these remote-sensing images in a reliable manner. The primary focus of this thesis is on Change Detection (CD) methods that identify re- gions in remote sensing images where the land cover or land use has changed. The CD is the first step in comprehending the Earth’s surface’s dynamics and it’s evolution. This thesis investigates methods for better information extraction by utilizing the temporal correlation found in bitemporal and multitemporal image time series. Three significant innovative contributions to the state of the art are presented in the thesis. The spatial- temporal information is modeled with various pre-processing and clustering techniques and integrated using deep learning approaches. These results in a significantly moreaccurate change map identifying when, where, and what land cover changes have oc- curred. The first contribution consists of a novel CD framework for bitemporal image CD. The hybrid CD approach using superpixel segmentation, fuzzy-based clustering, and lightweight deep learning is used to exploit the changes in remote sensing data. The second contribution presents a novel hybrid encoder-decoder model for land use and land cover CD that considers spatial and temporal aspects of binary and multiclass changes in remote sensing images. The final contribution presents an iterative method for enhancing the overall land cover classification performance for every pair of images defined inside a time series. For the five datasets used in this research work, the overall accuracy of the proposed methods has improved significantly, increasing above 85%, when both space and time are considered with the maximum likelihood. These results indicate that combining space and time domains has immensely improved the accuracy of temporal CD anal- ysis and can produce high-quality land cover prediction maps. A thorough qualitative and quantitative analysis complements the outcomes of the experiment. |
URI: | http://idr.nitk.ac.in/jspui/handle/123456789/17722 |
Appears in Collections: | 1. Ph.D Theses |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
197091-CS004-Nitesh Naik.pdf | 18.2 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.