Semantic Segmentation of Remotely Sensed Images using Multisource Data: An Experimental Analysis

dc.contributor.authorPutty, A.
dc.contributor.authorAnnappa, B.
dc.contributor.authorPrajwal, R.
dc.contributor.authorPariserum Perumal, S.P.
dc.date.accessioned2026-02-06T06:33:46Z
dc.date.issued2024
dc.description.abstractRemotely sensed data obtained from diverse sensors provide rich information for a wide range of applications in remote sensing, such as land use and land cover mapping. Due to the availability of a large amount of data, advanced deep-learning techniques have been incorporated into this domain. However, these techniques require a significant amount of annotated data, which can be challenging to obtain for land-use and land-cover mapping. Multisource data fusion has become crucial in remotely sensed image analysis to overcome this challenge, providing significant benefits across various applications. This paper analyzes the fusion of multisource data tailored for land-use and land-cover mapping. The analysis showcases that incorporating the novel knowledge transfer approach from multisource data has helped to achieve a 1-6% improvement in mIoU for the Kaggle Aerial Image dataset. © 2024 IEEE.
dc.identifier.citation2024 15th International Conference on Computing Communication and Networking Technologies, ICCCNT 2024, 2024, Vol., , p. -
dc.identifier.urihttps://doi.org/10.1109/ICCCNT61001.2024.10725213
dc.identifier.urihttps://idr.nitk.ac.in/handle/123456789/28840
dc.publisherInstitute of Electrical and Electronics Engineers Inc.
dc.subjectMultisource Data
dc.subjectRemotely Sensed Images
dc.subjectSemantic Segmentation
dc.subjectTransfer Learning
dc.titleSemantic Segmentation of Remotely Sensed Images using Multisource Data: An Experimental Analysis

Files