Putty, A.Annappa, B.Prajwal, R.Pariserum Perumal, S.P.2026-02-0620242024 15th International Conference on Computing Communication and Networking Technologies, ICCCNT 2024, 2024, Vol., , p. -https://doi.org/10.1109/ICCCNT61001.2024.10725213https://idr.nitk.ac.in/handle/123456789/28840Remotely sensed data obtained from diverse sensors provide rich information for a wide range of applications in remote sensing, such as land use and land cover mapping. Due to the availability of a large amount of data, advanced deep-learning techniques have been incorporated into this domain. However, these techniques require a significant amount of annotated data, which can be challenging to obtain for land-use and land-cover mapping. Multisource data fusion has become crucial in remotely sensed image analysis to overcome this challenge, providing significant benefits across various applications. This paper analyzes the fusion of multisource data tailored for land-use and land-cover mapping. The analysis showcases that incorporating the novel knowledge transfer approach from multisource data has helped to achieve a 1-6% improvement in mIoU for the Kaggle Aerial Image dataset. © 2024 IEEE.Multisource DataRemotely Sensed ImagesSemantic SegmentationTransfer LearningSemantic Segmentation of Remotely Sensed Images using Multisource Data: An Experimental Analysis