Faculty Publications
Permanent URI for this communityhttps://idr.nitk.ac.in/handle/123456789/18736
Publications by NITK Faculty
Browse
3 results
Search Results
Item Non-subsampled Shearlet Domain-based De-speckling Framework for Optical Coherence Tomography Images(International Hellenic University - School of Science, 2023) Gupta, P.K.; Chanchal, A.K.; Lal, S.; Gupta, V.An effective instrument for obtaining an image of the retina is an optical coherence tomography (OCT) imaging device. OCT images of the retina are useful for diagnosing and tracking eye diseases. However, different physical configurations in the imaging apparatus are to blame for the speckle noise in retinal OCT images. The OCT image quality and assessment reliability are reduced due to aforementioned noise. This paper offered a paradigm for reducing speckle noise that was motivated by the mathematical formulation of speckle noise. Two distinct noise components make up speckle noise, one of which is additive and the other of which is multiplicative in nature. For each sort of noise, the suggested structure employs a different filter. To reduce the additive component of speckle noise, Weiner filtering is used. To minimize the multiplicative component of noise, a particular arrangement based on non-subsampled shearlet transform (NSST) is used. It is now widely acknowledge that NSST overcome the limitations of traditional wavelet transform therefore it very useful in dealing of distributed discontinuities therefore it is prefer in this research work.Real retinal OCT pictures are used to assess the proposed framework's quantitative and qualitative performance. The PSNR, MSE, SSIM, and CNR metrics are used to compare the suggested framework. In comparison to existing cutting-edge filters, the proposed framework performs better in terms of noise suppression capability with structure preservation capabilities. The proposed technique gives highest PSNR, SSIM and CNR value that indicate the effectiveness of proposed work in addition to this proposed work give lowest MSE value. The proposed work give better enhance images in comparison to other existing filter therefore it may be helpful to find out any abnormality in OCT image and improve the diagnose of OCT retinal image. © 2023 School of Science, IHU. All rights reserved.Item Two dimensional cuckoo search optimization algorithm based despeckling filter for the real ultrasound images(Springer Science and Business Media Deutschland GmbH, 2024) Gupta, P.K.; Lal, S.; Kiran, M.S.; Husain, F.A clinical ultrasound imaging plays a significant role in the proper diagnosis of patients because, it is a cost-effective and non-invasive technique in comparison with other methods. The speckle noise contamination caused by ultrasound images during the acquisition process degrades its visual quality, which makes the diagnosis task difficult for physicians. Hence, to improve their visual quality, despeckling filters are commonly used for processing of such images. However, several disadvantages of existing despeckling filters discourage the use of existing despeckling filters to reduce the effect of speckle noise. In this paper, two dimensional cuckoo search optimization algorithm based despeckling filter is proposed for avoiding limitations of various existing despeckling filters. Proposed despeckling filter is developed by combining fast non-local means filter and 2D finite impulse response (FIR) filter with cuckoo search optimization algorithm. In the proposed despeckling filter, the coefficients of 2D FIR filter are optimized by using the cuckoo search optimization algorithm. The quantitative results comparison between the proposed despeckling filter and other existing despeckling filters are analyzed by evaluating PSNR, MSE, MAE, and SSIM values for different real ultrasound images. Results reveal that the visual quality obtained by the proposed despeckling filter is better than other existing despeckling filters. The numerical results also reveal that the proposed despeckling filter is highly effective for despeckling the clinical ultrasound images. © Springer-Verlag GmbH Germany, part of Springer Nature 2018.Item TransSARNet: a deep learning framework for despeckling of SAR images(Institute of Physics, 2025) Kevala, V.D.; Sravya, N.; Lal, S.; Suresh, S.; Dell’Acqua, F.Synthetic Aperture Radar(SAR) images are extensively used for Earth observation because of their all-weather, day, and night imaging capabilities. However, speckle noise in SAR images significantly reduces their usability in a variety of applications. Deep learning models developed for SAR despeckling exhibit promising noise reduction capabilities. Bringing a balance between reducing graininess and preserving texture details is a challenging task. In addition, supervised training of a robust deep learning model requires noisy images that capture the SAR speckle dynamics and the corresponding speckle-free ground truth, which is generally not available. This study proposes the first hybrid CNN-Halo attention-based transformer model for SAR despeckling. CNN-based feature extraction modules provide multiscale and multidirectional and large-scale feature maps. A halo-attention transformer block is used in the skip connection. It aids in the better preservation of radiometric information in the despeckled SAR images. TransSARNet is trained in a supervised manner using a new synthetic SAR dataset, which is a combination of the Kylberg and UCMerced land-use datasets. This study also analyzed the effect of combining the Kylberg and UCMerced datasets on texture preservation in despeckled SAR images. The visual and qualitative metrics evaluated on Sentinel-1 Single Look Complex SAR data showed that the proposed TransSARNet approach outperformed the other models under consideration. TransSARNet achieves a harmonious balance between model complexity, despeckling ability, edge preservation, radiometric information preservation, and smoothing in homogeneous regions. © 2025 IOP Publishing Ltd. All rights, including for text and data mining, AI training, and similar technologies, are reserved.
