Conference Papers
Permanent URI for this collectionhttps://idr.nitk.ac.in/handle/123456789/28506
Browse
3 results
Search Results
Item Depthwise Separable Convolutional Neural Network Model for Intra-Retinal Cyst Segmentation(Institute of Electrical and Electronics Engineers Inc., 2019) Girish, G.N.; Saikumar, B.; Roychowdhury, S.; Kothari, A.R.; Rajan, J.Intra-retinal cysts (IRCs) are significant in detecting several ocular and retinal pathologies. Segmentation and quantification of IRCs from optical coherence tomography (OCT) scans is a challenging task due to present of speckle noise and scan intensity variations across the vendors. This work proposes a convolutional neural network (CNN) model with an encoder-decoder pair architecture for IRC segmentation across different cross-vendor OCT scans. Since deep CNN models have high computational complexity due to a large number of parameters, the proposed method of depthwise separable convolutional filters aids model generalizability and prevents model over-fitting. Also, the swish activation function is employed to prevent the vanishing gradient problem. The optima cyst segmentation challenge (OCSC) dataset with four different vendor OCT device scans is used to evaluate the proposed model. Our model achieves a mean Dice score of 0.74 and mean recall/precision rate of 0.72/0.82 across different imaging vendors and it outperforms existing algorithms on the OCSC dataset. © 2019 IEEE.Item Retinal-Layer Segmentation Using Dilated Convolutions(Springer Science and Business Media Deutschland GmbH, 2020) Guru Pradeep Reddy, T.; Ashritha, K.S.; Prajwala, T.M.; Girish, G.N.; Kothari, A.R.; Koolagudi, S.G.; Rajan, J.Visualization and analysis of Spectral Domain Optical Coherence Tomography (SD-OCT) cross-sectional scans has gained a lot of importance in the diagnosis of several retinal abnormalities. Quantitative analytic techniques like retinal thickness and volumetric analysis are performed on cross-sectional images of the retina for early diagnosis and prognosis of retinal diseases. However, segmentation of retinal layers from OCT images is a complicated task on account of certain factors like speckle noise, low image contrast and low signal-to-noise ratio amongst many others. Owing to the importance of retinal layer segmentation in diagnosing ophthalmic diseases, manual segmentation techniques have been proposed and adopted in clinical practice. Nonetheless, manual segmentations suffer from erroneous boundary detection issues. This paper thus proposes a fully automated semantic segmentation technique that uses an encoder–decoder architecture to accurately segment the prominent retinal layers. © 2020, Springer Nature Singapore Pte Ltd.Item Attention Assisted Patch-Wise CNN for the Segmentation of Fluids from the Retinal Optical Coherence Tomography Images(Springer Science and Business Media Deutschland GmbH, 2024) Anoop, B.N.; Parida, S.; Ajith, B.; Girish, G.N.; Kothari, A.R.; Kavitha, M.S.; Rajan, J.Optical Coherence Tomography (OCT) is an important imaging modality in ophthalmology to visualize the abnormalities present in the retina. One of the major reasons for blindness is the accumulation of fluids in the various layers of the retina called retinal cysts. Accurate estimation of the type of cyst and its volume is important for effective treatment planning. In this paper, we propose attention assisted convolutional neural network-based architecture to detect and quantify three types of retinal cysts namely the intra-retinal cyst, sub-retinal cyst and pigmented epithelial detachment from the OCT images of the human retina. The proposed architecture has an encoder-decoder structure with an attention and a multi-scale module. The qualitative and quantitative performance of the model is evaluated on the publicly available RETOUCH retinal OCT fluid detection challenge data set. The proposed model outperforms the state-of-the-art methods in terms of precision, recall, and dice coefficient. Furthermore, the proposed model is computationally efficient due to its less number of model parameters. © Springer Nature Switzerland AG 2024.
