Conference Papers
Permanent URI for this collectionhttps://idr.nitk.ac.in/handle/123456789/28506
Browse
2 results
Search Results
Item Damage identification and assessment using image processing on post-disaster satellite imagery(Institute of Electrical and Electronics Engineers Inc., 2017) Joshi, A.R.; Tarte, I.; Suresh, S.; Koolagudi, S.G.Natural disasters such as earthquakes and tsunamis often have a devastating effect on human life and cause noticeable damage to infrastructure. Active research has been ongoing to mitigate the impact of these catastrophes and preclude the economic losses. The existing methods that utilize pre-event and post-event images not only require the immediate and guaranteed availability of the appropriate data set but are also encumbered by manual mapping of the images, necessitating the indication of corresponding control points in the two images. This paper highlights the use of only post-event imagery in the absence of reference data to achieve a more timely delivery to produce damage maps as the output. This eliminates the need for manual georeferencing of images. Our method incorporates simple linear iterative clustering (SLIC) for segmenting the images into uniform superpixels and extraction of 62 features for each superpixel. We used various classifiers of which Random Forest classifier was found to give a comparatively high accuracy of 90.4% over others. To enumerate the accuracy of the method proposed, we used 1500 data regions of which 20% were used for testing, and 80% were used for training. The aerial images taken by GeoEye1 after the 2011 Christchurch earthquake and 2011 Japan earthquake and tsunami are utilized in this study to detect building damage. In the case of availability of ground truth, we compare the histograms of the pre- and post-imagery to quantify similarity as the SSD (Sum of Squared Distances) value and thus, our approach produces an assessment as an output map displaying the extent of damage in the area covered by each superpixel. We consider 6 levels of damage ranging from 1 to 6, where 1 signifies no damage, and 6, maximum damage. © 2017 IEEE.Item Prostate Cancer Grading Using Multistage Deep Neural Networks(Springer Science and Business Media Deutschland GmbH, 2023) Bygari, R.; Rithesh, K.; Ambesange, S.; Koolagudi, S.G.Prostate cancer is the second most commonly occurring cancer in men with a high incidence to mortality ratio. Accurate prostate cancer grading is the foremost step in determining the precise treatment process for the patient in preventing mortality of the patient. Currently, the grading is carried out by pathologists, which has limitation of availability super specialist doctors across world to grade it at affordable price, and non-super specialist doctor grading is error prone. This paper evades the need for an expert pathologist by proposing a novel deep learning method for automatic screening of prostate images to detect and assign a grade severity of cancer based on the images. The explainability of classification model imbibed using gradient-weighted class activation mapping (GradCAM) visualization, which generate heatmap of image, which influenced the decision of the model. The proposed method has three stages with ensemble deep neural networks to grade the prostate cancer. Firstly, a UNet is used for the segmentation of the histopathological image. Subsequently, the segmented image is overlaid on the original image, which helps underscore the most critical regions determining the grade of cancer. Finally, the overlaid image is used by an ensemble model consisting of Xception, Resnet-50, EfficientNet-b7 to predict the final grade of the histopathological image. The dataset containing 10,000 histopathological images obtained from Karolinska and Radboud that are made publicly available through the Prostate Cancer Grade Assessment Challenge hosted in Kaggle is used for training and evaluation. This method achieves a classification accuracy of 92.38% and outperforms many state-of-the-art methods. © 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
