Multi-task deep neural network models for learning COVID-19 disease representations from multimodal data
No Thumbnail Available
Date
2023
Journal Title
Journal ISSN
Volume Title
Publisher
Inderscience Publishers
Abstract
Over the continued course of the COVID-19 pandemic, a significant volume of expert-written diagnosis reports has been accumulated that capture a multitude of symptoms and observations on diagnosed COVID-19 cases, along with expert-validated chest X-ray scans. The utility of rich, latent information embedded in such unstructured expert-written diagnosis reports and its importance as a source of valuable disease-specific information has been explored to a very limited extent. In this work, a convolutional attention-based dense (CAD) neural model for COVID-19 prediction is proposed. The model is trained on the rich disease-specific parameters extracted from chest X-ray images and expert-written diagnostic text reports to support an evidence-based diagnosis. Scalability is ensured by incorporating content based learning models for automatically generating diagnosis reports of identified COVID-19 cases, reducing radiologists' cognitive burden. Experimental evaluation showed that multimodal patient data plays a vital role in diagnosing early-stage cases, thus helping hasten the diagnosis process. © 2023 Inderscience Enterprises Ltd.
Description
Keywords
accuracy, Article, attention, clinical evaluation, convolutional neural network, coronavirus disease 2019, data analysis, decision tree, deep neural network, human, image retrieval, learning, learning algorithm, multimodal data, prediction, sensitivity and specificity, support vector machine, thorax radiography, training
Citation
International Journal of Medical Engineering and Informatics, 2023, 15, 6, pp. 501-515
