Stratification of Depressed and Non-Depressed Texts from Social Media using LSTM and its Variants
No Thumbnail Available
Date
2024
Journal Title
Journal ISSN
Volume Title
Publisher
Elsevier B.V.
Abstract
This work examines the performance of various LSTM (long short-term memory) variants on social media text data. This study evaluates the performance of LSTM models with different architectures, namely, classic LSTM, Bidirectional LSTM, Stacked LSTM, gated recurrent unit (GRU), and bidirectional GRU, on a social network dataset comprising texts extracted from multiple social media platforms. We aim to identify the most effective LSTM variant of the five considered LSTM models for text analysis through a comparative study of the models' precision, recall, F1-score, and accuracy. The research findings show that the Classic LSTM and the GRU model perform better than the other models in accuracy. In contrast, the bidirectional models (Bidirectional LSTM and Bidirectional GRU) provide better precision scores than their respective primitive models. This research has significant implications for developing more efficient models for natural language processing applications. It offers beneficial insights into the implications involving the scrutiny of depression on social media platforms through text data analysis. © 2024 Elsevier B.V.. All rights reserved.
Description
Keywords
Deep-Learning, Depression, Gated Recurrent Unit, Long Short Term memory, Performance analysis
Citation
Procedia Computer Science, 2024, Vol.235, , p. 1353-1363
