Long Short Term Memory Networks for Lexical Normalization of Tweets

dc.contributor.authorNayak, P.
dc.contributor.authorPraueeth, G.
dc.contributor.authorKulkarni, R.
dc.contributor.authorAnand Kumar, M.
dc.date.accessioned2026-02-06T06:36:05Z
dc.date.issued2021
dc.description.abstractLexical normalization is converting a non-standard text into a standard text that is more readable and universal. Data obtained from social media sites and tweets often contain much noise and use non-canonical sentence structures such as non-standard abbrevlatlons, skipping of words, spelling errors, etc. Hence such data needs to be appropriately processed before it can be used. The processing can be done by lexical normalization, which reduces randomness and converts the sentence structure to a predefined standard. Hence. lexical normalization can help in improving the performance of systems that use user-generated text as inputs. There are several ways to perform lexical normalization, such as dictionary lookups, most frequent replacements, etc. However, VVe aim to explore the domain of deep learning to find approaches that can be used to normalize texts lexically. © 2021 IEEE.
dc.identifier.citation2021 12th International Conference on Computing Communication and Networking Technologies, ICCCNT 2021, 2021, Vol., , p. -
dc.identifier.urihttps://doi.org/10.1109/ICCCNT51525.2021.9579684
dc.identifier.urihttps://idr.nitk.ac.in/handle/123456789/30232
dc.publisherInstitute of Electrical and Electronics Engineers Inc.
dc.subjectDeep Learning
dc.subjectLexical Normalization
dc.subjectLSTM
dc.subjectTweets
dc.titleLong Short Term Memory Networks for Lexical Normalization of Tweets

Files