Long short-term memory network for learning sentences similarity using deep contextual embeddings

No Thumbnail Available

Date

2021

Journal Title

Journal ISSN

Volume Title

Publisher

Springer Science and Business Media B.V.

Abstract

Semantic text similarity (STS) is a challenging issue for natural language processing due to linguistic expression variability and ambiguities. The degree of the likelihood between the two sentences is calculated by sentence similarity. It plays a prominent role in many applications like information retrieval (IR), plagiarism detection (PD), question answering platform and text paraphrasing, etc. Now, deep contextualised word representations became a better way for feature extraction in sentences. It has shown exciting experimental results from recent studies. In this paper, we propose a deep contextual long semantic textual similarity network. Deep contextual mechanisms for collecting high-level semantic knowledge is used in the LSTM network. Through implementing architecture in multiple datasets, we have demonstrated our model’s effectiveness. By applying architecture to various semantic similarity datasets, we showed the usefulness of our model’s on regression and classification dataset. Detailed experimentation and results show that the proposed deep contextual model performs better than the human annotation. © 2021, Bharati Vidyapeeth's Institute of Computer Applications and Management.

Description

Keywords

Deep contextual embedding, LSTM, Regression, Sentence similarity

Citation

International Journal of Information Technology (Singapore), 2021, 13, 4, pp. 1633-1641

Collections

Endorsement

Review

Supplemented By

Referenced By