Role of Activation Functions and Order of Input Sequences in Question Answering

dc.contributor.authorChenna Keshava, B.S.
dc.contributor.authorSumukha, P.K.
dc.contributor.authorChandrasekaran, K.
dc.contributor.authorDivakarla, D.
dc.date.accessioned2026-02-06T06:37:08Z
dc.date.issued2020
dc.description.abstractThis paper describes a solution for the Question Answering problem in Natural Language Processing using LSTMs. We perform an analysis on the effect of choice of activation functions in the final layer of LSTM cell on the accuracy. Facebook Research’s bAbI dataset is used for our experiments. We also propose an alternative solution, which exploits the language structure and order of words in the English language, i.e. reversing the order of paragraph will introduce many short-term dependencies between the textual data and the initial tokens of a question. This method improves the accuracy in more than half of the tasks by more than 30% over the current state of the art. Our contributions in this paper are improving the accuracy of most of the Q&A tasks by reversing the order of words in the query and the story sections. Also, we have provided a comparison of different activation functions and their respective accuracies with respect to all the 20 different NLP tasks. © 2020, Springer Nature Singapore Pte Ltd.
dc.identifier.citationAdvances in Intelligent Systems and Computing, 2020, Vol.1016, , p. 377-390
dc.identifier.issn21945357
dc.identifier.urihttps://doi.org/10.1007/978-981-13-9364-8_27
dc.identifier.urihttps://idr.nitk.ac.in/handle/123456789/30895
dc.publisherSpringer
dc.subjectLSTM––Long-short term memory
dc.subjectNLP––Natural language processing
dc.subjectQA––Question answer
dc.subjectRNN––Recurrent neural networks
dc.subjectSeq2seq models––Sequence-to-sequence models
dc.titleRole of Activation Functions and Order of Input Sequences in Question Answering

Files