Please use this identifier to cite or link to this item: https://idr.nitk.ac.in/jspui/handle/123456789/6565
Title: Role of Activation Functions and Order of Input Sequences in Question Answering
Authors: Chenna, Keshava, B.S.
Sumukha, P.K.
Chandrasekaran, K.
Usha, D.
Issue Date: 2020
Citation: Advances in Intelligent Systems and Computing, 2020, Vol.1016, , pp.377-390
Abstract: This paper describes a solution for the Question Answering problem in Natural Language Processing using LSTMs. We perform an analysis on the effect of choice of activation functions in the final layer of LSTM cell on the accuracy. Facebook Research�s bAbI dataset is used for our experiments. We also propose an alternative solution, which exploits the language structure and order of words in the English language, i.e. reversing the order of paragraph will introduce many short-term dependencies between the textual data and the initial tokens of a question. This method improves the accuracy in more than half of the tasks by more than 30% over the current state of the art. Our contributions in this paper are improving the accuracy of most of the Q&A tasks by reversing the order of words in the query and the story sections. Also, we have provided a comparison of different activation functions and their respective accuracies with respect to all the 20 different NLP tasks. � 2020, Springer Nature Singapore Pte Ltd.
URI: http://idr.nitk.ac.in/jspui/handle/123456789/6565
Appears in Collections:2. Conference Papers

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.