Role of Activation Functions and Order of Input Sequences in Question Answering
dc.contributor.author | Chenna, Keshava, B.S. | |
dc.contributor.author | Sumukha, P.K. | |
dc.contributor.author | Chandrasekaran, K. | |
dc.contributor.author | Usha, D. | |
dc.date.accessioned | 2020-03-30T09:45:52Z | |
dc.date.available | 2020-03-30T09:45:52Z | |
dc.date.issued | 2020 | |
dc.description.abstract | This paper describes a solution for the Question Answering problem in Natural Language Processing using LSTMs. We perform an analysis on the effect of choice of activation functions in the final layer of LSTM cell on the accuracy. Facebook Research�s bAbI dataset is used for our experiments. We also propose an alternative solution, which exploits the language structure and order of words in the English language, i.e. reversing the order of paragraph will introduce many short-term dependencies between the textual data and the initial tokens of a question. This method improves the accuracy in more than half of the tasks by more than 30% over the current state of the art. Our contributions in this paper are improving the accuracy of most of the Q&A tasks by reversing the order of words in the query and the story sections. Also, we have provided a comparison of different activation functions and their respective accuracies with respect to all the 20 different NLP tasks. � 2020, Springer Nature Singapore Pte Ltd. | en_US |
dc.identifier.citation | Advances in Intelligent Systems and Computing, 2020, Vol.1016, , pp.377-390 | en_US |
dc.identifier.uri | https://idr.nitk.ac.in/handle/123456789/6565 | |
dc.title | Role of Activation Functions and Order of Input Sequences in Question Answering | en_US |
dc.type | Book chapter | en_US |