Commonsense and Named Entity Aware Knowledge Grounded Dialogue Generation

No Thumbnail Available

Date

2022

Journal Title

Journal ISSN

Volume Title

Publisher

Association for Computational Linguistics (ACL)

Abstract

Grounding dialogue on external knowledge and interpreting linguistic patterns in dialogue history context, such as ellipsis, anaphora, and co-references is critical for dialogue comprehension and generation. In this paper, we present a novel open-domain dialogue generation model which effectively utilizes the large-scale commonsense and named entity based knowledge in addition to the unstructured topic-specific knowledge associated with each utterance. We enhance the commonsense knowledge with named entity-aware structures using co-references. Our proposed model utilizes a multi-hop attention layer to preserve the most accurate and critical parts of the dialogue history and the associated knowledge. In addition, we employ a Commonsense and Named Entity Enhanced Attention Module, which starts with the extracted triples from various sources and gradually finds the relevant supporting set of triples using multi-hop attention with the query vector obtained from the interactive dialogue-knowledge module. Empirical results on two benchmark dataset demonstrate that our model significantly outperforms the state-of-the-art methods in terms of both automatic evaluation metrics and human judgment. Our code is publicly available at https://github.com/deekshaVarshney/CNTF;https://www.iitp.ac.in/-ai-nlp-ml/resources/codes/CNTF.zip. © 2022 Association for Computational Linguistics.

Description

Keywords

Citation

NAACL 2022 - 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Conference, 2022, Vol., , p. 1322-1335

Endorsement

Review

Supplemented By

Referenced By