Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

KARNA at COIN Shared Task 1: Bidirectional Encoder Representations from Transformers with relational knowledge for machine comprehension with common sense

Yash Jain, Chinmay Singh


Abstract
This paper describes our model for COmmonsense INference in Natural Language Processing (COIN) shared task 1: Commonsense Inference in Everyday Narrations. This paper explores the use of Bidirectional Encoder Representations from Transformers(BERT) along with external relational knowledge from ConceptNet to tackle the problem of commonsense inference. The input passage, question, and answer are augmented with relational knowledge from ConceptNet. Using this technique we are able to achieve an accuracy of 73.3 % on the official test data.
Anthology ID:
D19-6008
Volume:
Proceedings of the First Workshop on Commonsense Inference in Natural Language Processing
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Simon Ostermann, Sheng Zhang, Michael Roth, Peter Clark
Venue:
WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
75–79
Language:
URL:
https://aclanthology.org/D19-6008
DOI:
10.18653/v1/D19-6008
Bibkey:
Cite (ACL):
Yash Jain and Chinmay Singh. 2019. KARNA at COIN Shared Task 1: Bidirectional Encoder Representations from Transformers with relational knowledge for machine comprehension with common sense. In Proceedings of the First Workshop on Commonsense Inference in Natural Language Processing, pages 75–79, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
KARNA at COIN Shared Task 1: Bidirectional Encoder Representations from Transformers with relational knowledge for machine comprehension with common sense (Jain & Singh, 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-6008.pdf