Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Negated and Misprimed Probes for Pretrained Language Models: Birds Can Talk, But Cannot Fly

Nora Kassner, Hinrich Schütze


Abstract
Building on Petroni et al. 2019, we propose two new probing tasks analyzing factual knowledge stored in Pretrained Language Models (PLMs). (1) Negation. We find that PLMs do not distinguish between negated (‘‘Birds cannot [MASK]”) and non-negated (‘‘Birds can [MASK]”) cloze questions. (2) Mispriming. Inspired by priming methods in human psychology, we add “misprimes” to cloze questions (‘‘Talk? Birds can [MASK]”). We find that PLMs are easily distracted by misprimes. These results suggest that PLMs still have a long way to go to adequately learn human-like factual knowledge.
Anthology ID:
2020.acl-main.698
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7811–7818
Language:
URL:
https://aclanthology.org/2020.acl-main.698/
DOI:
10.18653/v1/2020.acl-main.698
Bibkey:
Cite (ACL):
Nora Kassner and Hinrich Schütze. 2020. Negated and Misprimed Probes for Pretrained Language Models: Birds Can Talk, But Cannot Fly. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 7811–7818, Online. Association for Computational Linguistics.
Cite (Informal):
Negated and Misprimed Probes for Pretrained Language Models: Birds Can Talk, But Cannot Fly (Kassner & Schütze, ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.698.pdf
Video:
 http://slideslive.com/38929168
Code
 facebookresearch/LAMA +  additional community code
Data
LAMASQuADT-REx