Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Team Howard Beale at SemEval-2019 Task 4: Hyperpartisan News Detection with BERT

Osman Mutlu, Ozan Arkan Can, Erenay Dayanik


Abstract
This paper describes our system for SemEval-2019 Task 4: Hyperpartisan News Detection (Kiesel et al., 2019). We use pretrained BERT (Devlin et al., 2018) architecture and investigate the effect of different fine tuning regimes on the final classification task. We show that additional pretraining on news domain improves the performance on the Hyperpartisan News Detection task. Our system ranked 8th out of 42 teams with 78.3% accuracy on the held-out test dataset.
Anthology ID:
S19-2175
Volume:
Proceedings of the 13th International Workshop on Semantic Evaluation
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota, USA
Editors:
Jonathan May, Ekaterina Shutova, Aurelie Herbelot, Xiaodan Zhu, Marianna Apidianaki, Saif M. Mohammad
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
1007–1011
Language:
URL:
https://aclanthology.org/S19-2175
DOI:
10.18653/v1/S19-2175
Bibkey:
Cite (ACL):
Osman Mutlu, Ozan Arkan Can, and Erenay Dayanik. 2019. Team Howard Beale at SemEval-2019 Task 4: Hyperpartisan News Detection with BERT. In Proceedings of the 13th International Workshop on Semantic Evaluation, pages 1007–1011, Minneapolis, Minnesota, USA. Association for Computational Linguistics.
Cite (Informal):
Team Howard Beale at SemEval-2019 Task 4: Hyperpartisan News Detection with BERT (Mutlu et al., SemEval 2019)
Copy Citation:
PDF:
https://aclanthology.org/S19-2175.pdf