Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

The Unbearable Weight of Generating Artificial Errors for Grammatical Error Correction

Phu Mon Htut, Joel Tetreault


Abstract
In this paper, we investigate the impact of using 4 recent neural models for generating artificial errors to help train the neural grammatical error correction models. We conduct a battery of experiments on the effect of data size, models, and comparison with a rule-based approach.
Anthology ID:
W19-4449
Volume:
Proceedings of the Fourteenth Workshop on Innovative Use of NLP for Building Educational Applications
Month:
August
Year:
2019
Address:
Florence, Italy
Editors:
Helen Yannakoudakis, Ekaterina Kochmar, Claudia Leacock, Nitin Madnani, Ildikó Pilán, Torsten Zesch
Venue:
BEA
SIG:
SIGEDU
Publisher:
Association for Computational Linguistics
Note:
Pages:
478–483
Language:
URL:
https://aclanthology.org/W19-4449
DOI:
10.18653/v1/W19-4449
Bibkey:
Cite (ACL):
Phu Mon Htut and Joel Tetreault. 2019. The Unbearable Weight of Generating Artificial Errors for Grammatical Error Correction. In Proceedings of the Fourteenth Workshop on Innovative Use of NLP for Building Educational Applications, pages 478–483, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
The Unbearable Weight of Generating Artificial Errors for Grammatical Error Correction (Htut & Tetreault, BEA 2019)
Copy Citation:
PDF:
https://aclanthology.org/W19-4449.pdf