Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Cleaner Pretraining Corpus Curation with Neural Web Scraping

Zhipeng Xu, Zhenghao Liu, Yukun Yan, Zhiyuan Liu, Ge Yu, Chenyan Xiong


Abstract
The web contains large-scale, diverse, and abundant information to satisfy the information-seeking needs of humans. Through meticulous data collection, preprocessing, and curation, webpages can be used as a fundamental data resource for language model pretraining. However, when confronted with the progressively revolutionized and intricate nature of webpages, rule-based/feature-based web scrapers are becoming increasingly inadequate. This paper presents a simple, fast, and effective Neural web Scraper (NeuScraper) to help extract primary and clean text contents from webpages. Experimental results show that NeuScraper surpasses the baseline scrapers by achieving more than a 20% improvement, demonstrating its potential in extracting higher-quality data to facilitate the language model pretraining. All of the code is available at https://github.com/OpenMatch/NeuScraper.
Anthology ID:
2024.acl-short.72
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
802–812
Language:
URL:
https://aclanthology.org/2024.acl-short.72
DOI:
10.18653/v1/2024.acl-short.72
Bibkey:
Cite (ACL):
Zhipeng Xu, Zhenghao Liu, Yukun Yan, Zhiyuan Liu, Ge Yu, and Chenyan Xiong. 2024. Cleaner Pretraining Corpus Curation with Neural Web Scraping. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 802–812, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Cleaner Pretraining Corpus Curation with Neural Web Scraping (Xu et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-short.72.pdf