A Survey of Knowledge Enhanced Pre-trained Language Models
Abstract
References
Index Terms
- A Survey of Knowledge Enhanced Pre-trained Language Models
Recommendations
Impact of Morphological Segmentation on Pre-trained Language Models
Intelligent SystemsAbstractPre-trained Language Models are the current state-of-the-art in many natural language processing tasks. These models rely on subword-based tokenization to solve the problem of out-of-vocabulary words. However, commonly used subword segmentation ...
Pre-trained Language Models in Biomedical Domain: A Systematic Survey
Pre-trained language models (PLMs) have been the de facto paradigm for most natural language processing tasks. This also benefits the biomedical domain: researchers from informatics, medicine, and computer science communities propose various PLMs trained ...
On the transferability of pre-trained language models for low-resource programming languages
ICPC '22: Proceedings of the 30th IEEE/ACM International Conference on Program ComprehensionA recent study by Ahmed and Devanbu reported that using a corpus of code written in multilingual datasets to fine-tune multilingual Pre-trained Language Models (PLMs) achieves higher performance as opposed to using a corpus of code written in just one ...
Comments
Information & Contributors
Information
Published In
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Author Tags
Qualifiers
- Research-article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 448Total Downloads
- Downloads (Last 12 months)448
- Downloads (Last 6 weeks)110
Other Metrics
Citations
View Options
Get Access
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in