Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3568364.3568378acmotherconferencesArticle/Chapter ViewAbstractPublication PageswsseConference Proceedingsconference-collections
research-article

Chinese Short Text Entity Linking Model Based on PET

Published: 23 December 2022 Publication History

Abstract

Existing Chinese short text entity link models are less, and the short text is limited and handled by the context missing and the processing noise. There is still a lot of space to improve the accuracy. This paper proposes a Chinese short text entity linking model, encoding the mention and entity representation of Pattern-Exploiting Training (PET), and learning the potential relationship between the entities in the knowledge base, based on contrastive learning. Our Chinese short text model experiments on Duel2.0 dataset and improves the result.

References

[1]
Ganea O E, Hofmann T. Deep joint entity disambiguation with local neural attention[J]. arXiv preprint arXiv:1704.04920, 2017.
[2]
Broscheit S. Investigating Entity Knowledge in BERT with Simple Neural End-To-End Entity Linking[C]// Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL). 2019.
[3]
Zwicklbauer S, Seifert C, Granitzer M. Robust and Collective Entity Disambiguation through Semantic Embeddings[C]// International Acm Sigir Conference. ACM, 2016:425-434.
[4]
Moreno J, Besancon R, Beaumont R, Combining word and entity embeddings for entity linking[C]// European Semantic Web Conference. Springer, Cham, 2017.
[5]
Le P, Titov I. Distant Learning for Entity Linking with Automatic Noise Detection[C]//57th Annual Meeting of the Association for Computational Linguistics. ACL Anthology, 2019: 4081-4090.
[6]
Sevgili O, Shelmanov A, Arkhipov M, Neural entity linking: A survey of models based on deep learning[J]. arXiv preprint arXiv:2006.00575, 2020.
[7]
Fang Z, Cao Y, Li Q, Joint entity linking with deep reinforcement learning[C]//The world wide web conference. 2019: 438-447.Pershina M, He Y, Grishman R. Personalized page rank for named entity disambiguation[C]//Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2015: 238-243.
[8]
Pershina M, He Y, Grishman R. Personalized page rank for named entity disambiguation[C]//Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2015: 238-243.
[9]
Yamada I, Shindo H, Takeda H, Joint Learning of the Embedding of Words and Entities for Named Entity Disambiguation[C]// Proceedings of The 20th SIGNLL Conference on Computational Natural Language Learning. 2016.
[10]
Cao Y, Huang L, Ji H, Bridge Text and Knowledge by Learning Multi-Prototype Entity Mention Embedding[C]// Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2017.
[11]
Radhakrishnan P, Talukdar P, Varma V. Elden: Improved entity linking using densified knowledge graphs[C]//Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers). 2018: 1844-1853.
[12]
Martins P H, Marinho Z, Martins A F T. Joint Learning of Named Entity Recognition and Entity Linking[J]. ACL 2019, 2019: 190.
[13]
Kolitsas N, Ganea O E, Hofmann T. End-to-end neural entity linking[J]. arXiv preprint arXiv:1808.07699, 2018.
[14]
Cao Y, Hou L, Li J, Neural collective entity linking[J]. arXiv preprint arXiv:1811.08603, 2018.
[15]
Peters M E, Neumann M, Logan R, Knowledge Enhanced Contextual Word Representations[C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). 2019: 43-54.
[16]
Le P, Titov I. Boosting Entity Linking Performance by Leveraging Unlabeled Documents[C]//57th Annual Meeting of the Association for Computational Linguistics. ACL Anthology, 2019: 1935-1945.
[17]
Gupta N, Singh S, Roth D. Entity linking via joint encoding of types, descriptions, and context[C]//Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. 2017: 2681-2690.
[18]
Sil A, Kundu G, Florian R, Neural cross-lingual entity linking[C]//Thirty-Second AAAI Conference on Artificial Intelligence. 2018.
[19]
Hochreiter S, Schmidhuber J. Long short-term memory[J]. Neural computation, 1997, 9(8): 1735-1780.
[20]
Yamada I, Washio K, Shindo H, Global entity disambiguation with pretrained contextualized embeddings of words and entities[J]. arXiv preprint arXiv:1909.00426, 2019.
[21]
Wu L, Petroni F, Josifoski M, Zero-shot Entity Linking with Dense Entity Retrieval. CoRR abs/1911.03814, 2019 [J]. arXiv preprint arxiv:1911.03814, 2019.
[22]
Logeswaran L, Chang M W, Lee K, Zero-Shot Entity Linking by Reading Entity Descriptions[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019: 3449-3460.
[23]
Cheng J, Pan C, Dang J, Entity linking for Chinese short texts based on BERT and entity name embeddings[C]//China Conference on Knowledge Graph and Semantic Computing (CCKS). 2019.
[24]
Devlin J, Chang M W, Lee K, Bert: Pre-training of deep bidirectional transformers for language understanding[J]. arXiv preprint arXiv:1810.04805, 2018.
[25]
Zeng W, Tang J, Zhao X. Entity linking on Chinese microblogs via deep neural network[J]. IEEE Access, 2018, 6: 25908-25920.
[26]
Liu Q, Zhong Y, Li Y, Graph-based collective Chinese entity linking algorithm[J]. Journal of Computer Research and Development, 2016, 53(2): 270.
[27]
Mao E, Wang B, Tang Y, Entity linking method of chinese micro-blog based on word vector[J]. Comput. Appl. Softw, 2017, 34(4): 11-15.
[28]
Liu Y, Ott M, Goyal N, Roberta: A robustly optimized bert pretraining approach[J]. arXiv preprint arXiv:1907.11692, 2019.
[29]
Zhao Z, Chen H, Zhang J, UER: An Open-Source Toolkit for Pre-training Models[C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP): System Demonstrations. 2019: 241-246.
[30]
Zhang Z, Han X, Liu Z, ERNIE: Enhanced language representation with informative entities[J]. arXiv preprint arXiv:1905.07129, 2019.
[31]
https://aistudio.baidu.com/aistudio/projectdetail/1331020, 2020-12-14

Index Terms

  1. Chinese Short Text Entity Linking Model Based on PET

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    WSSE '22: Proceedings of the 4th World Symposium on Software Engineering
    September 2022
    187 pages
    ISBN:9781450396950
    DOI:10.1145/3568364
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 23 December 2022

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. PET
    2. deep learning
    3. entity linking
    4. short text

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    • Science and Technology Project of the Headquarters of State Grid Corporation of China

    Conference

    WSSE 2022

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 31
      Total Downloads
    • Downloads (Last 12 months)16
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 18 Aug 2024

    Other Metrics

    Citations

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media