Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Multidimensional relational knowledge embedding for coreference resolution

Published: 07 November 2023 Publication History

Abstract

Currently, the co-reference resolution model using a knowledge base mainly faces two problems: first, the knowledge is complex and diverse, and it is difficult to add appropriate knowledge to complement the conceptual relationships between entities; second, it is difficult to integrate the obtained external knowledge into the model. In this paper, we propose a multidimensional relational knowledge model (MDR) for co-reference resolution, which extends in both high-dimensional and low-dimensional directions according to the antecedent words to be parsed, abstracts upwards to high-dimensional concepts to represent the essential relations of things, and diffuses downwards to find intra-sentence words to make the knowledge closer to the sentence meaning, providing the model with more generalised multidimensional relational knowledge and higher sentence relevance. At the same time, in order for the model to make full use of the knowledge, the attention mechanism is adjusted to use external knowledge to guide the intra-sentence relationship changes and adjust the degree of knowledge dominance according to the back-propagation of the neural network. The knowledge noise reduction module is designed based on a multiplexed hybrid approach, using a hybrid approach to dilute the proportion of knowledge in the total information and reduce noise generation. The multidimensional relational knowledge model is evaluated on the Definite Pronoun Resolution Dataset and Winograd Schema Challenge datasets, showing its cross-sectional comparison with existing models in experiments and ablation experiments, and the role of knowledge is analyzed for experimental cases to demonstrate that the multidimensional relational knowledge is helpful for model co-reference resolution ability.

References

[1]
Khurana D, Koli A, Khatter K, and Singh S Natural language processing: state of the art, current trends and challenges Multimed Tools Appl 2023 82 3 3713-3744
[2]
Lee H, Chang A, Peirsman Y, Chambers N, Surdeanu M, and Jurafsky D Deterministic coreference resolution based on entity-centric, precision-ranked rules Comput Linguist 2013 39 4 885-916
[3]
Lee K, He L, Lewis M, Zettlemoyer L (2017) End-to-end neural coreference resolution. arXiv preprint arXiv:1707.07045
[4]
Liu F, Zettlemoyer L, Eisenstein J (2019) The referential reader: a recurrent entity network for anaphora resolution. arXiv preprint arXiv:1902.01541
[5]
Devlin J, Chang M-W, Lee K, Toutanova K (2018) Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805
[6]
Liu Y, Ott M, Goyal N, Du J, Joshi M, Chen D, Levy O, Lewis M, Zettlemoyer L, Stoyanov V (2019) Roberta: a robustly optimized BERT pretraining approach. arXiv preprint arXiv:1907.11692
[7]
Spiliopoulou E, Pagnoni A, Bisk Y, Hovy E (2022) Events realm: event reasoning of entity states via language models. arXiv preprint arXiv:2211.05392
[8]
Tandon N, Mishra BD, Grus J, Yih W, Bosselut A, Clark P (2018) Reasoning about actions and state changes by injecting commonsense knowledge. arXiv preprint arXiv:1808.10012
[9]
Zhang X, Bosselut A, Yasunaga M, Ren H, Liang P, Manning CD, Leskovec J (2022) Greaselm: graph reasoning enhanced language models for question answering. arXiv preprint arXiv:2201.08860
[10]
Yasunaga M, Ren H, Bosselut A, Liang P, Leskovec J (2021) QA-GNN: reasoning with language models and knowledge graphs for question answering. arXiv preprint arXiv:2104.06378
[11]
Liu J, Hallinan S, Lu X, He P, Welleck S, Hajishirzi H, Choi Y (2022) Rainier: reinforced knowledge introspector for commonsense question answering. arXiv preprint arXiv:2210.03078
[12]
Hao C, Xie M, Zhang P (2022) Acenet: attention guided commonsense reasoning on hybrid knowledge graph. In: Proceedings of the 2022 conference on empirical methods in natural language processing, pp 8461–8471
[13]
Speer R, Chin J, Havasi C (2017) Conceptnet 5.5: an open multilingual graph of general knowledge. In: Thirty-first AAAI conference on artificial intelligence
[14]
Levesque H, Davis E, Morgenstern L (2012) The winograd schema challenge. In: Thirteenth international conference on the principles of knowledge representation and reasoning
[15]
Rahman A, Ng V (2012) Resolving complex cases of definite pronouns: the winograd schema challenge. In: Proceedings of the 2012 joint conference on empirical methods in natural language processing and computational natural language learning, pp 777–789
[16]
Sun Y, Wang S, Feng S, Ding S, Pang C, Shang J, Liu J, Chen X, Zhao Y, Lu Y et al (2021) Ernie 3.0: large-scale knowledge enhanced pre-training for language understanding and generation. arXiv preprint arXiv:2107.02137
[17]
He P, Liu X, Gao J, Chen W (2020) Deberta: decoding-enhanced BERT with disentangled attention. arXiv preprint arXiv:2006.03654
[18]
Chowdhery A, Narang S, Devlin J, Bosma M, Mishra G, Roberts A, Barham P, Chung H.W, Sutton C, Gehrmann S et al (2022) Palm: scaling language modeling with pathways. arXiv preprint arXiv:2204.02311
[19]
Bajaj P, Xiong C, Ke G, Liu X, He D, Tiwary S, Liu T.-Y, Bennett P, Song X, Gao J (2022) Metro: efficient denoising pretraining of large scale autoencoding language models with model generated signals. arXiv preprint arXiv:2204.06644
[20]
Zhou J, Zheng Y, Tang J, Li J, Yang Z (2021) Flipda: effective and robust data augmentation for few-shot learning. arXiv preprint arXiv:2108.06332
[21]
Shen M, Banerjee P, Baral C (2021) Unsupervised pronoun resolution via masked noun-phrase prediction. arXiv preprint arXiv:2105.12392
[22]
Yuan M, Xia P, May C, Van Durme B, Boyd-Graber J (2021) Adapting coreference resolution models through active learning. arXiv preprint arXiv:2104.07611
[23]
Dobrovolskii V (2021) Word-level coreference resolution. arXiv preprint arXiv:2109.04127
[24]
Wen H, Zhu X, and Zhang L Improving distant supervision relation extraction with entity-guided enhancement feature Neural Comput Appl 2022
[25]
Rahman N and Borah B Improvement of query-based text summarization using word sense disambiguation Complex Intell Syst 2020 6 1 75-85
[26]
Yu J, Cai Y, Sun M, Li P (2021) Mquade: a unified model for knowledge fact embedding. In: Proceedings of the web conference 2021, pp 3442–3452
[27]
Ye M, Cui S, Wang Y, Luo J, Xiao C, Ma F (2021) Medpath: augmenting health risk prediction via medical knowledge paths. In: Proceedings of the web conference 2021, pp 1397–1409
[28]
Wei X, Zhang Y, and Wang H Joint semantic embedding with structural knowledge and entity description for knowledge representation learning Neural Comput Appl 2023 35 5 3883-3902
[29]
Talmor A, Herzig J, Lourie N, Berant J (2018) Commonsenseqa: a question answering challenge targeting commonsense knowledge. arXiv preprint arXiv:1811.00937
[30]
Sakaguchi K, Bras RL, Bhagavatula C, and Choi Y Winogrande: an adversarial winograd schema challenge at scale Commun ACM 2021 64 9 99-106
[31]
Sarlin P-E, DeTone D, Malisiewicz T, Rabinovich A (2020) Superglue: learning feature matching with graph neural networks. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 4938–4947
[32]
Morgenstern L, Davis E, and Ortiz CL Planning, executing, and evaluating the winograd schema challenge AI Mag 2016 37 1 50-54
[33]
Camburu O.-M, Kocijan V, Lukasiewicz T, Yordanov Y (2019) A surprisingly robust trick for the winograd schema challenge
[34]
Trichelair P, Emami A, Cheung JCK, Trischler A, Suleman K, Diaz F (2018) On the evaluation of commonsense reasoning in natural language understanding. arXiv preprint arXiv:1811.01778
[35]
Klein T, Nabi M (2020) Contrastive self-supervised learning for commonsense reasoning. arXiv preprint arXiv:2005.00669
[36]
Yang P, Wang J, Gan, R, Zhu X, Zhang L, Wu Z, Gao X, Zhang J, Sakai T (2022) Zero-shot learners for natural language understanding via a unified multiple choice perspective. arXiv preprint arXiv:2210.08590
[37]
Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser Ł, Polosukhin I (2017) Attention is all you need. In: Advances in neural information processing systems, vol 30

Index Terms

  1. Multidimensional relational knowledge embedding for coreference resolution
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Information & Contributors

          Information

          Published In

          cover image Neural Computing and Applications
          Neural Computing and Applications  Volume 36, Issue 4
          Feb 2024
          593 pages

          Publisher

          Springer-Verlag

          Berlin, Heidelberg

          Publication History

          Published: 07 November 2023
          Accepted: 16 October 2023
          Received: 17 January 2023

          Author Tags

          1. Coreference resolution
          2. Multidimensional relational knowledge
          3. Knowledge embedding
          4. Attention mechanism

          Qualifiers

          • Research-article

          Funding Sources

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • 0
            Total Citations
          • 0
            Total Downloads
          • Downloads (Last 12 months)0
          • Downloads (Last 6 weeks)0
          Reflects downloads up to 16 Jan 2025

          Other Metrics

          Citations

          View Options

          View options

          Media

          Figures

          Other

          Tables

          Share

          Share

          Share this Publication link

          Share on social media