Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Distantly-Supervised Long-Tailed Relation Extraction Using Constraint Graphs

Published: 01 July 2023 Publication History

Abstract

Label noise and long-tailed distributions are two major challenges in distantly supervised relation extraction. Recent studies have shown great progress on denoising, but paid little attention to the problem of long-tailed relations. In this paper, we introduce a constraint graph to model the dependencies between relation labels. On top of that, we further propose a novel constraint graph-based relation extraction framework(CGRE) to handle the two challenges simultaneously. CGRE employs graph convolution networks to propagate information from data-rich relation nodes to data-poor relation nodes, and thus boosts the representation learning of long-tailed relations. To further improve the noise immunity, a constraint-aware attention module is designed in CGRE to integrate the constraint information. Extensive experimental results indicate that CGRE achieves significant improvements over the previous methods for both denoising and long-tailed relation extraction.

References

[1]
Y. Shen, N. Ding, H. Zheng, Y. Li, and M. Yang, “Modeling relation paths for knowledge graph completion,” IEEE Trans. Knowl. Data Eng., vol. 33, no. 1, pp. 3607–3617, Nov. 2021.
[2]
H. Xiao, Y. Chen, and X. Shi, “Knowledge graph embedding based on multi-view clustering framework,” IEEE Trans. Knowl. Data Eng., vol. 33, no. 2, pp. 585–596, Feb. 2021.
[3]
Z. Jiang, Z. Dou, and J. Wen, “Generating query facets using knowledge bases,” IEEE Trans. Knowl. Data Eng., vol. 29, no. 2, pp. 315–329, Feb. 2017.
[4]
D. V. Kalashnikov, Z. Chen, S. Mehrotra, and R. Nuray-Turan, “Web people search via connection analysis,” IEEE Trans. Knowl. Data Eng., vol. 20, no. 11, pp. 1550–1565, Nov. 2008.
[5]
S. Hu, L. Zou, J. X. Yu, H. Wang, and D. Zhao, “Answering natural language questions by subgraph matching over knowledge graphs,” IEEE Trans. Knowl. Data Eng., vol. 30, no. 5, pp. 824–837, May 2018.
[6]
Y. Hua, Y. Li, G. Haffari, G. Qi, and T. Wu, “Few-shot complex knowledge base question answering via meta reinforcement learning,” in Proc. Conf. Empirical Methods Natural Lang. Process., 2020, pp. 5827–5837.
[7]
M. Mintz, S. Bills, R. Snow, and D. Jurafsky, “Distant supervision for relation extraction without labeled data,” in Proc. Joint Conf. 47th Annu. Meeting ACL, 4th Int. Joint Conf. Natural Lang. Process., 2009, pp. 1003–1011.
[8]
Y. Lin, S. Shen, Z. Liu, H. Luan, and M. Sun, “Neural relation extraction with selective attention over instances,” in Proc. 54th Annu. Meeting Assoc. Comput. Linguistics, 2016, pp. 2124–2133.
[9]
C. Yuan, H. Huang, C. Feng, X. Liu, and X. Wei, “Distant supervision for relation extraction with linear attenuation simulation and non-IID relevance embedding,” in Proc. 33rd AAAI Conf. Artif. Intell., 2019, pp. 7418–7425.
[10]
Z. Ye and Z. Ling, “Distant supervision relation extraction with intra-bag and inter-bag attentions,” in Proc. Conf. North Amer. Chapter Assoc. Comput. Linguistics: Hum. Lang. Technol., 2019, pp. 2810–2819.
[11]
J. Feng, M. Huang, L. Zhao, Y. Yang, and X. Zhu, “Reinforcement learning for relation classification from noisy data,” in Proc. 33rd AAAI Conf. Artif. Intell., 2018, pp. 5779–5786.
[12]
Z. Li, Y. Sun, S. Tang, C. Zhang, and H. Ma, “Adaptive graph convolutional networks with attention mechanism for relation extraction,” in Proc. Int. Joint Conf. Neural Netw., 2020, pp. 1–8.
[13]
Y. Wu, D. Bamman, and S. Russell, “Adversarial training for relation extraction,” in Proc. Conf. Empirical Methods Natural Lang. Process., 2017, pp. 1778–1783.
[14]
P. Qin, W. Xu, and W. Y. Wang, “DSGAN: Generative adversarial training for distant supervision relation extraction,” in Proc. 54th Annu. Meeting Assoc. Comput. Linguistics, 2018, pp. 496–505.
[15]
P. Li, X. Zhang, W. Jia, and H. Zhao, “GAN driven semi-distant supervision for relation extraction,” in Proc. North Amer. Chapter Assoc. Comput. Linguistics, 2019, pp. 3026–3035.
[16]
D. Puspitaningrum, “Improving performance of relation extraction algorithm via leveled adversarial PCNN and database expansion,” in Proc. 7th Int. Conf. Cyber IT Service Manage., 2019, pp. 1–6.
[17]
Q. Zhang and H. Wang, “Noise-clustered distant supervision for relation extraction: A nonparametric Bayesian perspective,” in Proc. Conf. Empirical Methods Natural Lang. Process., 2017, pp. 1808–1813.
[18]
Y. Shang, H. Y. Huang, X. Mao, X. Sun, and W. Wei, “Are noisy sentences useless for distant supervised relation extraction?,” in Proc. 33rd AAAI Conf. Artif. Intell., 2020, pp. 8799–8806.
[19]
G. Ji et al., “Distant supervision for relation extraction with sentence-level attention and entity descriptions,” in Proc. 33rd AAAI Conf. Artif. Intell., 2017, pp. 3060–3066.
[20]
L. Hu, L. Zhang, C. Shi, L. Nie, W. Guan, and C. Yang, “Improving distantly-supervised relation extraction with joint label embedding,” in Proc. Conf. Empirical Methods Natural Lang. Process., 9th Int. Joint Conf. Natural Lang. Process., 2019, pp. 3812–3820.
[21]
Y. Liu, K. Liu, L. Xu, and J. Zhao, “Exploring fine-grained entity type constraints for distantly supervised relation extraction,” in Proc. 25th Int. Conf. Comput. Linguistics: Tech. Papers, 2014, pp. 2107–2116.
[22]
S. Vashishth, R. Joshi, S. S. Prayaga, C. Bhattacharyya, and P. Talukdar, “Reside: Improving distantly-supervised neural relation extraction using side information,” in Proc. Conf. Empirical Methods Natural Lang. Process., 2018, pp. 1257–1266.
[23]
J. Kuang, Y. Cao, J. Zheng, X. He, M. Gao, and A. Zhou, “Improving neural relation extraction with implicit mutual relations,” in Proc. IEEE 36th Int. Conf. Data Eng., 2020, pp. 1021–1032.
[24]
G. Wang et al., “Label-free distant supervision for relation extraction via knowledge graph embedding,” in Proc. Conf. Empirical Methods Natural Lang. Process., 2018, pp. 2246–2255.
[25]
S. Riedel, L. Yao, and A. McCallum, “Modeling relations and their mentions without labeled text,” in Proc. Eur. Conf. Mach. Learn. Knowl. Discov. Databases, 2010, pp. 148–163.
[26]
X. Han, P. Yu, Z. Liu, M. Sun, and P. Li, “Hierarchical relation extraction with coarse-to-fine grained attention,” in Proc. Conf. Empirical Methods Natural Lang. Process., 2018, pp. 2236–2245.
[27]
N. Zhang et al., “Long-tail relation extraction via knowledge graph embeddings and graph convolution networks,” in Proc. Conf. North Amer. Chapter Assoc. Comput. Linguistics: Hum. Lang. Technol., 2019, pp. 3016–3025.
[28]
Y. Cao, J. Kuang, M. Gao, A. Zhou, Y. Wen, and T.-S. Chua, “Learning relation prototype from unlabeled texts for long-tail relation extraction,” IEEE Trans. Knowl. Data Eng., early access, Jul. 13, 2021.
[29]
Y. Li, T. Shen, G. Long, J. Jiang, T. Zhou, and C. Zhang, “Improving long-tail relation extraction with collaborating relation-augmented attention,” in Proc. 25th Int. Conf. Comput. Linguistics: Tech. Papers, 2020, pp. 1653–1664.
[30]
I. Hendrickx et al., “Semeval-2010 task 8: Multi-way classification of semantic relations between pairs of nominals,” in Proc. Workshop Semantic Eval.: Recent Achievements Future Directions, 2009, pp. 94–99.
[31]
Y. Zhang, V. Zhong, D. Chen, G. Angeli, and C. D. Manning, “Position-aware attention and supervised data improve slot filling,” in Proc. Conf. Empirical Methods Natural Lang. Process., 2017, pp. 35–45.
[32]
X. Han et al., “Fewrel: A large-scale supervised few-shot relation classification dataset with state-of-the-art evaluation,” in Proc. Conf. Empirical Methods Natural Lang. Process., 2018, pp. 4803–4809.
[33]
T. Gao et al., “Fewrel 2.0: Towards more challenging few-shot relation classification,” in Proc. Conf. Empirical Methods Natural Lang. Process., 9th Int. Joint Conf. Natural Lang. Process., 2019, pp. 6251–6256.
[34]
J. B. Jensen and G. Z. Gutin, Digraphs: Theory, Algorithms and Applications. Berlin, Germany: Springer, 2008.
[35]
K. Lei et al., “Cooperative denoising for distantly supervised relation extraction,” in Proc. 25th Int. Conf. Comput. Linguistics: Tech. Papers, 2018, pp. 426–436.
[36]
H. Peng et al., “Learning from context or names? An empirical study on neural relation extraction,” in Proc. Conf. Empirical Methods Natural Lang. Process., 2020, pp. 3661–3672.
[37]
Z. Zhong and D. Chen, “A frustratingly easy approach for entity and relation extraction,” in Proc. Conf. North Amer. Chapter Assoc. Comput. Linguistics: Hum. Lang. Technol., 2021, pp. 50–61.
[38]
J. Qu, W. Hua, D. Ouyang, and X. Zhou, “A noise-aware method with type constraint pattern for neural relation extraction,” IEEE Trans. Knowl. Data Eng., early access, Aug. 30, 2021.
[39]
T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” in Proc. Int. Conf. Learn. Representations, 2017, pp. 1–14.
[40]
S. Pradhan et al., “Towards robust linguistic analysis using ontonotes,” in Proc. 17th Conf. Comput. Natural Lang. Learn., 2013, pp. 143–152.
[41]
Y. Li et al., “Self-attention enhanced selective gate with entity-aware embedding for distantly supervised relation extraction,” in Proc. 33rd AAAI Conf. Artif. Intell., 2020, pp. 8269–8276.
[42]
D. Zeng, K. Liu, Y. Chen, and J. Zhao, “Distant supervision for relation extraction via piecewise convolutional neural networks,” in Proc. Conf. Empirical Methods Natural Lang. Process., 2015, pp. 1753–1762.
[43]
D. Marcheggiani and I. Titov, “Encoding sentences with graph convolutional networks for semantic role labeling,” in Proc. Conf. Empirical Methods Natural Lang. Process., 2017, pp. 1506–1515.
[44]
A. Akbik, T. Bergmann, D. Blythe, K. Rasul, S. Schweter, and R. Vollgraf, “FLAIR: An easy-to-use framework for state-of-the-art NLP,” in Proc. Conf. North Ameri. Chapter Assoc. Comput. Linguistics, 2019, pp. 54–59.
[45]
C. Alt, M. Hübner, and L. Hennig, “Fine-tuning pre-trained transformer language models to distantly supervised relation extraction,” in Proc. 54th Annu. Meeting Assoc. Comput. Linguistics, 2019, pp. 1388–1398.
[46]
F. Bai and A. Ritter, “Structured minimally supervised learning for neural relation extraction,” in Proc. Conf. North Ameri. Chapter Assoc. Comput. Linguistics, 2019, pp. 3057–3069.
[47]
F. Christopoulou, M. Miwa, and S. Ananiadou, “Distantly supervised relation extraction with sentence reconstruction and knowledge base priors,” in Proc. Conf. North Amer. Chapter Assoc. Comput. Linguistics, 2021, pp. 11–26.
[48]
X. Zhang, T. Liu, P. Li, W. Jia, and H. Zhao, “Robust neural relation extraction via multi-granularity noises reduction,” IEEE Trans. Knowl. Data Eng., vol. 33, no. 9, pp. 3297–3310, Sep. 2021.
[49]
S. Jat, S. Khandelwal, and P. Talukdar, “Improving distantly supervised relation extraction using word and entity based attention,” 2018,.
[50]
K. Hao, B. Yu, and W. Hu, “Knowing false negatives: An adversarial training method for distantly supervised relation extraction,” in Proc. Conf. Empirical Methods Natural Lang. Process., 2021, pp. 9661–9672.
[51]
X. Glorot and Y. Bengio, “Understanding the difficulty of training deep feedforward neural networks,” in Proc. 13th Int. Conf. Artif. Intell. Statist., 2010, pp. 249–256.
[52]
N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov, “Dropout: A simple way to prevent neural networks from overfitting,” J. Mach. Learn. Res., vol. 15, no. 1, pp. 1929–1958, 2014.
[53]
X. Ling and D. S. Weld, “Fine-grained entity recognition,” in Proc. 33rd AAAI Conf. Artif. Intell., 2012, pp. 94–100.
[54]
E. Yu, W. Han, Y. Tian, and Y. Chang, “ToHRE: A top-down classification strategy with hierarchical bag representation for distantly supervised relation extraction,” in Proc. 25th Int. Conf. Comput. Linguistics: Tech. Papers, 2020, pp. 1665–1676.
[55]
Y. Gou, Y. Lei, L. Liu, P. Zhang, and X. Peng, “A dynamic parameter enhanced network for distant supervised relation extraction,” Knowl.-Based Syst., vol. 197, 2020, Art. no.
[56]
Z. Hu, Y. Cao, L. Huang, and T.-S. Chua, “How knowledge graph and attention help? A qualitative analysis into bag-level relation extraction,” in Proc. Joint Conf. 47th Annu. Meeting ACL, 4th Int. Joint Conf. Natural Lang. Process., 2021, pp. 4662–4671.
[57]
D. Zeng, K. Liu, S. Lai, G. Zhou, and J. Zhao, “Relation classification via convolutional deep neural network,” in Proc. 25th Int. Conf. Comput. Linguistics: Tech. Papers, 2014, pp. 2335–2344.
[58]
J. D. M.-W. C. Kenton and L. K. Toutanova, “BERT: Pre-training of deep bidirectional transformers for language understanding,” in Proc. Conf. North Amer. Chapter Assoc. Comput. Linguistics: Hum. Lang. Technol., 2019, pp. 4171–4186.
[59]
P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Liò, and Y. Bengio, “Graph attention networks,” in Proc. Int. Conf. Learn. Representations, 2018.
[60]
W. L. Hamilton, R. Ying, and J. Leskovec, “Inductive representation learning on large graphs,” in Proc. 31st Int. Conf. Neural Informat. Process. Syst., 2017, pp. 1025–1035.
[61]
R. Hoffmann, C. Zhang, X. Ling, L. Zettlemoyer, and D. S. Weld, “Knowledge-based weak supervision for information extraction of overlapping relations,” in Proc. 54th Annu. Meeting Assoc. Comput. Linguistics, 2011, pp. 541–550.
[62]
M. Surdeanu, J. Tibshirani, R. Nallapati, and C. D. Manning, “Multi-instance multi-label learning for relation extraction,” in Proc. 26th Int. Conf. Comput. Linguistics: Tech. Papers, 2012, pp. 455–465.
[63]
W. Jia, D. Dai, X. Xiao, and H. Wu, “ARNOR: Attention regularization based noise reduction for distant supervision relation classification,” in Proc. 54th Annu. Meeting Assoc. Comput. Linguistics, 2019, pp. 1399–1408.
[64]
S. Krause, H. Li, H. Uszkoreit, and F. Xu, “Large-scale learning of relation-extraction rules with distant supervision from the web,” in Proc. 11th Int. Conf. Semantic Web, 2012, pp. 263–278.
[65]
Y. Gui, Q. Liu, M. Zhu, and Z. Gao, “Exploring long tail data in distantly supervised relation extraction,” in Natural Lang. Understanding Intell. Appl., 2016, pp. 514–522.

Cited By

View all
  • (2024)Research on Distant Supervision Relation Extraction based on Attention Graph Enhancement and Dynamic LossProceedings of the International Conference on Modeling, Natural Language Processing and Machine Learning10.1145/3677779.3677802(139-146)Online publication date: 17-May-2024
  • (2024)FPrompt-PLM: Flexible-Prompt on Pretrained Language Model for Continual Few-Shot Relation ExtractionIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2024.341911736:12(8267-8282)Online publication date: 1-Dec-2024
  • (2024)Sentence Bag Graph Formulation for Biomedical Distant Supervision Relation ExtractionIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2024.337722936:9(4890-4903)Online publication date: 1-Sep-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image IEEE Transactions on Knowledge and Data Engineering
IEEE Transactions on Knowledge and Data Engineering  Volume 35, Issue 7
July 2023
1090 pages

Publisher

IEEE Educational Activities Department

United States

Publication History

Published: 01 July 2023

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 02 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Research on Distant Supervision Relation Extraction based on Attention Graph Enhancement and Dynamic LossProceedings of the International Conference on Modeling, Natural Language Processing and Machine Learning10.1145/3677779.3677802(139-146)Online publication date: 17-May-2024
  • (2024)FPrompt-PLM: Flexible-Prompt on Pretrained Language Model for Continual Few-Shot Relation ExtractionIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2024.341911736:12(8267-8282)Online publication date: 1-Dec-2024
  • (2024)Sentence Bag Graph Formulation for Biomedical Distant Supervision Relation ExtractionIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2024.337722936:9(4890-4903)Online publication date: 1-Sep-2024
  • (2024)A More Balanced Loss-Reweighting Method for Long-Tailed Traffic Sign Detection and RecognitionIEEE Transactions on Intelligent Transportation Systems10.1109/TITS.2024.345623225:12(20729-20740)Online publication date: 23-Sep-2024

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media