Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3357384.3357997acmconferencesArticle/Chapter ViewAbstractPublication PagescikmConference Proceedingsconference-collections
research-article

A Fine-grained and Noise-aware Method for Neural Relation Extraction

Published: 03 November 2019 Publication History
  • Get Citation Alerts
  • Abstract

    Distant supervision is an efficient way to generate large-scale training data for relation extraction without human efforts. However, a coin has two sides. The automatically annotated labels for training data are problematic, which can be summarized as multi-instance multi-label problem and coarse-grained (bag-level) supervised signal. To address these problems, we propose two reasonable assumptions and craft reinforcement learning to capture the expressive sentence for each relation mentioned in a bag. More specifically, we extend the original expressed-at-least-once assumption to multi-label level, and introduce a novel express-at-most-one assumption. Besides, we design a fine-grained reward function, and model the sentence selection process as an auction where different relations for a bag need to compete together to achieve the possession of a specific sentence based on its expressiveness. In this way, our model can be dynamically self-adapted, and eventually implements the accurate one-to-one mapping from a relation label to its chosen expressive sentence, which serves as training instances for the extractor. The experimental results on a public dataset demonstrate that our model constantly and substantially outperforms current state-of-the-art methods for relation extraction.

    References

    [1]
    Dzmitry Bahdanau, KyungHyun Cho, and Yoshua Bengio. 2015. Neural Machine Translation by Jointly Learning to Aligh and Translate. In International Conference on Learning Representations. 1--15.
    [2]
    Antoine Bordes, Nicolas Usunier, Alberto Garcia-Duran, Jason Weston, and Oksana Yakhnenko. 2013. Translating Embeddings for Modeling Multi-relational Data. In Proceedings of Neural Information Processing Systems. 2787--2795.
    [3]
    Aron Culotta, Andrew McCallum, and Jonathan Betz. 2006. Integrating Probabilistic Extraction Models and Data Mining to Discover Relations and Patterns in Text. In Proceedings of Association for Computational Linguistics. 296--303.
    [4]
    Jun Feng, Minlie Huang, Li Zhao, Yang Yang, and Xiaoyan Zhu. 2018. Reinforcement Learning for Relation Classification from Noisy Data. In Proceedings of AAAI Conference on Artificial Intelligence .
    [5]
    Xu Han, Zhiyuan Liu, and Maosong Sun. 2018. Neural Knowledge Acquisition via Mutual Attention between Knowledge Graph and Text. In Proceedings of AAAI Conference on Artificial Intelligence .
    [6]
    Iris Hendrickx, Su Nam Kim, Zornitsa Kozareva, Preslav Nakov, Diarmuid O, Seaghdha, Sebastian Pado, Marco Pennacchiotti, Lorenza Romano, and Stan Szpakowicz. 2010. Semeval-2010 task 8: Multi-way classification of semantic relations between pairs of nominals. In Proceedings of the 5th International Workshop on Semantic Evaluation. 33--38.
    [7]
    Raphael Hoffmann, Congle Zhang, Xiao Ling, Luke Zettlemoyer, and Daniel S. Weld. 2011. Knowledge-based weak supervision for information extraction of overlapping relations. In Proceedings of Association for Computational Linguistics. 541--550.
    [8]
    Guoliang Ji, Kang Liu, Shizhu He, and Jun Zhao. 2016. Knowledge Graph Completion with Adaptive Sparse Transfer Matrix. In Proceedings of AAAI Conference on Artificial Intelligence. 985--991.
    [9]
    Guoliang Ji, Kang Liu, Shizhu He, and Jun Zhao. 2017. Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions. In Proceedings of AAAI Conference on Artificial Intelligence. 3060--3066.
    [10]
    Nanda Kambhatla. 2004. Combining Lexical, Syntactic, and Semantic Features with Maximum Entropy Models for Extracting Relations. In Proceedings of Association for Computational Linguistics. 1003--1011.
    [11]
    Chen Liang, Jonathan Berant, Quoc Le, Kenneth D. Forbus, and Ni Lao. 2017. Neural Symbolic Machines: Learning Semantic Parsers on Freebase with Weak Supervision. In Proceedings of 55th Annual Meeting of the Association for Computational Linguistics . 23--33.
    [12]
    Yankai Lin, Zhiyuan Liu, Maosong Sun, Yang Liu, and Xuan Zhu. 2015. Learning Entity and Relation Embeddings for Knowledge Graph Completion. In Proceedings of AAAI Conference on Artificial Intelligence. 2181--2187.
    [13]
    Yankai Lin, Shiqi Shen, Zhiyuan Liu, Huangbo Luan, and Maosong Sun. 2016. Neural Relation Extraction with Selective Attention over Instances. In Proceedings of Association for Computational Linguistics. 2124--2133.
    [14]
    Mike Mintz, Steven Bills, Rion Snow, and Dan Jurafsky. 2009. Distant supervision for relation extraction without labeled data. In Proceedings of the 47th Annual Meeting of the Association for Computational Linguistics . 1003--1011.
    [15]
    Makoto Miwa and Mohit Bansal. 2016. End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures. In Proceedings of Association for Computational Linguistics. 1105--1116.
    [16]
    Volodymyr Mnih, Koray Kavukcuoglu, David Silver, Andrei A. Rusu, Joel Veness, Marc G. Bellemare, Alex Graves, Martin Riedmiller, Andreas K. Fidjeland, Georg Ostrovski, Stig Petersen, Charles Beattie, Amir Sadik, Ioannis Antonoglou, Helen King, Dharshan Kumaran, Daan Wierstra, Shane Legg, and Demis Hassabis. 2015. Human-level control through deep reinforcement learning. Natural, Vol. 518, 7450 (2015), 529--533.
    [17]
    Karthik Narasimhan, Adam Yala, and Regina Barzilay. 2016. Improving Information Extraction by Acquiring External Evidence with Reinforcement Learning. In Proceedings of Empirical Methods in Natural Language Processing. 2355--2365.
    [18]
    Maximilian Nickel, Lorenzo Rosasco, and Tomaso Poggio. 2016. Holographic Embeddings of Knowledge Graphs. In Proceedings of AAAI Conference on Artificial Intelligence. 1955--1961.
    [19]
    Soujanya Poria, Erik Cambria, Devamanyu Hazarika, Navonil Mazumder, Amir Zadeh, and Louis-Philippe Morency. 2017. Context-Dependent Sentiment Analysis in User-Generated Videos. In Proceedings of Association for Computational Linguistics. 873--883.
    [20]
    Alec Radford, Rafal Jozefowicz, and Ilya Sutskever. 2017. Learning to Generate Reviews and Discovering Sentiment. In arXiv:1704.01444v2 .
    [21]
    Sebastian Riedel, Limin Yao, and Andrew McCallum. 2010. Modeling Relations and Their Mentions without Labeled Text. In Proceedings of North American Chapter of the Association for Computational Linguistics . 148--163.
    [22]
    Yelong Shen, Po-Sen Huang, Jianfeng Gao, and Weizhu Chen. 2017. ReasoNet: Learning to Stop Reading in Machine Comprehension. In Proceedings of Knowledge Discovery and Data Mining. 1047--1055.
    [23]
    David Silver, Aja Huang, Chris J. Maddison, Arthur Guez, Laurent Sifre, George van den Driessche, Julian Schrittwieser, Ioannis Antonoglou, Veda Panneershelvam, Marc Lanctot, Sander Dieleman, Dominik Grewe, John Nham, Nal Kalchbrenner, Ilya Sutskever, Timothy Lillicrap, Madeleine Leach, Koray Kavukcuoglu, Thore Graepel, and Demis Hassabis. 1995. Mastering the game of go with deep neural networks and tree search. Natural, Vol. 31, 15 (1995), 1244--1245.
    [24]
    Richard Socher, Brody Huval, Christopher D. Manning, and Andrew Y. Ng. 2012. Semantic Compositionality through Recursive Matrix-Vector Spaces. In Proceedings of Empirical Methods in Natural Language Processing. 1201--1211.
    [25]
    Mihai Surdeanu, Julie Tibshirani, Ramesh Nallapati, and Christopher D. Manning. 2012. Multi-instance Multi-label Learning for Relation Extraction. In Proceedings of Empirical Methods in Natural Language Processing. 1003--1011.
    [26]
    Richard S. Sutton and Andrew G. Barto. 1998. Reinforcement Learning: An Introduction .The MIT Press, Reading, MA.
    [27]
    Wen tau Yih, Ming-Wei Chang, Xiaodong He, and Jianfeng Gao. 2015. Semantic Parsing via Staged Query Graph Generation: Question Answering with Knowledge Base. In Proceedings of Association for Computational Linguistics. 1321--1331.
    [28]
    Robert West, Evgeniy Gabrilovich, Kevin Murphy, Shaohua Sun, Rahul Gupta, and Dekang Lin. 2014. Knowledge Base Completion via Search-Based Question Answering. In Proceedings of World Wide Web . 515--526.
    [29]
    Caiming Xiong, Victor Zhong, and Richard Socher. 2017. Dynamic coattention networks for question answering. In International Conference on Learning Representations. 1047--1055.
    [30]
    Daojian Zeng, Kang Liu, Yubo Chen, and Jun Zhao. 2015. Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks. In Proceedings of Empirical Methods in Natural Language Processing. 1753--1762.
    [31]
    Daojian Zeng, Kang Liu, Siwei Lai, Guangyou Zhou, and Jun Zhao. 2014. Relation Classification via Convolutional Deep Neural Network. In Proceedings of COLING . 2335--2344.
    [32]
    Xiangrong Zeng, Shizhu He, Kang Liu, and Jun Zhao. 2018. Large Scaled Relation Extraction with Reinforcement Learning. In Proceedings of AAAI Conference on Artificial Intelligence .
    [33]
    Shubin Zhao and Ralph Grishman. 2005. Extracting Relations with Integrated Information Using Kernel Methods. In Proceedings of the 43rd Annual Meeting of the Association for Computational Linguistics . 419--426.

    Cited By

    View all
    • (2023)Evidence Reasoning and Curriculum Learning for Document-level Relation ExtractionIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2023.3292974(1-14)Online publication date: 2023
    • (2022)Evidence-aware Document-level Relation ExtractionProceedings of the 31st ACM International Conference on Information & Knowledge Management10.1145/3511808.3557313(2311-2320)Online publication date: 17-Oct-2022
    • (2022)MiDTD: A Simple and Effective Distillation Framework for Distantly Supervised Relation ExtractionACM Transactions on Information Systems10.1145/350391740:4(1-32)Online publication date: 11-Jan-2022
    • Show More Cited By

    Index Terms

    1. A Fine-grained and Noise-aware Method for Neural Relation Extraction

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CIKM '19: Proceedings of the 28th ACM International Conference on Information and Knowledge Management
      November 2019
      3373 pages
      ISBN:9781450369763
      DOI:10.1145/3357384
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 03 November 2019

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. coarse-grained supervised signal
      2. distant supervision
      3. multi-instance multi-label
      4. reinforcement learning
      5. relation extraction

      Qualifiers

      • Research-article

      Funding Sources

      • Natural Science Foundation of China
      • Australian Research Council

      Conference

      CIKM '19
      Sponsor:

      Acceptance Rates

      CIKM '19 Paper Acceptance Rate 202 of 1,031 submissions, 20%;
      Overall Acceptance Rate 1,861 of 8,427 submissions, 22%

      Upcoming Conference

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)19
      • Downloads (Last 6 weeks)1

      Other Metrics

      Citations

      Cited By

      View all
      • (2023)Evidence Reasoning and Curriculum Learning for Document-level Relation ExtractionIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2023.3292974(1-14)Online publication date: 2023
      • (2022)Evidence-aware Document-level Relation ExtractionProceedings of the 31st ACM International Conference on Information & Knowledge Management10.1145/3511808.3557313(2311-2320)Online publication date: 17-Oct-2022
      • (2022)MiDTD: A Simple and Effective Distillation Framework for Distantly Supervised Relation ExtractionACM Transactions on Information Systems10.1145/350391740:4(1-32)Online publication date: 11-Jan-2022
      • (2022)Information Resilience: the nexus of responsible and agile approaches to information useThe VLDB Journal10.1007/s00778-021-00720-231:5(1059-1084)Online publication date: 16-Jan-2022
      • (2021)NS-Hunter: BERT-Cloze Based Semantic Denoising for Distantly Supervised Relation ClassificationChinese Computational Linguistics10.1007/978-3-030-84186-7_22(324-340)Online publication date: 8-Aug-2021
      • (2020)Temporal knowledge extraction from large-scale text corpusWorld Wide Web10.1007/s11280-020-00836-524:1(135-156)Online publication date: 2-Sep-2020
      • (2020)A Noise Adaptive Model for Distantly Supervised Relation ExtractionNatural Language Processing and Chinese Computing10.1007/978-3-030-60450-9_41(519-530)Online publication date: 2-Oct-2020

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media