Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1007/978-981-97-5492-2_16guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Reinforced Subject-Aware Graph Neural Network for Related Work Generation

Published: 16 August 2024 Publication History

Abstract

The objective of automatic related work generation is to gather the primary contributions of relevant prior work in a research field and provide a comprehensive analysis, which assists authors in drafting a related work section efficiently, saving them time and effort. The unique characteristic of the related work generation makes the task challenging. However, most existing abstractive related work generation methods are implemented at a coarse granularity, which leads to the complex relationships and interactions among multiple papers are not effectively modeled. In this study, we propose an abstractive Reinforced Subject-aware Graph Neural Network for Related work Generation (RSG) to explore the relationships between the target and the related reference papers based on the writing style of the related work section. Since these relationships are often not explicit, we first leverage the capability of the large language model (LLM) to extract keyphrases among the given papers. Building upon this, we introduce a keyphrase-guided selective encoding mechanism to augment the representations of the given papers. Considering the keyphrases as the subjects discussed within the papers, we propose a subject-aware graph to model the relationships between the papers and the subjects by constructing a hierarchical structure. In the decoding phase, we extend the transformer decoder by keyphrases augmented attention mechanism to integrate various information into the generation process. Extensive experiments on two benchmark datasets demonstrate the effectiveness of the proposed model.

References

[1]
Chen J and Zhuge H Automatic generation of related work through summarizing citations Concurrency Comput. Pract. Exp. 2019 31 3
[2]
Chen L et al. IoT microservice deployment in edge-cloud hybrid environment using reinforcement learning IEEE Internet Things J. 2020 8 16 12610-12622
[3]
Chen, X., et al.: Target-aware abstractive related work generation with contrastive learning. In: Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 373–383 (2022)
[4]
Chen, X., et al.: Capturing relations between scientific papers: an abstractive model for related work section generation. Association for Computational Linguistics (2021)
[5]
Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
[6]
Erkan G and Radev DR LexRank: graph-based lexical centrality as salience in text summarization J. Artif. Intell. Res. 2004 22 457-479
[7]
Gai K and Qiu M Reinforcement learning-based content-centric services in mobile sensing IEEE Network 2018 32 4 34-39
[8]
Grootendors, M.: KeyBERT: minimal keyword extraction with BERT (2020).
[9]
Hoang, C.D.V., Kan, M.Y.: Towards automated related work summarization. In: Coling 2010: Posters, pp. 427–435 (2010)
[10]
Hu, Y., Wan, X.: Automatic generation of related work sections in scientific papers: an optimization approach. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1624–1633 (2014)
[11]
Li, X., Ouyang, J.: Automatic related work generation: a meta study. arXiv preprint arXiv:2201.01880 (2022)
[12]
Liu, Y., Lapata, M.: Text summarization with pretrained encoders. arXiv preprint arXiv:1908.08345 (2019)
[13]
Tang, Y., et al.: Context-I2W: mapping images to context-dependent words for accurate zero-shot composed image retrieval. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 38, pp. 5180–5188 (2024)
[14]
Touvron, H., et al.: LLaMA: open and efficient foundation language models. arXiv preprint arXiv:2302.13971 (2023)
[15]
Wang, Y., Liu, X., Gao, Z.: Neural related work summarization with a joint context-driven attention mechanism. arXiv preprint arXiv:1901.09492 (2019)
[16]
Zhou, H., Ren, W., Liu, G., Su, B., Lu, W.: Entity-aware abstractive multi-document summarization. In: Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, pp. 351–362 (2021)
[17]
Zhou, Q., Yang, N., Wei, F., Zhou, M.: Selective encoding for abstractive sentence summarization. arXiv preprint arXiv:1704.07073 (2017)

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Guide Proceedings
Knowledge Science, Engineering and Management: 17th International Conference, KSEM 2024, Birmingham, UK, August 16–18, 2024, Proceedings, Part I
Aug 2024
461 pages
ISBN:978-981-97-5491-5
DOI:10.1007/978-981-97-5492-2
  • Editors:
  • Cungeng Cao,
  • Huajun Chen,
  • Liang Zhao,
  • Junaid Arshad,
  • Taufiq Asyhari,
  • Yonghao Wang

Publisher

Springer-Verlag

Berlin, Heidelberg

Publication History

Published: 16 August 2024

Author Tags

  1. Related work generation
  2. keyphrases
  3. subject-aware graph

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 26 Jan 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media