Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3605098.3636062acmconferencesArticle/Chapter ViewAbstractPublication PagessacConference Proceedingsconference-collections
research-article

Contextual Embeddings and Graph Convolutional Networks for Concept Prerequisite Learning

Published: 21 May 2024 Publication History

Abstract

Concept prerequisite learning (CPL) plays a crucial role in education. The objective of CPL is to predict prerequisite relations between different concepts. In this paper, we present a new approach for CPL using Sentence Transformers and Relational Graph Convolutional Networks (R-GCNs). This approach creates concept embeddings from single-sentence definitions extracted from Wikipedia using a Sentence Transformer. These embeddings are then used as an input feature matrix for the R-GCN, in addition to a graph structure that distinguishes prerequisites and non-prerequisites as distinct link types. Furthermore, the R-GCN is optimized simultaneously on CPL and concept domain classification to enhance prerequisite prediction generalization for unseen domains. Extensive experiments on the AL-CPL dataset show the effectiveness of our approach for the in-domain and cross-domain settings, as it outperforms the State-Of-The-Art (SOTA) methods on this dataset. Finally, we introduce a novel data split algorithm for this task to address a methodological issue found in previous studies. The new data split algorithm makes CPL more challenging to solve, but also more realistic as it excludes simple inferences by transitivity.

References

[1]
Rakesh Agrawal, Behzad Golshan, and Evangelos Papalexakis. 2015. Data-Driven Synthesis of Study Plans. Technical Report TR-2015-003. Data Insights Laboratories, San Jose, California.
[2]
V. Aleven and K.R. Koedinger. 2002. An effective metacognitive strategy: learning by doing and explaining with a computer-based Cognitive Tutor. Cognitive Science 26, 2 (2002), 147--179.
[3]
Y. Bai, Y. Zhang, K. Xiao, Y. Lou, and K. Sun. 2021. A BERT-Based Approach for Extracting Prerequisite Relations among Wikipedia Concepts. Mathematical Problems in Engineering 2021 (2021), 1--8.
[4]
Devendra Singh Chaplot, Yiming Yang, and Jaime Carbonell. 2016. Data-driven Automated Induction of Prerequisite Structure Graphs. (2016), 6.
[5]
Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2018. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. CoRR abs/1810.04805 (2018). arxiv.org/abs/1810.04805
[6]
William L. Hamilton, Rex Ying, and Jure Leskovec. 2018. Inductive Representation Learning on Large Graphs. arXiv:1706.02216 [cs.SI]
[7]
Armand Joulin, Edouard Grave, Piotr Bojanowski, and Tomás Mikolov. 2016. Bag of Tricks for Efficient Text Classification. CoRR abs/1607.01759 (2016). arXiv:1607.01759 http://arxiv.org/abs/1607.01759
[8]
Thomas N. Kipf and Max Welling. 2017. Semi-Supervised Classification with Graph Convolutional Networks. arXiv:1609.02907 [cs.LG]
[9]
J. Kirkpatrick, R. Pascanu, N. C. Rabinowitz, J. Veness, G. Desjardins, A. A. Rusu, K. Milan, J. Quan, T. Ramalho, A. Grabska-Barwinska, D. Hassabis, C. Clopath, D. Kumaran, and R. Hadsell. 2016. Overcoming catastrophic forgetting in neural networks. CoRR abs/1612.00796 (2016). http://arxiv.org/abs/1612.00796
[10]
R. Kuga, A. Kanezaki, M. Samejima, Y. Sugano, and Y. Matsushita. 2017. Multitask Learning Using Multi-modal Encoder-Decoder Networks with Shared Skip Connections. 403--411.
[11]
Jure Leskovec. 2020. Stanford CS224W Course: Machine Learning with Graphs. (2020). snap.stanford.edu/class/cs224w-2020/slides/08-GNN-application.pdf
[12]
I. Li, A.R. Fabbri, S. Hingmire, and D.R. Radev. 2020. R-VGAE: Relational-variational Graph Autoencoder for Unsupervised Prerequisite Chain Learning. CoRR abs/2004.10610 (2020). https://arxiv.org/abs/2004.10610
[13]
Irene Li, Alexander R. Fabbri, Robert R. Tung, and Dragomir R. Radev. 2018. What Should I Learn First: Introducing LectureBank for NLP Education and Prerequisite Chain Learning. CoRR abs/1811.12181 (2018). arXiv:1811.12181 http://arxiv.org/abs/1811.12181
[14]
Irene Li, Alexander R. Fabbri, Robert R. Tung, and Dragomir R. Radev. 2018. What Should I Learn First: Introducing LectureBank for NLP Education and Prerequisite Chain Learning. CoRR abs/1811.12181 (2018). http://arxiv.org/abs/1811.12181
[15]
Chen Liang, Zhaohui Wu, Wenyi Huang, and C. Lee Giles. 2015. Measuring Prerequisite Relations Among Concepts. In Conference on Empirical Methods in Natural Language Processing. https://api.semanticscholar.org/CorpusID:14404227
[16]
Chen Liang, Jianbo Ye, Shuting Wang, Bart Pursel, and C Lee Giles. 2018. Investigating Active Learning for Concept Prerequisite Learning. Proc. EAAI (2018).
[17]
Chen Liang, Jianbo Ye, Han Zhao, Bart Pursel, and C Lee Giles. 2018. Active Learning of Strict Partial Orders: A Case Study on Concept Prerequisite Relations. arXiv preprint arXiv:1801.06481 (2018).
[18]
Alessio Miaschi, Chiara Alzetta, Franco Alberto Cardillo, and Felice Dell'Orletta. 2019. Linguistically-Driven Strategy for Concept Prerequisites Learning on Italian. In Proceedings of the Fourteenth Workshop on Innovative Use of NLP for Building Educational Applications (Florence, Italy, 2019-08). Association for Computational Linguistics, 285--295.
[19]
Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean. 2013. Efficient Estimation of Word Representations in Vector Space. arXiv:1301.3781 [cs.CL]
[20]
Liangming Pan, Chengjiang Li, Juanzi Li, and Jie Tang. 2017. Prerequisite Relation Learning for Concepts in MOOCs. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, Vancouver, 1447--1456.
[21]
Jeffrey Pennington, R. Socher, and C. Manning. 2014. GloVe: Global Vectors for Word Representation. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics, Doha, Qatar, 1532--1543.
[22]
Nils Reimers and Iryna Gurevych. 2019. Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks. arXiv:1908.10084 [cs.CL]
[23]
Sudeshna Roy, Meghana Madhyastha, Sheril Lawrence, and Vaibhav Rajan. 2019. Inferring Concept Prerequisite Relations from Online Educational Resources. Proceedings of the AAAI Conference on Artificial Intelligence 33, 01 (July 2019), 9589--9594.
[24]
Michael Schlichtkrull, Thomas N. Kipf, Peter Bloem, Rianne van den Berg, Ivan Titov, and Max Welling. 2017. Modeling Relational Data with Graph Convolutional Networks. arXiv:1703.06103 [stat.ML]
[25]
Kaitao Song, Xu Tan, Tao Qin, Jianfeng Lu, and Tie-Yan Liu. 2020. MPNet: Masked and Permuted Pre-training for Language Understanding. CoRR abs/2004.09297 (2020). arXiv:2004.09297 https://arxiv.org/abs/2004.09297
[26]
X. Tang, K. Liu, H. Xu, W. Xiao, and Z. Tan. 2023. Continual Pre-Training of Language Models for Concept Prerequisite Learning with Graph Neural Networks. Mathematics 11, 12 (2023).
[27]
Shuting Wang, Alexander Ororbia, Zhaohui Wu, Kyle Williams, Chen Liang, Bart Pursel, and C Lee Giles. 2016. Using Prerequisites to Extract Concept Maps from Textbooks. In Proc. CIKM. ACM, 317--326.
[28]
Yongliang Wu, Shuliang Zhao, and Wenbin Li. 2020. Phrase2Vec: Phrase embedding based on parsing. Information Sciences 517 (2020), 100--127.
[29]
Zhilin Yang, Zihang Dai, Yiming Yang, Jaime G. Carbonell, Ruslan Salakhutdinov, and Quoc V. Le. 2019. XLNet: Generalized Autoregressive Pretraining for Language Understanding. CoRR abs/1906.08237 (2019). arXiv:1906.08237 http://arxiv.org/abs/1906.08237
[30]
Juntao Zhang, Nanzhou Lin, Xuelong Zhang, Wei Song, Xiandi Yang, and Zhiyong Peng. 2022. Learning Concept Prerequisite Relations from Educational Data via Multi-Head Attention Variational Graph Auto-Encoders. In Proceedings of the Fifteenth ACM International Conference on Web Search and Data Mining (Virtual Event, AZ, USA) (WSDM '22). Association for Computing Machinery, New York, NY, USA, 1377--1385.
[31]
Yang Zhou, Kui Xiao, and Yan Zhang. 2020. An Ensemble Learning Approach for Extracting Concept Prerequisite Relations from Wikipedia. In 2020 16th International Conference on Mobility, Sensing and Networking (MSN). 642--647.
[32]
Jun-Yan Zhu, Taesung Park, Phillip Isola, and Alexei A. Efros. 2017. Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks. CoRR abs/1703.10593 (2017). arXiv:1703.10593 http://arxiv.org/abs/1703.10593

Index Terms

  1. Contextual Embeddings and Graph Convolutional Networks for Concept Prerequisite Learning

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      SAC '24: Proceedings of the 39th ACM/SIGAPP Symposium on Applied Computing
      April 2024
      1898 pages
      ISBN:9798400702433
      DOI:10.1145/3605098
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 21 May 2024

      Check for updates

      Author Tags

      1. concept prerequisite relation
      2. large language models
      3. sentence transformers
      4. graph convolutional networks

      Qualifiers

      • Research-article

      Conference

      SAC '24
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 1,650 of 6,669 submissions, 25%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 39
        Total Downloads
      • Downloads (Last 12 months)39
      • Downloads (Last 6 weeks)13
      Reflects downloads up to 15 Oct 2024

      Other Metrics

      Citations

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media