Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Heterogeneous Evolution Network Embedding with Temporal Extension for Intelligent Tutoring Systems

Published: 08 November 2023 Publication History

Abstract

Graph embedding (GE) aims to acquire low-dimensional node representations while maintaining the graph’s structural and semantic attributes. Intelligent tutoring systems (ITS) signify a noteworthy achievement in the fusion of AI and education. Utilizing GE to model ITS can elevate their performance in predictive and annotation tasks. Current GE techniques, whether applied to heterogeneous or dynamic graphs, struggle to efficiently model ITS data. The GEs within ITS should retain their semidynamic, independent, and smooth characteristics. This article introduces a heterogeneous evolution network (HEN) for illustrating entities and relations within an ITS. Additionally, we introduce a temporal extension graph neural network (TEGNN) to model both evolving and static nodes within the HEN. In the TEGNN framework, dynamic nodes are initially improved over time through temporal extension (TE), providing an accurate depiction of each learner’s implicit state at each time step. Subsequently, we propose a stochastic temporal pooling (STP) strategy to estimate the embedding sets of all evolving nodes. This effectively enhances model efficiency and usability. Following this, a heterogeneous aggregation network is devised to proficiently extract heterogeneous features from the HEN. This network employs both node-level and relation-level attention mechanisms to craft aggregated node features. To emphasize the superiority of TEGNN, we perform experiments on several real ITS datasets and show that our method significantly outperforms the state-of-the-art approaches. The experiments validate that TE serves as an efficient framework for modeling temporal information in GE, and STP not only accelerates the training process but also enhances the resultant accuracy.

References

[1]
Wayne Xin Zhao, Wenhui Zhang, Yulan He, Xing Xie, and Ji-Rong Wen. 2018. Automatically learning topics and difficulty levels of problems in online judge systems. ACM Trans. Inf. Syst. 36, 3, Article 27 (Mar. 2018), 33 pages. DOI:
[2]
T. S. Ashwin, Vijay Prakash, and Ramkumar Rajendran. 2023. A systematic review of intelligent tutoring systems based on Gross body movement detected using computer vision. Comput. Educ.: Artif. Intell. 4 (2023), 100125. DOI:
[3]
Qi Liu, Runze Wu, Enhong Chen, Guandong Xu, Yu Su, Zhigang Chen, and Guoping Hu. 2018. Fuzzy cognitive diagnosis for modelling examinee performance. ACM Trans. Intell. Syst. Technol. 9, 4, Article 48 (Jan. 2018), 26 pages. DOI:
[4]
Lina Gao, Zhongying Zhao, Chao Li, Jianli Zhao, and Qingtian Zeng. 2022. Deep cognitive diagnosis model for predicting students’ performance. Fut. Gener. Comput. Syst. 126 (2022), 252–262. DOI:
[5]
Chris Piech, Jonathan Bassen, Jonathan Huang, Surya Ganguli, Mehran Sahami, Leonidas J. Guibas, and Jascha Sohl-Dickstein. 2015. Deep knowledge tracing. In Advances in Neural Information Processing Systems, C. Cortes, N. Lawrence, D. Lee, M. Sugiyama, and R. Garnett (Eds.), Vol. 28. Curran Associates, Inc.
[6]
Qi Liu, Zhenya Huang, Yu Yin, Enhong Chen, Hui Xiong, Yu Su, and Guoping Hu. 2021. EKT: Exercise-aware knowledge tracing for student performance prediction. IEEE Trans. Knowl. Data Eng. 33, 1 (2021), 100–115. DOI:
[7]
Zhenya Huang, Qi Liu, Yuying Chen, Le Wu, Keli Xiao, Enhong Chen, Haiping Ma, and Guoping Hu. 2020. Learning or forgetting? A dynamic approach for tracking the knowledge proficiency of students. ACM Trans. Inf. Syst. 38, 2, Article 19 (Feb. 2020), 33 pages. DOI:
[8]
Qi Liu, Shuanghong Shen, Zhenya Huang, Enhong Chen, and Yonghe Zheng. 2021. A survey of knowledge tracing. arXiv:2105.15106. Retrieved from https://arxiv.org/abs/2105.15106
[9]
Jianwen Sun, Rui Zou, Ruxia Liang, Lu Gao, Sannyuya Liu, Qing Li, Kai Zhang, and Lulu Jiang. 2022. Ensemble knowledge tracing: Modeling interactions in learning process. Expert Syst. Appl. 207 (2022), 117680. DOI:
[10]
Sannyuya Liu, Jianwei Yu, Qing Li, Ruxia Liang, Yunhan Zhang, Xiaoxuan Shen, and Jianwen Sun. 2022. Ability boosted knowledge tracing. Inf. Sci. 596 (2022), 567–587. DOI:
[11]
Qi Liu, Shiwei Tong, Chuanren Liu, Hongke Zhao, Enhong Chen, Haiping Ma, and Shijin Wang. 2019. Exploiting cognitive structure for adaptive learning. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery; Data Mining (KDD’19). Association for Computing Machinery, New York, NY, 627–635. DOI:
[12]
Tie-Yun Qian, Bei Liu, Liang Hong, and Zhen-Ni You. 2018. Time and location aware points of interest recommendation in location-based social networks. J. Comput. Sci. Technol. 33 (2018), 1219–1230.
[13]
Joaquín Gayoso-Cabada, María Goicoechea-de Jorge, Mercedes Gómez-Albarrán, Amelia Sanz-Cabrerizo, Antonio Sarasa-Cabezuelo, and José-Luis Sierra. 2019. Ontology-enhanced educational annotation activities. Sustainability 11, 16 (2019). DOI:
[14]
Boran Sekeroglu, Kamil Dimililer, and Kubra Tuncal. 2019. Student performance prediction and classification using machine learning algorithms. In Proceedings of the 8th International Conference on Educational and Information Technology (ICEIT’19). Association for Computing Machinery, New York, NY, 7–11. DOI:
[15]
Zhuojia Xu, Hua Yuan, and Qishan Liu. 2021. Student performance prediction based on blended learning. IEEE Trans. Educ. 64, 1 (2021), 66–73. DOI:
[16]
Ani Grubišić, Branko Žitko, Angelina Gašpar, Daniel Vasić, and Arta Dodaj. 2022. Evaluation of split-and-rephrase output of the knowledge extraction tool in the intelligent tutoring system. Expert Syst. Appl. 187 (2022), 115900. DOI:
[17]
Tianlong Qi, Meirui Ren, Longjiang Guo, Xiaokun Li, Jin Li, and Lichen Zhang. 2023. ICD: A new interpretable cognitive diagnosis model for intelligent tutor systems. Expert Syst. Appl. 215 (2023), 119309. DOI:
[18]
Takuji Yamada and Peer Bork. 2009. Evolution of biomolecular networks–lessons from metabolic and protein interactions. Nat. Rev. Molec. Cell Biol. 10, 11 (2009), 791–803.
[19]
Jure Leskovec, Kevin J. Lang, Anirban Dasgupta, and Michael W. Mahoney. 2008. Statistical properties of community structure in large social and information networks. In Proceedings of the 17th International Conference on World Wide Web (WWW’08). Association for Computing Machinery, New York, NY, 695–704. DOI:
[20]
Shaoxiong Ji, Shirui Pan, Erik Cambria, Pekka Marttinen, and Philip S. Yu. 2022. A survey on knowledge graphs: Representation, acquisition, and applications. IEEE Trans. Neural Netw. Learn. Syst. 33, 2 (2022), 494–514. DOI:
[21]
Tianchi Yang, Linmei Hu, Chuan Shi, Houye Ji, Xiaoli Li, and Liqiang Nie. 2021. HGAT: Heterogeneous graph attention networks for semi-supervised short text classification. ACM Trans. Inf. Syst. 39, 3, Article 32 (May 2021), 29 pages. DOI:
[22]
Palash Goyal and Emilio Ferrara. 2018. Graph embedding techniques, applications, and performance: A survey. Knowl.-Bas. Syst. 151 (2018), 78–94. DOI:
[23]
Xiao Wang, Houye Ji, Chuan Shi, Bai Wang, Yanfang Ye, Peng Cui, and Philip S. Yu. 2019. Heterogeneous graph attention network. In Proceedings of the World Wide Web Conference (WWW’19). Association for Computing Machinery, New York, NY, 2022–2032. DOI:
[24]
Aldo Pareja, Giacomo Domeniconi, Jie Chen, Tengfei Ma, Toyotaro Suzumura, Hiroki Kanezashi, Tim Kaler, Tao Schardl, and Charles Leiserson. 2020. EvolveGCN: Evolving graph convolutional networks for dynamic graphs. Proceedings of the AAAI Conference on Artificial Intelligence 34, 04 (Apr. 2020), 5363–5370. DOI:
[25]
Ming Jin, Yuan-Fang Li, and Shirui Pan. 2022. Neural temporal walks: Motif-aware representation learning on continuous-time dynamic graphs. In Advances in Neural Information Processing Systems, Alice H. Oh, Alekh Agarwal, Danielle Belgrave, and Kyunghyun Cho (Eds.).
[26]
Joshua B. Tenenbaum, Charles Kemp, Thomas L. Griffiths, and Noah D. Goodman. 2011. How to grow a mind: Statistics, structure, and abstraction. Science 331, 6022 (2011), 1279–1285.
[27]
Henry L. Roediger and Andrew C. Butler. 2011. The critical role of retrieval practice in long-term retention. Trends Cogn. Sci. 15, 1 (2011), 20–27.
[28]
Jeffrey D. Karpicke and Henry L. Roediger III. 2008. The critical importance of retrieval for learning. Science 319, 5865 (2008), 966–968.
[29]
Chun-Kit Yeung and Dit-Yan Yeung. 2018. Addressing two problems in deep knowledge tracing via prediction-consistent regularization. In Proceedings of the fth Annual ACM Conference on Learning at Scale (L@S’18). Association for Computing Machinery, New York, NY, Article 5, 10 pages. DOI:
[30]
Amr Ahmed, Nino Shervashidze, Shravan Narayanamurthy, Vanja Josifovski, and Alexander J. Smola. 2013. Distributed large-scale natural graph factorization. In Proceedings of the 22nd International Conference on World Wide Web (WWW’13). Association for Computing Machinery, New York, NY, 37–48. DOI:
[31]
Mingdong Ou, Peng Cui, Jian Pei, Ziwei Zhang, and Wenwu Zhu. 2016. Asymmetric transitivity preserving graph embedding. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD’16). Association for Computing Machinery, New York, NY, 1105–1114. DOI:
[32]
Bryan Perozzi, Rami Al-Rfou, and Steven Skiena. 2014. DeepWalk: Online learning of social representations. In Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD’14). Association for Computing Machinery, New York, NY, 701–710. DOI:
[33]
Aditya Grover and Jure Leskovec. 2016. Node2vec: Scalable feature learning for networks. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD’16). Association for Computing Machinery, New York, NY, 855–864. DOI:
[34]
Thomas N. Kipf and Max Welling. 2016. Semi-supervised classification with graph convolutional networks. arXiv:1609.02907. Retrieved from https://arxiv.org/abs/1609.02907
[35]
Petar Velickovic, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Lio, Yoshua Bengio, et al. 2017. Graph attention networks. Stat 1050, 20 (2017), 10–48550.
[36]
Will Hamilton, Zhitao Ying, and Jure Leskovec. 2017. Inductive representation learning on large graphs. In Advances in Neural Information Processing Systems, I. Guyon, U. Von Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett (Eds.), Vol. 30. Curran Associates, Inc.
[37]
Xiao Wang, Deyu Bo, Chuan Shi, Shaohua Fan, Yanfang Ye, and Philip S. Yu. 2023. A survey on heterogeneous graph embedding: Methods, techniques, applications and sources. IEEE Trans. Big Data 9, 2 (2023), 415–436. DOI:
[38]
Yuxiao Dong, Nitesh V. Chawla, and Ananthram Swami. 2017. Metapath2vec: Scalable representation learning for heterogeneous networks. In Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD’17). Association for Computing Machinery, New York, NY, 135–144. DOI:
[39]
Chuan Shi, Binbin Hu, Wayne Xin Zhao, and Philip S. Yu. 2019. Heterogeneous information network embedding for recommendation. IEEE Trans. Knowl. Data Eng. 31, 2 (2019), 357–370. DOI:
[40]
Hongxu Chen, Hongzhi Yin, Weiqing Wang, Hao Wang, Quoc Viet Hung Nguyen, and Xue Li. 2018. PME: Projected metric embedding on heterogeneous networks for link prediction. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery; Data Mining (KDD’18). Association for Computing Machinery, New York, NY, 1177–1186. DOI:
[41]
Yuanfu Lu, Chuan Shi, Linmei Hu, and Zhiyuan Liu. 2019. Relation structure-aware heterogeneous information network embedding. Proceedings of the AAAI Conference on Artificial Intelligence 33, 01 (Jul. 2019), 4456–4463. DOI:
[42]
Jiarui Jin, Kounianhua Du, Weinan Zhang, Jiarui Qin, Yuchen Fang, Yong Yu, Zheng Zhang, and Alexander J. Smola. 2022. GraphHINGE: Learning interaction models of structured neighborhood on heterogeneous information network. ACM Trans. Inf. Syst. 40, 3, Article 45 (Mar. 2022), 35 pages. DOI:
[43]
Nuttapong Chairatanakul, Xin Liu, and Tsuyoshi Murata. 2021. PGRA: Projected graph relation-feature attention network for heterogeneous information network embedding. Inf. Sci. 570 (2021), 769–794. DOI:
[44]
Linhong Zhu, Dong Guo, Junming Yin, Greg Ver Steeg, and Aram Galstyan. 2016. Scalable temporal latent space inference for link prediction in dynamic social networks. IEEE Trans. Knowl. Data Eng. 28, 10 (2016), 2765–2777. DOI:
[45]
Jundong Li, Harsh Dani, Xia Hu, Jiliang Tang, Yi Chang, and Huan Liu. 2017. Attributed network embedding for learning in a dynamic environment. In Proceedings of the ACM on Conference on Information and Knowledge Management (CIKM’17). Association for Computing Machinery, New York, NY, 387–396. DOI:
[46]
Wenchao Yu, Wei Cheng, Charu C. Aggarwal, Haifeng Chen, and Wei Wang. 2017. Link prediction with spatial and temporal consistency in dynamic networks. In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI’17). 3343–3349.
[47]
Sandra Mitrovic and Jochen De Weerdt. 2019. Dyn2Vec: Exploiting dynamic behaviour using difference networks-based node embeddings for classification. In Proceedings of the International Conference on Data Science. CSREA Press, 194–200.
[48]
Uriel Singer, Ido Guy, and Kira Radinsky. 2019. Node embedding over temporal graphs. arXiv:1903.08889. Retrieved from https://arxiv.org/abs/1903.08889
[49]
Yujing Zhou, Weile Liu, Yang Pei, Lei Wang, Daren Zha, and Tianshu Fu. 2019. Dynamic network embedding by semantic evolution. In Proceedings of the International Joint Conference on Neural Networks (IJCNN’19). 1–8. DOI:
[50]
Chao Gao, Junyou Zhu, Fan Zhang, Zhen Wang, and Xuelong Li. 2022. A novel representation learning for dynamic graphs based on graph convolutional networks. IEEE Trans. Cybernet. (2022), 1–14. DOI:
[51]
Yuanzhen Xie, Zijing Ou, Liang Chen, Yang Liu, Kun Xu, Carl Yang, and Zibin Zheng. 2021. Learning and updating node embedding on dynamic heterogeneous information network. In Proceedings of the 14th ACM International Conference on Web Search and Data Mining (WSDM’21). Association for Computing Machinery, New York, NY, 184–192. DOI:
[52]
Yujie Fan, Mingxuan Ju, Chuxu Zhang, and Yanfang Ye. 2022. Heterogeneous temporal graph neural network. Proceedings of the SIAM International Conference on Data Mining (SDM’22), 657–665.
[53]
Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg S. Corrado, and Jeff Dean. 2013. Distributed representations of words and phrases and their compositionality. In Advances in Neural Information Processing Systems, C. J. Burges, L. Bottou, M. Welling, Z. Ghahramani, and K. Q. Weinberger (Eds.), Vol. 26. Curran Associates, Inc.
[54]
Mingyu Feng, Neil Heffernan, and Kenneth Koedinger. 2009. Addressing the assessment challenge with an online system that tutors as it assesses. User Model. User-adapt. Interact. 19, 3 (2009), 243–266.
[55]
Youngduck Choi, Youngnam Lee, Dongmin Shin, Junghyun Cho, Seoyon Park, Seewoo Lee, Jineon Baek, Chan Bae, Byungsoo Kim, and Jaewe Heo. 2020. Ednet: A large-scale hierarchical dataset in education. In Proceedings of the 21st International Conference on Artificial Intelligence in Education (AIED’20). Springer, 69–73.
[56]
Haw-Shiuan Chang, Hwai-Jung Hsu, and Kuan-Ta Chen. 2015. Modeling exercise relationships in E-Learning: A unified approach. In Proceedings of the International Conference on Educational Data Mining (EDM’15). 532–535.
[57]
Petar Velickovic, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Lio, Yoshua Bengio, et al. 2017. Graph attention networks. Stat 1050, 20 (2017), 10–48550.
[58]
Zexi Huang, Arlei Silva, and Ambuj Singh. 2021. A broader picture of random-walk based graph embedding. In Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD’21). Association for Computing Machinery, New York, NY, 685–695. DOI:
[59]
Antoine Bordes, Nicolas Usunier, Alberto Garcia-Duran, Jason Weston, and Oksana Yakhnenko. 2013. Translating embeddings for modeling multi-relational data. In Advances in Neural Information Processing Systems, C. J. Burges, L. Bottou, M. Welling, Z. Ghahramani, and K. Q. Weinberger (Eds.), Vol. 26. Curran Associates, Inc.https://proceedings.neurips.cc/paper_files/paper/2013/file/1cecc7a77928ca8133fa24680a88d2f9-Paper.pdf
[60]
Bishan Yang, Wen-tau Yih, Xiaodong He, Jianfeng Gao, and Li Deng. 2014. Embedding entities and relations for learning and inference in knowledge bases. arXiv:1412.6575. Retrieved from https://arxiv.org/bs/1412.6575
[61]
Ren Li, Yanan Cao, Qiannan Zhu, Guanqun Bi, Fang Fang, Yi Liu, and Qian Li. 2022. How does knowledge graph embedding extrapolate to unseen data: A semantic evidence view. In Proceedings of the AAAI Conference on Artificial Intelligence, 5781–5791. DOI:
[62]
Aravind Sankar, Yanhong Wu, Liang Gou, Wei Zhang, and Hao Yang. 2018. Dynamic graph representation learning via self-attention networks. arXiv:1812.09430. Retrieved from https://arxiv.org/abs/1812.09430
[63]
Emanuele Rossi, Ben Chamberlain, Fabrizio Frasca, Davide Eynard, Federico Monti, and Michael Bronstein. 2020. Temporal graph networks for deep learning on dynamic graphs. arXiv:2006.10637. Retrieved from https://arxiv.org/abs/2006.10637

Cited By

View all
  • (2024)A survey on feature extraction and learning techniques for link prediction in homogeneous and heterogeneous complex networksArtificial Intelligence Review10.1007/s10462-024-10998-757:12Online publication date: 28-Oct-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Information Systems
ACM Transactions on Information Systems  Volume 42, Issue 2
March 2024
897 pages
EISSN:1558-2868
DOI:10.1145/3618075
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 November 2023
Online AM: 29 August 2023
Accepted: 11 August 2023
Revised: 10 July 2023
Received: 25 November 2022
Published in TOIS Volume 42, Issue 2

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Intelligent education
  2. knowledge tracing
  3. graph embedding
  4. heterogeneous information network
  5. dynamic graph

Qualifiers

  • Research-article

Funding Sources

  • National Key R&D Program of China
  • National Natural Science Foundation of China
  • China Postdoctoral Science Foundation
  • Hubei Provincial Natural Science Foundation of China
  • Fundamental Research Funds for the Central Universities

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)411
  • Downloads (Last 6 weeks)46
Reflects downloads up to 12 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)A survey on feature extraction and learning techniques for link prediction in homogeneous and heterogeneous complex networksArtificial Intelligence Review10.1007/s10462-024-10998-757:12Online publication date: 28-Oct-2024

View Options

Get Access

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media