Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.5555/2886521.2886563guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Identifying at-risk students in massive open online courses

Published: 25 January 2015 Publication History
  • Get Citation Alerts
  • Abstract

    Massive Open Online Courses (MOOCs) have received widespread attention for their potential to scale higher education, with multiple platforms such as Coursera, edX and Udacity recently appearing. Despite their successes, a major problem faced by MOOCs is low completion rates. In this paper, we explore the accurate early identification of students who are at risk of not completing courses. We build predictive models weekly, over multiple offerings of a course. Furthermore, we envision student interventions that present meaningful probabilities of failure, enacted only for marginal students. To be effective, predicted probabilities must be both well-calibrated and smoothed across weeks. Based on logistic regression, we propose two transfer learning algorithms to trade-off smoothness and accuracy by adding a regularization term to minimize the difference of failure probabilities between consecutive weeks. Experimental results on two offerings of a Coursera MOOC establish the effectiveness of our algorithms.

    References

    [1]
    Anderson, A.; Huttenlocher, D.; Kleinberg, J.; and Leskovec, J. 2014. Engaging with massive online courses. In Proceedings of the 23rd International Conference on World Wide Web, 687-698. International World Wide Web Conferences Steering Committee.
    [2]
    Ando, R. K., and Zhang, T. 2005. A framework for learning predictive structures from multiple tasks and unlabeled data. The Journal of Machine Learning Research 6:1817-1853.
    [3]
    Balakrishnan, G. 2013. Predicting student retention in massive open online courses using hidden Markov models. Master's thesis, EECS Department, University of California, Berkeley.
    [4]
    DeBoer, J.; Ho, A.; Stump, G. S.; Pritchard, D. E.; Seaton, D.; and Breslow, L. 2013a. Bringing student backgrounds online: MOOC user demographics, site usage, and online learning. Engineer 2:0-81.
    [5]
    DeBoer, J.; Stump, G.; Seaton, D.; and Breslow, L. 2013b. Diversity in MOOC students' backgrounds and behaviors in relationship to performance in 6.002 x. In Proceedings of the Sixth Learning International Networks Consortium Conference.
    [6]
    Halawa, S.; Greene, D.; and Mitchell, J. 2014. Dropout prediction in MOOCs using learner activity features. In Proceedings of the European MOOC Summit.
    [7]
    Jiang, S.; Warschauer, M.; Williams, A. E.; ODowd, D.; and Schenke, K. 2014. Predicting MOOC performance with week 1 behavior. In Proceedings of the 7th International Conference on Educational Data Mining.
    [8]
    Kaufman, C.; Perlman, R.; and Speciner, M. 2002. Network Security: Private Communication in a Public World. Prentice Hall, 2nd edition.
    [9]
    Kloft, M.; Stiehler, F.; Zheng, Z.; and Pinkwart, N. 2014. Predicting MOOC dropout over weeks using machine learning methods. In Proceedings of the EMNLP Workshop on Modeling Large Scale Social Interaction in Massively Open Online Courses.
    [10]
    Lodge, J. M. 2011. What if student attrition was treated like an illness? an epidemiological model for learning analytics. In Williams, G.; Statham, P.; Brown, N.; and Cleland, B., eds., Changing Demands, Changing Directions. Proceedings ascilite Hobart 2011, 822-825.
    [11]
    Niculescu-Mizil, A., and Caruana, R. 2005a. Obtaining calibrated probabilities from boosting. In UAI, 413.
    [12]
    Niculescu-Mizil, A., and Caruana, R. 2005b. Predicting good probabilities with supervised learning. In Proceedings of the 22nd International Conference on Machine learning, 625-632.
    [13]
    Ramesh, A.; Goldwasser, D.; Huang, B.; Daumé III, H.; and Getoor, L. 2013. Modeling learner engagement in MOOCs using probabilistic soft logic. In NIPS Workshop on Data Driven Education.
    [14]
    Ramesh, A.; Goldwasser, D.; Huang, B.; Daume III, H.; and Getoor, L. 2014. Learning latent engagement patterns of students in online courses. In Proceedings of the Twenty-Eighth AAAI Conference on Artificial Intelligence. AAAI Press.
    [15]
    Taylor, C.; Veeramachaneni, K.; and O'Reilly, U.-M. 2014. Likely to stop? predicting stopout in massive open online courses. arXiv preprint arXiv:1408.3382 273-275.
    [16]
    Veeramachaneni, K.; O'Reilly, U.; and Taylor, C. 2014. Towards feature engineering at scale for data from massive open online courses. CoRR abs/1407.5238.
    [17]
    Wen, M.; Yang, D.; and Rosé, C. P. 2014. Sentiment analysis in MOOC discussion forums: What does it tell us? Proceedings of Educational Data Mining.
    [18]
    Yang, D.; Sinha, T.; Adamson, D.; and Rosé, C. P. 2013. Turn on, tune in, drop out: Anticipating student dropouts in massive open online courses. In Proceedings of the 2013 NIPS Data-Driven Education Workshop.
    [19]
    Yang, D.; Wen, M.; and Rose, C. 2014. Peer influence on attrition in massive open online courses. Proceedings of Educational Data Mining.
    [20]
    Zhong, L. W., and Kwok, J. T. 2013. Accurate probability calibration for multiple classifiers. In Proceedings of the Twenty-Third international joint conference on Artificial Intelligence, 1939-1945. AAAI Press.

    Cited By

    View all
    • (2022)Dropout Rate Prediction for MOOC based on Inceptiontime ModelProceedings of the 7th International Conference on Distance Education and Learning10.1145/3543321.3543330(54-59)Online publication date: 20-May-2022
    • (2020)A Survey of Machine Learning Approaches for Student Dropout Prediction in Online CoursesACM Computing Surveys10.1145/338879253:3(1-34)Online publication date: 28-May-2020
    • (2020)Challenges and Solutions to the Student Dropout Prediction Problem in Online CoursesProceedings of the 29th ACM International Conference on Information & Knowledge Management10.1145/3340531.3412172(3513-3514)Online publication date: 19-Oct-2020
    • Show More Cited By

    Index Terms

    1. Identifying at-risk students in massive open online courses
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image Guide Proceedings
      AAAI'15: Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence
      January 2015
      4331 pages
      ISBN:0262511290

      Sponsors

      • Association for the Advancement of Artificial Intelligence

      Publisher

      AAAI Press

      Publication History

      Published: 25 January 2015

      Qualifiers

      • Article

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)0
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 12 Aug 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2022)Dropout Rate Prediction for MOOC based on Inceptiontime ModelProceedings of the 7th International Conference on Distance Education and Learning10.1145/3543321.3543330(54-59)Online publication date: 20-May-2022
      • (2020)A Survey of Machine Learning Approaches for Student Dropout Prediction in Online CoursesACM Computing Surveys10.1145/338879253:3(1-34)Online publication date: 28-May-2020
      • (2020)Challenges and Solutions to the Student Dropout Prediction Problem in Online CoursesProceedings of the 29th ACM International Conference on Information & Knowledge Management10.1145/3340531.3412172(3513-3514)Online publication date: 19-Oct-2020
      • (2019)Predict and InterveneProceedings of the Sixth (2019) ACM Conference on Learning @ Scale10.1145/3330430.3333634(1-9)Online publication date: 24-Jun-2019
      • (2019)Predicting learning status in MOOCs using LSTMProceedings of the ACM Turing Celebration Conference - China10.1145/3321408.3322855(1-5)Online publication date: 17-May-2019
      • (2019)Effective Feature Learning with Unsupervised Learning for Improving the Predictive Models in Massive Open Online CoursesProceedings of the 9th International Conference on Learning Analytics & Knowledge10.1145/3303772.3303795(135-144)Online publication date: 4-Mar-2019
      • (2019)Using Detailed Access Trajectories for Learning Behavior AnalysisProceedings of the 9th International Conference on Learning Analytics & Knowledge10.1145/3303772.3303781(290-299)Online publication date: 4-Mar-2019
      • (2019)Predicting Academic Performance for College StudentsACM Transactions on Intelligent Systems and Technology10.1145/329908710:3(1-21)Online publication date: 7-May-2019
      • (2019)Where is the Human?Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290607.3299002(1-9)Online publication date: 2-May-2019
      • (2019)Using machine learning to predict student difficulties from learning session dataArtificial Intelligence Review10.1007/s10462-018-9620-852:1(381-407)Online publication date: 1-Jun-2019
      • Show More Cited By

      View Options

      View options

      Get Access

      Login options

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media