Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3430665.3456320acmconferencesArticle/Chapter ViewAbstractPublication PagesiticseConference Proceedingsconference-collections
research-article
Open access

Are Working Habits Different Between Well-Performing and at-Risk Students in Online Project-Based Courses?

Published: 26 June 2021 Publication History

Abstract

We analyze differences in working habits between well-performing and at-risk students using highly-granular data collected from two semesters of an online project-based, upper-level course on cloud computing at a US institution of higher education. Such differentiating metrics may provide deeper insights than interim grades, which are oftentimes the only quantifiable data that is captured and available to an instructor as a proxy for students' learning. Interim grades provide little insight into students' broader work habits and may mask unsustainable learning strategies that result in shallow learning or quickly-forgotten skills/knowledge. The adoption of technology-enhanced learning tools for course delivery, automatic feedback, and grading enable data-informed insight and reflection into students' working habits. This data could allow the detection of early signs of under-prepared students or students in crisis. We empirically assess what working habits, if any, differ among well-performing and at-risk students. From clickstream and other activity data, we derive 22 metrics such as time spent reading project write-ups, timing of starting and finishing work, or break-taking. We also calculate two measures of consistency of each metric measured by a coefficient of variance and a variance of ranking over the semester as well as outlier behavior of a student. Using Z-test and Kolmogorov-Smirnov test, we confirm differences in multiple behavior patterns. Notably, our data suggest that well-performing students start and finish working on a project earlier than at-risk students but they also tend to have fewer submissions which indicate they are more thoughtful about feedback.

References

[1]
Jens Bennedsen and Michael E Caspersen. 2006. Abstraction ability as an indicator of success for learning object-oriented programming? ACM Sigcse Bulletin, Vol. 38, 2 (2006), 39--43.
[2]
Markus M Breunig, Hans-Peter Kriegel, Raymond T Ng, and Jörg Sander. 2000. LOF: identifying density-based local outliers. In Proceedings of the 2000 ACM SIGMOD international conference on Management of data. 93--104.
[3]
Zhongzhou Chen, Mengyu Xu, Geoffrey Garrido, and Matthew W Guthrie. 2020. Relationship between students' online learning behavior and course performance: What contextual information matters? Physical Review Physics Education Research, Vol. 16, 1 (2020), 010138.
[4]
Damian S Damianov, Lori Kupczynski, Pablo Calafiore, Ekaterina P Damianova, Gökcc e Soydemir, and Edgar Gonzalez. 2009. Time spent online and student performance in online business courses: A multinomial logit analysis. Journal of Economics and Finance Education, Vol. 8, 2 (2009), 11--22.
[5]
William EJ Doane. 2008. Predicting student performance in introductory computer programming courses. State University of New York at Albany.
[6]
Rosalina Rebucas Estacio and Rodolfo Callanta Raga Jr. 2017. Analyzing students online learning behavior in blended courses using Moodle. Asian Association of Open Universities Journal (2017).
[7]
Ge Gao, Samiha Marwan, and Thomas W Price. 2021. Early Performance Prediction using Interpretable Patterns in Programming Process Data. In Proceedings of the 52nd ACM Technical Symposium on Computer Science Education. 342--348.
[8]
Seth Copen Goldstein, Hongyi Zhang, Majd Sakr, Haokang An, and Cameron Dashti. 2019. Understanding how work habits influence student performance. In Proceedings of the 2019 ACM Conference on Innovation and Technology in Computer Science Education. 154--160.
[9]
Mark Guzdial and Briana Morrison. 2016. Growing computer science education into a STEM education discipline. Commun. ACM, Vol. 59, 11 (2016), 31--33.
[10]
Leslie Harvey and Ashish Aggarwal. 2020. How do Quiz and Homework Submission Times Affect Students' Performance in a Flipped CS1 Class?. In Proceedings of the 2020 ACM Conference on International Computing Education Research. 314--314.
[11]
Sture Holm. 1979. A simple sequentially rejective multiple test procedure. Scandinavian journal of statistics (1979), 65--70.
[12]
Kenneth Holstein, Bruce M McLaren, and Vincent Aleven. 2017. Intelligent tutors as teachers' aides: exploring teacher needs for real-time analytics in blended classrooms. In Proceedings of the seventh international learning analytics & knowledge conference. 257--266.
[13]
Wu-Yuin Hwang, Rustam Shadiev, Chin-Yu Wang, and Zhi-Hua Huang. 2012. A pilot study of cooperative programming learning behavior and its relationship with students' learning performance. Computers & education, Vol. 58, 4 (2012), 1267--1281.
[14]
Il-Hyun Jo, Taeho Yu, Hyeyun Lee, and Yeonjoo Kim. 2015. Relations between student online learning behavior and academic achievement in higher education: A learning analytics approach. In Emerging issues in smart learning. Springer, 275--287.
[15]
John S Kinnebrew, Kirk M Loretz, and Gautam Biswas. 2013. A contextualized, differential sequence mining method to derive students' learning behavior patterns. Journal of Educational Data Mining, Vol. 5, 1 (2013), 190--219.
[16]
Chris Masui, Jan Broeckmans, Sarah Doumen, Anne Groenen, and Geert Molenberghs. 2014. Do diligent students perform better? Complex relations between student and course characteristics, study time, and academic performance in higher education. Studies in Higher Education, Vol. 39, 4 (2014), 621--643.
[17]
James W Michaels and Terance D Miethe. 1989. Academic effort and college grades. Social Forces, Vol. 68, 1 (1989), 309--319.
[18]
E Ashby Plant, K Anders Ericsson, Len Hill, and Kia Asberg. 2005. Why study time does not predict grade point average across college students: Implications of deliberate practice for academic performance. Contemporary Educational Psychology, Vol. 30, 1 (2005), 96--116.
[19]
Jiezhong Qiu, Jie Tang, Tracy Xiao Liu, Jie Gong, Chenhui Zhang, Qian Zhang, and Yufei Xue. 2016. Modeling and predicting learning behavior in MOOCs. In Proceedings of the ninth ACM international conference on web search and data mining. 93--102.
[20]
Mar'ia Jesüs Rodr'iguez-Triana, Luis P Prieto, Alejandra Mart'inez-Monés, Juan I Asensio-Pérez, and Yannis Dimitriadis. 2018. The teacher in the loop: Customizing multimodal learning analytics for blended learning. In Proceedings of the 8th international conference on learning analytics and knowledge. 417--426.
[21]
Howard Schuman, Edward Walsh, Camille Olson, and Barbara Etheridge. 1985. Effort and reward: The assumption that college grades are affected by quantity of study. Social Forces, Vol. 63, 4 (1985), 945--966.
[22]
Alan Seidman. 2005. The learning killer: Disruptive student behavior in the classroom. Reading improvement, Vol. 42, 1 (2005), 40--47.
[23]
Clifford A Shaffer and Stephen H Edwards. 2011. Scheduling and student performance. In Proceedings of the 16th annual joint conference on innovation and technology in computer science education. 331--331.
[24]
MM Van den Hurk, HAP Wolfhagen, DHJM Dolmans, and CPM Van der Vleuten. 1998. The relation between time spent on individual study and academic achievement in a problem-based curriculum. Advances in Health Sciences Education, Vol. 3, 1 (1998), 43--49.
[25]
Janet Walkoe, Miriam Sherin, and Andrew Elby. 2020. Video tagging as a window into teacher noticing. Journal of Mathematics Teacher Education, Vol. 23, 4 (2020), 385--405.
[26]
Salla Willman, Rolf Lindén, Erkki Kaila, Teemu Rajala, Mikko-Jussi Laakso, and Tapio Salakoski. 2015. On study habits on an introductory course on programming. Computer Science Education, Vol. 25, 3 (2015), 276--291.
[27]
Annika Wolff, Zdenek Zdrahal, Andriy Nikolov, and Michal Pantucek. 2013. Improving retention: predicting at-risk students by analysing clicking behaviour in a virtual learning environment. In Proceedings of the third international conference on learning analytics and knowledge. 145--149.

Cited By

View all

Index Terms

  1. Are Working Habits Different Between Well-Performing and at-Risk Students in Online Project-Based Courses?

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ITiCSE '21: Proceedings of the 26th ACM Conference on Innovation and Technology in Computer Science Education V. 1
    June 2021
    611 pages
    ISBN:9781450382144
    DOI:10.1145/3430665
    This work is licensed under a Creative Commons Attribution International 4.0 License.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 June 2021

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. educational data mining
    2. online education
    3. student behavior
    4. student performance
    5. work habits

    Qualifiers

    • Research-article

    Conference

    ITiCSE 2021
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 552 of 1,613 submissions, 34%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 419
      Total Downloads
    • Downloads (Last 12 months)78
    • Downloads (Last 6 weeks)7
    Reflects downloads up to 28 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media