Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3649217.3653561acmconferencesArticle/Chapter ViewAbstractPublication PagesiticseConference Proceedingsconference-collections
research-article
Open access

Combining Local Testing with Automatic Commits: Benefits for Progress Tracking and CS2 Students' Learning Experience

Published: 03 July 2024 Publication History

Abstract

Many instructors in introductory programming courses experience high dropout and failure rates. Identifying struggling students early is a prerequisite to target this problem. To this end, instructors and learning analytics researchers may leverage version control by analyzing the students' commit histories. This approach relies on frequent pushes to the version control platform, which many instructors incentivize by offering test results each time a student pushes a commit. However, instructors who provide test cases that can be run locally (i.e., without creating a commit) may face coarse-grained commit histories.
In this study, we analyze a CS2 course which offers both local and remote testing. Students were provided with tools that automatically create and push a commit on each local test run. We investigate to what extent these automatically created snapshots contribute to obtaining fine-grained commit histories and early initial push events. Our analysis uncovers distinct commit patterns among high- and low-performing students. Furthermore, we find that despite the commit automation and encouraging students to start early, many students pushed their first commit late. We triangulate this observation with survey results which confirm the late start of many students. The survey also identified reasons for students to opt out of automatic commit creation. Moreover, many students expressed a positive attitude towards testing their programs locally. Thus, our survey results underline that instructors should strive for providing students with comprehensive feedback that students can conveniently obtain.

References

[1]
Kai Arakawa, Qiang Hao, Wesley Deneke, Indie Cowan, Steven Wolfman, and Abigayle Peterson. 2022. Early identification of student struggles at the topic level using context-agnostic features. In Proceedings of the 53rd ACM Technical Symposium on Computer Science Education-Volume 1. 147--153.
[2]
Kai Arakawa, Qiang Hao, Tyler Greer, Lu Ding, Christopher D Hundhausen, and Abigayle Peterson. 2021. In situ identification of student self-regulated learning struggles in programming assignments. In Proceedings of the 52nd ACM Technical Symposium on Computer Science Education. 467--473.
[3]
Brett A Becker. 2016. A new metric to quantify repeated compiler errors for novice programmers. In Proceedings of the 2016 ACM Conference on Innovation and Technology in Computer Science Education. 296--301.
[4]
Susan Bergin and Ronan Reilly. 2005. The influence of motivation and comfort-level on learning to program. (2005).
[5]
Susan Bergin and Ronan Reilly. 2006. Predicting introductory programming performance: A multi-institutional multivariate study. Computer Science Education, Vol. 16, 4 (2006), 303--323. https://doi.org/10.1080/08993400600997096
[6]
Dragan Gavs ević, Shane Dawson, Tim Rogers, and Danijela Gasevic. 2016. Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success. The Internet and Higher Education, Vol. 28 (2016), 68--84.
[7]
Richard Glassey. 2019. Adopting Git/Github within teaching: A survey of tool support. In Proceedings of the ACM Conference on Global Computing Education. 143--149. https://doi.org/10.1145/3300115.3309518
[8]
Louis Glassy. 2006. Using version control to observe student software development processes. Journal of Computing Sciences in Colleges, Vol. 21, 3 (2006), 99--106.
[9]
Petri Ihantola, Arto Vihavainen, Alireza Ahadi, Matthew Butler, Jürgen Börstler, Stephen H Edwards, Essi Isohanni, Ari Korhonen, Andrew Petersen, Kelly Rivers, et al. 2015. Educational data mining and learning analytics in programming: Literature review and case studies. Proceedings of the 2015 ITiCSE on Working Group Reports (2015), 41--63.
[10]
Matthew C Jadud. 2006. Methods and tools for exploring novice compilation behaviour. In Proceedings of the second international workshop on Computing education research. 73--84.
[11]
Aleksandar Karakavs and Denis Helic. 2023. Relation Between Student Characteristics, Git Usage and Success in Programming Courses. In European Conference on Technology Enhanced Learning. Springer, 133--148.
[12]
Juho Leinonen, Francisco Enrique Vicente Castro, and Arto Hellas. 2021a. Does the early bird catch the worm? Earliness of students' work and its relationship with course outcomes. In Proceedings of the 26th ACM Conference on Innovation and Technology in Computer Science Education V. 1. 373--379.
[13]
Juho Leinonen, Francisco Enrique Vicente Castro, and Arto Hellas. 2021b. Fine-grained versus coarse-grained data for estimating time-on-task in learning programming. In Proceedings of The 14th International Conference on Educational Data Mining (EDM 2021). The International Educational Data Mining Society.
[14]
Christoph Matthies, Arian Treffer, and Matthias Uflacker. 2017. Prof. CI: Employing continuous integration services and Github workflows to teach test-driven development. In 2017 IEEE Frontiers in Education Conference (FIE). IEEE, 1--8.
[15]
Gustavo Rodriguez-Rivera, Jeff Turkstra, Jordan Buckmaster, Killian LeClainche, Shawn Montgomery, William Reed, Ryan Sullivan, and Jarett Lee. 2022. Tracking large class projects in real-time using fine-grained source control. In Proceedings of the 53rd ACM Technical Symposium on Computer Science Education-Volume 1. 565--570.
[16]
Cristobal Romero and Sebastian Ventura. 2007. Educational data mining: A survey from 1995 to 2005. Expert systems with applications, Vol. 33, 1 (2007), 135--146.
[17]
Julian Seward and Nicholas Nethercote. 2005. Using Valgrind to Detect Undefined Value Errors with Bit-Precision. In USENIX Annual Technical Conference, General Track. 17--30.
[18]
Raj Shrestha, Juho Leinonen, Albina Zavgorodniaia, Arto Hellas, and John Edwards. 2022. Pausing while programming: insights from keystroke analysis. In Proceedings of the ACM/IEEE 44th International Conference on Software Engineering: Software Engineering Education and Training. 187--198.
[19]
George Siemens and Ryan SJ d Baker. 2012. Learning analytics and educational data mining: towards communication and collaboration. In Proceedings of the 2nd international conference on learning analytics and knowledge. 252--254.
[20]
Jaime Spacco and William Pugh. 2006. Helping students appreciate test-driven development (TDD). In Companion to the 21st ACM SIGPLAN symposium on Object-oriented programming systems, languages, and applications. 907--913.
[21]
Arto Vihavainen, Matti Luukkainen, and Petri Ihantola. 2014. Analysis of source code snapshot granularity levels. In Proceedings of the 15th annual conference on information technology education. 21--26.
[22]
Christopher Watson, Frederick WB Li, and Jamie L Godwin. 2013. Predicting performance in an introductory programming course by logging and analyzing student programming behavior. In 2013 IEEE 13th international conference on advanced learning technologies. IEEE, 319--323.

Index Terms

  1. Combining Local Testing with Automatic Commits: Benefits for Progress Tracking and CS2 Students' Learning Experience

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ITiCSE 2024: Proceedings of the 2024 on Innovation and Technology in Computer Science Education V. 1
    July 2024
    776 pages
    ISBN:9798400706004
    DOI:10.1145/3649217
    This work is licensed under a Creative Commons Attribution International 4.0 License.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 03 July 2024

    Check for updates

    Author Tags

    1. computer science education
    2. cs2
    3. educational data mining
    4. git
    5. learning analytics
    6. version control

    Qualifiers

    • Research-article

    Conference

    ITiCSE 2024
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 552 of 1,613 submissions, 34%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 113
      Total Downloads
    • Downloads (Last 12 months)113
    • Downloads (Last 6 weeks)29
    Reflects downloads up to 25 Dec 2024

    Other Metrics

    Citations

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media