Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3408877.3432430acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article

STOP THE (AUTOGRADER) INSANITY: Regression Penalties to Deter Autograder Overreliance

Published: 05 March 2021 Publication History

Abstract

Autograders are an invaluable tool for deploying assessments in large classes. However students sometimes rely on the autograder in place of careful thought for ways to improve to their solution. We sought to naturally encourage students to check their own solutions more, and hammer the grader less. To do this, we imposed a penalty each time a student's grade went down: we called these regression penalties. We assessed whether the introduction of these penalties resulted in less reliance on the autograder without hurting student performance.
Encouragingly, the number of autograder submissions was reduced by roughly half while only slightly decreasing the median final grade. Students reported feeling nervous about their submissions, but noted that they checked their own solutions by testing their code far more than they would have without the penalty. Students also expressed positivity about the regression model.

References

[1]
Kirsti M. Ala-Mutka. 2007. A Survey of Automated Assessment Approaches for Programming Assignments. Computer Science Education(Feb. 2007). https://doi.org/10.1080/08993400500150747
[2]
Luciana Benotti, Federico Aloi, Franco Bulgarelli, and Marcos J. Gomez. 2018. The Effect of a Web-Based Coding Tool with Automatic Feedback on Students' Performance and Perceptions. In Proceedings of the Technical Symposium on Computer Science Education (SIGCSE). 2--7.https://doi.org/10.1145/3159450.3159579
[3]
Benjamin Clegg, Siobhán North, Phil McMinn, and Gordon Fraser. 2019. Simulating Student Mistakes to Evaluate the Fairness of Automated Grading. In Proceedings of the International Conference on Software Engineering: Software Engineering Education and Training (ICSE-SEET). 121--125. https://doi.org/10.1109/ICSE-SEET.2019.00021
[4]
Anton Dil and Joseph Osunde. 2018. Evaluation of a Tool for Java Structural Specification Checking. In Proceedings of the International Conference on Education Technology and Computers (ICETC). 99--104. https://doi.org/10.1145/3290511.3290528
[5]
Stephen H. Edwards and Manuel A. Perez-Quinones. 2008. Web-CAT: Automatically Grading Programming Assignments. SIGCSE Bulletin 40, 3 (June 2008), 328. https://doi.org/10.1145/1597849.1384371
[6]
Sarah Heckman and Jason King. 2018. Developing Software Engineering Skills Using Real Tools for Automated Grading. In Proceedings of the Technical Symposium on Computer Science Education (SIGCSE). 794--799. https://doi.org/10.1145/3159450.3159595
[7]
Petri Ihantola, Tuukka Ahoniemi, Ville Karavirta, and Otto Seppälä. 2010. Review of Recent Systems for Automatic Assessment of Programming Assignments. In Proceedings of the International Conference on Computing Education Research(KOLI). 86--93. https://doi.org/10.1145/1930464.1930480
[8]
Ville Karavirta, Ari Korhonen, and Lauri Malmi. 2007. On the Use of Resubmissions in Automatic Assessment Systems. Computer Science Education(Feb. 2007). https://doi.org/10.1080/08993400600912426
[9]
Xiao Liu, Shuai Wang, Pei Wang, and Dinghao Wu. 2019. Automatic Grading of Programming Assignments: An Approach Based on Formal Semantics. In Proceedings of the International Conference on Software Engineering: Software Engineering Education and Training (ICSE-SEET). 126--137. https://doi.org/10.1109/ICSE-SEET.2019.00022
[10]
Formative Authorised Educator M Garcia. 2018. The things you learn the hard way- Beware "return scores instantly". (2018). https://discuss.goformative.com/t/the-things-you-learn-the-hard-way-beware-return-scores-instantly/1880/16
[11]
Károly Nehéz, Sándor Király, and Oliver Hornyak. 2017. Some aspects of grading Java code submissions in MOOCs. Research in Learning Technology (RLT)27 (072017). https://doi.org/10.25304/rlt.v25.1945
[12]
Jaime Spacco, David Hovemeyer, William Pugh, Fawzi Emad, Jeffrey K. Hollingsworth, and Nelson Padua-Perez. 2006. Experiences with Marmoset:Designing and Using an Advanced Submission and Testing System for Programming Courses. ACM SIGCSE Bulletin 38, 3 (June 2006), 13--17. https://doi.org/10.1145/1140123.1140131
[13]
John Wrenn and Shriram Krishnamurthi. 2019. Executable Examples for Programming Problem Comprehension. In Proceedings of the International Conference on International Computing Education Research (ICER). 131-139. https://doi.org/10.1145/3291279.3339416
[14]
Lucas Zamprogno, Reid Holmes, and Elisa Baniassad. 2020. Nudging Student Learning Strategies Using Formative Feedback in Automatically Graded Assessments. In Proceedings of the International Conference on Systems, Programming, Languages, and Applications: Software for Humanity - Education Track (SPASH-E).1--11. https://doi.org/10.1145/3426431.3428654

Cited By

View all
  • (2024)Bridging the Gap between Project-Oriented and Exercise-Oriented Automatic Assessment ToolsComputers10.3390/computers1307016213:7(162)Online publication date: 30-Jun-2024
  • (2024)The Trees in the Forest: Characterizing Computing Students' Individual Help-Seeking ApproachesProceedings of the 2024 ACM Conference on International Computing Education Research - Volume 110.1145/3632620.3671099(343-358)Online publication date: 12-Aug-2024
  • (2024)Learning with Style: Improving Student Code-Style Through Better Automated FeedbackProceedings of the 55th ACM Technical Symposium on Computer Science Education V. 110.1145/3626252.3630889(1175-1181)Online publication date: 7-Mar-2024
  • Show More Cited By

Index Terms

  1. STOP THE (AUTOGRADER) INSANITY: Regression Penalties to Deter Autograder Overreliance

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      SIGCSE '21: Proceedings of the 52nd ACM Technical Symposium on Computer Science Education
      March 2021
      1454 pages
      ISBN:9781450380621
      DOI:10.1145/3408877
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 05 March 2021

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. assessment
      2. autograder
      3. software engineering

      Qualifiers

      • Research-article

      Conference

      SIGCSE '21
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 1,595 of 4,542 submissions, 35%

      Upcoming Conference

      SIGCSE Virtual 2024
      1st ACM Virtual Global Computing Education Conference
      December 5 - 8, 2024
      Virtual Event , NC , USA

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)102
      • Downloads (Last 6 weeks)7
      Reflects downloads up to 18 Aug 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Bridging the Gap between Project-Oriented and Exercise-Oriented Automatic Assessment ToolsComputers10.3390/computers1307016213:7(162)Online publication date: 30-Jun-2024
      • (2024)The Trees in the Forest: Characterizing Computing Students' Individual Help-Seeking ApproachesProceedings of the 2024 ACM Conference on International Computing Education Research - Volume 110.1145/3632620.3671099(343-358)Online publication date: 12-Aug-2024
      • (2024)Learning with Style: Improving Student Code-Style Through Better Automated FeedbackProceedings of the 55th ACM Technical Symposium on Computer Science Education V. 110.1145/3626252.3630889(1175-1181)Online publication date: 7-Mar-2024
      • (2024)Automated feedback for participants of hands-on cybersecurity trainingEducation and Information Technologies10.1007/s10639-023-12265-829:9(11555-11584)Online publication date: 1-Jun-2024
      • (2023)“It’s Weird That it Knows What I Want”: Usability and Interactions with Copilot for Novice ProgrammersACM Transactions on Computer-Human Interaction10.1145/361736731:1(1-31)Online publication date: 29-Nov-2023
      • (2023)Evaluation of Submission Limits and Regression Penalties to Improve Student Behavior with Automatic Assessment SystemsACM Transactions on Computing Education10.1145/359121023:3(1-24)Online publication date: 20-Jun-2023
      • (2023)Seeing Program Output Improves Novice Learning GainsProceedings of the 2023 Conference on Innovation and Technology in Computer Science Education V. 110.1145/3587102.3588796(180-186)Online publication date: 29-Jun-2023
      • (2023)Automated Questionnaires About Students’ JavaScript Programs: Towards Gauging Novice Programming ProcessesProceedings of the 25th Australasian Computing Education Conference10.1145/3576123.3576129(49-58)Online publication date: 30-Jan-2023
      • (2023)Automated Assessment: Experiences From the TrenchesProceedings of the 25th Australasian Computing Education Conference10.1145/3576123.3576124(1-10)Online publication date: 30-Jan-2023
      • (2023)Eastwood-Tidy: C Linting for Automated Code Style Assessment in Programming CoursesProceedings of the 54th ACM Technical Symposium on Computer Science Education V. 110.1145/3545945.3569817(799-805)Online publication date: 2-Mar-2023
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media