Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2960310.2960337acmconferencesArticle/Chapter ViewAbstractPublication PagesicerConference Proceedingsconference-collections
research-article

Benchmarking Introductory Programming Exams: Some Preliminary Results

Published: 25 August 2016 Publication History

Abstract

The programming education literature includes many observations that pass rates are low in introductory programming courses, but few or no comparisons of student performance across courses. This paper addresses that shortcoming. Having included a small set of identical questions in the final examinations of a number of introductory programming courses, we illustrate the use of these questions to examine the relative performance of the students both across multiple institutions and within some institutions. We also use the questions to quantify the size and overall difficulty of each exam. We find substantial differences across the courses, and venture some possible explanations of the differences. We conclude by explaining the potential benefits to instructors of using the same questions in their own exams.

References

[1]
J Bennedsen and ME Caspersen (2007). Failure rates in introductory programming. SIGCSE Bulletin, 39:2, 32--36.
[2]
MJ Guzdial and B Ericson (2013). Introduction to computing and programming in Python: a multimedia approach. Pearson Education.
[3]
R Lister (2011). Concrete and other neo-Piagetian forms of reasoning in the novice programmer. 13th Australasian Computing Education Conference (ACE 2011), 9--18.
[4]
R Lister, ES Adams S Fitzgerald, W Fone, J Hamer, M Lindholm, R McCartney, JE Moström, K Sanders, O Seppälä, B Simon, and L Thomas (2004). A multi-national study of reading and tracing skills in novice programmers. SIGCSE Bulletin, 36:4, 119--150.
[5]
R Lister, M Corney, J Curran, D D'Souza, C Fidge, R Gluga, M Hamilton, J Harland, J Hogan, J Kay, T Murphy, M Roggenkamp, J Sheard, Simon, and D Teague (2012). Toward a shared understanding of competency in programming: An invitation to the BABELnot project. 14th Australasian Computing Education Conference (ACE 2012), 53--60.
[6]
M McCracken, V Almstrum, D Diaz, M Guzdial, D Hagan, Y Ben-David Kolikant, C Laxer, L Thomas, I Utting, and T Wilusz (2001). A multi-national, multi-institutional study assessment of programming skills of first-year CS students. SIGCSE Bulletin - Working Group reports: Making inroads to improve computing education, 33:4, 125--140.
[7]
A Petersen, M Craig, and D Zingaro (2011). Reviewing CS1 exam question content. 42nd ACM Technical Symposium on Computer Science Education (SIGCSE'11), 631--636.
[8]
J Sheard, A Carbone, R Lister, B Simon, E Thompson, and JL Whalley (2008). Going SOLO to access novice programmers. 13th Annual Conference on Innovation and Technology in Computer Science Education (ITiCSE 2008), 209--213.
[9]
J Sheard, Simon, A Carbone, D Chinn, T Clear, M Corney, D D'Souza, J Fenwick, J Harland, M-J Laakso, and D Teague (2013). How difficult are exams? A framework for assessing the complexity of introductory programming exams. 15th Australasian Computing Education Conference (ACE 2013), 145--154.
[10]
Simon (2011). Assignment and sequence: why some students can't recognise a simple swap. 11th Koli Calling International Conference on Computing Education Research, 21--30.
[11]
Simon, A Carbone, M de Raadt, R Lister. M Hamilton, and J Sheard (2008). Classifying computing education papers: process and results. Fourth International Computing Education Research Workshop (ICER 2008), 161--171.
[12]
Simon, J Sheard, A Carbone, D D'Souza, J Harland, and M-J Laakso (2012). Can computing academics assess the difficulty of programming examination questions? 11th Koli Calling International Conference on Computing Education Research, 160--163.
[13]
Simon, J Sheard, D D'Souza, P Klemperer, L Porter, J Sorva, M Stegeman, and D Zingaro (2016). Benchmarking introductory programming exams: how and why. 21st ACM Conference on Innovation and Technology in Computer Science Education (ITiCSE '16).
[14]
Simon, J Sheard, D D'Souza, M Lopez, A Luxton-Reilly, IH Putro, P Robbins, D Teague, and J Whalley (2015). How (not) to write an introductory programming exam. 17th Australasian Computing Education Conference (ACE 2015), 137--146.
[15]
Simon and S Snowdon (2014). Multiple-choice vs free-text code-explaining examination questions. 14th Koli Calling International Conference on Computing Education Research, 91--97.
[16]
B Simon, M Clancy, R McCartney, B Morrison, B Richards, and K Sanders (2010). Making sense of data structures exams. Sixth International Computing Education Research workshop (ICER 2010), 97--105.
[17]
E Soloway (1986). Learning to program = learning to construct mechanisms and explanations. Communications of the ACM, 29:9, 850--858.
[18]
C Watson and FW Li (2014). Failure rates in introductory programming revisited. 19th ACM Conference on Innovation and Technology in Computer Science Education (ITiCSE '14), 39--44.
[19]
J Whalley, R Lister, E Thompson, T Clear, P Robbins, PKA Kumar, and C Prasad (2006). An Australasian study of reading and comprehension skills in novice programmers, using the Bloom and SOLO taxonomies. Eighth Australasian Computing Education conference (ACE 2006), 243--251.

Cited By

View all
  • (2023)Computing Education Research in AustralasiaPast, Present and Future of Computing Education Research10.1007/978-3-031-25336-2_17(373-394)Online publication date: 5-Jan-2023
  • (2021)Identifying Informatively Easy and Informatively Hard ConceptsACM Transactions on Computing Education10.1145/347796822:1(1-28)Online publication date: 18-Oct-2021
  • (2020)Analysis of Programming Assessments — Building an Open Repository for Measuring CompetenciesProceedings of the 20th Koli Calling International Conference on Computing Education Research10.1145/3428029.3428039(1-10)Online publication date: 19-Nov-2020
  • Show More Cited By

Index Terms

  1. Benchmarking Introductory Programming Exams: Some Preliminary Results

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ICER '16: Proceedings of the 2016 ACM Conference on International Computing Education Research
    August 2016
    310 pages
    ISBN:9781450344494
    DOI:10.1145/2960310
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 August 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. assessment
    2. benchmarking
    3. cs1
    4. examination
    5. introductory programming

    Qualifiers

    • Research-article

    Conference

    ICER '16
    Sponsor:
    ICER '16: International Computing Education Research Conference
    September 8 - 12, 2016
    VIC, Melbourne, Australia

    Acceptance Rates

    ICER '16 Paper Acceptance Rate 26 of 102 submissions, 25%;
    Overall Acceptance Rate 189 of 803 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)40
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 22 Sep 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Computing Education Research in AustralasiaPast, Present and Future of Computing Education Research10.1007/978-3-031-25336-2_17(373-394)Online publication date: 5-Jan-2023
    • (2021)Identifying Informatively Easy and Informatively Hard ConceptsACM Transactions on Computing Education10.1145/347796822:1(1-28)Online publication date: 18-Oct-2021
    • (2020)Analysis of Programming Assessments — Building an Open Repository for Measuring CompetenciesProceedings of the 20th Koli Calling International Conference on Computing Education Research10.1145/3428029.3428039(1-10)Online publication date: 19-Nov-2020
    • (2020)Revisiting Self-Efficacy in Introductory ProgrammingProceedings of the 2020 ACM Conference on International Computing Education Research10.1145/3372782.3406281(158-169)Online publication date: 10-Aug-2020
    • (2019)Introductory Programming Exams and Their Benchmarking in Slovakia2019 17th International Conference on Emerging eLearning Technologies and Applications (ICETA)10.1109/ICETA48886.2019.9039988(279-284)Online publication date: Nov-2019
    • (2018)Explicit short program practice in a programming languages courseJournal of Computing Sciences in Colleges10.5555/3199572.319958833:4(114-122)Online publication date: 1-Apr-2018
    • (2018)Introductory programming: a systematic literature reviewProceedings Companion of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education10.1145/3293881.3295779(55-106)Online publication date: 2-Jul-2018
    • (2018)Developing Assessments to Determine Mastery of Programming FundamentalsProceedings of the 2017 ITiCSE Conference on Working Group Reports10.1145/3174781.3174784(47-69)Online publication date: 30-Jan-2018

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media