Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2999541.2999565acmotherconferencesArticle/Chapter ViewAbstractPublication Pageskoli-callingConference Proceedingsconference-collections
poster

Paper-based vs computer-based exams in CS1

Published: 24 November 2016 Publication History

Abstract

In this study, we examine the "test mode effect" in CS1 exam using the Rainfall problem. The participants started working with pen and paper, after which they had access to a computer, and they could rework their solution with a help of a test suite developed by the authors. In the computer- based phase many students were able to fix the errors that they had committed during the paper-based phase. These errors included well-known corner cases, such as empty array or division by zero.

References

[1]
A. Ebrahimi. Novice programmer errors: language constructs and plan composition. International Journal of Human-Computer Studies, 41(4):457--480, Oct. 1994.
[2]
K. Fisler. The recurring rainfall problem. In Proceedings of the Tenth Annual Conference on International Computing Education Research, ICER '14, pages 35--42, New York, NY, USA, 2014. ACM.
[3]
V. Isomöttönen and V. Lappalainen. CSI with games and an emphasis on TDD and unit testing: piling a trend upon a trend. ACM Inroads, 3(3):62--68, Sept. 2012.
[4]
A. J. Lakanen, V. Lappalainen, and V. Isomöttönen. Revisiting rainfall to explore exam questions and performance on CS1. In Proceedings of the 15th Koli Calling Conference on Computing Education Research, Koli Calling '15, pages 40--49, New York, NY, USA, 2015. ACM.
[5]
O. Seppälä, P. Ihantola, E. Isohanni, J. Sorva, and A. Vihavainen. Do we know how difficult the rainfall problem is? In Proceedings of the 15th Koli Calling Conference on Computing Education Research, Koli Calling '15, pages 87--96, New York, NY, USA, 2015. ACM.
[6]
E. Soloway. Learning to program= learning to construct mechanisms and explanations. Communications of the ACM, 29(9):850--858, 1986.

Cited By

View all
  • (2023)Pseudocode vs. Compile-and-Run Prompts: Comparing Measures of Student Programming Ability in CS1 and CS2Proceedings of the 2023 Conference on Innovation and Technology in Computer Science Education V. 110.1145/3587102.3588834(519-525)Online publication date: 29-Jun-2023
  • (2023)Executable ExamsProceedings of the 54th ACM Technical Symposium on Computer Science Education V. 110.1145/3545945.3569724(381-387)Online publication date: 2-Mar-2023
  • (2022)A systematic review of paper-based versus computer-based testing in engineering and computing education2022 IEEE Global Engineering Education Conference (EDUCON)10.1109/EDUCON52537.2022.9766469(364-372)Online publication date: 28-Mar-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
Koli Calling '16: Proceedings of the 16th Koli Calling International Conference on Computing Education Research
November 2016
189 pages
ISBN:9781450347709
DOI:10.1145/2999541
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

  • Univ. Eastern Finland: University of Eastern Finland
  • Univ. Turku: University of Turku
  • Monash University, Australia: Monash University, Australia

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 November 2016

Check for updates

Author Tags

  1. CS1
  2. computer science education
  3. rainfall

Qualifiers

  • Poster

Conference

Koli Calling 2016
Sponsor:
  • Univ. Eastern Finland
  • Univ. Turku
  • Monash University, Australia

Acceptance Rates

Koli Calling '16 Paper Acceptance Rate 21 of 57 submissions, 37%;
Overall Acceptance Rate 80 of 182 submissions, 44%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)6
  • Downloads (Last 6 weeks)0
Reflects downloads up to 13 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Pseudocode vs. Compile-and-Run Prompts: Comparing Measures of Student Programming Ability in CS1 and CS2Proceedings of the 2023 Conference on Innovation and Technology in Computer Science Education V. 110.1145/3587102.3588834(519-525)Online publication date: 29-Jun-2023
  • (2023)Executable ExamsProceedings of the 54th ACM Technical Symposium on Computer Science Education V. 110.1145/3545945.3569724(381-387)Online publication date: 2-Mar-2023
  • (2022)A systematic review of paper-based versus computer-based testing in engineering and computing education2022 IEEE Global Engineering Education Conference (EDUCON)10.1109/EDUCON52537.2022.9766469(364-372)Online publication date: 28-Mar-2022
  • (2021)Extending Computational Thinking into Information and Communication Technology Literacy MeasurementACM Transactions on Computing Education10.1145/342759621:1(1-25)Online publication date: 19-Jan-2021
  • (2020)Paper Or IDE?Proceedings of the 51st ACM Technical Symposium on Computer Science Education10.1145/3328778.3366857(706-712)Online publication date: 26-Feb-2020
  • (2018)BlueBookProceedings of the 49th ACM Technical Symposium on Computer Science Education10.1145/3159450.3159587(562-567)Online publication date: 21-Feb-2018
  • (2018)Coding by hand or on the computer? Evaluating the effect of assessment mode on performance of students learning programmingJournal of Computers in Education10.1007/s40692-018-0103-35:2(199-219)Online publication date: 3-May-2018

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media