Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2960310.2960333acmconferencesArticle/Chapter ViewAbstractPublication PagesicerConference Proceedingsconference-collections
research-article

Learning Curve Analysis for Programming: Which Concepts do Students Struggle With?

Published: 25 August 2016 Publication History

Abstract

The recent surge in interest in using educational data mining on student written programs has led to discoveries about which compiler errors students encounter while they are learning how to program. However, less attention has been paid to the actual code that students produce. In this paper, we investigate programming data by using learning curve analysis to determine which programming elements students struggle with the most when learning in Python. Our analysis extends the traditional use of learning curve analysis to include less structured data, and also reveals new possibilities for when to teach students new programming concepts. One particular discovery is that while we find evidence of student learning in some cases (for example, in function definitions and comparisons), there are other programming elements which do not demonstrate typical learning. In those cases, we discuss how further changes to the model could affect both demonstrated learning and our understanding of the different concepts that students learn.

References

[1]
Anderson, J.R. and Reiser, B.J. 1985. The LISP Tutor. BYTE. 10, 4 (1985), 159--175.
[2]
Baker, R.S.J.D. and Yacef, K. 2009. The State of Educational Data Mining in 2009?: A Review and Future Visions. Journal of Educational Data Mining. 1, 1 (2009), 3--16.
[3]
Berges, M. and Hubwieser, P. 2015. Evaluation of Source Code with Item Response Theory. ITiCSE '15 (2015), 51--56.
[4]
Cen, H. et al. 2006. Learning Factors Analysis -- A General Method for Cognitive Model Evaluation and Improvement. ITS '06 (2006), 164--175.
[5]
Cherenkova, Y. et al. 2014. Identifying Challenging CS1 Concepts in a Large Problem Dataset. SIGCSE '14 (2014), 695--700.
[6]
Corbett, A.T. and Anderson, J.R. 1995. Knowledge Tracing: Modeling the Acquisition of Procedural Knowledge. User Modeling and User-Adapted Interaction. 4, 4 (1995), 253--278.
[7]
Helminen, J. et al. 2012. How Do Students Solve Parsons Programming Problems? -- An Analysis of Interaction Traces. ICER '12 (2012), 119--126.
[8]
Hosseini, R. et al. 2014. Exploring Problem Solving Paths in a Java Programming Course. PPIG '14 (2014).
[9]
Hosseini, R. and Brusilovsky, P. 2013. JavaParser: A Fine-Grain Concept Indexing Tool for Java Problems. AIEDCS '13 (2013), 60--63.
[10]
Ihantola, P. et al. 2015. Educational Data Mining and Learning Analytics in Programming: Literature Review and Case Studies. ITiCSE WG '15 (2015), 41--63.
[11]
Kasurinen, J. and Nikula, U. 2009. Estimating Programming Knowledge with Bayesian Knowledge Tracing. ITiCSE '09 (Aug. 2009), 313--317.
[12]
Koedinger, K.R. et al. 2010. A data repository for the EDM community: The PSLC DataShop. Handbook of educational data mining. 43.
[13]
Koedinger, K.R. et al. 2012. The Knowledge-Learning-Instruction Framework: Bridging the Science-Practice Chasm to Enhance Robust Student Learning. Cognitive Science. 36, 5 (Jul. 2012), 757--798.
[14]
Newell, A. and Rosenbloom, P.S. 1981. Mechanisms of skill acquisition and the law of practice. Cognitive skills and their acquisition. 1--56.
[15]
Papancea, A. et al. 2013. An Open Platform for Managing Short Programming Exercises. ICER '13 (2013), 47--51.
[16]
Piech, C. et al. 2012. Modeling How Students Learn to Program. SIGCSE '12 (2012), 153--158.
[17]
Rivers, K. and Koedinger, K.R. 2014. Automating Hint Generation with Solution Space Path Construction. ITS '14 (2014), 329--339.
[18]
Rivers, K. and Koedinger, K.R. 2015. Data-Driven Hint Generation in Vast Solution Spaces: a Self-Improving Python Programming Tutor. International Journal of Artificial Intelligence in Education {pre-release}. (2015).
[19]
Soloway, E.M. 1986. Learning to Program = Learning to Construct Mechanisms and Explanations. Communications of the ACM. 29, 9 (Sep. 1986), 850--858.
[20]
Stamper, J.C. and Koedinger, K.R. 2011. Human-Machine Student Model Discovery and Improvement Using DataShop. AIED '11 (2011), 353--360.
[21]
VanLehn, K. et al. 2007. What's in a Step? Toward General, Abstract Representations of Tutoring System Log Data. UMAP '07 (2007), 455--459.
[22]
Yudelson, M. V. et al. 2014. Investigating Automated Student Modeling in a Java MOOC. EDM '14 (2014), 261--264.

Cited By

View all
  • (2024)Beyond Repetition: The Role of Varied Questioning and Feedback in Knowledge GeneralizationProceedings of the Eleventh ACM Conference on Learning @ Scale10.1145/3657604.3664688(451-455)Online publication date: 9-Jul-2024
  • (2024)Validating, Refining, and Identifying Programming Plans Using Learning Curve Analysis on Code Writing DataProceedings of the 2024 ACM Conference on International Computing Education Research - Volume 110.1145/3632620.3671120(263-279)Online publication date: 12-Aug-2024
  • (2024)TriviaHG: A Dataset for Automatic Hint Generation from Factoid QuestionsProceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval10.1145/3626772.3657855(2060-2070)Online publication date: 10-Jul-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICER '16: Proceedings of the 2016 ACM Conference on International Computing Education Research
August 2016
310 pages
ISBN:9781450344494
DOI:10.1145/2960310
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 25 August 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. educational data mining
  2. knowledge components
  3. learning curve analysis
  4. programming syntax

Qualifiers

  • Research-article

Funding Sources

  • Department of Education

Conference

ICER '16
Sponsor:
ICER '16: International Computing Education Research Conference
September 8 - 12, 2016
VIC, Melbourne, Australia

Acceptance Rates

ICER '16 Paper Acceptance Rate 26 of 102 submissions, 25%;
Overall Acceptance Rate 189 of 803 submissions, 24%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)62
  • Downloads (Last 6 weeks)4
Reflects downloads up to 06 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Beyond Repetition: The Role of Varied Questioning and Feedback in Knowledge GeneralizationProceedings of the Eleventh ACM Conference on Learning @ Scale10.1145/3657604.3664688(451-455)Online publication date: 9-Jul-2024
  • (2024)Validating, Refining, and Identifying Programming Plans Using Learning Curve Analysis on Code Writing DataProceedings of the 2024 ACM Conference on International Computing Education Research - Volume 110.1145/3632620.3671120(263-279)Online publication date: 12-Aug-2024
  • (2024)TriviaHG: A Dataset for Automatic Hint Generation from Factoid QuestionsProceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval10.1145/3626772.3657855(2060-2070)Online publication date: 10-Jul-2024
  • (2024)Exploring Novice Programmers' Testing Behavior: A First Step to Define Coding StruggleProceedings of the 55th ACM Technical Symposium on Computer Science Education V. 110.1145/3626252.3630851(1251-1257)Online publication date: 7-Mar-2024
  • (2024)Aligning open educational resources to new taxonomies: How AI technologies can help and in which scenariosComputers & Education10.1016/j.compedu.2024.105027(105027)Online publication date: Mar-2024
  • (2024)Enhancing LLM-Based Feedback: Insights from Intelligent Tutoring Systems and the Learning SciencesArtificial Intelligence in Education. Posters and Late Breaking Results, Workshops and Tutorials, Industry and Innovation Tracks, Practitioners, Doctoral Consortium and Blue Sky10.1007/978-3-031-64315-6_3(32-43)Online publication date: 2-Jul-2024
  • (2023)Human-Centered Deferred Inference: Measuring User Interactions and Setting Deferral Criteria for Human-AI TeamsProceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581641.3584092(681-694)Online publication date: 27-Mar-2023
  • (2023)Ira: a free and open-source Fourier transform infrared (FTIR) data analysis widget for pharmaceutical applicationsAnalytical Letters10.1080/00032719.2023.218051656:16(2637-2648)Online publication date: 21-Feb-2023
  • (2023)Automated Extraction of Domain Models from Textbook Indexes for Developing Intelligent Tutoring SystemsAugmented Intelligence and Intelligent Tutoring Systems10.1007/978-3-031-32883-1_11(124-136)Online publication date: 22-May-2023
  • (2022)Types of Errors in Block Programming: Driven by Learner, Learning EnvironmentJournal of Educational Computing Research10.1177/0735633122110231261:1(178-207)Online publication date: 18-Jun-2022
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media