Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2538862.2538981acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article

Using CodeBrowser to seek differences between novice programmers

Published: 05 March 2014 Publication History

Abstract

A large body of systems that gather data on students' programming process exists, and with the increase of massive open online courses in programming, the amount of gathered data is growing even at a higher rate. A common issue for data analysis is the lack of common tools for visualizing source code snapshots. We have created a browser-side snapshot analysis tool called CodeBrowser that provides a clean REST API that anyone can integrate their snapshot data into. In this article, we describe CodeBrowser and as an example, discuss how it has been used to seek differences between novice programmers that have passed (n=10) or failed (n=10) an introductory programming course.

References

[1]
K. M. Ala-Mutka. A Survey of Automated Assessment Approaches for Programming Assignments. Computer Science Education, 15:83--102, June 2005.
[2]
E. Balzuweit and J. Spacco. SnapViz: visualizing programming assignment snapshots. In Proceedings of the 18th ACM conference on Innovation and technology in computer science education, ITiCSE '13, pages 350--350, New York, NY, USA, 2013. ACM.
[3]
P. Blikstein. Using learning analytics to assess students' behavior in open-ended programming tasks. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, LAK '11, pages 110--116, New York, NY, USA, 2011. ACM.
[4]
C. Douce, D. Livingstone, and J. Orwell. Automatic test-based assessment of programming: A review. J. Educ. Resour. Comput., 5(3), Sept. 2005.
[5]
P. Ihantola, T. Ahoniemi, V. Karavirta, and O. Seppala. Review of recent systems for automatic assessment of programming assignments. In Proceedings of the 10th Koli Calling International Conference on Computing Education Research, Koli Calling '10, pages 86--93, New York, NY, USA, 2010. ACM.
[6]
M. C. Jadud. Methods and tools for exploring novice compilation behaviour. In Proceedings of the second international workshop on Computing education research, ICER '06, pages 73--84, New York, NY, USA, 2006. ACM.
[7]
V. Karavirta, A. Korhonen, L. Malmi, and T. Naps. A comprehensive taxonomy of algorithm animation languages. J. Vis. Lang. Comput., 21(1):1--22, Feb. 2010.
[8]
M. Kölling, B. Quig, A. Patterson, and J. Rosenberg. The BlueJ system and its pedagogy. Computer Science Education, 13(4):249--268, 2003.
[9]
Y. Matsuzawa, K. Okada, and S. Sakai. Programming process visualizer: a proposal of the tool for students to observe their programming process. In Proceedings of the 18th ACM conference on Innovation and technology in computer science education, ITiCSE '13, pages 46--51, New York, NY, USA, 2013. ACM.
[10]
J. McKeogh and C. Exton. Eclipse plug-in to monitor the programmer behaviour. In Proceedings of the 2004 OOPSLA workshop on eclipse technology eXchange, eclipse '04, pages 93--97, New York, NY, USA, 2004. ACM.
[11]
A. Moreno, N. Myller, E. Sutinen, and M. Ben-Ari. Visualizing programs with Jeliot 3. In Proceedings of the working conference on Advanced visual interfaces, AVI '04, pages 373--376, New York, NY, USA, 2004. ACM.
[12]
C. Murphy, G. Kaiser, K. Loveland, and S. Hasan. Retina: helping students and instructors based on observed programming activities. In Proceedings of the 40th ACM technical symposium on Computer science education, SIGCSE '09, pages 178--182, New York, NY, USA, 2009. ACM.
[13]
C. Norris, F. Barry, J. B. Fenwick Jr., K. Reid, and J. Rountree. ClockIt: collecting quantitative data on how beginning software developers really work. SIGCSE Bull., 40(3):37--41, June 2008.
[14]
J. Spacco, J. Strecker, D. Hovemeyer, and W. Pugh. Software repository mining with Marmoset: an automated programming project snapshot and testing system. In Proceedings of the 2005 international workshop on Mining software repositories, MSR '05, pages 1--5, New York, NY, USA, 2005. ACM.
[15]
I. Utting, N. Brown, M. Kölling, D. McCall, and P. Stevens. Web-scale data gathering with BlueJ. In Proceedings of the ninth annual international conference on International computing education research, ICER '12, pages 1--4, New York, NY, USA, 2012. ACM.
[16]
A. Vihavainen, T. Vikberg, M. Luukkainen, and M. Partel. Scaffolding students' learning using Test My Code. In Proceedings of the 18th ACM conference on Innovation and technology in computer science education, ITiCSE '13, pages 117--122, New York, NY, USA, 2013. ACM.
[17]
M. R. A. Wittmann, M. Bower, and M. Kavakli-Thorne. Using the SCORE software package to analyse novice computer graphics programming. In Proceedings of the 16th annual joint conference on Innovation and technology in computer science education, ITiCSE '11, pages 118--122, New York, NY, USA, 2011. ACM.

Cited By

View all
  • (2024)InsProg: Supporting Teaching Through Visual Analysis of Students’ Programming ProcessesProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656668(1-5)Online publication date: 3-Jun-2024
  • (2024)Instructor Perceptions of AI Code Generation Tools - A Multi-Institutional Interview StudyProceedings of the 55th ACM Technical Symposium on Computer Science Education V. 110.1145/3626252.3630880(1223-1229)Online publication date: 7-Mar-2024
  • (2023)Automated Questionnaires About Students’ JavaScript Programs: Towards Gauging Novice Programming ProcessesProceedings of the 25th Australasian Computing Education Conference10.1145/3576123.3576129(49-58)Online publication date: 30-Jan-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGCSE '14: Proceedings of the 45th ACM technical symposium on Computer science education
March 2014
800 pages
ISBN:9781450326056
DOI:10.1145/2538862
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 March 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. programming education
  2. snapshot analysis
  3. snapshots
  4. source code analysis
  5. visualization

Qualifiers

  • Research-article

Conference

SIGCSE '14
Sponsor:

Acceptance Rates

SIGCSE '14 Paper Acceptance Rate 108 of 274 submissions, 39%;
Overall Acceptance Rate 1,595 of 4,542 submissions, 35%

Upcoming Conference

SIGCSE TS 2025
The 56th ACM Technical Symposium on Computer Science Education
February 26 - March 1, 2025
Pittsburgh , PA , USA

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)5
  • Downloads (Last 6 weeks)0
Reflects downloads up to 04 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)InsProg: Supporting Teaching Through Visual Analysis of Students’ Programming ProcessesProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656668(1-5)Online publication date: 3-Jun-2024
  • (2024)Instructor Perceptions of AI Code Generation Tools - A Multi-Institutional Interview StudyProceedings of the 55th ACM Technical Symposium on Computer Science Education V. 110.1145/3626252.3630880(1223-1229)Online publication date: 7-Mar-2024
  • (2023)Automated Questionnaires About Students’ JavaScript Programs: Towards Gauging Novice Programming ProcessesProceedings of the 25th Australasian Computing Education Conference10.1145/3576123.3576129(49-58)Online publication date: 30-Jan-2023
  • (2023)Evaluating Distance Measures for Program RepairProceedings of the 2023 ACM Conference on International Computing Education Research - Volume 110.1145/3568813.3600130(495-507)Online publication date: 7-Aug-2023
  • (2023)Factors Affecting Compilable State at Each Keystroke in CS1Proceedings of the 45th International Conference on Software Engineering: Software Engineering Education and Training10.1109/ICSE-SEET58685.2023.00036(314-323)Online publication date: 17-May-2023
  • (2023)Clusters of Solvers' Behavioral Patterns Based on Analysis of the Programming Process2023 IEEE Frontiers in Education Conference (FIE)10.1109/FIE58773.2023.10343479(1-6)Online publication date: 18-Oct-2023
  • (2023)Learning analytics in programming courses: Review and implicationsEducation and Information Technologies10.1007/s10639-023-11611-028:9(11221-11268)Online publication date: 16-Feb-2023
  • (2022)CodeProcess Charts: Visualizing the Process of Writing CodeProceedings of the 24th Australasian Computing Education Conference10.1145/3511861.3511867(46-55)Online publication date: 14-Feb-2022
  • (2022)Evaluating CodeClusters for Effectively Providing Feedback on Code Submissions2022 IEEE Frontiers in Education Conference (FIE)10.1109/FIE56618.2022.9962751(1-9)Online publication date: 8-Oct-2022
  • (2021)The importance of using the CodeInsights monitoring tool to support teaching programming in the context of a pandemic2021 IEEE Frontiers in Education Conference (FIE)10.1109/FIE49875.2021.9637292(1-8)Online publication date: 13-Oct-2021
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media