Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2632320.2632354acmconferencesArticle/Chapter ViewAbstractPublication PagesicerConference Proceedingsconference-collections
research-article

Predicting student success using fine grain clicker data

Published: 28 July 2014 Publication History

Abstract

Recent research suggests that the first weeks of a CS1 course have a strong influence on end-of-course student performance. The present work aims to refine the understanding of this phenomenon by using in-class clicker questions as a source of student performance. Clicker questions generate per-lecture and per-question data with which to assess student understanding. This work demonstrates that clicker question performance early in the term predicts student outcomes at the end of the term. The predictive nature of these questions applies to code-writing questions, multiple choice questions, and the final exam as a whole. The most predictive clicker questions are identified and the relationships between these questions and final exam performance are examined.

References

[1]
A. Ahadi and R. Lister. Geek genes, prior knowledge, stumbling points and learning edge momentum: parts of the one elephant? In ICER, pages 123--128, 2013.
[2]
A. Ahadi, R. Lister, and D. Teague. Falling behind early and staying behind when learning to program. In 25th Anniversary Psychology of Programming Annual Conference, 2014.
[3]
J. Bennedsen and M. E. Caspersen. Failure rates in introductory programming. SIGCSE Bulletin, 39:32--36, 2007.
[4]
R. Duda, J. Gaschnig, and P. Hart. Model design in the PROSPECTOR consultant system for mineral exploration. In Expert systems in the micro- electronic age, pages 153--167. 1979.
[5]
M. A. Hudak and D. E. Anderson. Formal operations and learning style predict success in statistics and computer science courses. Teaching of Psychology, 17(4):231--234, 1990.
[6]
R. Lister, C. Fidge, and D. Teague. Further evidence of a relationship between explaining, tracing and writing skills in introductory programming. In ITiCSE, pages 161--165, 2009.
[7]
L. Porter, C. Bailey-Lee, B. Simon, and D. Zingaro. Peer instruction: Do students really learn from peer discussion in computing? In ICER, 2011.
[8]
L. Porter, C. B. Lee, and B. Simon. Halving fail rates using peer instruction: A study of four computer science courses. In SIGCSE, pages 177--182, 2013.
[9]
L. Porter and B. Simon. Retaining nearly one-third more majors with a trio of instructional best practices in cs1. In SIGCSE, pages 165--170, 2013.
[10]
L. Porter and D. Zingaro. Importance of early performance in cs1: Two conflicting assessment stories. In SIGCSE, pages 295--300, 2014.
[11]
A. Robins. Learning edge momentum: A new account of outcomes. Computer Science Education, 20(1):37--71, 2010.
[12]
B. Simon, M. Kohanfars, J. Lee, K. Tamayo, and Q. Cutts. Experience report: Peer instruction in introductory computing. In SIGCSE, pages 341--345, 2010.
[13]
E. Soloway. Cognitive strategies and looping constructs: An empirical study. Communications of the ACM, 26(11):853--860, 1983.
[14]
A. Venables, G. Tan, and R. Lister. A closer look at tracing, explaining and code writing skills in the novice programmer. In ICER, pages 117--128, 2009.
[15]
D. Wright, K. London, and A. Field. Using bootstrap estimation and the plug-in principle for clinical psychology data. Journal of Experimental Psychopathology, 2(2):252--270, 2011.
[16]
D. Zingaro. Experience report: Peer instruction in remedial computer science. In Ed-Media, pages 5030--5035, 2010.
[17]
D. Zingaro. Peer instruction contributes to self-efficacy in CS1. In SIGCSE, pages 373--378, 2014.

Cited By

View all
  • (2024)Early Identification of Struggling Students in Large Computer Science Courses: A Replication Study2024 IEEE 48th Annual Computers, Software, and Applications Conference (COMPSAC)10.1109/COMPSAC61105.2024.00022(88-93)Online publication date: 2-Jul-2024
  • (2023)Evolving Towards a Trustworthy AIEd Model to Predict at Risk Students in Introductory Programming CoursesProceedings of the 2023 Conference on Human Centered Artificial Intelligence: Education and Practice10.1145/3633083.3633190(22-28)Online publication date: 14-Dec-2023
  • (2023)Toward CS1 Content Subscales: A Mixed-Methods Analysis of an Introductory Computing AssessmentProceedings of the 23rd Koli Calling International Conference on Computing Education Research10.1145/3631802.3631828(1-13)Online publication date: 13-Nov-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICER '14: Proceedings of the tenth annual conference on International computing education research
July 2014
186 pages
ISBN:9781450327558
DOI:10.1145/2632320
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 28 July 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. assessment
  2. clickers
  3. cs1

Qualifiers

  • Research-article

Funding Sources

Conference

ICER '14
Sponsor:
ICER '14: International Computing Education Research Conference
August 11 - 13, 2014
Scotland, Glasgow, United Kingdom

Acceptance Rates

ICER '14 Paper Acceptance Rate 17 of 69 submissions, 25%;
Overall Acceptance Rate 189 of 803 submissions, 24%

Upcoming Conference

ICER 2025
ACM Conference on International Computing Education Research
August 3 - 6, 2025
Charlottesville , VA , USA

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)41
  • Downloads (Last 6 weeks)1
Reflects downloads up to 25 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Early Identification of Struggling Students in Large Computer Science Courses: A Replication Study2024 IEEE 48th Annual Computers, Software, and Applications Conference (COMPSAC)10.1109/COMPSAC61105.2024.00022(88-93)Online publication date: 2-Jul-2024
  • (2023)Evolving Towards a Trustworthy AIEd Model to Predict at Risk Students in Introductory Programming CoursesProceedings of the 2023 Conference on Human Centered Artificial Intelligence: Education and Practice10.1145/3633083.3633190(22-28)Online publication date: 14-Dec-2023
  • (2023)Toward CS1 Content Subscales: A Mixed-Methods Analysis of an Introductory Computing AssessmentProceedings of the 23rd Koli Calling International Conference on Computing Education Research10.1145/3631802.3631828(1-13)Online publication date: 13-Nov-2023
  • (2023)A learning analytics dashboard for data-driven recommendations on influences of non-cognitive factors in introductory programmingEducation and Information Technologies10.1007/s10639-023-12125-529:8(9221-9256)Online publication date: 7-Sep-2023
  • (2023)A Review on the Impact of Cognitive Factors in Introductory ProgrammingProceedings of Fourth International Conference on Communication, Computing and Electronics Systems10.1007/978-981-19-7753-4_77(1019-1032)Online publication date: 15-Mar-2023
  • (2022)Time-on-task metrics for predicting performanceACM Inroads10.1145/353456413:2(42-49)Online publication date: 17-May-2022
  • (2022)Methodological Considerations for Predicting At-risk StudentsProceedings of the 24th Australasian Computing Education Conference10.1145/3511861.3511873(105-113)Online publication date: 14-Feb-2022
  • (2022)Pausing while programmingProceedings of the ACM/IEEE 44th International Conference on Software Engineering: Software Engineering Education and Training10.1145/3510456.3514146(187-198)Online publication date: 21-May-2022
  • (2022)PreSSProceedings of the 27th ACM Conference on on Innovation and Technology in Computer Science Education Vol. 110.1145/3502718.3524755(54-60)Online publication date: 7-Jul-2022
  • (2022)Assessing Workload Perception in Introductory Computer Science Projects using NASA-TLXProceedings of the 53rd ACM Technical Symposium on Computer Science Education - Volume 110.1145/3478431.3499406(668-674)Online publication date: 22-Feb-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media