Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3510456.3514149acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
research-article
Open access

ASPA: A static analyser to support learning and continuous feedback on programming courses. An empirical validation

Published: 17 October 2022 Publication History

Abstract

For decades there have been arguments how to teach programming in the basic courses. Supportive intervention methods to improve students' learning and methods to improve assessment process have been widely studied. There are various successful methods to each topic separately, but only a few of them fit for both. In this work, we aimed at validating ASPA a static analyser tool that supports learning and continuous feedback on programming courses. For this purpose, we designed and conduct an empirical study among 236 students enrolled in the basic programming course, that voluntary adopted the tools during the project development activities. We first profiled the students, then, evaluated the attitude toward using ASPA, the perceived ease of use, and the perceived usefulness. Results showed that ASPA is a good helper for the entire course and especially the student's programming assignments, and it also helps to improve the students' grades.

References

[1]
Joe Michael Allen, Frank Vahid, Kelly Downey, and Alex Daniel Edgcomb. 2018. Weekly Programs in a CS1 Class: Experiences with Auto-graded Many-small Programs (MSP). In ASEE Annual Conference & Exposition. ASEE Conferences, Salt Lake City, Utah, 13.
[2]
Joe Michael Allen, Frank Vahid, Alex Edgcomb, Kelly Downey, and Kris Miller. 2019. An analysis of using many small programs in cs1. In Technical Symposium on Computer Science Education. ACM, Minneapolis, MN, USA, 585--591.
[3]
Raul Andrade and João Brunet. 2018. Can students help themselves? An investigation of students' feedback on the quality of the source code. In 2018 IEEE Frontiers in Education Conference (FIE). IEEE, San Jose, CA, USA, 1--8.
[4]
Brett A. Becker and Keith Quille. 2019. 50 Years of CS1 at SIGCSE: A Review of the Evolution of Introductory Programming Education Research. In Symposium on Computer Science Education. ACM, Minneapolis, MN, USA, 338--344.
[5]
Janet Carter, Kirsti Ala-Mutka, Ursula Fuller, Martin Dick, John English, William Fone, and Judy Sheard. 2003. How shall we assess this? In Working group reports from ITiCSE on Innovation and technology in computer science education. ACM, New York, NY, USA, 107--123.
[6]
Sammi Chow, Kalina Yacef, Irena Koprinska, and James Curran. 2017. Automated data-driven hints for computer programming students. In Conference on User Modeling, Adaptation and Personalization. ACM, Bratislava, Slovakia, 5--10.
[7]
Stephen Cooper, Wanda Dann, and Randy Pausch. 2000. Alice: a 3-D tool for introductory programming concepts. Journal of computing sciences in colleges 15, 5 (2000), 107--116.
[8]
David Croft and Matthew England. 2020. Computing with CodeRunner at Coventry University: Automated summative assessment of Python and C++ code. In 4th Conference on Computing Education Practice 2020. ACM, Durham, United Kingdom, 1--4.
[9]
Fred D. Davis. 1989. Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Quarterly 13, 3 (1989), 319--340. Publisher: Management Information Systems Research Center, University of Minnesota.
[10]
Eric Foxley, Colin Higgins, Tarek Hegazy, Pavlos Symeonidis, and Athanasios Tsintsifas. 2001. The coursemaster cba system: Improvements over Ceilidh. In Proceedings of the 5th CAA Conference. Loughborough University, Loughborough, United Kingdom, 12. https://hdl.handle.net/2134/1809
[11]
Nupur Garg and Aaron W Keen. 2018. Earthworm: Automated Decomposition Suggestions. In Koli Calling International Conference on Computing Education Research. ACM, Koli, Finland, 1--5.
[12]
Alex Gerdes, Johan T Jeuring, and Bastiaan J Heeren. 2010. Using strategies for assessment of programming exercises. In 41st Technical symposium on Computer science education. ACM, Milwaukee, Wisconsin, USA, 441--445.
[13]
Colin A Higgins, Geoffrey Gray, Pavlos Symeonidis, and Athanasios Tsintsifas. 2005. Automated assessment and experiences of teaching programming. Journal on Educational Resources in Computing (JERIC) 5, 3 (2005), 5--es.
[14]
Tony Jenkins. 2002. On the difficulty of learning to program. In Annual Conference of the LTSN. LTSN Centre for Information and Computer Sciences, Loughborough, United Kingdom, 53--58.
[15]
Erkki Kaila, M-J Laakso, Teemu Rajala, and Einari Kurvinen. 2018. A model for gamifying programming education: University-level programming course quantified. In International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO). IEEE, IEEE, Opatija, Croatia, 0689--0694.
[16]
Jussi Kasurinen. 2006. Python as a programming language for the introductory programming courses. Bachelor's thesis. Lappeenranta University of Technology.
[17]
Caitlin Kelleher and Randy Pausch. 2005. Lowering the barriers to programming: A taxonomy of programming environments and languages for novice programmers. ACM Computing Surveys (CSUR) 37, 2 (2005), 83--137.
[18]
Hieke Keuning, Bastiaan Heeren, and Johan Jeuring. 2017. Code Quality Issues in Student Programs. In Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education (2017-06-28) (ITiCSE '17). ACM, Bologna, Italy, 110--115.
[19]
Roope Luukkainen. 2020. ASPA: A static analyser to support learning and continuous feedback on the first programming course. Master's thesis. LUT University.
[20]
Roope Luukkainen. 2020. RoopeLuukkainen/ASPA: ASPA proof-of-concept release. Zenodo
[21]
Andrew Luxton-Reilly, Simon, Ibrahim Albluwi, Brett A. Becker, Michail Giannakos, Amruth N. Kumar, Linda Ott, James Paterson, Michael James Scott, Judy Sheard, and Claudia Szabo. 2018. Introductory programming: a systematic literature review. In Proceedings Companion of the 23rd Annual ACM Conference on Innovation and Technologyin Computer Science Education (2018-07-02) (ITiCSE 2018 Companion). Association for Computing Machinery, Larnaca, Cyprus, 55--106.
[22]
Richard E Mayer, Jennifer L Dyck, and William Vilberg. 1986. Learning to program and learning to think: what's the connection? Commun. ACM 29, 7 (1986), 605--610.
[23]
Rodrigo Pessoa Medeiros, Geber Lisboa Ramalho, and Taciana Pontual Falcão. 2018. A systematic literature review on teaching and learning introductory programming in higher education. IEEE Trans. on Education 62, 2 (2018), 77--90.
[24]
Seppo Nevalainen and Jorma Sajaniemi. 2006. An Experiment on Short-Term Effects of Animated versus Static Visualization of Operations on Program Perception. In Second International Workshop on Computing Education Research. ACM, Canterbury, United Kingdom, 7--16.
[25]
Uolevi Nikula, Satu Alaoutinen, Jussi Kasurinen, and Toni Pirinen. 2009. Improving the Technical Infrastructure of a Programming Course. In International Conference on Advanced Learning Technologies. IEEE, Riga, Latvia, 374--376.
[26]
Uolevi Nikula, Orlena Gotel, and Jussi Kasurinen. 2011. A Motivation Guided Holistic Rehabilitation of the First Programming Course. ACM Trans. on Computing Education 11, 4 (Nov. 2011), 1--38.
[27]
Anthony Robins, Janet Rountree, and Nathan Rountree. 2003. Learning and teaching programming: A review and discussion. Computer science education 13, 2 (2003), 137--172.
[28]
Elliot Soloway. 1984. What do novices know about programming. Directions in Human-Comupter Interaction (1984).
[29]
Juha Sorva, Ville Karavirta, and Lauri Malmi. 2013. A review of generic program visualization systems for introductory programming education. ACM Transactions on Computing Education (TOCE) 13, 4 (2013), 1--64.
[30]
Venkatesh, Viswanath, Morris, Michael G., Davis, Gordon B., and Davis, Fred D. 2003. User Acceptance of Information Technology: Toward a Unified View. MIS Quarterly 27, 3 (2003), 425--478.
[31]
Claes Wohlin, Per Runeson, Martin Hst, Magnus C. Ohlsson, Bjrn Regnell, and Anders Wessln. 2012. Experimentation in Software Engineering. Springer, Berlin, Heidelberg, Germany.
[32]
Brad Wuetherick. 2010. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory. Canadian Journal of University Continuing Education 36 (12 2010).

Cited By

View all
  • (2025)Introducing Code Quality at CS1 Level: Examples and Activities2024 Working Group Reports on Innovation and Technology in Computer Science Education10.1145/3689187.3709615(339-377)Online publication date: 22-Jan-2025
  • (2024)Are a Static Analysis Tool Study's Findings Static? A ReplicationProceedings of the 2024 on Innovation and Technology in Computer Science Education V. 110.1145/3649217.3653545(80-86)Online publication date: 3-Jul-2024
  • (2024)A Static Analysis Tool in CS1: Student Usage and Perceptions of PythonTAProceedings of the 26th Australasian Computing Education Conference10.1145/3636243.3636262(172-181)Online publication date: 29-Jan-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICSE-SEET '22: Proceedings of the ACM/IEEE 44th International Conference on Software Engineering: Software Engineering Education and Training
May 2022
292 pages
ISBN:9781450392259
DOI:10.1145/3510456
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

In-Cooperation

  • IEEE CS

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 17 October 2022

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. CS1
  2. empirical software engineering
  3. programming education
  4. software education
  5. static analysis tools

Qualifiers

  • Research-article

Conference

ICSE '22
Sponsor:

Upcoming Conference

ICSE 2025

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)167
  • Downloads (Last 6 weeks)21
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Introducing Code Quality at CS1 Level: Examples and Activities2024 Working Group Reports on Innovation and Technology in Computer Science Education10.1145/3689187.3709615(339-377)Online publication date: 22-Jan-2025
  • (2024)Are a Static Analysis Tool Study's Findings Static? A ReplicationProceedings of the 2024 on Innovation and Technology in Computer Science Education V. 110.1145/3649217.3653545(80-86)Online publication date: 3-Jul-2024
  • (2024)A Static Analysis Tool in CS1: Student Usage and Perceptions of PythonTAProceedings of the 26th Australasian Computing Education Conference10.1145/3636243.3636262(172-181)Online publication date: 29-Jan-2024
  • (2023)On the Use of Static Analysis to Engage Students with Software Quality Improvement: An Experience with PMDProceedings of the 45th International Conference on Software Engineering: Software Engineering Education and Training10.1109/ICSE-SEET58685.2023.00023(179-191)Online publication date: 17-May-2023

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media