Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1734263.1734432acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article

Does studio-based instruction work in CS 1?: an empirical comparison with a traditional approach

Published: 10 March 2010 Publication History

Abstract

Given the increasing importance of communication, teamwork, and critical thinking skills in the computing profession, we believe there is good reason to provide students with increased opportunities to learn and practice those skills in undergraduate computing courses. Toward that end, we have been exploring studio-based instructional methods, which have been successfully employed in architecture and fine arts education for over a century. We have developed an adaptation of studio-based instruction for computing education called the pedagogical code review, which is modeled after the code inspection process used in the software industry. To evaluate its effectiveness, we carried out a quasi-experimental comparison of a "studio-based" CS 1 course with pedagogical code reviews and an identical "traditional" CS 1 course without pedagogical code reviews. We found no learning outcome differences between the two courses; however, we did observe two interesting attitudinal trends: (a) self-efficacy decreased more in the traditional course than in the studio-based course; and (b) peer learning decreased in the traditional course, but increased in the studio-based course. Additional questionnaire and interview data provide further evidence of the positive impact of studio-based instruction.

References

[1]
Anewalt, K. Using peer review as a vehicle for communication skill development and active learning. Journal of Computing Sciences in Colleges 21, 2 (2005), 148--155.
[2]
Bandura, A. Self-efficacy: the exercise of control. Worth, New York.
[3]
Docherty, M., Sutton, P., Brereton, M., and Kaplan, S. An innovative design and studio-based CS degree. In Proc. 32nd SIGCSE Technical Symposium on Computer Science Education. ACM, New York, 2001, 233--237.
[4]
Faro, S. and Swan, K. An investigation of the efficacy of the studio model at the high school level. Journal of Educational Computing Research 35, 1 (2006), 45--59.
[5]
Gehringer, E. Electronic peer review and peer grading in computer science courses. In Proc. 32nd SIGCSE Technical Symposium on Computer Science Education. ACM Press, New York, 2001, 139--143.
[6]
Gehringer, E., Chinn, D., Mardis, M., and Perez-Quinones, M. Panel: Using peer review in teaching computing. In Proc. 36th SIGCSE Technical Symposium on Computer Science Education. ACM Press, New York, 2005, 321--322.
[7]
Gilb, T. and Graham, D. Software Inspection. Addison-Wesley, Menlo Park, CA, 1993.
[8]
Hundhausen, C., Agrawal, A., Fairbrother, D., and Trevisan, M. Integrating pedagogical code reviews into a CS 1 course: an empirical study. In Proc. 40th ACM Technical Symposium on Computer Science Education. ACM, New York, 2009, 291--295.
[9]
Hundhausen, C., Narayanan, N., and Crosby, M. Exploring studio-based instructional models for computing education. In Proc. 39th SIGCSE Technical Symposium on Computer Science Education. ACM Press, New York, 2008, 392--396.
[10]
Lackney, J. A History of the Studio-based Learning Model. 1999. http://www.edi.msstate.edu/studio.htm.
[11]
Lister, B. Next generation studio: A new model for interactive learning. 2001. http://www.ciue.rpi.edu/pdfs/nextGenStudio.pdf.
[12]
Lynch, K., Carbone, A., Jamieson, P., and Arnott, D. Adopting a studio-based education approach into information technology (poster session). Proc. Australasian conference on computing education, ACM New York, 2000, 254.
[13]
McKinney, J., McKinney, K., Franiuk, R., and Schweitzer, J. The college classroom as a community: Impact on student attitudes and learning. College Teaching 54, 281--284.
[14]
Myneni, L., Ross, M., Hendrix, D., and Narayanan, N. Studio-based learning in CS2: An experience report. In Proc. 46th ACM southeast conference (ACM-SE 2008). ACM Press, New York, 2008, 253--255.
[15]
Pintrich, D., Smith, D., Garcia, T., and McKeachie, W. A manual for the use of the motivated strategies for learning questionnaire. National Center for Research to Improve Postsecondary Teaching and Learning, Ann Arbor, MI, 1991.
[16]
Reily, K., Finnerty, P. L., and Terveen, L. Two peers are better than one: aggregating peer reviews for computing assignments is surprisingly accurate. Proc. ACM 2009 International Conference on Supporting Group Work, ACM, New York, 2009, 115--124.
[17]
Reimer, Y. and Douglas, S. Teaching HCI design with the studio approach. Computer Science Education 13, 3 (2003), 191--205.
[18]
Trivedi, A., Kar, D., and Patterson-McNeill, H. Automatic assignment management and peer evaluation. Journal of Computing Sciences in Colleges 18, 4 (2003), 30--37.
[19]
Trytten, D. A design for team peer code review. In Proc. 36th SIGCSE Technical Symposium on Computer Science Education. ACM Press, New York, 2005, 455--459.
[20]
Wiegers, K. Improving quality with software inspections. Software Development 3, 4 (1995), 55--64.

Cited By

View all
  • (2023)Impacting the Submission Timing of Student Work Using GamificationProceedings of the 16th Annual ACM India Compute Conference10.1145/3627217.3627218(7-12)Online publication date: 9-Dec-2023
  • (2023)Augmented Cognition Instructional Design for Studio-Based LearningAugmented Cognition10.1007/978-3-031-35017-7_17(250-268)Online publication date: 9-Jul-2023
  • (2020)A Review of Peer Code Review in Higher EducationACM Transactions on Computing Education10.1145/340393520:3(1-25)Online publication date: 9-Sep-2020
  • Show More Cited By

Index Terms

  1. Does studio-based instruction work in CS 1?: an empirical comparison with a traditional approach

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SIGCSE '10: Proceedings of the 41st ACM technical symposium on Computer science education
    March 2010
    618 pages
    ISBN:9781450300063
    DOI:10.1145/1734263
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 10 March 2010

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. code inspection
    2. cs1
    3. pedagogical code review
    4. peer review
    5. studio-based learning and instruction

    Qualifiers

    • Research-article

    Conference

    SIGCSE10
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 1,595 of 4,542 submissions, 35%

    Upcoming Conference

    SIGCSE TS 2025
    The 56th ACM Technical Symposium on Computer Science Education
    February 26 - March 1, 2025
    Pittsburgh , PA , USA

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)9
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 26 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Impacting the Submission Timing of Student Work Using GamificationProceedings of the 16th Annual ACM India Compute Conference10.1145/3627217.3627218(7-12)Online publication date: 9-Dec-2023
    • (2023)Augmented Cognition Instructional Design for Studio-Based LearningAugmented Cognition10.1007/978-3-031-35017-7_17(250-268)Online publication date: 9-Jul-2023
    • (2020)A Review of Peer Code Review in Higher EducationACM Transactions on Computing Education10.1145/340393520:3(1-25)Online publication date: 9-Sep-2020
    • (2020)What Do We Think We Think We Are Doing?Proceedings of the 2020 ACM Conference on International Computing Education Research10.1145/3372782.3406263(2-13)Online publication date: 10-Aug-2020
    • (2019)A Maker Studio Model for High School Classrooms: The Nature and Role of Critique in an Electronic Textiles Design ProjectTeachers College Record: The Voice of Scholarship in Education10.1177/016146811912100906121:9(1-34)Online publication date: 1-Sep-2019
    • (2019)50 Years of CS1 at SIGCSEProceedings of the 50th ACM Technical Symposium on Computer Science Education10.1145/3287324.3287432(338-344)Online publication date: 22-Feb-2019
    • (2019)Students As Teachers and CommunicatorsThe Cambridge Handbook of Computing Education Research10.1017/9781108654555.030(827-858)Online publication date: 15-Feb-2019
    • (2019)Leveraging the Integrated Development Environment for Learning AnalyticsThe Cambridge Handbook of Computing Education Research10.1017/9781108654555.024(679-706)Online publication date: 15-Feb-2019
    • (2019)The Cambridge Handbook of Computing Education Research10.1017/9781108654555Online publication date: 15-Feb-2019
    • (2018)Advancing the Science of Collaborative Problem SolvingPsychological Science in the Public Interest10.1177/152910061880824419:2(59-92)Online publication date: 29-Nov-2018
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media