Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3328778.3366881acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article
Public Access

The Role of Evidence Centered Design and Participatory Design in a Playful Assessment for Computational Thinking About Data

Published: 26 February 2020 Publication History

Abstract

The K-12 CS Framework provides guidance on what concepts and practices students are expected to know and demonstrate within different grade bands. For these guidelines to be useful in CS education, a critical next step is to translate the guidelines to explicit learning targets and design aligned instructional tools and assessments. Our research and development goal in this paper is to design a playful, curriculum-neutral assessment aligned with the 'Data and Analysis' concept (grades 6-8) from the CS framework. Using Evidence Centered Design and Participatory Design, we present a set of assessment guidelines for assessing data and analysis, as well as a set of design considerations for integrating data and analysis across middle school curricula in CS and non-CS contexts. We outline these contributions, describe how they were applied to the development of a game-based formative assessment for data and analysis, and present preliminary findings on student understanding and challenges inferred from student gameplay.

References

[1]
Berland, M. (2016). Making, tinkering, and computational literacy. In K. Peppler, E. Halverson, & Y. B. Kafai (Eds.), Makeology: Makers as Learners (Vol. 2, pp. 196--205). NYC: Routledge.
[2]
Berland, M., Martin, T., Benton, T., Petrick Smith, C., & Davis, D. (2013). Using Learning Analytics to Understand the Learning Pathways of Novice Programmers. Journal of the Learning Sciences, 22(4), 564--599. https://doi.org/10.1080/10508406.2013.836655
[3]
Bienkowski, M., Snow, E., Rutstein, D. W., & Grover, S. (2015). Assessment design patterns for computational thinking practices in secondary computer science: A First Look. Menlo Park, CA: SRI International.
[4]
Cuban, L. (1993). How teachers taught: Constancy and change in American classrooms, 1890--1990. Teachers College Press.
[5]
DiSalvo, B., & DiSalvo, C. (2014). Designing for democracy in education: Participatory design and the learning sciences. In Proceedings of the Eleventh International Conference of the Learning Sciences (ICLS 2014). Boulder, CO: International Society of the Learning Sciences.
[6]
DiSalvo, B., Yip, J., Bonsignore, E., & Carl, D. (2017). Participatory design for learning. In Participatory Design for Learning (pp. 3--6). Routledge.
[7]
Goode, J., Chapman, G., & Margolis, J. (2012). Beyond curriculum: the exploring computer science program. ACM Inroads, 3(2), 47--53.
[8]
Greenberg, Saul. (2002, University of Calgary CPSC 481 coursenotes edition) "Working through Task-Centered System Design." In Diaper D. and Stanton, N. (Eds) The Lori Shyba " 20 Handbook of Task Analysis for Human-Computer Interactions. Lawrence Erlbaum Associates.
[9]
Holbert, N., & Wilensky, U. (2018). Designing Educational Video Games to Be Objects-to-Think-With. Journal of the Learning Sciences, 0(ja), null. https://doi.org/10.1080/10508406.2018.1487302
[10]
Hunicke, R., LeBlanc, M., & Zubek, R. (2004). MDA: A formal approach to game design and game research. In Proceedings of the AAAI Workshop on Challenges in Game AI (pp. 4--4).
[11]
Laughey, D. (2006). Music and Youth Culture. Edinburgh University Press.
[12]
Mandinach, E. B., & Gummer, E. S. (2013). A systemic view of implementing data literacy in educator preparation. Educational Researcher, 42(1), 30--37.
[13]
Mislevy, R.J. (2007). Validity by design. Educational Researcher, 36(8), 463--469.
[14]
Mislevy, R. J., & Haertel, G. D. (2006). Implications of evidence?centered design for educational testing. Educational Measurement: Issues and Practice, 25(4), 6--20.
[15]
Mislevy, R. J., & Riconscente, M. M. (2006). Evidence-centered assessment design: Layers, concepts, and terminology. In S. Downing & T. Haladyna (Eds.), Handbook of test development (pp. 61--90). Mahwah, NJ: Lawrence Erlbaum.
[16]
Papert, S., & Harel, I. (1991). Situating constructionism. In S. Papert & I. Harel (Eds.), Constructionism (pp. 1--11). New York: Ablex Publishing.
[17]
Papert, S. (1980). Mindstorms: Children, computers, and powerful ideas. Basic Books, Inc.
[18]
Parker, M. C., & DeLyser, L. A. (2017). Concepts and Practices: Designing and Developing A Modern K-12 CS Framework. In Proceedings of the 2017 ACM SIGCSE Technical Symposium on Computer Science Education (pp. 453--458). New York, NY, USSA: ACM.
[19]
Sanders, E.B.N., and Stappers, P.J. Convivial design toolbox. 2012. Publishers Group UK, Amsterdam.
[20]
Public Health Matters Blog: Plague Inc., 2013, accessed May 29, 2014. [Online]. Available: http://blogs.cdc.gov/publichealthmatters/ 2013/04/plague-inc/
[21]
Shaffer, D. W., & Gee, J. P. (2012). The right kind of gate. Technology-Based Assessments for 21st Century Skills: Theoretical and Practical Implications from Modern Research, 211--228.
[22]
Weintrop, D., Holbert, N., Horn, M. S., & Wilensky, U. (2016). Computational Thinking in Constructionist Video Games: International Journal of Game-Based Learning, 6(1), 1--17. https://doi.org/10.4018/IJGBL.2016010101
[23]
Weintrop, D., Beheshti, E., Horn, M. S., Orton, K., Trouille, L., Jona, K., & Wilensky, U. (2014, July). Interactive assessment tools for computational thinking in High School STEM classrooms. In International Conference on Intelligent Technologies for Interactive Entertainment (pp. 22--25). Springer, Cham.
[24]
Zimmerman, J., Forlizzi, J., & Evenson, S. (2007, April). Research through design as a method for interaction design research in HCI. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 493--502). ACM.
[25]
Lehrer, R., Kim, M., Ayers, E., & Wilson, M. (2014). Toward Establishing a Learning Progression to Support the Development of Statistical Reasoning." In A. Maloney, J. Confrey, & K. Nguyen (Eds), Learning over Time: Learning Trajectories in Mathematics Education. Charlotte, N.C.: Information Age Publishers.

Cited By

View all
  • (2024)Developing and validating interoperable ontology-driven game-based assessmentsExpert Systems with Applications: An International Journal10.1016/j.eswa.2024.123370248:COnline publication date: 15-Aug-2024
  • (2023)Concepts, practices, and perspectives for developing computational data literacy: Insights from workshops with a new data programming systemProceedings of the 22nd Annual ACM Interaction Design and Children Conference10.1145/3585088.3589364(100-111)Online publication date: 19-Jun-2023
  • (2023)A Systematic Literature Review of Game-Based Assessment Studies: Trends and ChallengesIEEE Transactions on Learning Technologies10.1109/TLT.2022.322666116:4(500-515)Online publication date: Aug-2023
  • Show More Cited By

Index Terms

  1. The Role of Evidence Centered Design and Participatory Design in a Playful Assessment for Computational Thinking About Data

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        SIGCSE '20: Proceedings of the 51st ACM Technical Symposium on Computer Science Education
        February 2020
        1502 pages
        ISBN:9781450367936
        DOI:10.1145/3328778
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 26 February 2020

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. cs assessment
        2. data and analysis
        3. evidence centered design
        4. game-based assessment
        5. participatory design

        Qualifiers

        • Research-article

        Funding Sources

        Conference

        SIGCSE '20
        Sponsor:

        Acceptance Rates

        Overall Acceptance Rate 1,595 of 4,542 submissions, 35%

        Upcoming Conference

        SIGCSE Virtual 2024
        1st ACM Virtual Global Computing Education Conference
        December 5 - 8, 2024
        Virtual Event , NC , USA

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)160
        • Downloads (Last 6 weeks)17
        Reflects downloads up to 09 Nov 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Developing and validating interoperable ontology-driven game-based assessmentsExpert Systems with Applications: An International Journal10.1016/j.eswa.2024.123370248:COnline publication date: 15-Aug-2024
        • (2023)Concepts, practices, and perspectives for developing computational data literacy: Insights from workshops with a new data programming systemProceedings of the 22nd Annual ACM Interaction Design and Children Conference10.1145/3585088.3589364(100-111)Online publication date: 19-Jun-2023
        • (2023)A Systematic Literature Review of Game-Based Assessment Studies: Trends and ChallengesIEEE Transactions on Learning Technologies10.1109/TLT.2022.322666116:4(500-515)Online publication date: Aug-2023
        • (2023)Assessment of Computational Thinking Skills: A Systematic Review of the LiteratureIEEE Revista Iberoamericana de Tecnologias del Aprendizaje10.1109/RITA.2023.332376218:4(319-330)Online publication date: Nov-2023
        • (2022)Is it time we get real? A systematic review of the potential of data-driven technologies to address teachers' implicit biasesFrontiers in Artificial Intelligence10.3389/frai.2022.9949675Online publication date: 11-Oct-2022
        • (2022)Learning analytics application to examine validity and generalizability of game‐based assessment for spatial reasoningBritish Journal of Educational Technology10.1111/bjet.1328654:1(355-372)Online publication date: 13-Nov-2022
        • (2022)Taking data feminism to school: A synthesis and review of pre‐collegiate data science education projectsBritish Journal of Educational Technology10.1111/bjet.1325153:5(1096-1113)Online publication date: 24-Jun-2022
        • (2022)Participatory design and participatory debugging: Listening to students to improve computational thinking by creating gamesInternational Journal of Child-Computer Interaction10.1016/j.ijcci.2022.10052534(100525)Online publication date: Dec-2022
        • (2021)A Systematic Literature Review of Gameful Feedback in Computer Science EducationInternational Journal of Information and Education Technology10.18178/ijiet.2021.11.10.155111:10(464-470)Online publication date: 2021
        • (2021)Assessing computational thinking in librariesComputer Science Education10.1080/08993408.2021.1874229(1-22)Online publication date: 18-Jan-2021

        View Options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Get Access

        Login options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media