An Evaluation Tool To Measure Computational Thinking Skills: Pilot Investigation
An Evaluation Tool To Measure Computational Thinking Skills: Pilot Investigation
An Evaluation Tool To Measure Computational Thinking Skills: Pilot Investigation
net/publication/327882359
CITATIONS READS
0 1,196
5 authors, including:
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Ling LING Ung on 26 September 2018.
Ung L. Ling1*, Tammie C. Saibin2 , Nasrah Naharu3, Jane Labadin4 , Norazila A. Aziz5
*1, 2, 3
Faculty of Computer and Mathematical Sciences, University of Technology MARA (UiT M), 88999
Kota Kinabalu, Sabah, Malaysia
4, 5
Institute of Social Informatics and Technological Innovations (ISITI), University of Sarawak
Malaysia (UNIMAS), 94300 Kota Samarahan, Sarawak, Malaysia
Abstract. This is a pilot study whereby we present an evaluation rubric to measure Computational
Thinking (CT) skills for primary school level. The rubric is designed based on different criteria from
prominent CT trainers and researchers. It produces measureable CT evidence from students’ performed
tasks and written test. The experiment kit was sent to thirty schools, and nine primary 1 teachers have
responded to this pilot investigation. They have recorded and reported 359 students learning outcome with
pre-designed evaluation as pre- and post-test after conducted a semester teaching and learning activities.
Our results demonstrate positive attitude from the teachers towards the proposed evaluation tool, and it
manages to measure the Computational Thinking skill learning outcome. Teachers’ concern on CT skill
evaluation is also obtained from this investigation.
Keywords: computational thinking (CT), teaching and learning (TL), primary school student,
evaluation rubric, curriculum.
Introduction. The Prime Minister of Malaysia has announced the integration of Computational Thinking (CT)
skills into all subjects, starts in 2017 with Primary 1 students 10. With this announcement, there is an urgent need to
equip the schools, especially the teachers on the teaching and learning (TL) of CT skills in their daily lessons. The
newly revised curriculums are uploaded in the Ministry of Education (MOE) website, describing the revised curriculum
has accounted improvements content by global trends and international benchmarking in the content. The teaching and
learning pedagogy will concentrate on learning in depth, contextually and effectively; and the development of student
learning is assessed based on an on-going basis45.
T skills will expose students to the logical thinking skills, problem solving and life-long learning skills as well
as Information and Communication Technology (ICT)skills3, 13, 69. Previous studies have proven that CT skills are able
to improve individual higher-order thinking skills, which is a crucial element to survive in the 21st century especially
excelling in the work force 13, 35, 38, 48, 61. Despite numerous studies embarked on to understand CT skills; there are still
many concerns that need to be addressed in order to ensure effective CT teaching and learning process 22, 62, especially
for Malaysia, a beginner in this field. Key question in this study include: How can teachers effectively assess the
outcome of their teaching and learning (TL)?
CT Concepts. Wing 67stressed that CT is as important as reading, writing and counting and it should be
included as part of school curriculum. Researchers reported that CT alleviate one’s higher order thinking skills and
improve problem-solving skills23, 31. Experiments proved that leaners scored better not only in computing lessons, but
also in mathematics, languages and sciences compared to those who are not 1, 6, 28. CT is claimed as a must have skill in
order to live and work in today’s challenging world 63. It is defined by Cuny 16 as “the thought of processes involved in
formulating problems and their solutions so that the solutions are represented in a form that can be effectively carried
out by an information-processing agent”. However, the adaptation of CT concepts and the delivery of it by teachers to
their everyday school practices are not going to be easy 26 and it will require thorough study, to seek and determine the
most effective ways of teaching and learning CT 2, 22, 62, 69.
Regardless of all these studies, the definition of CT is still uncertain or lack of common definition, and the skill
set elements are different for different perspectives, which might due to premature investigations carried out 24, 25, 57, 65.
The prominent CT operational definitions are by Computer Science Teachers Association (CSTA) and The International
Society for Technology in Education(ISTE), defining CT skills as human ability in formulating solutions in
computational manner and the solution to be carried out by a computer 15. Table 1 shows different definitions of CT from
different institutions/organizations/bodies. There were also various publications have also proposed different CT
definitions for different fields 4, 7, 13, 19, 53, 57, 67. CT is claimed as a skill with varieties of skills namely the logical,
algorithmic, modeling, simulations and abstraction thinking 37, 51, 53, 54, 66, 68. 57 and 17 suggested that the core concepts of
CT are the algorithmic thinking, evaluation, decomposition, abstraction and generalization, while continuous work in
Scratch12, have developed a definition of CT, that involves three key dimensions: computational concepts,
computational practices and computational perspectives. There are also reports investigated the relevancy of CT with
Mathematics and Sciences 44, 50, 55, 56, 58 . According to 21, CT has a long history with computer science since 1950s,
sharing the same skills components such as the algorithmic thinking, conditional logics and modeling but CT is not all
about computer programming 37, 52, 65, 67. Some researchers have associated CT skills with attitude such as
606
Herald NAMSCA 1, 2018 Ung L. Ling, Tammie C. Saibin, Nasrah Naharu, Jane Labadin,
Norazila A. Aziz
__________________________________________________________________________________________________
607
Herald NAMSCA 1, 2018 Ung L. Ling, Tammie C. Saibin, Nasrah Naharu, Jane Labadin,
Norazila A. Aziz
__________________________________________________________________________________________________
Table 1: Definition of CT skillsets by CAS Barefoot, CSTA, Google for Education, MDEC and K-12,US
Institutions/
CT Skill Set
Organizations/Bodies
Computing at School Barefoot Computational Thinking is the thought processes involved in formulating a problem and expressing its solution
(CAS) in a way that a computer, human or machine can effectively carry out 8.
Computer Science Teachers Computational thinking is a problem solving process that includes: Formulating problems, logically, organizing
Association (CSTA) and analyzing data, representing data , identifying, analyzing, and implementing possible solutions and
generalizing and transferring this problem-solving process to a wide variety of problems 18.
Google for Education CT is a problem solving process that includes a number of characteristics, such as logically ordering and
analyzing data and creating solutions using a series of ordered steps (or algorithms), and dispositions, such as
the ability to confidently deal with complexity and open-ended problems 17.
Malaysia Digital Economy “CT is the ability to dissect problems and formulate solutions by drawing from concepts in computer science…
59
Corporation (MDEC) ”. Comprises of 6 main concepts: decompose, pattern, abstraction, algorithm, logical reasoning and
evaluation.
K-12, US CT element include assess its development: Abstractions and pattern generalizations (including models and
simulations), systematic processing of information, symbol systems and representations, algorithmic notions of
flow of control, structured problem decomposition (modularizing), iterative, recursive, and parallel thinking,
conditional logic, efficiency and performance constraints, debugging and systematic error detection 30.
Participants. A written permission was obtained from the Malaysian Ministry of Education (MOE), the State
Education Department (Jabatan Pendidikan Negeri-JPN) and also from the school principals. Teachers of 10 schools
from all over Malaysia (Sabah, Sarawak, Selangor, Pulau Pinang, Johor and Kelantan) were randomly picked for this
study. The participants were teachers who teach primary 1 students and volunteered to take part in this investigation.
The participated teachers teaching different subjects executed the experiment with their students. They conducted their
TL activities in classroom according to their teaching plan over a semester which is approximate 20 weeks. Each
teacher carried out a pre-test before any lesson commenced, then set up another post-test after completing a lesson to
measure the results of TL that have been conducted after one semester. The method of assessment was not set in the
curriculum, and thus the teachers had a say to decide types of assessment to be carried out.
Data Collection Instruments. Data collection instruments consisted of several items: - A set of information of
CT skills prepared by referring to available online resources mainly from the Google for Education and my digital maker
by MDEC, the instructions of the experiments and CT evaluation rubric (in Malay language version and English
version). The selected CT skills in the rubric is based on the CT skills mentioned in the MDEC teaching module for
trainers 43. The designed rubric has 6 performance indicators, namely 1- Very Limited, 2-Limited, 3-Fair, 4-Moderate, 5-
Good and 6-Excellent which is designed based on the reporting standard in MOE template5, 39-41.Each performance
indicator is described with expected CT learning outcome. As part of this study, a feedback form was shared (via
postage) and the teachers were required to fill in the form each time they conducted an assessment. The form required
participating teachers to record the following information: - Number of students, a short summary of the lesson, CT
concepts intending to cover (a list of possible selections is provided), mode of assessment (selected from the choices:
oral test, written test, games, presentation, role-play, project and respondents were also provided with “other” tostate
other assessment method) and a column to obtain any other comments or observations they wished to record (free text).
To acquire teachers’ perception on the proposed rubric, 5 Likert-type scale questions were asked in the form, which were
adapted and modified based on multiple resources 25, 36, 46. An open-ended section is provided to the teachers to state
down their concern on the proposed rubric as a tool to assess their TL outcome.
Data Analysis. Descriptive quantitative method was applied to analyze the data collected during the pre-test
and the post-test. The data is analyzed using SPSS. Cronbach’s alpha was applied to measure the strength of the
correlation between the questionnaire items in the survey form which was used to acquire teachers’ perception on the
proposed rubric60.
Results. A total of 9 teachers participated in the experiment voluntarily, with 359 students involved in the TL activities.
Assessment methods carried out by the teachers were written test, via game, singing and calculation exercise. Table 2 shows the
number of students involved in the respective assessments.
Assessed CT skills
Type of assessment
Pre/Post tests
Teacher
Subject
Pattern
Post-test
recognition
Pre-test Logical reasoning
4 41 Science Written test Pattern
Post-test
recognition
Pre-test Pattern
5 36 Science Written test
Post-test recognition
Pre-test Logical reasoning
6 36 Science Written test Pattern
Post-test
recognition
Pre-test Calculation Logical reasoning
7 41 Mathematics
Post-test exercise Decomposition
Pre-test Algorithm
8 36 Mathematics Written test Pattern
Post-test
recognition
Pre-test
40 Mathematics Logical reasoning
Post-test Game
9 Decomposition
37 Pre-test
Mathematics
Post-test
3.1 research question 1: Is there any difference on the learning outcome before and after a lesson was conducted?
Table 3: Pre-Test by Subjects
Results of pre-test
1 2 3 4 5 6
Mathemati %
29 38 44 26 14 3
cs
18.8% 24.7% 28.6% 16.9% 9.1% 1.9%
Subject
Science % 41 35 50 13 10 5
26.6% 22.7% 32.5% 8.4% 6.5% 3.2%
Malay % 7 22 9 8 5 0
Language 13.7% 43.1% 17.6% 15.7% 9.8% 0.0%
Table 3 shows the marks for pre-test by subject. For Mathematics, 28.6% (44) students attained scores of 3
followed by scores of 2 (24.7%) and score of 1 (18.8%). Only 9.1% and 1.9% students attained higher score 5 and 6 for
this subject. For Science subject, 32.5% of the students attained the score of 3 followed by score of 1 (26.6%) and
2(22.7%). Only 6.5% and 3.2% students attained scores of 5 and 6 respectively. For Malay Language subject, 43.1% of
the students attained scores of 2, followed by scores of 3 (17.6%), scores of 4 (15.7%), score of 1 (13.7%) and score of 5
(9.8%).
Table 4: Post-Test by Subject
Results of post-test
1 2 3 4 5 6
1 7 33 63 40 10
Mathematics
3 10 35 48 39 19
Science
0 3 18 23 7 0
Language
Malay
Table above shows the marks for post-test by subjects. For Mathematics, 40.9% managed to get score of 4 followed by
score of 5 (26%), score of 3 (21.4%), score of 6 (6.5%) and scores of 2(4.5%). It was noted that there was slight increase
in terms of attainable scores by the students (from score of 3 to score of 4). For Science subject, 31.2% of the students
attained score of 4 followed by score of 5 (25.3%), score of 3 (22.7%), score of 6 (12.3%), score of 2 and 1 with 6.5%
and 1.9% respectively. The number of students attained higher scores for Science subject also increased (from score of 3
to score of 4). For Malay Language subject, 45.1% of the students attained score of 4, followed by score of 3 (35.3%),
score of 5(13.7%) and score of 2(5.9%). For this subject, the marks drastically increased from score of 2 in pre-test to
scores of 4 for post-test.
609
Herald NAMSCA 1, 2018
Ung L. Ling, Tammie C. Saibin, Nasrah Naharu, Jane Labadin,
Norazila A. Aziz
__________________________________________________________________________________________________
Table 5 presents the mean scores from pre-test to post-test by subjects. For Mathematics subject, the mean score of 2.79 during pre-
test has increased to 4.06 for post-test. For Science subject, the score during pre-test, 2.55 was increased to 4.08 for post-test. For
Malay Language subject, score of 2.65 during pre-test slightly increased to 3.67 for post-test. Overall, we observed the mean score for
all the subjects have increased. There were more students attained higher marks for post-test compared to pre-test. The finding
suggests that the students managed to learn andunderstand the respective subject after TL commenced. The proposed rubric
successfully assessed students’ performance in assessed subjects.
2nd research question: Does the evaluation rubric able to measure students’ CT skills after a lesson?
Table 6: Pre-Test by CT concepts
Pre-Test Score
1 2 3 4 5 6
53 51 49 23 14 5
Reasoning
Logical
2 6 11 12 5 0
Algorithm
Count%
18 28 39 8 6 3
Recognition
Pattern
4 10 4 4 4 0
Abstraction
Table 6 shows the score during pre-test by CT skills involved. There are four out of six CT skills assessed in this research. For
logical reasoning, 27.2% of the students attained the lowest score, followed by the score of 2 (26.2%), score of 3 (25.1%), score of 4
(11.8%), score of 5 (7.2%) and score of 6(2.6%). For CT skill involving algorithm, 33.3% of the students attained score of 4, followed by
score of 3(30.6%), score of 2 (16.7%), score of 5 (13.9%) and score of 1 (5.6%). For CT skill involving pattern recognition, 38.2% of the
students attained score of 3, followed by the score of 2 (27.5%), score of 1 (17.6%), score of 4(7.8%), score of 5 (5.9%) and score of
6(2.9%). For CT skill involving abstraction, 38.5% of the students attained score of 2. No students attained score of 6.
Post-test Score
1 2 3 4 5 6
2 12 44 75 46 16
Reasoning
Logical
0 1 4 16 13 2
Algorithm
Count %
Recognition
Pattern
2 4 28 32 25 11
The table above presents the result of marks attained by students for post-test according to CT skills involved. For logical
reasoning, 38.5% of the students attained score of 4, followed by score of 3 (22.6%), score of 5 (23.6%), score of 6 (8.2%), score of 2
610
Herald NAMSCA 1, 2018 Ung L. Ling, Tammie C. Saibin, Nasrah Naharu, Jane Labadin,
Norazila A. Aziz
__________________________________________________________________________________________________
(6.2%) and score of 1 (1%). For algorithms, 44.4% students attained score of 4, followed by score of 5 (36.1%), score of 3 (11.1%),
score of 6 (5.6%) and score of 2 (2.8%). For CT skill involving pattern recognition, 31.4% of the students attained score of 4, score of
3 (27.5%), score of 5(24.5%), score of 6(10.8%), score of 2 (3.9%), and score of 1 (2%). For abstraction, 42.3% of the students
attained score of 4, followed by score 3 (38.5%), score of 2 (11.5%) and score of 5 (7.7%). No students attained score of 1 and 6.
Table 8 presents the mean scores achieved by the students according to CT skills assessed. Overall the means scores for
all CT skills assessed was positively increased. The CT skill that has the highest difference from pre-test to post-test is logical
reasoning which increased by 1.49 in terms of score followed by pattern recognition (+1.39), algorithms (+0.98) and abstraction
(+0.69). The finding suggests that students were able to learn and acquire CT skills after TL of the subjects were conducted. The
proposed rubric managed to record down students’ CT understanding before and after a lesson.
3rd research question: What are the comments from the teachers about the evaluation rubric?
.936 5
The teachers then were asked to answer a few questions consisting of 5 statements regarding their perception towards the
evaluation rubric. To test whether each statements measure the same underlying concepts, Cronbach’s alpha was used. 47provided
guidance in the interpretation of the reliability coefficient by stating that a value of 0.70 is sufficient for early stages of a research.As
shown in Table 9, the questionnaire achieved excellent reliability with α=0.936. Therefore, the questions in the survey form are
reliable and the questionnaire is internally consistent.
Table 10: Summary of teachers’ perception on the proposed rubric
Disagree
Not sure
Strongly
Strongly
disagree
Agree
agree
Table 9 shows the teachers’ level of perception towards the evaluation rubric. 5 of the respondents agreed and 3
strongly agreed that they can measure the students’ CT skills by using the newly introduced rubric. On the other hand, 2
of the respondents had difficulties using the evaluation rubric to assess students’ performance. Despite the difficulties in
using the rubric, 8 of the respondents agreed and strongly agreed that the newly introduced rubric was suitable to assess
students’ CT skills in their respective subjects. Majority of the teachers also believed that the rubric can be used in
helping them in planning their TL strategy. 2 of the respondents were not sure if he/she will use the evaluation rubric in
future and 1 respondent disagreed in using the evaluation rubric. Nevertheless, 6 respondents positively agreed and
strongly agreed in using the rubric as a method to assess students’ CT skills in future. In the form, the teachers were also
required to state down their related concern on the integration of proposed rubric in their daily TL practices. Only 7 out
of the 9 participated teachers had stated their concern. 2 teachers stated about their concern on the duration of
assessment required if there is a bigger group of students involved. 2 teachers were concerned about the usage of the
proposed rubric on written test. These 2 teachers were the teachers who conducted non-written assessments. 2
respondents stated that they will use the rubric if it is made compulsory to measure CT skill in their assessment
611
Herald NAMSCA 1, 2018 Ung L. Ling, Tammie C. Saibin, Nasrah Naharu, Jane Labadin,
Norazila A. Aziz
__________________________________________________________________________________________________
practices. While another 3 teachers stated that they were satisfied with the rubric but stressed that training should be
provided to the teachers before application of the rubric commences.
Conclusion. The present study was designed to describe the usability of the proposed rubric as an assessment tool
in measuring CT TL outcome. Positive outcome were found in terms of rubric usability in measuring CT TL learning
outcome despite 4 different assessment methods were carried out.
The results showcased that most of the students developed better understanding on CT skills after TL activities
commenced, with majority improved from a very limited understanding on CT concepts to a relatively uniform
distribution of understanding across Levels 4, 5 and 6. The teachers indicated the agility used of the proposed rubric in 4
different assessment methods. Besides that the proposed rubric is proven to be practical too as 7 out of 9 participated
teachers had conducted written test as assessment, and 4 Teachers conducted assessment on more than 30 students in a
class.
This small scale of study serves important implications for incorporating CT in Malaysia classrooms as there
are existing methods suggested by some researchers to evaluate the learning outcomes of CT skills, but most of their
methods are simply applicable to programming environment8, 12, 33, 56. As CT skills are to be integrated in primary 1, and
continue on to secondary level, teachers will need a mechanism to measure the TL outcome, align with the existing
curriculum design, and this might be one of the tools that are applicable in Malaysia education environment. Future
studies will include bigger number of participants in the experiment to examine teacher’s use of the proposed rubric.
References
1 Sheikh Iqbal Ahamed, Dennis Brylow, Rong Ge, Praveen Madiraju, Stephen J Merrill, Craig A Struble, and
James P Early, 'Computational Thinking for the Sciences: A Three Day Workshop for High School Science Teachers', in
Proceedings of the 41st ACM technical symposium on Computer science educationACM, 2010), pp. 42-46.
2 Chris Stephenson Aman Yadav, Hai Hong 'Computational Thinking for Teacher Education', Communications of
the ACM, 2017).
3 Jon Good Aman Yadav, Joke Voogt, Petra Fisser, Computational Thinking as an Emerging Competence Domain.
ed. by M. Mulder. Vol. 23, Technical and Vocational Education and Training: Issues, Concerns and Prospects
(Switzerland Springer International Publishing, 2017).
4 Gabriella Anton, 'Power of Play: Exploring Computational Thinking through Game Design', The Velvet Light
Trap - A Critical Journal of Film and Television (2013), 74-75.
5 KEMENTERIAN PENDIDIKAN MALAYSIA BAHAGIAN PEMBANGUNAN KURIKULUM, Tmk Dalam
Kurikulumstandard Sekolah Rendah (Semakan 2017), 2017.
6 David Barr, John Harrison, and Leslie Conery, 'Computational Thinking: A Digital Age Skill for Everyone',
Learning & Leading with Technology, 38 (2011), 20-23.
7 Valerie Barr, and Chris Stephenson, 'Bringing Computational Thinking to K-12: What Is Involved and What Is
the Role of the Computer Science Education Community?', ACM Inroads, 2 (2011), 48-54.
8 Ashok Basawapatna, Kyu Han Koh, Alexander Repenning, David C. Webb, and Krista Sekeres Marshall,
'Recognizing Computational Thinking Patterns', in Proceedings of the 42nd ACM technical symposium on Computer
science education (Dallas, TX, USA: ACM, 2011), pp. 245-50.
9 Ashok R Basawapatna, Kyu Han Koh, and Alexander Repenning, 'Using Scalable Game Design to Teach
Computer Science from Middle School to Graduate School', in Proceedings of the fifteenth annual conference on
Innovation and technology in computer science educationACM, 2010), pp. 224-28.
10 BERNAMA, 'Pemikiran Komputasional, Sains Komputer Akan Diajar Di Sekolah Tahun Depan', Utusan
ONLINE, 11 Ogos 2016 2016.
11 Matt Bower, and Katrina Falkner, 'Computational Thinking, the Notional Machine, Pre-Service Teachers, and
Research Opportunities', in Proceedings of the 17th Australasian Computing Education Conference (ACE 2015), 2015),
p. 30.
12 Karen Brennan, and Mitchel Resnick, 'New Frameworks for Studying and Assessing the Development of
Computational Thinking', in Proceedings of the 2012 annual meeting of the American Educational Research
Association, Vancouver, Canada, 2012).
13 Sheryl Buckley, 'The Role of Computational Thinking and Critical Thinking in Problem Solving in a Learning
Environment', (Kidmore End: Academic Conferences International Limited, 2012), pp. 63-XI.
14 Simon Collins, 'All Kiwi Kids to Learn 'Computational Thinking' for 10 Years', NZ Herald, 28 June 2017 2017.
15 Computer Science Teacher Association CSTA, 'Csta: The Voice of K–12 Computer Science Education and Its
Educators', in Csta Voice: Computational Thinking, 2016).
16 Jan and Snyder Cuny, Larry and Wing, Jeannette M., 'Demystifying Computational Thinking for Non-Computer
Scientists', Unpublished manuscript in progress, referenced in http://www. cs. cmu. edu/~
CompThink/resources/TheLinkWing. pdf (2010).
17 Paul Curzon, Mark Dorling, Thomas Ng, Cynthia Selby, and John Woollard, 'Developing Computational
612
Herald NAMSCA 1, 2018 Ung L. Ling, Tammie C. Saibin, Nasrah Naharu, Jane Labadin,
Norazila A. Aziz
__________________________________________________________________________________________________
613
Herald NAMSCA 1, 2018
Ung L. Ling, Tammie C. Saibin, Nasrah Naharu, Jane Labadin,
Norazila A. Aziz
__________________________________________________________________________________________________
Children's Mathematical Thinking: Cognitively Guided Instruction as an Inclusive Pedagogy', Teaching and Teacher
Education, 43 (2014), 69-79.
45 Y Yi Mun, and Yujong Hwang, 'Predicting the Use of Web-Based Information Systems: Self-Efficacy,
Enjoyment, Learning Goal Orientation, and the Technology Acceptance Model', International journal of human-
computer studies, 59 (2003), 431-49.
46 Andrea Niosi, 'Creating Rubrics for Assessment', (2012).
47 Jum C Nunnally, and IH Bernstein, 'The Assessment of Reliability', Psychometric theory, 3 (1994), 248-92.
48 Kian L. Pokorny, and Nathan White, 'Computational Thinking Outreach: Reaching across the K-12 Curriculum',
J. Comput. Sci. Coll., 27 (2012), 234-42.
49 Dylan J. Portelance, 'Code and Tell: An Exploration of Peer Interviews and Computational Thinking with
Scratchjr in the Early Childhood Classroom' (M.A., Tufts University, 2015), p. 77.
50 Siwarak Promraksa, Kiat Sangaroon, and Maitree Inprasitha, 'Characteristics of Computational Thinking About
the Estimation of the Students in Mathematics Classroom Applying Lesson Study and Open Approach', in Journal of
Education and Learning (Toronto: Canadian Center of Science and Education, 2014), pp. 56-66.
51 Christie Lee Lili Prottsman, 'Computational Thinking and Women in Computer Science' (M.S., University of
Oregon, 2011), p. 51.
52 Alexander Repenning, David Webb, and Andri Ioannidou, 'Scalable Game Design and the Development of a
Checklist for Getting Computational Thinking into Public Schools', in Proceedings of the 41st ACM technical
symposium on Computer science educationACM, 2010), pp. 265-69.
53 Mitchel Resnick, John Maloney, Andrés Monroy-Hernández, Natalie Rusk, Evelyn Eastmond, Karen Brennan,
Amon Millner, Eric Rosenbaum, Jay Silver, and Brian Silverman, 'Scratch: Programming for All', Communications of
the ACM, 52 (2009), 60-67.
54 Justin Richards, 'Computational Thinking: A Discipline with Uses Outside the Computer Lab?', Computer
Weekly (2007), 52.
55 Young Jeon Se Young Park, Teachers’ Perception on Computational Thinking in Science Practices. Vol. Volume
9, 2015), p. 5.
56 Linda Seiter, and Brendan Foreman, 'Modeling the Learning Progressions of Computational Thinking of Primary
Grade Students', in Proceedings of the ninth annual international ACM conference on International computing
education research (San Diego, San California, USA: ACM, 2013), pp. 59-66.
57 Cynthia Selby, and John Woollard, 'Computational Thinking: The Developing Definition', (2013).
58 Pratim Sengupta, John S. Kinnebrew, Satabdi Basu, Gautam Biswas, and Douglas Clark, 'Integrating
Computational Thinking with K-12 Science Education Using Agent-Based Computation: A Theoretical Framework',
Education and Information Technologies, 18 (2013), 351-80.
59 Karamjit Singh, 'Computational Thinking Comes to the Fore in Malaysian Schools', Digital News Asia (DNA),
12 August 2016 2016.
60 Mohsen Tavakol, and Reg Dennick, 'Making Sense of Cronbach's Alpha', International journal of medical
education, 2 (2011), 53.
61 Allen Tucker, 'A Model Curriculum for K–12 Computer Science', in Final Report of the ACM K–12 Task Force
Curriculum Committee., ed. by Computer Science Teachers Association (New York, 10121-0071: Computer Science
Teachers Association, 2003).
62 L. Ling Ung, Tammie, C. Saibin, Jane, Labadin and Norazila, Abdul Aziz 'Preliminary Investigation: Teachers’
Perception on Computational Thinking Concepts', Journal of Telecommunication, Electronic and Computer
Engineering (JTEC) (2017).
63 Viswanath Venkatesh, Michael G Morris, Gordon B Davis, and Fred D Davis, 'User Acceptance of Information
Technology: Toward a Unified View', MIS quarterly (2003), 425-78.
64 Garret Walliman, 'Genost: A System for Introductory Computer Science Education with a Focus on
Computational Thinking' (M.S., Arizona State University, 2015), p. 374.
65 Andrea Elizabeth Weinberg, 'Computational Thinking: An Investigation of the Existing Scholarship and
Research' (Ph.D., Colorado State University, 2013), p. 99.
66 Michael Philetus Weller, Ellen Yi-Luen Do, and Mark D Gross, 'Escape Machine: Teaching Computational
Thinking with a Tangible State Machine Game', in Proceedings of the 7th international conference on
Interaction design and childrenACM, 2008), pp. 282-89.
67 Jeannette M Wing, 'Computational Thinking', Magazine Communications of the ACM - Self managed systems
CACM March, 2006, pp. 33-35.
68 A. Yadav, C. Mayfield, N. E. Zhou, S. Hambrusch, and J. T. Korb, 'Computational Thinking in Elementary and
Secondary Teacher Education', Acm Transactions on Computing Education, 14 (2014), 1-16.
69 Aman Yadav, Ninger Zhou, Chris Mayfield, Susanne Hambrusch, and John T. Korb, 'Introducing Computational
Thinking in Education Courses', in Proceedings of the 42nd ACM technical symposium on Computer science
educationACM, 2011), pp. 465-70.
614