Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Keterampilan Proses

Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

International Journal of Technology and Design Education (2022) 32:267–285

https://doi.org/10.1007/s10798-020-09608-8

T/E design based learning: assessing student critical thinking


and problem solving abilities

Susheela Shanta1   · John G. Wells2 

Accepted: 26 June 2020 / Published online: 7 July 2020


© Springer Nature B.V. 2020

Abstract
The research presented is of an investigation into the critical thinking (CT) and problem
solving (PS) abilities used by high school technology and engineering (T/E) students when
attempting to achieve a viable solution for an authentic engineering design-no-make chal-
lenge presented outside the context of the classroom in which their STEM content was first
learned. Five key abilities were identified and assessed as indicators of a student’s ability
to problem solving within the context of authentic engineering design. Findings from data
analyses indicates T/E students who acquire STEM content through T/E design base learn-
ing demonstrate significantly better CT and PS abilities in designing an engineering solu-
tion compared with a hypothesized mean for students receiving their STEM content via
traditional classroom instruction. Furthermore, student abilities associated with selecting
and utilizing relevant science and math content and practices, and communicating logical
reasoning in their design solution were found to be critical to successful problem solving.

Keywords  Design based · Authentic problems · T/E DBL · Critical thinking · Problem
solving · Student abilities

Motivation

The rapid pace of change, the complexity of human problems, and the ease of global
access to technologies and human resources have created the demand for individu-
als well prepared to utilize their knowledge of science, technology, engineering and
mathematics (STEM) in collaboration with professionals from diverse disciplines to
solve complex novel problems. The European Commission for New Assessment Tools
for Cross-Curricular Competencies in the Domain of Problem Solving (NATCCC-PS)
brought together various experts in 2000 to establish appropriate definitions for the
required student competencies in the 21st century (NATCCC 2000). These experts

* Susheela Shanta
sshanta@rcps.us
John G. Wells
jgwells@vt.edu
1
Governor’s STEM Academy, Roanoke, VA, USA
2
Virginia Tech, Blacksburg, VA, USA

13
Vol.:(0123456789)
268 S. Shanta, J. G. Wells

agreed that critical thinking and problem solving were cross curricular competences
required within the context of increasingly complex sociocultural demands. In recog-
nizing the relationship between critical thinking, problem solving, and sociocultural
needs, their charge to education writ large was preparing students able to meet the
demands of their complex technological world. One key conclusion drawn by the com-
mittee was that learning to problem-solve required competence “within and between
several subjects” (p. 14); i.e., transdisciplinary competence. Preparing individuals
who possess sufficient transdisciplinary competence is of great importance for meet-
ing tomorrow’s sociocultural demands. However, evidence suggests that progress in
doing so is hampered by most practitioners who still interpret the integration of STEM
as simply meaning an increased emphasis on their silo discipline and who continue to
teach adhering to a mono-discipline pedagogy (Wells 2013, 2016a, b, c).
Preparing learners to succeed in their chosen postsecondary STEM disciplines is
dependent upon a clearly articulated curricular framework and definition for achieving
STEM literacy: the ability to use content knowledge and skills in science, technol-
ogy, engineering and math in solving human problems in a collaborative manner (NRC
2009; Wiggins and McTighe 2005). Literacy as defined within these individual disci-
plines is the ability to understand information or claims in the context of those disci-
plines, evaluate the validity of the same, and, communicate evaluations and predictions
with a vocabulary that is consistent within those disciplines (NGSS Lead States 2013;
NCTM 2000). Defining literacy from within a mono-disciplinary context promotes
the misconception that literacy in one discipline is separate from that in another. This
notion of artificially separating disciplinary literacy is not representative of authentic
or real world situations where individuals are challenged to devise solutions or solve
problems requiring knowledge at the intersections of STEM disciplines (NATCCC
2000; Pope et al. 2015). Regardless of their postsecondary goals, for students to suc-
ceed in their increasingly complex technological world they must possess key critical
thinking and problem solving abilities that enable them to function within the con-
tinuum of disciplinary demands. Preparing such individuals who are able to meet the
cognitive demands for solving complex problems necessitates that STEM education be
intentionally integrative in its approach (Huber and Hutchings 2004) and best facili-
tated by technological/engineering design based learning (T/E DBL) pedagogical strat-
egies requiring solutions within complex problem scenarios (Fortus et al. 2004; Barak
and Assal 2016; Wells 2017).
As an instructional strategy T/E DBL is unique in how cognitive demands inherent
to complex design problems are naturally imposed on the leaner throughout the phases
of design. The inherent cognitive demands require learners to use key critical think-
ing skills in conjunction with problem solving skills as a means for making informed
decisions leading to viable design solutions. However, to date little empirical research
is available documenting the validity of T/E DBL for promoting student use of criti-
cal thinking (CT) and problem solving (PS) skills while engaged in authentic design
challenges. In light of this, the research presented in this paper addresses that need
for empirical evidence. Specifically, this research was designed to empirically demon-
strate the correlation between student engagement in T/E DBL and the extent to which
they use key CT and PS abilities in arriving at a viable solution for an authentic T/E
design based learning challenge. The research was conducted in a mid-eastern US high
school Governor’s STEM Academy where the Integrative STEM Education pedagogi-
cal approach is employed throughout the curriculum.

13
T/E design based learning: assessing student critical thinking… 269

Theoretical framework

I‑STEM ED pedagogical approach

Integrative STEM Education (I-STEM ED) is the use of technological/engineering design


based learning (T/E DBL) to teach Technological/Engineering content and practices, and
to concurrently and intentionally teach other targeted STEM content/practices inherent
within the design of the solution. When effectively employed, this pedagogical approach
promotes active learning through T/E design challenges within an environment of struc-
tured student discovery. This type of teaching environment provides the context for the
challenge, along with the impetus for learning and utilizing science and mathematics con-
tent/practices through design of a solution, guided by an instructional intent to foster active
construction of understanding through concrete experiences (Wells 2016b, p. 15); the goal
is to develop habits-of-mind as derived from habits-of-hand through a pedagogy where
learners design to understand (D2U) (Wells 2017).
The T/E DBL approach engages students in a design challenge as the pedagogical strat-
egy through which students learn. Design, a process that is inseparable from innovation, is
a socio-collaborative activity within which a group of people tackle an ill-defined, authen-
tic problem constrained by the resources available and parameters of real-world conditions.
Designing is the unifying practice among the technology and engineering disciplines, and
educationally serves as the pedagogical commons (Wells 2019) for both domains. Though
designing serves as the unifying practice, instructionally it is important to recognize the
distinctions in the targeted T/E design outcomes (Daugherty et  al. 2011). Technological
design entails the creation or innovation of a technological artifact, with learning outcomes
largely focused on student demonstration of descriptive (declarative), prescriptive (proce-
dural), and tacit knowledge in addressing the purpose, function, and fabrication of the arti-
fact. The end-goal for engineering design differs in that the outcome is a set of specifica-
tions for either fabricating a device or implementing a system as an engineering solution
(design-no-build). T/E DBL employs a transdisciplinary approach to design based learn-
ing that achieves a blend of learning end-goals. Pedagogically, regardless of discipline, the
design based learning approach engages students in a T/E design challenge that naturally
imposes a need to know the STEM content and practices that are both requisite and inher-
ent to the design of the technological/engineering solution. And as with most problems in
life, T/E design problems are ill-defined, ill-structured, and have multiple viable solutions
(Ormrod 2012). The best solutions involve optimization within constraints and which will
lead to the greatest benefits. Student construction of knowledge occurs within the relevant
context of the design challenge, the culture of peer-to-peer questioning, and the researching
of unknown variables in a collaborative environment (Brown et al. 1989; Bransford et al.
2000).
This student-centered instructional style has been found to be superior to the traditional
teacher-centered instructional style (Felder and Brent 2016). Using a process of (1) iden-
tifying and defining the problem that must be solved, (2) determining criteria for the solu-
tion, and, (3) using appropriate disciplinary content areas relative to the problem, students
engage in a collaborative process of questioning, researching and ideating to design, build,
evaluate and re-iterate until a satisficing solution (Middleton 2005) is created. The teaching
strategies used in T/E DBL adhere to Gagne’s theory of information processing where les-
sons are designed around events of instruction to intentionally promote student knowledge
construction in the procedural, declarative, schematic and strategic knowledge domains

13
270 S. Shanta, J. G. Wells

(Gagne et al. 2004; Wells 2016c). Iterations throughout the phases of design require stu-
dents to reflect on their existing knowledge in all relevant subject areas as a means for
constructing new knowledge, which is critical to their developing habits of mind (Jonassen
1997; Wells 2017). The predictive nature of designing a solution and testing the model or
prototype through multiple iterations presents opportunities (with instructional guidance)
for students to refine their understandings as a result of being confronted with internal cog-
nitive dissonance generated by realized inconsistencies (Festinger 1962; Nicholas and Oak
2020; Puntambekar and Kolodner 2005). The central tenet of education is to increase stu-
dent understanding, and fostering their ability to think critically as a result. Many research-
ers have argued that T/E design based learning is an effective approach for increasing
students understanding through just such cognitive dissonance (Cajas 2001; Wells 2010;
Puntambekar and Kolodner 2005; Barlex 2003; Puente et al. 2013) and which ultimately
leads to their development of higher order thinking skills.

Cognitive science underpinning I‑STEM ED pedagogical approach

A widely-accepted model explaining the process of learning posits that human senses
receive new input, which are held in a sensory register for a fraction of a second before
being passed on to working memory. In working memory, the information is processed
and either stored in long-term memory or lost. New information becomes stored in long-
term memory when learners make strong emotional associations with the information or
make sense of it through connections to some previously stored information (Bransford
et al. 2000; Brown et al. 1989). Without the strong associations or the sense-making con-
nections, the new information is lost.
Although learners’ construction of knowledge is linked to the activity, context, and cul-
ture in which it is learned, constructing knowledge requires the individual to be engaged as
the sole or co-constructor of his/her knowledge, uniquely related to his/her previous knowl-
edge (Brown et  al. 1989). Repeated retrieval of stored information strengthens the inter-
connectedness of stored information and enhances its future recall at appropriate times.
In alignment with social constructivist practices and with cognitive science research, the
I-STEM ED pedagogical approach supports the development of cognitive connections
required for integration and the transfer of knowledge and skills to novel situations (Huber
and Hutchings 2004; Brown and Kane 1988; Perkins and Salomon 1989).
Building from research in cognitive science regarding student learning, cognitive
demands naturally imposed through T/E design based learning challenges will require stu-
dent use of the following four knowledge types (NAGB 2009; Shavelson et al. 2005):

• Declarative knowledge (includes definitions, concepts and principles of a subject)


• Procedural knowledge (those practices used within the subject or discipline)
• Schematic knowledge (the reasoning and relationship between concepts)
• Strategic knowledge (recognizing when and where to utilize concepts).

These hierarchical knowledge types reflect a progression toward deeper learning


when students are able to exhibit the interconnections between concepts and disci-
plines and demonstrate the use of the knowledge in developing a solution to the posed
design challenge (Webb 1997). However, isolated singular experiences of T/E DBL
where students learn in this manner will not achieve the goal of developing the desired

13
T/E design based learning: assessing student critical thinking… 271

habits of mind and hand. Needed are repeated experiences in T/E DBL throughout
their education where students will develop these habits.
Problem solving skills are not assessed in traditional science and mathematics
standardized testing. Assessment of student problem solving abilities in the traditional
classroom focuses primarily on the extent of correctness of the end-result, and rarely,
if ever, on the reasoning or procedures leading to the result (Docktor and Heller 2009;
Shavelson et al. 2003; Steif and Dantzler 2005). Furthermore, the content knowledge
tested is directly related to what has been recently taught in the classroom, which does
not require the solver to demonstrate the transfer of knowledge or the higher order
thinking processes required for selection of discipline specific content and practice
knowledge.
In an integrative STEM education program, the T/E DBL approaches are consist-
ently used “to intentionally teach content and practices of science and mathematics
education through the content and practices of technology/engineering education”
(Wells and Ernst 2012/2015). In Virginia, Career and Technical Education (CTE)
offerings have an engineering coursework sequence available in grades 9–12 that can
serve as a framework for an engineering program. Engineering by Design™ is the
flagship curriculum for ITEEA that uses a select series of CTE courses to provide an
engineering curriculum for grades 9–12. Project Lead the W ­ ay® is yet another P-12
engineering curriculum that includes both introductory and advanced course work in
engineering. Yet in spite of their availability, widespread delivery of these programs
is not realized simply because they do not exist in all schools due primarily to the lack
of funding, qualified teachers, or adequate instructional space. Furthermore, these pro-
grams are independent of the science and math coursework that students take in their
high-schools. As a result, not all students are provided an opportunity to participate
in such design based learning experiences. And even among those who do participate,
most students are not afforded the opportunity to experience an integrative STEM edu-
cation approach. Additionally, in many instances students cannot even choose to take
these courses because their schedules are already burdened with required core courses
and critical AP courses as mentioned previously.
The Academy intentionally created a unique four-year curriculum to provide
increasingly complex engineering courses and integrate science and math courses in
each of the 4  years. Instruction is intended to focus students towards investigations
and explorations into the interrelated facets of the disciplines. Design challenges in
the engineering classes are the vehicles through which students are taught the tools of
engineering and the design process (analytical and visual), where the complexities of
design challenges are intentional for guiding students to learn and utilize science and
math content as needed; i.e., researching and learning new content and practices on a
need-to-know basis. For example, first-year students learn new tools and techniques
(hand tools, machinery, CAD, CAM, light programming, etc.) and undertake teacher-
guided design projects following a prescribed engineering design process. In contrast,
fourth-year students work in teams on open-ended design challenges addressing real-
world problems they identify within authentic contexts. Team members work collab-
oratively towards designing and implementing a solution with the instructor serving
primarily as the facilitator. Industry professionals participate in mentoring with their
specialized expertise and evaluations of the designed solution. By necessity, assess-
ments of student learning in such a program must be designed to evaluate higher order
thinking skills and problem solving abilities in addition to the more traditional stand-
ardized assessments.

13
272 S. Shanta, J. G. Wells

Characterizing problem solving to develop assessment strategy

Researchers agree that problem solving writ large is a goal-driven process that requires
recognition of the nature of the problem, identification of the end-state that implies suc-
cess, creation of a strategy to go from the current-state to the end-state, execution of the
strategy, and adaptation of changes in strategy based on difficulties encountered along
the way (Barak and Assal 2018; Reeff 1999; Martinez 1998; Hayes 1989). Furthermore,
there are fundamental differences in problem solving among disciplines. In the natural
sciences for example, the problems to be solved are those requiring casual explanations
of natural phenomenon. In contrast, technological and/or engineering problems require
design solutions that do not necessarily result in the correct solution, but the best via-
ble solution within a set of imposed constraints and parameters. When a problem is
based in the real-world context with many unknowns, recognizing and understanding
the problem is key to the solver’s ability to devise a process for solving the problem.
This implies that two different real-world problems cannot be solved using the same
knowledge or the exact same process (Reeff 1999).
Problem solving has been characterized within two broad categories: (1) well-
structured, where all information needed to solve the problem are provided, and, (2)
ill-structured, where there are many unknowns, many conflicting goals and multiple
approaches possible for solving the problem (Jonassen 1997). Well-structured problems
are typical of problems practiced and assessed in the traditional science and mathemat-
ics classrooms. In contrast, ill-structured problems are more typical of real-world sit-
uations (known as authentic problems), and are more reflective of what professionals
will encounter in the workplace. Solving ill-structured problems requires engagement
of multiple disciplinary experts, often having several conflicting or vague goals, and
where all the necessary information is not fully known. As presented in the T/E DBL
educational setting, a T/E design challenge, with or without prototyping, is closer to the
ill-structured end of the spectrum of problems (Heywood 2005) and more authentic. To
develop an effective assessment of student problem solving skills requires knowledge of
the processes that are involved in solving such problems.
For solving authentic problems, methods can be algorithmic, heuristic, or a combi-
nation of both (Martinez 1998; Jonassen 2000; Ormrod 2012). Algorithmic methods
are typical of mathematical problem solving in the context of a classroom, where stu-
dents learn step-by-step procedures on how to work out factoring for quadratic functions
or long division. Heuristic methods are more like general strategies or rules employed
when engaged in an engineering design based problem that can only be solved through
execution and testing. Neither algorithmic methods nor general heuristics alone are
enough for solving authentic problems without deep understanding of the content areas
within which a given problem is embedded (Perkins and Salomon 1989). In an authentic
problem requiring the design of a solution and situated within the context of science and
mathematics instruction, the method may be heuristic in creating general strategies for
solving the problem, but algorithmic when it comes to utilizing specific mathematics
and science knowledge to solve a discrete sub-problem. Furthermore, frequently prac-
ticed and accessed pathways to content stored in long-term memory helps with recogni-
tion of content areas relevant to the problem. Moreover, thinking and reasoning forward
(predictive analysis) in order to evaluate the results of any particular action, also helps
with progressing logically towards a solution (Perkins and Salomon 1989).

13
T/E design based learning: assessing student critical thinking… 273

Cognitive science reveals that the mental processes involved in problem solving are
based on prior knowledge and experiences of the solver (Ormrod 2012; Newell and Simon
1972). Prior knowledge is stored in long term memory and information gleaned from the
problem are stored in short term or working memory. The latter has limited capacity and
therefore can become overloaded during problem solving. This cognitive overload can hin-
der the solver’s ability to successfully complete the design of a solution. Therefore, the
science, mathematics and engineering methods of problem solving recommend identify-
ing and writing down (symbolically and visually) identified information (Heywood 2005;
Jonassen 1997; Jonassen et  al. 2006; Chi et  al. 1981). An essential practice unique to
technological and engineering design is communication of relevant details regarding the
identified problem through documentation that provides explanations necessary to ongo-
ing iterations in the development and implementation of a solution. As an integral part
of T/E design, documentation is central throughout the design process beginning with the
first phase of solving an engineering design based problem: identification of the problem
parameters, such as useful information given and unknowns.
The complex practices of T/E design impose on learners the cognitive demand for crit-
ical thinking, the development of which is fostered through use of detailed documenta-
tion whereby abstract thought is concretized through “visual mental imagery” (Middleton
2005, p. 66). Metacognition is involved in the mental activities of identifying and select-
ing appropriate conceptual knowledge, planning a strategy to use the conceptual knowl-
edge, and monitoring one’s progress towards a goal (Jonassen 1997; Chi et al. 1981; White
and Frederiksen 1998). When an authentic problem is encountered in a situation outside
the classroom where the content was learned, it creates a demand on the solver to recog-
nize the discipline or topic and the specific content and practices relevant to the problem.
Repeated experiences in solving problems of this type creates strong interconnected organ-
ization of information between the long term memory of the solver and their practiced hab-
its of mind. In contrast, problems encountered in the traditional classroom are often related
directly to the content recently learned and the course or subject matter is used as the clue
for helping students recognize the discipline and topic. This approach, unlike T/E design
based learning, limits the transferability and utility of the learning and knowledge gained
during instruction to novel situations.
Prior research on development of problem solving skills indicates there are specific
skills associated with solving authentic, ill-structured T/E design problems: (1) recognizing
and identifying the problem, (2) recalling and organizing specific subject content relevant
to the problem, (3) carrying out the procedural steps that are common practices within the
subject, (4) looking back to see if the progression is logical, and, (5) stating the solution to
the identified problem (Newell and Simon 1972; Polya 1973; Perkins and Salomon 1989;
Heller and Reif 1984; Reeff 1999). Assessment of problem solving skills would provide
insight into the transferability of learning within the context of T/E design based learning.

Method

Research design

The research reported is based on the precepts of the Design-No-Make (DNM) approach
as characterized in 1999 by David Barlex through the Young Foresight initiative (Barlex
2003). The aim of the DNM is to help focus students’ learning and teachers’ instruction

13
274 S. Shanta, J. G. Wells

toward the design phase instead of the culminating act of making the designed product.
This approach removes the typical distractions of making the prototype from the learning
experience, thereby affording students the opportunity to explore various ideas and con-
cepts in greater depth.
Research demonstrates the DNM approach to be valuable in helping students explore a
wide range of design criteria and develop greater understanding of both technological and
engineering concepts (Barlex and Trebell 2008). Furthermore, over the past several dec-
ades the use of DNM to demonstrate improved design cognition has been well documented
(Atman and Bursic 1998; Lammi 2011; Ball and Christensen 2019; Becker and Mentzer
2015; Gero and Kan 2016; Wells 2016a), providing support for its use as an instructional
strategy. As in previous studies, removing the making aspect from the design process mini-
mizes potential constraints of students’ limited tool skills or tacit knowledge, as well as
the availability of materials that would be required for implementation of their design. The
DNM approach therefore provides an opportunity for students to focus their efforts on the
utilization of domain-specific knowledge and decision making abilities. Specifically, those
associated with the utilization of the schematic and strategic knowledge domains (knowing
when, where and why) within the context of the design problem or challenge presented
(Wells 2016a). Consequently, the research being reported employed an exploratory design
as a means for investigating the extent to which student problem solving abilities are pro-
moted through the I-STEM ED approach, where the use of physics and mathematics is
imposed by the Design-No-Make engineering challenge, as compared to the traditional
mono-disciplinary teaching approach.

Participants

Equivalent academic preparation

Participants in the experiment (Academy) group were 11 students in 12th grade who had
completed Advanced Placement (AP) Calculus AB, and who were also currently enrolled
in AP Calculus BC. The five students who were part of the control group participants were
12th grade students in a traditional classroom who received the same curricula and level
of physics and mathematics preparation in grades 9 through 12 as those in the I-STEM
ED (Academy) program. As well, self-reported grade point averages (GPAs) for students
in both control and experiment groups indicated all participants were equally eligible for
valedictorian (at or above GPA 4.0) status in their schools. Furthermore, control group par-
ticipants in the traditional classroom, had they applied to the Academy during grade 9,
would have met the same academic eligibility requirements for admission to the Academy.
To further ensure that experiment and control groups were similar in academic preparation
and caliber, the physics and math curricula of both groups were examined for instructional
consistency, as well as the caliber of students based on equivalent GPAs. This examination
revealed no significant differences in physics and math instruction, and all students were
deemed academically equivalent.
In addition to the engineering courses taken every year, students in the Academy take
Algebra II, Pre-Calculus, AP Calculus AB, and AP Calculus BC consecutively in grades
9 through 12. During the 11th grade, students in the Academy take Pre-AP Chemistry and
Physics. All the mathematics and science courses, inclusive of Algebra II and Physics, in
the Academy follow the same curricula used by all schools within the school system where
the Academy was situated. As previously noted, the DNMC used for the study was aligned

13
T/E design based learning: assessing student critical thinking… 275

with the Algebra II and Physics curricula in the school system. The experiment group
would therefore have received the same math and science preparation as all other students
in their school system. Consequently, the only instructional difference between experiment
and control groups is the added engineering courses and the integrative STEM education
pedagogical approach used at the academy.

Measures

Assessment of problem solving skills

The T/E design based learning challenge used in the research being reported was situated
within the physics and mathematics content areas. Both physics and mathematics are com-
ponents of the curricula followed by all schools in the county, including the Governor’s
STEM Academy. Prior research indicates that a lack of literacy in these two content areas
contributes to challenges faced by undergraduate students when entering postsecondary
engineering programs (Budny et al. 1998; Steif and Dantzler 2005). As a result, one rea-
son first year undergraduate students do not continue in engineering programs is that they
have not been adequately prepared to apply the foundational knowledge in these subjects
(ibid). Research into the nature and characterization of problem solving over several dec-
ades has identified a set of key student abilities requisite of success for solving authen-
tic problems outside the confines of a typical classroom setting (Newell and Simon 1972;
Polya 1980; Perkins and Salomon 1989; Martinez 1998; Jonassen et al. 2006; Barak and
Assal 2018). Specifically, these key student abilities are: (1) useful description, both sym-
bolic and descriptive, (2) recognition and selection of relevant content applicable to the
problem, (3) use of the principles and practices of specific content identified to solve the
problem, and (4) adherence to a devised logical strategy for solving the problem. In design-
ing this research, all four key student abilities, along with several other elements from the
previously discussed studies, were incorporated in developing, validating and utilizing an
assessment rubric for answering stated research questions.

Design no make challenge (DNMC)

The DNMC presented to the Governor’s Academy high school students was based on an
authentic T/E design challenge encountered by engineering students at a major research
university in the United States as an authentic case for resolving engineering problems
confronting humanitarian workers in the country of Malawi (Fig.  1). For use in the
current research the DNMC was first aligned with the physics and mathematics state
standards in the high school curriculum. Metacognitive questions were then embedded
in the DNMC as prompts for eliciting student responses that would demonstrate the key
abilities used in solving authentic problems outside the classroom context where the tar-
geted physics and mathematics subjects had initially been learned. A group of I-STEM
ED experts were tasked with aligning the metacognitive prompts in the DNMC with an
assessment rubric (adapted from Docktor 2009), and validating the key ability levels
associated with rubric scores. Arbitration among experts resulted in a modified assess-
ment rubric, as well as modifications to several of the prompts used in the DNMC.
This finalized DNMC was then administered to students who received their high school
instruction in the traditional classroom setting, all of whom had received the same level

13
276 S. Shanta, J. G. Wells

Fig. 1  The DNMC

of physics and mathematics instruction as did those students receiving instruction in the
I-STEM ED Governor’s Academy School program.
Responses to the DNMC from traditional classroom participants were scored by the
I-STEM ED experts and used in conducting reliability testing for the rubric. Interrater
reliability among the experts scoring student responses was found to be consistently
above 94%, which is an acceptable threshold for establishing consistency in using the
rubric (Jonsson and Svingby 2007). The same DNMC was then administered to stu-
dents enrolled in the I-STEM ED Governor’s Academy School program, all of whom
were at the equivalent high school academic level. These student responses were scored
by the same qualified educators previously trained in scoring the traditional classroom
student responses using the rubric, and done so in accordance with prior research pro-
tocols (Docktor et al. 2016) where 30 min was spent on independent scoring, followed
by an iterative process of collaborative scoring, discussion, and arbitration to achieve
consensus.

Scoring rubric

The scoring rubric (Table 1) for DNMC responses was adapted from the rubric developed
by Docktor (2009) to measure key student abilities (SAs) identified as indicators of their
abilities to solve authentic problems outside the classroom context where the relevant sub-
jects were initially learned.
The dependent variables were Overall Student Success (OSS) and the five key Student
Abilities (SA1, SA2, SA3, SA4 and SA5). To obtain measures for each of the five SAs, stu-
dent responses to the DNMC question prompts were scored using the adapted rubric, and
thus providing numerical scores. The OSS was the sum of the student ability scores (OS
S = SA1 + SA2 + SA3 + SA4 + SA5). The following prompts were used to elicit responses
matching each of the key student abilities to be measured.

Q 1a) What is your understanding of the challenge described above? Describe using your
own words, in a few sentences.

13
Table 1  Scoring rubric: key student abilities

SA1 5 4 3 2 1 0
Useful description The description The description The description The description There is a description The response does not
(specific to a given provides appropri- provides appropriate provides appropriate provides details but contains more than 3 include a description
problem) ate details, and is details but contains 1 details but contain 2 contains 3 omissions omissions or errors,
complete omission or error omissions or errors or errors or is incorrect
SA2 The sketch provides The sketch provides The sketch provides The sketch provides There is a sketch but The response does not
Sketch appropriate details details but contains 1 details but contains 2 details but contains 3 contains more than 3 include a sketch
(contains dimension- and is complete omission or error omissions or errors omissions or errors omissions or errors
ing, legible and or is incorrect
correct units of
measurement, labels
for specific features
or known items)
SA3 5 4 3 2 1 0
Specific application of The specific applica- The specific applica- The specific applica- The specific applica- The specific applica- The specific application
physics tion of physics is tion of physics 1 tion of physics tion of physics tion of physics is of physics is missing
appropriate and omission or error contains 2 omissions contains 3 omissions inappropriate or has
complete or errors or errors more than 3 omis-
T/E design based learning: assessing student critical thinking…

sions or errors, or is
incorrect
SA4 5 4 3 2 1 0
Application of math- The mathematical The mathematical pro- The mathematical pro- The mathematical The mathematical The mathematical pro-
ematics procedures are cedures are appropri- cedures are appropri- procedures are procedures are inap- cedures are entirely
appropriate for solv- ate for solving this ate for solving this appropriate for propriate for solving missing
ing this problem and problem with 1 problem with a 2 solving this problem this problem or has
complete omission or error omissions or errors with 3 omissions or more than 3 omis-
errors sions or errors, or is
incorrect
277

13
Table 1  (continued)
278

SA5 5 4 3 2 1 0
Logical progression The problem solution The solution is clear Parts of the solu- Most of the solution The problem solu- There is no evidence of

13
is clear, focused, and focused with 1 tion are unclear, parts are unclear, tion is unclear, logical progression
logically connected logical inconsistency unfocused, and has unfocused, and 3 unfocused, and
and complete and complete 2 logical inconsist- logical inconsisten- inconsistent
encies cies
S. Shanta, J. G. Wells
T/E design based learning: assessing student critical thinking… 279

Q 1b) Based on what you wrote above, draw a sketch to describe the scenario, and label
the sketch to show the information provided above (e.g., the depth of the well is 10 m).
Use variables for what you do not know.
Q 2) How could you determine the power of the water pump? State any laws and equa-
tions you would use and explain your strategy in a few sentences.
Q 3) Based on your response to question 2 go through the process you have outlined,
and show your calculations to determine the power of the pump (in horsepower).
Q 4) Based on your work in question 3, what is the power of the motor you need? Using
the motor pricing information provided, which motor would you pick and how much
will it cost?

Criterion-related validity, associated with how the scoring criteria reflect the com-
petencies being assessed and whether the measures or scoring levels are appropriate, is
of concern for obtaining evidence on how students perform outside the school or in a
different situation than in a classroom Moskal and Leydens 2000). Authentic problem-
solving (or performance assessments) have been shown to benefit from explicitly stated
scoring rubrics (also known as analytic scoring, Jonsson and Svingby 2007, p. 131) hav-
ing more than three levels (Moskal and Leydens 2000). Establishing significance of cor-
relation measures between scorers and measuring the degree to which scores can be
attributed to common scoring rather than to error in scoring, are considered essential for
such validating such assessments (Stemler 2004).
To establish the validity of the modified rubric used in this research, a pilot study
was conducted to ensure that (a) the DNMC question prompts and the modified rubric
aligned with the research questions guiding this study and (b) that they could be reliably
used for scoring student responses. Five students enrolled in a high school physics class
were recruited and agreed to participate in the pilot study by providing responses to the
DNMC. For the pilot study, students enrolled in a physics class offered at a high school
within the school system (same course as offered in the Academy) were recruited as vol-
unteers to take the DNMC. Pilot study students were informed that their participation
would not be tied to any assessment of their class performance or grade. Student volun-
teers who returned consent forms were those who participated in the problem solving
activity (DNMC).
Two types of reliability are discussed when using rubrics for scoring: intra-rater reli-
ability and inter-rater reliability. Intra-rater reliability refers to a lack of consistency of
a single rater over time in large numbers of scored assessments. Establishing a scoring
routine through training and use of a rubric with explicitly defined scoring levels can
eliminate this type of a reliability concern (Jonsson and Svingby 2007). With authen-
tic problem-solving assessments, acceptable scoring agreement or inter-rater reliabil-
ity can be achieved by various acceptable means—training of raters, having two raters
instead of more than two, benchmarks or instructor’s solutions and rubrics with detailed
definitions for levels of scores (p. 135). In the current study, guidelines developed by
Moskal and Leydens (2000) were followed to establish rubric reliability by preparing
well defined scoring categories, ensuring clear separation between the scoring catego-
ries, and checking the concurrence of scores given by two independent raters for the
same assessment.
A panel of I-STEM ED experts was identified and asked to review each of the ques-
tion prompts, and compare them to the associated research sub-question as a means of
rating the degree of alignment between the question prompts and the research questions.

13
280 S. Shanta, J. G. Wells

Using a rating scale of one (poor) to four (excellent), experts provided a numerical score
for each question prompt, along with comments and suggestions for improvement of the
prompts. Based on panelist feedback, any necessary adjustments were made, followed
by a second panel review conducted in same manner. The intent was for each question
to receive a rating of 3 or above from all members of the expert panel. Following two
iterations, with corresponding revisions to the language of the question prompts, con-
sensus was achieved (rating of 3 or 4) for all question prompts.
The validity of the finalized rubric was tested by having the experts apply the rubric in
scoring student responses from the pilot study. Rubric validation was initiated by providing
each expert with the same scanned copy of one participant’s response and asking them to
independently score the responses. An instructor’s solution developed by physics educa-
tors was also provided to the experts to assist them with the scoring. Following their inde-
pendent evaluation of responses, experts then reviewed each other’s scores (tabulations and
comments) and engaged in arbitration to determine if there was a need to change, revise, or
accept assigned scores. The final mean point score on the first student response (SR1) was
12. The lowest and highest point scores assigned by the expert reviewers was 11 and 13
respectively. Scoring of the second and third student responses (SR2 and SR3) followed the
same arbitration process and achieved expert consensus based on previously described cri-
teria. This process was repeated on the fourth and fifth student responses (SR4 and SR5),
where achievement of final consensus resulted in a validated rubric for use in the research
study.
The DNMC was then administered to the student research participants at the Gover-
nor’s Academy in adherence to the appropriate and previously approved protocols. Two
content area specialists were solicited for scoring these student responses. Criteria used
in the selection of scorers were they: (a) had a minimum of 5  years of high school con-
tent area teaching experience, (b) held a valid teaching license in science, mathematics or
technology education, and (c) held a master’s degree or higher in a STEM or STEM edu-
cation discipline. Inter-rater reliability among scorers was established following the pre-
vious method of scorer training. This included independent scoring of individual student
responses from the pilot study, followed by an iterative process of arbitration to compare
and discuss assigned scores, with the goal of achieving a sufficiently high level of scor-
ing consistency. Independent scoring of the final two pilot study student responses resulted
in identical scores, at which point raters were deemed sufficiently trained in scoring the
DNMC responses using the validated rubric. Trained scorers were then provided all eleven
of the Academy students’ responses and asked to score them independently. Once indepen-
dently scored, scorers met to compare and discuss their assigned scores, making any neces-
sary final adjustments before submitting the scores. The scores in each category were then
used for the data analysis.

Results

Numerical scores on the scoring rubric were aligned with each of the five student
abilities (SAs) as depicted in Table 1. Data collected from the scored DNMC question
prompts were examined using descriptive statistics, a one-sample t test for measuring
the significance of the Overall Student Success (OSS, the sum of scores of the five SA),
the covariance between each of the SAs and the OSS, and the strength of the correla-
tion using an adjusted correlation coefficient. Although the sample size for this research

13
T/E design based learning: assessing student critical thinking… 281

Table 2  Results from the t df Sig. (2- tailed) Mean difference 95% confi-
two-tailed one sample t-test for dence interval
overall success score (OSS) of the differ-
ence
Lower Upper

OSS 3.708 10 0.004* 4.455 1.78 7.13

Bootstrapping generated 100 samples, resulting in a test value


(hypothesized mean) of 12
*
 p < 0.05

Table 3  Pearson’s correlations Student abilities PPM correlation (r) Coefficient of


and calculated coefficient of determination
determination for the SAs ­(r2)

Useful description 0.121 (p > .05) 0.015 (1.5%)


Sketch 0.635 (p < .05)** 0.403 (40.3%)
Specific application of physics 0.916 (p < .01)*** 0.839 (83.9%)
Application of mathematics 0.953 (p < .01)*** 0.908 (90.8%)
Logical progression 0.918 (p < .01)*** 0.843 (84.3%)
***
**Correlation statistically significant at p < 0.05 level; Correlation
statistically significant at p < 0.01 level

was small, the sample-distribution represented a good approximation of the population-


distribution. Given this approximation, the bootstrapping method was appropriate for
estimating population statistics through repeated sampling of the dataset with replace-
ment (Efron 1994; Cumming 2014). In this way the bootstrapping technique was used to
create a large number of repeated samples prior to calculating statistics for the popula-
tion samples used in this research.
Results from the two-tailed one-sample t-test (Table 2), indicate that the mean OSS
was statistically significantly higher by 4.455 (95% CI 1.78–7.13) than the hypothesized
mean performance score of 12, t (10) = 3.708, p = 0.004. The hypothesized mean was
generated from the pilot study conducted in a traditional classroom (one not within the
Academy), where those students were completing the same physics course (using the
same curriculum) as students in the Academy; i.e., these students received the same
STEM content through traditional classroom instruction. The calculated effect size
(Cohen’s d) of 0.8 indicated a large effect, implying that the strength of significance of
the t-test is large enough to be practically significant.
The coefficient of determination represents the percent of data points that are closest
to the line of best fit in the model, and is a measure of how well the regression line rep-
resents the data. A higher coefficient is an indicator of a better goodness of fit and can
provide a good indication of prediction of the variations of one variable with respect
to the other in the regression model. Data analysis of student abilities found there was
a statistically significant correlation with student performance among 4 of 5 abilities
(Table  3). Useful Description was the only student ability not found to correlate well
with student performance.

13
282 S. Shanta, J. G. Wells

Discussion and conclusions

The main purpose of the research presented was to measure and assess five key student
abilities (SAs) as indicators of a student’s ability to solve authentic engineering design
problems outside the context of the traditional classroom where their subject-specific con-
tent was initially learned. Assessment of student Design-No-Make-Challenge (DNMC)
responses were used to generate individual SA scores, and the sum of those five SA scores
was used to establish the Overall Student Success (OSS) score. The t-test data presented
in Table 2 indicates there was a statistically significant difference between the mean per-
formance score of students from the I-STEM ED program when compared to students
from the traditional classroom learning environment. The calculated effect size of 1.117
(Cohen’s d) is considered large, and as such indicates there is an important substantial dif-
ference between the control and experiment groups.
Results (Table  3) further demonstrate that only specific student abilities (Sketching
with p < 0.05, Specific Application of Physics, Application of Math and Logical Progres-
sion with p < 0.01) are significantly correlated to the students’ overall success in solving
the problem. When students design a solution to a DNMC, their ability to organize and
describe useful information in written statements was not correlated to their overall success
score. However, their ability to organize essential information into graphical representa-
tions, ability to select and utilize science content and practices necessary to design a solu-
tion, ability to select and utilize relevant mathematical content and practices, and, ability to
demonstrate logical progression towards the design of a solution were strongly positively
correlated to their overall success score after adjusting for the small sample size used in
this study. i.e., utilizing content/practice knowledge outside the confines of the classroom
wherein their science and/or mathematics was learned.
Based on results, the primary conclusion drawn from this research is that students
immersed in T/E design based learning through an integrative STEM education pedagogi-
cal approach will out-perform their traditional classroom counterparts in drawing on resi-
dent knowledge for designing solutions to a design-no-make-challenge. One can also con-
clude that student abilities associated with the specific application of physics (science) and
mathematics, as well as the ability to logically progress towards the solution, are shown to
be strong predictors of successful T/E design based problem solving. These stated conclu-
sions have direct implications for instruction in K-12 T/E design education, student learn-
ing and assessment, and engineering program design in secondary schools. One of the pri-
mary motivations for this research was to address the overall need for adequately preparing
high school students who are STEM literate and possessing the skills needed to tackle the
challenges in the 21st Century. The results of this study would suggest that instructional
strategies need to be further strengthened to help students learn to select and utilize sci-
ence and mathematics in problem solving in diverse contexts. There may be reason to also
investigate the same skills in students in the lower grades to focus on helping develop these
skills at an earlier grade level for all students. Furthermore, the rubric developed in this
study has the potential to be used as an assessment tool in the technology education class-
room, and therefore this study has implications for alternative methods of demonstrating
student growth.
In spite of the findings and conclusions drawn, questions still remain regarding what
specific I-STEM ED instructional strategies are most effective in preparing students who
are better able to tackle authentic problems through T/E design based learning. A larger
study with students from many institutions and diverse socioeconomic backgrounds will

13
T/E design based learning: assessing student critical thinking… 283

be needed before being able to generalize the conclusions and the applicability of results.
As well, further research on student learning, specifically using qualitative methods to
investigate how they select and utilize scientific and mathematical principles in solving T/E
design based problems, is necessary for providing additional insights into student learning
and transfer of that learning.

References
Atman, C. J., & Bursic, K. M. (1998). Verbal protocol analysis as a method to document engineering student
design processes. Journal of Engineering Education, 87, 121–132.
Ball, J. L., & Christensen, B. T. (2019). Advancing an understanding of design cognition and design meta-
cognition: Progress and prospects. Design Studies, 65, 35–59.
Barak, M., & Assal, M. (2018). Robotics and STEM learning: students’ achievements in assignments
according to the P3 Task Taxonomy—practice, problem solving, and projects. International Journal of
Technology and Design Education, 28, 121–144. https​://doi.org/10.1007/s1079​8-016-9385-9.
Barak, M., & Assal, M. (2018). Robotics and STEM learning: Student achievements in assignments accord-
ing to the P3 task taxonomy-practice, problem solving and projects. International Journal of Technol-
ogy and Design Education, 28, 121–144. https​://doi.org/10.1007/s1079​8-016-9385-9.
Barlex, D. (2003). Considering the impact of design and technology on society—The experience of the
Young Foresight project. In J. R. Dakers & M. J. Devries (Eds.), The place of design and technology in
the curriculum PATT conference 2003 (pp. 142–147). Glasgow: University of Glasgow.
Barlex, D. M., & Trebell, D. (2008). Design-without-make: Challenging the conventional approach to
teaching and learning in a design and technology classroom. International Journal of Technology and
Design Education, 18(2), 119–138.
Becker, K., & Mentzer, N. (2015). Engineering design thinking: High school students’ performance and
knowledge. In Interactive collaborative learning (ICL), 2015 international conference on (pp. 5–12).
Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.). (2000). How people learn: Brain, mind, experience
and school. Washington, DC: National Academy Press.
Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational
Researcher, 18(1), 32–42.
Brown, A. L., & Kane, M. J. (1988). Preschool children can learn to transfer: Learning to learn and learning
from example. Cognitive Psychology, 20(4), 493–523. https​://doi.org/10.1016/0010-0285(88)90014​-X.
Budny, D., LeBold, W., & Bjedov, G. (1998). Assessment of the impact of freshman engineering courses.
Journal of Engineering Education, 87(4), 405–411.
Cajas, F. (2001). The science/technology interaction: Implications for science literacy. Journal of Research
in Science Teaching, 38(7), 715–729.
Chi, T. H., Feltovich, P. J., & Glaser, R. (1981). Categorization and representation of physics problems by
experts and novices. Cognitive Science, 5(2), 121–152.
Cumming, G. (2014). The new statistics: Why and how. Psychological Science, 25(1), 7–29.
Daugherty, J., Mentzer, N., & Kelley, T. (2011, April). Technology design and engineering design: Is there a
difference? Paper presented at IAJC-ASEE joint international conference, Harford, CT.
Docktor, J. (2009). Development and validation of a physics problem solving assessment rubric (doctoral
dissertation). University of Minnesota, Minneapolis and St. Paul, Minnesota.
Docktor, J., & Heller, K. (2009). Robust assessment instrument for student problem solving. In Proceed-
ings of the NARST 2009 annual meeting. Retrieved from http://group​s.physi​cs.umn.edu/physe​d/Peopl​e/
Dockt​or/resea​rch.htm#Resea​rch_Docum​ents.
Docktor, J. L., Dornfeld, J., Frodermann, E., Heller, K., Hsu, L., Jackson, K. A., et al. (2016). Assessing
student written problem solutions: A problem-solving rubric with application to introductory physics.
Physical Review Physics Education Research, 12, 301–318.
Efron, B. (1994). Missing data, imputation and the bootstrap. Journal of American Statistics Association,
89, 463–479.
Felder, R. M., & Brent, R. (2016). Teaching and learning STEM: A practical guide. San Francisco:
Jossey-Bass.
Festinger, L. (1962). A theory of cognitive dissonance (Vol. 2). Stanford: Stanford University Press.
Fortus, D., Dershimer, R. C., Krajcik, J., Marx, R. W., & Mamlok-Naaman, R. (2004). Design based science
and student learning. Journal of Research in Science Teaching, 41(10), 1081–1110.

13
284 S. Shanta, J. G. Wells

Gagne, R. M., Wager, W. W., Golas, K. C., & Keller, J. M. (2004). Principles of instructional design (5th
ed.). Independence: Cengage Learning.
Gero, J. S., & Kan, J. (2016). Empirical results from measuring design creativity: Use of an augmented cod-
ing scheme in protocol analysis. Paper presented at the 4th international conference on design creativ-
ity, Atlanta, Georgia.
Hayes, J. R. (1989). The complete problem solver (2nd ed.). Hillsdale: Lawrence Erlbaum Associates.
Heller, J. I., & Reif, F. (1984). Prescribing effective human problem solving processes: Problem description
in physics. Cognition and Instruction, 1(2), 177–216.
Heywood, J. (2005). Engineering education: Research and development in curriculum and instruction.
Hoboken: Wiley.
Huber, M., & Hutchings, P. (2004). Integrative learning: Mapping the terrain. New York: The Carnegie
Foundation for the Advancement of Teaching and Association of American Colleges and Universities.
Jonassen, D. H. (1997). Instructional design models for well-structured and Ill-structured problem solving
learning outcomes. Educational Technology Research and Development, 45(1), 65–94.
Jonassen, D. H. (2000). Toward a design theory of problem solving. Educational Technology Research and
Development, 48(4), 63–85.
Jonassen, D. H., Strobel, J., & Lee, C. B. (2006). Everyday problem solving in engineering: Lessons for
engineering educators. Journal of Engineering Education, 95(2), 1–14.
Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational conse-
quences. Educational Research Review, 2, 130–144.
Lammi, M. (2011). Characterizing high school students’ systems thinking in engineering design through the
function-behavior-structure (FBS) framework. PhD Thesis, Utah State University, Logan, Utah.
Martinez, M. E. (1998). What is problem solving? The Phi Delta Kappan, 79(8), 605–609.
Middleton, H. (2005). Creative thinking, values and design and technology education. International Journal
of Technology and Design Education, 15, 61–71.
Moskal, B. M., & Leydens, J. A. (2000). Scoring rubric development: Validity and reliability. Practical
Assessment, Research and Evaluation, 7, 71–81.
National Assessment Governing Board (NAGB). (2009). Science framework for the 2009 national assess-
ment of educational progress. Washington, DC: U.S. Department of Education.
National Council of Teachers of Mathematics (NCTM). (2000). Principles and standards for school mathe-
matics: Executive summary. Retrieved from https​://www.nctm.org/uploa​dedFi​les/Stand​ards_and_Posit​
ions/PSSM_Execu​tiveS​ummar​y.pdf.
National Research Council (NRC). (2009). Engineering in K-12 education: Understanding the status and
improving the prospects. Washington, DC: The National Academies.
Newell, A., & Simon, H. A. (1972). Human problem solving. Englewood Cliffs: Prentice-Hall Inc.
New Assessment Tools for Cross-Curricular Competencies in the Domain of Problem Solving (NATCCC).
(2000). Final report of project ERB-SOE2- CT98-2042. EUROPEAN Commission, L-2926
Luxembourg.
NGSS Lead States. (2013). Next generation science standards: for states, by states. Washington, DC: The
National Academies Press.
Nicholas, C., & Oak, A. (2020). Make and break details: The architecture of design-build education. Design
Studies, 66, 35–53.
Ormrod, J. E. (2012). Human learning. Boston: Pearson.
Perkins, D. N., & Salomon, G. (1989). Are cognitive skills context bound? Educational Researcher, 18(1),
16–25.
Polya, G. (1973). How to solve it. Princeton, NJ: Princeton University Press. (Original work published
1945).
Polya, G. (1980). On solving mathematical problems in high school. In S. Krulik & R. Reys (Eds.), Prob-
lem-solving in school mathematics: 1980 yearbook (pp. 1–2). Reston, VA: National Council of Teach-
ers of Mathematics.
Pope, D., Brown, M., & Miles, S. (2015). Overloaded and underprepared: Strategies for stronger schools
and healthy successful kids. San Francisco: Jossey-Bass.
Puente, S. M. G., Eijck, M. V., & Jochems, W. (2013). Exploring the effects of design-based learning char-
acteristics on teachers and students. International Journal of Engineering Education, 29(2), 491–503.
Puntambekar, S., & Kolodner, J. L. (2005). Towards implementing distributed scaffolding: Helping students
learn science from design. Journal of Research in Science Teaching, 42(2), 185–217.
Reeff, J. P. (1999). New assessment tools for cross-curricular competencies in the domain of problem solv-
ing. Retrieved from http://www.ppsw.rug.nl/~pesch​ar/TSE.pdf, on January 21, 2017.
Shavelson, R., Ruiz-Primo, M. A., Li, M., & Ayala, C. C. (2003). Evaluating new approaches to assessing
learning. Los Angeles: National Center for Research on Evaluation, Standards, and Student Testing.

13
T/E design based learning: assessing student critical thinking… 285

Shavelson, R. J., Ruiz-Primo, M. A., & Wiley, E. W. (2005). Windows into the mind. Higher Education,
49(4), 413–430.
Steif, P. S., & Dantzler, J. A. (2005). A statics concept inventory: Development and psychometric analysis.
Journal of Engineering Education, 94(4), 363–371.
Stemler, S. E. (2004). A comparison of consensus, consistency, and measurement approaches to estimating
interrater reliability. Practical Assessment, Research and Evaluation, 9, 80–89.
Webb, N. (1997). Research monograph number 6: Criteria for alignment of expectations and assessments
on mathematics and science education. Washington, DC: CCSSO.
Wells, J. G. (2010). Research on teaching and learning in science education: Potentials in technology educa-
tion. In P. A. Reed & J. E. LaPorte (Eds.), 59th Yearbook, 2010: Research in technology education (pp.
192–217). Reston: Council on Technology Teacher Education.
Wells, J. G. (2013). Integrative STEM education at Virginia Tech: Graduate preparation for tomorrow’s
leaders. Technology and Engineering Teacher, 72(5), 28–35.
Wells, J. G. (2016a). Efficacy of the technological/engineering design approach: Imposed cognitive demands
within design based biotechnology instruction. Journal of Technology Education, 27(2), 4–20.
Wells, J. G. (2016b). PIRPOSAL model of integrative STEM education: Conceptual and pedagogical frame-
work for classroom implementation. Technology and Engineering Teacher, 75, 12–19.
Wells, J. G. (2016c). I-STEM ED exemplar: Implementation of the PIRPOSAL© model. Technology and
Engineering Teacher, 76, 16–23.
Wells, J. (2017). Design to understand: Promoting higher order thinking through T/E design based Learn-
ing. In Proceedings of the technology education New Zealand and international conference on technol-
ogy education-Asia Pacific (pp. 325–339). TEMS Education Research Center, University of Waikato,
New Zealand. ISBN: 978-0-9951039-0-0. https​://tenzc​on.org/wpcon​tent/uploa​ds/2017/10/TENZI​
CTE-2017-Proce​eding​s.pdf.
Wells, J. G. (2019). STEM education: The potential of technology education. Chapter 11, in M. Daugherty,
& V. Carter (Eds.), The most influential papers presented at the Mississippi Valley technology teacher
education conference. Council on Technology and Engineering Teacher Education, 62nd Yearbook,
Ball State University, Muncie, IN.
Wells, J., & Ernst, J. (2012/2015). Integrative STEM education. Blacksburg, VA: Virginia Tech: Invent the
Future, School of Education. Retrieved from www.soe.vt.edu/istem​ed/.
White, B. Y., & Frederiksen, J. R. (1998). Inquiry, modeling, and metacognition: Making science accessible
to all students. Cognition and Instruction, 16(1), 3–118.
Wiggins, G., & McTighe, J. (2005). Understanding by design. Alexandria: Association for Supervision and
Curriculum Development.

Publisher’s Note  Springer Nature remains neutral with regard to jurisdictional claims in published maps and
institutional affiliations.

13

You might also like