Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Performance Assessment of Ec 2000 Student Outcomes in The Unit Operations Laboratory

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

Session 3513

Performance Assessment of EC-2000 Student Outcomes


in the Unit Operations Laboratory

Ronald L. Miller, Barbara M. Olds


Colorado School of Mines
Golden, Colorado

Summary

The new ABET Engineering Criteria 2000 (EC-2000) describe eleven student outcomes which
must be demonstrated by graduates of accredited programs. Many of these outcomes focus on
professional engineering practice including an ability to design and conduct experiments;
analyze and interpret data; identify, formulate, and solve engineering problems; function on
multidisciplinary teams; communicate effectively; and use the techniques, skills, and modern
engineering tools necessary for engineering practice. Each of these outcomes is best evaluated
using some form of performance assessment where students are judged on their ability to
complete the required task in a setting approaching authentic engineering practice rather than by
traditional objective tests which focus on non-contextual recall of facts and closed-ended
problem-solving.

This paper describes the process we followed to design and implement performance assessment
in the unit operations laboratory including setting goals and objectives compatible with our
departmental program objectives, defining appropriate student performance criteria, developing
and testing a scoring rubric for laboratory written reports, deciding when and how often we
assess, and determining how to evaluate the collected assessment data. Assessment results from
our first pilot study are presented and discussed to illustrate the utility of the data obtained in
improving the laboratory course and chemical engineering curriculum.

Introduction

Most of us are aware that traditional engineering courses overemphasize “plugging and
cranking” on well-defined, close-ended, numerical problems at the expense of helping students
become better critical thinkers and engineering practitioners. As a result, and in sharp contrast
with other professions such as medicine and law, too few of our engineering graduates are
capable of immediately practicing engineering when they leave college. Yet, industry expects to
hire engineering graduates who “can go beyond the numbers” by understanding how technical
results fit into a larger systems perspective, who can integrate knowledge to find new solutions
to problems rather than relying on a traditional reductionist approach, who can deal with
uncertainty and develop engineering judgment skills, and who can communicate the results of
their work to many different audiences. [1,2] In short, they want engineers who can “think
outside the academic box.” In response to these expectations, many of the new ABET EC-2000
outcomes focus on professional practice including: (3b) “an ability to design and conduct
Page 4.417.1
experiments, as well as to analyze and interpret data;” (3e) “an ability to identify, formulate, and
solve

Page 4.417.2
engineering problems;” (3d) “an ability to function on multi-disciplinary teams;” and (3g) “an
ability to communicate effectively.” [3] Student ability in each of these areas is best assessed
using some form of performance assessment—a quantitative judgment of the quality of student
work by experienced raters based on agreed-upon criteria. [4]

We believe that the unit operations laboratory provides an ideal setting to help chemical
engineering students become better engineering practitioners and to assess their progress toward
achieving this goal. At the Colorado School of Mines (CSM), we offer the laboratory course as
an intensive six week summer experience designed to enhance and assess students’ higher order
thinking skills and knowledge of many aspects of chemical engineering professional practice
including data collection and analysis, evaluation and interpretation of results to draw
meaningful conclusions, team processes, and communication of technical results to a variety of
audiences. The course structure allows us to collect rich performance assessment data on
students’ ability to apply classroom knowledge of unit operations to situations approaching the
“real-world” including dealing with “noisy” data, overcoming equipment limitations, using
statistics in a prudent way, comparing “real” data to theoretical and empirical models, and
communicating complex results to a variety of audiences under significant time constraints.

As presently taught, the course relies heavily on a constructivist approach—that is, the cognitive
theory suggesting that learners construct their own internal interpretation of objective knowledge
based, in part, on formal instruction, but also influenced by social and contextual aspects of the
learning environment and previous life experiences. [5] This view suggests that students “make
their own meaning” of what they are learning by relying on mental models of the world, models
that may be correct or may contain strongly held misconceptions. [6] Rather than acting as
acknowledged authorities transmitting objective knowledge to passive students, laboratory
faculty use coaching and socratic questioning techniques to help students understand complex
technical phenomena by constructing mental models which reflect reality as perceived by
acknowledged experts while minimizing models containing significant misconceptions. Use of
constructivist pedagogies creates an ideal context for assessing students’ abilities to complete
authentic engineering tasks rather than relying on artificial examinations which emphasize non-
contextual recall of facts and closed-ended problem-solving.

In this paper, we briefly describe the course structure, the process we used to create and validate
a new performance assessment scoring rubric for written laboratory reports, and results of our
pilot assessment work using the rubric.

Overview of the CSM Unit Operations Laboratory

To facilitate development of each student’s engineering abilities in the unit operations laboratory
course, supervising faculty place as much responsibility for the planning, execution, analysis,
evaluation, and reporting of experiments on the students as possible. Each student performs a
total of eight experiments in fluid mechanics, heat transfer, and mass transfer working in teams
of two or three. Teams are randomly sorted from experiment to experiment so that students
work with all their peers in the course and each student has the opportunity to serve as a “team
Page 4.417.3

leader” on several experiments. Since the students have received extensive team-building
instruction and practice in previous CSM coursework, no explicit team-building work is
required in the laboratory. Details of the course objectives and pedagogical structure have been
reported. [7]

The afternoon prior to performing an experiment, each student team meets to become familiar
with the general experimental objectives and safety guidelines provided by faculty supervisors,
to study the equipment and how to measure and model its performance, to create a list of
detailed experimental objectives, to develop an experimental design for data collection, and to
decide what statistical analysis strategies will be used with the experimental data. Less than one
page of written guidelines (including safety issues) is typically available for each experiment;
faculty supervisors act as coaches or mentors to the teams but do not portray themselves as
authority figures. Early on the morning the experiment is scheduled, each student team presents
the results of the “prelab” preparation to a supervising faculty member who questions members
of the team on all aspects of the experiment including background theory; working equations;
data collection; measurement errors and data reproducibility; and data analysis and evaluation.

After successfully passing the prelab oral “exam,” each student team controls its own destiny in
the laboratory and operates without input from faculty supervisors or teaching assistants (except
for potential safety issues). Students make all decisions about ranges of data to collect, about
the amount of data to collect, and about conducting reproducibility runs.

With data in hand, the team begins the process of data analysis, comparison of results with
theoretical predictions or accepted correlations, and statistical error analysis. This is an intense
time for the team—they must either prepare and deliver a 20 minute oral presentation describing
their work one day after completing the experiment or must submit a draft written report five
days after completing the experiment. In either case, they must complete calculations, develop
appropriate correlations of engineering parameters such as friction factors or heat transfer
coefficients, prepare figures and tables of results, develop error propagation and statistical
analyses, provide logical explanations for any deviations of their results from expected values,
and develop overall conclusions based on evaluation of their work.

Students produce four oral and four written reports on experiments completed during the course.
Oral presentations are attended by other students in the course and by one or more faculty
supervisors; presenters are expected to focus largely on the conclusions drawn from their results
and reasons for any obvious discrepancies from expected trends. Once again, faculty use
socratic questioning to probe for evidence of analysis, synthesis, and evaluation by student
teams. Each written report is submitted first in draft form for review by the faculty supervisor
and a technical communication specialist. Draft review meetings are then held with individual
student teams to provide feedback and discuss remaining difficulties in technical and rhetorical
content before the final version of the report is submitted for grading.

Implementing Performance Assessment in the Unit Operations Laboratory

As the CSM chemical engineering department faculty began developing our assessment and
Page 4.417.4

evaluation plan to meet ABET EC-2000 requirements, we recognized that the learning
objectives and pedagogical methods associated with the unit operations laboratory experience
make the course an ideal setting to assess how well our students can actually apply their
knowledge of unit operations, statistics, communications, and teamwork to the operation and
analysis of laboratory equipment. While our entire assessment plan consists of three goals and
fourteen objectives, we agreed that the following educational objectives were the appropriate
ones to assess in the course:

• Students will be able to apply knowledge of unit operations to the identification,


formulation, and solution of chemical engineering problems (ref. ABET 3a, 3e).

• Students will be able to design and conduct experiments of chemical engineering


processes or systems and they will be able to analyze and interpret data from
chemical engineering experiments (ref. ABET 3b).

• Students will demonstrate an ability to communicate effectively in writing (ref.


ABET 3g).

Once these objectives were identified and agreed upon, we began considering possible
assessment tools for measuring each objective including examinations, faculty and student
surveys, portfolios, interviews, focus groups, and performance assessment. While we eventually
hope to expand the number of instruments used, we are initially focusing on assessing written
laboratory reports because faculty are already comfortable evaluating students’ written work and
because we believe we can reliably assess all three objectives listed above using well-designed
and tested scoring rubrics.

After deciding to collect and assess written reports, we had to determine what evidence would
demonstrate that students had met the objectives and how the evidence would be evaluated.
This required developing articulated levels of performance against which to measure student
achievement and organizing the performance levels into a scoring rubric. Greene [8] has listed
the steps required to design and test a reliable rubric instrument:

• Determine what objectives the rubric will be used to assess.


• Determine what student work will be assessed with the rubric.
• Determine an appropriate rating scale for scoring student performance.
• Draft the rubric using samples of student work at all performance levels as a guide.
• Test the rubric on student work (preferably using double blind-scoring to evaluate the
instrument’s reliability.)
• Revise the rubric language until inter-rater reliability meets faculty expectations.
Page 4.417.5
Based on a review of many rubrics, we initially developed a scoring rubric with three levels of
articulated performance: 1) “exceeds standards,” 2) “meets standards,” and 3) “does not meet
standards.” We then drafted language describing student work at each of these levels for each of
the three objectives being assessed. However, when we used the draft rubric to score student
reports, we quickly found that three scoring levels was insufficient to differentiate between
acceptable and unacceptable student performance (virtually none of the reports exceeded the
standard or didn’t meet the standard, so almost every report was judged to “meet standards”—
not a very useful result to help improve the course and chemical engineering curriculum, which
of course is the ultimate reason to assess student work).

We also found that the use of the term “standards” in our rubric seemed to denote for some
faculty the notion of a “minimum standard” which then erroneously implied some sort of
“lowest common denominator graduation requirement” that absolutely every student would have
to achieve. These examples served to remind us that we had to be extremely careful in our
choice of language used in scoring rubrics we developed. We eliminated these difficulties by
expanding the rubric to include four levels of articulated performance (“exemplary,”
“proficient,” “apprentice,” and “novice”) and avoiding use of the term “standard.”

After about three iterations of rubric testing and redrafting by the four-member departmental
assessment committee, we were able to design an instrument that is easy to use (a typical 15-20
page lab report can be scored in 5-7 minutes), reliable (based on blind-scoring of sample rubrics
by multiple raters), and descriptive of expected student performance. Table I shows the current
version of the rubric we are using to evaluate unit operations laboratory reports. We also plan
share the rubric with students in future sections of the unit operations laboratory course to help
them better understand our expectations for satisfactory and superior performance.

Once the rubric instrument was developed to our satisfaction, we conducted a “norming” session
to make sure we could achieve and document inter-rater reliability when assessing laboratory
reports. Four sample laboratory reports were independently scored by all four assessment
committee faculty members using the rubric shown in Table I. Results from this work are
summarized in Table II and show that the range of scores for 10 of 12 scored objectives (4
reports with 3 objectives per report) agreed within 1 point, our self-imposed reliability criterion.
We consider this level of agreement to be satisfactory for holistic program assessment purposes.

Page 4.417.6
Table I
Scoring Rubric for Unit Operations Laboratory Reports

Group members: ____________________________________ Lab Session: ______________ Experiment:


___________________

4 3 2 1
Objective Exemplary Proficient Apprentice Novice Score
ChE graduates will be able to Student groups apply Student groups apply Student groups apply Student groups make
apply knowledge of unit knowledge with virtually no knowledge with no knowledge with occasional significant conceptual and/or
operations to the identification, conceptual or procedural significant conceptual errors conceptual errors and only procedural errors affecting
formulation, and solution of errors affecting the quality of and only minor procedural minor procedural errors. the quality of the
chemical engineering problems. the experimental results. errors. experimental results.
ChE graduates will be able to Student groups design and Student groups design and Student groups design and Student groups design and
design and conduct conduct unit operations conduct experiment with conduct experiment with no conduct experiments with
experiments of chemical experiments with virtually no virtually no errors; analysis significant errors; results are major conceptual and/or
engineering processes or errors; analysis and and interpretation of results analyzed but not interpreted; procedural errors; no
systems and they will be able to interpretation of results meet requirements of very limited evidence of evidence of significant
analyze and interpret data from exceed requirements of experiment and demonstrate higher-order thinking ability. analysis and interpretation of
chemical engineering experiment and demonstrate some higher-order thinking results; fail to meet
experiments. significant higher-order ability. requirements of the
thinking ability. experiment; demonstrate
only lower-level thinking
ability.
ChE graduates will demonstrate Written report is virtually Written report presents Written report is generally Written report does not
an ability to communicate error-free, presents results results and analysis logically, well written but contains present results clearly, is
effectively in writing. and analysis logically, is well is well organized and easy to some grammatical, rhetorical poorly organized, and/or
organized and easy to read, read, contains high quality and/or organizational errors; contains major grammatical
contains high quality graphics, contains few minor analysis of results is and rhetorical errors; fails to
graphics, and articulates grammatical and rhetorical mentioned but not fully articulate analysis of results
interpretation of results errors, and articulates developed. meeting requirements of the
beyond requirements of the interpretation of results experiment.
experiment. which meets requirements of
the experiment.

Page 4.417.7
Table II
Summary of Inter-rater Reliability Results Using the Scoring Rubric

ChE students will be able to ChE students will be able to design


apply knowledge of unit and construct experiments of
operations to the chemical engineering processes or ChE students will
identification, formulation, systems and they will be able to demonstrate an ability to
and solution of unit analyze and interpret data from communicate effectively in
Report
operations problems. chemical engineering experiments. writing.
a 4,3,3,3 4,2,4,3 3,2,3,3
b 3,2,3,3 3,2,3,3 2,1,2,2
c 1,2,2,3 2,2,2,2 2,2,2,2
d 2,2,2,3 2,2,2,3 2,1,2,2

Page 4.417.8
Once inter-rater reliability was established among assessment committee members, we began the
task of assessing a sample of reports from each of two 1998 summer laboratory sessions.
Results are summarized in Table III. All reports produced by the students in the last two weeks
of the course (approximately 65 per summer) were retained and a random sample of 20 was
collected for assessment purposes. Each report was then independently assessed by two
assessment committee members using the scoring rubric. As the results in Table III show,
scores for 18 of the 20 reports agreed within 1 point for each objective (our reliability criterion)
and therefore only 2 of the 20 reports had to be scored by a third faculty member. Such a high
rate of agreement among a widely disparate group of faculty (two senior professors, one
associate
professor, and one new assistant professor who has never taught the laboratory course) is not
uncommon when the rubric development process recommended by Greene [8] is closely
followed.

Page 4.417.9
Table III
Results of Summer Lab Report Assessment (1998 Summer Lab Sessions)

ChE students will be able to


ChE students will be able to design and construct
apply knowledge of unit experiments of chemical
operations to the engineering processes or ChE students will
identification, formulation, systems and they will be able to demonstrate an ability
and solution of unit operations analyze and interpret data from to communicate
Report problems. chemical engineering effectively in writing.
experiments.
1 3,3 2,2 2,2
2 3,2 2,2 1,2
3 1,2 2,2 3,2
4 2,2 2,2 2,2
5 1,1 2,1 1,2
6 2,3 2,2 2,2
7 3,3 3,3 3,2
8 3,3 2,3 3,3
9 2,3 2,2 3,2
10 2,3 2,3 2,3
11 4,4 3,4 3,2
12 3,4,3 2,4,2 2,3,2
13 3,3 2,3 2,3
14 3,3 3,2 3,3
15 1,1 2,2 1,2
16 2,1 2,2 2,2
17 3,3 3,3 2,3
18 3,3 3,3 3,2
19 4,3 4,3 3,2
20 3,2,2 4,2,3 3,2,2
% of
scores at 2 80.0% 95.0% 85.0%
or above
% of
scores at 3 50.0% 30.0% 10.0%
or above
average
score 2.6 2.5 2.3

Note: Each report was independently blind-scored by two CR assessment committee faculty
members. Any scores which varied by more than 1 point in any of the three objective categories
was scored by a third faculty member.
Page 4.417.10

Scoring levels: Exemplary = 4 Apprentice = 2


Proficient = 3 Novice = 1
Once we obtained assessment data on the lab reports, we began evaluating the results and how
we could use them to improve the unit operations laboratory course and our overall curriculum.
As part of our assessment plan development, department faculty had agreed upon a stringent
performance criterion for each of the three objectives being assessed: “100% of the reports will
rate at 2 or above and 50% will rate at 3 or above.” As shown in Table III, we are not yet
meeting our performance criterion for any of the three objectives, although students are
performing reasonably well in all three areas. These data are now being used to guide faculty
discussions about what specific curricular changes are needed to further improve student
performance. Possibilities under consideration include adding new unit operations learning
objectives to pre-requisite courses in fluid mechanics, heat transfer, and mass transfer; more
emphasis on data analysis in chemistry and physics laboratory courses, and additional writing
assignments in chemical engineering courses leading to the unit operations laboratory.

Conclusions and Recommendations

The unit operations laboratory course represents an ideal context for performance assessment of
chemical engineering students. To obtain reliable and valid data requires a multi-step process to
create and test scoring rubrics which clearly articulate the objectives being assessed and describe
in detail each level of observed student performance. The process of developing a performance
assessment plan is itself valuable because faculty members (for perhaps the first time) must
discuss and agree upon the attributes of acceptable student work.

We have developed and tested an easy-to-use scoring rubric for using unit operations laboratory
reports to assess three objectives: 1) knowledge of unit operations, 2) designing and conducting
experiments and analyzing and interpreting data, and 3) written communication. Data from
1998 summer laboratory sessions indicate that the rubric is a reliable and valid means of
holistically assessing student performance.

Based on our success assessing written reports, we hope to expand assessment activities in the
laboratory to include performance in the prelab oral examination, oral presentations, and
teamwork. We also plan to extend this method of assessment to other courses in the curriculum
which contain project and laboratory work.

References Cited

[1] “Educating Tomorrow’s Engineers,” ASEE Prism, pp.11-15, May/June 1995.

[2] “Do We Really Want ‘Academic Excellence?,’” Lee, W.E. and R.R. Rhinehart, Chemical Engineering Progress,
pp. 82-89, October 1997.
Page 4.417.11

[3] “Criteria for Accrediting Programs in Engineering,” Accreditation Board for Engineering and Technology,
Baltimore, MD, 1999 (available on the ABET WWW homepage: www.abet.org)
[4] “Performance Assessment,” Office of Research, U.S. Department of Education, Washington, DC, 1999
(available on the DOE WWW homepage: inet.ed.gov/pubs/OR/ConsumerGuides/perfasse.html)

[5] “Teslow, J.L., L.E. Carlson, and R.L. Miller, “Constructivism in Colorado: Applications of Recent Trends in
Cognitive Science,” ASEE Proceedings, pp. 136-144, 1994.

[6] Atman, C.J. and I. Nair, “Constructivism: Appropriate for Engineering Education?” ASEE Proceedings, pp.
1310-1312, 1992.

[7] “Higher Order Thinking in the Unit Operations Laboratory,” R.L. Miller, J.F. Ely, R.M. Baldwin, and B.M.
Olds, Chemical Engineering Education, 32, 2, 146-151, 1998.

[8] Greene, A., “Developing Rubrics for Open-Ended Assignments, Performance Assessments, and Portfolios,”
Proceedings of the Best Assessment Processes in Engineering Education Conference, Rose-Hulman Institute of
Technology, April 11-12, 1997.

RONALD L. MILLER

Ronald L. Miller is Associate Professor of Chemical Engineering and Petroleum Refining at the Colorado School of
Mines where he has taught chemical engineering and interdisciplinary courses and conducted research in
educational methods since 1986. He has received three university-wide teaching awards and the Helen Plants
Award for Best Workshop at the 1992 Frontiers in Education national conference. Dr. Miller is chair of the
chemical engineering department assessment committee and acting chair of the CSM Assessment committee.

BARBARA M. OLDS

Barbara M. Olds is Principal Tutor of the McBride Honors Program in Public Affairs for Engineers and Professor
of Liberal Arts and International Studies at the Colorado School of Mines where she has taught since 1984. She is
the chair of CSM's assessment committee and has given numerous workshops and presentations on assessment in
engineering education. Dr. Olds has received the Brown Innovative Teaching Grant and Amoco Outstanding
Teaching Award at CSM and was the CSM Faculty Senate Distinguished Lecturer for 1993-94. She also received
the Helen Plants Award for Best Workshop at the 1992 Frontiers in Education national conference and was
awarded a Fulbright fellowship to teach and conduct research in Sweden during the 1998-99 academic year.

Page 4.417.12

You might also like