Streamlining The Assessment Process With The Faculty Course Assessment Report
Streamlining The Assessment Process With The Faculty Course Assessment Report
Streamlining The Assessment Process With The Faculty Course Assessment Report
John K. Estell Electrical & Computer Engineering and Computer Science Department Ohio Northern University
The Faculty Course Assessment Report (FCAR) presents a methodology that allows assessment reports to be written in a format conducive for use in ABET Criterion 3 program outcomes assessment. In addition to traditional assessment reporting, the FCAR lists modifications incorporated into the course, reflection on the part of the instructor as to what was or was not effective, and suggestions for further improvements. To assist in program outcomes assessment, additional information is collected in certain specified areas and incorporated into the FCAR. This facilitates program-level assessment in that, instead of processing raw data, assessors review the pertinent sections of an appropriate set of FCARs, thereby reducing the assessment workload. Reports are collected and disseminated to allow instructors to inspect prior offerings of specific courses and adopt the suggestions found therein, thereby improving the course with each offering.
The FCAR consists of the following sections: 1. Header Provide both the subject code and course number, followed by course title. If this course is offered in multiple sections by different faculty, then each faculty member is to submit an FCAR that summarizes the assessment of all sections for which he/she is responsible. Indicate the section(s) within parentheses that the Report is covering. List the academic term that the Report is for and the instructor of record for the course. 2. Catalog description Give the catalog description under which this course was taught. Providing this information will, over time, document changes made to the catalog description without the need for keeping previous university catalogs on file. Additionally, the catalog only shows that the course description was changed; it does not document why it was changed, nor does it indicate what feedback elements of the assessment process led to this change. The FCAR documents this activity in the Course Modifications section. 3. Grade distribution List the distribution of grades for the course, including withdrawals. While it is possible to obtain most of this information from ones Office of Instructional Research, it is preferable that the instructor directly provide this data so that (a) it is obtained in a timely manner, and (b) by actively engaging in this computation, the instructor can better reflect upon the results. At no time is any information included that would reveal the identity of individual students or their grades for the course. 4. Modifications made to course When the continuous quality improvement process is working, changes are fed back into the program, which is often referred to as closing the
loop on the assessment process. However, without appropriate documentation, changes made to the organization or operation of individual courses will go unrecognized. Accordingly, this is an important section as it provides contemporaneous documentation of course improvements made because of the assessment process. Please list any substantive changes made to the current offering of the course, and cite the source of the improvement (e.g. a previous FCAR, an action plan, minutes of a committee meeting, etc.), especially if it has been documented. These references are necessary so that each modification can be traced back to its source if so required. By combining this information with the relevant portions of the referenced items documenting the assessment process, one can easily demonstrate how the loop was closed for any particular modification. 5. Course outcomes assessment List and address each outcome separately. Appropriate documentation stating what items were used for the assessment and the results of that assessment must be provided. There is no need to assess every question on every assignment; keep your workload manageable by picking an appropriate selection of items (e.g. specific exam questions, noteworthy assignments) and use those for your assessment. In order to have a uniform reporting method, the following four categories1 are suggested for course outcomes:
Category Exemplary Adequate Minimal Unsatisfactory Point value 3 2 1 0 General Description Student applies knowledge with virtually no conceptual or procedural errors Student applies knowledge with no significant conceptual errors and only minor procedural errors. Student applies knowledge with occasional conceptual errors and only minor procedural errors. Student makes significant conceptual and/or procedural errors when applying knowledge.
The specification of the performance criteria that correlate to these categories is up to the instructor; accordingly, the description for each category should be developed to apply to the specifics for the course. This information can be presented in an introductory paragraph if the same criteria are used for all outcomes. When reported, each outcome should present a count of the number of students in each of the four categories. For conciseness, the count can be reported in vector format; the EAMU vector would contain the following four fields in order: Exemplary, Adequate, Minimal, and Unsatisfactory. Performance criteria vectors are to be constructed with data from only those students who received a passing grade in the course, as the primary question is whether students are graduating without achieving the specified outcomes. Please refer to the section at the end of this document, Using Microsoft Excel to Automate the Outcomes Assessment Process, for a way to simplify the collection and processing of this information.
6. Program outcome assessment documentation ("Components") The assessment of course outcomes is, by itself, insufficient to meet the criteria for program outcomes and
Adapted from R. L. Miller and B. M. Olds, Performance Assessment of EC-2000 Student Outcomes in the Unit Operations Laboratory, 1999 ASEE Annual Conference Proceedings.
1
assessment. The data presented for satisfying the requirements for Criterion 3 have to be relative to the adopted program outcomes. However, this does not mean that the course outcomes assessment process cannot be used to assist in the program outcomes assessment process. This section of the Report is organized into components that roughly correspond to the individual items listed in the 3(a)-(k) program outcomes. While writing metrics for some of these outcomes border on the trivial and a wide variety of assessment data are readily available, some outcomes are more difficult to deal with and not easily documented save at the course level. As an example, take outcome (b) of ABET Criterion 3: an ability to design and conduct experiments, as well as to analyze and interpret data. How does one sufficiently prove to a program evaluator that a graduate of the program has experience and expertise in designing experiments? Merely stating that this activity is being accomplished is insufficient and would likely result in the citing of a shortcoming. Documentation is needed to back up the claim, and this can be provided in the courses where design of experiments is occurring by the inclusion of a Design of Experiments Component in the submitted Reports for those courses. When writing this portion of the Report, the instructor presents a synopsis regarding the assignment(s) in question and what steps were undertaken by students in order to design the experiment, along with assessment of the results. The person performing program outcome assessment in this area can now document that this activity is taking place by citing the FCARs of the relevant courses. For added convenience, the header for each component should parenthetically include a reference to the specific metric for which this data is being collected; in this way, the data can be easily processed from the FCAR and placed under the appropriate program outcome. Going through the list of outcomes in ABET Criterion 3, some of the areas that would be worth documenting if you are doing something of sufficient substance that it can be pointed to as an example are the following: design of experiments, professional/ethical responsibility, communications (both written and oral), impact of solutions in a global and societal context, and contemporary issues. This is not meant to be an exhaustive list; however, it does cover some of the harder items to prove for Criterion 3. By providing contemporaneous documentation here, it at least demonstrates that these items have been addressed. A component should be listed only when there is something to report or when one is specifically instructed to do so as part of an assessment plan. The reporting of components can be performed as follows: a. An Assessment & Evaluation Committee may select a small set of courses where a particular ability, knowledge, or understanding that may or may not be part of the course outcomes for those courses is demonstrated. The use of a component, with its own assessment, will provide the needed documentation. The assessment of these items will be performed in the same manner as a course outcome assessment, using the categories provided in item 5 above, with the indicators being tailored specifically for that particular metric.
b. An Assessment & Evaluation Committee may select a small set of courses where a students growth in a particular ability, knowledge, or understanding that may or may not be part of the course outcomes for those courses is demonstrated. In this
instance, one is performing cohort longitudinal analysis (CLA), where the parameters for the performance evaluation are held constant throughout the curriculum and are set for the performance expected from a student at the time of graduation. CLA is used to determine the effectiveness and appropriateness of the curriculum by measuring the ability of students as they progress through the curriculum. In order to have a uniform reporting method, the following four categories2 can be used for CLA-related components:
Point value 3 2 1 0
General Description Student applies knowledge with virtually no conceptual or procedural errors Student applies knowledge with no significant conceptual errors and only minor procedural errors. Student applies knowledge with occasional conceptual errors and only minor procedural errors. Student makes significant conceptual and/or procedural errors when applying knowledge.
Note that this approach takes a different viewpoint in its judgment of student performance. In an introductory course, it is perfectly acceptable for an overall rating of novice to be reported. The desire here is to show that the curriculum is providing useful instruction by targeting areas where the incoming cohorts are novices; if all is going well, the cohort will initially rate as novices, and will graduate with at least a proficient performance rating. If the cohort enters with a rating higher than that of novice, then that could indicate that the introductory course could be too basic and not serving as an appropriate challenge to the students. If a cohort graduates with a less than proficient rating, then that could indicate a failure in some portion of the curriculum to deliver appropriate instruction. Ideally, at the time of graduation, a cohort should rate as either proficient or exemplary, and no member of a cohort should reside at either the novice or apprentice level. When reported, each outcome should present a count of the number of students in each of the four categories. For conciseness, the count can be reported in vector format; the EPAN vector would contain the following four fields in order: Excellent, Proficient, Apprentice, and Novice. c. The third situation is where the instructor voluntarily reports component information, either as a statement of fact or with assessment information in one of the two specified formats. This is done to provide additional documentation that these activities are occurring in the course, should that evidence be needed later. 7. Student feedback When performing assessment, input should be obtained from all of the appropriate constituents; accordingly, it is reasonable and proper to incorporate student feedback into the Report. Please provide a synopsis of the course evaluation form feedback as it relates to the course. While some of the comments received from students are of dubious
Adapted from R. L. Miller and B. M. Olds, Performance Assessment of EC-2000 Student Outcomes in the Unit Operations Laboratory, 1999 ASEE Annual Conference Proceedings.
2
quality, or are of constructive criticism toward the instructor, there are other comments regarding course content and organization that are worthy of being shared. This section of the FCAR allows an instructor to publicly document and share constructive comments concerning the course. By sharing this information, the student comments regarding the course now reach a wider audience, increasing the likelihood that these comments will find their way into an action plan for improving the content of the course. 8. Reflection The primary purpose of this section is to promote self-awareness on the part of the instructor. Given that the goal of assessment is to improve the program, it is imperative on the part of the instructor to keep an open mind while looking at the results so that shortcomings can be identified and corrected. The reflection section also provides the instructor the opportunity to document impressions regarding the effectiveness of instruction, extenuating circumstances that might have affected student performance, or items that fall outside the scope of the current set of course and program outcomes. Having the opportunity for reflection on the part of the instructor is very beneficial for both the improvement of the course and the improvement of the instructional methods used by the instructor. From an assessment standpoint, it allows for the documentation of those things that are not easily measurable and of things that are measurable but not encapsulated into the current set of course or program outcomes. 9. Proposed actions for course improvement The specification of proposed actions for course improvement begins the "closing the loop" process, as these items constitute the result of the instructor's evaluation of the course via assessment, student feedback, and reflection. There are no restrictions as to what can be proposed; it could be as simple as a note to include material on a certain subject in an assignment, or a recommendation to the curriculum committee to create a new course to better deal with some of the subject material. Whatever suggestions are recorded by the instructor, it is essential that the appropriate parties in the department review these suggestions; to that end, one needs to incorporate the FCAR review into the overall assessment process as a regularly scheduled activity.
The following is an FCAR example. Please use this standard format when writing your Report.
Note: Our outcomes assessment process utilizes vectors to aggregate data. The "EAMU" vector reflects the number of students whose proficiency for that outcome was rated Exemplary, Adequate, Minimal, or Unsatisfactory. The "EPAN" vector is used for cohort longitudinal analysis (CLA) and rates students as Excellent, Proficient, Apprentice, or Novice on their abilities in various areas, such as communication skills. The goal of CLA is to use the data to demonstrate skill improvement over the course of a cohort's academic career so that by the time of graduation, all students are at least proficient in all areas that are being measured.
Faculty Course Assessment Report ECCS 000 Introduction to ECCS (sections 00 and 01) 1.00 credit Fall Quarter 2005 - John K. Estell Catalog Description:
Orientation to the department. Familiarization with requirements for the majors, planning program of courses, university catalog, and library. Exposure to TLAs such as PHP, ASP, PLC, BJT, etc. Philosophical discussion of the metavariables foo and bar.
Grade Distribution:
A 3 B 5 C 11 D 4 F 2 W 1 Total 26
2.
Student Feedback:
On the student course evaluation forms, students indicated a general dissatisfaction with the lecture on career opportunities available to our majors. Some expressed an interest in having a mentoring program to ease the transition into college life. A couple of students indicated that we should spend less time on dealing with university paperwork and more on what it is like to be an engineer.
Reflection:
Overall, the course went well, but some areas need work. Half of the class demonstrated less than effective proficiency with metavariables. I don't think we did a sufficient job on explaining the rationale behind our common freshman core course sequences. We should advertise the successfulness of our alumni. The addition of the ethics lectures was well received; student enjoyed talking with a real engineer about the situations she's encountered in the workplace.
Using Microsoft Excel to Automate the Outcomes Assessment Process Assume that we have a class of ten students in a course with three course outcomes, CO-1, CO-2, and CO-3. The final exam (FE) is broken down into three pages, with questions concerning CO1 placed on page 1 (P1) of the exam, CO-2 questions on page 2 (P2), and CO-3 questions on page 3 (P3). (Please note that this is a contrived example; it is not expected that every page of your exams will correspond to, or be use for, course outcomes assessment.) The scores for each page are entered into the course's Excel spreadsheet; the results are tabulated and the grades assigned. Please note that we also record at the top of the column the number of possible points available for each set of questions.
ECCS 000 - Fall 2005 Introduction to ECCS Aardvark, Alan Bear, Betty Cougar, Clyde Dingo, Daisy Elephant, Ed Flamingo, Fiona Groundhog, Gus Hyena, Henry Iguana, Isaac Jackolope, Jack FE-P1 25 20 25 15 22 15 20 10 12 14 5 FE-P2 35 32 29 20 30 30 20 35 30 27 15 FE-P3 Final Exam 40 100 37 89 39 93 14 49 30 82 32 77 20 60 40 85 35 77 34 75 20 40 Grade B A F B C D B C C F
At this point, we sort the results in descending order by the final score in the class (which in this case is the final exam score), and then add one column for each course outcome for only those students who have passed the course. One of the purposes of program assessment is to measure the abilities of our students against a set of outcomes we expect each student to meet by the time of graduation. A student who has failed the course is of no interest to us for the purposes of this assessment; our concern is directed toward those students who passed the course without achieving one or more of the course outcomes. If our analysis shows that a "significant" number of students are passing the course without achieving a particular course outcome, then that points out an area to focus our efforts for quality improvement.
ECCS 000 - Fall 2005 Introduction to ECCS Bear, Betty Aardvark, Alan Groundhog, Gus Dingo, Daisy Elephant, Ed Hyena, Henry Iguana, Isaac Flamingo, Fiona Cougar, Clyde Jackalope, Jack FE-P1 25 25 20 10 22 15 12 14 20 15 5 FE-P2 35 29 32 35 30 30 30 27 20 20 15 FE-P3 Final Exam Grade 40 100 39 93 A 37 89 B 40 85 B 30 82 B 32 77 C 35 77 C 34 75 C 20 60 D 14 49 F 20 40 F CO-1 CO-2 CO-3
The "CO-1," "CO-2," and "CO-3" columns in the figure below reflect the processing of the raw course outcome data into the 4-point "compliance level" scale used by the EAMU and EPAN vector reporting mechanisms. The rows beneath the "Outcome Analysis" label collate the point scale values into reportable vectors.
1.
To calculate letter grades (based on the traditional 90-80-70-60 boundaries), starting with the cell F4: =IF(E4>=90,"A",IF(E4>=80,"B",IF(E4>=70,"C",IF(E4>=60,"D","F")))) where cell E4 is the corresponding final score for the student in the course. Perform a copy and paste of the formula for the remaining cells (F5-F13).
2.
To calculate the course outcome compliance level (using our 0-1-2-3 model), starting with cell G4: =IF(B4/B$3>=0.9,3,IF(B4/B$3>=0.75,2,IF(B4/B$3>=0.6,1,0))) Note the use of the dollar sign; it is needed to anchor the row reference. You can now copy this formula to the remaining course outcome categorization cells.
3.
To automate the counting of the number of students at each level of performance, starting with cell G16 and ending with G19: =COUNTIF(G$4:G$11,"=3") =COUNTIF(G$4:G$11,"=2") =COUNTIF(G$4:G$11,"=1") =COUNTIF(G$4:G$11,"=0") (formula for cell G16) (formula for cell G17) (formula for cell G18) (formula for cell G19)
These cells can be entered for one column of analysis reporting, then that column can be copied and pasted into the other reporting columns.