This document discusses various approaches to program evaluation including objective-oriented, expertise-oriented, participant-oriented, and consumer-oriented approaches. It provides examples of each approach and how they may be applied. Strengths and weaknesses of each approach are considered. The document also discusses evaluation methods such as surveys, interviews, and mixed methods. References are provided on related research and examples of evaluation studies.
7. Management-Oriented EvaluationContext, Input, Process, Product Model (CIPP) UCLA Model Strengths & Limitations Rational & Orderly Fuels high level decision makers Costly & Complex Stability v. Need for Adjustment
8. Consumer-Oriented ApproachTypically a summative evaluation approach This approach advocates consumer education and independent reviews of productsScriven’s contributions based on groundswell of federally funded educational programs in 1960sDifferentiation between formative/summative evaluation
9. What is a consumer-oriented evaluation approach?When independent agencies, governmental agencies, and individuals compile information on education or other human services products for the consumer. Goal: To help consumers become more knowledgeable about products
10. For what purposes is it applied?Typically applied to educational products and programs Governmental agenciesIndependent consumer groupsEducational Products Information ExchangeTo represent the voice and concerns of the consumers
11. How is it generally applied?Creating and using stringent checklists and criteria Michael ScrivenEducational Products Information ExchangeU.S. Dept of EducationProgram Effectiveness PanelProcessesContentTransportabilityEffectiveness
12. Consumer Oriented ChecklistNeedMarketPerformanceTrue field trials [tests in a “real” setting]True consumer tests [tests with real users]Critical comparisons [comparative data]Long term [effects over the long term]Side effects [unintended outcomes]Process [product use fits its descriptions]Causation [experimental study]Statistical significance [supports product effectiveness]Educational significance
13. Strengths…Has made evaluations available on products and programs to consumers who may have not had the time or resources to do the evaluation process themselves
14. Increases the consumers’ knowledge about using criteria and standards to objectively and effectively evaluate educational and human services products
15. Consumers have become more aware of market strategies …and WeaknessesIncreases product costs onto the consumer
18. Concern for rise of dependency of outside products and consumer services rather than local initiative developmentConsumer Oriented Promotions• Goals: Encourage repurchases by rewarding current users, boost sales of complementary products, and increase impulse purchases.• Coupons—most widely used form of sales promotion.Refunds or rebates—help packaged-goods companies increase purchase rates, promote multiple purchases, and reward product users.
19. More Consumer Oriented Promotions Samples, bonus packs, and premiums—a “try it, you’ll like it” approach.• Contests—require entrants to complete a task such as solving a puzzle or answering questions in a trivia quiz. • Sweepstakes—choose winners by chance; no product purchase is necessary.• Specialty Advertising—sales promotion technique that places the advertiser’s name, address, and advertising message on useful articles that are then distributed to target consumers.
23. What other criteria for selection does this approach suggest to you?Expertise Oriented EvaluationExpertise-oriented approaches, Direct application of professional expertise to judge the quality of educational endeavors, especially the resources and the processes.
24. ApproachesFormal Professional Review SystemInformal Professional Review SystemAd Hoc Panel ReviewAd Hoc Individual ReviewEducational Connoisseurship and Criticism
25. Formal Professional Review SystemStructure or organization established to conduct periodic reviews of educational endeavorsPublished standardsPre-specified scheduleOpinions of several expertsImpact on status of that which is reviewed
26. Informal Professional Review SystemState review of district funding programsReview of professors for determining rank advancement or tenure statusGraduate student’s supervisory committee Existing structure, no standards, infrequent schedule, experts, status usually affectedOther ApproachesAd Hoc Panel Reviews (journal reviews)-Funding agency review panels-Blue-ribbon panelsMultiple opinions, status sometimes affected
27. Ad Hoc Individual Reviews (consultant)-Status sometimes affectedEducational Connoisseurship and Criticism-Theater, art, and. literary critic
30. Weaknesses: whose standards? (personal bias), expertise credentials, can this approach be used with issues of classroom life, texts, and other evaluation objects or only with the bigger institutional questions?Expertise Type QuestionsWhat outsiders review your program or organization? How expert are they in your program’s context, process, and outcomes?What are characteristics of the most/least helpful reviewers?
33. This approach encourages support for representation of marginalized, oppressed and/or powerless partiesParticipant Oriented CharacteristicsDepend in inductive reasoning [observe, discover, understand]
46. Confirmability vs. objectivity (neutrality of evaluation)Participant Oriented Examples Participatory EvaluationCollaboration between evaluators & key organizational personnel for practical problem solvingUtilization-Focused EvaluationBase all decisions on how everything will affect useEmpowerment EvaluationAdvocates for societies’ disenfranchised, voiceless minoritiesAdvantages: training, facilitation, advocacy, illumination, liberationUnclear how this approach is a unique participant-oriented approachArgued in evaluation that it is not even ‘evaluation’
47. Strengths and WeaknessesStrengths: emphasizes human element, gain new insights and theories, flexibility, attention to contextual variables, encourages multiple data collection methods, provides rich, persuasive information, establishes dialogue with and empowers quiet, powerless stakeholders
48. Weaknesses: too complex for practitioners (more for theorists), political element, subjective, “loose” evaluations, labor intensive which limits number of cases studied, cost, potential for evaluators to lose objectivityParticipant Oriented QuestionsWhat current program are you involved in that could benefit from this type of evaluation?Who are the stakeholders?
49. What's going on in the field?Educational Preparationhttp://www.duq.edu/program-evaluation/TEAhttp://www.tea.state.tx.us/index2.aspx?id=2934&menu_id=949
50. What's going on in the field?Rockwood School DistrictClear Creek ISDEducational Link posted by Austin ISDHouston ISDAustin ISD
54. District InitiativesHouston ISD “Real Men Read”http://www.houstonisd.org/portal/site/ResearchAccountability/menuitem.b977c784200de597c2dd5010e041f76a/?vgnextoid=159920bb4375a210VgnVCM10000028147fa6RCRD&vgnextchannel=297a1d3c1f9ef010VgnVCM10000028147fa6RCRDAlvin ISD “MHS (Manvel HS) Reading Initiative Programhttp://www.alvinisd.net/education/staff/staff.php?sectionid=245
55. What does the research say?“Rossman and Salzman (1995) have proposed a classification system for organizing and comparing evaluations of inclusive school programs. They suggest that evaluations be described according to their program features (purpose, complexity, scope, target population, and duration) and features of the evaluation (design, methods, instrumentation, and sample).”Dymond, S. (2001). A Participatory Action Research Approach to Evaluating Inclusive School Programs. Focus on Autism & Other Developmental Disabilities, 16, 54-63.
56. What does the research say?“Twenty-eight school counselors from a large Southwestern school district participated in a program evaluation training workshop designed to help them develop evaluation skills necessary for demonstrating program accountability. The majority of participants expressed high levels of interest in evaluating their programs but believed they needed more training in evaluation procedures.”
57. What does the research say?Group Interview Questions“Graduate research assistants conducted group interviews in Grades 2-5 during the final weeks of the school year. We obtained parent permission by asking teachers to distribute informed consent forms to students in their classes, which invited the students to participate in the group interviews. We received informed consent forms from at least 3 students-the criterion number for a group interview at the school-for 21 schools (66% participation rate). If participation rates were high enough, the research assistants conducted separate interviews for Grades 2-3 and 4-5; the assistants conducted 23 interviews. The research assistants tape recorded all interviews, which averaged about 25 min, for data analysis. The interviewer encouraged responses from all group members. Four questions guided the group interviews.”Frey, B., Lee, S., Massengill, D., Pass, L., & Tollefson, N. (2005). Balanced Literacy in an Urban School District. The Journal of Educational Research, 98, 272-280.
58. What does the research say?Survey Collection“A survey collected teachers' self-reports of the frequency with which they implemented selected literacy activities and the amount of time in minutes that they used the literacy activities. Teachers also reported their level of satisfaction with the literacy resources available to them.”Frey, B., Lee, S., Massengill, D, Pass, L, & Tollefson, N. (2005). Balanced Literacy in an Urban School District. The Journal of Educational Research, 98, 272-280.
59. Tips on making a survey:Make the survey response time around 20 minutes Make the survey easy to answerWhat changes should the school board make in its policies regarding the placement of computers in elementary school? -Is this question effective or vague? Survey questions should clarify the time periodDuring the past year, has computer use by the average child in your classroom increased, decreased, or stayed the same?Avoid Double (or triple, or quadruple) Barreled questions and responsesMy classroom aide performed his/her tasks carefully, impartially, thoroughly, and on time.Langbein, L. (2006). Public Program Evaluation: A Statistical Guide. New York: M.E. Sharpe, Inc.
60. Examples of Evaluation MethodsEXAMPLES OF EVALUATION METHODS USEDOne study uses a mixed-methods approach of objective-oriented, expertise-oriented, and participant-oriented approaches. Evaluations were based on the models provided in “Program EvaluationAlternative Approaches and Practical Guidelines”
62. Cont.The purpose of this report is to illustrate the procedures necessary to complete an evaluation of the Naval Aviation Survival Training Program (NASTP) written by Anthony R. Artino Jr. a program manager and instructor within the NASTP for eight years“In very few instances have we adhered to any particular “model” of evaluation. Rather, we find we can ensure a better fit by snipping and sewing bits and pieces off the more traditional ready-made approaches and even weaving a bit of homespun, if necessary, rather than by pulling any existing approach off the shelf. Tailoring works” (Worthen, Sanders, Fitzpatrick p. 183).
63. Cont. objective-oriented evaluation – Objective-oriented evaluation was used because a) the NASTP has a number of well-written objectives; b) it would be relatively easy to measure student attainment of those objectives using pre and post-assessments; and c) the program sponsor, the CNO, would be very interested to know if the objectives that he approves are in fact being met.
64. Cont.Expertise-oriented evaluation - An outside opinion from an aviation survival training subject matter expert – someone very familiar with the topics being taught and the current research literature in survival training was used.
65. Cont.Participant-oriented evaluation - It was important for evaluators and the SME to be totally immersed in the training environment. This included a focus on audience concerns and issues (i.e. mangers, instructors, and students) and an examination of the program “in situ” without any attempt to manipulate or control it (Worthen, Sanders, & Fitzpatrick, 1997).
69. ReferencesAstramovich, R., Coker, J., & Hoskins, W. (2005). Professional School Counseling. The Journal of Educational Research, 9, 49-54.Dymond, S. (2001). A Participatory Action Research Approach to Evaluating Inclusive School Programs. Focus on Autism & Other Developmental Disabilities, 16, 54-63.Frey, B., Lee, S., Massengill, D., Pass, L., & Tollefson, N. (2005). Balanced Literacy in an Urban School District. The Journal of Educational Research, 98, 272-280.Langbein, L. (2006). Public Program Evaluation: A Statistical Guide. New York: M.E. Sharpe,