Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
SlideShare a Scribd company logo
BMHR5103, November 2012
- Supervised by: Dr. Nayal Rashed
- Presented by: Eyad Al-Samman
I.D.#: 201010047
ABC
(1)TRAINING EVALUATION – BMHR5103
This presentation will concisely discuss the following issues
related to Training Evaluation Process:
(2)TRAINING EVALUATION – BMHR5103
• Definition of Training Evaluation
• Benefits of Training Evaluation
• Steps of Training Evaluation:
• STEP (1): Identifying Purposes of Evaluation
• STEP (2): Selecting Evaluation Method
• STEP (3): Designing Evaluation Tools
• STEP (4): Collecting Data
• STEP (5): Analyzing and Reporting Results
• Kirkpatrick Four-Level Evaluation Model (1959)
• Purposes of Training Evaluation
Training Evaluation involves assessment of effectiveness of training
programs. This Assessment is done by collecting data on whether:
(3)TRAINING EVALUATION – BMHR5103
STEP 1: CONDUCT
Training Needs Analysis
STEP 2: DESIGN
Training Program
STEP 3: SELECT
Training Method
STEP 4: EVALUATE
Training Program
Participants were
satisfied with the
training program.
1 Effectiveness of
participants’ skills
is enhanced.
2 Participants are able
to apply new skills
at their workplace.
3
Training
Management Cycle
(4)TRAINING EVALUATION – BMHR5103
• Ensuring accountability. (Training programs comply with competency gaps)
• Matching training costs with desired outcomes.(e.g., Improving employee behaviour)
• Determining the program’s strengths and weaknesses.
(Does the program meet learning objectives?)
Giving feedback to candidates by defining objectives and linking
these objectives to learning outcomes.
Ascertaining relationship between acquired
knowledge, transferring knowledge at workplace, and training.
Controlling training program to ensure its effectiveness.
Using evaluative data to manipulate them for the top
management’s benefits.
Determining whether actual outcomes are aligned with expected
outcomes.
1) Feedback
2) Research
3) Control
4) Power
Games
5) Intervention
STEP 1:
Identifying
Purposes of
Evaluation
STEP 2:
Selecting
Evaluation
Method
STEP 3:
Designing
Evaluation
Tools
STEP 4:
Collecting
Data
STEP 5:
Analyzing and
Reporting
Results
(5)TRAINING EVALUATION – BMHR5103
Why do we
want to
evaluate
training
programs?
Many
available
methods and
the most
famous one
is
Kirkpatrick’s
four-level
Method.
Qualitative
or
Quantitative
Tools?
Who, When
, How to
collect
data?
Evaluation
Data
Analysis
and
Reporting.
Learning from experience for future improvement
• Evaluation helps us to learn from experience of past training programs.
STEP 1:
Identifying Purposes
of Evaluation
STEP 2:
Selecting
Evaluation
Method
STEP 3:
Designing
Evaluation Tools
STEP 4:
Collecting
Data
STEP 5:
Analyzing and
Reporting Results
(6)TRAINING EVALUATION – BMHR5103
Why do
we want
to
evaluate
training
programs?
Knowing different purposes of training evaluation
• Seeing how knowledge and skills learned in training are put into practice.
• Identifying strengths and weaknesses of training programs.
Accountability issues
• Evaluation increases accountability of implementing agencies to concerned
stakeholders.
Other related reasons
• Comparing costs and benefits of a human resource development program.
• Deciding who should participate in future programs.
Kirkpatrick
(1959)
CIPP Model
(1987)
IPO Model
(1990)
TVS Model
(1994)
STEP 1:
Identifying Purposes
of Evaluation
STEP 2:
Selecting
Evaluation Method
STEP 3:
Designing
Evaluation Tools
STEP 4:
Collecting
Data
STEP 5:
Analyzing and
Reporting Results
(7)TRAINING EVALUATION – BMHR5103
Many
available
methods
and the
most famous
one is
Kirkpatrick’s
four-level
Method.
1. Context
2. Input
3. Process
4. Product
1. Input
2. Process
3. Output
4. Outcomes
1. Situation
2. Intervention
3. Impact
4. Value
1. Reaction
2. Learning
3. Behavior
4. Results
System-based approachesGoal-based approach
Level 1:
REACTION
Level 2: Learning Level 3: Behavior Level 4: Results
STEP 1:
Identifying Purposes
of Evaluation
STEP 2:
Selecting
Evaluation Method
STEP 3:
Designing
Evaluation Tools
STEP 4:
Collecting
Data
STEP 5:
Analyzing and
Reporting Results
(8)TRAINING EVALUATION – BMHR5103
Donald
Kirkpatrick’s
Four-Level
Training
Evaluation
Model (1959)
Measuring how participants react to the training program.
• For measuring reaction, questions of surveys or questionnaires like these
should be considered:
» Did the trainees feel that training was worth their time?
» What were the biggest strengths of training, and the biggest
weaknesses?
After gathering information, changes could be done based on
the trainees’ feedback and suggestions.
Level 2:
LEARNING
Level 1: Reaction Level 3: Behavior Level 4: Results
STEP 1:
Identifying Purposes
of Evaluation
STEP 2:
Selecting
Evaluation Method
STEP 3:
Designing
Evaluation Tools
STEP 4:
Collecting
Data
STEP 5:
Analyzing and
Reporting Results
(9)TRAINING EVALUATION – BMHR5103
• Measuring learning can be done through identifying what evaluator
wants to evaluate, (i.e., changes in knowledge, skills, or attitudes).
• Learners’ skills and
knowledge are assessed
before training program.
• Candidates are unaware
of the objectives and
learning outcomes of
the program.
• The phase where
instructions are started.
• This phase consists of
short tests and quizzes at
regular intervals.
• Learners’ skills and
knowledge are assessed
again to measure the
effectiveness of training
program.
• Determining in this phase
if training has had the
desired effect at
individual level and at
organizational levels.
BEFORE Training DURING Training AFTER Training
Donald
Kirkpatrick’s
Four-Level
Training
Evaluation
Model (1959)
Level 3:
BEHAVIOR
Level 2: LearningLevel 1: Reaction Level 4: Results
STEP 1:
Identifying Purposes
of Evaluation
STEP 2:
Selecting
Evaluation Method
STEP 3:
Designing
Evaluation Tools
STEP 4:
Collecting
Data
STEP 5:
Analyzing and
Reporting Results
(10)TRAINING EVALUATION – BMHR5103
Measuring the extent to which change in participants’ behavior has
occurred because of attending training programs.
• For measuring changes in behavior, questions like these should be considered:
» Did the trainees put any of their learning to use?
» Are trainees able to teach their new knowledge, skills, or attitudes to
other people?
• To make changes happen, these conditions are necessary:
- have the desire to change themselves.
- know what to do and how to do it.
- be trained in the right climate.
- be rewarded for changing.
Donald
Kirkpatrick’s
Four-Level
Training
Evaluation
Model (1959)
Trainees should
Level 4:
RESULTS
Level 2: Learning Level 3: BehaviorLevel 1: Reaction
STEP 1:
Identifying Purposes
of Evaluation
STEP 2:
Selecting
Evaluation Method
STEP 3:
Designing
Evaluation Tools
STEP 4:
Collecting
Data
STEP 5:
Analyzing and
Reporting Results
(11)TRAINING EVALUATION – BMHR5103
Measuring final results that occurred because
participants already have attended training program.
• For measuring results, outcomes or final results that are closely related
to training program should be considered such as:
» Increased employee retention.
» Higher morale.
» Increased sales.
» Higher quality ratings.
» Increased customer satisfaction.
Donald
Kirkpatrick’s
Four-Level
Training
Evaluation
Model (1959)
Level 5:
RoI (Return on Investment)
Level 4: ResultsLevel 2: Learning Level 3: BehaviorLevel 1: Reaction
STEP 1:
Identifying Purposes
of Evaluation
STEP 2:
Selecting
Evaluation Method
STEP 3:
Designing
Evaluation Tools
STEP 4:
Collecting
Data
STEP 5:
Analyzing and
Reporting Results
(12)TRAINING EVALUATION – BMHR5103
It measures return on investment
for a training program.
How to
calculate
Return on
Investment
(RoI)?
 Convert output
data to cost
savings.
 Calculate cost
of quality.
 Use historical
training costs.
 Use external
databases to
estimate cost of
data items.
 Cost to
design and
develop the
program.
 Cost of all
program’s
materials.
 Costs of
instructors.
 Cost of
Facilities.
Net benefit
= Program benefits - Program costs
Return on Investment
(RoI) = Net benefit
Program costs
Program
Benefits
1
Program
Costs
2
Donald
Kirkpatrick’s
Four-Level
Training
Evaluation
Model (1959)
Qualitative tools
(Interviews, case studies, focus groups)
Quantitative tools
(Surveys, questionnaires, experiments)
Evaluation tools can be selected
depending on purposes and methods of evaluation.
STEP 1:
Identifying
Purposes of
Evaluation
STEP 2:
Selecting
Evaluation Method
STEP 3:
Designing
Evaluation Tools
STEP 4:
Collecting
Data
STEP 5:
Analyzing and
Reporting Results
(13)TRAINING EVALUATION – BMHR5103
Designing
Qualitative
or
Quantitative
or Mixed
Tools?
SAMPLE QUESTIONNAIRE
• The training met my expectations.
Strongly Agree () Agree () Neutral ()
Disagree () Strongly Disagree ()
• The trainer was knowledgeable.
Strongly Agree () Agree () Neutral ()
Disagree () Strongly Disagree ()
SAMPLE INTERVIEW
• What did you enjoy most about
today?
• What is the most valuable thing you
learned today (knowledge or skills)?
Qualitative Tools
Interviews
Quantitative Tools
Questionnaires
STEP 1:
Identifying
Purposes of
Evaluation
STEP 2:
Selecting
Evaluation Method
STEP 3:
Designing
Evaluation Tools
STEP 4:
Collecting
Data
STEP 5:
Analyzing and
Reporting Results
(14)TRAINING EVALUATION – BMHR5103
Guidelines for effective data
collection of questionnaires
• Keep responses anonymous.
• Distribute questionnaire
forms in advance.
• Explain purposes of
questionnaire and how its
information will be used.
• Allow enough time for
completing a questionnaire.
Guidelines for effective
interviews
• Introduction
 Appreciation
 Self introduction
 Purpose of interview
• Interviewer’s Positive attitude
 Face positively and
respectively interviewees.
 Smile and be friendly.
 Give positive feedback (eye
contact, repeat the answers).
WHO,
WHEN,
HOW
to collect
data?
Analyzing Collected Data
STEP 1:
Identifying
Purposes of
Evaluation
STEP 2:
Selecting
Evaluation Method
STEP 3:
Designing
Evaluation Tools
STEP 4:
Collecting
Data
STEP 5:
Analyzing and
Reporting Results
(15)TRAINING EVALUATION – BMHR5103
Evaluation
Data
Analysis
and
Reporting.
• Data collected in
STEP 4 should be
presented using:
 Figures.
 Charts.
• Data collected in
STEP 4 should be
entered into a
computer using:
MS EXCEL
or
SPSS.
• Analysis should be
simple and limited
to necessary data
using:
 Frequency
distributions and
average statistics.
Data Input1 Data Analysis2 Data Presenting3
Reporting Evaluation Findings
STEP 1:
Identifying
Purposes of
Evaluation
STEP 2:
Selecting
Evaluation Method
STEP 3:
Designing
Evaluation Tools
STEP 4:
Collecting
Data
STEP 5:
Analyzing and
Reporting Results
(16)TRAINING EVALUATION – BMHR5103
Evaluation
Data
Analysis
and
Reporting. • Primary users include:
 Training Program
director.
 Funding agency.
 Decision makers.
Who needs
to know what?
1
• Evaluation simple report should
include:
 Summary.
 Program description.
 Evaluation design and
methods.
 Findings and results.
 Recommendations.
 Appendices.
Evaluation
report outline
2
(17)TRAINING EVALUATION – BMHR5103
© 2012 Eyad Al-Samman

More Related Content

Training evaluation

  • 1. BMHR5103, November 2012 - Supervised by: Dr. Nayal Rashed - Presented by: Eyad Al-Samman I.D.#: 201010047 ABC (1)TRAINING EVALUATION – BMHR5103
  • 2. This presentation will concisely discuss the following issues related to Training Evaluation Process: (2)TRAINING EVALUATION – BMHR5103 • Definition of Training Evaluation • Benefits of Training Evaluation • Steps of Training Evaluation: • STEP (1): Identifying Purposes of Evaluation • STEP (2): Selecting Evaluation Method • STEP (3): Designing Evaluation Tools • STEP (4): Collecting Data • STEP (5): Analyzing and Reporting Results • Kirkpatrick Four-Level Evaluation Model (1959) • Purposes of Training Evaluation
  • 3. Training Evaluation involves assessment of effectiveness of training programs. This Assessment is done by collecting data on whether: (3)TRAINING EVALUATION – BMHR5103 STEP 1: CONDUCT Training Needs Analysis STEP 2: DESIGN Training Program STEP 3: SELECT Training Method STEP 4: EVALUATE Training Program Participants were satisfied with the training program. 1 Effectiveness of participants’ skills is enhanced. 2 Participants are able to apply new skills at their workplace. 3 Training Management Cycle
  • 4. (4)TRAINING EVALUATION – BMHR5103 • Ensuring accountability. (Training programs comply with competency gaps) • Matching training costs with desired outcomes.(e.g., Improving employee behaviour) • Determining the program’s strengths and weaknesses. (Does the program meet learning objectives?) Giving feedback to candidates by defining objectives and linking these objectives to learning outcomes. Ascertaining relationship between acquired knowledge, transferring knowledge at workplace, and training. Controlling training program to ensure its effectiveness. Using evaluative data to manipulate them for the top management’s benefits. Determining whether actual outcomes are aligned with expected outcomes. 1) Feedback 2) Research 3) Control 4) Power Games 5) Intervention
  • 5. STEP 1: Identifying Purposes of Evaluation STEP 2: Selecting Evaluation Method STEP 3: Designing Evaluation Tools STEP 4: Collecting Data STEP 5: Analyzing and Reporting Results (5)TRAINING EVALUATION – BMHR5103 Why do we want to evaluate training programs? Many available methods and the most famous one is Kirkpatrick’s four-level Method. Qualitative or Quantitative Tools? Who, When , How to collect data? Evaluation Data Analysis and Reporting.
  • 6. Learning from experience for future improvement • Evaluation helps us to learn from experience of past training programs. STEP 1: Identifying Purposes of Evaluation STEP 2: Selecting Evaluation Method STEP 3: Designing Evaluation Tools STEP 4: Collecting Data STEP 5: Analyzing and Reporting Results (6)TRAINING EVALUATION – BMHR5103 Why do we want to evaluate training programs? Knowing different purposes of training evaluation • Seeing how knowledge and skills learned in training are put into practice. • Identifying strengths and weaknesses of training programs. Accountability issues • Evaluation increases accountability of implementing agencies to concerned stakeholders. Other related reasons • Comparing costs and benefits of a human resource development program. • Deciding who should participate in future programs.
  • 7. Kirkpatrick (1959) CIPP Model (1987) IPO Model (1990) TVS Model (1994) STEP 1: Identifying Purposes of Evaluation STEP 2: Selecting Evaluation Method STEP 3: Designing Evaluation Tools STEP 4: Collecting Data STEP 5: Analyzing and Reporting Results (7)TRAINING EVALUATION – BMHR5103 Many available methods and the most famous one is Kirkpatrick’s four-level Method. 1. Context 2. Input 3. Process 4. Product 1. Input 2. Process 3. Output 4. Outcomes 1. Situation 2. Intervention 3. Impact 4. Value 1. Reaction 2. Learning 3. Behavior 4. Results System-based approachesGoal-based approach
  • 8. Level 1: REACTION Level 2: Learning Level 3: Behavior Level 4: Results STEP 1: Identifying Purposes of Evaluation STEP 2: Selecting Evaluation Method STEP 3: Designing Evaluation Tools STEP 4: Collecting Data STEP 5: Analyzing and Reporting Results (8)TRAINING EVALUATION – BMHR5103 Donald Kirkpatrick’s Four-Level Training Evaluation Model (1959) Measuring how participants react to the training program. • For measuring reaction, questions of surveys or questionnaires like these should be considered: » Did the trainees feel that training was worth their time? » What were the biggest strengths of training, and the biggest weaknesses? After gathering information, changes could be done based on the trainees’ feedback and suggestions.
  • 9. Level 2: LEARNING Level 1: Reaction Level 3: Behavior Level 4: Results STEP 1: Identifying Purposes of Evaluation STEP 2: Selecting Evaluation Method STEP 3: Designing Evaluation Tools STEP 4: Collecting Data STEP 5: Analyzing and Reporting Results (9)TRAINING EVALUATION – BMHR5103 • Measuring learning can be done through identifying what evaluator wants to evaluate, (i.e., changes in knowledge, skills, or attitudes). • Learners’ skills and knowledge are assessed before training program. • Candidates are unaware of the objectives and learning outcomes of the program. • The phase where instructions are started. • This phase consists of short tests and quizzes at regular intervals. • Learners’ skills and knowledge are assessed again to measure the effectiveness of training program. • Determining in this phase if training has had the desired effect at individual level and at organizational levels. BEFORE Training DURING Training AFTER Training Donald Kirkpatrick’s Four-Level Training Evaluation Model (1959)
  • 10. Level 3: BEHAVIOR Level 2: LearningLevel 1: Reaction Level 4: Results STEP 1: Identifying Purposes of Evaluation STEP 2: Selecting Evaluation Method STEP 3: Designing Evaluation Tools STEP 4: Collecting Data STEP 5: Analyzing and Reporting Results (10)TRAINING EVALUATION – BMHR5103 Measuring the extent to which change in participants’ behavior has occurred because of attending training programs. • For measuring changes in behavior, questions like these should be considered: » Did the trainees put any of their learning to use? » Are trainees able to teach their new knowledge, skills, or attitudes to other people? • To make changes happen, these conditions are necessary: - have the desire to change themselves. - know what to do and how to do it. - be trained in the right climate. - be rewarded for changing. Donald Kirkpatrick’s Four-Level Training Evaluation Model (1959) Trainees should
  • 11. Level 4: RESULTS Level 2: Learning Level 3: BehaviorLevel 1: Reaction STEP 1: Identifying Purposes of Evaluation STEP 2: Selecting Evaluation Method STEP 3: Designing Evaluation Tools STEP 4: Collecting Data STEP 5: Analyzing and Reporting Results (11)TRAINING EVALUATION – BMHR5103 Measuring final results that occurred because participants already have attended training program. • For measuring results, outcomes or final results that are closely related to training program should be considered such as: » Increased employee retention. » Higher morale. » Increased sales. » Higher quality ratings. » Increased customer satisfaction. Donald Kirkpatrick’s Four-Level Training Evaluation Model (1959)
  • 12. Level 5: RoI (Return on Investment) Level 4: ResultsLevel 2: Learning Level 3: BehaviorLevel 1: Reaction STEP 1: Identifying Purposes of Evaluation STEP 2: Selecting Evaluation Method STEP 3: Designing Evaluation Tools STEP 4: Collecting Data STEP 5: Analyzing and Reporting Results (12)TRAINING EVALUATION – BMHR5103 It measures return on investment for a training program. How to calculate Return on Investment (RoI)?  Convert output data to cost savings.  Calculate cost of quality.  Use historical training costs.  Use external databases to estimate cost of data items.  Cost to design and develop the program.  Cost of all program’s materials.  Costs of instructors.  Cost of Facilities. Net benefit = Program benefits - Program costs Return on Investment (RoI) = Net benefit Program costs Program Benefits 1 Program Costs 2 Donald Kirkpatrick’s Four-Level Training Evaluation Model (1959)
  • 13. Qualitative tools (Interviews, case studies, focus groups) Quantitative tools (Surveys, questionnaires, experiments) Evaluation tools can be selected depending on purposes and methods of evaluation. STEP 1: Identifying Purposes of Evaluation STEP 2: Selecting Evaluation Method STEP 3: Designing Evaluation Tools STEP 4: Collecting Data STEP 5: Analyzing and Reporting Results (13)TRAINING EVALUATION – BMHR5103 Designing Qualitative or Quantitative or Mixed Tools? SAMPLE QUESTIONNAIRE • The training met my expectations. Strongly Agree () Agree () Neutral () Disagree () Strongly Disagree () • The trainer was knowledgeable. Strongly Agree () Agree () Neutral () Disagree () Strongly Disagree () SAMPLE INTERVIEW • What did you enjoy most about today? • What is the most valuable thing you learned today (knowledge or skills)?
  • 14. Qualitative Tools Interviews Quantitative Tools Questionnaires STEP 1: Identifying Purposes of Evaluation STEP 2: Selecting Evaluation Method STEP 3: Designing Evaluation Tools STEP 4: Collecting Data STEP 5: Analyzing and Reporting Results (14)TRAINING EVALUATION – BMHR5103 Guidelines for effective data collection of questionnaires • Keep responses anonymous. • Distribute questionnaire forms in advance. • Explain purposes of questionnaire and how its information will be used. • Allow enough time for completing a questionnaire. Guidelines for effective interviews • Introduction  Appreciation  Self introduction  Purpose of interview • Interviewer’s Positive attitude  Face positively and respectively interviewees.  Smile and be friendly.  Give positive feedback (eye contact, repeat the answers). WHO, WHEN, HOW to collect data?
  • 15. Analyzing Collected Data STEP 1: Identifying Purposes of Evaluation STEP 2: Selecting Evaluation Method STEP 3: Designing Evaluation Tools STEP 4: Collecting Data STEP 5: Analyzing and Reporting Results (15)TRAINING EVALUATION – BMHR5103 Evaluation Data Analysis and Reporting. • Data collected in STEP 4 should be presented using:  Figures.  Charts. • Data collected in STEP 4 should be entered into a computer using: MS EXCEL or SPSS. • Analysis should be simple and limited to necessary data using:  Frequency distributions and average statistics. Data Input1 Data Analysis2 Data Presenting3
  • 16. Reporting Evaluation Findings STEP 1: Identifying Purposes of Evaluation STEP 2: Selecting Evaluation Method STEP 3: Designing Evaluation Tools STEP 4: Collecting Data STEP 5: Analyzing and Reporting Results (16)TRAINING EVALUATION – BMHR5103 Evaluation Data Analysis and Reporting. • Primary users include:  Training Program director.  Funding agency.  Decision makers. Who needs to know what? 1 • Evaluation simple report should include:  Summary.  Program description.  Evaluation design and methods.  Findings and results.  Recommendations.  Appendices. Evaluation report outline 2
  • 17. (17)TRAINING EVALUATION – BMHR5103 © 2012 Eyad Al-Samman