Training
Training
Training
12 January 2004
Training
SYSTEMS APPROACH TO TRAINING: EVALUATION
__________________________________________________________
Summary This pamphlet provides implementing guidance, formats, and techniques
for evaluation and quality assurance programs of the U.S. Army Training
and Doctrine Command (TRADOC) as described in TRADOC Regulation
350-70, Part III.
Applicability This pamphlet applies to TRADOC activities and The Army School
System (TASS) Training Battalions responsible for managing or
performing Training Development (TD) or TD-related functions, including
evaluation/quality assurance of the training, products, and institutions that
present the training. It also applies to non-TRADOC agencies/
organizations having Memorandums of Understanding, Memorandums of
Agreement, and contracts for developing training or training products for
TRADOC and TASS agencies and organizations.
Forms The “R” forms at the back of this pamphlet are for local reproduction.
Suggested The proponent for this pamphlet is the Deputy Chief of Staff for
Improve- Operations and Training (DCSOPS&T). Send comments and suggested
ments improvements on DA Form 2028 (Recommended Changes to
Publications and Blank Forms) through channels to Commander,
TRADOC (ATTG-CD), 5 Fenwick Road, Fort Monroe, VA 23651-1049.
Suggested improvements may also be submitted using DA Form 1045
(Army Ideas for Excellence Program (AIEP) Proposal).
Paragraph Page
Chapter 1
Introduction
Purpose 1-1 4
References 1-2 4
Explanations of abbreviations and terms 1-3 4
Systems Approach to Training (SAT) overview 1-4 4
Regulation, pamphlet, and job aid (JA) relationships 1-5 5
TRADOC Pam 350-70-4
Contents (cont)
Paragraph Page
Chapter 2
Evaluation Process
Evaluation overview................................................................. 2-1 13
Evaluator’s role........................................................................ 2-2 14
Evaluation process description ................................................ 2-3 21
Chapter 3
Planning Evaluations
Planning overview ................................................................... 3-1 23
Planning an evaluation ............................................................ 3-2 23
Evaluation project management plans..................................... 3-3 28
Quality control criteria for planning .......................................... 3-4 30
Chapter 4
Collecting Evaluation Data
Data collection overview.......................................................... 4-1 31
Data collection procedures ...................................................... 4-2 31
Quality control criteria for data collection................................. 4-3 39
Chapter 5
Analyzing Evaluation Data
Evaluation analysis overview .................................................. 5-1 39
Analysis description................................................................. 5-2 39
Quality control criteria for evaluation analysis ......................... 5-3 43
Chapter 6
Preparing Evaluation Reports
Evaluation report preparation overview ................................... 6-1 43
Evaluation report descriptions ................................................. 6-2 44
Report preparation steps ......................................................... 6-3 44
Prepare draft report ................................................................. 6-4 44
Staff draft report for review/concurrence ................................. 6-5 46
Obtain final approval of recommendations .............................. 6-6 47
Distribute report/recommendations for action, and conduct
follow-up check...................................................................... 6-7 47
Quality control criteria for preparing evaluation reports ........... 6-8 48
2
TRADOC Pam 350-70-4
Contents (cont)
Paragraph Page
Chapter 7
Conducting Evaluation Follow-Ups
Evaluation follow-up overview ................................................. 7-1 48
Follow-up description............................................................... 7-2 49
Quality control criteria for conducting follow-up evaluations .... 7-3 51
Chapter 8
Internal Evaluation
Internal evaluation overview .................................................... 8-1 51
Internal evaluation description ................................................. 8-2 52
Internal evaluation procedures ................................................ 8-3 54
Areas to consider/review during internal evaluations............... 8-4 56
Outputs of internal evaluations ................................................ 8-5 59
Internal evaluation issues/concerns......................................... 8-6 60
Quality control criteria for internal evaluations......................... 8-7 61
Chapter 9
External Evaluation
External evaluation overview................................................... 9-1 61
External evaluation description................................................ 9-2 62
External evaluation procedures ............................................... 9-3 63
Unit training evaluation ............................................................ 9-4 63
AUTOGEN software program .................................................. 9-5 64
Outputs of external evaluations ............................................... 9-6 66
External evaluation issues/concerns ....................................... 9-7 66
Quality control criteria for external evaluations ........................ 9-8 67
Chapter 10
Accreditation
Accreditation overview............................................................. 10-1 67
Accreditation description ......................................................... 10-2 68
Self-assessment description.................................................... 10-3 70
Quality control criteria for accreditation ................................... 10-4 71
Appendixes
A. References......................................................................... 72
B. SAT Process ...................................................................... 73
C. Job Aid Hyperlinks ............................................................. 77
Glossary ................................................................................. 78
3
TRADOC Pam 350-70-4
Chapter 1
Introduction
1-1. Purpose. This pamphlet provides detailed guidance in support of
TRADOC Reg 350-70 on the following areas of the evaluation process
for TRADOC courses and courseware:
• Evaluation process.
• Internal evaluations.
• External evaluations.
• Accreditation.
4
TRADOC Pam 350-70-4
5
TRADOC Pam 350-70-4
6
TRADOC Pam 350-70-4
Introduction a. Quality assurance (QA), quality control (QC), and evaluation are
employed throughout the SAT process to ensure quality. These
functions are not synonymous. Each has a distinct purpose within
training to achieve the ultimate goal of quality. A brief description of
quality, QA, QC, and evaluation follows.
7
TRADOC Pam 350-70-4
Pamphlet b. Figure 1-3 shows how this pamphlet is organized. Some chapters
organization are supported by guidance provided in other chapters; refer to each of
these to accomplish the evaluation. The procedural JAs, product
templates, product samples, and information papers will help in
accomplishing the work.
Evaluation/ c. Evaluations are the feedback mechanism within the SAT process
QC to complete the QC. The QC process begins with the prescribed
relationship minimum quality standards for the relevant product or process. Within
the QC process, the function of the evaluation element is to
gather/collect information to analyze, and then provide management
with data on which to make quality judgments. Figure 1-4 illustrates the
role evaluation plays within the QC process.
8
TRADOC Pam 350-70-4
9
TRADOC Pam 350-70-4
Prescribed
Standard
Rework
Evaluation
starts here
Evaluation Make Quality
ends here Judgment Gather Data
Determine
the Delta
Analyze/process information
10
TRADOC Pam 350-70-4
11
TRADOC Pam 350-70-4
Table 1-1
Project Management Plan procedures
Step Action
1. Identify project team members, and consult as needed for plan
requirements.
2. Identify TD resources required to complete the development
project.
3. Establish TD milestones for starting and completing the project,
and intermediate milestones, as needed.
4. Determine budget requirements and provide to the budget office.
Include such things as—
12
TRADOC Pam 350-70-4
Table 1-2
Training Development Plan procedures
Step Action
1. Summarize all TDPMP requirements in prioritized order.
2. Determine which requirements can be accomplished based on
available resources (i.e., resourced requirements).
3. Identify the impact of each of unmet training requirement on unit
mission and task accomplishment.
4. Finalize the TD Plan.
5. Obtain command approval of the TD Plan.
13
TRADOC Pam 350-70-4
1-10. Quality control criteria. Each chapter in this pamphlet will include QC
Quality criteria for applying the evaluation process.
control
Chapter 2
Evaluation Process
14
TRADOC Pam 350-70-4
15
TRADOC Pam 350-70-4
16
TRADOC Pam 350-70-4
SME role (3) The SME is the content or technical expert on the team. This
SME is, or should be, the master performer of the action/activity being
evaluated. No matter what the job—a training developer, a
combat/doctrine developer, or an instructor—the SME is involved
directly in the evaluation function. The SME is specifically responsible
for the accuracy and completeness of the technical content, and the
comprehensiveness of the content presented. The SME may perform
other duties the team leader assigns, such as data collection.
18
TRADOC Pam 350-70-4
Evaluation (a) Evaluate quality of unit training strategies (unit long and
areas short-range training strategies).
19
TRADOC Pam 350-70-4
20
TRADOC Pam 350-70-4
21
TRADOC Pam 350-70-4
(2) The right soldier was trained (i.e., all students fell within the
target audience).
22
TRADOC Pam 350-70-4
Planning b. The first step in doing anything is proper planning. Evaluations are
overview no exception and require thorough planning. Some routine evaluation
duties, such as conducting a classroom observation, reviewing a test, or
analyzing a group of student end-of-course critiques, do not require in-
depth evaluation plans. Explain procedures for performing these routine
duties in a local standard operating procedure, or equivalent document.
However, major evaluations (i.e., in-depth evaluations of school training
program, products, or processes) require an evaluation plan—the end
product of the planning phase. This phase of the process is discussed
in detail in chapter 3.
Data c. The data collection process involves determining what type of data
collection is required, what data to collect (student performance, student feedback,
overview audit trails, supervisors, graduates, etc.), where/who data is collected
from (source), how much, and how data is collected (method/technique/
instrument). Next, structure and develop the data collection
method/instrument to collect data. The final step is to administer the
instrument/technique to collect the data. This phase of the process is
discussed in detail in chapter 4.
Reporting e. After evaluation data is collected and analyzed, the next step is to
findings and identify major findings and recommendations. Once identified, write a
recommend- report to include:
ations
overview (1) References.
(2) Background/problem.
23
TRADOC Pam 350-70-4
Chapter 3
Planning Evaluations
3-1. Planning overview. This chapter addresses the planning phase of the
evaluation process. It will assist evaluators in planning an evaluation
and creating the Master Evaluation Plans (MEP) and Evaluation Project
Management Plans. Add or delete additional actions, as the specific
evaluation conducted requires. Evaluations that are not well-planned or
managed will result in the collection of information that cannot be trusted
(i.e., invalid and of no use). The Quality Assurance Office (QAO) is
responsible for the development of MEPs and Evaluation Project
Management Plans. However, involvement from other directorates and
departments is essential in their development.
24
TRADOC Pam 350-70-4
• Helps ensure there is a valid need for doing the evaluation (i.e.,
reduces the chance of doing an evaluation that is not needed).
• Focuses the intent of the evaluation and prevents the evaluation
from getting “off track”.
• Forces the evaluator to “think through” the entire evaluation and
plan for all actions required.
• Identifies and optimizes the use of the limited resources available
for doing the evaluation.
• Ensures that everyone involved in the evaluation is informed of,
and knows, their responsibilities.
• Identifies and prioritizes initiatives.
(2) The MEP is the planning document that provides all evaluation
requirements for the next fiscal year (FY), and projections for the
following 3-5 years. Evaluation requirements outlined in Evaluation
Project Management Plans are included in the MEP. The Combined
Arms Center (CAC) and Army Accessions Command (AAC) QAOs and
each TRADOC center/school will prepare and provide the MEP to HQ,
TRADOC by September for the upcoming FY. Centers/schools should
send a courtesy copy of their MEP to CAC and ACC QAOs. These
plans are mandatory per TRADOC Reg 350-70. Job Aid 350-70-4.3c
provides a format of a MEP.
Planning b. There are six major steps involved in planning an evaluation (see
procedural table 3-1). The training developer will perform these steps to
steps successfully plan an evaluation. An Evaluation Project Management
Plan is the outcome of this procedure.
25
TRADOC Pam 350-70-4
Table 3-1
Steps for planning an evaluation
No. Step
1. Determine what areas need evaluating.
2. Define the purpose of the evaluation.
3. Determine the scope of the evaluation and available resources.
4. Collect and research information pertinent to the evaluation
(feedback and training documentation, POI, STP, TSP, critical
task list).
5. Develop and coordinate initial Evaluation Project Management
Plan.
6. Develop and coordinate final Evaluation Project Management
Plan.
Note: The evaluation plan (steps 5 and 6) is a by-product of planning
an evaluation.
Table 3-2
Actions for determining what to evaluate
No. Action
1. List all appropriate proponent areas that could require an
evaluation each year (i.e., courses, TD organizations, instructors,
processes, graduates/graduates’ supervisors, etc.).
2. Identify from this list those areas that have not been evaluated
within 2-3 years, and areas that, through feedback, have
indicated a potential problem.
3. Obtain any written documentation that may have been initiated,
or will be initiated.
4. Prioritize evaluation initiatives based on personnel and resource
availability per year (i.e., number of people, time to complete the
evaluation, and dollars required to accomplish the job).
26
TRADOC Pam 350-70-4
Determine e. Clarify and delineate the purpose for each evaluation conducted.
evaluation This provides justification for expenditure of funds and allocating time to
purpose do the work. However, keep in mind that it is not always necessary to
have preset goals and objectives for evaluations; they may limit or
constrain the evaluation. Table 3-3 summarizes actions to address to
answer why the evaluation is being performed. Detailed assistance on
conducting these steps is in JA 350-70-4.3b (Step 2). The JA
addresses each action in detail. Table 3-3 provides a summary of a
portion of the JA.
Table 3-3
Actions for determining the purpose of the evaluation
No. Action
1. If evaluation request was initiated outside of the organization,
obtain appropriate interpretation of the reason/problem/issue/
concern.
2. Discuss concerns with other evaluators.
3. Develop initial interpretation of the evaluation requirements.
4. Develop initial interpretation of the impact of not solving any
deficiencies or discrepancies.
5. Develop a list of questions and/or issues that need answering as
a result of the evaluation.
6. Get senior leader sponsorship/approval. Make initial
recommendations to continue or discontinue evaluation. If
evaluation will not continue, write a memorandum on why
evaluation stopped. If evaluation will continue, write a
memorandum requesting POCs.
7. List the evaluation POCs and other POCs.
27
TRADOC Pam 350-70-4
Table 3-4
Actions to research and collect pertinent information
No. Action
1. Initiate a literature search, if required.
2. Identify the criteria the evaluation is based on, to ensure
goals/objectives are met.
3. Begin to collect and review training feedback.
4. Begin to collect and review pertinent education/training
documents, e.g., regulations, pamphlets, products, etc.
28
TRADOC Pam 350-70-4
(3) If there are other related training issues that need evaluating
as part of the evaluation.
(2) Are there written comments from the field commanders? Who
were the field commanders that made the comments? And who
recorded the commander’s comments?
29
TRADOC Pam 350-70-4
Elements of b. There are specific critical elements that all Evaluation Project
evaluation Management Plans must include. Each basic question listed above has
plans associated elements to address for a successful evaluation. Each
essential element is discussed in JA 350-70-4.3b (Step 5). The
evaluation plan elements are outlined in general terms in table 3-5.
Table 3-5
Evaluation plan elements
No. Element
1. Why is the evaluation being conducted?
a. Reason/problem/issue/concern.
b. Impact.
2. What will the evaluation accomplish?
a. Limitations.
b. Assumptions.
c. Essential elements of analysis.
d. Objectives.
e. Purpose.
3. How is the evaluation performed?
a. Data collection/analysis methodology.
(1) Ensure sources of relevant data and general approach
to data collection, reduction, and analysis.
(2) Use automation support.
(3) Use computer scan sheets when possible for collecting
and inputting data into a computer.
b. Data reporting/follow-up methodology.
c. Identify the criteria the evaluation is based on to ensure
goals/objectives are being met.
30
TRADOC Pam 350-70-4
(1) Purpose.
(2) Background.
(3) Scope.
(4) Objectives.
31
TRADOC Pam 350-70-4
(6) Methodology.
(9) Timelines.
Chapter 4
Collecting Evaluation Data
4-1. Data collection overview. This chapter will discuss the data collection
phase of the evaluation process, to include selecting, using, and
defending appropriate methods of collecting data. It will provide
guidance on designing and using surveys and interviews, as well as the
use of observations.
4-2. Data collection procedures. During the planning phase for evaluation,
identify the information required to address the overall evaluation effort,
as well as how to collect the required information. Data collection is the
process of gathering, collating, and preparing data for the purpose of
processing (analyzing) to obtain desired results. To determine the worth
of any training, training product, or process, the data is collected to
analyze. Relevant data should come from several sources, with more
than one method used to collect data. General data collection sources
and methods are document reviews, individual interviews, group
interviews, surveys, tests or time trials, and personal observations. The
intent is to collect sufficient raw data to ensure a successful analysis.
The techniques or instruments used will depend on the type of data
required for collection. See table 4-1 for general procedural guidance
for collecting data.
32
TRADOC Pam 350-70-4
33
TRADOC Pam 350-70-4
34
TRADOC Pam 350-70-4
Technical c. There are two technical approaches to collect data that may be
approach to used—quantitative or qualitative.
data
collection (1) Quantitative data indicates an amount (how much or how
many) and is measured on a numerical scale. This method is used
most frequently. A quantitative approach is:
(a) Objective.
(b) Reliable.
35
TRADOC Pam 350-70-4
36
TRADOC Pam 350-70-4
37
TRADOC Pam 350-70-4
38
TRADOC Pam 350-70-4
Table 4-2
Steps for determining completed survey requirements
No. Step
1. Determine the target audience.
2. Estimate how many individuals are in the target audience. If the
population size is 200 or less, survey the entire population.
3. Determine confident level required that results are representative.
The JA recommends choosing the 95 percent confidence level, a
level commonly used.
4. Determine the estimated rate of usable questionnaires to use.
5. Determine the number of questionnaires that need distributing.
39
TRADOC Pam 350-70-4
Raw data j. The end product of the collection phase is raw information. This
collected raw data is in the form of completed questionnaires, interview guides,
observation checklists, and other completed data collection
instruments. Analyze this raw data.
Chapter 5
Analyzing Evaluation Data
5-1. Evaluation analysis overview. This chapter discusses the conduct of
the data analysis phase of the evaluation process. This will assist
evaluators with analyzing findings to identify trends and systemic
problems. This chapter includes reviewing, summarizing, and analyzing
raw data, as well as interpreting the results of the analysis.
40
TRADOC Pam 350-70-4
Analysis a. Table 5-1 provides the steps a training developer should perform
procedural when conducting any analysis.
steps
Table 5-1
Steps for analyzing data
Step Action
1. Review the raw data for integrity (ensure data is reliable).
2. Prepare data for analysis. Summarize the data into some form
of table, to avoid searching through individual questionnaires, or
interview guides. This ensures every reply is counted.
3. Analyze the data by converting the raw figures into percentages,
proportions, averages, or some other quantitative form that is
more easily understood. The choice of statistical description will
depend on the purpose of the data (which is determined during
the planning phase of the evaluation).
4. Interpret the analysis result.
The end product of the analysis is a series of initial findings that are
addressed in the Evaluation Report.
Types of b. Evaluators may analyze data in various ways. The data is either in
data analysis the form of qualitative data (expressed in narratives or words), or
methods quantitative data (expressed in numbers). The type of data collected will
depend on the objectives of the evaluation. These are determined when
planning the evaluation, and ultimately developing an evaluation plan.
41
TRADOC Pam 350-70-4
Review data c. Ensure data is valid and reliable. Triangulation (using multiple
methods to study the same thing) can corroborate evidence, and
increase validity, especially for qualitative findings. Examples of events
that could result in invalid and unreliable data:
42
TRADOC Pam 350-70-4
Summarize f. When data is summarized, simply condense the data for analysis.
data
(1) The easiest and most accurate method for summarizing large
amounts of quantitative data is to use an automated statistical program.
Analyze data g. Before analyzing any data, ensure all quantitative data is entered
into a computer, and all qualitative data is summarized and condensed
into categories. Keep in mind exactly why the data is being analyzed
(i.e., identify what specific questions are needed).
43
TRADOC Pam 350-70-4
Interpret the i. Interpret the analysis in common sense terms, and be able to
analysis explain the results. Interpretation of analysis is one of the most difficult
results steps in this phase of evaluation. Keep in mind the purpose of the
evaluation as data results are interpreted. Annotate all trends identified,
and include in the final report. Qualitative data is often considered less
objective than quantitative data, but can provide very useful information,
if procedures are followed, especially when looking at themes and
relationships at the case level. Quantitative data, though more scientific,
requires statistical manipulation to represent findings.
Chapter 6
Preparing Evaluation Reports
44
TRADOC Pam 350-70-4
Table 6-1
Steps for preparing an evaluation report
No. Steps
1. Prepare the draft report, to include findings, conclusions, and
recommendations. Develop findings, conclusions, and/or
recommendations on how they relate to the objective of the
evaluation.
2. Staff draft report for review/concurrence. Staff internally, and with
involved organizations. Obtain review and concurrence of
recommendations.
3. Revise draft report into a final report.
4. Obtain final approval of recommendations. Provide to approving
authority (i.e., decisionmaker) for final approval of final report,
and recommendations.
5. Distribute report/recommendations for action. Distribute final
report with approval authority documentation to all organizations
with implementing requirements.
6-4. Prepare draft report. The evaluation report is a vital document that
summarizes the results of the evaluation (i.e., findings, conclusions,
and recommendations). From the findings, develop conclusions, and
for each identified problem area, provide specific recommendations for
management to consider. There are different ways of reporting
information, depending on how the report is used, the target audience,
and the impact that the evaluation will have. Be as concise and simple
as possible, while ensuring all required information is included.
45
TRADOC Pam 350-70-4
(a) Background.
(c) Methods.
46
TRADOC Pam 350-70-4
a. The procedures for staffing a draft report should begin within the
organization, and then to involved organizations. The steps are as
follows:
47
TRADOC Pam 350-70-4
Draft report a. Send a memorandum to the decisionmaker, along with the report
and copies of the responses received as a result of the staffing from the
Staffing Draft Report step. Job Aid 350-70-4.6c provides a sample
format for the memorandum sent to the decisionmaker.
48
TRADOC Pam 350-70-4
(1) Findings.
(2) Conclusions.
(3) Recommendations.
Chapter 7
Conducting Evaluation Follow-Ups
49
TRADOC Pam 350-70-4
Follow-up a. Though the decisionmaker has signed off on the evaluation and
steps approved the recommendations, there is still no guarantee that those
responsible will implement the recommendations. Conduct a follow-up
to ensure implementation. See table 7-1 for procedural steps for
conducting a follow-up.
Table 7-1
Steps for conducting a follow-up
No. Steps
1. Input action milestones into a tracking system. Collect responses
from organizations responsible for implementing
recommendations. Input milestones into a system for tracking
actions taken.
2. Conduct the follow-up to ensure actions have been taken to
implement recommendations.
3. Prepare and write a follow-up report.
4. Staff the follow-up report.
Input action b. During the reporting phase, determine actions the responsible
milestones organizations have taken so far, and their plans for future actions. To
into tracking begin the follow-up, update information from these organizations to
system determine status of actions taken and planned. These actions are put
into a tracking system.
(3) Has the projected date for the follow-up. Projected date will
depend on the milestones established for the actions, and on local
policies and procedures for performing follow-ups.
50
TRADOC Pam 350-70-4
Conduct the d. The purpose of the follow-up is to ensure that those responsible
follow-up have implemented the approved recommendations. How a follow-up is
conducted is dependent on the actions themselves, resources
available, and local policy for follow-ups. The primary goal of a follow-
up is to ensure the organization has implemented the actions. Follow-
up measures may include:
(2) Be brief.
Staff follow- i. Staff the follow-up report to inform everyone of the follow-up
up report results, and to document the follow-up. Staff the report to:
51
TRADOC Pam 350-70-4
Chapter 8
Internal Evaluation
52
TRADOC Pam 350-70-4
(4) Unit.
(5) Home.
53
TRADOC Pam 350-70-4
54
TRADOC Pam 350-70-4
Table 8-1
Steps for conducting an internal evaluation
No. Steps
1. Prepare Internal Evaluation Project Management Plans.
2. Establish feedback channels.
3. Prepare/modify checklists to evaluate products and processes.
4. Observe training and testing events.
5. Prepare and administer student, instructor, and training manager
questionnaires.
6. Collect and analyze data or information (see chapter 4).
7. Prepare and staff draft evaluation reports.
8. Distribute final evaluation reports.
9. Monitor compliance with recommendations.
10. Establish responsibilities for QA and QC.
55
TRADOC Pam 350-70-4
56
TRADOC Pam 350-70-4
57
TRADOC Pam 350-70-4
Other areas f. Several methods to use for collecting internal evaluations data are
to review described below.
58
TRADOC Pam 350-70-4
(a) The instructors follow the lesson plan, and teach to the
standard.
59
TRADOC Pam 350-70-4
c. Needs assessment.
60
TRADOC Pam 350-70-4
(2) The POI is different in some ways from the course being
implemented.
(5) Training materials are not correlated with the test and
measurement instruments, the terminal or enabling objectives, or the
instructional content identified in the task and learning analyses.
61
TRADOC Pam 350-70-4
Chapter 9
External Evaluation
62
TRADOC Pam 350-70-4
Where b. External evaluations gather data from the field to assess soldiers’
conducted on-the-job performance. A misconception often made is that external
evaluations are anything conducted outside of the proponent
schoolhouse. This is not true. External evaluations are conducted on
soldiers and/or supervisors after the individual has graduated from a
course and is performing their job/duty in the unit.
63
TRADOC Pam 350-70-4
Table 9-1
Steps for conducting an external evaluation
Steps Actions
1. Prepare external Evaluation Project Management Plans.
2. Obtain senior leadership approval/sponsorship. Establish
feedback channels.
3. Prepare visitation plans for observations and/or interviews.
4. Prepare statements of work for contracted studies.
5. Prepare survey instrument.
6. Administer survey.
7. Collect data/information.
8. Analyze data/information.
9. Prepare evaluation reports.
10. Distribute evaluation reports.
11. Monitor compliance with recommendations.
12. Establish responsibilities for QA and QC.
9-4. Unit training evaluation. In the context of this pamphlet, when unit
training evaluation is discussed, the focus of TRADOC proponents is
on the training (and doctrinal) materials provided to support training in
units; the purpose is NOT to evaluate the unit. The assessment of unit
training and proficiency is, and must remain, a unit responsibility.
However, TRADOC must work in close cooperation with the unit to
provide the products (MTPs, STPs) necessary to support unit
evaluation efforts. Evaluation of the education/training products used in
the unit is critical to determine the effectiveness of collective and
individual task performance and products. The intent of a
training/education evaluation within the unit is to improve training and
task performance proficiency.
64
TRADOC Pam 350-70-4
65
TRADOC Pam 350-70-4
ARI server c. The Army Research Institute has made available a server for
administering AUTOGEN-generated job analysis and external
evaluation surveys. This user-friendly site requires minimal interface
with ARI to load surveys and download the survey answer file (data). It
is anticipated that this capability will provide easier access to AC, as
well as RC, soldiers, increasing their participation. Distribute surveys
using either the ARI server, or a center/school server. However, if the
ARI server is used to distribute an AUTOGEN-generated survey, the
Survey Control Number is automatically generated and provided to
ARI. The ARI server will automatically maintain (back up daily) the
survey data and supporting files.
66
TRADOC Pam 350-70-4
c. Needs assessment.
67
TRADOC Pam 350-70-4
Chapter 10
Accreditation
68
TRADOC Pam 350-70-4
69
TRADOC Pam 350-70-4
70
TRADOC Pam 350-70-4
Accreditation f. There are several accreditation reports; either received from other
report organizations, or reports the team is responsible for writing. These
formats include:
(3) Raise higher headquarters issues (HHI) that are beyond the
scope of the training institution to the appropriate level.
71
TRADOC Pam 350-70-4
72
TRADOC Pam 350-70-4
Appendix A
References
Section I
Required Publication
Section II
Related Publications
AR 5-5
Army Studies and Analysis
AR 10-87
Major Army Commands in the Continental United States
AR 25-55
The DA Freedom of Information Act Program
AR 200-1
Environmental Protection and Enhancement
AR 310-25
Dictionary of United States Army Terms
AR 310-50
Authorized Abbreviations, Brevity Codes, and Acronyms
AR 340-21
The Army Privacy Program
AR 350-1
Army Training
MIL-HDBK 29612-1A
Guidance for Acquisition of Training Data Products and Services
73
TRADOC Pam 350-70-4
Note: This is the first of a 4-part DOD Handbook that supports MIL PRF 29612B
(Training Data Products) and its associated DIDs.
MIL-HDBK 29612-2A
Instructional Systems Development/Systems Approach to Training and Education
MIL-HDBK 29612-4A
Glossary for Training
Section III
Prescribed Forms
Appendix B
SAT Process
B-1. The SAT Process involves five training related phases: evaluation, analysis,
design, development, and implementation. Each phase and product developed has
“minimum essential requirements” to meet. Table B-1 describes the five phases of
SAT, and the specific chapters in TRADOC Reg 350-70 that address each phase.
Table B-1
SAT Process
Phases Description
1. Evaluation a. Evaluation, which is the focus of this pamphlet, provides the
means to determine how well the training takes place, Army
personnel/units perform, and products support training. Use the
process to—
74
TRADOC Pam 350-70-4
2. Analysis a. Provides the means to determine the need for training, those
that get the training, and what critical tasks (collective and
individual (including leader) tasks) and supporting skills and
knowledge are trained.
75
TRADOC Pam 350-70-4
3. Design a. Provides the means to establish when, where, and how the
education/training is presented. Use the process to—
76
TRADOC Pam 350-70-4
B-2. Various perspectives exist to view the SAT process. Two such graphical
representations are the integration of the SAT phases (fig B-1) and the SAT pyramid
(fig B-2).
External Evaluation
Analysis
Implementation Design
Evaluation
Development
External Evaluation
B-3. The SAT pyramid (fig B-2) shows how each phase of the SAT model builds upon
preceding phases. While the phases build upon each other, remember this is not
necessarily a linear process. Following all phases in order is not required; enter each
phase individually, as needed for revisions. The process is a continuous series of
analysis, design, development, implementation, and evaluation, with resultant revision
during any phase to maintain product currency and efficiency.
77
TRADOC Pam 350-70-4
Development
Evaluation
DOTMLPF Design
Evaluation
Evaluation
Analysis
Evaluation
Resource Constraints
B-4. TRADOC’s most significant products are efficiently and effectively trained soldiers,
leaders, and units that can perform their mission. For a complete description of the SAT
process and a flowchart of the linkages between the various SAT products, see
TRADOC Reg 350-70, Executive Summary. Flowcharts of the various TD processes
are in TRADOC Reg 350-70, appendix G.
______________________________________________________________________
Appendix C
Job Aid Hyperlinks
Number Title
78
TRADOC Pam 350-70-4
______________________________________________________________________
Glossary
Section I
Abbreviations
AC Active Component
DL Distributed Learning
79
TRADOC Pam 350-70-4
FY fiscal year
HQ headquarters
JA Job Aid
QA quality assurance
QC quality control
RC Reserve Component
80
TRADOC Pam 350-70-4
TD training development
Section II
Terms
accreditation
The recognition afforded an educational/training institution when it has met accepted
standards of quality applied by an accepted accreditation authority.
external evaluation
The evaluation process that provides the means to determine if the training received
meets the needs of the operational Army. This evaluation ensures the system
continues to effectively and cost-efficiently produce graduates who meet established job
performance requirements.
internal evaluation
Assessment of whether the training and training development objectives were met.
Internal evaluations also verify the effective use of the SAT process to meet minimum
essential analysis, design, development, implementation, and evaluation requirements.
81
TRADOC Pam 350-70-4
//signed//
JANE F. MALISZEWSKI
Colonel, GS
Chief Information Officer
82
Observation Worksheet
(For use of this form, see TRADOC Pam 350-70-4; the proponent is DCSOPS&T)
14. TLO/ELOs Match POI? YES NO If "NO", mandatory comments and recommendations:
16. Does doctrine reflect COE? YES NO If "NO", mandatory comments and recommendations:
17. LP task on Critical Task List? YES NO If "NO", mandatory comments and recommendations:
19. LP time/MOI on TMA sheet? YES NO If "NO", mandatory comments and recommendations:
PERFORMANCE RATING GO NO GO
NO GO - Less than 75% of the evaluated items were rated "Go" or waiver(s) not available.
GO NO GO
i. Medevac Plan.
No
B. Classroom Preparation Go NA Comments
Go
1. Lesson plan current, DOTD and DOT
approved, and IAW POI.
2. Classroom had adequate lighting, neat,
orderly, free from noise and interruptions.
Seating arrangement appropriate. Class
prepared prior to training.
No
C. Introduction Go NA Comments
Go
1. Used a motivational statement that
explains the relevance and importance of
the task.
2. Displayed and clearly stated the
Learning Objectives (Action, Condition,
Standard), and briefly outlined the
sequence of the lesson.
3. Stated the Risk Assessment Level,
warnings, safety hazards, and the
environmental considerations.
4. Explained how the objective would be
tested.
D. Demonstration No
Go NA Comments
Techniques Go
1. Ensured students could see all parts of
demonstration.
2. Steps were properly demonstrated.
3. Students were involved in
demonstration, if appropriate.
4. Assisted students as needed.
5. Gave on-the-spot corrections and
praise.
HQ TRADOC Form 350-70-4-1-R-E (Jul 03) Page 3 of 6
Section III - Instructor Checklist (cont)
E. Hands-on Training No
Go NA Comments
Method Go
1. Summarized points covered during the
demonstration.
2. Gave detailed directions before the
practical exercise.
3. Ensured students performed the
practical exercise correctly.
4. Provided timely feedback.
5. Encouraged group members to
participate.
6. Conducted an after action review with
the students after practical exercise.
No
F. Communications Skills Go NA Comments
Go
1. Used correct enunciation and grammar.
2. Did not excessively use distracting
mannerisms such as "Ah", "OK" and "You
know".
3. Instructor's voice quality, volume, and
variations (pitch, rate, and inflection) were
adequate.
G. Question/Answer No
Go NA Comments
Techniques Go
1. Questions were phrased clearly and to
the point (ask, pause, call, respond,
evaluate).
2. Questions were appropriate for the
lesson.
3. Covered all key points with questions.
4. Student's questions were answered
adequately.
No
H. Presentation Skills Go NA Comments
Go
1. Made eye contact with all students.
2. Movement and gestures were natural
and appropriate.
3. Instructor was poised and enthusiastic.
I. Use of Training Aids/ No
Go NA Comments
Materials Go
1. Training aids, instructional materials,
equipment listed in POI were used
appropriately.
2. Whiteboard and/or other visual aids
were used in an effective manner.
No
J. Classroom Management Go NA Comments
Go
1. Maintained proper control of the class.
2. Used appropriate techniques to assist
and motivate students.
3. Managed time appropriately; lesson
was well paced.
4. Encouraged student participation.
No
K. Test Management Go NA Comments
Go
1. Maintained accountability of tests.
2. Complied with Test Administration
Guide (TAG).
3. Test matched method of training.
HQ TRADOC Form 350-70-4-1-R-E (Jul 03) Page 4 of 6
Section III - Instructor Checklist (cont)
4. Test evaluated what was trained.
5. Conducted AAR with students.
No
L. Instructor Preparation Go NA Comments
Go
1. Demonstrated knowledge of class
material.
2. Explained key performance points.
3. Followed the sequence as outlined in
the lesson plan.
4. Covered all objectives.
5. Used smooth transitions.
6. Put training activity into job context at
least once.
7. Ensured all students could see and
hear all instruction.
8. Properly used internal summaries.
9. Properly conducted lesson summary
(see 9a - 9d below).
9a. Restated action.
9b. Restated main learning steps.
9c. Checked on learning.
9d. Provided closing summary.
No
M. Personal Qualities Go NA Comments
Go
1. Instructor's professionalism set the
proper example for bearing, behavior, and
appearance.
2. Showed respect to students.
3. Established a positive rapport with
students.
Part III - AAR with Instuctor
GO - At least 75% of the evaluated items (Part II) were rated "Go".
NO GO - Less than 75% of the evaluated items were rated "Go". Command emphasis needed.
PERFORMANCE RATING: GO NO GO
Part V - Backbrief
Acknowledgement of Evaluation
Person briefed: Position: Date:
PART II - Ratings
NOTE: Overall performance as derived from the evaluation in Sections I, II, and III. Items marked "Not
Applicable" are not counted when computing the overall performance rating. NO GO
Administrative Data
1. Organization being evaluated:
Name:
Location/address:
Reporting Focus
Type of Training Area Evaluated
(Check one) (Check one)
Initial Milit ary Training BCT OSUT AIT WOCS OCS Conduct of Training
Recommendation
Professional Conditional Full
Accreditation Accreditation Accreditation
Remarks:
TRAINING SUPPORT
PROPONENT FUNCTIONS