Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Training

Download as pdf or txt
Download as pdf or txt
You are on page 1of 90

Department of the Army TRADOC Pamphlet 350-70-4

Headquarters, United States Army


Training and Doctrine Command
Fort Monroe, Virginia 23651-1047

12 January 2004

Training
SYSTEMS APPROACH TO TRAINING: EVALUATION
__________________________________________________________
Summary This pamphlet provides implementing guidance, formats, and techniques
for evaluation and quality assurance programs of the U.S. Army Training
and Doctrine Command (TRADOC) as described in TRADOC Regulation
350-70, Part III.

Applicability This pamphlet applies to TRADOC activities and The Army School
System (TASS) Training Battalions responsible for managing or
performing Training Development (TD) or TD-related functions, including
evaluation/quality assurance of the training, products, and institutions that
present the training. It also applies to non-TRADOC agencies/
organizations having Memorandums of Understanding, Memorandums of
Agreement, and contracts for developing training or training products for
TRADOC and TASS agencies and organizations.

Forms The “R” forms at the back of this pamphlet are for local reproduction.

Suggested The proponent for this pamphlet is the Deputy Chief of Staff for
Improve- Operations and Training (DCSOPS&T). Send comments and suggested
ments improvements on DA Form 2028 (Recommended Changes to
Publications and Blank Forms) through channels to Commander,
TRADOC (ATTG-CD), 5 Fenwick Road, Fort Monroe, VA 23651-1049.
Suggested improvements may also be submitted using DA Form 1045
(Army Ideas for Excellence Program (AIEP) Proposal).

Availablity This publication is distributed solely through the TRADOC homepage at


http://www.tradoc.army.mil.
__________________________________________________________
Contents

Paragraph Page
Chapter 1
Introduction
Purpose 1-1 4
References 1-2 4
Explanations of abbreviations and terms 1-3 4
Systems Approach to Training (SAT) overview 1-4 4
Regulation, pamphlet, and job aid (JA) relationships 1-5 5
TRADOC Pam 350-70-4

Contents (cont)

Paragraph Page

Overview of quality assurance, quality control, and evaluation 1-6 6


Types of evaluations................................................................ 1-7 10
Overview of training development management/planning ....... 1-8 11
Electronic support.................................................................... 1-9 13
Quality control criteria.............................................................. 1-10 13

Chapter 2
Evaluation Process
Evaluation overview................................................................. 2-1 13
Evaluator’s role........................................................................ 2-2 14
Evaluation process description ................................................ 2-3 21

Chapter 3
Planning Evaluations
Planning overview ................................................................... 3-1 23
Planning an evaluation ............................................................ 3-2 23
Evaluation project management plans..................................... 3-3 28
Quality control criteria for planning .......................................... 3-4 30

Chapter 4
Collecting Evaluation Data
Data collection overview.......................................................... 4-1 31
Data collection procedures ...................................................... 4-2 31
Quality control criteria for data collection................................. 4-3 39

Chapter 5
Analyzing Evaluation Data
Evaluation analysis overview .................................................. 5-1 39
Analysis description................................................................. 5-2 39
Quality control criteria for evaluation analysis ......................... 5-3 43

Chapter 6
Preparing Evaluation Reports
Evaluation report preparation overview ................................... 6-1 43
Evaluation report descriptions ................................................. 6-2 44
Report preparation steps ......................................................... 6-3 44
Prepare draft report ................................................................. 6-4 44
Staff draft report for review/concurrence ................................. 6-5 46
Obtain final approval of recommendations .............................. 6-6 47
Distribute report/recommendations for action, and conduct
follow-up check...................................................................... 6-7 47
Quality control criteria for preparing evaluation reports ........... 6-8 48

2
TRADOC Pam 350-70-4

Contents (cont)

Paragraph Page
Chapter 7
Conducting Evaluation Follow-Ups
Evaluation follow-up overview ................................................. 7-1 48
Follow-up description............................................................... 7-2 49
Quality control criteria for conducting follow-up evaluations .... 7-3 51

Chapter 8
Internal Evaluation
Internal evaluation overview .................................................... 8-1 51
Internal evaluation description ................................................. 8-2 52
Internal evaluation procedures ................................................ 8-3 54
Areas to consider/review during internal evaluations............... 8-4 56
Outputs of internal evaluations ................................................ 8-5 59
Internal evaluation issues/concerns......................................... 8-6 60
Quality control criteria for internal evaluations......................... 8-7 61

Chapter 9
External Evaluation
External evaluation overview................................................... 9-1 61
External evaluation description................................................ 9-2 62
External evaluation procedures ............................................... 9-3 63
Unit training evaluation ............................................................ 9-4 63
AUTOGEN software program .................................................. 9-5 64
Outputs of external evaluations ............................................... 9-6 66
External evaluation issues/concerns ....................................... 9-7 66
Quality control criteria for external evaluations ........................ 9-8 67

Chapter 10
Accreditation
Accreditation overview............................................................. 10-1 67
Accreditation description ......................................................... 10-2 68
Self-assessment description.................................................... 10-3 70
Quality control criteria for accreditation ................................... 10-4 71

Appendixes
A. References......................................................................... 72
B. SAT Process ...................................................................... 73
C. Job Aid Hyperlinks ............................................................. 77

Glossary ................................................................................. 78

3
TRADOC Pam 350-70-4

Chapter 1
Introduction
1-1. Purpose. This pamphlet provides detailed guidance in support of
TRADOC Reg 350-70 on the following areas of the evaluation process
for TRADOC courses and courseware:
• Evaluation process.
• Internal evaluations.
• External evaluations.
• Accreditation.

1-2. References. The references for this pamphlet appear in appendix A.

1-3. Explanations of abbreviations and terms. Abbreviations and terms


appear in the glossary of this publication.

1-4. Systems Approach to Training (SAT) overview.

a. It is important to have an understanding of the Army’s peacetime


mission, and the System’s Approach to Training (SAT), before
addressing the specifics of this pamphlet. See appendix B for a detailed
explanation of the SAT process, and each of the five phases.

b. The Army’s peacetime mission is to prepare the Army, i.e., Active


Component (AC), Reserve Component (RC), and National Guard
Components, to perform (fight, win, and survive) across the entire
spectrum of military operations. Soldiers and units must be trained to
perform their mission to standard, and survive.

c. Education/training development (TD) is a vital component of the


mission to prepare the Army for war. The Army’s TD process is
subsumed in the SAT process. The goal of the SAT process is to
support the Army’s mission by providing mission-focused, task-based,
hard- and soft-skill education/training. This training must be rigorous;
relevant to units, soldiers, and leaders trained; and conducive to safety
and environmental protection.

d. The SAT is a systematic, spiral approach to making collective,


individual, and self-development education/training decisions for the
Army. It is used to determine whether or not training is needed; what is
trained; who gets the training; how, how well, and where the training is
presented; and the training support/resources required to produce,
distribute, implement, and evaluate the required education/training
products.

4
TRADOC Pam 350-70-4

1-5. Regulation, pamphlet, and job aid (JA) relationships.

a. This pamphlet supports and provides procedural guidance for the


policy established in TRADOC Reg 350-70. The regulation directs the
use of this pamphlet in the planning and conduct of evaluations. Job
aids, product templates, product samples, information papers, and other
supporting documents/products support this pamphlet. Print the
pamphlet and JAs as individual files, or as a single document.

Supporting b. Figure 1-1 provides a list of JAs referenced in this pamphlet.


JAs Appendix C provides a consolidated list of JAs with hyperlinks. Figure
1-2 depicts the relationship of this pamphlet and supporting
documents/products with TRADOC Reg 350-70.

Figure 1-1. List of JAs

5
TRADOC Pam 350-70-4

6
TRADOC Pam 350-70-4

1-6. Overview of quality assurance, quality control, and evaluation.


Quality training and training products result in soldiers who can perform
and survive in the full spectrum of operations. This requires timely
training and training products that conform to established standards and
meet identified requirements, i.e., are efficient and effective. The SAT is
the management control system the Army uses to produce quality
education/training, and training products that meet the needs of the
Army.

Introduction a. Quality assurance (QA), quality control (QC), and evaluation are
employed throughout the SAT process to ensure quality. These
functions are not synonymous. Each has a distinct purpose within
training to achieve the ultimate goal of quality. A brief description of
quality, QA, QC, and evaluation follows.

Quality (1) Quality is the timeliness, accuracy, and conformance to


specified standards for products, processes, and/or programs. Quality
is engineered into a process; it is not an attribute that is added later. It
is the goal at all stages of development, and is achieved through
continuous evaluation during each phase of the SAT process. Built-in
checks in each phase ensure the quality of the SAT process and
instructional products, with emphasis on the units’ or graduates’
performance.

Quality (2) Quality assurance is the function involving evaluative


assurance processes that assure the command that training is efficient and
effective, and meets the current, Stryker, and future training needs of
the operational force. The prime aim of QA is to furnish the chain of
command with the confidence that the TRADOC mission is being
achieved, while minimizing risk of error or failure. It provides an
oversight function for increasing organizational effectiveness, efficiency,
and economy. The objective of QA is to:

(a) Provide the Army with the maximum return on investment.

(b) Ensure and maintain quality up-to-date products to fulfill


the needs of the operational Army.

(c) Ensure quality training and training products are delivered


in a timely manner, and comply with Department of the Army (DA) and
TRADOC policy.

Quality Assurance is achieved through decisions based on results of


accreditation, internal and external evaluation, and QC functions.

7
TRADOC Pam 350-70-4

Quality (3) Quality control is an evaluative action or event, conducted to


control effect QA, that ensures all education/TD and implementation procedures
and processes, and/or education/training products, met or exceeded
prescribed standards. Every QC activity provides a degree of QA. The
SAT process provides a series of QC mechanisms/checks that are
applied to the development of all education/training products,
procedures, and processes. These checks are formal or informal.

Note: The SAT “How-To” pamphlets provide the QC


requirements/checks. The general QC process is discussed in more
detail in chapter 2.

Evaluation (4) Evaluation is a systematic, continuous process to appraise the


quality (efficiency, deficiency, and effectiveness) of a program, process,
or product. It may determine the worth of a training program; determine
if objectives have been met; and/or appraise the value of a new training
technique. It is the means by which an evaluator provides management
(i.e., decisionmakers) with information/recommendations so it can
decide on actions to improve the education/training. It also provides
information/recommendations to prove the value/worth of the education
training (summative evaluation). Evaluations:

(a) Identify both intended and unintended outcomes so


decisionmakers can make necessary adjustments in the instructional
program.

(b) Provide feedback used to modify the education/training


program, as necessary.

Pamphlet b. Figure 1-3 shows how this pamphlet is organized. Some chapters
organization are supported by guidance provided in other chapters; refer to each of
these to accomplish the evaluation. The procedural JAs, product
templates, product samples, and information papers will help in
accomplishing the work.

Evaluation/ c. Evaluations are the feedback mechanism within the SAT process
QC to complete the QC. The QC process begins with the prescribed
relationship minimum quality standards for the relevant product or process. Within
the QC process, the function of the evaluation element is to
gather/collect information to analyze, and then provide management
with data on which to make quality judgments. Figure 1-4 illustrates the
role evaluation plays within the QC process.

8
TRADOC Pam 350-70-4

(1) Evaluation is a key component of the QC process, but is not


synonymous with it.

(2) Evaluation is a tool that not only measures, but also


contributes to the success of Army education/training.

Figure 1-3. Pamphlet organization

9
TRADOC Pam 350-70-4

Prescribed
Standard
Rework

Evaluation
starts here
Evaluation Make Quality
ends here Judgment Gather Data

Determine
the Delta

Analyze/process information

Figure 1-4. Quality control process


1-7. Types of evaluations. Evaluations are categorized into two types:
internal and external.

Internal a. Internal evaluation gathers internal feedback and management


evaluation data from the education/training instructional system environment to
determine if--

(1) The SAT process is being appropriately applied in the


development of products or programs.

(2) The instructional base is providing the appropriate/intended


training.

(3) The objectives of the training have been met.

(4) The instructional system is producing the required qualified


graduates.

(5) The staff and faculty receive the required training.

(6) Instructors provide quality instruction.

(7) Required infrastructure is in place to support training, whether


resident or Distributed Learning (DL).

10
TRADOC Pam 350-70-4

b. This process is conducted no matter where or how the training


takes place (i.e., institution, TASS Training Battalion, DL, unit, or at
home). The purpose of an internal evaluation is to improve the quality
and effectiveness of the instructional system. Internal evaluations are
discussed in detail in chapter 8.

External c. External evaluation determines if soldiers can meet job


evaluation performance requirements, require all the instruction received, and need
any additional instruction not received. Ultimately, this evaluation
process determines if the training meets the needs of the Army. This
process gathers data from the field to assess a graduate’s performance
in a job environment, i.e., to determine if the graduate was trained to
meet real-world job performance requirements. External evaluations are
discussed in detail in chapter 9.

11
TRADOC Pam 350-70-4

1-8. Overview of training development management/planning.


Through planning, a manager will develop a realistic estimate of the
resources required to accomplish evaluation projects, establish
milestones, and allocate the available resources to the project. Initial
TD management planning begins with an education/TD requirement that
is a result of a needs analysis, or a new/revised training strategy.

Types of a. Planning serves as a top-level tool to manage education/training


planning requirements and resources. Proponent schools develop two types of
plans:

(1) A Training Development Project Management Plan (TDPMP)


is developed for every education/TD process or product (see table 1-1
for procedure). This plan is simple and informal, or detailed and
complex. The TDPMP will tell the “who”, “what”, “when”, “where”, “why”,
and “how”, and ultimately estimate the cost of every project.

(a) The TDPMP documents workload and resource


requirements for the duration of a specific project. Examples include—
• Conduct a job analysis of the 84B30.
• Design/develop a drill.

(b) The TDPMP identifies manpower and resourcing


requirements for the specific project to include such things as—
• Personnel.
• Milestones.
• Costs.
• Material.
• Temporary duty (TDY).
• Any other factor required for the TD project.

Table 1-1
Project Management Plan procedures
Step Action
1. Identify project team members, and consult as needed for plan
requirements.
2. Identify TD resources required to complete the development
project.
3. Establish TD milestones for starting and completing the project,
and intermediate milestones, as needed.
4. Determine budget requirements and provide to the budget office.
Include such things as—

12
TRADOC Pam 350-70-4

• TDY requirements, to include TDY location, length,


transportation requirements, personnel, and cost estimates.
• Material requirements.
• Distribution requirements and costs.
• Printing/reproduction costs.
5. Determine audiovisual support requirements (personnel and
equipment).
6. Identify training product requirements.
7. Coordinate between all directorates and departments within the
proponent school, and appropriate external units.
8. Complete the TDPMP.
9. Obtain appropriate command approval of the TDPMP.
10. Update the Proponent TD Plan, Individual Training Plans (ITPs),
System Training Plan, etc., as appropriate to the project.

(2) Proponent TD Plan is a proponent’s internal, living document


that includes all requirements (resourced and unresourced). If a new
TD requirement is identified or an existing one is changed, adjust the TD
Plan. The TD Plan is a rollup of the requirements outlined in the
TDPMP. It is a long-range document covering multiple years, and
provides data to various resource, budget, and manpower reports, such
as the Monthly Status Report, Installation Contract, Program Objective
Memorandum, and Command Operating Budget.

(a) The TD Plan documents TD workload and resource


requirements for the programming, planning, budgeting, and execution
years. The details within the TD Plan will increase each year.

(b) The TD Plan addresses all required collective and


individual training products (e.g., resident and nonresident courses,
training support packages (TSP), manuals, etc.) as well as all TD
processes (e.g., mission analysis, evaluation, etc.). Table 1-2 provides
the procedures.

Table 1-2
Training Development Plan procedures
Step Action
1. Summarize all TDPMP requirements in prioritized order.
2. Determine which requirements can be accomplished based on
available resources (i.e., resourced requirements).
3. Identify the impact of each of unmet training requirement on unit
mission and task accomplishment.
4. Finalize the TD Plan.
5. Obtain command approval of the TD Plan.

13
TRADOC Pam 350-70-4

b. For further information, see TRADOC Reg 350-70, chapters II-2


and II-3.

1-9. Electronic support. Training developers should take advantage of


TRADOC-provided automation tools available to support the applicable
TD process. The automation tools and programs available change over
time, both in what they can accomplish, and ease in use. Keep
cognizant of these changes, and bring the need for upgraded
automation requirements to the attention of the chain of command. A
good source of current information is the Army Training Support Center
website. Throughout this pamphlet there are references to the current
automation tools, and guidance on how a tool will help perform a
specific process/procedure. The current, primary training automated
development tool is the Automated Systems Approach to Training
(ASAT). Each Training/TD (task) proponent should have an ASAT point
of contact (POC) in the TD organization that can assist with obtaining
and using the program. Another resource is the ASAT Help Desk.

1-10. Quality control criteria. Each chapter in this pamphlet will include QC
Quality criteria for applying the evaluation process.
control

Chapter 2
Evaluation Process

2-1. Evaluation overview. The evaluation process is used to appraise


quality (efficiency and effectiveness) of education/training, and TD.
There is a process to follow to conduct an evaluation of training
programs, products, and/or process, whether evaluating—
• The effectiveness of a training program, or the meeting of training
objectives; appraising the effectiveness of new training;
• Equipment or environmental factors; or
• Extent that training and training products meet the needs of the
Army (i.e., determine if training received is what is required in
the field).
The evaluation process involves the following phases: planning,
collecting, analyzing, reporting, and follow-up.

14
TRADOC Pam 350-70-4

2-2. Evaluator’s role. The prevailing role of an education/training evaluator


is to identify, articulate, and provide information to decisionmakers to
assist them in making education/training decisions. These could include
decisions on whether to continue training, whether and how to improve
the training, and/or the cost effectiveness of the training. Evaluation is
the process used to complete QC checks throughout the entire SAT
process. As an evaluator, look for both strengths and weaknesses in
the entire instructional system. Focus on—
• How well the graduates are meeting job performance
requirements.
• Whether instruction is being provided that is not needed.
• Whether instruction that is needed is not being provided.
• Ways to improve the learner’s performance on the job, as well as
the instructional system.
• How well each instructional system component is contributing to
overall instructional system quality. This includes, but is not
limited to, lesson plans, instructors, equipment, training devices,
interactive courseware, training schedules, audiovisual media,
facilities, manpower, and costs.
• Whether the SAT process is being appropriately applied.
• How efficiently the education/training products meet the identified
needs.

Support to a. Evaluators support decisionmakers at all levels by—


decision-
makers (1) Collecting, analyzing, evaluating, and distributing feedback
concerning such areas as:

(a) Quality of current training and training support.

(b) Sufficiency of doctrine.

(c) Operability and maintainability of equipment and weapon


systems from field user’s viewpoint.

(d) Readiness to meet new training requirements.

(2) Providing standards and guidance for evaluating and


accrediting Army training, training products, and institutions.

(3) Identifying performance deficiencies.

(4) Providing successful initiatives from the collection and analysis


of trends data.

15
TRADOC Pam 350-70-4

(5) Ensuring quality of training by determining if the:

(a) Instruction follows objectives and implementation


procedures listed in the lesson plan, course management plan, student
evaluation plan, and the approved program of instruction (POI).

(b) Training aids used in classrooms support the objectives,


are appropriate, understandable, and readable.

(c) Environmental conditions contribute to a proper learning


environment.

(d) Instructor performance meets instructional standards.

(e) Training development and training management are


effective and efficient.

(f) Collective training products are effective and efficient.

(6) Ensuring staff and faculty have received required training.

Evaluation b. An evaluator is an independent observer that provides guidance


team make- and assistance while ensuring a quality process is applied, and quality
up products are produced. Conducting an evaluation is a team effort,
guided by the training developer, who will function as the project leader.
Executing an evaluation as a team effort is the most effective way to
accomplish this process. Building the team may involve a matrix
management approach. The team will consist of a project leader and
subject matter experts (SMEs) (e.g., training developers, soldiers in
units, instructors, task performers, and their supervisors, etc.).
Depending on the purpose of the evaluation, the other team members
will vary (i.e., ad hoc teams are built on the requirements of what is
being evaluated). There are several major players that perform on
every team.

(1) Evaluators on the evaluation team primarily consist of a


training developer (an expert in evaluation), and experts in other areas,
depending upon the evaluation initiative.

(a) The training developer, a GS-1750 (Instructional Systems


Specialist), is normally in charge of the project. This is the individual
trained in the conduct of evaluations. This person is the “training
development” SME.

16
TRADOC Pam 350-70-4

(b) Ensure the “content/technical” SMEs are master experts


in the military occupational specialty (MOS), area of concentration
(AOC), or other areas being evaluated.

(2) A difficulty encountered when setting up this team is selecting


master SMEs. There are three levels of SMEs—apprentice,
journeyman, and master. Make sure master training developers and
master content area SMEs are on this team.

Team c. Everyone taking part in an evaluation is part of the evaluation


member team. TRADOC Reg 350-70, paragraph II-3-4, provides the basic policy
roles on TD teams. A variety of people are needed during the evaluation
process, but the number and mix of personnel will vary based on the
evaluation.

Evaluation (1) The evaluation team shall ensure the evaluation:


team roles
(a) Incorporates all elements that make up a SAT.

(b) Determines if the right things are being trained; students


learned; and training transferred to the job, met the unit’s needs, and
was efficient.

(c) Provides support to decisionmakers at all levels.

(d) Is thorough and comprehensive.

(e) Is technically correct.

(f) Results in a quality product by applying QC measures.

(g) Complies with TRADOC TD guidance and policy.

(h) Meets milestone requirements.

Training (2) The training developer (GS-1750) is a key player on the


developer evaluation team, who usually leads and manages the internal evaluation
role efforts (within the proponent school) and, depending on the purpose,
may lead the external evaluation efforts. This role includes—

(a) Keeping all people involved in the evaluation process


informed of progress, problems encountered, developments, changes,
and constraints.

(b) Being responsive.

(c) Providing results in a timely manner.

(d) Keeping the project management plan updated, as


17
appropriate.

(e) Providing guidance to the team members concerning how


TRADOC Pam 350-70-4

(j) Conducts data collection and analysis; identifying


deficiencies and efficiencies.

(k) Providing recommendations to actual or potential


problems/deficiencies.

(l) Sharing identified efficiencies with appropriate


organizations.

(m) Assisting with the center/school’s self-assessment efforts.

(n) Ensuring all necessary corrective actions were completed.

SME role (3) The SME is the content or technical expert on the team. This
SME is, or should be, the master performer of the action/activity being
evaluated. No matter what the job—a training developer, a
combat/doctrine developer, or an instructor—the SME is involved
directly in the evaluation function. The SME is specifically responsible
for the accuracy and completeness of the technical content, and the
comprehensiveness of the content presented. The SME may perform
other duties the team leader assigns, such as data collection.

Additional (4) Depending on the purpose of the evaluation, the composition


members of the evaluation team may include, in addition to the training developer
and SME, the following:

(a) Visual information specialists.

(b) Installation support person.

(c) Safety Officer.

(d) Resource management personnel.

18
TRADOC Pam 350-70-4

Evaluation d. Evaluators should routinely evaluate all aspects of the production,


areas management, and conduct of the Army education/training system. This
includes such areas as:

(1) Enlisted, officer, warrant officer, and DA civilian training.

(a) Evaluate quality of instruction, standardization, test


objectives, etc., to prepare students to perform effectively.

(b) Ensure horizontal and vertical integration of tasks and


training.

(c) Ensure progressive and sequential training.

(d) Evaluate quality of individual training strategies and


Career Development Model.

(2) Unit Training Products.

Evaluation (a) Evaluate quality of unit training strategies (unit long and
areas short-range training strategies).

(b) Review Combined Arms Training Strategies (CATS),


Mission Training Plan (MTP), drills, exercises, Warfighting TSP, and
Training Aids, Devices, Simulator and Simulations (TADSS) to ensure—
• Product usefulness.
• Technical/doctrinal content accuracy.
• Design and development are consistent with analysis data.
• Effective training sequence.
• Product meets needs.
• Safety, environment, and risk assessment/management
considerations are incorporated.
• Deficiencies identified were corrected.

Note: Unit feedback on MTP, Soldier Training Publication (STP) and


TSP are supposed to come through the automated unit training
management program (i.e., Standard Army Training System/Unit
Training Management Configuration).

19
TRADOC Pam 350-70-4

(3) Combined Arms Training Strategies.

(a) Army Training Strategy (Headquarters (HQ) TRADOC


only).

(b) Unit Long Range Training Strategy.

(c) Unit Short Range Training Strategy.

(d) Individual Long Range Training Strategy.

(e) Individual Short Range Training Strategy.

(f) Self Development Training Strategy.

(4) Threat. Evaluate to determine school’s compliance with


TRADOC regulations on threat, i.e., ensuring valid, accurate, and
consistent threat portrayal is in the training and TD areas.

(5) Doctrinal development. Evaluate the organization, procedures


and management of the development process, and the integration of
doctrine into training.

(6) School evaluation system. Evaluate feedback systems and


how training institutions use feedback to verify/validate decisions, or
cause change.

(7) Education/training management. Evaluate the management of


implementation of education and training.

(8) Training Resource Management.

(a) Evaluate how well the school aligns resources with


mission/training requirements to—
• Accomplish training student workload to standard.
• Allocate resources to support Commanding General,
TRADOC priorities.
• Identify tradeoffs, and impacts of these tradeoffs.
• Consider efficiencies and economies.
• Apply savings to meet unresourced priorities.

20
TRADOC Pam 350-70-4

(b) Evaluate training resources involving programming and


budgeting, manpower management, ammunition, equipment, and
facilities.

(c) Evaluate assignment and management of contracts.

(d) Evaluate management of personnel assignments and staff


and faculty training.

(9) Training Requirements Analysis System (TRAS). Determine


how well the institution is managing TRAS implementation and process
to include:

(a) TRAS document implementation and updates.

(b) Maintenance of milestone schedules.

(c) Meeting programming and budgeting windows for


resources.

(10) Systems Approach to Training process. Ensure the


appropriate application of minimum essential elements of the SAT
process.

(11) Staff and faculty training. Evaluate the adequacy and


relevancy of institution’s training and development programs for the staff
and faculty.

(12) Automation systems which provide—

(a) Proponent approved functionality (operational)


requirements.

(b) Seamless share data.

(c) Automated systems easy to operate.

(d) Multiple levels of help, to include use of program,


technical assistance, and embedded training.

(e) The capability to maintain the integrity of current data.

(f) Key databases in the production cycle (available through


the TD automated support system).

21
TRADOC Pam 350-70-4

Remember, evaluators must have access to decisionmakers, and the


evaluation findings, conclusions, and recommendations must be
credible to be useful.

2-3. Evaluation process description. Evaluations assess the quality of


training and TD. Evaluators conduct evaluations by analyzing the current
status of unit and individual performance, training products, programs,
and processes by using a 5-phased process (planning, collecting data,
data analysis, providing recommendations/reporting of findings, and
following up on recommendations to ensure implementation). This
process produces valid and reliable results that identify training
deficiencies. Findings provide the basis for corrective recommendations
through the chain of command. Findings also identify those areas of the
training program performing efficiently and effectively.

Evaluation a. Evaluation efforts will determine whether:


findings
(1) Training and TD efforts have accomplished their intended
purpose (i.e., the objective of the training has been met).

(2) The right soldier was trained (i.e., all students fell within the
target audience).

(3) All students met specified entry-level standards.

(4) Accreditation was performed properly, and accreditation data


was obtained.

(5) The TD process met minimum essential requirements of


analysis, design, development, and implementation.

(6) Training products met unit needs.

(7) Soldiers’ training met unit needs.

(8) Appropriate quantity of training was received.

(9) Soldiers needed instruction they did not receive.

(10) Soldiers received instruction they did not need.

(11) Training institutions are adhering to command training and


education guidance.

Note: An overview of each evaluation phase is provided in paragraph


2-3b through f below.

22
TRADOC Pam 350-70-4

Planning b. The first step in doing anything is proper planning. Evaluations are
overview no exception and require thorough planning. Some routine evaluation
duties, such as conducting a classroom observation, reviewing a test, or
analyzing a group of student end-of-course critiques, do not require in-
depth evaluation plans. Explain procedures for performing these routine
duties in a local standard operating procedure, or equivalent document.
However, major evaluations (i.e., in-depth evaluations of school training
program, products, or processes) require an evaluation plan—the end
product of the planning phase. This phase of the process is discussed
in detail in chapter 3.

Data c. The data collection process involves determining what type of data
collection is required, what data to collect (student performance, student feedback,
overview audit trails, supervisors, graduates, etc.), where/who data is collected
from (source), how much, and how data is collected (method/technique/
instrument). Next, structure and develop the data collection
method/instrument to collect data. The final step is to administer the
instrument/technique to collect the data. This phase of the process is
discussed in detail in chapter 4.

Analysis d. Analysis is the process of reviewing, synthesizing, summarizing,


overview and processing evaluation data collected to develop initial findings
concerning the issue being evaluated. Analysis reduces the huge
volumes of raw data collected into a series of initial findings. The
method used will depend on the type of data collected. This phase of
the process is discussed in detail in chapter 5.

Reporting e. After evaluation data is collected and analyzed, the next step is to
findings and identify major findings and recommendations. Once identified, write a
recommend- report to include:
ations
overview (1) References.

(2) Background/problem.

(3) Purpose of evaluation.

(4) Summary of data collection procedures.

(5) Results – major findings, conclusions, and recommendations.


This phase of the process is discussed in detail in chapter 6.

23
TRADOC Pam 350-70-4

Follow-up f. The follow-up phase determines if recommendations were


overview implemented. This phase should take place within one year of the
evaluation approval. This phase of the process is discussed in detail in
chapter 7.

Feedback g. Evaluators receive feedback from students, supervisors, soldiers,


system instructors, commanders, training developers, and affected
organizations. Provide feedback to the same. The entire evaluation
process is based on feedback, either formal or informal. Informal
feedback consists of unsolicited information, which is transmitted either
directly or indirectly to an action office, verbally, or in writing. Formal
feedback, however, is solicited, specific in content, and transmitted
through channels by means of a document designed specifically for the
purpose (i.e., a survey or questionnaire). The feedback system,
therefore, includes either the instruments or mechanisms used to elicit
the information, and the means by which it is transmitted to the
appropriate action office. It is the evaluator’s responsibility to ensure
that the feedback system is functioning, and evaluation results are
relayed back to the appropriate action point in the system. Formal or
information feedback, via external and internal evaluations, may drive
future evaluation plans.

Chapter 3
Planning Evaluations

3-1. Planning overview. This chapter addresses the planning phase of the
evaluation process. It will assist evaluators in planning an evaluation
and creating the Master Evaluation Plans (MEP) and Evaluation Project
Management Plans. Add or delete additional actions, as the specific
evaluation conducted requires. Evaluations that are not well-planned or
managed will result in the collection of information that cannot be trusted
(i.e., invalid and of no use). The Quality Assurance Office (QAO) is
responsible for the development of MEPs and Evaluation Project
Management Plans. However, involvement from other directorates and
departments is essential in their development.

3-2. Planning an evaluation. The first step to any evaluation is planning,


which is probably the most difficult step. Everything that follows
depends on how well the evaluation is conceptualized. This entails how
well thoughts, ideas, and/or intuitions are expressed. In the planning
process, assess all major goals/objectives, needs, resources and
capabilities, and/or any other areas, which are eventually represented in
an evaluation plan. There are several reasons to carefully plan an
evaluation and document the plan in a written evaluation plan before
conducting an evaluation. Planning—

24
TRADOC Pam 350-70-4

• Helps ensure there is a valid need for doing the evaluation (i.e.,
reduces the chance of doing an evaluation that is not needed).
• Focuses the intent of the evaluation and prevents the evaluation
from getting “off track”.
• Forces the evaluator to “think through” the entire evaluation and
plan for all actions required.
• Identifies and optimizes the use of the limited resources available
for doing the evaluation.
• Ensures that everyone involved in the evaluation is informed of,
and knows, their responsibilities.
• Identifies and prioritizes initiatives.

Types of a. Thorough planning is imperative to the success of a well-


evaluation conducted evaluation. The complexity of the evaluation determines the
plans amount of planning required. Routine evaluations will not require an in-
depth evaluation plan. However, major evaluations of school training
programs, products, or processes will require an in-depth evaluation
plan. The end product of a well-instituted planning phase is an effective
and efficient evaluation plan. There are two types of evaluation plans
proponents are involved in developing.

(1) Evaluation Project Management Plans are the individual plans


developed for each evaluation conducted. They support the MEP.
These plans are informal, simple, and unwritten (the project requirement
may exist in a database, but not as a formal report) or they are a formal,
very detailed, complex plan. Job Aid 350-70-4.3a provides a format of
an Evaluation Project Management Plan. Guidance for developing a
project management plan is provided in JA 350-70-4.3b.

(2) The MEP is the planning document that provides all evaluation
requirements for the next fiscal year (FY), and projections for the
following 3-5 years. Evaluation requirements outlined in Evaluation
Project Management Plans are included in the MEP. The Combined
Arms Center (CAC) and Army Accessions Command (AAC) QAOs and
each TRADOC center/school will prepare and provide the MEP to HQ,
TRADOC by September for the upcoming FY. Centers/schools should
send a courtesy copy of their MEP to CAC and ACC QAOs. These
plans are mandatory per TRADOC Reg 350-70. Job Aid 350-70-4.3c
provides a format of a MEP.

Planning b. There are six major steps involved in planning an evaluation (see
procedural table 3-1). The training developer will perform these steps to
steps successfully plan an evaluation. An Evaluation Project Management
Plan is the outcome of this procedure.

25
TRADOC Pam 350-70-4

Table 3-1
Steps for planning an evaluation
No. Step
1. Determine what areas need evaluating.
2. Define the purpose of the evaluation.
3. Determine the scope of the evaluation and available resources.
4. Collect and research information pertinent to the evaluation
(feedback and training documentation, POI, STP, TSP, critical
task list).
5. Develop and coordinate initial Evaluation Project Management
Plan.
6. Develop and coordinate final Evaluation Project Management
Plan.
Note: The evaluation plan (steps 5 and 6) is a by-product of planning
an evaluation.

JA c. A brief description of each step is provided below. Job Aid 350-70-


4.3b is provided to help plan the evaluation. This JA is not meant as a
lock-step procedure to plan only by these rules. It is provided as an aid
to assist in planning the evaluation.

Determine d. The first step of planning an evaluation is to determine what to


what to evaluate, and what is accomplishable with given resources. Job Aid
evaluate 350-70-4.3b, (Step 1) provides guidance to determine what to evaluate.
Table 3-2 below provides a summary of this guidance.

Table 3-2
Actions for determining what to evaluate
No. Action
1. List all appropriate proponent areas that could require an
evaluation each year (i.e., courses, TD organizations, instructors,
processes, graduates/graduates’ supervisors, etc.).
2. Identify from this list those areas that have not been evaluated
within 2-3 years, and areas that, through feedback, have
indicated a potential problem.
3. Obtain any written documentation that may have been initiated,
or will be initiated.
4. Prioritize evaluation initiatives based on personnel and resource
availability per year (i.e., number of people, time to complete the
evaluation, and dollars required to accomplish the job).

26
TRADOC Pam 350-70-4

Determine e. Clarify and delineate the purpose for each evaluation conducted.
evaluation This provides justification for expenditure of funds and allocating time to
purpose do the work. However, keep in mind that it is not always necessary to
have preset goals and objectives for evaluations; they may limit or
constrain the evaluation. Table 3-3 summarizes actions to address to
answer why the evaluation is being performed. Detailed assistance on
conducting these steps is in JA 350-70-4.3b (Step 2). The JA
addresses each action in detail. Table 3-3 provides a summary of a
portion of the JA.

Table 3-3
Actions for determining the purpose of the evaluation
No. Action
1. If evaluation request was initiated outside of the organization,
obtain appropriate interpretation of the reason/problem/issue/
concern.
2. Discuss concerns with other evaluators.
3. Develop initial interpretation of the evaluation requirements.
4. Develop initial interpretation of the impact of not solving any
deficiencies or discrepancies.
5. Develop a list of questions and/or issues that need answering as
a result of the evaluation.
6. Get senior leader sponsorship/approval. Make initial
recommendations to continue or discontinue evaluation. If
evaluation will not continue, write a memorandum on why
evaluation stopped. If evaluation will continue, write a
memorandum requesting POCs.
7. List the evaluation POCs and other POCs.

Determine f. Determine the scope of each evaluation initiative. Basically, detail


scope of the breadth and depth/size of the evaluation. The purpose of any
evaluation evaluation is to identify training and/or TD efficiencies (which can be
shared) or deficiencies, which will require improvement/corrections.
When planning evaluations, ensure sufficient and appropriate
information is acquired to provide reasonable assurance of a successful
evaluation. Detailed assistance on conducting these steps is in JA 350-
70-4.3b (Step 3).

27
TRADOC Pam 350-70-4

Example of An example of an evaluation scope follows. When individual training


evaluation presented by a specific school is evaluated, determine the extent to
scope which the individual training for that school is evaluated. Also determine
if the evaluation will include just one STP, multiple STPs, all individual
training products (TSPs, and lesson plans); one/all courses; one
organization’s process for developing individual training products; and/or
standardization of the process within several organizations. The
decision is made to evaluate just a particular course. An example scope
statement follows:

“Course, Repair the Widget, will be evaluated from March to


June of the next FY.” The evaluation will include determining if
the course trains current critical tasks for the job, whether the
lessons are based on the approved critical tasks and supporting
skills and knowledge, whether the tests administered are
criterion-referenced and are valid, whether the instructors are
qualified and present the material as required by the course
documentation, whether the training utilizes the appropriate
training strategy, and whether a reduced training time is
possible, while maintaining or improving the training standards.
The proponent will use results to improve the course.

Research/ g. Review feedback and training documentation relevant to the


collect evaluation being conducted. Though this has been designated as a
pertinent separate step, it can begin, and continue throughout the evaluation.
information Table 3-4 provides a summary of this step in planning an evaluation.
Job Aid 350-70-4.3b (Step 4) provides detailed assistance on
conducting this step.

Table 3-4
Actions to research and collect pertinent information
No. Action
1. Initiate a literature search, if required.
2. Identify the criteria the evaluation is based on, to ensure
goals/objectives are met.
3. Begin to collect and review training feedback.
4. Begin to collect and review pertinent education/training
documents, e.g., regulations, pamphlets, products, etc.

h. Conducting literature search and reviewing training feedback may


assist in determining—

(1) If the reason/concern/problem/issue has been previously


identified (determine if this is a one-time situation, or represents a
recurring trend).

28
TRADOC Pam 350-70-4

(2) If the reason/concern/problem/issue has been previously


studied. If the information required to address the problem/issue has
already been addressed, it reduces the possibility of duplicating
something that has already been done.

(3) If there are other related training issues that need evaluating
as part of the evaluation.

(4) If any existing studies pertinent to the evaluation exist.

Note: The establishment and maintenance of a feedback system will


assist the organization in organizing feedback collected from different
sources.

i. Collect and review training documents to become familiar with the


training programs, products, and/or processes being evaluated. Job Aid
350-70-4.3b (Step 4) provides a list of the different training documents
that may need reviewing. Basic questions to ask during this step to
assist in collecting existing documentation are:

(1) Is there a written tasking from the assistant commandant or


appropriate tasking authority to do an evaluation?

(2) Are there written comments from the field commanders? Who
were the field commanders that made the comments? And who
recorded the commander’s comments?

(3) Is there a trip report indicating field problems?

(4) Is there other documentation indicating a training or education


problem?

Evaluation j. Development of evaluation plans is discussed in paragraph 3-3


plans below.

3-3. Evaluation project management plans.

a. A manager must develop a realistic estimate of the resources


required to accomplish evaluation projects, establish milestones, and
allocate the available resources to the project. Evaluation plans will
achieve this by identifying the total data/information required, and
specifying the methodology for collecting; processing, and analyzing that
data/information; and validating the measuring instruments. These
plans answer the basic questions:

(1) Why is the evaluation being conducted?

29
TRADOC Pam 350-70-4

(2) What will the evaluation accomplish?

(3) How is the evaluation done?

(4) Who will conduct and support the evaluation?

(5) What resources will the evaluation require?

(6) When will the evaluation be completed?

(7) What references are used during the evaluation?

(8) Who will receive the final report?

Elements of the evaluation plan are discussed below. These elements


provide additional insight into the basic questions addressed in an
evaluation plan.

Elements of b. There are specific critical elements that all Evaluation Project
evaluation Management Plans must include. Each basic question listed above has
plans associated elements to address for a successful evaluation. Each
essential element is discussed in JA 350-70-4.3b (Step 5). The
evaluation plan elements are outlined in general terms in table 3-5.

Table 3-5
Evaluation plan elements
No. Element
1. Why is the evaluation being conducted?
a. Reason/problem/issue/concern.
b. Impact.
2. What will the evaluation accomplish?
a. Limitations.
b. Assumptions.
c. Essential elements of analysis.
d. Objectives.
e. Purpose.
3. How is the evaluation performed?
a. Data collection/analysis methodology.
(1) Ensure sources of relevant data and general approach
to data collection, reduction, and analysis.
(2) Use automation support.
(3) Use computer scan sheets when possible for collecting
and inputting data into a computer.
b. Data reporting/follow-up methodology.
c. Identify the criteria the evaluation is based on to ensure
goals/objectives are being met.

30
TRADOC Pam 350-70-4

4. Who will conduct and support the evaluation?


a. Evaluation representatives.
b. Representative responsibilities.
5. What resources will the evaluation require?
a. Resource requirements.
b. Support requirements.
6. What is the evaluation schedule?
a. Evaluation start date.
b. Evaluation completion date.
c. Evaluation critical elements.
7. What references are used during the evaluation?
a. References.
b. Related studies.
8. What is the feedback plan and who will receive the final report?

3-4. Quality control criteria for planning. TD/task proponents perform QC


actions as an inherent part of the SAT process. When planning an
evaluation, ensure:

a. Areas selected for evaluation were based on one or more of the


following:

(1) Identified deficiency/discrepancy.

(2) Scheduled evaluation.

(3) A request from an internal or external source.

b. Purpose of the evaluation is clearly stated.

c. Scope of the evaluation describes precisely what is covered.

d. A thorough literature search was conducted to identify relevant


information.

e. The final evaluation plan was developed and includes:

(1) Purpose.

(2) Background.

(3) Scope.

(4) Objectives.

31
TRADOC Pam 350-70-4

(5) Essential elements of analysis.

(6) Methodology.

(7) Evaluation recommendations and responsibilities.

(8) Resource and support requirements.

(9) Timelines.

(10) Feedback plans of results.

f. Master Evaluation Plan was revised, to include all evaluation


project management plans, as they were developed or modified.

Chapter 4
Collecting Evaluation Data

4-1. Data collection overview. This chapter will discuss the data collection
phase of the evaluation process, to include selecting, using, and
defending appropriate methods of collecting data. It will provide
guidance on designing and using surveys and interviews, as well as the
use of observations.

4-2. Data collection procedures. During the planning phase for evaluation,
identify the information required to address the overall evaluation effort,
as well as how to collect the required information. Data collection is the
process of gathering, collating, and preparing data for the purpose of
processing (analyzing) to obtain desired results. To determine the worth
of any training, training product, or process, the data is collected to
analyze. Relevant data should come from several sources, with more
than one method used to collect data. General data collection sources
and methods are document reviews, individual interviews, group
interviews, surveys, tests or time trials, and personal observations. The
intent is to collect sufficient raw data to ensure a successful analysis.
The techniques or instruments used will depend on the type of data
required for collection. See table 4-1 for general procedural guidance
for collecting data.

32
TRADOC Pam 350-70-4

Data Table 4-1


collection Steps for collecting data
procedural No. Step
steps 1. Review what information needs collecting and why (data required).
2. Select the data sources.
3. Determine the data collection method (types of data collection
method).
4. Develop draft of instrument, if required (design).
a. Select the type of questions/checklists to use.
b. Construct the questions/checklist.
c. Sequence the questions/checklist.
d. Format the instrument.
5. Evaluate the instrument. Pilot it with peers; make changes as
needed.
6. Administer the instrument.
7. Collect raw data.

A brief description of steps is provided in JA-350-70-4.4a (excluding


data sources, which is discussed in paragraph 4-2c below).

Data a. Information collected is based on:


required
(1) Area of interest.

(2) Questions that must be answered.

(3) Information needed to answer these questions.

Job Aid 350-70-4.4a, paragraph 1a, provides general guidance on


achieving this requirement.

Note: If an evaluation is conducted on the TD process, collect data


based on the QC measures provided in the individual TRADOC
pamphlets (i.e., analysis, design, development, validation).

Data b. Collecting data on the effectiveness and quality of training is an


sources important responsibility of evaluators. In concert with this, evaluators
ensure that data is being obtained from appropriate sources. A list of
possible data sources to assess is provided below.

(1) Center for Army Lessons Learned (CALL).

33
TRADOC Pam 350-70-4

(2) Combat Training Center (CTC) rotation feedback.

(3) Lesson learned collectors.

(4) Internal training review report.

(5) Supervisor feedback (surveys/questionnaire/interview).

(6) Graduate feedback (surveys/questionnaire/interview).

(7) Student feedback (surveys/questionnaire/interview).

(8) Instructor feedback (surveys/questionnaire/interview).

(9) Unit commander feedback (surveys/questionnaire/interview).

(10) Subject matter experts.

(11) Monitoring of training and instructors (surveys/questionnaire/


interview).

(12) Training developers.

(13) Doctrine and combat developers.

(14) Field visits or unit feedback.

(15) Comparison between what was supposed to be trained


versus what was trained (validated training).

(16) Unsolicited feedback.

(17) Analysis of trainee test results.

(18) Analysis of validity of assessment/test employed.

(19) Second party training review reports.

(20) Accident reports.

(21) Conferences and seminars.

(22) Document reviews.

34
TRADOC Pam 350-70-4

Technical c. There are two technical approaches to collect data that may be
approach to used—quantitative or qualitative.
data
collection (1) Quantitative data indicates an amount (how much or how
many) and is measured on a numerical scale. This method is used
most frequently. A quantitative approach is:

(a) Objective.

(b) Reliable.

(c) Easy to use.

(d) Not as time consuming as qualitative.

For example, a runner’s time in a 10-kilometer race is a quantitative


variable, because it measures the amount of time it took to run 10
kilometers.

(2) Qualitative data is usually in the form of words, not numbers.


It can be thought of as the perception about the quality of something.
Qualitative data either labels sampling units, or places them into
categories. In general terms, such data is identified or named by some
quality, category, or characteristic. Such data are nominal scales (as in
agree, disagree, or no opinion in survey instruments). When qualitative
data is analyzed, it usually features the proportion that lies in a given
category. For example, what proportion of the survey respondents
agreed training developers are best qualified to develop ITPs?

Types of d. Data collection instruments contain some style of questions


data designed in a systematic, highly defined approach. Job Aid 350-70-
collection 4.4a, paragraph 1b, provides general guidance on achieving this
methods requirement. Their purpose is to obtain consistent data that is
compared, summarized, and, if the data is quantitative, subjected to
statistical analysis. Use a single or combination of data collection
methods to collect data. Obtain evaluative data from sources
previously discussed. Methods for collecting data are discussed below.
Details are provided for the first four, as they are the primary methods
used.

(1) Questionnaires/surveys collect large samples of information


from individuals associated with, or affected by, the training programs,
products, or process being evaluated (students, graduates, soldiers’
first line supervisors, and instructors). Structured questionnaires are
most valuable in obtaining straightforward factual information. A well-

35
TRADOC Pam 350-70-4

prepared and administered questionnaire will yield valid data. An


appropriate timeframe for administering questionnaires/surveys to
graduates, or first line supervisors, is usually no sooner than 6 months
after soldier has graduated from the course. Carefully prepared,
properly distributed, objectively executed, and critically analyzed
questionnaires can provide constructive information on:

(a) The ability of recent graduates to perform specific tasks


for which they were prepared.

(b) The specific nature of instructional deficiencies, as


perceived by the student or graduate.

(c) Details of the jobs actually being performed by the


graduates.

(d) Instruction not needed by the graduates in their jobs.

(e) Other areas that require analysis.

Note: For external evaluation, Army Research Institute (ARI) has


developed the Automated Survey Generator (AUTOGEN) Software
Program that creates the survey. This software is discussed in more
detail in chapter 9.

(2) Interview guides, like questionnaires, collect information from


individuals associated with, or affected by, the training programs,
products, or process being evaluated. When interviewing, generally a
list of questions is used (i.e., an interview guide), the questions are
asked verbally, and the participants’ answers are recorded. The
questions are not necessarily in a mandated order, but should ensure
the data is collected in a systematic and standardized way. Depending
on the participant’s response (i.e., a thorough answer to a question that
is asked later has already been provided), the order of the questions
may need to change, which requires knowledge and understanding of
the questions. Although interviews are economical, flexible, and a
good means of collecting information, maintain neutrality as an
interviewer, and do not affect the participant’s perception of the
question or answer given. However, interviews allow evaluators to gain
a great depth of understanding of problems and issues. Interviews can
also yield anecdotal examples and vignettes that bring life to the
problems or issue at hand.

36
TRADOC Pam 350-70-4

(3) Observations are used during the implementation of training


to assure training is being delivered in the right sequence, and
conditions and standards are being met. This method also involves
sending evaluators to observe and interview soldiers and their
supervisors on the job. Direct observation of soldiers as they perform
their job, combined with interviews, provides a useful source of task
information. Job aids (which are also referred to as worksheets,
observation forms, or checklists) assist in collecting information during
the observation of training programs, products, or processes. Use HQ
TRADOC Form 350-70-4-1-R-E when observing training programs,
observing individual or unit performance, or reviewing training
documents, products, or processes. Use this form to record TD,
training management, and instructor observations. In addition to the
reproducible format at the back of this pamphlet, a FormFlow version is
also available to provide the observer the capability to digitally record
observations. The Staff and Faculty Development Program also uses
Instructor Performance Checklists to evaluate basic instructor
performance, classroom instructors, video teletraining instructors, small
group instructors, and After-Action Review (AAR) performance. These
checklists are found in TRADOC Reg 350-70, chapter III-4.

(4) Tests or timed trials measure the learner’s ability to perform


critical tasks to standards, and/or the learner’s obtainment of supporting
skills and knowledge. Test results are also used to determine the
effectiveness of training, and identify areas of improvement.

(5) Student critiques.

(6) Instructor questionnaires or interviews.

(7) Examination of training documentation/publications.

(8) Expert panel comments.

(9) Job performance evaluations.

(10) Major training exercises.

(11) Job Aids.

(12) Reports (course, evaluation and/or company).

37
TRADOC Pam 350-70-4

Design data e. Detailed procedural guidance for developing data collection


collection instruments are found in JA 350-70-4.4a, paragraph 2. Remember to
instruments consider early in the design of collection instruments how the measures
are scored, who will score it, and how the data is analyzed.

(1) Development and administration of questionnaires and


structured interviews (JA 350-70-4.4a, paragraphs 2c through 2e(4)).

(2) Additional guidance on interviews (JA 350-70-4.4b and JA


350-70-4.4c).

f. When preparing to develop a questionnaire or interview,


constantly apply the following criteria:

(1) To what extent might a question influence the respondents to


show themselves in a good light?

(2) To what extent might a question influence a respondent to be


unduly helpful by attempting to anticipate what the researcher wants to
hear or find out?

(3) To what extent might a question be asking for information


about/from respondents that they are not certain, and perhaps not
likely, to know about themselves?

The validity of questionnaire and interview items is limited by all three


considerations above.

Methods to g. Once the data collection instrument is developed, consider to


administer whom and how it is administered. (Determining the sample size is
data discussed in the next section.) Various methods to employ are listed
collection below.
instruments
(1) Mailed questionnaire.

(2) Personally administered questionnaire.

(3) Electronically administered questionnaire.

(4) Telephonic interview.

(5) Face-to-face interview.

(6) Video teleconference interview.

38
TRADOC Pam 350-70-4

Regardless of the data collection instrument used, administer those


designed to research training transfer to the job, to the graduate or
supervisor, at least 6 months after the graduate’s return to duty.

Sample size h. Paramount to providing accurate data to senior leadership


for surveys requires determining how many completed surveys are needed to
produce a reliable report. Additionally, provide senior leadership with
the confidence that the information collected is representative of the
target audience. Table 4-2 provides steps to determine how many
completed surveys are needed. Job Aid 350-70-4.4d assists with
accomplishing procedures in table 4-2.

Table 4-2
Steps for determining completed survey requirements
No. Step
1. Determine the target audience.
2. Estimate how many individuals are in the target audience. If the
population size is 200 or less, survey the entire population.
3. Determine confident level required that results are representative.
The JA recommends choosing the 95 percent confidence level, a
level commonly used.
4. Determine the estimated rate of usable questionnaires to use.
5. Determine the number of questionnaires that need distributing.

Standard i. Since it is frequently impossible to study an entire population,


sampling evaluators rely on sampling to acquire a section of the population from
techniques which data is collected. Sampling is considering a segment of a target
population that is representative of that target population. If the sample
is not a true representation of the population, then the data analyzed
may be erroneous. The most commonly used sampling procedures
are:

(1) Simple Random Sampling—a sampling technique in which


each individual in a population is chosen by chance, and each member
of the population has an equal chance of being included. The group of
subjects (the sample) is selected from a larger group (the population).

(2) Stratified Random Sampling – a sampling technique that


divides the population into categories (strata) and then data is collected
from the strata by simple random sampling. These categories are
based on certain characteristics relevant to the questionnaire (i.e., age,
training level, or gender).

39
TRADOC Pam 350-70-4

Raw data j. The end product of the collection phase is raw information. This
collected raw data is in the form of completed questionnaires, interview guides,
observation checklists, and other completed data collection
instruments. Analyze this raw data.

4-3. Quality control criteria for data collection. Training development/


task proponents perform QC actions as an inherent part of the SAT
process. Regardless of the collection data instrument used, ensure—

a. Everyone involved understands the purpose of the evaluation.

b. Data sources were relevant to provide appropriate, pertinent, and


reliable information.

c. Data collection instruments were appropriately designed to collect


required data.

d. Data sources provided appropriate, pertinent, and reliable


information.

e. Instrument was validated prior to distribution.

f. Acceptable sample size has been determined to collect sufficient


data.

g. Appropriate sampling technique has been used to select either a


simple, or stratified, sample of the target audience.

Chapter 5
Analyzing Evaluation Data
5-1. Evaluation analysis overview. This chapter discusses the conduct of
the data analysis phase of the evaluation process. This will assist
evaluators with analyzing findings to identify trends and systemic
problems. This chapter includes reviewing, summarizing, and analyzing
raw data, as well as interpreting the results of the analysis.

5-2. Analysis description. The analysis process transforms large volumes


of raw data collected into usable findings. Simply stated, analysis is the
process of reviewing, summarizing, and processing information to
develop initial, sound findings/recommendations concerning the issue
being evaluated. (Note: Do not confuse this analysis with the analysis
phase of the SAT.) Before starting an analysis of collected data,
ensure—

40
TRADOC Pam 350-70-4

• Data has been collected from a sufficient and appropriate


sampling/reliable source.
• Adequate data samples have been collected to validate the
reliability of the findings.
• Special attention is given to notes made by respondents on
questionnaires or answers to supplemental questions included in
the questionnaire.
• Data that contains the halo effect (indiscriminate rating of all
items positively) or central tendency (indiscriminate rating of
items in the center of the scale) is used with caution.
• Data for electronic analysis has been collected in an easily
processed form.

Analysis a. Table 5-1 provides the steps a training developer should perform
procedural when conducting any analysis.
steps
Table 5-1
Steps for analyzing data
Step Action
1. Review the raw data for integrity (ensure data is reliable).
2. Prepare data for analysis. Summarize the data into some form
of table, to avoid searching through individual questionnaires, or
interview guides. This ensures every reply is counted.
3. Analyze the data by converting the raw figures into percentages,
proportions, averages, or some other quantitative form that is
more easily understood. The choice of statistical description will
depend on the purpose of the data (which is determined during
the planning phase of the evaluation).
4. Interpret the analysis result.

The end product of the analysis is a series of initial findings that are
addressed in the Evaluation Report.

Types of b. Evaluators may analyze data in various ways. The data is either in
data analysis the form of qualitative data (expressed in narratives or words), or
methods quantitative data (expressed in numbers). The type of data collected will
depend on the objectives of the evaluation. These are determined when
planning the evaluation, and ultimately developing an evaluation plan.

41
TRADOC Pam 350-70-4

Review data c. Ensure data is valid and reliable. Triangulation (using multiple
methods to study the same thing) can corroborate evidence, and
increase validity, especially for qualitative findings. Examples of events
that could result in invalid and unreliable data:

(1) Different data collectors doing interviews or observations


(resulting in different interpretations).

(2) Different interpretations of notes recorded during observations


if not interpreted by the evaluator conducting the observation.

(3) Questionnaire answers that are incomplete, illegible, or not


understandable.

Data d. Check data for integrity as a part of every evaluation, especially


integrity when there were several data collectors involved, questionnaires were
mailed, or an unstructured data collection method was used. Closely
check data for integrity when there is less control on how it is collected.
When reviewing data for integrity, ensure:

(1) Responses are complete—a blank next to a question could


mean do not know, refused to answer, or the question was not
applicable.

(2) Responses are understandable—either the data collector’s


written response to an observation, or the answer provided by the
survey taker.

Adequate e. Ensure adequate data samples are collected to validate the


samples reliability of the findings. When reviewing data for validity, ensure:

(1) Responses are consistent—different questions, pertaining to


the same subject, on the same instrument, are consistent (ask the same
question in two different ways, to see if the same answer is obtained).
Also, when using a rating scale for rating a list of items, look for patterns
of responses that may indicate the respondent did not seriously answer
the question.

(2) Responses are uniform—if different data collectors for


administering interviews or observations are used, make sure that
collectors follow uniform procedures for collecting and recording data.

(3) Responses are appropriate—if a response does not pertain to


the purposes of the evaluation or the question asked, discard the
responses. If integrity problems are not resolved, discard the data.

42
TRADOC Pam 350-70-4

Summarize f. When data is summarized, simply condense the data for analysis.
data
(1) The easiest and most accurate method for summarizing large
amounts of quantitative data is to use an automated statistical program.

(2) Summarizing qualitative data is much more difficult to


accurately and effectively summarize. Job Aid 350-70-4.5 provides
guidance for summarizing qualitative data.

Analyze data g. Before analyzing any data, ensure all quantitative data is entered
into a computer, and all qualitative data is summarized and condensed
into categories. Keep in mind exactly why the data is being analyzed
(i.e., identify what specific questions are needed).

h. Statistical analysis is required for analyzing quantitative data.


(Note: Statistics simply show things about the data that are otherwise
not seen.) There are two types of statistics:

(1) Descriptive statistics give measurements that simply


“describe” the data collected. This includes such measures as:

(a) Mean - the average value.

(b) Range - the extreme values.

(c) Standard deviation - the degree to which values are


dispersed.

(2) Inferential statistics “infer” something about the total population


from which the data was collected, usually by a random process. It
includes such measurements as—

(a) t tests—a statistical analysis used to infer something


about two different populations, based on data collected from a sample
of each of the populations.

(b) Chi-square—a statistical test commonly used to compare


observed or actual frequency with expected or hypothesized frequency
of two variables.

(c) Analysis of Variance—a statistical analysis used to


determine whether there are any statistically significant mean
differences between two or more groups, or one or more variables, or
factors.

43
TRADOC Pam 350-70-4

Individual descriptions on how to use these various methods for


analyzing data are not provided. If the evaluation plan requires using
unfamiliar statistical analysis, consult with a statistician or evaluator
experienced in statistics.

Interpret the i. Interpret the analysis in common sense terms, and be able to
analysis explain the results. Interpretation of analysis is one of the most difficult
results steps in this phase of evaluation. Keep in mind the purpose of the
evaluation as data results are interpreted. Annotate all trends identified,
and include in the final report. Qualitative data is often considered less
objective than quantitative data, but can provide very useful information,
if procedures are followed, especially when looking at themes and
relationships at the case level. Quantitative data, though more scientific,
requires statistical manipulation to represent findings.

As with analysis, interpretation of results may call for the use of


unfamiliar statistical procedures. Solicit the assistance of a statistician,
or more experienced evaluator, when using unfamiliar statistical
methods.

5-3. Quality control criteria for evaluation analysis. TD/task proponents


perform QC actions as an inherent part of the SAT process. When
analyzing data, ensure—

a. The integrity of the raw data, by verifying the completeness and


understandability of responses.

b. Accuracy of data summaries was verified.

c. Data analysis was appropriate for the type of data collected.

d. Results of the analysis were interpreted to provide usable,


accurate, and pertinent recommendations.

Chapter 6
Preparing Evaluation Reports

6-1. Evaluation report preparation overview. This chapter provides


guidance on how to develop a report, which captures the results of an
evaluation (phase four of the evaluation process). An evaluation report
will include findings, conclusions, and recommendations. This chapter
will assist in preparing, staffing, and distributing the
report/recommendations.

44
TRADOC Pam 350-70-4

6-2. Evaluation report descriptions. Formal and informal reports are an


ongoing process during the life of an evaluation; do not consider them
as just an “end of evaluation” product. Establish the primary audience
during the planning phase, and write the report to that audience’s
interest. The report should be concise and presented in a way that will
ensure the evaluation information is used to its fullest potential. The
report summarizes the results of the evaluation (i.e., findings,
conclusions, and recommendations). The report is provided to senior
leaders for their information and guidance.

6-3. Report preparation steps. Evaluators should perform the steps in


table 6-1, which provides the general procedural guidance for preparing
the evaluation reports.

Table 6-1
Steps for preparing an evaluation report
No. Steps
1. Prepare the draft report, to include findings, conclusions, and
recommendations. Develop findings, conclusions, and/or
recommendations on how they relate to the objective of the
evaluation.
2. Staff draft report for review/concurrence. Staff internally, and with
involved organizations. Obtain review and concurrence of
recommendations.
3. Revise draft report into a final report.
4. Obtain final approval of recommendations. Provide to approving
authority (i.e., decisionmaker) for final approval of final report,
and recommendations.
5. Distribute report/recommendations for action. Distribute final
report with approval authority documentation to all organizations
with implementing requirements.

6-4. Prepare draft report. The evaluation report is a vital document that
summarizes the results of the evaluation (i.e., findings, conclusions,
and recommendations). From the findings, develop conclusions, and
for each identified problem area, provide specific recommendations for
management to consider. There are different ways of reporting
information, depending on how the report is used, the target audience,
and the impact that the evaluation will have. Be as concise and simple
as possible, while ensuring all required information is included.

45
TRADOC Pam 350-70-4

Types of a. The length and format of an evaluation report can vary


evaluation significantly. Factors to consider are the audience for whom the report
reports is intended, complexity of the evaluation, and/or the impact the
evaluation results will have on the organization. For purposes of
standardization, the reports are categorized into two types—Executive
Summary and Detailed Evaluation Report.

Executive (1) The Executive Summary is a synopsis of the evaluation, and


Summary requires a response to the evaluation recommendations. The
Executive Summary is a memorandum designed for senior leadership
that includes:

(a) Background.

(b) Purpose and objectives.

(c) Methods.

(d) Major findings, recommendations, and who is responsible


to implement (or lead the implementation of) recommendations.

(e) Suspense date for a response to recommendations.

Job Aid 350-70-4.6a provides a format of an Executive Summary. For


most evaluations, the Executive Summary is sufficient. Keep on file the
supporting data and documentation required to explain and support
findings.

Detailed (2) Detailed Evaluation Reports are lengthy, formal evaluation


Evaluation reports that explain in detail those findings included in the Executive
Report Summary listed above. A detailed report is required if:

(a) Intended audience is outside the installation, and the


evaluation specifics (to include supporting data) cannot be explained in
person.

(b) Evaluation will have a very significant impact on


training/training resources.

(c) Evaluation involves very complex data collection or


analysis methodologies that require explanation, and are supported
with data.

(3) In addition to the items in an executive summary, the detailed


report includes:

46
TRADOC Pam 350-70-4

(a) An Executive Summary.

(b) A discussion of limitations.

(c) A list of assumptions.

(d) An identification of Essential Elements of Analysis.

(e) Appendices for data summaries, which may include Data


Collection Instruments, the Evaluation Plan, etc.

(f) A memorandum requesting concurrence of the director on


the recommendations, and organization responsible for implementing
recommendations. Format the memorandum similar to paragraph five
of the Executive Summary cited in JA 350-70-4.6a. A format for a
Detailed Report is provided in JA 350-70-4.6b.

Report b. Whether an Executive Summary or a Detailed Evaluation Report


preparation is prepared, keep the report as brief as possible. The following
hints suggestions will assist in preparing either report.

(1) Only include information that the intended reader needs.

(2) Keep it simple.

(3) Do not use acronyms unknown to readers.

(4) Do not write at a reading level drastically above, or below, the


reader.

(5) Explain in simple terms any complex data collection, or


analysis methodologies.

(6) Stay focused on the Essential Elements of Analysis when


writing the report.

6-5. Staff draft report for review/concurrence.

a. The procedures for staffing a draft report should begin within the
organization, and then to involved organizations. The steps are as
follows:

(1) Distribute the draft report internally to individuals within the


office to review the report content.

47
TRADOC Pam 350-70-4

(2) Provide the director a copy of the draft report for


review/approval.

(3) Ensure director signs a memorandum to distribute to


organizations responsible for implementing recommendations. (Note:
If an Executive Summary is prepared, the director will sign it. If a
Detailed Evaluation Report is prepared, the director signs the
accompanying memorandum.)

(4) Distribute draft report to all organizations affected by the


evaluation for their review and concurrence (or nonconcurrence) on
recommendations made. Request a response back within 30 days.

b. If nonconcurrence comments are received, resolve any


disagreements prior to sending the final report to the decisionmaker.
Use coordination meetings, if necessary, to resolve disagreements.

c. If no agreements are reached on nonconcurrences, send the


report with explanations of all nonconcurrences to the decisionmaker
for a final decision.

6-6. Obtain final approval of recommendations. In this phase, a final


report is provided to the decisionmaker (i.e., assistant commandant) for
final approval of the overall report and the individual recommendations.

Draft report a. Send a memorandum to the decisionmaker, along with the report
and copies of the responses received as a result of the staffing from the
Staffing Draft Report step. Job Aid 350-70-4.6c provides a sample
format for the memorandum sent to the decisionmaker.

b. After the decisionmaker has approved or disapproved the


recommendations, distribute the final report.

6-7. Distribute report/recommendations for action, and conduct follow-


up check. In this phase, the final report is distributed to all
organizations that are responsible for implementing the approved
recommendations, with a copy of the memorandum from the
decisionmaker, showing final approval of recommendations.

a. Provide a suspense date for written responses to responsible


organizations.

b. Responses must include:

48
TRADOC Pam 350-70-4

(1) Actions they have taken thus far.

(2) Milestones for future actions they will take.

c. Inform the organization that a follow-up check is conducted to


ensure that they have taken the actions, and provide a predetermined
time to conduct this check-up.

6-8. Quality control criteria for preparing evaluation reports. Training


development/task proponents perform QC actions as an inherent part
of the SAT process. When developing an evaluation report, ensure—

a. The final evaluation report was focused on the objectives of the


evaluation.

b. The final evaluation report includes, at a minimum:

(1) Findings.

(2) Conclusions.

(3) Recommendations.

c. Reports were staffed to all appropriate organizations.

d. Final report was approved by chain of command before


distribution.

e. Final report was distributed to all relevant organizations (i.e., all


organizations that are responsible for implementing the approved
recommendations).

Chapter 7
Conducting Evaluation Follow-Ups

7-1. Evaluation follow-up overview. This chapter provides guidance on


how to conduct a follow-up of the evaluation effort (phase 5 of the
evaluation process). During this phase, it is determined if
recommendations made as a result of the evaluation were actually
implemented. This chapter will assist in conducting follow-ups, and
includes—
• Preparing for a follow-up evaluation.
• Conducting the follow-up evaluation.
• Preparing and staffing the report.

49
TRADOC Pam 350-70-4

7-2. Follow-up description. The follow-up phase is sometimes considered


the most important part of the evaluation process, but often is the most
overlooked. Conducting a follow-up will determine if recommendations
were implemented, and if they resulted in an improvement to the
training. This phase should take place within one year of the evaluation
approval. If the recommendations are not followed up, do not conduct
the evaluation.

Follow-up a. Though the decisionmaker has signed off on the evaluation and
steps approved the recommendations, there is still no guarantee that those
responsible will implement the recommendations. Conduct a follow-up
to ensure implementation. See table 7-1 for procedural steps for
conducting a follow-up.

Table 7-1
Steps for conducting a follow-up
No. Steps
1. Input action milestones into a tracking system. Collect responses
from organizations responsible for implementing
recommendations. Input milestones into a system for tracking
actions taken.
2. Conduct the follow-up to ensure actions have been taken to
implement recommendations.
3. Prepare and write a follow-up report.
4. Staff the follow-up report.

Input action b. During the reporting phase, determine actions the responsible
milestones organizations have taken so far, and their plans for future actions. To
into tracking begin the follow-up, update information from these organizations to
system determine status of actions taken and planned. These actions are put
into a tracking system.

c. The tracking system:

(1) Includes all present or future actions taken by an


organization.

(2) Is automated to simplify use of data collected.

(3) Has the projected date for the follow-up. Projected date will
depend on the milestones established for the actions, and on local
policies and procedures for performing follow-ups.

(4) Provides an audit trail on all actions taken as a result of the


evaluation.

50
TRADOC Pam 350-70-4

Conduct the d. The purpose of the follow-up is to ensure that those responsible
follow-up have implemented the approved recommendations. How a follow-up is
conducted is dependent on the actions themselves, resources
available, and local policy for follow-ups. The primary goal of a follow-
up is to ensure the organization has implemented the actions. Follow-
up measures may include:

(1) Meeting with personnel from the organizations.

(2) Observing training.

(3) Reviewing training documentation.

(4) Examining training records.

e. Reconcile any differences of opinion on actions that they have or


have not taken.

f. Enter new milestones into the tracking system if the organization


has not implemented actions.

g. Record action as complete in the tracking system when all


corrective actions have occurred.

Prepare a h. Write up the results of the follow-up in a Follow-up Report. A


follow-up sample Follow-up Report is found in JA 350-70-4.7. Report should:
report
(1) Summarize each action taken for each approved
recommendation.

(2) Be brief.

(3) Include a new milestone, if more work is required.

Staff follow- i. Staff the follow-up report to inform everyone of the follow-up
up report results, and to document the follow-up. Staff the report to:

(1) Organizations responsible for implementing the


recommendations.

(2) Other affected offices.

51
TRADOC Pam 350-70-4

7-3. Quality control criteria for conducting follow-up evaluations.


Training development/task proponents perform QC actions as an
inherent part of the SAT process. When conducting a follow-up of an
evaluation, ensure—

a. Follow-on actions were entered into a tracking system. This


tracking system includes—

(1) Actions taken by all responsible organizations.

(2) Projected dates for verifying collective action were


implemented.

(3) Dates for scheduled follow-up evaluation.

(4) An audit trail of all actions taken as a result of the evaluation.

b. Recommendations were implemented.

c. A follow-up report has been prepared, detailing the corrective


action taken for each approved recommendation.

d. Follow-up report was staffed to all appropriate organizations.

Chapter 8
Internal Evaluation

8-1. Internal evaluation overview. This chapter provides “how-to” guidance


for conducting internal evaluations. Internal evaluations cover the
evaluation of—
• Overall training development process.
• Student learning.
• Instructional materials.
• Personnel.
• Instructional resources.
• Implementation.
• Training products.
• Staff and faculty education/training.
• Instructor performance.
• Infrastructure requirements.
• Course Management Plan.
• TRAS documents.
• Training Support Products.

52
TRADOC Pam 350-70-4

8-2. Internal evaluation description. The purpose of an internal evaluation


is to improve the quality and effectiveness of the instructional system, by
providing sufficient, high-quality data to decisionmakers upon which they
can make sound, informed decisions about the training and education.
During an internal evaluation, gather internal feedback and management
data from the education/training instructional system environment.
Periodic internal evaluations may identify weaknesses/problems, as well
as strengths, of the TD and instructional system. Internal evaluation is a
deeper requirement than checking instructor techniques and method of
instruction. It is a check of the quality of the content in regards to what
is being taught, and what the students are assimilating. In an internal
evaluation, make a comparison between the course objectives and
standards applied in the training environment, and the objectives and
standards specified in course development documents. In addition,
evaluate school’s/center’s control of the total training environment, and
promptness in moving graduates to units, to include proper application
of the TD process.

a. Internal evaluations focus on the TD process and the


measurement of learning that was gained from the training program, in
an effort to continually improve instructional quality and effectiveness.
Where b. If an internal evaluation on the application of the SAT process (i.e.,
conducted analysis, design, development, implementation, and/or evaluation) is
conducted, then conduct the evaluation in the TD environment. The QC
checks outlined in the related TRADOC pamphlets will assist in
accomplishing an evaluation of these areas. If an internal evaluation of
the implementation of training is conducted, then conduct the evaluation
wherever the training is being implemented:

(1) Training institution.

(2) TASS Training Battalion.

(3) Distributed Learning Facilities.

(4) Unit.

(5) Home.

NOTE: Internal Evaluations are NOT conducted ONLY in resident


schools.

53
TRADOC Pam 350-70-4

Objectives of c. Internal evaluations provide school/decisionmakers with a method


internal of assuring that training and training products are correctly developed
evaluation and implemented IAW the appropriate standards of SAT. The
school’s/decisionmakers’ responsibilities include ensuring that:

(1) A systems approach is being appropriately applied to the


analysis, design, development, implementation, and evaluation of all
training and training materials (i.e., products and/or programs).

(2) Records/documents that clearly explain the decision process


are maintained.

(3) Analysts have interacted with school/center personnel charged


to conduct the evaluation function, to ensure that feedback data required
for analysis are identified in evaluation plans.

(4) Training development, and training functions, are effective and


efficient.

(5) Task inventories from new or modified equipment


development processes have been provided to the appropriate office.

(6) Access to information from analysis of collective training


requirements, such as MTP and standard drills, are available.

(7) Collective training programs including the appropriate MTP,


and standard drills are developed that will support units/organizations for
which they are proponent.

(8) Staff and faculty personnel receive training relevant to their


duty assignment. As a minimum, staff and faculty training will consist of
an introduction to a SAT and, when required, courses where
instructional/facilitation techniques are certified.

(9) The instructional base is providing the appropriate/intended


training, and using the appropriate training products (POIs and lesson
plans/TSPs are examples).

(10) The instructional system is producing a qualified graduate in


an effective and cost-efficient way.

(11) Quality control mechanisms are in place for developing and


implementing training and training products.

54
TRADOC Pam 350-70-4

(12) Objectives of the training have been met.

(13) The infrastructure (e.g., TD facilities, classrooms, shop areas,


learning facilities, billets, training areas, and ranges) adequately
supports all phases of the SAT.

8-3. Internal evaluation procedures. Table 8-1 provides the general


procedural guidance for conducting an internal evaluation. The “how-to”
procedures for a majority of the steps below are discussed in-depth in
chapters 3 through 7.

Table 8-1
Steps for conducting an internal evaluation
No. Steps
1. Prepare Internal Evaluation Project Management Plans.
2. Establish feedback channels.
3. Prepare/modify checklists to evaluate products and processes.
4. Observe training and testing events.
5. Prepare and administer student, instructor, and training manager
questionnaires.
6. Collect and analyze data or information (see chapter 4).
7. Prepare and staff draft evaluation reports.
8. Distribute final evaluation reports.
9. Monitor compliance with recommendations.
10. Establish responsibilities for QA and QC.

Project a. Plans to conduct an internal evaluation are detailed in Evaluation


manage- Project Management Plans. A guide for preparation of Evaluation
ment plans Plans is provided in chapter 3.

Feedback b. It is essential to establish a process that ensures feedback is


channels provided to the appropriate individuals/organizations. It is also
imperative that feedback is constantly evaluated and applied, to check
and improve the training system. Feedback helps:

(1) Ensure training materials are maintained that reflect current


doctrine, conditions, equipment, and procedures.

(2) Ensure intended educational objectives have been met.

(3) Provide information to students on individual performance.

(4) Identify substandard performance or trends as early as


possible for individual assistance or corrective action.

55
TRADOC Pam 350-70-4

(5) Assess the effectiveness of training and instructional methods


used in training implementation.

(6) Provide information to base shifts in emphasis and allocation


of scarce resources.

Checklists c. Job Aids listed in paragraph 8-4g below provide assistance in


evaluating products and processes.

Data d. Data collection techniques used during internal evaluations are


collection discussed in chapter 4, to include questionnaires/surveys, interviews,
observations, and tests.

Analyze e. Detailed guidance for analyzing data is provided in chapter 5. In


data addition to the procedures outlined in chapter 5, an analysis of internal
evaluation data should specifically include the following measures:

(1) Review the POI and instructional materials (lesson


plans/TSPs, instructor guides, student guides, and other instructional
materials are examples) to determine whether they are current,
adequate, and in agreement.

(a) Compare terminal and/or enabling objective standards


with the standards in the POI, to determine if the requirements of the
POI are being met.

(b) Compare lesson plan/TSP with course being taught to


determine if they are the same.

(2) Compare stated training resource requirements with actual


resources, to determine if adequate resources are available to support,
operate, and/or maintain the instructional system.

(3) Review instructor records to ensure instructors are qualified


to teach the POI.

(4) Review test and measurement data to determine if students


are meeting the terminal and enabling objectives.

(5) Analyze test and measurement instruments to determine if


they are valid and reliable.

(6) Employ QC checklists and JAs for the various TD phases.

56
TRADOC Pam 350-70-4

Evaluation f. Provide results of data analyses to the appropriate elements within


reports the school to influence the development process. Detailed guidance for
preparing and distributing reports is provided in chapter 6.

Follow-up g. It is important to determine if deficiencies identified during an


internal evaluation have been corrected. The steps for conducting a
follow-up evaluation are provided in chapter 7.

8-4. Areas to consider/review during internal evaluations. Internal


evaluation will assist in establishing and maintaining the desired level of
QA. Therefore, it is not excessive to have a system of in-process
reviews of key areas. Depending on the circumstances, and the
directives of proponent evaluation policies and plans, the evaluator
needs to review various documents/data during an internal evaluation.
This will ensure all required data to conduct a successful internal
evaluation is available. Some areas to consider are listed below:

Documenta- a. A record of training needs assessment, to determine if training is


tion to required, or some other aspect of the installation/unit environment
review needs addressing, e.g., personnel management, maintenance and
logistical support, and equipment availability and operability.

b. The specific target population description (developed during job


analysis), to ensure training and training support materials is developed
with the user in mind. Internal evaluations should check how well the
analysis, design, development, and implementation processes and
products match the target population description, and the needs of field
users.

c. Records/documents and data sampled to determine compliance


with HQ TRADOC and local policy and guidelines. There are certain
documentations of critical “decision points” in the TD process which
internal evaluators should check, as a minimum, to see how well the
decisions stand up to the tests of soundness of rationale, and validity of
selection. This may include, but is not limited to:

(1) Critical task selection.

(2) Training site selection.

(3) Job performance measure description.

(4) Training method and media selection.

(5) Validation of tests, training materials, and courses.

57
TRADOC Pam 350-70-4

d. Course control documents to determine if there are any


discrepancies between the planned course, and the course that was
actually implemented. Study each component and procedure
authorized, and/or required by these documents.

e. Resource documents to ensure all required resources are


available and sufficient. Recommended corrective actions should
consider system constraints. Evaluate actions taken and whether:

(1) Facilities (instructional and support) are available and


adequate.

(2) Equipment and training devices (instructional, support, and


test and measurement), and supplies are available and meet system
requirements.

(3) Human resources (instructional developers, instructors,


students, and courseware maintenance personnel) are available.

(4) There is adequate time allotted for the instruction (adequate


course length, and sufficient time to maintain the course).

(5) Adequate funds are available to support, operate, and


maintain the course.

Other areas f. Several methods to use for collecting internal evaluations data are
to review described below.

(1) Visit instructional facilities. Make enough visits of sufficient


length, and ensure there is a representative sampling to evaluate or
check:

(a) The quality of implemented instruction. Ensure that the


visit is long enough to observe samples of representative instruction for
the entire course.

(b) Hardcopy instructional materials, such as instructor and


student guides, workbooks, and reference materials for quality and
availability.

(c) Equipment, training devices, instructional media, and


training aids for condition, operation, maintenance reliability, and
appropriateness.

(d) Instructional literature for availability and quality.

58
TRADOC Pam 350-70-4

(2) Evaluate instructor performance. Instructors must show


acceptable application of instructional system technology, and their
activities should conform to those specified in the lesson plan. Ensure
instructor records are current, and show amounts of in-service training
and special training completed. Ensure:

(a) The instructors follow the lesson plan, and teach to the
standard.

(b) The instructors use instructional media properly, detect


student problems and respond to student needs, and are qualified to
teach.

(c) The instructors’ records are current, and show required


amount of in-service training and special training are completed.

(d) Noted weaknesses on instructor evaluation forms have


been corrected.

(3) Measurement (Testing Program). Ensure that measurement


programs are not compromised. If they are, the tests cannot provide
useful data/feedback on student performance. To avoid tests being
compromised, monitor the test and measurement program, to ensure
quality of the test and measurement items, and student performance;
and evaluate instruction in terms of student performance. Test and
measurement instruments are the performance measures that
determine student achievement of course objectives. A satisfactory
measurement program should:

(a) Provide students and instructors with goals.

(b) Inform each student of their progress in meeting program


objectives.

(c) Establish a permanent record of each student’s


achievement, and make it available to the student.

(d) Identify any need for a remedial program.

(e) Identify students not meeting course standards, so


appropriate action is taken.

(f) Provide feedback data to establish a constant quality


control check on the instructional system.

59
TRADOC Pam 350-70-4

Note: Monitoring the measurement (test) program is a very important


aspect in verifying instructional quality. But, it is just one aspect of
internal evaluation. Do not exclude other aspects, such as visiting
classrooms, or checking equipment adequacy and audiovisual aids.
Internal evaluation analyzes all actions that occur in system operation.

JA g. Job Aids 350-70-4.8a through 8d provides four checklists for


conducting internal evaluations on the following functional areas:

(1) SAT Process (JA 350-70-4.8a).

(2) Training Institutions/Facilities (JA 350-70-4.8b).

(3) Products (JA 350-70-4.8c).

(4) Training Development Management (JA 350-70-4.8d).

8-5. Outputs of internal evaluations. Possible outputs of internal


evaluations may include and subsequent corrective actions:

a. Reports with identified deficiencies, and corrective actions and


follow-ups on identified deficiencies.

b. Efficient and effective individual training, training programs, and


products.

c. Needs assessment.

d. Use of data for improvement of trainee performance and revision


of learning materials.

e. Validated course developed in accordance with (IAW) the SAT


process; and validated evaluation instruments.

f. Certified instructors, and qualified evaluators and training


developers

g. Updated Master Evaluation Plan and supporting TD Project


Management Plans as required.

60
TRADOC Pam 350-70-4

8-6. Internal evaluation issues/concerns.

Possible a. Though an instructional system has been validated before


causes of implementation, students may still have difficulty with instruction during
problems the day-to-day training. Managers, training developers, instructors, as
well as evaluators, need to consistently review and address these
possible problems. Problems may include—

(1) Instructors that do not follow the POI or lesson plans.

(2) The POI is different in some ways from the course being
implemented.

(3) Training resources that are required to support, operate,


and/or maintain the POI differs in some respects from the resources that
have been allocated.

(4) The training resources are inadequate for the student to


master specific terminal or enabling objectives.

(5) Training materials are not correlated with the test and
measurement instruments, the terminal or enabling objectives, or the
instructional content identified in the task and learning analyses.

(6) Students do not have the prerequisites required.

Managerial b. Competent management is key to an effective evaluation.


questions Management has the overall responsibility for ensuring that all
components of the evaluation are fully integrated. Internal evaluations
are specifically focused to obtain answers to the questions “How good is
the training?” “Are students learning?” “Has the training development
process been applied?” and “Do we need to change anything?”
Questions that managers seek to answer via internal evaluations are
found at JA 350-70-4.8e.

Higher c. During internal evaluations, deficiencies may be found over which


head- the school does not have complete control (i.e., classroom is
quarter overcrowded). Correction of deficiencies may depend on support from
issues higher HQ. In such cases, it is imperative to bring these issues to the
attention of the senior leadership, and report them to higher HQ.
Maintain documentation of proponent reports to higher HQ, and their
response to issues.

61
TRADOC Pam 350-70-4

8-7. Quality control criteria for internal evaluations. Training


development/task proponents perform QC actions as an inherent part of
the SAT process. When conducting an internal evaluation, ensure:

a. Internal evaluation project management plans were prepared in a


timely manner to impact the resource requirements.

b. Feedback channels were established that:

(1) Were accessible to units and school.

(2) Collected usable, pertinent data.

(3) Distributed evaluation results to appropriate


organizations/personnel.

c. Appropriate checklists of JAs were completed in support of the


evaluation.

d. As appropriate, training and testing events were observed, and


recommended improvements discussed with proponent.

e. Data collection was completed following the QC criteria in


chapter 4.

f. Data was analyzed sufficiently to develop viable training/TD


recommendations following the QC criteria in chapter 5.

g. Evaluation reports were prepared and distributed to appropriate


organizations following the QC criteria in chapter 6.

h. Recommended improvements were thoroughly discussed with


training/TD proponents.

i. Follow-ups were conducted to verify that proponent took action on


recommendations.

Chapter 9
External Evaluation

9-1. External evaluation overview. This chapter provides “how-to”


guidance for conducting external evaluations. External evaluations
cover the evaluations of education/training products used in the unit,
and the capability of soldiers to perform after receiving specified training.

62
TRADOC Pam 350-70-4

9-2. External evaluation description. External evaluation determines if


soldiers can meet job performance requirements, need all the instruction
they received, or need any additional instruction they did not receive.
This process gathers data from the field to assess graduate’s on-the-job
performance in a job environment, and assess if the soldier can satisfy
real-world job performance requirements. Evaluators must realize that
the responses to the surveys are opinions of supervisors/soldiers in a
specific unit configuration that may or may not relate to wartime or
battlefield requirements, and may or may not be in a peacetime
environment. Likewise, it is important to compare what the field says is
being done with regard to a particular task, with what other
documentation indicates should be done to support a particular unit
mission, and/or equipment configuration, or operations capability.

External a. External evaluation is the evaluation process that provides the


evaluation means to determine if the training received meets the needs of the
definition operational Army. This evaluation ensures the system continues to
effectively and cost-efficiently produce graduates who meet established
job performance requirements. External evaluations are considered a
quality improvement operation, ensuring soldiers and training products
continue to meet established job performance requirements, as well as
continually improve system quality.

Where b. External evaluations gather data from the field to assess soldiers’
conducted on-the-job performance. A misconception often made is that external
evaluations are anything conducted outside of the proponent
schoolhouse. This is not true. External evaluations are conducted on
soldiers and/or supervisors after the individual has graduated from a
course and is performing their job/duty in the unit.

Objectives c. External evaluations assist in learning how well graduates meet


of external job performance requirements. When conducting external evaluations,
evaluations look for both strengths and weaknesses of the training system. External
evaluations will help determine—

(1) How well the graduates are meeting job performance


requirements.

(2) Whether training is being provided that is not needed.

(3) Whether any needed training is not being provided.

(4) Ways to improve the graduate’s performance as well as the


training system.

63
TRADOC Pam 350-70-4

9-3. External evaluation procedures. Table 9-1 provides the general


procedural guidance for conducting an external evaluation. This
process relies on input from the job environment (field) to establish how
well the soldiers are performing. Data is gathered and analyzed from
outside the instructional environment. The “how-to” procedures for a
majority of the steps below are discussed in chapters 3 through 7.

Table 9-1
Steps for conducting an external evaluation
Steps Actions
1. Prepare external Evaluation Project Management Plans.
2. Obtain senior leadership approval/sponsorship. Establish
feedback channels.
3. Prepare visitation plans for observations and/or interviews.
4. Prepare statements of work for contracted studies.
5. Prepare survey instrument.
6. Administer survey.
7. Collect data/information.
8. Analyze data/information.
9. Prepare evaluation reports.
10. Distribute evaluation reports.
11. Monitor compliance with recommendations.
12. Establish responsibilities for QA and QC.

9-4. Unit training evaluation. In the context of this pamphlet, when unit
training evaluation is discussed, the focus of TRADOC proponents is
on the training (and doctrinal) materials provided to support training in
units; the purpose is NOT to evaluate the unit. The assessment of unit
training and proficiency is, and must remain, a unit responsibility.
However, TRADOC must work in close cooperation with the unit to
provide the products (MTPs, STPs) necessary to support unit
evaluation efforts. Evaluation of the education/training products used in
the unit is critical to determine the effectiveness of collective and
individual task performance and products. The intent of a
training/education evaluation within the unit is to improve training and
task performance proficiency.

Factors to a. A unit evaluation of education/training products (i.e., students,


consider products, etc.) may require assessing one or several factors to
during an determine the effectiveness and efficiency. Such factors to consider
external include:
evaluation

64
TRADOC Pam 350-70-4

(1) Proponent’s training programs and products.

(2) CALL trends.

(3) Feedback from units (i.e., commander).

(4) Responses provided to unit feedback.

(5) Feedback provided to training developers for needs analysis.

Combat b. Many proponent schools can no longer afford to send evaluation


Training teams to units to evaluate the validity and effectiveness of
Center training/training products. Therefore, CTCs have become a critical
interface source of that information. The CTC program provides highly realistic
and stressful joint and combined arms training, IAW Army and Joint
doctrine. Combat training center rotations and reviews provide an
invaluable source of collective training feedback that may impact the
determination of unit missions, critical collective tasks, and collective
task analysis data used in the development of collective training
products. Any feedback from CTCs should trigger the proponent to
revisit analysis and product revisions. Evaluators should ensure:

(1) Training scenario missions and unit training products are


tactically sound, based on approved doctrine, and developed IAW all
provisions of TRADOC Reg 350-70 and the SAT process.

(2) Interface with CTCs to receive unit task training performance


feedback that identify the need or requirement to develop or revise
training scenario missions IAW TRADOC Reg 350-70 and the SAT
process.

(3) Support of the TRADOC Remedial Action Program (T-RAP)


(TRADOC Reg 11-13) by the conduct of detailed reviews of priority
issues in an effort to determine solutions in all doctrine, organizations,
training, materiel, leadership and education, personnel, and facilities
areas.

(4) Review of CALL trends (provided by the CTCs Concept of


Operations collection efforts) and ensure application, as appropriate, of
lessons learned to training and doctrinal products.

Checklists c. Job Aid 350-70-4.9a provides assistance in conducting external


evaluations.

9-5. AUTOGEN software program.

65
TRADOC Pam 350-70-4

Description a. The AUTOGEN is a cost effective, user-friendly, automated


survey development program to give each school their own survey
development, data collection, and analysis capability. The AUTOGEN
is currently comprised of 2 modules, one to accomplish job analysis
surveys, and the other to conduct external evaluation surveys. The
conduct of an external evaluation is critical to acquiring feedback from
course graduates and their supervisors at least 6 months after
graduation for use in improving the quality of Army education/training.
Additionally, the feedback will assist the proponent to ensure the
training meets the needs of the operational force. The AUTOGEN is
now available for download from the ARI AUTOGEN download site.
Use password ‘qualityjob’ (one word, not case sensitive).

b. The AUTOGEN provides—

(1) The proponent the capability to quickly and effectively obtain


valid individual task performance data directly from active and RC
soldiers, and their supervisors, in field units for a specific job, or for an
entire MOS/AOC.

(2) A DCSOPS&T-approved template and incumbent background


information that captures Frequency of Performance and Task Training
Emphasis data.

(3) The proponent the capability to quickly and effectively build


an external evaluation survey to assess the effectiveness and efficiency
of provided individual education/training courses.

ARI server c. The Army Research Institute has made available a server for
administering AUTOGEN-generated job analysis and external
evaluation surveys. This user-friendly site requires minimal interface
with ARI to load surveys and download the survey answer file (data). It
is anticipated that this capability will provide easier access to AC, as
well as RC, soldiers, increasing their participation. Distribute surveys
using either the ARI server, or a center/school server. However, if the
ARI server is used to distribute an AUTOGEN-generated survey, the
Survey Control Number is automatically generated and provided to
ARI. The ARI server will automatically maintain (back up daily) the
survey data and supporting files.

Recom- d. Forward recommended changes to the AUTOGEN program to


mending Commander, TRADOC (ATTG-CD (QAO)), 5 Fenwick Road, Fort
AUTOGEN Monroe, VA 23651-1049 by 1 May of each FY. A panel of HQ
changes TRADOC and ARI representatives will prioritize suggestions and fund
according to requirements and available resources. Centers/schools

66
TRADOC Pam 350-70-4

are notified of all changes/updates to the software, which are


downloaded by the school's designated AUTOGEN POCs (previously
identified to HQ TRADOC). Recommendations are considered in terms
of meeting the needs of all, not just a few, schools.

9-6. Outputs of external evaluations. Possible outputs of an external


Outputs of evaluation and subsequent actions may include:
external
evaluations a. Reports with identified deficiencies and corrective actions, and
follow-ups on identified deficiencies.

b. Efficient and effective individual and collective training, training


programs, and products.

c. Needs assessment.

d. Use of data for improvement of trainee performance and revision


of learning materials.

9-7. External evaluation issues/concerns.

Possible a. Possible problems that may be identified during external


causes of evaluations include:
problems
(1) Criterion test(s) do not measure graduate’s ability to meet job
performance requirements.

(2) Terminal or enabling learning objectives do not reflect job


performance requirements.

(3) Job performance requirements were incorrectly identified


during job and task analyses.

(4) Job performance requirements changed after job and task


analyses.

Managerial b. Competent management is key to an effective evaluation.


questions Management has the overall responsibility for ensuring that all
components of the evaluation are fully integrated. External evaluations
are specifically focused to obtain answers to the questions “How good
are our graduates and training support products” and “Do we need to
change anything?” Questions that managers may seek to answer via
external evaluations are found at JA 350-70-4.9b.

67
TRADOC Pam 350-70-4

9-8. Quality control criteria for external evaluations. Training


development/task proponents perform QC actions as an inherent part of
the SAT process. When conducting an external evaluation, ensure—

a. External Evaluation Project Management Plans were prepared in a


timely manner to impact the resource requirements.

b. Senior leader approval/sponsorship was obtained.

c. Feedback channels were established that:

(1) Were accessible to units and school.

(2) Collected usable, pertinent data.

(3) Distributed evaluation results to appropriate


organizations/personnel.

d. Appropriate checklists and JAs were completed in support of the


evaluation.

e. Data collection instruments were prepared and data collection


conducted following the QC criteria in chapter 4.

f. Visitation plans were prepared and followed.

g. Data were analyzed sufficiently to develop viable training/TD


recommendations.

h. Evaluation reports were prepared and distributed to appropriate


senior leaders and organizations following the QC criteria in chapter 6.

i. Recommended improvements were thoroughly discussed with


training/TD proponents.

Chapter 10
Accreditation

10-1. Accreditation overview. This chapter provides an overview of


accreditation as it applies to all TRADOC assigned and affiliated
training institutions, and “how-to” guidance for centers and schools to
conduct a self-assessment. Standards for self-assessing the following
areas are discussed:

68
TRADOC Pam 350-70-4

a. Professional Military Education (PME), also known as Institutional


Leader Development.

b. Initial Military Training (IMT).

c. Combat Training Center programs.

10-2. Accreditation description. Accreditation is the TRADOC


Description Commander’s formal recognition given to a training institution, which
gives authority to conduct (or continue to conduct) education/training. It
is the result of an evaluative process that certifies an institution’s
training program, processes, personnel, administration, operations, and
logistical support (infrastructure) are adequate to support training to
course standards and that training institutions are adhering to TRADOC
Command Training Guidance and directives. Accreditation of all AC
and RC training institutions are reevaluated every 3 years.

a. Accreditation is a QA function that helps to assure the command


that training and education provided meet the competency needs of
today’s Army, and the objective force. Accreditation assures:

(1) Standardized training and training products are doctrinally


correct, and set the correct standards for the Army.

(2) Staffs, faculties, and observer/controllers are trained to


standard, and provide quality instruction.

(3) Institutional infrastructure meets required standards.

(4) Training program provides relevant, realistic training to meet


opposing force (OPFOR)/Contemporary Operational Environment
(COE) requirements.

(5) Training institutions are preparing to meet the training and


education needs of the Stryker and Future Forces.

(6) Feedback to senior leaders regarding significant training


issues.

Accreditation b. Teams of both military and civilian educators conduct all


approach accreditation visits. Do not consider these visits as inspections only;
the teams also provide staff assistance. Presently, PME and IMT is
accredited at proponent schools and affiliated TASS battalions, Army
Training Centers, and Noncommissioned Officer Academies (NCOAs).
When sufficient resources are available, the accreditation will also

69
TRADOC Pam 350-70-4

include functional courses, and any other training conducted by Army


institutions. Organizations leading the accreditation effort are:

(1) Commander, AAC - Recommends IMT accreditation to


Commander, TRADOC.

(2) Commander, CAC – Recommends PME and CTC Program


accreditation to Commander, TRADOC.

(3) Proponent schools will continue to accredit RC training


institutions.

(4) Currently, the U.S. Army Sergeants Major Academy will


continue to accredit Phase 1 of the Advanced Noncommissioned
Officer’s Course and Basic Noncommissioned Officer’s Course, and all
of the Primary Leader Development Course. The focus of the
accreditation effort is on conduct of training; training support; proponent
functions; and command training guidance and directives. Focus of
CTC program accreditation is on Operations Groups, OPFOR/COE,
TADSS, and facilities.

Guidelines c. The Guidelines for TRADOC Accreditation establishes the


for TRADOC policies, procedures, objectives, and responsibilities for accreditation of
accreditation IMT and PME, as well as accreditation of CTC programs. The
standards against which the programs will be evaluated are listed in
these guidelines.

Accreditation d. The Standards Guide contains the TRADOC accreditation


standards standards, with references, evaluation criteria, mandatory comments,
guide and guidance for evaluators and training institution staffs preparing for
accreditation. There are two different Accreditation Standards Guides,
one for self-assessment and accreditation of IMT/PME, and the other
for the accreditation of the CTC program. The IMT/PME Accreditation
Standards Guide provides the basis for conducting formal accreditation
evaluations. The CTC Program Accreditation Standards Guide
includes separate standards guides for each of the four CTC pillars
evaluated.

Evaluation of e. HQ TRADOC Form 350-70-4-2-R-E, Record for Evaluation of


Accreditation Accreditation Standards, is a form for entering all of the approved
standards standards evaluated, and for noting accreditation evaluation findings for
each. The reproducible version of this form is available at the back of
this pamphlet. Base remarks on the guidance in the Accreditation
Standards Guide and, if appropriate, attach detailed notes pertaining to
the evaluation of particular standards.

70
TRADOC Pam 350-70-4

Accreditation f. There are several accreditation reports; either received from other
report organizations, or reports the team is responsible for writing. These
formats include:

(1) A memorandum of Notification of Accreditation Status for the


training institution prepared by HQ TRADOC. This is based on
recommendations from the CAC and AAC QAOs. This memorandum
will advise the training institution of its accreditation level (i.e., status).
Job Aid 350-70-4.10a provides the sample format design for the
Notification of Accreditation Status.

(2) The center/school QAOs are responsible for conducting


accreditations of the RC TASS Training Battalions. They are to provide
a memorandum to their commandant that provides accreditation
findings and recommendation of accreditation status. A format of this
report is shown in JA 350-70-4.10b.

(3) The commandant of the proponent school will provide a


memorandum to the commander of the RC Training Battalion, awarding
accreditation status. A format sample of this memorandum is provided
at JA 350-70-4.10c.

10-3. Self-assessment description. Training institutions conduct self-


assessments to ensure standards established by HQ TRADOC have
been met. The self-assessment process is based on the Accreditation
Standards and Guide established by HQ TRADOC.

a. Conducting a self-assessment provides the institution an


opportunity to assess its situation prior to an official accreditation visit.
A self-assessment will:

(1) Demonstrate that the institution meets the accreditation


standards established by HQ TRADOC QAO.

(2) Give visibility for, and provide an objective, critical evaluation


of the institution’s performance (strengths, weaknesses, and
challenges), and specific recommendations for improvement.

(3) Raise higher headquarters issues (HHI) that are beyond the
scope of the training institution to the appropriate level.

(4) Identify ways for the school to sustain strengths, correct


weaknesses, and improve training.

71
TRADOC Pam 350-70-4

(5) Analyze the resources and effectiveness of the institution in


fulfilling its mission.

(6) Serve as a baseline for measuring progress in the coming


decade, and provides a sound basis for institutional planning and
improvement.

(7) Demonstrate that performance, competence, and


achievements of students who complete programs are commensurate
with the certificates, diplomas, and degrees awarded by the institution.

Self- b. Prior to an accreditation visit, the education/training institution will


assessment complete a self-assessment based on the Accreditation Standards
procedures Guide. Job Aid 350-70-4.10d provides the detailed guidance for
preparing and conducting a self-assessment, and preparing a self-
assessment report. A self-assessment is conducted on every course
within a 3-year cycle.

Self- c. A self-assessment report is prepared and provided to the


assessment Accreditation Team 60 days prior to the accreditation. The report
cover letter identifies the results of the self-assessment. Included in the report are
and report the training institution’s strengths and weaknesses, areas of deficiency,
planned initiatives, and HHI. It also transmits documents needed by
the accreditation team for review. The format design of the self-
assessment cover letter and report is provided at JA 350-70-4.10e.

10-4. Quality control criteria for accreditation. Training development/task


proponents and training institutions perform quality control actions as
an inherent part of the SAT process. When conducting a self-
assessment and reporting the results, ensure—

a. Findings of previous self-assessments are considered.

b. Personnel at all levels of the institution are involved in conducting


the self-assessment.

c. The self-assessment report does identify strengths, weaknesses,


and challenges.

d. The self-assessment report includes recommendations for


improvement.

e. The self-assessment report includes all the information and


documents described by JA 350-70-4.10e.

72
TRADOC Pam 350-70-4

f. Higher headquarters issues are reported to the appropriate


headquarters.

Appendix A
References

Section I
Required Publication

TRADOC Reg 11-13


TRADOC Remedial Action Program (T-RAP)

TRADOC Reg 350-70


Systems Approach to Training Management, Processes, and Products

Section II
Related Publications

AR 5-5
Army Studies and Analysis

AR 10-87
Major Army Commands in the Continental United States

AR 25-55
The DA Freedom of Information Act Program

AR 200-1
Environmental Protection and Enhancement

AR 310-25
Dictionary of United States Army Terms

AR 310-50
Authorized Abbreviations, Brevity Codes, and Acronyms

AR 340-21
The Army Privacy Program

AR 350-1
Army Training

MIL-HDBK 29612-1A
Guidance for Acquisition of Training Data Products and Services

73
TRADOC Pam 350-70-4

Note: This is the first of a 4-part DOD Handbook that supports MIL PRF 29612B
(Training Data Products) and its associated DIDs.

MIL-HDBK 29612-2A
Instructional Systems Development/Systems Approach to Training and Education

MIL-HDBK 29612-4A
Glossary for Training

TRADOC Reg 385-2


TRADOC Safety Program

Section III
Prescribed Forms

HQ TRADOC Form 350-70-4-1-R


Observation Worksheet
_____________________________________________________________________

Appendix B
SAT Process

B-1. The SAT Process involves five training related phases: evaluation, analysis,
design, development, and implementation. Each phase and product developed has
“minimum essential requirements” to meet. Table B-1 describes the five phases of
SAT, and the specific chapters in TRADOC Reg 350-70 that address each phase.

Table B-1
SAT Process
Phases Description
1. Evaluation a. Evaluation, which is the focus of this pamphlet, provides the
means to determine how well the training takes place, Army
personnel/units perform, and products support training. Use the
process to—

(1) Identify deficiencies and corrective actions; follow-up on


identified deficiencies.

(2) Give assurance that the Army provides efficient, effective,


and economical individual and collective education/training, training
programs, and products.

(3) Ensure validated training courses/products are provided.

(4) Accredit training institutions.

74
TRADOC Pam 350-70-4

(5) Certify instructors.

(6) Determine if training meets field needs.

b. See TRADOC Reg. 350-70, Part III for more information on


evaluation.

2. Analysis a. Provides the means to determine the need for training, those
that get the training, and what critical tasks (collective and
individual (including leader) tasks) and supporting skills and
knowledge are trained.

b. The purpose for each of the types of training analysis:

(1) Needs analysis provides training and non-training


solutions to the performance deficiency(ies), the requirement to
improve training, and TD requirement(s). See TRADOC Reg
350-70, part IV, chapter IV-1 for more information on needs
analysis.

(2) Mission analysis provides the unit mission and critical


collective task lists. These tasks form the foundation for Army unit
training. See TRADOC Reg 350-70, part V, chapter V-1 for more
information on mission analysis.

(3) Collective critical task analysis provides the collective


task performance specifications, and identifies the individual tasks
performed as part of the critical collective task. See TRADOC Reg
350-70, part V, chapter V-2 for more information on collective
critical task analysis.

(4) Job analysis provides the command-approved critical


tasks for a specific job, or special category, and a collective-to-
individual task matrix. These tasks form the foundation for Army
individual education/training. See TRADOC Reg 350-70, part VI,
chapter VI-1 for more information on job analysis.

(5) Individual critical task analysis provides the individual


task performance specifications, including task performance
standard, the STP task summary data, the individual-to-collective
task matrix, and the individual-to-skill/knowledge matrix. See
TRADOC Reg 350-70, part VI, chapter VI-2 for more information on
individual critical task analysis.

75
TRADOC Pam 350-70-4

3. Design a. Provides the means to establish when, where, and how the
education/training is presented. Use the process to—

(1) Establish the CATS long- and short-range unit, individual,


and self-development training strategies/milestones.

(2) Design efficient, effective, and economical education/


training products such as individual training courses/courseware,
TADSS, TSPs, and drills.

(3) Produce student performance measurement documents,


e.g., tests, and Student Evaluation Plan.

(4) Identify all resources required to implement the


education/training.

b. See TRADOC Reg 350-70, chapters IV-2, V-3, VI-4, and


VI-7, for more information on design.

4. Development a. Provides the means to produce valid education/training


products based on the design. Use the process to—

(1) Produce the education/training material, e.g., lesson


plans, TSPs, training media/training aids, devices, simulators, and
simulations.

(2) Validate the education/training materials.

(3) Reproduce the education/training products and materials.

(4) Acquire training resources.

(5) Prepare the instructors, training managers, staff, faculty,


and cadre to present the education/ training.

(6) Prepare facilities and equipment.

b. See TRADOC Reg 350-70, parts V and VI for more


information on development.

5. Implemen- a. The actual presentation of standardized education/training to


tation soldiers and DA civilians. The process is used to—

(1) Distribute the education/training material.

76
TRADOC Pam 350-70-4

(2) Schedule the education/training.

(3) Manage and administer the execution of the education/


training, to include controlling progress through the training,
maintaining records, and conducting AARs.

b. See TRADOC Reg 350-70, part VII for more information on


implementation.

B-2. Various perspectives exist to view the SAT process. Two such graphical
representations are the integration of the SAT phases (fig B-1) and the SAT pyramid
(fig B-2).
External Evaluation

Analysis

Implementation Design
Evaluation

Development

External Evaluation

Figure B-1. Integration of the SAT phases

B-3. The SAT pyramid (fig B-2) shows how each phase of the SAT model builds upon
preceding phases. While the phases build upon each other, remember this is not
necessarily a linear process. Following all phases in order is not required; enter each
phase individually, as needed for revisions. The process is a continuous series of
analysis, design, development, implementation, and evaluation, with resultant revision
during any phase to maintain product currency and efficiency.

77
TRADOC Pam 350-70-4

Trained Units Trained Soldiers


Imple-
mentation
Evaluation

Development
Evaluation

DOTMLPF Design
Evaluation

Evaluation
Analysis

Evaluation

Resource Constraints

Figure B-2. SAT pyramid

B-4. TRADOC’s most significant products are efficiently and effectively trained soldiers,
leaders, and units that can perform their mission. For a complete description of the SAT
process and a flowchart of the linkages between the various SAT products, see
TRADOC Reg 350-70, Executive Summary. Flowcharts of the various TD processes
are in TRADOC Reg 350-70, appendix G.
______________________________________________________________________

Appendix C
Job Aid Hyperlinks

Number Title

JA 350-70-4.3a Format of an Evaluation Project Management Plan


JA 350-70-4.3b Guidance for Developing a Project Management Plan
JA 350-70-4.3c Master Evaluation Plan Format
JA 350-70-4.4a Developing Questionnaires and Interview Guides
JA 350-70-4.4b Interviewing Practices/Procedures
JA 350-70-4.4c Interview Guide
JA 350-70-4.4d Guidelines for Determining Sampling Size
JA 350-70-4.5 Summarizing Qualitative Data (Written Comments)
JA 350-70-4.6a Executive Summary Format
JA 350-70-4.6b Detailed Evaluation Report Format
JA 350-70-4.6c Format for Memorandum to Decisionmaker
JA 350-70-4.7 Format for Follow-Up Report
JA 350-70-4.8a Evaluator’s Checklist: SAT Process
JA 350-70-4.8b Evaluator’s Checklist: Training Institutions/Facilities
JA 350-70-4.8c Evaluator’s Checklist: Products
JA 350-70-4.8d Evaluator’s Checklist: TD Management
JA 350-70-4.8e Managerial Internal Evaluation Questions

78
TRADOC Pam 350-70-4

JA 350-70-4.9a Evaluator’s Checklist: External Evaluation


JA 350-70-4.9b Managerial External Evaluation Questions
JA 350-70-4.10a Notification of Accreditation Status
JA 350-70-4.10b Format of Proponent Accreditation Team Recommendation to their
Commandant of NCOA or RC TASS BN Accreditation
JA 350-70-4.10c Sample Final Accreditation Report to NCOA and/or RC TASS BN
by Proponent School
JA 350-70-4.10d Guidelines for Preparing and Conducting a Self-Assessment
JA 350-70-4.10e Cover Letter for Self-Assessment Report

______________________________________________________________________

Glossary

Section I
Abbreviations

AAC Army Accessions Command

AAR After-Action Review

AC Active Component

AOC area of concentration

ARI Army Research Institute

ASAT Automated Systems Approach to Training

AUTOGEN Automated Survey Generator

CAC Combined Arms Center (Ft Leavenworth, Kansas)

CALL Center for Army Lessons Learned

CATS Combined Arms Training Strategies

COE Contemporary Operational Environment

CTC Combat Training Center

DA Department of the Army

DCSOPS&T Deputy Chief of Staff for Operations & Training

DL Distributed Learning

79
TRADOC Pam 350-70-4

FY fiscal year

HHI higher headquarters issues

HQ headquarters

IAW in accordance with

IMT Initial Military Training

ITP Individual Training Plan

JA Job Aid

MEP Master Evaluation Plan

MOS military occupational specialty

MTP Mission Training Plan

NCOA Noncommissioned Officer’s Academy

OPFOR Opposing Force

PME Professional Military Education

POC point of contact

POI program of instruction

QA quality assurance

QAE Quality Assurance Element

QAO Quality Assurance Office

QC quality control

RC Reserve Component

SAT Systems Approach to Training

SME subject matter expert

STP Soldier Training Publication

80
TRADOC Pam 350-70-4

T-RAP TRADOC Remedial Action Program

TADSS Training Aids, Devices, Simulator and Simulations

TASS The Army School System

TD training development

TDPMP Training Development Project Management Plan

TDY temporary duty

TRADOC U.S. Army Training and Doctrine Command

TRAS Training Requirements Analysis System

TSP training support package

Section II
Terms

accreditation
The recognition afforded an educational/training institution when it has met accepted
standards of quality applied by an accepted accreditation authority.

external evaluation
The evaluation process that provides the means to determine if the training received
meets the needs of the operational Army. This evaluation ensures the system
continues to effectively and cost-efficiently produce graduates who meet established job
performance requirements.

internal evaluation
Assessment of whether the training and training development objectives were met.
Internal evaluations also verify the effective use of the SAT process to meet minimum
essential analysis, design, development, implementation, and evaluation requirements.

81
TRADOC Pam 350-70-4

FOR THE COMMANDER:

OFFICIAL: ANTHONY R. JONES


Lieutenant General, U.S. Army
Deputy Commanding General/
Chief of Staff

//signed//
JANE F. MALISZEWSKI
Colonel, GS
Chief Information Officer

82
Observation Worksheet
(For use of this form, see TRADOC Pam 350-70-4; the proponent is DCSOPS&T)

SECTION I - Training Development

PART I - Administrative Data


1. School: 2. Course/POI:

3. Date: 4. Name of Evaluator:

PART II - Course Design/Implementation Plan


1. POI File No: 2. Lesson Plan (LP)/Training 3. LP/TSP approved IAW 4. Date LP/TSP
Support Package (TSP) Title: local policy? approved:
YES NO
5. LP/TSP risk assessed? 6. LP environmentally assessed? 7. POI time matches LP time?
YES NO YES NO YES NO
8. POI Method of Instruction matches LP Method of Instruction? YES NO
(MOI)?
9. Foreign disclosure statement listed? YES NO
10. POI date: 11. CMP date:
12. Critical Task List date:
13. TLO/ELOs written IAW TR 350-70? YES NO If "NO", mandatory recommendation for rewrite:

14. TLO/ELOs Match POI? YES NO If "NO", mandatory comments and recommendations:

15. Is doctrine current? YES NO If "NO", mandatory comments and recommendations:

16. Does doctrine reflect COE? YES NO If "NO", mandatory comments and recommendations:

17. LP task on Critical Task List? YES NO If "NO", mandatory comments and recommendations:

18. LP task in POI? YES NO If "NO", mandatory comments and recommendations:

19. LP time/MOI on TMA sheet? YES NO If "NO", mandatory comments and recommendations:

Part III - Section I Performance Rating


GO - At least 75% of the evaluated items (Part II, Items 3-19) were rated "Go".
NO GO - Less than 75% of the evaluated items were rated "Go". Command emphasis needed.

PERFORMANCE RATING GO NO GO

HQ TRADOC Form 350-70-4-1-R-E (Jul 03) Page 1 of 6


SECTION II - Training Management
PART I - Administrative Data
1. School: 2. Course/POI:

3. Date: 4. Name of Evaluator:

PART II - Training Resource Material


1. LP equipment in POI? YES NO If "NO", mandatory comments and recommendations:

2. LIN/nomen IAW FedLog? YES NO If "NO", mandatory comments and recommendations:

3. POI reflects updated AV equipment requirements/Classroom XXI requirements: YES NO


4. LP facilities in POI? YES NO If "NO", mandatory comments and recommendations:

5. LP ammo in POI? YES NO If "NO", mandatory comments and recommendations:

6. LP TADSS in POI? YES NO If "NO", mandatory comments and recommendations:

Part III - Training Ratios


Required Assigned Available Comments
a. Instructor/Student
b. Equipment/Student
c. Drill/Student
d. Operator/Student
Part IV - Other Areas
No
Go NA Comments
Go
1. Facilities
2. Safety
3. Other (specify):
PART V - Training Implementation
1. Deviation from LP/POI:
1a. Caused by: 1b. Explanation: 1c. Status
Reported: YES/NO
Recurring: YES/NO
Safety Impact: YES/NO
Part VI - Section II Performance Rating
GO - At least 75% of the evaluated items (Part II, 1-6) were rated "Go"; and all applicable sections in Parts III and IV match
the LP/TSP/POI or have a waiver.

NO GO - Less than 75% of the evaluated items were rated "Go" or waiver(s) not available.

GO NO GO

HQ TRADOC Form 350-70-4-1-R-E (Jul 03) Page 2 of 6


SECTION III - Instructor Checklist
PART I - Administrative Data
1. School/Course: 2. Class Number: 3. Date:

4. Name of Instructor/SGL: 5. Rank/MOS/SC: 6. Instructor qualified IAW TR 350-70?


YES NO
PART II - Evaluation
No
A. Administrative Preparation Go NA Comments
Go
1. Visitor's book was current and
available.
a. TSP, Student H/O at visitor's area.
b. Training schedule available.
c. ITC Certificate or Memo of certified
instructors.
d. Visitor's sign in sheet.
e. Student Roster.
f. Range Safety/Demo Certification.
g. Inclement Weather Plan.
h. Risk Management Worksheet/Daily Risk
Assessment.

i. Medevac Plan.

No
B. Classroom Preparation Go NA Comments
Go
1. Lesson plan current, DOTD and DOT
approved, and IAW POI.
2. Classroom had adequate lighting, neat,
orderly, free from noise and interruptions.
Seating arrangement appropriate. Class
prepared prior to training.

3. Training materials, aids, and safety


equipment available and serviceable prior
to training.

No
C. Introduction Go NA Comments
Go
1. Used a motivational statement that
explains the relevance and importance of
the task.
2. Displayed and clearly stated the
Learning Objectives (Action, Condition,
Standard), and briefly outlined the
sequence of the lesson.
3. Stated the Risk Assessment Level,
warnings, safety hazards, and the
environmental considerations.
4. Explained how the objective would be
tested.
D. Demonstration No
Go NA Comments
Techniques Go
1. Ensured students could see all parts of
demonstration.
2. Steps were properly demonstrated.
3. Students were involved in
demonstration, if appropriate.
4. Assisted students as needed.
5. Gave on-the-spot corrections and
praise.
HQ TRADOC Form 350-70-4-1-R-E (Jul 03) Page 3 of 6
Section III - Instructor Checklist (cont)
E. Hands-on Training No
Go NA Comments
Method Go
1. Summarized points covered during the
demonstration.
2. Gave detailed directions before the
practical exercise.
3. Ensured students performed the
practical exercise correctly.
4. Provided timely feedback.
5. Encouraged group members to
participate.
6. Conducted an after action review with
the students after practical exercise.
No
F. Communications Skills Go NA Comments
Go
1. Used correct enunciation and grammar.
2. Did not excessively use distracting
mannerisms such as "Ah", "OK" and "You
know".
3. Instructor's voice quality, volume, and
variations (pitch, rate, and inflection) were
adequate.
G. Question/Answer No
Go NA Comments
Techniques Go
1. Questions were phrased clearly and to
the point (ask, pause, call, respond,
evaluate).
2. Questions were appropriate for the
lesson.
3. Covered all key points with questions.
4. Student's questions were answered
adequately.
No
H. Presentation Skills Go NA Comments
Go
1. Made eye contact with all students.
2. Movement and gestures were natural
and appropriate.
3. Instructor was poised and enthusiastic.
I. Use of Training Aids/ No
Go NA Comments
Materials Go
1. Training aids, instructional materials,
equipment listed in POI were used
appropriately.
2. Whiteboard and/or other visual aids
were used in an effective manner.

No
J. Classroom Management Go NA Comments
Go
1. Maintained proper control of the class.
2. Used appropriate techniques to assist
and motivate students.
3. Managed time appropriately; lesson
was well paced.
4. Encouraged student participation.
No
K. Test Management Go NA Comments
Go
1. Maintained accountability of tests.
2. Complied with Test Administration
Guide (TAG).
3. Test matched method of training.
HQ TRADOC Form 350-70-4-1-R-E (Jul 03) Page 4 of 6
Section III - Instructor Checklist (cont)
4. Test evaluated what was trained.
5. Conducted AAR with students.
No
L. Instructor Preparation Go NA Comments
Go
1. Demonstrated knowledge of class
material.
2. Explained key performance points.
3. Followed the sequence as outlined in
the lesson plan.
4. Covered all objectives.
5. Used smooth transitions.
6. Put training activity into job context at
least once.
7. Ensured all students could see and
hear all instruction.
8. Properly used internal summaries.
9. Properly conducted lesson summary
(see 9a - 9d below).
9a. Restated action.
9b. Restated main learning steps.
9c. Checked on learning.
9d. Provided closing summary.
No
M. Personal Qualities Go NA Comments
Go
1. Instructor's professionalism set the
proper example for bearing, behavior, and
appearance.
2. Showed respect to students.
3. Established a positive rapport with
students.
Part III - AAR with Instuctor

Part IV - Section III Performance Rating

GO - At least 75% of the evaluated items (Part II) were rated "Go".

NO GO - Less than 75% of the evaluated items were rated "Go". Command emphasis needed.

PERFORMANCE RATING: GO NO GO

Part V - Backbrief
Acknowledgement of Evaluation
Person briefed: Position: Date:

Signature of Evaluator: Signature of Course Manager:

HQ TRADOC Form 350-70-4-1-R-E (Jul 03) Page 5 of 6


NO GO NOSECTION
GO IV - Overall Performance Rating
NO GO
PART I - Administrative Data
1. School: 2. Course/POI:

3. Date: 4. Name of Evaluator:

PART II - Ratings

Section I: Training Development GO NO GO


Section II: Training Management GO NO GO
Section III: Instructor Checklist GO NO GO
Overall Rating: GO NO GO

NOTE: Overall performance as derived from the evaluation in Sections I, II, and III. Items marked "Not
Applicable" are not counted when computing the overall performance rating. NO GO

HQ TRADOC Form 350-70-4-1-R (Jul 03) Page 6 of 6


RECORD FOR EVALUATION OF ACCREDITATION STANDARDS
for Initial Military Training, Reclassification Training, and Professional Military Education
(The prescribing directive for t his form is TRAD OC Pamphlet 350-70-4; the proponent is DCSOPS&T)

Administrative Data
1. Organization being evaluated:
Name:
Location/address:

2. Accrediting agency name:


3. Evaluator: Phone: DSN
E-mail address: Comm
Address:

Reporting Focus
Type of Training Area Evaluated
(Check one) (Check one)

Initial Milit ary Training BCT OSUT AIT WOCS OCS Conduct of Training

Reclassification Training Training Support

Professional Military Education


NCOES WOES OES Proponent Functions
(Indicate educ ation system)

Recommendation
Professional Conditional Full
Accreditation Accreditation Accreditation

Remarks:

(Attached additional comments should be keyed to item numbers.)

HQ TRADOC Form 350-70-4-2-R-E (Oct 03)


CONDUCT OF TRAINING
STD Met Not N/A
Standard Met HHI
NO. w/cmt Met N/O

TRAINING SUPPORT

STD Met Not N/A


Standard Met HHI
NO. w/cmt Met N/O

PROPONENT FUNCTIONS

STD Met Not N/A


Standard Met HHI
NO. w/cmt Met N/O

HQ TRADOC Form 350-70-4 -2-R-E (Oct 03) Page 2

You might also like