Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Mil STD 785b

Download as pdf or txt
Download as pdf or txt
You are on page 1of 88
At a glance
Powered by AI
This document outlines requirements for reliability programs for systems and equipment development and production. It provides guidance on selecting and scoping reliability tasks to meet specific program needs and constraints.

This document establishes requirements for reliability programs to ensure systems and equipment are designed and produced to meet reliability objectives. It provides a structured, tailored approach for selecting appropriate reliability tasks.

Some key changes from the previous version include: increased emphasis on reliability engineering tasks and testing to prevent and correct defects, distinction between basic and mission reliability, separation of system reliability parameters, and guidance for tailoring reliability programs to fit specific program constraints.

Downloaded from http://www.everyspec.

com

MIL-STD-?85B
15 Septaber 1980

SUPERSEDING
MIL-STD-?85A
28 March 1969

MILITARY STANDARD

RELIABILIfl PROGRAM
FOR

SYST~ AND EQUI-NT

DEVELOP~~ AND PRODUCTION


Downloaded from http://www.everyspec.com

MIL-STD-785B
5 September ~980

DEPARTMENT OF DEFENSE
Washington, D. C. 20301

RELIABILITY PROGRAM FOR SYSTEMS AND EOUIPMENT DEVELOPMENT AND PRODUCTION

MIL-STD-785B

1. This Military Standard IS aporoved for use bv all Departments and


Agencies of the Department of Defense.

2. Beneficial comments (recommendations,additions, deletions) and anv


pertinent data which may be of use in improvlnq this document should be
addressed to: ASD/ENESS, Wriaht-Patterson AFB, OH 45433 by uslnq the
self-addressed Standardization Document Improvement Proposal (DD Form 1425)
appearing at the end of this document or by letter.
Downloaded from http://www.everyspec.com

MIL-STD-785B
15 September 1$180

FOREWORD

This military standard consists of basic application requirements,


specific tailorable reliability program tasks, and an appendix which includes
an application matrix and guidance and rationale for task selection.

Effective reliability programs must be tailored to fit procram needs and


constraints, including life cycle costs (LCC). This document is intentionally
. cuctured to discourage indiscriminate blanket applications. Tailorinq is
forced by requxrinq tha~ specific tasks be seiected and, for those tasks
identified, that certain essential information relative to implementation of
the task be provided by the procurinq activity.

Many of the tasks solicit facts and recommendations from the contractors
on the need for, and scope of, the work to be done rather than requirinq that a
specific task be done in a specific way. The selected tasks can be tailored :0
meet specific and peculiar program needs.

Although no: all encompassing, the guidance and ratiamale pra~ided ir~
Appendix A is intended to serve as an aid in selectinq and scopinq the tasks
and requirements.

This revision contains the following fundamental changes from


MIL-STD-785A:

a. Increased emphasis has been placed on reliability engineering


tasks and tests. The thrust is toward prevention, detection, and correction ~f
design deficiencies, weak parts, and WO!kXMKIShiDdefects. Emphasis on
reliability accounting has been retained, and expanded to serve the needs of
acquisition, operation, and support management, but cost and schedule
investment in reliability demonstration (Qualification and acceptance) ests
must be made clearly visible and carefully controlled.

b. A sharp distinction has been established between basic


reliability and mission reliability. Measures of basic reliability such as
Mean-Time-Between-Failures(MTBF) now include all item life units (net just
mission time) and all failures within the item (not just mission-critical
failures of the item itself). Basic reliability requirements apply to all
items.

c. Mission reliability (MIL-STD-785A Reliability) is now one of


four system reliability parameters. The other three are direc$ly re:a~ed ~~
operational readiness, demand for maintenance, and demanc for LOQIS:LC slc~art.
Separate requirements will be established for each reliability parameter that
applies to a system, and translated into basic reliabilityrequirements ?ar
subsystems, equipments, components, and Dart9t

iii
Downloaded from http://www.everyspec.com

MIL-STD-785B
15 September 1980

CONTENTS

f=aawah
1. SCOPE.. . . . . . . . . . . c
1
1.1 Purpose. . . , . . s c 1
1.2 Applicability . . . . . . . . c c 9 . 1
1.3 Method of reference . . . . . . . 1

2. REFERENCED DOCUMENTS . . . . . 9 9 1

TERMS, DEFINITIONS, AND ACRONYMS * * * 2


3.

4. GENERAL REQUIREMENTS o . . . . . 4

4.1 Reliability program . . . . . c . a 9


4
4.2 Program requirements . . . - . . 9 8
4
4.2,1 Reliability engineering . . . . . . . , 4
4.2.2 Reliability accounting . . . c o * 9 . 4
4.3 Reliability progr- interfaces . o * 9
4
4.4 Quantitative requirements , . , . * * . 5
4.4.1 Categories of quantitative requirements 9 v 5
4.4.2 System reliability parameters . . . . . , * . 5
4,4.3 Stati3tLcal criteria . . . . . . * t . . . . 5

TASK DESCRIPTIONS . . c . . * . ~ o , 6
5.

100-1 - 100-2

101 RELIABILITY PROCRAM PLAN . . . . . . . . . . . 101-1 - 10:-2

102 MONITOR/CONTROL OF SUBCONTRACTORS AND SUPPLIERS. 102-1 - 102-2

103 PROGRAMREVIEHS . . , . . . . 0 , c c c 103-1 - :03-3

104 FAILURE REPORTING, ANALYSIS, AND CORRECTION ACTION


SYSTEM (FRACAS) . c c . . . , ~ ~ . . s Q
104-1

105 FAILURE REVIEW BOARD (FRB) . c . . , . . . . , 0 105-1


Downloaded from http://www.everyspec.com

?41L-STD-785B
15 September 1980

200-1 - 200-2

201 RELIABILITYMODELING . . . . . . . . . 9 201-1

202 RELIABILITY ALLOCATIONS . . . . . . . . 202-1

203 RELIABILITYPREDICTIONS c . . o . . c 203-1 - 203-2

204 FAXLURE ~DES, EFFECTS, AND CRITICALITY


ANALYSIS (lWECA) . . . . , . . . 9.g 204-1

205 SNEAK CIRCUIT ANALYSIS (SCA) . . . * . . . 205-1

206 ELECTRONIC PARTSICIRCUITS TOLERANCE ANALYSIS . . 206-1 - 206-2

207 PARTS PROGRAX. . . . . . , . . . . ***** * 207-1

208 RELIABILITYCRITICAL IT~S . . . . . a*wae 9 208-1 - 208-2

209 EFFECTS OF FUNCTIONALTESTING, STORAGE, HANDLING,


PACKAGING, TRANSPORTATION,AND MAINTENANCE . . 209-1

300-1 - 300-2

301 ENVIRONMENTALSTRESS SCREENING [ESS) . . . . . 301-1 - 301-2

302 RELIABILITY DEVELOPMENT/GROUTHTEST (RDGT)


PROGRAM , , , . . . . ., 0 . . . . . . . . 302-1 - 302-2

303 RELIABILITYQUALIFICATIONTEST (RQT) PROGRAM . 303-1 - 303-2

304 PRODUCTION RELIABILITY ACCEPTANCE TEST (PRAT)


PROGRAM . . . . . . . . . . . . . . . .
304-1 -.304-2

APPENDIX A

APPLICATIONGUIDANCE FOR IMPL~NTATION OF


RELIABILITY PROGRAM REQUIREMENTS

10. GENERAL A-1


10.1 scope A-1
10.2 Purpose A-1
10.3 User A-1
v

m 1
Downloaded from http://www.everyspec.com

MIL-STD-785B
15 September 1980

20. REFERENCE DOCUMENTS A-1

30 DEFINITIONS A-1

40, TASK SELECTION A-1


40.1 Selection criteria A-1
40.2 Application matrix for proqram phases A-2
40,3 Task prioritization A-2

50. RATIONALE AND GUIDANCE FOR TASK SECTIONS A-U


50.1 Task section 100 - Program surveillance and control A-4
50.1.1 Structuring the proram requirements A-4
50.1.1.1 Identifying and quantifying reliability needs A-4
50.1.1.2 Selecting tasks to fit the needs A-5
50,1.1.3 Reliability program plan (task 101) A-5
50.1.1.4 Monitor/control of subcontractors and sumliers
(task 102) A-5
50.1.2 Program management A-6
50.1.2.1 Continual program assessment A-6
50.1.2.2 Program reviews (task 103) A-7
50.1.2.3 Failure reporting, analysis, and corrective action
systems (FRACAS) (task 104) A-7
50.1.2.4 Failure review board (FRE) (task 105) A-8
50.1.2.5 Government plant representatives A-8
50.1.3 Conducting the program A-9
50.1.3.1 Essential considerations A-9
50.1.3.2 Preparing for follow-on phases A-9

50.2 Task section 200 - Desiqn and evaluation A-II


50.2.1 General considerations A-II
50.2.1.1 Criteria and analyses are resource allocation tools A-11
50.2.1.2 Analyses as work direction tools A-n
50.2.1.3 Analysis applicability A-n
50.2.2 Models, allocations, and predictions A-12
50,2.2.1 Reliability model (task 201) A-12
50.2.2.2 Tops down allocation (task 202) A-13
50.2.2.3 Reliability predictions (task 203) A-74
50.2.3 Configuration analyses A-16
50.2.3.1 Failure modes, effects, and criticality analysis
(FMECA)
(task 204) A-16
50.2.3.2 Sneak circuit analysis (SCA) (task 205) A-17
50.2.3,3 Electronic parts/circuits tolerance analysis
(task 206) A-19
50.2.4 Design criteria A-18
50.2.4.1 Failure tolerant design criteria improve
mission reliability A-18
50.2,4.2 Parts selection/applicationcriteria (task 207) A-19
50.2.4.3 Reliability critical Items (task 208) A-21 .
50.2.4.4 Life criteria (task 209) A-21

vi

. w w *
- ,..
m t m m ---
Downloaded from http://www.everyspec.com

MIL-STD-785B
IS Septmber 1980

50.3 Task section 300 - Development and production testing A-23


50.3.1 General considerations A-23
50.3. 1,1 Reliability testing A-23
50.3. 1.2 Integrated testing A-23
50.3. 1.3 Teat realism A-24
50.3. 1,4 Reliability.estimates and projections A-24
50.3.1.5 Relevant failures and chargeable failures A-25
50.3. 1.6 Statistical test plans A-25
50.3. 1.7 Independent testing A-26
50.3. 1.8 Testing compliance A-26
50.3. 1.9 Documentation A-27
50.3.2 Reliablllty engineering tests ~-~?
50.3.2.1 Environmental stress screening (ESS) (Task 301) A-27

50.3.2.2 Reliability cievolopwntigrouth testing (RDCT)


(task 302) A-28
50.3.3 ReUabilLty accomting tests A-29
50.3.3.1 Reliability qualification test (R~) (task 303) A-29
50.3.3.2 Predwtlon rel$abfllty acceptance test (PRAT)
(task 304) A-30

60. DATA XTR4 DESCRIPTIOMS (DID) A-31

TABLES

A-1 Application matrix A-3

vii
Downloaded from http://www.everyspec.com

MIL-STD-785R
Is September ~9U()

RELIABILITYPROGRAMFOR SYSTEMS AND EQUIPMENT


DEVELOP?lENTAND PRODUCTION

1. SCOPE

1.1 ~ This standardprovidesgeneralrequirementsand soecifictasks


for reliabilityprogramsdurlnq the development, production, and initial
deploymentof systems and equipment.

1.2.1 Tasks describedin this standardare to be


.
. SelectivelyaPDlled in DOD contract-definitized procurements, reauest for
prOPOSiilS,state!nentsof work, and Government in-house developmentsreouirina
reiiabilitv programs for the development, Production, and initialdeployment
of systems and equiDment. The word contractor herein also includes
Government activities developing militarysvstemsand equipment.

122 ~ Task descriptionsare intendedto be


tailored as required by governing regulations and as appropriate to particular
systems or quipment program type, maunitude, and fundin~. When meDar:ng his
proposal, the contractor may include additional tasks or task modifications
with supporting rationale for each addition or modification.

1.2.2.1 The Details To Be Specifledn paraqraph under each task descriDtisnis


intendedfor listingthe specificdetails, additions, modifications, deletions,
or options to the requirements of the task that should be considered bv the
procuring activity when tallorinu the task description to fit mouram needs.
Details annotat@d by an (R) are essential and shall be provided the
contractor for proper Lmolementation of the task.

1.2.3 Application guidance and rationale for selectinu


tasks to fit the needs of a Particular reliability orouram is included in
appendix A; this appendix is not contractual.

1.3 MsLMQ= When specifyingthe task descriptionsof this


standard as requirements, both the standard and the specific task descriotior
number(s) are to be cited. Applicable Details To Be Soecifiedshall be
included in the statement of work.

2. REFERENCED DOCUMENTS

2=1 ~ The follotinqdocuments,of the iss.e in ef~ecto.


date of invitation for bids or requestfor Broposal, form a part of this
standard to the extent specified herein:

STANDARDS

MILITARY

!HL-STD-105 SamplingProceduresand Tables for Inspection by Attributes


MIL-STD-721 Definitions of Terms For Reliability and Maintainability
MIL-STD-781 Reliability Design Qualification and Production Acceptance
Tests: Exponential Distribution
MXL-STD-965 Parts Control Program

1
Downloaded from http://www.everyspec.com

HIL-STD-785B
15 September 1980

PUBLICATIONS

MILITARY HANDBOOK

MIL-HDBK-217 Reliability Prediction of ElectronicEouiment

(Copiesof specifications, standatis, drawings, and publicationsreauiredby


contractorsin connectionwith specificprocurementfunctionsshould be
obtained from the procuring activity or as directed by the contracting
officer.)

3* TERMS, DEFINITIONS, AND ACRONYMS

3.1 ~ The terms used herein are defined in !41L-STD-721.

3.2 ~ Definitions applicable to this standard are as follows:

. ~ The process by which the individual requirements


(~ect~On~, paragraphs, or sentences) of the selected specifications and
standards are evaluated to determine the extent to which each requirement is
most suitable for a specific materiel acquisition and the modification of these
requirements. where necessary, to assure that each tailored document invoked
states onlv the minimum needs of the Government. Tallorinq is not a license to
specify a zero reliability program, and must conform to provisions of existinz
regulations governing reliabllitv Prourams.

b.

(1) ~ The identification and exploration


of alternative solutions or solution concepts to satisfya validatedneed.

(2) (v~Q The period when


selected candidatesolutionsare refinedthrouqhextensive study and analY~e~;


hardware development, if appropriate; test; and evaluations.

(3) &JJ-s~ ~ de~ (F~) ~ The period


when the system and the principal Items necessary for its support are desizned,
fabricated, tested and evaluated.

(4) Pr~dULiQB (PH!JJ The period from production approval


until the last system 1s delivered and accepted.

c. Ihatset of mathematical tasks which


establish and allocate quantitative reliability requirements, and predict and
measure quantitative reliabilityachievements.

d. ~ That set of design, development, and


manufacturing tasks bv which reliability is achieved,

e. me duration or Probability of failure-free


performance under statedconditions. Basic reliability terms, such as
Plean-Time-BetweenFailures (MTBF) or Mean-Cycles-Between-Failures(MCBFI, shail
include aU item life units (not just mission time) and all failures within the
items (not just mission-critical failures at the Item level of assembly).
&lsiC rellab~lity requirements shail be capable of descri.binxitem demand ?3r
Downloaded from http://www.everyspec.com

P41L-STL735P
15 September1980
The
maintenance manpower (e.g., !4ean-Time-Between-WintenanceActions(NTEMA)).
other system reliability Parameters shall employ clearlv defined subsets of all
item life units and all failures.

f. ~ The ability of an item to perform its reauired


functions for the duration of a saecified mission Profile.

~. A measure of use duration aupli-ble to the item (~.~~


operating hours, cycles, distance, rounds fired, attempts to operate].

h. A series of tests conducted


under environmental stresses to disclose weak parts and workmanship defects for
correction.

i, tv cle~t (=). A series af tests


conducted to disclose deficiencies and to verify that corrective actions will


prevent recurrence in the ooeratlonal inventorv. (Also known as TA}F
testing.)

J. Esuahusv~t~t {- A test conducted under soesified


conditions, by, or on behalf of, the government, usinq items representative of


the pproved production configuration, to determine compliance with SDeCified
reliability requirements as a basis for production aBoroval. (Also kncwn as a
ReliabilityDemonstrationn~ or Desian Approval, test.)

k. PmduSiimmlhMlitv ~ (pMT)t A test conducted ur?der


specified conditions, by, or on behalf of, the &overnment,using delivered or


deliverable production Items, to determine the producers compliance with
specified reliability requirements.

3.3 ~ Acronyms used in this document are defined as follows:

CDR Critical Design Review


CDRL Contract Data Reaulrements List
CFE Contractor Furnished &quiDment
DID Data Item Description(s)
Ess Environmental Stess Screening
FMECA Failure Modes, Effects, and Criticality Analwis(es)
FRACAS Failure Reporting, Analysis(es), and Corrective Action
Systems .
FRB Failure Review Board
FSED Full Scale Engineering Development
GFE Government Furnished Equipment
GIDEP Government/IndustryData Exchanae Proqram
GPR Government Plant Representative
LSAP Logistic Support Analysis Program
LSAR Logistic Support Analysis Records
?4CBF Mem-Cycles-Mt=en-Failmes
i*csP MissIon Completion Success Probability
$lTBC Mission-Time-Between-CriticalFailures
MTBDE Mean-Time-Between-DowningEvents
MTBF Mean-Time-Between-Failures
MTBMA Mean-Time-Between-MaintenanceActions
MTBR Mean-Time-Between-Removals
PA Procuring ActZvity (including Proqram/Project Offices)

I I ANU 2.urrLALna 1 1
Downloaded from http://www.everyspec.com

MILSTD-785B
15 Sept-ber 1980

PCB Parts Control Board


PDR PrelUary Design Review
PPSL ProgramPartsSelectionList
PRAT Prodwtion Rel$abillty Acceptance Test
PRST Probability Ratio Sequential Test
RDGr Reliability Development/Gro@h Test
RFP Roqwst For Proposal
RUT Rellabllity Qual$flcatlon Test
SCA Sneak CticuitAnalysis(es)
Sw stat~t of work
TAAF Test, Analyze,and Flx

4. GENERAL REQUIREMENTS

4.1 ~ The contractor shall establish and maintain an


efficient reliabil~ty progras b support economical aohiev-nt of overall
program objectives. To be considered efficient, a reliability program shall
clearly: (1) fiprWO Operational readiness and mlsslon success of the major
end-item; (2) redwe item daand for malntename manpower and loglstlc support;
(3) prOVld@ aaaantld managa!nnt tifor!mation;and (4) hold dom Its OUI lmpac~
on overall progr~ oost and aohedule.

4*2 ~ Each r81iabUity program shall Include an


appropriate MIX of reliability engineering and accounting tasks depending on
the life cycle phase. These tasks shall be selected and tailored according to
the type of I,ta (systa, subayst~ or qul~ent) and for each applicable phase
of the c?qulsltion(COMCEPT, VALID, FS~, =d p~OD)= ~@Y *- * Pl=n~~
integrated and aoucmplished in oonjunctlon ulth othor design, development and
manufaoturlng functions. The overall oqulsitionprogr~ shall includethe
roaouroes,aohedule,managaent struoture,nd oontrols neoesaary to ensure
that specLfl@d raliabflity program tasks are satlsfactorl.1.y
accomplish-a

4=2.1 ~ Tasks shall focus on the prevention,


deteotion,and correction of reliability design deficiencies, weak parts, and
workmanship defects. Reliability engimerlng shall be an integral part of the
it= des@n prOOesa, tiCIUdi.13$ dOS~ O-S. The means by uhloh reliability
engtieering ooatrlbut8s to tha design, and the lovol of authority and
constraints on this nglneeriq dlsoipltno, shall be identified h the
reli,abLlityprogram plan. AZIfficient reliability progr- shall stress early
Investment In rellablllty englneerimg tasks to avoid subsequent costs and
sohedula dalays.

4.2.2 ~ Tasks shall focus on the provision of


tifomation essential to aoqulsitlon, operation, and support aam.g-nt,
Inoluding properly defined Inputs for estimates of operational effectiveness
and otaershlp ooot. An efftelent reliabll.ityprogram shall provide this
info~tion while enaurlng that ooat and sohedule Iavestaent In efforts to
obtain management data (awh as demonstrations, qualifloatlon tests, and
aooeptanoe tests) QI dearly visible and oarefully controlled.

4*3 ~ The contractor shall utilize reliability


data and lnfomation resulting from applicable tasks U the reliability program
to satisfy LSAP requirements. All reliability data and UformatZon used and
provided shall be baaed upon, and traceable to, the outputs of the reliability
program for all logisticsupport and engineering activities involved h dl

4
Downloaded from http://www.everyspec.com

M~L.$TD.795~
15 Seotember 1980
.
phases of the systeca/subsystem/equipment
acquisition.

4*4 ~ The systeWsubsystem/equipment reliability


requirements shall be specified contractually. Quantitative reliability
requirements for the system, all major subsystems, and equipments shall be
included in appropriate sections of the system and end item specifications.
The sub-tier values not established by the orocuring activitv shall be
established by the system or equioment contractor at a contractually spesified
control point prior to detail desizn.

4.4.1 There are three different


categories of quantitative reliability requirements: (1) operational
requirements for applicable system reliabilltv parameters; (2) basic
reliability requiremen~s for item design and auality; and (3) statistical
confidence/decision risk criteria for soecifl.creliability tests. These
categories must be carefully delineated, and related to each other bv clearlv
defined audit trails, Co establish clear lines of responsibility and
accountability.

4.4.2 System reliability oarafnet!!r?


snail be
defined in units of measurement directly related to operational readiness,
mission success, demand for maintenance rnanoower,and demand for Loqlztzc
support, as applicable to the type of system. Ooerattonal reauirenents far
each of these parameters shall include the combined effects of item desicn,
quality, operation, maintenance and re~air in the operational environment.
Examples of system reliabllitv parameters include: readinesz,
Mean-Time-Between-Downin~Events (MTEDE); mission success,
Mission-Time-Between-CriticalFailures (HTPCF); maintenance demand,
Mean-Time-Between-?laintenance Actions (YNMA); and louistlcs demand,
Mean-Time-Between-Removals(MTER).

4.4*3 ~ Statistical criteria for reliability


demonstrations, Reliabllitv Qualification Tests (ROT), and Production
Reliability Acceptance Tests (PRAT) shall be carefully tailored to avoid
driving cost or schedule without I.morovinareliability. Such criteria include
specified confidence levels or decision risks, Uooer Test !WBF, Lower Tes:
MTPF, etc., as embodied in statistical test olans. They shall be cleariv
9eDarated from specified values and minimum acceptable vaiues to orever!ttest
oriteria from drivinq item design. They shali be selected and tailored
according to the degree that confidence intervals are reduced by each
additional increment of total test time.

4.4.3.1 ~r P~ For electronic equioment, the Lower Test MT??


shall be set equal to the minimum acceptable MT8F for the item. Conformance to
the minimum acceptable MTBF requirements shall be demonstrated by test?
selected from MIL-STD-781, or alternative specified by the PA.

44S302~ por~unit~ons and~echanczal


eaulpment, a given lower confidence limit shall be set.eaual tc the T!IR:IXI
acceptable reliability for the item. An adequate number of samnles shail be
selected per MIL-STD-195, or by other valid means aDproved bv the PA, and
tested for conformance to reliability.requirements as specified by the PA.

5
Downloaded from http://www.everyspec.com

MIL-STD-785B
1S September1980
.
5. TASK DESCRIPTIONS

5.1 The task descriptions following are divided into three ueneral sections:
Section 100, Program Surveillance and Control; Secticm 200, Design and
Evaluation;and Section300, Developmentand Production Tcstinu.

Custodians: PreoarinfgActiVltv:
Army - CR Air Force - 11
Navy - As
Air Force - 11 Project RELI-0008

Review Activities:
Army - AR, AV, AT, ME, MI, SC, TE
lia~ - EC, OS, SA, SH, YD, TD, MC, CC
Air Force - 10, 13, 17, 18, 19, 26, ?5
Downloaded from http://www.everyspec.com

!UL-STD-795F
TASK SECTION 100
15 September 1980

TASK SECTION 100

PROGRAM SURVEILLANCE AND CONTROL

100-1

I 9
Downloaded from http://www.everyspec.com

..
MIL-STD-785B
TASK SECTION 100
15 September 1980

THIS PAGE INTENTIONALLY LEFT BLANK

. . . . .
Downloaded from http://www.everyspec.com

MIL-STD-785B
15 September wao

CASK101

RELIABILITY PROGRAM PLAN

101.1 ~ The purpose of task 101 is to develop a reliability program


plan which $dentifles, and ties together, all program management tasks required
to accomplish program requirements.

101.2.1 A reliabilityprogram plan shall be prepared and shall include, but not
be limited to, the following:

a. A description of how the reliability program will be conducted :a meet


the requtiements of the SOW.

b. A detailed description of how each specified reliability accounting


and engineering design task(s) will be performed or complied with.

c. The procedures (wherever existing procedures are applicable) to


evaluate the statua and control of eaoh task, and identification of the
organizational unit with the authority and responsibility for execdthg each
task.

d. Description of interrelationships of reliability hsks and activities


and description of how reliability tasks will interface with other system
oriented tasks. The description shall specifically include the proced-tie~EC
be anployed which assure that applicable reliabilitydata derived from, and
traceable to, the reliability tasks specified are Integrated into the LSAP and
re~rted on appropriate LSAR.

e. A schedule wtth estimated start and completion points for each


reliability program activity or task.

f. The identification of known reliability problem to be solved, an


assessment of the impact of these problems on meeting apecifled requirements,
and the proposed solutions or the proposed plan to solve these problems.

~* The prOC!ed~e~ or methods (if procedures do not exist) for recording


the status of actions to resolve problems.

h. The designation of reliability milestones (includes design and test).

i. The method by which the reliability requirements are disseminated Lc


des@ers and associated personnel.

~. Identification of key persomel for managing the reliability program.

k. Description of the management structure, including interrelationship


between line, service, staff, and policy organizations.

1. Statement of what source of reliability design guideline9 or


reliability design review checklist till be utilized.

TASK 101
15 September ?980
101-1
Downloaded from http://www.everyspec.com

MIL-STD-785B
15 September 1980

m. Description of how reliability contributes to the total des~n, and


the level of authority and constraints on this engineering discipline.

n. Identification of Inputs that the contractor needs from operation and


support experience with a predecessor it- or item. Inputs should hcl~de
measured basic reliability and mlss$on reliability values, measured
environmental stresses, typical failure modes, and critical failure modes.

101.2.2 The contractor may propose additional tasks or modlflcatlons with


supporting rationale for such additions or modifications.

101.2.3 Uhen approved by the procurl.ngactivity and if incorporated into the


~ontr~c~, the reliability program Dlan shall become, together with the sow, the
basis for contractual compliance,

101.3,1 Details to be specified in the SOW shall Include the follod.ng, as


applicable:

(R) a. Identification of each reliability accounting and engineering design


tasks.

(R) b. Identification of contractual staWs of the program plan.

c* Identification of additional tasks to be perfommd or additional


information te be provided.

d. Identification of any specific indoctrination or training


requirements,

e. Delivery identirlcation of any data item required.

TASK 101
75 September :983
101-2
Downloaded from http://www.everyspec.com

MIL-sTD-T85E
15 September 1980

TASK 102

MONITOR/CONTROL OF SUBCONTRACTORS AND SUPPLIERS

102,1 ~ The purpose of task 102 Is to provide the prtie contractor d~d
PA with appropriate surveillance and management control of
Subcontractors/suppliersreliability programs so that timely managment action
can be t~ken as the need arises and program progress is ascertained. -

102,2.1 The contractor shall insure that system elements obt~ined from
s~ppllers will meet reliability requirements. This effwt sh~ll app2J tc C?E
items obtained from any supplier whether in the first or any subsequent tier,
or whether the item is obtained by an Intra-company order fran .Anyelement of
the contractors Organization. All subcontracts shall include provisions fcr
review and evaluation of the suppliers reliability efforts by the Pr~e
contractor, and by the procuring activity at their discretion.

102.2.2 The contractor shall assure, and advise the PA, that his
subcontractors and suppliers reliability efforts are consistent ti.thGverall
svsta requirements, And that provisions we made for surveillance of their
reliability activities, Tne contractor sh~l, 4s ~ppropri~te:

3. Incorporate quantitative reliability requirements In s~bcontracted


equipment specifications.

b, Assure that subcontractors have a reliability progrm that LS


canpatible uith the overall program amd includes provisions to review and
eval~ate tneir supplier(s) reliability efforts.

c* Attend ~nd participate in 9uDCCIntraCt0r9design reviews-

d. Review subcontractors predictions And analyses for accuracy and


correctness of approach.

e. Furnish subcontractors with data :rcm testing or ds4ge af tne~r


product when testing and usage are outside thek control.

f, Review subcontractors test plans, procedures, and reports far


correctness of approach and test details.

6 Review subcontractors progress reports.

h, Assure that subcontractors have, and are pursuing, a vigorous


corrective action effort to eliminate causes of unreliability.
4
Reserve for himself and for the PA the right to send personnel into
the s~~contractors facilities ~s necess~ry to monitor and eval~ate the
subcontractors reliability programs and related activities.

$ Assure that subcontractors/supplierswill provide hti witn the


necessary technical and administrative support for the items they s~pply c~:~g

TASK 102
: September IGt!j
Downloaded from http://www.everyspec.com

?41L-STD-785B
15 September 1989

production and deployment of the hardware. This support may include failure
analyses and corrective action for fa~ures occurringIn the tots use
environment, if specified under 102.2 herein.

k. Ensure that sslected Ltems (crZtlcal Items, et cetera) obtained from


suppliers are covered by spec~f~cations, drawings, and other technical
documents and that the requirements called out adequately cantml those
parameters and characteristics that may affect reliability of the end Item.

1. Unless otherwise specified by the PA, conduot or centnl his


subcontractors/suppliersreliability d-onstration (q~lf;cat~on ad
acceptanm) teats on behalf of the government to provido a defensible basis for
CJetem$ning the suppliers contraatud compliance ulth quantitative rellabiltty
requixwsentso

102.3.1 Details to be specified h the SOW shall include the following, as


applicable:

a. Mat%fication reqtiemsnta for attendance at Special meetings, program


reviews, PDRs, CDRS et cetera.

b. Responsible activity to conduct or control reliability demonstration


(qualificationand acceptance) tests on behalf of the government.

TASK102
15 September 1980
102-2

-...._ . .._. .
I .- - -.-,
m
Downloaded from http://www.everyspec.com

MIL.STD-785B
15 September 1980

TASK 103

PROGRAM REVIEUS

103.1 ~ The purpose of task 103 is to establish a requirementfor the


prime (or associate)contractorto conduct reliab~ity program reviews at
specified points in time to assure that the reliabilityprogram is proceeding
in accordancewith the contractualmilestonesand that the weapon system,
subsystem,equipment,or componentquantitativereliabilityrequlr=ents uI1l
be achieved.

103.2.1 The reliabilityprogram shall be planned and scheduledto permit the


contractorand the PA to review program status. Formal review and asse99ment
of contractreliabilityrequirementsshall be conducted at major program
points, identifiedas system program reviews, aa specified by the contract. As
the program develops,re12abllityprogress shall also be assessed by the use of
additionalrellabllltyprogram revleua aa necessary. The contractorshall
schedulo reviews as appropriateulth his subcontractorsand suppliersand
insure that the PA is Informed in advance of each review.

103.2.2 The reviews shall identify and discuss all pertinentaspects of the
reliabilityprogram such as the following,when applicable:

(1) Updatedreliabfl~tystatue including:

(a) Reliabilitymodeling
(b) Reliabilityapportionment
(c) ReliabQlty predictions
(d) F14ECA
(e) Reliabilitycontont of specification
(f) Design guidelinecriteria
(g) Other tasks as identified.

(2) Other proble!maffecting reliability

(3) Partsprogramprogress
(4) Reliabilitycritical items program.

b. .
.

(1) Reliability content of specifications

(2) Reliabilitypredictionand analyses

(3) Parts program status


(4) Reliabilitycritical items program

(5) Other problem!!


affectingreliability

TASK 103
15 September 1980

703-1

==-=--=--..--=

Downloaded from http://www.everyspec.com

MIL-STI)-785B
15 September 1980

(6) PHECA

(7) Identificationof circuit reference designatorswhose stress levels


excaed the recommendedparts applicationcriteria.

(8) Other tasks as ldantifled.

c. At QAauJlsv pr~ .
(1) Discussion of those items reviewed at PDRs and CDRS

(2) Results of Pall=e analyses

(3) Test schedule: start dat.s and completiondates

(4) Parts, design, roliabflity,and schedule probl-s

(5) Status of assigned action ltaas

(6) Contractorassessmentof reliabtiitytask effectiveness

(7) Other topics and issues as de-ad appropriateby the contractorand


the PA.

~~ ~
(~) Reliabilityanalyses status, prlmarlly prediction

(2) Test schedule

(3) Test profile


(4) Test plan including failure definition
(5) Test report format

(6) FRACAS lmpl-entation,

e.

(1) Rewlta of applicableROTS

(2) Results of applicablerali~oi.Uty/grotihte~ting.

103,3 (refer~e 1.7~.1~

103,3,1 DetaiM to be specifiedLn the SOW shallincltiethe followi~,M


applicable:

*9 Advance notificationto the PA ef all scheduledreviews. ?he 9pe2ific


number of dayciadvance notice should be contractual.

5. Recordingof the results of the revlewa.

TASK 103 .
15 September 1980
193-2
Downloaded from http://www.everyspec.com

MIL-STD-785B
15 September 1980

c. Identificationof PA and contractor follow-upmethods on review of


open it-s.

d. Identificationof reviews other than system program reviews,

e. Delivery identification
of any data itaa required.

103-3
Downloaded from http://www.everyspec.com

MIL-STD-?85B
15 Septaaber 1980

THIS PAGE INTENTIONALLY LEFT BLANK

TASK 103
;5 September ?98CI
Downloaded from http://www.everyspec.com

MIL-STD-785B
IS Sepember 1980

TASK 104

FAILURE REPORTING, ANALYSIS, AND CORRECTIVE ACTION SYSTEM (FRACAS)

104.1 ~ The purpose of task 104 1s to establish a closed loop failure


reporting systau, procedures for analysis of failures to determine cause, and
documentation for recording corrective action taken.
..
104.2 ~
104.2.1 The contractor shall have a closed loop system that collects,
analyzes, and records failures that occur for specified levels of assembly
prior to acceptance of the hardware by the procuring activity. The
contractors exlstlng data collection, analysis and corrective action Bygtem
shall be utilized, with modification only as necessary to meet the requirements
specified by the PA.

104.2.2 Procedures for initiating failure reports, the analysis of failurea,


feedback of corrective action into the design, manufacturing and test processes
shall be identified. Flow diagr~(s) depicting failed harduare and data flow
shall also be documnted. The analysis of failures shall establish and
categorize the cause of failure.

104.2.3 The closed 100p gyst~ g~l ticl~e ~ovigiong to a99We that
effective corrective actions are taken on a timely basis by a follow-up audit
that reviews all open fal.lurereports, fafiure analyses, and corrective action
suspense dates, and the reporting of delinquencies to management. The failxe
cause for eaeh failure shall be clearly stated.

104.2.4 Uhen applicable, the method of establishing and recording operating


Wae, or cycles, on equipments shall be clearly defined.

104.2.5 The contractor~s closed loop failure reporting syst~ data shall be
transcribed to Gover~ent forms only if specifically required by the procwing
activity.

104.3 BY THE PA (r~ 1.2.7.1~

104.3.1 Details to be specified in the SOW shall include the following, as


applicable:

a. Identification of the extent to which the contractors FRACAS must be


compatible with PAs data system.

(R) b. Identification of level of assembly for failure reporting.

co Deftiitiom for failure cause categories.

d. Identification of logistic support requirements for LSAR.

e. Deliveryidentificationof any data it= required.

TASK 10U
15 September 1980
104-1
Downloaded from http://www.everyspec.com

PUL-STD-785B
15 September Ig80

THIS PAGE INTENTIONALLY LEFT BLANK

TASK l13h
15 September 1?80
?04-2
Downloaded from http://www.everyspec.com

MIL-STD-785B
15 September 1~80

TASK 105

FAILURE REVIEW BOARD (FRB)

105.1 ~ The purpose of task 105 is to require the establishment of a


failure review board to review fa%lure trends, sl.gnlficantfailures, corrective
actions status, and to assure that adequate corrective actions -e taken in a
timely manner and recorded during the development and production phases of the
program.

105.2.1 The FRB shall review functionallperforma,nce faliured~ta frca


appropriate inspections and testing including subcontractor qualification,
Tenability, and acceptance test ~allures. All failure occurrence Lnfonaation
shall be avafiable to the FRB. Data including a description of test conditions
at the offailure, symptoms of failure, failure isolation procedures, and
known or swpected ca~ses of failure shall be examined by the FRB. Open FRB
items shall be followed UP until failure mechanisms have been satisfactorily
identified and corrective action initiated. The FRB shall also maintain And
disseml.natethe status of corrective action implementation and effectiveness,
Minutes of FRB activity shall be recorded and kept on file for examination by
the procuri.w activity during the tem of the contract. Contractor FRB aemoers
shall include appropriate representatives fraa design, reliability, system
safety, maintainability, manufacturing, and parts and quality assrance
activities. The proclurlngactivity reserves the right to appoint a
representative to the FRB as observer, If the contractor can identify and
utilize an already existing and operating function for this task, then he shall
describe in hls proposal how that function will be employed A meet the
P.rocWing activity requirements. This task shall be coordinated uLth Qlality
Assurance organizations to Insure there 1s no duplication of effort.

105.3

105.3.1 Details to be specified in the SOW shall include the following,


as applicable.

(R) a. The imposition of task 104 as a requisite task,

TASK ?05
15 September ?980
105-:

. . .
Downloaded from http://www.everyspec.com

MIL-sTD-785B
15 September Ig80

THIS PAGE INTENTIONALLY LEFT BLANK

TASK 105
15 September 198~
105-2

~, - . . ..-. --- - . -----.--f -----


.. -- ---
Downloaded from http://www.everyspec.com

!41L-STD-T85B
15 September 1980

TASK 201

RELIABILITY MODELING

201.1 ~ The purpose of task 201 is to develop a reliability model for


making numerical apportionments and estimates to evaluate system/subsystem/
equipment reliability.

201.2.1 A rel~abfiity mathematical model based on system/subsystem/equipment


functions shall be developed and maintained. Aa the design evolves, a
reliabtiity block diagram shall be developed and maintained for the
systdsubsystem with associated allooatlons and predictions for all items In
each nliability block. The reliability block diagram shall be keyed and
traceable to the functional block diagram, schematics, and drawings, and shall
provide the basis for accurate mathematical representation of reliability,
Nomenclature of items used in reliability block diagrame shall be consistent
with that used in functional block diagrams, drawings, and schematics, weight
statcmmta, powe~ b@@s, and specifications. The model outputs shall be
ex~eaaed in tams of contractual reliability requirements and other
reliability terms as specified. Hhen requ4.redfor the PROD phase, the model
shall be updated to include hardware design changes.

201.2.2 The reliability mathanatical model shall be updated with information


resulting mom reliability and other relevant tests as well as changes in Item
conflguratlon, mission parameters and operational constraints. Inputs ~nd
outputs of the reliability mathematical model shall be compatible with the
input and output reqtimnts of the systaa and subsystem level analysis
models.

201,2,3 Plodellngtechniques shall provide separate outputs for: (1) basic


reliability, and (2) mission reliability, of the system\subsystem/eqApment. A
single series calculation of basic reliability, and the modeling techniques
described in appendix A of MIL-HDBK.217 for mission reliabllty, shall be used
unless otherwhe specified.

201,3

201.3.I Details to be s~clf18d in the SOW shaU include the followi.~,as


applicable:

a. Imposition of tasks 202 and 203 as requisite tasks in the FSED phase,

b, XdentifLcationof alternative modeling techniques.

(R) C, Identification af mission paramotors and operational constraints.

<9 Lc support coordinated reporting requirements for LSAR.

e. A .Mflcation of additional rellabllty tenna.

f. Dellvery identificationof any data item required,

TASK 201
15 September 198G
201-1

. . ..
Downloaded from http://www.everyspec.com

..
tlxL-sTD-785B
15 Septaober 1980

THIS PAGE INTENTIONALLYLEFT BLANK

201-2
Downloaded from http://www.everyspec.com

y&L-sTD-785E
d September ?98?

TASK 202

RELIABILITY ALLOCATIONS

202.1 ~ The purpose of task 202 1s to assure that once q~antitative


system requirements have been determined, they are allocated or apportioned tc
lower levels.

202.2.1 Both basic reliability and mission reliability requ~rements shall be


allocated to the level specified and shall be used to establish baseline
requirements for designers. Requirements consistent ulth the allocations shall
be Imposed on the subcontractors and suppliers. The apportioned values shall
be inclludedin appropriate sections of procurement specifications, critical
item specifications, and contract end item specifications. All allocated
reliability values established by the contractor and included in contract end
Item specifications shall be consistent with the reliability model (see task
201) and any change thereto, and subject to procuring activity review.

202.3 ~~ TO ~ BY T~PA (reference 1~ 1

202.3.I Details to be specified In the SOW shall Include the following, as


applicable:

a. Imposlt$on of tasks 201 and 203 as requisite tasks in the FSED phase.

(R) b, Identification of the level to which PA will reqAre allocations to be


made.

c, Logistic support coordinated reporting requirements for LSAR,

(R) d. Pertinent reliability information of any specified CFE. ~~~


information shall include the envi,ronmental/operational
conditions mder which
the reliability information w~s derived,

e. Delivery identification of any data item required.

TASK 202
15 September !S80
Downloaded from http://www.everyspec.com

MIL-sTD-785B
15 %ptanber 1980

THIS PAGE INTENTIONALLY LEFT BLANK

.
TASK 202
15 September 1980
202-2

lb 1 . I ns I mr! 7An Y! 7TV oo~n~u PI AN


Downloaded from http://www.everyspec.com

!IIIL-STD-T85B
15 September 1980

TASK 203

RELIABILITY PREDICTIONS

203.1 ~ The purposeof task 203 is to estimatethe basic ~liability


and mission reliability of the system/subsystem/equl~nt and to make a
determ$natlon of whether these reliability requirements can be achieved with
the proposed design.

203.2.1 Re12abQlty predictions shall be made for the system,


subayatdequipwnt. When required, predictions shall account for, and
differentiate between, each mode of item operation as defined in the item
specification. Predictions shall be made showing: (1) basic reliability of
the item during the life profile specified by the PA, to provide a basis for
life cycle cost and logistics support analysis; and (2) mission reliabilityOr
the item during the mission profile(s) specified by the PA, to provide a basis
for analyds of itaa operational effectiveness. These predictions shall be
made ua~ng the associated reliability block diagram and failure rate data
approved by, or provided by, the procuri.~ activity. Items shall not be
exclwlti -W the flCSPor other mission rel$abfilty predictions unless
subetant~atlng docummtatlon (such as FMECA) verify that the item fatium has
no Inflwnce on the required measure of mission reliability. Prior to such
Uxclustons &m the predictions, an assessment and approval shall be obtained
fra the procuring activity.

203.Z!.l.l Failure rates other than those established at contract award may De
used only upon approval of the pMCWiW activitY*

203.2.1.2 The permissible failure rate adjwtment factors for standby


operation and storage shall be as specifically agreed ta by the procuring
activity.

203.2.1.3 When the individual part operating conditions are defined, the
prediction procedure in section 2 of MIL-HDBK-217, or PA approved alternative,
shall be umed.

203.2.1.4 If the part type and quantity 1s the only Infoneation available, the
prediction procedure of section 3 of MIL-HDBK-217, or PA approved alternative,
shall be wad.

203.2.2 predictions for electronic equipment shall be made using one of the
two methods contained in MXL-HDBK-217, or alternatives approved or provided by
the PA. Predictions for mechanical, electrical, and electro-mechanical
equlpnnt shall be made wirg either contractor data w alternatives, both of
which shall require PA approval.

TASK 203
15 Sept=ber 1980
203-1
Downloaded from http://www.everyspec.com

IUL-STD-785B
IS Septauber 1980

203.3
.
203,3.1 Details to be specified In tho SOW shall Include the fallowing, as
applicable:

a. Imposition of WSka 201 and 202 as requisite taske in the FSED phase.

b. Identification of dormancy factors.

(R) C. Identification of item Mf. profile, to include one or more Oission


profQes.

d. Identlfiaation of mqu$rement to update predictionsusing actual


experienceand test data.

e. Source ~ which fafiure rate data uI1l be ebtained (i.e.,


MIL-HDBK-21?$ w other ~O~c8~)e

f. Establishment of PA approval req@rements for failure rate data and


soureo of data.

8~ Identification of alternative methods to be used for predlct~ons.

h. Logistic support ooordlnated reporting req*ants for LSAR.

i. Sdentiftcation of additional reliability terms for which predictions


are reqtied.

(R) j. Perttnent reliability tifo-atlon of any specified GFE. This


infomatlon shall Incluie the envi.ronutental/o~ratlondconditions under which
the reliabti~ty Information was derived.

k. Delivery identification of any data it- requ$red.

TASK 203
15 Sept-ber 1980
203-2 .
Downloaded from http://www.everyspec.com

MIL-STD-785B
15 September 1980

TASK 204

FAILURE MODES, EFFECTS, AND CRITICALITY ANALYSIS (FtIECA)

204.1 ~ The purpose of task 204 ia to identify potential des~n


weaknesses through systematic, doc=ented consideration of the following: all
likely ways in which a component or equipment can fail; causes for each mode;
and the effects of each failure (which may be different for each mission
phase).

204*2 ~

204.2.1 FMECA shall be performed to the level speclfied(subsystem, equlpment,


functional circuit, module, or piece part level). All failuremodes shall be
postulated at that level and the effects on all higher levels shall be
detemnlned, The FMECA shall consider failure mode, failure effect and
criticality (Impact on safety, readiness, mission success, and demand for
malntenancelloglsticssupport), and the failure indication to the operator and
maintenance personnel by llfe/mission profile phase. This analysis shall be
scheduled and completed concurrently with the design effort eo that the design
will reflect analysis conclusions and recommendations, The restits and caurrezt
status of FMECA shall be used as inputs to design trade-offs, safety
engineering, maintenance engineering, maintainability, logistic support
analysis, test equipment design and test planning activities, et cetera.

204.2.2 A sample FtlECAworksheet format shall be submitted to the PA for


approval and details such as who (by discipline) shall perfom the analysis,
who shall review it for adequacy and accuracy, when and how It shall be
updated, and what speclflc uses shall be made of the results (e.g., identifying
potential system weaknesses, as a tool for evaluating the effectiveness of
built-in test, updating reliability assessments, updating critical item control
procedures, development of safety, maintainabi.llty,and h=an engineering
design and operational criteria, et cetera) shall be identified.

204.3 TO= SP~ ~r~~

204,3.1 Details to be specified in the SOW shall Include the followl.ng,as


applicable:

(R) a. Identification of the level to which the FMECA shall be conducted.

(R) b. Procedure identification in accordance with MIL-STD-1629, or an -


alternative approved by the PA,

(R) C. Identificationof life/mission profile.

d. Logistic support coordinated reporting requirements for MAR.

e. Submittal of sample FMECA worksheets per 204.2.2.

f. DelIvory identification of any data items required.

TASK 204
15 September :980
Downloaded from http://www.everyspec.com

MIL-STD-785B
15 Sept-ber 1980

THIS PAGE INTENTIONALLY LEFT BLANK

TASK 204

15 Saptember ?980
204-2

*.-->=_s__=___ _-_-_ ___


_
Downloaded from http://www.everyspec.com

MIL-STD-785B
15 September 1980

TASK 205

SN&AK CIRCUIT ANALYSIS (SCA)

205.1 ~ The purpose of task 205 Is to Identify latent paths which


cause occurrence of unwanted functlow or inhibit desired functions, assuming
all component are functioning properly.

205.2.1 Sneak cticult analyses of critical circuitry shall be conducted to


identify latent paths which cause unuanted fwctl~ns to occur or which inhibit
desired functions. In making these analyses, all components shall be assumed
to be functioning properly. These analyses shall be made using production
manufacturing documentation for each circuit analyzed.

205.2.2 A list of those functions/c@cuits to be analyzed, and the priorities


given each subassembly in the analysis, shall be presented for PA approval at
CDR, together with the supporting rationale for the selections made, Results of
the analyees and actions taken as a result of analyses findings shall be made
available to the procuring activity upon reqwst.

205.3

205.3.1 Detailsto be specifiedIn the SOW shall includethe following,


as applicable:

(R) a. Specification of criteria for selection of circuits/functionsto be


analyzed.

b, Delivery identification of any data items required.

TASK 205
15 September 198c
205-1
Downloaded from http://www.everyspec.com

HIL-STD-785B
15 Septanber1980

THIS PAGE INTENTIONALLY LEFT BLANK

.-
IAX 205
15 September 1980
205-2
Downloaded from http://www.everyspec.com

MIL-STD-785B
15 September 1980

TASK 206

ELECTRONIC PARTS/CIRCUITS TOLERANCE ANALYSIS

206.1 ~ The purpose of task 206 Is to examine the effects of


parts/circuits electricaltolerancesand parasiticparametersover the range of
specifiedoperatingtemperatures.

206.2 ~
206.2.1 Parts/circuits tolerance analyses shall be conducted on critical
circuitry as defined in the ccmtract, These analyses shall verify that, given
reasonable combinations of within-specification characteristics and parts
tolerance buildup, the circuitry being analyzed will perform within
specification performance. In making these analyses the contractor shall
examine the effect of cwlponent parasitic parameters, input signal and power
tolerances, and impedance tolerances on electrical parameters, both ;:n::rcdu~~
nodes (c-ponent interconnections) and at input and output points.
of the stated factors may not be significant to all circuits, only the critical
factors for that circuit shall be considered.

206.2.2 Component cha.racterlstlcs, (life-drift and temperature) shall be


factored into the analyses. These characteristics or values shall include
resistance, capacitance, transistor, gain, relay opening or closing time, et
cetera.

206.2.3 The inductance of wire-wound resistors, parasitic capacitance, and ariy


other similar phenomena shall be taken inti accomt, where appropriate.
Ilaximmvariations in input signal or powr supply voltage, frequency
bandwidth, Impedance, phase, et cetera shall be used in the analyses. The
impedance characteristics of the load shall be considered as well. Circuit
node parameters (incl@lng voltage, current, phase, and waveform), circuit
element rise time, timing of sequential events, circuit power dissipation, and
circuit-load impedance matching under wrst case conditions shall also be
considered. These parameters shall be analyzed for their effect on the
perfomnance of cticuit components.

206.2.4 A list of those functiondci.rcuits to be analyzed shall be presented


at PDR. The moat unfavorable combination of realizable conditions to be
considered in the partslcjrcuits tolerance analyses shall be defined for
approval by the procuring activity. Results of the analyses and actions taken
as a result of analyses findings shall be made available to the procuring
activity upon request.

206.3

206.3.1 DetalJato be specified in the SOW shall include the following, as


applicable:

(R) a. Identification of range of equipment operating temperatures.

(R) b, Specification of criteria for selection of partsicl.rcults to be


analyzed,

TASK 206
15 September 198o
206-1

--- . . . .
Downloaded from http://www.everyspec.com

HIL-sTD-785B
IS Sept-ber 1980

cc Delivery identification of any data It=s requ-a

TASK 206
15 September198o
206-2
Downloaded from http://www.everyspec.com

MXL-STD-785B
15 September 1980

TASK 207

PARTS PROGRAM

207.1 UREQSL The purpose of task 207 Is to control the selection and use
of standaml and nonstandard partst

207.2.1 A parts control program shall be established in accordance with


?IUL-$TD-965procedures, as des%nat~ fi the contract

20T.2.Z Reliability design guidelines shall be developed and documented to


include deratlng criteria, junction temperatures, and parts aPPlicatiofi
criteria. Safety marghs for nanelectronic parts will also be included when
appropriate, The guidelines shall be consistent uith guidance provid@d by the
PA,

207.3

207.3.1 Details to be specified in the SOW shall include the following, as


applicable:

(R) a. Identification of MIL-STD-965 procedures (proc@ure 1 or IIJ~

b. Identification of PA part approval procedures.

c. Icientlflcatlonof review procedures with design activity.

cl. Identification of detailed design guidelines, including:

(1) Order of preference of part quallty/reliability/screenLrU3


levels.

(2) Documentation of a prohibited parts/materials llst.

Contractor/supplier
participationh the GIDEP programper
MIL-s%-1556.

TASK 207
15 September 198C
207-1

--===-.
_._=__
_~a==_.~=====._=_._._=_.~._.
-====-=.=--e .--. .
_____~>_ ~ -_-_-=_=-G.._-= . . .
. . . ..
Downloaded from http://www.everyspec.com

PIIL-STD-785B
15 September 1980

THIS PAGE INTENTIONALLY LEFT BLANK


TASK 207
15 September 1980
207-2


Downloaded from http://www.everyspec.com

MIL-STD-785B
15 September 1980

TASK 208

RELIABILITY CRITICAL ITEMS

208.1 ~ The purpose of task 208 la to ldentlfy and control those items
which require special attention because of complexity, application of
advanced state-of-the-art techniques, and the tipact of potential failure on
safety, readiness, mission success, and demand for maintenanceiloglstics
support.

208.2.1 Reliability critical items shall be Identified by F14ECAor other


methods and shall be controlled. Methods and procedures for control and
testing of the reliability critical Itam shall be identified along with
justification(s) for decontrolling the item if that is intended. Uhen
speclfled, the procedures shall Include engineering support of critical items
during FSED govmnwnt field testing, which shell Incltie provisions for
conf~lng failures which may occur, expediting fddure cause determination,
and dete-ing and incorporating, or verifying, the necessary corrective
action.

208.3

208.3.1 Details to be specified in the SOW shall include the following, as


applicable:

a. Specific Identification of reliability critical Item criteria such as:

(1) A failure of the item would critically affect systan safety,


cause the system to become mavafiable or unable to achieve tulssionobjectives,
or cause extensiveiexpenslvemaintenance and repafi. (NOTE: High-value Item
are mliabllity-critical for design-to-life-cycle cost.)

(2) A failure of the item uould prevent obtaining data to eval~ate


system safety, availability, miss~on success, or need for maintenance/repair.

(3) The item has stringent perfonaance requirement(s) in its


intended application relative to state-of-the-art techniques for the it~. .

(4) The sole failure of the item causes system failure.

(5) The item is stressed In excess of specified derating criteria.

(6) The It=


has a lmoun operating life, shelf life, or
environmental such as vibration, thema.1, propellant; or a liMitatiOfi
exposure
which warrants controlld survotilanco under spedfied conditions.

(7) The it- is bow to nquire s~cial handling, transportation,


storage, or teat precautions.

(8) The item is difficult to procure or manufacture relative to

TASK 208
15 September1980
208-1

-.. . . . . ----- AAA-A


Downloaded from http://www.everyspec.com

--

HIL-STD-785B
15 Sept-bar 1980

state-of-the-art techniques.

(9) The It= has exhibited an msatisfactory operating history.

(10) The it- does not have sufficient history of its own, or
similarityot other Itas having danonstrated high reliability, to provide
confldmoe In its rellabUty,

(11) ThcJitems past history, nature, function, or Processing has a


deficiency warranting total traceability.

(12) The ita Is ueed in large quantities (typically, at least IG


percent of tb conflgurad it-s electronic parts Comt).

b. Identification of requirement for ngine8r~ support during FSED


field testing.

c. Loglstlc support coordtiated reporting req~enta Sor BAR.

d. Delivery identification of any data Items reqtiecl.

TASK 208
15 September 1980
208-2
Downloaded from http://www.everyspec.com

MIL-sTD-7851?
15 September 1980

TASK 209

EFFECTS OF FUNCTIONAL TESTING, STORAGE, HANDLING, PACKAGING,


TRANSPORTATION, AND MAINTENANCE

209.I ~ The purpose of task 209 Is to detemlne the effects of


storage, handlhg~ packaging~ transwrtatlon~ ma~tenance~ and ~Peated
exposure to functional testing on hardware reliability~

20g.2 ~
209.2,1 Procedures shall be established, maintained, and implemented to
detenalne by test and analysis, or estimation, the effects OF storage,
handlhg, paakag~ng, transportation? mtinte~ance~ and ~wated exwa~e to
functional testing on the design and reliability of the hardware. The results
of this effort shall include items such as:

a. Identificationof equipments and their major or critical


characteristics which deteriorate with storage age or environmental conditions
(Including sheck and vibration, et cetera).

b. Identification ofprocedures for periodic field inspection or tests


(including recall for test) or stockpile rellabl.lityevaluation. The
Proo@dur8a shall ineltie s~sted quantity of items for test and acceptable
leve18 of Pmrfo-noe for parameters under test.

c. Identification of special procedures for maintenance or restoration.

The reaulta of this effort shall be used to support long term failure rate
predictions, design trade-offs, definition of allowable test exposures, retest
after storage decisions, packaging, handling, or storage requirements, and
refurbishment plans.

209.3

209.3.1 Deta~ to be spacified in the SOU shall include the following, as


applicable:

(R) a. Identification of functional testing, storage, handltng, pack~ing,


tranaportatlon,and maintenance profiles.

b, Logistic support coordinated reporting requirements for UAR. -

TASK 209
IS Sept-ber 1980
209-1
Downloaded from http://www.everyspec.com

HIL-STD-785B
15 Septaber 1980

THIS PAGE INTENTIONALLY LEFT BLANK

TASK 209
15 September 1980
209-2
Downloaded from http://www.everyspec.com

MIL-STD-785B
TASK SECTION 300
15 September 1980

TASK SECTION 300

DEVELOPMENT AND PRODUCTION TESTING

300-1

.
Downloaded from http://www.everyspec.com

MIL-STD-785B
TASK SECTION 300
15 September 1980

THIS PAGE INTENTIONALLY LEFT F!LANK

300-2

.
Downloaded from http://www.everyspec.com

MIL-STD-785B
15 September 1980

TASK 301

ENVIRONMENTALSTRESS SCREENING (ESS)

301,1 ~ The purpose of task 301 1s to establish and Implement


environmental stress screening procedures so that early failure(s)due to weak
parts, workmanship defects, and other non-confonuanceanauoliescan be
identifiedand removed from the equipment.

301.2.1 Environmentalstress screening (also lmoun as preconditioning,


burn-in, et cetera) shall be conducted on parts, subasaanblles,and complete
units for both developmentaland production items,

301.2.1.1 Dwing development, ESS test procedures, taking into consideration


the equi~ent design, partic~ponent technology,and production fabrication
techniques,shall be fomulated. ESS proceduresshall be designed for the end
item and for Al lower level item which will be procured separately as spare
or repair parts. A plan for Implementingthese proceduresshall also be
pre~d, Indicating the proposed applicationof ESS during developmentand
production. The propoeed ESS procedures and tipl~ntation plan shall be
subject to approval by the PA.

301.2.2 ESS testing shall be designed to stimulate relevant failuresby


stressing the iten.
The stressing need not simulate the precise operational
environment the item
uI1l see. Environmental stress types may be applied in
sequence. During ESS, the item shall be cycled through Its operational modes
while simultaneo~ly bei~ subjected to the required envfionmentd stresses.

301.2.3 Upon approval of the proposed HS mc~-s am ~Pl~entat*on Plam


a detailed envixmnmental stress screening test plan shall be prepared and
included as part of the reliabLM.ty test plan. The EM detailed test plan
shall include the following,subject to PA approval prior to initiation of
testing:

a. Description of environmental stress types, levels, profiles, and


exposure times b be applied.

b. Identlflcatlonof level (board, subassembly,assembly) at which


testing wIJ1 be accomplished.

c. Identification of itea perforamnce and stress puameters to be


monitored durl~ EM.

d. Proposed test duration (failure-free interval and maximuo ~S test


tk per item),

301.2.4 The results of ESS testing during development shall be analyzed and
wed as the basis for the ESS procedures to be speclfled for production.

301.3

301,3.1 Details to be specified in the SOU shall Include the following, as


applicable:

TASK 301
IS September 1980
301-1

. __=_. .
Downloaded from http://www.everyspec.com

MIL-STD-785B
15 Septaaber 1980

a. Identificationof the requirement to develop ESS procedures,


Implementation plan, and detafiad test plan.

b. Spaciflcationof detailed ESS requlrementa.

c. Ddivory identification of any data Ltema required.

TASK 301
15 September 1980
301-2
Downloaded from http://www.everyspec.com

MIL-STD-785B
15 September 1980

TASK 302

RELIABILITY DEVELOPMENT/CROWH TEST (RDGT) PROGRAM

302.1 ~ The purpose of task 302 is to conduct pre-qualification


tegt~rig(also known as TMF) to provide a basis for resolving the majority of
reliability problem early In the development phase, and incorporating
corrective action to preclude recurrence, prior to the start of production.

302~2 ~

302.2.I A reliability development/growth test (TAAF test) shall be conducted


for the purpose of enhancing systaa reliabl.litythrough the identification,
analysis, and correction of failures and the veriflcatiociof the corrective
action effectiveness. Mere repati of the test item does not constitute
corrective action.

302.2.1.1 T6 enhance mission reliability, corrective action shall be foc~sed


on mission-critical failure modes. To enhance basic reliability, corrective
action shall be focused on the most frequent failure modes regardless of their
mission criticality, These efforts shall be balanced to meet predicted growth
for both parameters.

302.2.1.2 Growth testing will emphaaize performance monitoring, failure


detection, failure analysis, and the incorporation and verification of design
corrections to prevent recurrence of failures.

302.2.2 A TAAF test plan shall be prepared and shall include the following,
subject to PA approval prior to initiation of testing:

a, Test objectives and requirements, including the selected growth model


and grouth rate &d the rationale for both selections.

b. Identificationof the equipment to be tested and the number of test


items of each equipment.

c. Test conditions, environmental, operational and performance profiles,


And the duty cycle.

d. Test schedules expressed in calendar time and item life units,


including the test milestones and test program rev%eu schedule.

e. Test ground rules, chargeability criteria and Interface bouridaries.

f. Test facility and equipment descriptions and requfiements.

g* Procedures and tld.ng for corrective actions.

h. Bloch of time and resources designated for the incorporation of


design corrections.

1. Data collection and recordhg requirements

TASK 302
15 September 1g80
302-1
Downloaded from http://www.everyspec.com

MUATD-785B
15 Sept~ber 1980

J. FRACX3

k. Government furnished property requirements

1. Description of preventive maintenance to be accomplshti during test

a Final disposition of test Zt=s

n. Any other relevant considerations.

302.2.3 As specified by the procuring activity, the TAAF test plan shall be
submitted to the procuring act%vlty far its review and approvd.a This plan, as
approved, shall be incorporated into the contract and shall beoome the basis
for contractual canplianmc

302.3 (r~ I&.1)

302.3.1 Details to be specified In the SW shall Include the foUowing, as


applicable:

(R) a. Imposltlon of task 104 as a requlslte task.

(R) b. Identification of a llfe/ml~sion/env&o~td profUe to represent


equipment usage In service,

c, Identification of equipment and quantity to be used forreliability


clevelopmnt/groWh testing.

d. Delivery ldentiflcatton of any data Items requhed.

TASK 302
15 Sept~ber 1980
302-2

.-.

Downloaded from http://www.everyspec.com

MIL-STD-785B
IS September 1980

TASK 303

RELIABILITY QUALIFICATION TEST (RQT) PROGRAM

303.1 ~ The purpose of task 303 is to determine that the specified


reliability requirements have been achieved.

303.2 ~

303.2,1 Reliability qualification tests shall be conducted on equipments which


shall be identified by the PA and which shall be representative of the approved
production configurate.on. The reliabUity qualification testing may be
integrated with the overall system/equlpa?entqualification testing,w hen
practicable, for cost-effectiveness;the RQT plan shall so indicate irithis
case. The PA shall retain the right to disapprove the test fallurerelevaucy
and chargeability determinations for the reliability demonstrations.

30302.2 A RQT plan shall be prepared in accordance with the requirements of


ffIL-STD-781,or alternative approved by the PA, and shall include the
followlng, subject to PA approval prior to initiation of testing:

a, Test objectives and selection rationale.

b. Identification of the equipment to be tested (with identification of


the computer programs to be used for the test, If applicable) and the number of
test items of each equipment.

c. Test durat$on and the appropriate test plan and test environments,
The test placiand test environments (if llfeimission profties are not specified
by the PA) shall be derived from MIL-STD-781. If It is deemed that alternative
procedures are more appropriate, prior PA approval shall be requested with
sufficient selection rationale to permit procuring activity evaluation.

d. A test schedule that is reasonable and feasible, permits testing of


equipment which are npresentatlve of the approved production configmation,
and allows sufficient time, as specified in the contract, for PA review aria
approval of each test procedure and test setup,

303.2.3 DetaUed test procedures shall be prepared for the tests that =e
Included in the RQT plan.

303.2.4 As specified by the procuring activity, the RQT plan and test
procedures shall be submitted to the procuring activity for its review and
approval. These documents, as approved, shall be Incorporated into the
contract and shall become the basis for contractual compliance.

303,3

303.3.1 Detaila to be specified in the SOW shall include the fallowing, as


applicable:

(R) a. Identification of equipment to be used for reliability qualification


testing,

TASK 303
15 September ,9c3
Downloaded from http://www.everyspec.com

?41L-STD-785B
15 Septauber 1980

(R) b, Identl.flcatlonof MIL-STD-781, MIL-STD-105 or alternative procedures


to be used for conducting the R~ (i.e., test plan, test conditions, etct)c

c. Identification of a life/mission/environmentalprofile to represent


equipment u8age im 9ervice.

d. Logistic support coord-inatedreporting requirements for BAR.

e. Delivery identification of any data items required.

TASK 303
?5 September ;980
Downloaded from http://www.everyspec.com

MXL-STD-765B
15 September 1980

TASK 304

PRODUCTION RELIABILITYACCEPTANCE TEST (PRAT) PROGRAM

304.1 ~ The purpose of task 304 is to assure that the reliability of


-the hardware 1s not degraded as bhe result of changes h tooling, processes,
work flow, design, parts quality, or other characteristics ldentlfled by the
PA.

304.2 ~

304.2.1 Production reliability acceptance testing shall be conducted on


production equipments which shall be identified by the procuring activity.

304.2.2 A PRAT plan shall be prepared in accordance with the requirements of


MILSTD-781, or alternative approved by the PA, and shall include the
following, subject to PA approval prior to Initiation of testing:

a. Test objectives and selection rat~onale.

b. Identification of the equipment to be tested and the number of test


samples of each equlpmenc.

c. Test duration, test frequency, and the appropriate test plan and test
environments. The test plan and test environments (if mission profiles are not
specified by the PA) shall be derived from MILATD-781. If it is deemed that
alternative procedures are more appropriate, prior PA approval shail be
requested with sufficient selection rationale to permit procuring actlv~ty
evaluation.

d. A test schedule that is reasonable and feasible, and In consonance


uityh the production delivery schedule.

304.2.3 DecaiLed test procedures shaAi be prepared for Che tests that are
included in the PRAT pl~ or the equipment specification.

304.2.4 As spec$fied by the procuring activity, the PRAT plan amd ~rocedurea
shall be submitted to the procuring activity for its review and approval.
These aocuments, as approved by the procuring acClvity, shall be incorporated
tito the contract and shall become the basis for contractual compliance.

304.3.1 Details to be specifi~ b the SOW shall include the following, as


applicable:

(R) a. Identification of equipment to be usedfor production reliability


acceptance besting.

(R) b. Identificationof MIL-STD-781, hIL-STD-105 or ~ternative proced+wes


to be wed for conducting the pRAT (i.e., test plan, test conditions, etc.).

TASK 304
15 SepCember 138G
304-1
Downloaded from http://www.everyspec.com

PJIILSTD-785B
IS September 1980

c. Identification of a missiotienvlromentd profile to represent


equipment u8ageC

d. Logistic support coordinated reporting req-enti for MAR.

e. Delivery identification of any data Itrn requirul.

TASK 304
15 Septe@ber 1980
304-2
Downloaded from http://www.everyspec.com

MIL-STD-785P
APPENDIX A
15 September 1980

APPENDIX A

APPLICATIONGUIDANCE FOR IMPLEMENTATIONOF


RELIABILITYPROGRAM REQUIREMENTS

10. GENERAL

10.1 ,%@D@ This appendix providesrationale and guidance for the selection
of tasks to fit the needs of any reliabilityprogram, and identifiesapplicable
data items for Implementationof reouirti tasks.

10.2 ~ This appendix is to be used to tailor reliabilityrequirements


in the most cost effective manner that meets establishedproaram objectives.
HOWEVER, IT IS NOT TO BE REFERENCED,OR IMPLEMENTED,IN CONTRACTUALDOCUMENTS.

10,3 ~ The user of this appendix may include the Departmentof Defense
procuringactivity, Government in-houseactivity, and prime contractor,or
subcontractor,who wishes to impose reliabilitytasks uoon his supolier~s).

20. REFERENCEDDOCUMENTS

MIL-E-5400 Electronic Eauinment, Airborne,General S~ecificatianFor


MIL-Q-9858 Quality Program Requirements
MIL-P-11268 Parts, Materials,and ProcessesUsed in Electronic
Equipment
MIL-STD-781 ReliabilityDesiqn Quallflcatim and ProductionAcceptance
Tests, ExponentialDistribution
MIL-STD-51O EnvironmentTest Methods
MIL-sTD-965 Parts Control Program
MIL-STD-1521 Technical Reviews and Audits ForSvstems,Equipment,And
Computer Programs
MIL-STD-1556 Government/IndustryData Exchanze Program Contractor
ParticipationRequirements
MIL-HDBK-217 ReliabilityPredictionof ElectronicEauiu~en~

30. DEFINITIONS

Reference llIL-STD-785
basic.

40. TASK SELECTION

40.1.1 A major problem which confrontsall governmentand Industrv


organizationsresponsiblefor a reliabilityprogram is the selectionof tasks
which can materiallyaid in attainingprogram reliabilityrequirements.
Todays schedule and fundincjconstraintsmandate a cost-effectiveselectian,
one that is based on identifiedproqram needs. The considerationspresen~ed
herein are intended to provide guidance and rationale for this selection. They
are also intended to jog the memory for lessons learned to orovoke ouesticns
which must be answered and to encouraqe dialogue with other enf?ineers,
operationsand support personnelso that answers to questionsand solutions ta
problemscan be found.

A-1

----.
-----,=---
--
Downloaded from http://www.everyspec.com

MIL-STD-785B
APPENDIX A
15 September 1980

40.1.2 Once appropriate tasks have been selected, the tasks themselves can be
tailored as outlined in the Details To Be Specified By The PA. It is al~o
important to coordinate task requirements with other enqinerinq suoDort grouDs,
such as Logistics Support, System Safety, et cetera, to eliminate duplication
of tasks and to be aware of any additional information of value to reliability
which these other groups can provide. Finally, the ti!uinqand depth required
for ach task, as well as action to be taken based on task outcome, are largely
dependent on individual experience and program requirements. For these
reasons, hard and fast rules are not stated.

40=2 ~ Table I herein provides general


guidance, in summary form, of when and what to include in a RFP to establish
an acceptable and cost effective reliability proqram. This table can be used
to initially identify those ta~k~ which typically are included in an effective
reliability program for the Bartlcular acquisition Dhase involved. The user of
the document can then refer to the particular task referenced bv the matrix and
deter%nlnefrom the detailed purpose at the beqinninu of the task if it is
appropriate to identify as a program task. The use of this matrix for
developing a reliability progra!nis to be considered as optional guidance only
and is not to be construed as coverlnu all Procurement situations. The
provisions of applicable regulations must also be followed.

40.3 ~ The Droblem of prioritizing or establishing a base


line group from all the tasks in this document cannot be solved unless
variables like system complexity, Program pha~et availability of funds,
schedule, et cetera are known. The reliability program plan (task 101) should
always be considered for selection and total proqram comr)lexityis one
consideration for determining the need for this task. However, individual
tasks may be cited without requiring a reliability program plan.

A-2
Downloaded from http://www.everyspec.com

7ABLE A-x.

I PRCGRAM PHASE I
TASK TITLE ASK
TPt CONCEPT VALID I FSE: I PROC I

101 RELIABILIfl PROGRAM PUN rm s

102 mNKTOR/CtM?RCL OF SUBCONTRACYVRS fffiT s s , G !: I


AND SUPPLIERS

103 PROGRAN R~VIEUS MGT s S(2) ; G(2!


Iok FAILURE REPORTING, ANALYSIS, AND ENG NA S(G
coRRtXmve AcTxoN srsTm (FRACAS)
I
105 FAILURE REVIEU BOARD (FRB) PCT NA
201 RELIABILITY
M3DELING ENG s
202 Rl!LIABILIfT
ALLOCATIONS ACC s
203 RELUBILITY
PRtDICTIOM ACC s S(2)

204 FAILURE HOD=, m==, AND ENG s s ,,


CRITICALITY ANALYSIS (FMECA) (1)(2

205 SNEAK CIRCUIT ANAL~IS (SCA) ENG NA NA G(l)

206 ELECTRONICPARTS/CIRm~S ENG NA NA G


(

TOLERANCE ANALYSIS I
1 I
207 PARTS PROGRAN ENG s ,

208 RELIABILITY
CRITICAL
KTMS m s(l)
209 EFFECTS OF FUNCTIONAL TESTING, EM NA ~(l)
STORAGE, NANDLING, PACKACING,
TRANSPORTATION, AND MAINTENANCE

301 ENVIRONMENTAL STRESS SCREENING ENG NA s


(Css)
:,-,
302 RELIABILITY D&VELOmENT/GRWTH ENG NA @id) I
TESTING
1
303 RELIABILITY QUALIFICATION TEST ACC MA S(2)
(ROT) PROGRAM

304 PROMJCTIOM
RELIABILITY
ACCEPTANCE ACC NA NA
ACCEf?ANC& TEST (PRAT) PROGRM

JAauxtE

ACC- RUJABILITY ACCOUNTING s- selectively APPLICABLE

Km - RUIA81L~ UGXWERINC c - ~NERALLY APPLICABLE

m - IMNAGEM2HT Gc - CENXRAUY APPLICABLE TO DESIGN


CNANCES ONLY

NA - NOT APPLICABLE

(1) - REQUIRES CONS12ERABL2 :NTERPRETAT:-N


OF INTENT TO BE COST ZFFEC71JE

:2) - PIIL-STD-785XS NOT THE ?~MARY


IMPLEMENTATION RECIJIREMENT, ;THEo
WL-3TDS OR STA7P.EYT JF M2:<
REOUIRMENTS HUT ?E :NCi.JC~:~:
OEFINE THE REGUIRWENTG.

A-3
Downloaded from http://www.everyspec.com

HIL-STD-785B
APPENDIX A .
15 September 1980

50. RATIONALE AND GUIDANCE FOR TASK SECTIONS

50.1

50.1.1

50.1.1.1 The elements of a


reliability proqmm mat be selected to meet reliability needs. These needs
are identified by higher authority through documentation such as the Decision
Coordinating Paper (DCP), the Program Management Direotive (PMD), and Prouram
Management Plan (P?4P). Identifying and quantifylnq these needs must be
accomplished prior to release of an RFP for the appropriate acquisition phase
so that tasks and requirements commensurate with the needs may be included in
the RFP. The tasks and requirements which are included establish the framework
for the continuing reliability dialogue between the Drocuring activitv and the
proposing contractors, one or more, of whom will ultimately be selected to
develop the hardware. It is essential to make appropriate analyses and
exercise mature judgment in detemaining reliability needs.

50.1.1.1.1 In Baking this determination, it is necessary to assemble proqram


data concerning mission and performance requirements (preferably at the
subsystem level), anticipated en~ironments, and mission reliability and basic
reliability requirements. This information is initially gathered in the
CONCEPT phase and refined throughout development. It is the base upon which
the reliability needs are determined and adjusted as this information is
refined. The initial life/mission profile definition shall define, as a
minimum, the bomdaries of the performance envelope and provide the timeline
(environmentalconditions nd applied/induced stresses versus time) tvpical of
operations within that envelope. The quantitative requirements (basic
reliability and mission reliability) shall be cletennlnedfor the defined
life/mission profile.

50.1.1.1.2 Using these data and the Information on equipment contemplated to


provide the required performance, a separate apportionment or allocation of
basic reliability (HIBI14A
or MTBF) and mission reliability (MCSP or MTRCF) can
be made to the equipment level. This apportionment is usually based on
available reliability data modified to reflect changes in performance
requirements (e.g., greater accuracy), duty cycles, and anticipated
environments. If the hardware to be procured is a subsystem or equir)ment,the
allocations discussed herein would apply down to the lowest assembly level in
term of MTBPIA,lWBF, or failure rate. The required modifications are larqelv
a matter of judgment, particularly when a new or considerably modified
equipment concept must be synthesized to provide a specified function.

50.1.1.1.3 A reliability estimate should be made of each eauimaent


independent of, and reasonably soon after, com~leting the initial
apportionment. The equipment estimates should be combined to provide an
initial estimate of basic reliability and mission reliability. During the
CONCEPT and VALZD phases design details will probably not be available.
Therefore, estimates made during these phases and early in FSED will provide
ball park numbers, which are nevertheless adequate for initial comparisons
with, and for establishing the reasonableness of, the initial apportionment.
Reapportionmentbased on a comparison with details of the estimate may be

A-4
Downloaded from http://www.everyspec.com

!lIL-sTD-785B
APPENDIX A
15 September 1980

dvisable at this time. The apportionment and the estimate procedures should
be repeated until reasonable apportioned values are obtaln~d. The
apportionment should be frozen orior to the contractors awarclinqsubcontracts
which have firm reliability requirements.

50.1~~.2 ~. Some reliability tasks should be


accomplished for an entire weapons system, e.g. , develop and use an effective
FRACAS, make periodic estimates of ba~ic reliability and mission reliability.
In most cases, the need for them is self evident. These same tasks, and others
which must be selected, apply to subsystems and eauipment. Uhile experience
plays-a key role In task selection, it should be supplemented bv analvsls and
investigation.

50.1.1.2.1 A useful initial analytical procedure is to comare reliability


estimates at the subsystem and equipment level, with the corresoondinq
apportionments. If the estimate is less than the apportionment, the need for
improvement is indicated. Where considerable improvement is reouired (and
considerable $s a Judqment), the subsystem or equioment should be identified
as reliability critieal~. This identification shall be as early as possible
in the proqram phase so as to imDact the eaui~ment through the pr~~er selection
of tasks.

50.1.1.2.2 Reasons for the disparity between the apportioned and the
estimated valuds of the reliability critical items should be investigated.
Discussions of these reasons and tentative wavs to attain the acmortioned
values, (ea., relaxed performance requirements, either more or less deslun
redundancy, additional environmental protection), should be held with
appropriate project personnel. The object of the investizatlons and
discussions is viable recommendation(s) for action to overcome the
deficiencies. A significant benefit which can be eained from this process is a
consensus on, and a wide awareness of, the soeciflc equipment which is
considered reliability critical. When systems or equipment Derformartce
requirements create a wide and irreconcilabledisparltv between aooortion~d and
stimated values of required reliability, the procurinu authority shall
challenge the perfommnce requirements. Elimination of less essential
equipment functions can reduce equipment complexity and siqnlficantlv enhance
reliability.

50.1.1.2.3 Once recommendations for task aopllcations have been determined


and more detailed equipment requirements identified, tasks and requirements can
be prioritized and a rough order of maamitudem estimate should be made of the
time and effort required to complete each Wsk. This information will be of
considerable value in selecting the tasks which can be accom~lished within
schedule and funding constraints.

50=1.1.3 ~ The reliability proqram elan is


to be designed as a basic tool for the PA to ssist in manauinq an effective
reliability program and to evaluate the contractors approach to, understandln~
of, and execution of his reliability tasks, his depth of planning to ensure
that his procedures for implementing and controlling reliability tasks are
adequate, and his organizational structure to ensure that aporoprfate attention
will be focused on reliability activities/problems.

A-s


._
.=__
___
_._. =.__.___.
___ _
Downloaded from http://www.everyspec.com

MIL-sTI)-785B
APPEND_-XA
1S September 1980

50.1.1.4 lo?). The f7FP


contai,m system/subsystem/equipmentrequirements and some of the eqaipment will
undoubtedly be designed and developed by subcontractors. Appropriate
rellabll~ty tasks, previously determined as necessary, will also be included in
the RFP, and are normally levied by the prime (or associate) or integrating
contractor on the subcontractors. The procuring activity must know that these
tasks and requir~ents are correctly understood by the subcontractors. This
understanding is fundamental to meeting program needs.

50.1.1.4.1 Continual visibility of subcontractors activities is essential m


that Limely and appropriate management action can be taken as the need arises
To this end, it is prudent to include contractual provisions which permit the
procuring activity to participate, at its discretion, h appropriate forma-i
prime/subcontractor meetings. InPomation gained at these meetings can provide
a basis for follow-up action necessary to maintain adequate visibility of
subcontractors progress.

50.1.1.4.2 Active participation in the closed-loop FRACAS (50.1.2.s) should be


required of all equipment subcontractors as well as the prime or integrating
contractor. The information about unplanned events which this system can
provide is a naajorfactor Ln assessing and maintaining reliability program
effectiveness. It is reasonable to assume that equipment failures will occur
during service evaluation testing. During this testing it is very Important LO
determine as rapidly as possible the cause of suoh failures, the need for
corrective action, and the specific action to be taken. For thi9 reason.
selected subcontractor support of these teats is advisable, and should be
considered by the procurhg activity for I.nclusi.onin program requirements.

50.1.2

So=f.z=l ~. Since system acquisition programs are


usually very dynamic, continual knowledge of program status is required t@
assure that necessary action can be taken expeditiously. Such knowledge can be
obtained by integrating tiformal information with formal reporting.
50.1e20~4~ One informal technique is to monitor program status Shrough
telephone conversations and visits to the contractora facility. Such
procedures should be established early h the program so that it can be agreed
to by all parties. The resultant informal information system should be ~sed
early and exercised throughout development to expedite corrective action.

50.1.2.1.2 Useful information on program status can often be gleaned from


contractor data which is not submitted fomally, but which can be made
available through an accession list. A data item for this list must be
included in the CDRL. The list is a compilation of documents and data which
the procuring activity can order, or uhich can be reviewed at the contractors
facility. Typically, the details of design analyges, test plannlng, test
results, ad technical decisions are included and the data constitute a 3ourc42
ofinformation not otherwise available.

50.1.2.1.3 Active participation by the procuring activity orits designated


representatives in the hardware testing programs (prime and subcontractor) wii~
provide flr~t-hand information on test progress and thus an assessment zf

A-6
Downloaded from http://www.everyspec.com

program status. TM! PA reliability monitor who is closely associated with


contractor testing will be better able to alert ~roaram fnanauementto nroble~
areas, and to identify corrective action contefnolatedor implemented to resolve
the problems.

50.1.2.1.4 The informal methods described above provide ti~elv information


and should be used to the maximum extent consistent with PA resources. SGC!?
information is normally Supplemented with renorts which document reliability
program activity and continuity, and which are submitted under the CDRL. These
formal reoorts for reliability and related areas can include desiun data and
analyses of test results, and are submitted at intervals as saecified OR the
CDRL. The reports provide interim status in addition to documenting the
completion of significant program milestones. Their utilitv deDends on
adequate PA review, which is esoeclallv Imoortant if aporoval is linked to
aoproval to proceed to the next procram phase.

50.1.2.2 ~ An imDortant management and technical


tool used by the procuring activity reliability organization is Design Review:.
These reviews should be s~ecified In the statement of work to ensure adeollate
staffing and funding. Typicallv, reviews are held: to evaluate the orocrress,
consistency and technical adeauacv, includinq reliability, of a selected aeslzn
and test approach, (PDR); and to determine the acceptability of the detail
design aDDPOaCh, includinc reliab~llty, (CDR) before commitment to oroduc:ion.
Both the procuring activity and contractor reliability ~ersonnel should
consider design reviews as major milestones. The contractorg reliabiii:v
organization should be represented at all design reviews whether conducted
Internally, with supplier, or with the Government. The result of the
contractors internal, and suppliers, design reviews should be documented ant
made available to the PA on reauest. Desiqn reviews should include review of
reliabil~ty items listed in task 103. (Reference MIL-STD-1521 for AF use.;)

50.1.2.2.1 Reviews of the reliability prouram should also be conductec fror


time to time. Early in the proqram the reviews should be held at least
quarterly and as the proqram mot?resses, time between reviews can be extended.
In addition to more detailed coveraue of those items discussed at PDRs and
CDRS, the reviews should address oroqress on all reliability tasks soeci?ied in
the statement of work. Representative discussion items include all reliabil::~
analyses, failure analysis detaiis, Lest schedules and proqress, oroblems
related to vendors and subs reliability oroqrarus,parts oroblems and desicm
problems. Reliability reviews should be specified and scheduled in the
Statement o!Work or task 103.

50.1.2.2.2 All reliability program reviews provide an oooortunitv to revlek


and assign action &ternsand to explore other areas of concern. A mutual:v
acceptable agenda should be generated to ensure that all reliability oDen ite~s
are covered and that all participants are preclaredfor meaningful discussions.

50.1.2.3 W* r~ (F~ r)
Early elimination of failure causes 1s a major contributor to
reliability growth and attaininu acceptable field reliability. The sooner
failure causes can be identified, the easier It is to implement effective
corrective action. As the desiun, documentation and Dr?li!ninarvhardware
mature, corrective action can still be identified, but its ifuDlementation

A-7
Downloaded from http://www.everyspec.com

?HL-STD-785B
APPENDIX A
15 September 1980

becomes more difficult. It 1s, therefore, important to employ a closed-looo


FRACAS early in the development phase. Except for non-complex acquisitions,
~RACAS should be reauired by the procurlnq activity, and the contractors
existing system should be used utth minimum changes necessary to accornBlishthe
fundamental purposes of eliminating failure cauaes and documenting the action
taken.

50.1.2.3.1 FRACAS effectiveness depends on accurate input data, i.e., reDorts


documenting failures\ancxnallesand failure cause isolation. Essential inputs
are made by the contractors failure cause isolation. Also, essential inouts
are made by the contractors failure reporting ctivity which should soan
across all testing activities. These inputs should document all conditions
surrounding a failure to facilitate failure cause determination. (If time
permits, these observations should also be used to verify the FMECA (50.2.3.~)
for correctness and consistency.) Sometimes failure causes can be determined
through technical dialogue betueen design and reliability enqineers.
Occasionally, however, It Is decided that formal laboratory failure analyses
are required to reveal failure mechanism and provide the basis for effective
corrective action. Laboratory failure analysis should always be done for
reliabilitytest failures if the part failure mode is germane to the failure
cause detarmlnations.

50.1.2.3.2 The dismsltlon of fai~@d ~~=re is ~ritie~ ~d mUSt be prooer~~


controlled to preclude Dreamture disposal and ensure that the actual failed
parts are subjected to reauired analyses. A disposition team (the Failure
Review Roard) is normally comprised of representatives of Government and
contractor engineering, quality assurance and manufacturing. Access to
hardware peripheral to the failed item may also be required to investigate
failure repeatability under identical test/usage conditions. During initial
development, demand for existing hardware often exceeds supply and can result
in routinq that is not easily traceable. A FRACAS should be flexible enough to
ccommodate normal operations, and yet be capable of tracking the equioment as
Lt proceeds through the failure analysis activities. Durlnu later phases, the
FRACAS should also be able to accommodate hardware returned from the customer
and hardware returned to subcontractors and vendors for analysis.

50.1.2.3.3 A useful output of the FRACAS is a failure summarv FePort which


groups information about failure of like items or similar functlanal failures.
With this lnfomatlon, the need for, and the extent of contemplated corrective
action and its impact can be fonaulated.

50.1.2.4 ~ F th =au~slt~onof
expensive, complex, or critical systems or equipment it may be necessary and
desirable to formalize FRACAS proceedings to the extent of having them
controlled by a FRB.

50.1.2.4.1 The addition of this task to a reliability program will orovide the
procurinq activity with further assurance of tight control of reporting,
analysis, and corrective actions taken on Identified failures. It should be
noted, however, that in some instances the application of this task may
duplicate QA tasks under ?41L-O-9858and may not be cost effective or in the
best interests of the overall proqram. l%erefore, a survey should be made bv
the procuring activity to determine the need for application of this task.

A-8
Downloaded from http://www.everyspec.com

IIIL-STD-785B
APPENDIX A
15 September 1980

50.1.2.5 mve~re ~. The GPR (AFP!!O,NAVPRO, and DCAS)


serves as an extension of the Drocuring activity and provides on-site real-time
feedback on the contractors program activities. To assure maximum
coordination and utilization of the GPR, a memorandum of agreement (t40A)or
other documentation should be coordinated early in a program as to the kind of
reliability su~port the GPR will DrovMe. For example, the GPR can perform
in-plant surveillance and test monitoring on a routine basis. In addition, ad
hoc tasks and inquiries can be performed as needs develoP. The GPR can also be
a most valuable observer or participant in reliability oroWam revie~~
(50.1.2.2).

50.1,3 ~

50.1.3.1 ~* When the technical tasks required to


achieve the reliability requirements have been defined, the resources required
must be identified and committed to meet objectives efficiently. The task
elements should be staffed and time-phased to ensure that reliabllitv
objectives are not arbitrarily compromised to meet other procram cost or
sahedule constraints.

50.1.3.1.1 The procurinf!activity reliability monitor can prooerlv influence


the contractors reliability program by placing on contract: (1) numerical
reliability requirements and the testing requirements to ensure contractual
compliance during development and production; (2) the requirement to imulement
specific reliability tasks durinq conduct of the orouram and (3) the
requirement to provide visibility to the procurin~ activity of the
implementationof the contractual requirements.

50.1.3.1.2 Reliability promam success requires that tob management be


continually informed of prouram statua and unresolved woblems that could
impact the achievement of orogram milestones, so that direction and resources
can be reoriented as required. In eeneral, the contractors rellabl:tv
organization should have: (?) a shared resDonslbilitv with other disc:piines
in its engineering cleoartmentto achieve rellabilitv (2) technical control of
reliability disciplines; and (3) fiscal control of reliability resources,

50.1.3.1.3 Working arrangements between reliability and other activities


(e.g., maintainability, safety, survivability, vulnerability, testing, quaLitv
assurance) should be established to identify mutual interests, maximize
benefits of mutually supporting tasks, and minimize effort overlap. Suc!7
organizational working relationships can also promote more system-oriented.
decisions ifthe work is prooerly integrated at higher levels.

50.1.3.1.4 When all the necessary planning for a reliability procramhas been
ccomplished, it should be clocum?ntedas a reliability procraa!plan (task 101).
A reliability program plan is normally submitted as part of the contractors
response to the procuring activitys request for proposals. After mutual
agreement has bean reached and procuring activl.tyapproval has been granted,
the reliability program plan must be made a part of the contract. Since ?tie
plan is a contractual tool used to evaluate the contractors Feliabilitv
program, it should be kept current, subject to procuring acti71ty approval.

A-O
Downloaded from http://www.everyspec.com

!41L-STD-785B
APPENDIX A
1S September 1980

50.1.3.2 . From time to time during the


acquisition program, transitions from one program phase to another have to be
made. In addition, there may be other occasions within program phases when
changes In the reliability proqram are required. As transition or other chanqe
points approach, those responsible for monitoring and achieving system
reliability must evaluate the needs for the reliability proqram and determine
its structure if it Is needed.

50.1.3.2.1 While almost all tasks described in this standard can be Derformed
at varying levels of detail during any acquisition DhaSe, it is incumbent uDon
the procuring activity reliability monitor to ensure that only essential tasks
are specified, to avoid wasting resources. The procuring activity and the
contractor should critically appraise what has, or what will have been
achieved, at given milestones. For example, as the transition between FSED and
PROD approaches, judgments reqardina reliability tasks duri~ pr~uction must
be made. In some instances, only some kind of minimal testing will be
required, while in other instances, a substantial number of FSED tasks will
need to be continued, along with some testinq. yet other cases may call for a
reliability gromh proqram or perhaps a Dhase-uniouc task such as PRAT. (The
FSED-PROD transition point was chosen for illustrative purposes. Similar
reasoning applies whenever program change mints occur or are anticipated.) lt
is not the purpose of this paragraph to match a set of tasks ulth everv
conceivable set of program circumstances. Rather, its puroose is to emphasize
that the reliability monitor must assess and project accomplishments, determine
what still needs to be accomplished to achieve reliability requirements, and
then tailor a program to meet those requirements.

MQE: Tailorinu should not be interpreted as license to specify a zero


reliabilityprogram. Necessity and sufficiency are the key criteria
to be aet in determining whether tasks are tailored into, or
excluded from, a reliability proqram.


Downloaded from http://www.everyspec.com

?JIIL-sTD-795e
APPENDIX A
15 %$DtWIIbt2r
19fi0

50.2 00

50.2.1 ~

50.2.1.1 Program fundinz


and schedule constraints demand that the limited reliability resources
available be used where they are most cost effective. It is also axiomatic
that major proqrafn level requirements nd criteria have a more far reachina
impact than those developed for lower levels. It is amromiate, therefare
examine as early as possible the numerieal reliability reaulrements, t)ot?3;3::
.
reliability(PfTBFtMTBMA, or failure rate) and mission reliability (MCSP, MT9C~j
or critical failure rate), mission rules, failure tolerance criteria, et
cetera, so that analyses can be selected to show desiqn compliance or ta
identify shortcomings. Durinq this examination the numerical requirements and
criteria should also be evaluated, and if slluht changes to them can ~-nrave
program cost effectiveness, such information should be mesented to Drouram
management for appropriate action.

50.2.1,1.1 Both quantitative and qualitative analyses are useful in


determining where reliability resources should be aoplied. For example,
modeling (50.2.2.1), apportionments (50.2.2.2) and estimates (50.2.2.3) of
basic reliability and mission reliability, usinq aa inputs available data
modified to reflect change in environments and usa~e~ can scoPe the imBroveme~~
which may be required. A FMECA (~0.2.3.1) based on available mission rules and
system definition, not only can provide the framework for the estimate, but
also can be used to determine compliance with failure tolerance criteria and ts
identify single failure points which are critical to either mission success or
safety, or both.

50.2.1.1.2 These kinds of analyses identify lamrovements which must be made if


requirements are to be met. Some techniques which have been used to assure
efficient use of available resources in meeting the identified needs are
presented in the followinq paraUraDh3.

50=2.1.2 ~ Reliability analyses are efficient


work direction tools, because they can confirm system adequacv or identify the
need for design change, DrovidinK they are accomplished in conjunction with ar
reviewed by other disciplines.

50.2.1.3 ~, The use of reliability analwes is not


limited to the phase traditionally thought of as the design phase. Some 5? :?e
analyses mentioned above, and exoanded u~on in 50.2.2 and 50.2.4, are useful
during the early acquisition ohases when design criteria, mission requirements,
and preliminary designs are being developed. Since the situation is generallv
fluid durlnq these phases and firm commitments for full scale development have
not yet been made, a coemarison of the reliability benefits of competin~
configuration concepts may be more readily accepted for use in the decision
making process, Basically, the ultimaW reliability that carIbe attained 5V
any system, subsystea, or item Is limited by the configuration chosen.
Therefore, the analyses should be ~elected to aid in configuration definition
in light of the existing design criteria and mission requirements. Preliminary
reliability estimates (50.2.2.s) and FMECA (SO.2.3.1) are generally the mcst
appropriate for this purpose. The depth of the estimates should be sufficient

A-1

_=_._._.._._
..=
_._=
.--A_...m - ___ _ -= = - _ _ -.
-. ._=_= _.=_= _.=_=__=._____._<e. .-._<~__==___-=G~=
Downloaded from http://www.everyspec.com

MIL-STD-785B
APPENDIX A
15 September 1980

for comparison of the configurations.

50.2.1.3.1 The cost Of the selected analyses is obviously a function of both


the level and breadth requested. For example, an FMECA at the part level far
all equipment in a weapons system is time consuming, and action taken to reduce
reliability risk as a result of such n analysis will probably not be cost
effective (usually the failure of every part is critical to equipment
operation). However, when the analysis is eondueted at the black boxm or
similar level, single point failures can be identified and the need far a more
detailed analysis or design action can be detmnined. Similar considerations
(which are largely subjective) should be used in tailoring other analyses to
fit program needs. The cardinal principles are:

FOR BASIC RELIABILITY, DO NOT ANALYSE BELOW THE


LEVEL AT WHICH A FAILURE WILL CAUSE A DEMAND FOR
MAINTENANCE, REPAIR, OR LOGISTICS SUPPORT.
FOR MISSION RELIABILITY, DO NOT ANALYZE BELOW THE
LEVEL NECESSARY TO IDENTIFY MISSION CRITICAL FAILURES.

50.2.2.1 A reliability model of the


system/subsystem/equipmentis reauired for making numerical apportionments and
estimates; it is mandatory for evaluating complex series-parallel equipment
arrangements which usually exist in weapons systems. In every case the
rationale behind the reliability model should be documented. A model should be
developed whenever a failure tolerant design is being analyzed.

50.2.2.1.1 The basic information for the reliabUlt~ model 1s derived from tne
functional (schematic) block diagrams. The diagrams depict functional
relationships of subsystems and ccmocnents available to Drovicle the reauired
performance. The reliability model reorients the diagrams into a
series-parallel network shoulng reliability relationships amonq the various
subsystems and components. (The authenticity of the functional relationships
depicted by the diagrams should be checked by a failure modes, ffects, and
criticality analysis.)

50.2.2.1.2 The model should be developed as soon as program definition


permits, even though usable numerical input data are not available. Careful
review of even the early models can reveal conditions where management action -
may be required, For example, ~lngle point failure conditions which can cause
premature mission termination or unacceptable hazards can be identified for
further consideration.

50.2.2.1.3 The model should be continually expanded to the detail level for
which planning, mission, and system definition are firm. The expansion need
not be to the same level for all functions until desiqn definition is comDlete.
In the interim, more detail should be added to the model as it becomes
available so that evaluations may proceed aoace with program decisions.

50.2.2.1.4 Together with duty cycle and mission duration information, the
model is used to develop mathematical expressions or computer proqrams which,
with appropriate failure rate and probability of success data, can provide

A-12
Downloaded from http://www.everyspec.com

MIL-STD-785B
APPENDIX A
15 Seotember 1980

apportionments, estimates, and assessments of basic reliabilltv and mission


reliability.

50.2.2.2 ~ System reliability requirements


can evolve in a number of ways from informed judqments to analvses based on
etnoiricaldata, Ideally, the requirements are such that the total cost of
developing, procuring, ooeratlnq, and maintaining the system over its llfe is
minimized.

50.2.2.2.1 Once quantitative svstem requirements have been determined, thev


.
are allocated or apportioned to its subsystems. A number of schemes can be
used, but the objective Is common--to transform the system requirement into
manageable lower level requirements. Even thouuh the initial allocations fnav
be gross, thev can nevertheless indicate to managers the scoDe of the resources
required for the reliabilltv prourafn.

50.2.2.2.2 @efore resources can be specifically allocated for achievina


subsystem requirements, however, the specific subsystem requirements aust be
re~ined. A useful technique In the refinement process I.sto allot some failure
probability to a resene for each subsystem. This technique recognizes that.
new functions will be added and therebv precludes the need for continual
reallocation to accommodate additional design definitions. Included in this
process should be a comparison of the merginq requirements with emirical data
for identical or similar hardware, to determine the realism.of the allocation
in terms of reasonable expectation. lf some of the requirements a~pear to be
unreasonably difficult to achieve, then the analysis becomes the basis far
performing design tradeoffs amona the subsystems to reallocate the svstem
reaulrement. This total process--moss allocations, comparisons with emDirical
data, tradeoffs, reiterating as reauired-- eventually results in subsvstem
reauirernents. Then, with this information, the amount of effort and Dersonne~
resources reouired can be estimated and oromarmaed.

50.2.2.2.3 The ~locat~on process should be initiated as SOOC?as ?ossibl in


the early acquisition phases, for it is then that most flexibility in tradeoffs
and redefinition exi~t~. Another reason for startin~ early is to 311ow tlae :9
establish lWer level reliability reaulrements (system reouirefnentallocated :2
subsystems, subsystem reaulrement allocated to assemblies, et cetera). Also ,
the requirements must be frozen at some Doint to establish baseline
requirements for designers.

50.2.2.2.4 After the lower level reliability requirements are defined, thev
should be levied on the responsible equipment en~ineers (contractor and
subcontractor) for all hardware levels. Without sneclfic reliabiiltv
requirements which must be desiqned to or achieved, reliability becomes a vauue
and undefined general objective for which nobody is responsible. From another
perspective, program progress can be measured by valuating defined reliabil~tv
requirements at a given milestone/ti!neperiod with what has actuallv been
accomplished.

50.2.2.2.5 The reliability requirements produced from allocations should be


the basis for essential inputs to other related activities. Maintainabili:v,
safety, quality engineering, Loqistics, and test planninq are exam~les of
activities whose work will be facilitated with established reliability
Downloaded from http://www.everyspec.com

MIL-STD-785B
APPENDIX A
15 September 1980

requirements.

50.2.2.3 ~

50.2.2.3.1 ~ Allocations are determined from the system


reliability requirements to provide lower level requirements which are levied
on the equipment designers (50.2.2.2). As design work fmomesses, predictions,
based on previously generated data, and assessments based on prosram test data,
are the tools used to indicate whether the allocated requirement can or will be
met. However, predictions should not be used as a basis for determining
attainment of reliability requirements. Attainment of requirements will be
based on representative test results.

50.2.2.3.1.1 Predictions combine lower level reliability data to indicate


equipment reliability at successively higher levels from subassemblies throuh
subsystems to the system. Predictions falling short of requirements at any
level siqnal the need for maxmqement and technical attention. A shortfall in
basic reliability, for example, may be offset by sim~l~fyinq the design, bv use
of higher quality oarts, or by tradinq off detailed performance tolerances. In
addition, a shortfall in mission reliability may be offset by the use of
redundancy or lternative modes of operation (it should be noted that such
design techniques increase system complexity, reduce basic reliability, and
increase life cycle cost). Alternatively, shortfalls may dictate the need to
reaccomplish the reliabilltvall~ation and to redefine requirements which car?
reasonably be achieved at the lower equioment Lev@ls.

50.2.2.3.1.2 The prediction task, iterative and interrelated with activities


such as reliability allocation and configuration analyses (50.2.3), should be
specified by the procuring activity durinq the early cauisltion phases to
determine reLiabUity feasibility, and durina the develo~ent and Production
phases to determine reliability attainability. Predictions provide enuineers
and management essential infomuation for day-to-day activities; in addlticn,
they are important supportlnq elements for program decision-makers.

50.2.2.3.2 Predictions should be made as early as possible and updated


whenever changes occur. While early predictions based on parts counts are
inherently unrefined because of insufficient design detail, they provide useful
feedback to designers and management on the feasibility of meeting the basic
reliability requirements. As the design progresses, and hardware relationships .
become better defined, the model of the system depictinu relationships between
basic reliabilityand mission reliability should be exercised to WOVide
predictions up through the system level.

50.2.2.3.3 As a system progresses from paoer design to hardnre stages,


predictions mature as actual program test data become available and are
integrated into the calculations. The validity of predictions is a function of
data quality and assumptions. Valid, timelv analyses Pmjectirm or indicat~~z
deficient reliability attainment provide the basis for corrective action, and
the sooner that corrective action is identified, the less its implementation is
impacted by.program constraints, and the higher are the payoffs over the life
of the system. The reliability values produced from predictions should be the
basis for essential inputs to other related activltes. Maintainability,
safety, quality engineering, loqistics, and test planning are exaru~lesof

A-14
Downloaded from http://www.everyspec.com

actlvitieg whose work will be facilitated with established reliability


requirements.

50.2.2.3.3.1 Eauioment
level predictions usin~ part failure rates: (1) provide a basis for
Identifying design, part selection/annlicationand environmental oroblem
areas, (2) provide early indicationof capability to meet the reliabiiitvtest
requirement, and (3) are essential inDuts to svstern/subsystemlevel
predictions(50.2.2.3.3).

50.2.2.3.3.1.1 Reliability predictions should be accomplished at the lowest


equipment Level that the preliminary design and configuration anaivses oemit.
For off-the-shelf hardwere to be incorporated into a hiuher level assenblv, a
prediction can be an empirical data assessment adjusted for new or different
mission requirements. For newly-desiuned eauipment, analytical derivation o?
failure rates for constituent components may have to suffice until actual data
can be acouired. Electronic oart failure rate Prediction techniques are
available Ln the current edition of ?IIL-HDPK-217. Techniques for oredictlnq
failure rates for mechanical and electromechanical devices, however, are net so
readily available and therefore dialogue between reliability and desiqc
engineers is important to ensure that mission and environmental imoacts on
device performance are accoun~ea for in tne failure rates. ~n either tne
electronic or the nonelectronic cases, the part or comoonent failure rates are
the basic buildinq blocks for accomplishing higher level predictions.

50.2.2.3.3.1.2 The fundamental reason for early predictions based on parts


failure ratesis to precipitate appropriate action during development, when it
is most tolerable from a urogracnstandoolnt and most effective from the basic
reliability and mission reliability viewpoints. Early review of reliasilit*/
predictions at lowest equipment levels can identify parts or components whlcn
have inadequate margins between parts strength and expected atlDlledstress. In
addition, the earlier the review is oerforl!ned, t?enerallv the ureater Is the
range of acceptable options for improvinq the predictions and the eouiwnent.
Whenever predictions fall short of allocated reliability reaulrements,
alternatives such as the followinq should be considered: identl~y sutsable
substitutes, reaccomplish reliability allocations, i!!!prove selected
parts/components, redegiqn, modifv the mission to decrease severitv g?
environments or other factors. Some alternatives are more feasible and
acceptable than others at given points in development, bu: all are eas~e~ an5
less expensive to accomplish earlier than later.

50.2.2.3.3.1.3 Reliability predictions it any equipment level become inDuts to


higher level predictions. Prediction quality at all levels depends on how weil
the reliabllty model used represents the hardware and Its functional
relationships. The better the predictions, in terms of reduced uncertainty,
the more justifiable are the reliability and design decisions resultlnr
therefrom--whether the decision is to maintain the status quo ~r to take act:sc
to improve hardware reliability.

50.2.2.3.3.1.4 A serial model prediction of basic reliability must be made far


every system, subsystem, and equipment, whether or not it includes anv
redundancy, to provide estimates as input for maintenance and logistics sun~crt
Plans, awnership cost estimates, and life cvcle cost estimates.

A-l?
Downloaded from http://www.everyspec.com

MIL-STl)-785B
APPENDIX A
15 Sept&iiber1980

50.2.3 ~
50.2.3.1 (F=) (~?o Q)L
A F14ECA1s a powerful tool to optimize the perfom8ance/llfe cyole cost tradeoff
between mission reliability and basic reliability at the blaok box/component or
major subsystan level, where these tradeoffs are most appropriately analyzed
and evaluated. Potential design weahesses are identified through the use of
engineering schematics and mission rules to systematically identify the likely
modes of failure, the possible e~fects of @ach failure (which may be different
for each life{misalon proftie phase), and the crltluality of eaoh effect on
safety, readiness, mission wcoess, d~d for maintenanoeilogistlcs support,
or some other outcome of significance. A reliability criticality number may be
assigned to each failure mode, usually based on failure effect, severity, and
probability of occurrence. These numbers are sometimes used to estabUsh
corrective action priorities, but beoause of the subjective judgment required
to establish them, they should be usd only as indicators of relative
priorities. FM.ECAcan also be uoed to confirm that now fa$lure modes have net
been introduced in transforming schematics into production drawings.

50.2.3.1.1 The lnltial FMECAcan be done early In the CONCEPT phase, and
becauae only limlted design deftiAtton may be avafiable, only the more obvious
failure modes may be identified. It till, houevcr, identify many of the single
failure points, sow of tilah oan be eliminated by a Sch-tic re~rW~ent
As greater mission and design definitions are developed in VALID and FSED
phases, the analysis can be expanded to successively more detall~ levels and
ult~tely, if requtied, to the part level. Additionally, for non4etectable
failures, the analysls should be carried further to determine the effect of a
Woond failure (e.g., double-point failure). Where non4etectable failures
cannot be eliminated, soh.duled maintenance procedures may have to be modified
to mlnlmlze their probability of occurrence. Non-duteutable failures (e.g.,
check VdV4JS, wel.ght-on-uheelssultches, et cetera) are often overlooked by
analysis and the F?4ECAshould be carried out far enough to consider the overall
effect on the total syst~. Ulth regard to one-shot systems, it may be
Partlcuhrly desirable to analyze manufacturing documentation such as circuit
board layouts, wire routings and connector keying to determine if new fallxe
modes have been introduced that ~re not in circuit schematics.

50.2.3.1.2 The usefulness of the FHECA 1s dependent on the skill of the


analyst, the available data, and the information he provides as result of the ~
analysis. The FMECA fo~t may require additional lnfomnation such as, failxe
indication, antlolpated env2ro~ent under which the failure may be expected :a
occur, time available for operational corrective action, and the corrective
action required. lherequirement to supply such additional information should
in general be llmlted to those potential failures uhich imperil the crew or
preclude mlsslon cmpletion. FPIECAresults may suggest areas where the
Judicious use of redundancy can significantly improve mission reliability
without unacceptable impact on baalo reliability, and where other ~~y~e~,
e.g., electronic parts toleranoe analyses, should be made or other provisions,
such as enviro~enti protections, should be considered. Mdltionally, FMECA
results can be used to provide the rationale for the details of operating
procedures used to ameliorate undesirable fatlure modes and to document the
residual risk. F?IECAis al= an effective tool for evaluating the
effectiveness of built-in-test.

A-16
Downloaded from http://www.everyspec.com

MIL4TD-785B
APPENDIX A
15 September 1980

50.2.3.1.3 Finally, mCA results should be used to confirm the validity of


the model (50.2.2.1) used in computing estimates for subsystems or functional
equipment groupings, particularly where some form of redundancy is included.
The identity of reliability critical items (50.2.4.3) *ich are a P~t of the
seleoted configurations should be retained and ticluded in the RFP for the
development phase. These It-s are the prime candidates for detailed analysis,
gro~h testing, reliability qualification testing, reliability stress analyses
-d other techniques to reduce the reliability risk. It is advisable to
request the respondents to examine the list of reliability critical items and
make appropriate recommendations for addttions and deletions with supporting
.
-atlmale. FMECA results should be used h defining test and checkout
procedures to assure all essential parameters, functions, and modes me
verified,

50.2.3.1.4 Becauae of the many and varied skills required to determine failtme
modes, effects, corrective action, etc., the FMECA requires inputs from many
diaciplineg. For this reason, it is relatively unimportant which engineering
grouP is 80i90t@d to make the and.ytis. What i9 important is the critical
examination of the results by all disciplines which could have useful knowledge
that oan be brought to bear on the analysis. The analys~s is most effective
when made as the design progreme~, i.e. , it is a working toOlo It 13
therefore more cost effective to review the analysia prior to formal
publication and at scheduled Program Reviews!.

50.2.3.2 A SCA is baaed on the ase


of engineering and manufacturing doclumentatlon. Its purpose is to Identify
latent paths which cause occurrence of unwanted functions or inhibit desired
functions, aasumtig all components are functioning properly. SCA OC
electro-mechanicalcircuits ts a useful techniqw that can also be used for -
discrete analog and digital circuitry. Finally, the analysis should be
considered for critical system and functions uhere other techniques are not
effective, but shouid not be applied to off-the-shelf computer hardware such as
memory or data processing equipment.

50.2.3.2.1 SCA is a useful engineering tool which can be used to identify


sneak circuits, drawing errors and de3ign concerns. The effects of varying
environments are not nomally considered, and sneak circuits which result from
harduare failure, malfunction, or environmentally sensitive characteristics ~e
not usually identified. The Identification of a sneak circuit does not always
indicabe an undesirable condition; h fact, soum have been used to accomplish
tasks when other circuitry has faU8d. The implications of a sneak circuit,
therefore, must be explored and Its impact on the circuit function determined
before any action is taken.

50.2.3.2.2 SCA may be expensive. It is ua~lly performed late In the design


cycle after the design documentation 1s complete which makes change difficult
and costly to implement, and it is not defined as a technique by any MIL-STD.
Therefore, SCA should be considered only for components and circuitry which ars
critical to mission success and safety, Cost may rule out the use of SCA ~or
digital logic c~rcuits because It ia necessary to consider all cmbinationa af
switching positions, transients, and timing which could require considerable
computer time.

A-17
Downloaded from http://www.everyspec.com

HIL4TD-785Ei
APPENDIX A
15 September 1980

50.2.3,3 Becau9e of
within-specificationpart tole;ance buildup, an output frcm a circuit,
subassembly, or equipment may be outside spec values and therefore
unacceptable. In such cases, fault isolation will not identify any part as
fatied or input as unacceptable. To preclude the existence of this condition,
a pUts/CUctita tolerance analysis la conducted. This analysis examines, at
component titerconnectlons and Input and output points, the effects of
partsicircuits electrical.tolerances and parasitic parameters over the range of
specified operatl.ngtemperatures. This analysis has proven coat effective h
identifying equipment perfomaancelreliabilityproblem areas so they oan be
corrected prior to production.

50.2.3.3.1 The analysis considers expected component value variations due sa


manufacturing variances (purchased tolerances), drift ~th the (l~fe-drift)~
and temperature. Some of the characteristics examined are relay opening and
closing t-s, transistor gain, resistance, inductance, capacitance, and
ccmtponentparasitic parameters. Max-= uithin-spec input signal or power
voltage, phase, frequency, and bandwidth; impedance of both signal and load
should also be considered. Circuit mode parameters, such as voltage, current,
phase, and waveform, can be analyzed for theti effect on circuit component
perfonnanoe. Finally, under worst caae conditions, the t-ing of sequential
events, circuit load impedance, power digaipation and element rise time should
be considered.

50.2.3.3.2 In making this analysis, equivalent ctrcuits and mode-matrix


andy~i~ technlqw!!sare umd to prove that the circuit/equipment will meet
specification requirements under all required conditions. The u3e of a
ccmputer 1s recommended to solve the matrjx problan inherent in mode matrix
analysts Or complex ci.muitryo

50.2.3.3.3 This analysis is considered to be relatively expensive because cf


the skill levels required and the time-consuming job of preparing the input
information for the caputer (use of a computer Is mandatory in most cases).
For this reason, lta application should probably be llmited to critical
c~cuitry. For the purpose of this analysis, power circuitry, e.g., power
supplies and servo drivers, are u3ually crltlcti, a3 to a Iesaer extent are
lower power clrcults, such as Intermediate frequency strfps. Because of the
difficulty in specifying precisely the variables to be considered and theL-
ranges, it may be more efficient to specify a parts/circuits analysis of
critical circuitry, to require the supplier to identify the circuitry, the
variables to be considered, and the statistical llmlt criteria to be used in
*valuating circuit/system performance; and to propose his effort on that basis.
Subsequent negotiation prior to procuring the analysis should result in a
tailored task that 1s mutually satlsfactery.

50.2.4.1 . A
system which can tolerate failures and still successfully complete a mission
has a higher MCSP than one which must abort-following a failure. System,
subsystem, or equipment designs which have this attribute are somet~s tailed
failure tolerant, Statements which establish the specifics of such tolerance
are called failure tolerance criteria.

A-18
Downloaded from http://www.everyspec.com

bA~L-sT~-755p
A?PENDIX A
15 Sebtember 19eo

50.2.4.1.1 These criteria provide standards for design compliance, and snaoe
~ub~ystem architecture. They are usually found in several elates in weaDon
system and subsystem speciricatlons. Tvpicai of such criteria is the following:
nA single failure in any ~ub~y~teqj shall not Cauge or reauirea TliSSiOn abort,
and durina an abort the single malfunction or failure of a subsvstem or
component shall not cause loss of the crew. Criteria such as these in~luence
the design and operation of almost all subsystems, and therefore an oraanized
approach is required to meet them. The compliance attained by this ~DWrOaCh
will use a minimum of added redundancy and comlexity. It must be realize~
that failure tolerant design techniques U!3uallyincrease comDlexitv and totai
number of Darts: reduce basic reliability (~~F), maintenance-re~at@d
reliability (MTBMA), and louistics-related reli.ablity(YTBR~; and thus usJa:l/
increase both acquisition cost and cost of ownership.

50.2.4.1.2 Compliance with the mission success criteria can best be determined
by examining functional diaurams, systems schematics, and software
specifications and documentation in the liaht of mission rules and
requirements. Particular attention should be paid to providina uowe~ from
different sources (where feasible) to redundant or alternative means of
accomplishing a function. Eesides different oower sources, conslderati~n
should also be given to the use of different conneetorslwtrinc Dins, a?vsical
separation/orientation,and different software for redundant equipments. b~r~
generally, careful scrutiny is required to identify and avoid arranze~ents
which can invalidate the functional redundancy provided.

50.2.4.1.3 This process of confirming compliance with criteria should be


continued through FSED and iterated as dictated by ~roDoseclchacees durin~
production.

50.2.4.2 &@ 3Pl~~rriw


4 4
?97 )% Conductin@ an
( task

aggressive parts control and application prouram increases the proba~ilit~ 3?


achievinq and maintaining inherent eauipment reliability, minimizes Darts
proliferation, logistics suDDort costs, and syaternlife cycle co3ts. Tho added
investment required for a vigorous prouram which controls parts se:ec:~~~ a~~
application can be offset 9V reduced system life cycle c~stz for ?~:~ratle
systems and by overall system effectiveness for nonreoarable systems. Zn s~re
cases the use of higher quality parts can even lower item acquisition test
through reduction in the amount of assembly line rewark as well as eliminate
additional coats for drawings and test data required when usinq nonztancart
parts.

50.2.~.2.l Parts and components are the basic items comnrlsine hi~+er level
assemblies, which in turn ultimately constitute the system, where the svstem
may be a radio, a space satellite, or a nuclear submarine. Significant
contributions toward system optimization can be realized by applying attention
and resources to parts application, selection, and control startinq arlv ir
VALID phase and continulnq throughout the life of the system,

50.2.4.2.2 The decisisns as te the deuth and extent of the aarts ar~zram
designed for a particular item acquisition should be made based on
considerations of factors such as: fni33ioncriticality, Dar:3 es~ent:ali:v :2
successful mission completion and reduced frequency of maintenance),
maintenance concept, production quantity, oarts availability, aacuntfdegre? z:

A-19
Downloaded from http://www.everyspec.com

MIL-STD-785B
APPENDIX A
15 September 1980

new design, and parta standardization status. A comprehensive parts program


can be just as essential for a relatively simple device (e.g., UHF radio) with
a large projected tiventory as for a single, complex device (e.g., special
purpose space system).

50.2.4.2.3 A c=prehenalve parta program Mill consist of the following


elements:

a. A parts control program (in accordance ul.thMIL-STD-965)

b. Parta standardization

c. Parts application (derating) guideltie~ eat~bll~hed by the contractor

d. Parts testing, screening, or validation

e. GIDEP participation as applicable (MIL-sTD-1556)

If such a program is embarked upon, it is imperative that both procuring


activity and the contractor aaslgn qualified personnel, because the dynamic
nature of parts and compon.nt technology can quickly render existing knouledge
and experience obsolete. Guidelines for selecting procedure I or II of
MIL4TD-965 for parts control are contatied in that document.

50.2.4.2.4 The procuring activity should provide general parts application


guidelties (e.g. MIL-P-11268(ARMY), MIL-E-5400) for the producer to use in
establishing the parts application criteria. These criteria are the standards
established and enforced by the contractor for equipment designerg !contractar
and subcontractor),and should be adhered to because failure rateg can Lncreage
dramatically (i.e., reliability decreases) ulth exposure to increased stress
levels. Deviations to the parts application criteria should be granted through
the parts control program only after evaluating the actual part stress
conditions, design alternatives, and impact on clxcult and overall Syste!ll
reliability.

50.2.4.2.5 The basic ob.jectlveof a procuring actlvltys parts program 19 :D


control the selection and use of standard and nonstandard parts.
Occasionally--sometimesoften, depending on the system--it becomes nece~9arY
for the contractor to propose the use of a nonstandard part. Pr0P09a19 for JSe
of nonstandard parts hould be made and accepted only after other options uhiah
use standard parts have been investigated. (The parts control activity,
besides ruling on the immediate application, should normally make a judgment on
whether the part in question meets enough of the criteria to make It a standara
part.) Use of standard parts is Important because it avoids introducing
additional parts into the DoD inventory, it tends to assure a supply of parts!
throughout the system~s life, and can limit tendencie~ to overdesign or
gold-plate.W It Is important to also recognize the potential for systems
unreliability and ticreased maintenance contributed by interconnectingwiring
and common hardware (e.g., relays, stitches, connectors) purchased under
separate contract.

50.2.4.2.6 Parts programs should be initiated early in design and contin~ed


throughout-the life of the gystem. But, before the beginning of FSED. the

A-20
Downloaded from http://www.everyspec.com

MIL-STD-7859
APPENDIX A
15 Seoternber198c

procuring activity should clearly soell out an order of areference of part


quality levels for use in the system. In addition, the orocuring activi:y
should identify prohibited part types. For certain applications, sDecial tests
of standard parts may have to be used to obtain acceptable parts which are then
unique and must so be identified. It is most izmortant to emphasize, however,
that special testing, identification,and selection inhibits standardization,
since those processes produce a nonstandard part which mav not be readily
available to support the system throughout its Lifec

50.2.U.2.7 Parts proqram activities are interrelated with all other analyses
described in this document, and with analyses performed by other disciplines
such as safety, qu~ity enqineerinq, maintairiability,survivzibiiitvand
vulnerability. Any of these analyses can indicate the need for different parts:
upqraded or uniaue, in some cases, to meet system requirements; standard or
readily available parts, in other cases, to minimize system life cvcle costs
and ensure suooortability.

50.2.4.2.8 An effective parts prouram reauires that knouled~eable Darts


engineers be used by both the procuring activity and the contractor. Clovernmen:
aqencies such as the Defense Industrial Supply Center, the Defense Electronics
Supply Center, and the Rome Air Develomnent Center can provide excellent
support. Loqistic~ans should aiways be consuitea for LheLr lnDuts, because cnev
will be required to support the system operationally. The investment In oar:s
programs generally oays handsome dividends in terms of reduced ooeratlanal
costs and improved system operational effectiveness.

50.2.4.3 Reliability critical items


are those items whose failure can significantly affect system safetv,
availability, mission success, or total maintenance/loqistlc9suooort cost,
Critical items shall include, but not be limited to those identified bv
reliability analysis and F!lECA. High-value items should always be considered
reliability critical for life cycle cost. A single point failure identified as
a mission-critical item, and any design redundancy or alternative modes of
operation proposed as a means of eliminating that single point failure, must
both be considered in llqht of their affects on both operational effectiveness
and life cycle cost. In other words, redundancy may be necessarv, but i: mug:
be justified in terms of what it will ccst, and what it will buy, over the
entire life cycle of the systems inventory.

50,2.4.3.1 Reliability critical items, once identified aa a part a: the -


selected configurations, should be retained and included in the RFP for
subsequent phases. These items are the prime candidates for detailed analvsis,
growth testing, reliability qualification testing, reliability stress analyses,
and other techniques to reduce the reliability risk. It is advisable to
request the respondents to examine the list(of reliability critical items and
make appropriate recommendations for additions and deletions with suoportlnq
rationale.

50.2.4.4 (~q)< Planned $toraae and/or useful life are


important considerations for every system, subsystem or component. To cai~
some assurance that these items can successfully tolerate foreseeable
operational and storage influences, it may be advisable to conduct analvses and
tests to determine the effects on them of packaclnu, transportation, bandlin~,

A-21
Downloaded from http://www.everyspec.com

HIL-STD-785B
APPENDIX A
15 September 1980

storage, repeated exposure to functional testing, et cetera. The information


from these analyses and tests can support trade-offs to influence design
criteria. This task contains a mxg?ested description of the effort. It is
recommended that this task be applied after consulting with
cognizant equipment engineers and auality assurance, test and logistics
experts.

A-22
Downloaded from http://www.everyspec.com

MIL-STD-785B
WPENDIX A
15 September 1980

50.3

50.3.1 ~

50*3*1.1 ~ The reliability test program must serve three


objectives in the following priority: (1) disclose deficiencies in it=
design, materiel and workmanship; (2) provide measured ~l~ability data as
input for estimates of operational readiness, mission success, mainten~nce
manpower cost, and logistics support cost; and (3) determine compliance with
quantitative reliability requlrementa. Cost and schedule investment :n
reliability test~ng shall confom to these priorities to ensure that the
overall reliability program 1s both effective and efficient. Four types cf
reltibility testing are oontained in task section 300; MS, RDGT, ROT and PRAT.
Environmental stress screening (ESS, task 301) and reliability
development/growth test2ng (RDGT, t~sk 302) are reliability ngineering tests.
program plans shall emphasize early investment in ESS and RI)GTto ~vold
subsequent costs and schedule delays. Reliability qualification tests (RCT,
task 303) and production reliability acceptance tests (PRAT, task 304) are
reliability accounting tests. They shall be tailored for effectiveness And
efficiency (maximm return on cost and schedule investment) in terms of the
manag-ent information they provide. A properly balanced reliability progra
-L ephasize ESS And RDGT, and limit, but not eliminate, ROT and PRAT.

50*3*7*2 ~ It Is DOD policy that performance, reliability,


and environmental stress testing shall be combined, and that environmental
stress types shall be combined insofar as practical. It is the responsibility
of the PA to draw these tests together into an $ntegrated, effective, and
eff~cient test program. For example, mechanical, hydraulic, pneumatic, and
Qlectrica.1equipment are usually subjected to three qualification test!?;
PerfO-nOe, envfionmental, and endurance (durability). The Integration cf
th8Se separate tests into a more comprehensive reliability test program can
~VO~ 00Stly duplication and ensure that deficiencies we not overlooked ag
they often are in the fragmented approach.

50.3.1.2.1 performance tests should be conducted ag goon ag it-g Are


fabricated. They should be brief, and should provide the timedlate basis for
correction of any deficiencies they dlsclOSO. However, an item that has p~ssed
its performance test must not be considered compliant with Government
requirements until it has shown that it uI1l perform reliably under realistic
conditions,

50.3.1,2.2 Envhonmental tests such a.gthoge de~crib~ in MIL-STD.8~()~hodld


be considered an early portion of RDGT. They must be conducted early in
development, to ensure that time and resources are available to correct the
deficiencies they disclose, and the corrections must be verified under stress.
Such information must be included in the FRACAS (50.1.2,s) as an integral
aspect of the reliabUity program.

50.3.1.2,3 Endurance (durability) testing usuallly consists of a normal test,


an overload test, and a mission profile oycling test that duplicates or
approx~tos the conditions expected in service. Failures m~t be evaluated,
and corrective actions must be incorporated in the test items. The test must
then be rerun or, at the option of the PA, the test may be completed and an
additional run conducted to show the problems have been corrected. ~~~
information must also be included in the FRACAS. An integrated test moqram

A-23
Downloaded from http://www.everyspec.com

PUL-STD-785B
APPENDIX A
15 September 1980

will combine reliability testing and durablltty testing.

50,3. 1.3 A test is reallstic to the degree that test conditions


and procedures simulate the operational life/tission/envfiomental profile of a
~odution ita8. Realistic testing can disclose deficiencies and defects that
othmwise would be Uisoovered only after an Ites Is deployed, and it can reduce
the disparity between laboratory and operational rel~abtiity values.
Therefore, test realism must be a primary consideration In every reliability
test. A test that only discloses a small fraction of the operational failures
It is supposed to disclose is a waste of tbe and resources. Conversely, d
test that Indmes failures which wfil not occur in service forces unnecessary
expenditures of the and resources to correct those failures. And finally, the
degree to which any reliabfilty test must simulate field service depends on tne
purpose of the test.

50c3*10301 kw test XOalSm ti often due to anisslon of a relevant stress, or


Incaplete definition of a type of stress. For example, failures that are
caused by vibrations are seldom found by tests that apply no vibration, or by
teats that @ore the relevant c~b2natd.ons of vibration frequency, aplitude
and duration of exposure. Establishment of realistic test condltZons and
~~- reqties a knowledge of the life profile from factory to final
expenditure, ?s include the micro-environments an Item will experience duririg
each phase of its life profile, based on measurement of the actual stresses
experienced by similar items.

50.3.1.3.2 It is approprfitc to apply strosa 10VS1S graater than those


xpected in aervioo, if the purpose of the test is to disclose deficiencies,
and If test oonclltlonsdo not Induoe fall~es that till not occur In service.
On the other hand, both overstress and under~tress make reliability egtimates
inaccurate and distort test results ueed to determine compliance. Therefore,
overstress (and step-stress) testing may be applied during ESS and the early
portion of RDCT, but the final portion of RDGT, and both ROT and PRAT, should
simulate the operational life prof~e insofar as ~actical and cost-effective.

50.3.1.3.3 Preotie simulation of the operational life profile would expose


each ltm and each part of each item to the exact stress types, levels and
durations they will experience in service. Such idealiatlc te8tlng M seldom
praotical or mat4tfective. So- stress types cannot be coablned in the sdme
test fac~ity, and some may cost more to reproduce in the laboratory than theY
are worth in te~ of the failures they cause in service. Stress type9 may be.
appl$ed in series for ESS and the early port$on of PRAT. Total test time may
be compressed by reducing the amount of time spent In simulating 1sss stressfd
phases of the life prof~le. (Note that overstress is a valid way to accelerate
the discovery of deficiencies and defects, but it Is not a valid means of
com~essing test t$ae when reliabtilty la to be measured.) PUL-STD-781
contains guidance for realistic combined-stress, lLfe/mission profile
reliability testing,

50.3.1.4 Meagured reliability data must


serve variety of needs for management lnPormation, in addition to its me as
a basis for deteml.nlng compliance with quantitative reliability requirements.
Po3.nt(and interval) estimates of the demonstrataj reliability are essentidl
inputs for operations and support plans, manning and sparing decls~ons,
ownership cost and life cycle cost estimates. It is Imperative that these
inputs be defines in Lhe appropriiicemits of meiisurementma based un

A-24
Downloaded from http://www.everyspec.com

MIL-STD-785B
APPENDIX A
15 September 1980

realistic test results. MIL-STD-781 contains guidance for confidence interval


estimates of the demonstrated MTBF, The PA may translate d-onstrated MTBF
into the pr~per tits of measurement for syst~ reliability parameters, or
require the contractor to perform these translations.

50.3.1,4.1 In cases where It is impractical or Inefficient to demonstrate all


applicable system rel$abllity par~eters at the syst~ level of assembly,
system level estimates must be c-piled &wI lower level test results. Audit
trails are required to relate basic reliability measurements (MTBF) with the
proper units of measurement for each systsm reliability parameter (such a HCSP
and MTBMA), and to accwunt for elements of operational reli~billty values which
are not simulated during the test (such as the influence of operation and
support concepts, policies and planning factors). The PA may specify each
element of these audit trails, or require that audit trails be developed
subject to PA approval.

50.3.1,4.2 Reliability values measured during ESS and the early portion of
RDGT cannot be expeoted to correlate with reliability values in service.
Reliability values measured dur~ng the final portion of RDGT, and both ROT And
PRAT, must be correlated with reliability values in service, by optimum test
redlsm and clear traeeabilty between test and field measurements. Al1
relevant test data must be used to project operational reliability for
e~t~te~ of operational ePfeotiveness (readiness and mission success) and
ownership cost (maintenancemanpowr costs and logistic support cost). Only
chargeable test results shall be used to detexmine contractual compliance with
qwntltatiwe reliability req~ements.

50.3.1.5 ~es . Failure and relevant


failure are defined by DOD policy. A failure is the event In uh$ch any part of
an item does not perferm as required by Its performance specification. A
relevant fallm la one that can occw or mow during the opeationa.1 life of
an tta inventory. Therefore, there are only two types of nonrelevant failure;
(1) those verMLed aa cauaed by a condition not presmt in the operational
environment, and (2) those verified as peculw to an item design that will
not enter operational inventory. A chargeable failure is relevant failure
caused by any factor within the responslblllty of a given organi=thnal
entity, whether Government or commercial. Every relevant failure shall be
charged to maebody. For example, relevant failures due to software errors Are
chargeable to the soft- supplier: those caused by human errors are
chargeable to the mployer, or to the agency res~nslble for trainhg.
Dependent failures are chargeable, as on. independent falluro, to th8 supplier
of the item that cauaed them, whether that item is GFE or CFE. In keeping with
DOD policy that respcnslbllities mwt be cearly defined, chargeability refers
to the responsibility for; (1) the cause of failure, and (2) corrective
act$on to prevent recwrenoe of failure.

50.3.1.6 ~t- The statistical design of any reliability


test depende on the purpose of that test. EM and RDGT muet not lnclde
accept/reject criteria that penalizoa tho contractor in proportion to the
n~ber of failures he finds, because that would be contrary to the purpose of
the test; so these tests must not use statistical test plans that establish
such criteria. RQT and PRAT must provide a clearly defined basis for
determining complQnce, but they must iQso be tailored for effectiveness and
efficiency (maxlmm return on cost am schedule investment) in tenna of the
management ir.format~onthey provide. Therefore, 9election of dny 9tdtl.9tiQai

A-25
Downloaded from http://www.everyspec.com

MIL-STD-785B
APPENDIX A
15 September 1980

test plan for ROT or PRAT shall be based on the amount of confidence gained
(the degree that confidence Lntervals are reduced) by each additional increment
of testing. For example, a test that stops at the first failure leaves a wide
range of uncertainty; testing to the fifth or sixth failure dramatically
reduces that mcertainty; but testing beyond the eighth or ninth failure buys
very lzttle in terms of increased confidence or reduced risk. Finally,
speclf:ed oonfidenoe levels, discrimination ratios, and decision risks shall be
subject to tradeoffs with total test time and uost, to inulude impact cost of
program schedule delay,

50.3.1.6.1 Probability ratio sequential test (PRST) plans are only intended to
cietexminecompllanae (accept or rejeot) on ths bas~ ef predetermined deoision
risks. They are not intended b provide esthatos of demonstrated rellablllty,
and they leave no decisions to the PA once they have been specified. PRST
Plans contain significant uncertainties in regard to actual test time.
Therefore, if program cost and mhedule are based on the expected decision
Pointw, rather than the ha.xinnm allowable test ttiem, specification of a PRST
will build in potentiti cost and schedule overruns that the PA cannot control.
in general, PRST plans may be used for PRAT, if only a simple aaccept or
reJect* dec$slon is desired and if schedule uncertainty is not a major concern,
but they should not be used for RQT.

50.3.1.6.2 Fixed-length t~st plans should-be specif$ed when actual test time
must be subject to PA control, and when something more than a simple accept or
reject decision i3 desired, For example, the PA may wish to specify a
ftied-length test, assess the data as it becomes available, and make an early
accept decision on the basis of measured test results to date. (Reject
decisions based on real-t- data assessment are not rec~nded, because they
may require changes in the contract. ) Fixed-length test plans may also provide
a basis for strwturing inaentive fees. For example, the contract may state
that ba8e price uIH be paid for those items having a demonstrated reliability
(~int estimate) within a spoclfied range; that an $ncentlve fee will be P&d
for reliability above that range; that penalty or remedy till be required for
reliability below that range, and that Items having demonstrated reMabilLty
below a minimm acceptable (obgemed) v~ue will not be purchased by the
government. These provisions may also apply to production lots,

50377 ~ It %s DOD policy that, insofar as possible,


teZ3twhich dete!mine oompltince with reliability requirements shall be
conducted or controlled by ~eme other than the supplier whose compliance is
being determined. The PA b responsible for impl~ntat$on of this policy. It
applies to RQT and PRAT, but it does not apply to ESS or RDGT. The PA may
eleot to have RQT and PRAT conducted under se~ate contract to a government or
c~orcial test facility, A higher tier contractor may be required to conduct
or control these tests on behalf of the government. The suppliers test
fac~ity may be USed by sending ~ a tg~ of ~ependent test eng~neers for the
duration of the test. fie supplier must be invited to witness all independent
R~ or PRAT of his product, and test restits must be fed baok to the supplier
(such feedback provides Incentive for h- quality control program). Exceptions
in which the supplier conducts RQT or FJRATof his own product may be granted
only in situation of technical or financial necessity.

50318 ~ The contractor is compliant with specified ESS -


(task 301) requirements when testing has been performed as speciflad in th:
Contract, failures have been zorrected, and corrections have been verifLed.

A-26
Downloaded from http://www.everyspec.com

MILATD-785B
APPENDIX A
15 September 1980

The contractor shall be held responsible for the achievement of specified


reliability growth during RDOT (task 302). me contract should ~PeclfY
accelerated effort In event of failure to meet reliability growth targeted on
the specified values (goils), and noncompliance Provisions in accordance with
the Defense Acquisition Regulations for failsureto meet reliability growth
targeted on the minimum acceptable values (thresholds). The contractor is
compllant with RQT (task 303) or PRAT (task 304) requirements if an accept
decision is reached in aooordame with the statistical test plan, or when the
PA accepts items on the basis of real-time data assessment. It 1s imperative
that a reject decision denote contraotud noncompliance, rather thm a
potentially endless eerles of corrective actions and retests; if the PA falis
to ensure that contracts reflect thi9 policy, total test Item will remain
inherently open-ended and quantitative reliability requirements will remain
inherently unenforceable.

50.3*f.9 ~ An integrated test and evaluation master plan (TEP?P)


must be prepared by the PA, or prepared subject to PA approval. The TEMP must
include eaoh phaae of testing, the ground rules for each t89t, the impact on
CFE, ad the orihria for auoooaaful completion of each test. In addition, a
test procedures document 1s required for each type of reliability test (tasks
301, 302, 303, and 304). These documents describe the item(s) ta be tested.
~he test facilities, the item performance and performance monitoring
requtiements, data handling, and location of test instrumentation. The TEMP
and the test procedures documents should be delivered to the PA early enough to
allow PA review and approval (normally 60 days) before the start of testing.
Onoe testing Is underway, the approved TEMP and the test procedures documents
must be used to monitor and control conduct of the tests. Information on
failures must be compared with existing information in the FRACAS. Accurate
record-keepl~, tith proper failure analysis and corrective action, will
provide the insight needed to manage specif%ed reliability growth (ESS And
RDGT), and will point out the need for any corrective actions late in the
program (RQT and PRAT). Even though most failures should have been corrected
before the start of RQT, latent deficiencies and defects must be corrected as
soon as possible after they are discovered. Delay only compounds the problem
and the cost of correction.

50.3.2.1 [~ [Qsk 3011< ESS is a test, or a


series of tests, specifically designed to disclose weak parts and workmanship
defeots for correction. It should be applied to parts, cauponents,
subasa~blies, assemblies, or equipment (as appropriate and cost-effective), ta
remove defects which would otherwise cause failures during higher-level testing
or early f%eld aervloe. The test conditions and procedures for ESS should be
designed to stimulate fatiur.s typical of early field service, rather than to
provide precise almulation of the operational life proffie. Envtionmental
stress types (s~h as randm vibration and thermal cycling) may be applied in
aeriea, rather than in combination, and should be tailored for the level of
assembly at which they are most cost-effective. ESS testing has 9~niflcant
potential return on investment, for both the contractor and the government,
during both development and production. All-equipment ESS (100S sampling) is
recommended for PA consideration.

50.3.2.1.1 The PA may specify detafled ESS requirements, or have the


contractor develop an ESS test plan 9ub.jW% to PA approval. In either cd3e,
Downloaded from http://www.everyspec.com

141L-STD-785B
APPENDIX A
15 September 1980

the PA should specify a minim= test time per Item, a falJure-~ee interval,
and a maximum test time per item (after which that Item wfil be considered too
worn out by the test to be a deliverable item). The contractor must not be
penalized for the nmbar of failures discovered during ES, but must be
required to correct every faUur*, and to prevent recurrence of fafiures
through u8e of more rel$able parts and the reduction of workmanship errors
d~l.ng the manufacturing process.

5od3.2cl,2 ESS must not be confised with PRAT. ESS employs less expensive
test facilities, and is recommended for 100% sampling. PllATrequires a more
reallstio simulation of the life profile, and more expens3ve test facilities,
and therefore Is not recommended for 100S sampling. ESS must be conducted by
the oontraotor, ~fie PRAT must be independent of the contractor if at all
possible. Uhere the statlztlcal test plans for RQT or PRAT are based on the
exponential distribution (constant fatlure rate), MS is a prerequisite for RQT
and PRAT, because those test plans assume that early failres have been
eliminated,

50.3.2,2 RDCT is a
phnned, pre-quaUfioation, test-analyze-and-fixprocasa, in which equipments
are tested mder actual, simulated, or accelerated envknments to disclose
design deficiencies and defects. Thh test~rg Is Intended M provide a basis
for early Inuarporation of uorrmtivo autioos, and verification of their
effectivenosa, thereby praotlng reliabflty growth. Howovor:

TESTING DOES NOT IMPROVE RELIABILITY. ONLY CORRECTIVE ACTIONS THAT PREVENT THE
RECURRENCE OF FAILURES IN IWE OPERATIONAL XWIMTORY ACTUALLY XMPROVE
RELIABILITY,

50.3.2.2.1 It Is DOD policy that rel$abti~ty growth is reqdred during


full-scale develo~nt, concurrent developwnt and production (where
concurrency la approved), and dmlng Initial deployment. predicted reliability
growth shall be stated as a series of Intemmdlate milestones, with associated
goals and thresholds, for ach of those phases. A period of testing shall be
scheduled In eon~unetbn tith each Intermediate milestone. A block of time and
reaourcas shall be scheduled for the correction of deficiencies and defects
found by each period of testing, to prevent their recurrence in the operational
Inventory. Admlnistrattve delay of rellabl.lityengineering change proposals
shall be mlnlmized. Approved reliability growth shall be assessed and
enforced,

50.3.2.2.2 predicted rellabl.lltygrowth must differentiate between the


apparent grouth achieved by screening waak parts and mrkmnstllp defects out of
the test Itms, and the step-function growth auhleved by design 00rr8Ct30n9.
The apparent growth does not tranafer frcm prototypes to ~uction mits;
Instead, it repeats In every Individual itao of equl~ent. The Step-fUnCtiOn
growth does transfer lx production mlts that Incorporate effective des~n
corrections. Therefore, RDGT plans should Inalude a series of test periods
(apparent growth), and each of tho test periods should be followed by a ?9
period (step-functiongrowth). tiere two or more itam are being tested, their
test and fIx periods should be out of phase, so one itan is belw test~
while the other is being ftied.

50.3.2.2.3 RDGT must correct failures tnat reduce operational effectiveness,


and failures that drive maintenance and logistic support cost. Therefore,

A-28
Downloaded from http://www.everyspec.com

MILSTD-785B
APPENDIX A
15 September 1980

failures must be prioritized for correction In two separate categories; mission


criticality, and cumulative ownership cost critic~itY* The differences
between required values for the syst- reliability parameters shall be used to
concentrate reliability engineering effort where it is needed (for e%ample:
enhance mission reliability by correcting mlss~on-crltlcal faZ.lures;reduce
maintenance manpower cost by correcting any fallwes that ooeur frequently).

50.3.2.2.4 It 1s -peratlve that RDGT be conduoted usLng one or tuo of the


first full-scale engineering development item available. Delay forces
corrective action into the fomal configuration control cycle, uhlch then adds
even greater delays for admlmistrative processing of reliability engineering
changes. The cumulative delays create monumental retrofit problems later in
the program, and may prevent the incorporation of necessary des~n corrections.
h appropriate sequence for RDGT would be: (1) ESS to remove defects In the
test items and reduce subsequent test t~e, (2) enviro=ental te~t~~ such as
that described in MIL+TD-81O, and (3) combined-stress, life profile,
test-analyze-and-(Ix. This final portion of RDGT differs from RQTin two ways:
RDGT %s intended to disclose failures, while RQT Is not; and RDGT is conducted
by the contractor, while ROT must be independent of the contractor if at all
possible.

50*3*3 ~

50.3.3.1 v a~~k 30 ~ RQT1s intended to


provide the government reasonable assurance &hat mZnimum acceptable reliability
requirements have been met before items are commltced to production. ROT must
be operationally realistic, and must provide estimates of demonstrated
Fellabillty. The statlsitical test plan must predefine criteria of compliance
(aCcept) which 11.mltthe probability that true reliability of the item is
less than the minlmun acceptable reliability requirement, and these criteria
must be tailored for cost and schedule efficiency. However:

TESTING TEN ITEMS FOR TEN HOURS EACH IS NOT EQUIVALENT TO TESTING ONE ITEM FOR
ONE HUNDRED HOURS, REGARDLESS W ANY STATISTICAL ASSUMPTIONS TO THE CONTRARY.

50.3.3.1.1 It must be clearly understood that RQT is preproductlon test (that


is, It must be completed h time to provide management information as input for
the production decision). The previous concept that only required
*qualification of the first production units meant that the government .
ccamitted itself to the production of unqualified equipment.

50.3.3.1.2 Requirements for RQfshould be determined by the PA and specified


in the request for proposal. RQT is required for items that are newly
desQgned, for item that have undergone major modification, and for items that
have not met theLr allocated reliability requirements for the new system under
equal (or more severe) environmental strees. Off-the-shelf (government or
c~rciai) items which have met theti alloca&ed reliability requirements for
the new system under equal (or more severe) environmental s~ress may be
considered qualified by analogy, but the PA 1s responsible for ensuring there
is a valid basis for that decision.

50.3.3.1.3 Prior to the start of RQT, certain documents should be available


for proper conduct and control of the test. These documents include: the
approved TE?lPand detailed RQT procedures document, a listing of the items ta
be tested, the item specification, the statistical test plan (50.3.1.6), and a

A-29
Downloaded from http://www.everyspec.com


WLaTD-?85B
APPENDIX A
15 Septanber 1980

statement of precisely who will conduct this test on behalf of the government
(50.3.1*7)* The requirements and submittal schedulefor these document must be
in the CDRL.

50.3.3.2 PRAT is
intended to simulate in-service evaluation of the delivered it- or production
lot. It must be operationally realistic, and may be required to provide
estimates of demonstrated reliability. The statistical test plan must
predefine criteria of c-plhnce (naccept-) which llmit the probability that
the item tested, and the lot It represents, may have a true reliability less
than the minimum acooptable reliability, and these criteria must be tailored
for cost and soheduleefficiency. PRAT may be reqtied to provide a basis for
positive and negative financial feedback to the contractor, in lieu of an
in-aervioe warranty (50.3.1.6). Because it must sim-ulatethe item life profile
and operational envlroment, PRAT may ~equlre rather expensive test facilitie~;
therefore, all-equipment PRAT (100% sampling) is not recommended. Because it
must provide a basis for doterminlng oentractural ccxpliance, and because it
applles to the items aotually delivared to qaratlmal forces, PRAT must be
independent of the supplier $f at all possible (50.3.1.7). Finally, even
though sapling freqwncy should be reduced after a production run is well
established, the protection that PRAT provides for the government (md the
!mtivat$on it provides for the contractors quality control program) should not
be dixarded by cauplete waiver of the PRAT requirement.

A-30


m
1
Downloaded from http://www.everyspec.com

MXL-STE-785B
APPENDIX A
15 September 1980

60. DATA ITR4 DESCRIPTIONS (DID)

60,1 The following is a l$st of data Item descriptions associated with the
reliability tasks specified herein:

TASK APPLICABLE DID DATA REQUIREMENT

101 DI-R-7079 Reliability Program Plan

103 DI-R-7080 Reliability Status Report

104 DI-R-7041 Report, Failure Summary and Analysis

201 DI-R-7081 Reliability Mathematical Model(s)

202 DI-R-2114 Report, Reliability Allocation

203 DI-R-7082 Reliability Predictions Report

204 DI-R-1734 Report, Failure Modes, Effects and Criticality


Analy818 Report
DI-R-2115A Report, Failure Mode and Effect Analysis (FMEA)
(DI-R-2115A is to be used only when MIL-STD-1629
has been dealgnated as the basis for
MIL-STD-785B, Task 204)

205 DI-R-7083 Sneak Circuit Analysts Report

206 DI-R-7084 Electronic Parts/Circuits Tolerance


Analysis Report

208 DI-R-35011 Plan, Critical Item Control

60.2 The following tasks have DIDs associated with them related to imposition
of t41L-STD-781C:

301 DI-R-7040 Report, Burn-in Test

302,
303, DI-R-7033 Plan, Reliability Test
304

303,
304 DI-R-7035 Procedures, ReliabilityTest and Demonstration

303, DI-R-7034 Reports, l?ellabllity Test and D-nstration


304 (Find report)

NOTES: (1) Only data items specified h the CDRL are deliverable. Therefore,
thoso data requirements identified j.nthe Reliability Program Plan mu9t a190
appear in the CDRL,

(2) The PA should mvlew all DIDs and assure through tailor~, that
the preparation Instructions in the DID are compatible with task requirements
as specified in the SOW.

A-3?

-.. .
. .
Downloaded from http://www.everyspec.com

ASD/ENESS
Wrxght-Patterson AFB, O 45433
Downloaded from http://www.everyspec.com

You might also like