Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Guidelines For Ans Software Safety Assurance

Download as pdf or txt
Download as pdf or txt
You are on page 1of 202

The European Organisation for Civil Aviation Equipment

L’Organisation Européenne pour l’Equipement de l’Aviation Civile

GUIDELINES FOR ANS SOFTWARE SAFETY ASSURANCE

This document is the intellectual and commercial property of EUROCAE.


It is presently commercialised by EUROCAE
This electronic copy is delivered to your company/organisation for internal use exclusively.
In no case, may it be re-sold, or hired, lent or exchanged outside your company/organisation.

ED-153
August 2009

102 rue Etienne Dolet Tel: 33 1 40 92 79 30


92240 MALAKOFF, France Fax: 33 1 46 55 62 65
Web Site: www.eurocae.net Email: eurocae@eurocae.net
GUIDELINES FOR ANS SOFTWARE SAFETY ASSURANCE

This document is the intellectual and commercial property of EUROCAE.


It is presently commercialised by EUROCAE
This electronic copy is delivered to your company/organisation for internal use exclusively.
In no case, may it be re-sold, or hired, lent or exchanged outside your company/organisation.

ED-153
August 2009

© EUROCAE, 2009
i

FOREWORD
1. This document, prepared by EUROCAE Working Group 64 (SWAL CS), was
accepted by the Council of EUROCAE on 28th August 2009 taking note of the
concerns expressed by UK CAA and DFS, which are summarized in the
Attachment to this document (Appendix III).
2. EUROCAE is an international non-profit making organisation. Membership is
open to European manufacturers of equipment for aeronautics, trade
associations, national civil aviation administrations, users, and non-European
organisations. Its work programme is principally directed to the preparation of
performance specifications and guidance documents for civil aviation
equipment, for adoption and use at European and worldwide levels.
3. EUROCAE performance specifications are recommendations only. EUROCAE
is not an official body of the European Governments; its recommendations are
valid as statements of official policy only when adopted by a particular
government or conference of governments. The inclusion of references to
standards published by other standardisation bodies, or extracts from those
standards, does not imply any safety or regulatory liability.
4. Copies of this document may be obtained from:
EUROCAE
102, rue Etienne Dolet
92240 MALAKOFF
France

Telephone: 33 1 40 92 79 30
Fax: 33 1 46 55 62 65
E-mail: eurocae@eurocae.net
Website: www.eurocae.net

© EUROCAE, 2009
ii

TABLE OF CONTENTS
1 INTRODUCTION.............................................................................................................. 1
1.0 Purpose ............................................................................................................ 1
1.1 Scope................................................................................................................ 2
1.2 Organisation of This Document ........................................................................ 3
1.2.0 Structure............................................................................................ 3
1.2.1 Mandating and recommending phrases............................................ 3
1.2.2 Table legend...................................................................................... 4
1.3 Target Audience ............................................................................................... 4
1.4 Readership ....................................................................................................... 5
1.5 References ....................................................................................................... 5
1.6 Glossary............................................................................................................ 6
1.7 Abbreviations .................................................................................................... 10
2 DOCUMENT STRATEGY ................................................................................................ 11
2.0 Introduction ....................................................................................................... 11
2.0.0 Relationship to EC Regulations ........................................................ 11
2.1 Strategy for SSAS:............................................................................................ 13
2.1.0 SSAS Regulatory Requirements hierarchy ....................................... 13
2.1.1 SSAS Regulatory Requirements and Objectives hierarchy .............. 14
2.1.2 SSAS Implementation ....................................................................... 16
2.1.3 Production of Argument..................................................................... 16
2.1.4 Requirement Correctness ................................................................. 16
2.1.5 Traceability ........................................................................................ 16
2.1.6 Unintended functions......................................................................... 17
2.1.7 Requirement satisfaction................................................................... 17
2.1.8 Configuration Management............................................................... 17
2.1.9 Assurance and Demonstration to NSA ............................................. 17
2.1.10 Changes to software and to specific software .................................. 17
2.2 Strategy for Software lifecycle: ......................................................................... 18
3 SOFTWARE SAFETY ASSURANCE SYSTEM .............................................................. 20
3.0 Software Safety Assurance System Overall Objectives................................... 20
3.1 Software Safety Assessment Initiation ............................................................. 24
3.2 Software Safety Assessment Planning............................................................. 26
3.3 Software Safety Requirements Specification ................................................... 27
3.4 Software Safety Assessment Validation, Verification and Process Assurance 28
3.5 Software Safety Assessment Completion ........................................................ 29
3.6 Software Assurance Level ................................................................................ 30
3.6.0 Introduction........................................................................................ 30
3.6.1 Supporting Information for SWAL Allocation Process....................... 30
3.6.2 SWAL Allocation Process ................................................................. 33
3.6.3 Examples of SWAL Allocation........................................................... 36
3.6.4 Grading Policy ................................................................................... 38
3.6.5 Similarity of Levels throughout Various Standards ........................... 40

© EUROCAE, 2009
iii

4 PRIMARY LIFECYCLE PROCESSES............................................................................. 42


4.0 Introduction ....................................................................................................... 42
4.1 Acquisition Process .......................................................................................... 43
4.2 Supply Process................................................................................................. 46
4.3 Development Process....................................................................................... 49
4.4 Operation Process ............................................................................................ 57
4.5 Maintenance Process ....................................................................................... 59
5 SUPPORTING LIFECYCLE PROCESSES ..................................................................... 61
5.0 Introduction ....................................................................................................... 61
5.1 Documentation Process.................................................................................... 63
5.2 Configuration Management Process ................................................................ 64
5.3 Quality Assurance Process .............................................................................. 68
5.4 Verification Process .......................................................................................... 70
5.4.0 Software Verification ......................................................................... 71
5.4.1 Other Verifications............................................................................. 79
5.5 Validation Process ............................................................................................ 80
5.6 Joint Review Process ....................................................................................... 80
5.7 Audit Process.................................................................................................... 82
5.8 Problem/Change Resolution Process............................................................... 86
6 ORGANISATIONAL LIFECYCLE PROCESSES ............................................................. 88
6.0 Introduction ....................................................................................................... 88
6.1 Management Process....................................................................................... 88
6.2 Infrastructure Process....................................................................................... 91
6.3 Improvement Process....................................................................................... 92
6.4 Training Process............................................................................................... 93
7 ADDITIONAL ANS SOFTWARE LIFECYCLE OBJECTIVES ......................................... 94
7.0 Introduction ....................................................................................................... 94
7.1 Software Development Environment ................................................................ 95
7.2 COTS ................................................................................................................ 95
7.2.0 Scope of COTS Section .................................................................... 95
7.2.1 System Aspects Relating to COTS in ANS ....................................... 96
7.2.2 COTS Planning Process ................................................................... 96
7.2.3 COTS Acquisition Process ................................................................ 99
7.2.4 COTS Verification Process ............................................................... 102
7.2.5 COTS Configuration Management Process...................................... 104
7.2.6 COTS Quality Assurance .................................................................. 106
7.3 Tool Qualification .............................................................................................. 106
8 SOFTWARE SAFETY FOLDER ...................................................................................... 108
8.0 Introduction ....................................................................................................... 108
8.1 Software Safety Folder Structure – Option 1.................................................... 109
8.2 Software Safety Folder Structure – Option 2.................................................... 111
8.3 ESARR6 and EC Regulation 482/2008 mapping ............................................. 116

© EUROCAE, 2009
iv

A REFERENCE TO EXISTING SOFTWARE STANDARDS .............................................. 117


A.1 Purpose ............................................................................................................ 117
A.2 Software Safety Assurance System ................................................................. 118
A.3 Primary Lifecycle Processes ............................................................................ 126
A.4 Supporting Lifecycle Processes ....................................................................... 154
A.5 Organisational Lifecycle Processes.................................................................. 174
A.6 Additional ANS Software Lifecycle Objectives ................................................. 179
A.7 ED109/DO-278 Omissions ............................................................................... 181
B ROLES AND RESPONSIBILITIES SCENARIOS............................................................ 182
B.1 First Scenario – Contracted Major System Project .......................................... 182
B.2 Second Scenario – Contracted SW.................................................................. 183
B.3 Third Scenario – ANSP Internal Development ................................................. 185
C TRACEABILITY WITH ESARR6...................................................................................... 187

APPENDIX I WG-64 MEMBERSHIP.............................................................................................. 193


APPRENDIX II HISTORY AND TERMS OF REFERENCE OF WG-64......................................... 194
APPENDIX III ATTACHMENT TO ED-153 .................................................................................... 195
APPENDIX IV IMPROVEMENT SUGGESTION FORM ................................................................ 196

© EUROCAE, 2009
1

CHAPTER 1

INTRODUCTION

1.0 PURPOSE

An increasing proportion of safety-critical Air Navigation Service (ANS) functions are


being supported by software. This shift towards more automated ANS functions
assumes at least equal, if not improved, levels of safety and efficiency provided by the
overall system. Therefore, it is necessary to offer guidance on how to assure that the
risk associated with deploying the software is reduced to a tolerable level.
This document provides:
Recommendations and requirements on the major processes necessary to
provide safety assurance for software in ANS systems, including:
- A Software Assurance Level (SWAL) allocation process;
- A list of objectives to be satisfied per SWAL;
- A SWAL grading policy, ie the definition of a policy and its rationale to
justify and substantiate the stringency of the objectives to be met per
SWAL;
- The identification of some appropriate activities (techniques or methods)
to achieve these objectives, principally through referencing existing
standards that offer guidance on how to provide evidence and confidence
that these objectives are achieved and the SWAL is satisfied;
This document also provides:
A recommended ANS Software lifecycle and its associated activities in support
of achieving the objectives identified herein;
A reference to other standards (focusing on ED-109/DO-278, ISO/IEC 12207,
IEC 61508, CMMi and ED-12B/DO-178B) that relate to the identified objectives;
NOTE: IEC12207, ED-12B/DO-178B, ED-109/DO-278 and IEC61508 consider a
system as being hardware and software and consequently, the people and
procedure aspects of a system are not taken into account by these four
standards.
An assessment of the referenced standards’ coverage of the recommended
lifecycle and its associated activities for the development, operation and
maintenance of ANS software.
Guidance towards satisfying ESARR6 and EC Regulation 482/2008 (see Annex
C). This guidance does not address the generation of the required safety
arguments. Generation of these safety arguments that use the evidence
generated by the SWAL processes are outside the scope of this document.
Formal compliance with ESARR6 cannot be claimed as it is under the
responsibility of Eurocontrol Safety Regulation Commission (SRC) to state it.
Figure 1 below encapsulates the main purposes of this document and
shows how those purposes and other relevant documents are related.

© EUROCAE, 2009
2

SOFTWARE
SOFTWAREASSURANCE
ASSURANCELEVEL
LEVEL(SWAL)
(SWAL)
TO SATISFY

OBJECTIVES
OBJECTIVES
TO GIVE CONFIDENCE
Chapter 8 EVIDENCE
EVIDENCE

TO ACHIEVE TO PRODUCE
ACTIVITIES
ACTIVITIES

ED109 ( DO178B)
IEC 61508,
Chapters 3-7 IEC 12207,
482/2008 CMMi

(SW) Support OBJECTIVES


OBJECTIVES
Compliance
ACTIVITIES
ACTIVITIES Can Support STDs
Annex A
ED153
FIGURE 1: PURPOSE AND DOCUMENT RELATIONSHIPS RELEVANT TO THIS STANDARD

NOTE: Whilst the objectives described in this document support the achievement
of many (but not all) of the articles within EC Regulation 482/2008, in no
way can compliance with any such articles be claimed as this is the
responsibility of regulatory and legislative authorities.

1.1 SCOPE

This document applies to software that forms part of an ANS system. The scope of
this extends to the overall lifecycle of software within an ANS system, however this
document considers aircraft software out of scope and is therefore limited to the
“ground” segment of ANS.
This document assumes that a risk assessment and mitigation process has been
undertaken along with an a priori system (where system includes people, procedure
and equipment) safety assessment (eg a SAM-FHA and SAM-PSSA) with the results
forming an input to this document.
This document is limited to software safety assurance and any references to software
lifecycle data are made solely within the context of software safety assurance.
Documentation not related to software lifecycle data is therefore out of scope.
This document covers:
Guidance for an ANSP to establish a software safety assurance system;
Guidance for software suppliers on the necessary software safety assurance
regarding products and processes;
A reference against which stakeholders can assess their own practices for
software safety assurance of: specification, design, development, operation,
maintenance, and decommissioning;
A software assurance process that will promote interoperability through its
common application to ANS software development.

© EUROCAE, 2009
3

1.2 ORGANISATION OF THIS DOCUMENT

1.2.0 Structure

This document includes the following chapters:


Chapter 2: Document Strategy described each step of the strategy for the
determination of software safety assurance.
Chapter 3: Software Safety Assurance System lists objectives to set-up a
Software Safety Assurance System including Software Assurance Level
(SWAL) aspects.
Chapter 4: Primary Lifecycle Processes lists the objectives per SWAL that
belong to primary lifecycle processes.
Chapter 5: Supporting Lifecycle Processes lists the objectives per SWAL that
belong to supporting lifecycle processes.
Chapter 6: Organisational Lifecycle Processes lists the objectives per SWAL
that belong to organisational lifecycle processes.
Chapter 7: Additional ANS Software Lifecycle Objectives lists additional
objectives per SWAL that have been added due to ANS specific needs, such as
the analysis of safety oriented standards, ANS particularities, and omissions by
existing standards.
Chapter 8: Software Safety Folder proposes two options for a Software Safety
Folder (SSF) structure and identifies which objective output contributes to fill in
this SSF.
The document is completed by the following annexes
Annex A: Provides a mapping between objectives in this document and existing
standards (ED-109/DO-278, ISO/IEC 12207, IEC 61508, CMMi and ED-
12B/DO-178B)
Annex B: Provides examples of how roles and responsibilities may be allocated
between the ANSP and supplier per objective.
Annex C: Proposes a mapping between ESARR6 and ED-153 objectives.

1.2.1 Mandating and recommending phrases

Each objective contains at least one “shall” statement, where “shall” means that this
requirement must be satisfied to demonstrate compliance with this document.
Where more than one “shall” statement is included as part of an objective then, to
achieve that objective, all of the “shall” requirements must be satisfied.
This document is not a regulatory requirement. It implies that, in case of a “shall”
statement not being satisfied, alternative means sustained with rationale and evidence
can be chosen and agreed with the NSA.

© EUROCAE, 2009
4

1.2.2 Table legend

The tables including Objective allocation per SWAL have to be read as follows (Cf:
3.6.4.0-criteria):

Legend (used throughout this document):


The objective to be achieved with independence.
The objective to be achieved.
Blank Achievement of objective is at organisation’s discretion.

1.3 TARGET AUDIENCE

This document is specifically targeted at:


Safety Practitioners: are responsible for establishing the link between the
programme/project and the risk assessment and mitigation process. They
provide the methodological support to the different steps of the risk assessment
and mitigation process and the integration within the organisation’s Safety
Management System (SMS).
For example, the safety practitioners have to ensure that SWAL is allocated in
accordance with material provided in Chapter 2, and that the allocated SWAL is
validated.
Software Teams: should apply this document to the relevant software.
For example, the software team is responsible for the implementation,
verification and compliance with many of the objectives defined in this
document.
Project/Programme Manager or Safety Manager: Responsible for all or part
of the Software Lifecycle (development, maintenance, operation…). Also
responsible for allocating resources to assurance activities.
System Designers: Responsible for system (people, procedures and
equipment) architectural design including software. They will also allocate the
SWAL.
Test and Integration Team: Responsible for the development and conduct of
the test and integration activities.

© EUROCAE, 2009
5

1.4 READERSHIP

The following table suggests a minimum reader’s attention to this document.

Test and Integration


Programme/project
Audience

Safety Practitioner

System Designers
Manager, Safety
Software Team

Other roles (eg

Manager)
Section

Team
Chapter 1 – Introduction
Chapter 2 – Document Strategy
Chapter 3 – Software Safety Assurance System
Chapter 4 – Primary Lifecycle Processes
Chapter 5 – Supporting Lifecycle Processes N/A
Chapter 6 – Organisational Lifecycle Processes N/A N/A
Chapter 7 – Additional ANS Software Lifecycle Objectives N/A N/A
Chapter 8 – Software Safety Folder
Annex A - Reference To Existing Software Standards N/A N/A
Annex B – Roles and Responsibilities Scenarios N/A N/A
Annex C – Traceability with ESARR6 N/A N/A N/A

: Detailed knowledge; : Aware; N/A: Not Applicable.

1.5 REFERENCES

[1] ANS Software Lifecycle, by SAM-Software Task Force, SAF.ET1.STO1.1000-


REP-01-00, edition 3.0 (21st December 2005).
[2] Air Navigation System Safety Assessment Methodology, by Safety Assessment
Methodology Task Force, SAF.ET1.STO1.1000-MAN-01-00, edition 2.1
(10/2006).
[3] Recommendations For ANS Software, SAF.ET1.ST03.1000.GUI-01-00, edition
1.0, 21st December 2005.
[4] ED-109/DO-278, Guidelines for the communication, navigation surveillance,
and air traffic management (CNS/ATM) systems software integrity assurance
(March, 2002), EUROCAE ED-109 & RTCA DO-278.
[5] ISO/IEC 12207: Information technology - Software lifecycle processes, 1995.
[6] IEC 61508: Functional safety of electrical/electronic/programmable electronic
safety-related systems, December 1998.
[7] RTCA/DO-178B/ED-12B, Software Considerations in Airborne Systems and
Equipment Certification, RTCA & EUROCAE, 1992.
[8] CMMI for Development (CMMI-DEV), Version 1.2, August 2006.
[9] ESARR6, Software in ATM Systems, Version 1.0, 06/11/2003
[10] ESARR4, Risk assessment and mitigation in ATM, Version 1.0, 05/04/2001

© EUROCAE, 2009
6

[11] EUROCAE/ED-94B and RTCA/DO-248B, “Final Report for Clarification of


ED-12B/DO-178B ‘Software Consideration in Airborne Systems and Equipment
Certification.
[12] EC Regulation 482/2008
[13] EC Regulation 552/2004

1.6 GLOSSARY

Acceptably Safe Acceptably safe defines the target risk for an ANSP and is more
demanding than tolerably safe.
Acquirer An entity in charge of acquiring SW and/or SW lifecycle data from
another entity so-called “Supplier”. There are multiple type of
relationships between Acquirer and Supplier as they can be part of
the same organisation or from different organisations (contractual
relationships).
In some cases such formal relationships may not exist (e.g. in-house
development) and so some requirements relating to the Acquirer may
be justified as not applicable.
Adaptation Data Data used to customise elements of the Air Traffic Management
System for their designated purpose.
Note: Adaptation data is utilized to customize elements of the
CNS/ATM system for its designated purpose at a specific location.
These systems are often configured to accommodate site-specific
characteristics. These site dependencies are developed into sets of
adaptation data. Adaptation data includes:
- Data that configures the software for a given
geographical site, and
- Data that configures a workstation to the preferences
and/or functions of an operator.
Examples include, but are not limited to:
a) Geographical Data – latitude and longitude of a radar
site.
b) Environmental Data – operator selectable data to provide
their specific preferences.
c) Airspace Data – sector-specific data.
d) Procedures – operational customization to provide the
desired operational role.
Adaptation data may take the form of changes to either
database parameters or take the form of pre-programmed
options. In some cases, adaptation data involves re-linking
the code to include different libraries. Note that this should
not be confused with recompilation in which a completely
new version of the code is generated.
Where appropriate, the generation of adaptation data should
seek compliance with the guidance given in ADI DAL
(Aeronautical Data Integrity Data Assurance Level). Except
for generation related processes, compliance with this
document should be shown to the same assurance level as
the ANS code that uses them.

© EUROCAE, 2009
7

ANS Air Navigation Service


Approval A means by which an authorised body so-called "the Approval
Authority" gives formal recognition that a product, process, service, or
operation conforms to applicable requirements..
Note: For example, approval is a generic term to refer to
certification, commissioning, qualification and initial operational
capability, etc.
Approval Authority The relevant body responsible for the approval in accordance with
applicable approval requirements.
Capacity This term should be defined in the framework of the SW under
consideration (the definition itself may vary depending on the type of
SW eg real-time, database, HMI, …).
Configuration data Data that configures a generic software system to a particular
instance of its use (for example, data for flight data processing
system for a particular airspace, by setting the positions of airways,
reporting points, navigation aids, airports and other elements
important to air navigation)
Note: See adaptation data
Consequential failure A SW failure or malfunction caused by another SW failure or
malfunction.
Commercial Off The COTS software encompasses a wide range of software, including
Shelf (COTS) purchased software, Non-Developmental Items (NDI), and software
previously developed without consideration of ED-153. The term
“Previously Developed Software” is also used for such software. This
software may or may not have been approved through other
“approval processes.” Partial data or no data may be available as
evidence of objectives of ANS developmental process. For the rest
of this section, all such software is referred to as COTS for the sake
of brevity. This terminology was selected because of the usual use of
the term “COTS” within the “ground” ANS community.
Defined When used in this document in the context of an objective, “defined”
means “defined and documented”.
External Mitigation Barriers or lines of defence, defined from the risk mitigation strategy,
Means (EMM) taken to control or prevent or reduce either: the likelihood of a hazard
contributing to or causing an end effect; or the severity of an end
effect. They may take various forms, including operational,
procedural, functional, performance, and interoperability
requirements or environment characteristics.
HMI Human Machine Interface
Improvement A modification has to be performed because the software
performance, though compliant with requirements, has to be
improved.
Internal Mitigation Barriers or lines of defence, defined from the risk mitigation strategy,
Means (IMM) taken to control or prevent or reduce the likelihood of an event
contributing to or causing a hazard. They may take various forms,
including operational, procedural, functional, performance, and
interoperability requirements or environment characteristics.
Isolation A design concept that ensures that the failure of one item does not
cause a failure of another item. (derived from JAR AMJ 25.1309)

© EUROCAE, 2009
8

Mitigation Means Barriers or lines of defence, defined from the risk mitigation strategy,
taken to control, prevent or reduce either: the likelihood of an event
developing further such that it can contribute to or cause an end
effect; or the severity of an end effect. They may take various forms,
including operational, procedural, functional, performance and
interoperability requirements or environment characteristics.
Non-developmental Software not developed for the current demonstration.
software
Organisation An administrative structure in which people collectively manage one
or more projects as a whole, and whose projects share a senior
manager and operate under the same policies.
Overload Tolerance This term should be defined in the framework of the software under
consideration as the definition itself may vary depending on the type
of software (eg real-time, database, HMI). The definition may include,
if appropriate, the SW performance range for some levels of load
beyond some limits to be defined.
Perform Completely In accordance with the approved software safety plan, in
conformance with ANSP Safety Management System and in
compliance with applicable safety regulatory requirements.
Resource Usage The amount of resources within the computer system that can be
used by the application software.
Note: Resources may include main memory of various categories
(such as static data, stack and heap), disc space and
communications bandwidth and may include internal software
resources, such as the number of files which may be simultaneously
open.
Safety Requirement A risk-mitigation means, defined from the risk-mitigation strategy that
achieves a particular safety objective, including organisational,
operational, procedural, functional, performance, and interoperability
requirements or environment characteristics;
Software (SW) Computer programs and corresponding configuration data, including
non-developmental software (eg proprietary software, Commercial
Off The Shelf (COTS) software, re-used software, etc.), but excluding
electronic items such as application specific integrated circuits,
programmable gate arrays or solid-state logic controllers.
Software Assurance The software assurance level (SWAL) defines the level of rigour of
Level (SWAL) the software assurances throughout the software lifecycle, supporting
the demonstration that the EATMN software is acceptability safe.
Software Component A component can be seen as a building block that can be fitted or
connected together with other reusable blocks of software to combine
and create a custom software application.
In the framework of this document, it was found necessary to further
developed the definition of “Software component” by providing the
following information:
A software component is the result of the first level of decomposition
of the software architecture, so that requirements, actions, objects,
input and output flows can be associated to that software component.
Therefore, it can be a process if the application is based on a multi-
process architecture or a thread if the architecture is mono or multi-
process and multi thread or a set of actions or a set of objects with
their associated methods or a state and its associated actions of a
finite state machine.

© EUROCAE, 2009
9

Software Failure The inability of a program to perform a required function.


Software Lifecycle Data that is produced during the software lifecycle to plan, direct,
data explain, define, record, or provide evidence of activities (including the
software product itself). This data enables the software lifecycle
processes, system or equipment approval and post-approval
modification of the software product.
Software Malfunction The inability of a program to perform a required function correctly..
Software Confirmation by examination and provision of objective evidence that
Requirements software requirements are complete and correct, and are also
Validity compliant with the system safety requirements. ([9] Para 4.2)
Software Robustness The behaviour of the software in the event of unexpected inputs,
hardware faults and power supply interruptions, either in the
computer system itself or in connected devices.
Software Timing The time allowed for the software to respond to given inputs or to
Performances periodic events, and/or the performance of the software in terms of
transactions or messages handled per unit time.
Software Unit (SW An element specified in the design of a Software Component that is
Unit) separately testable.
A software unit (SW Unit) is a low level of decomposition of the
software architecture (can be the lowest). Requirements, actions,
objects, input and output flows can be associated to that software unit
that can be verified (and more specifically be tested).
Therefore, it can be a file or a module or a single object with its
associated methods or an interrupt or device handler.
Supplier An organisation that enters into relationship with the acquirer for the
supply of a software product.
In some cases such formal relationships may not exist (e.g. in-house
development) and so some requirements relating to the Supplier may
be justified as not applicable.
System An Air Navigation System is composed of People, Procedures and
Equipment (Software, Hardware and HMI)
Tolerably Safe Tolerably safe defines the target risk for a National Supervisory
Authority (NSA) and is less demanding than acceptably safe.
Validation Confirmation by examination and provision of objective evidence that
the particular requirements for a specific intended use are fulfilled.
[10]
Verification Confirmation by examination of evidence that a product, process or
service fulfils specified requirements. [9]

© EUROCAE, 2009
10

1.7 ABBREVIATIONS

ADI Aeronautical Data Integrity


ANSP Air Navigation Service Supplier
ATM Air Traffic Management
CM Configuration Management
CMM Capability Maturity Model
COTS Commercial Off the Shelf Software
DAL Data Assurance Level
EC European Commission
EMM External Mitigation Means
ER Essential Requirement
ESARR EUROCONTROL SAfety Regulatory Requirement
FFP Functional Failure Path
FHA Functional Hazard Assessment
HMI Human Machine Interface
IEC International Electrotechnical Commission
IMM Internal Mitigation Means
ISO International Standard Organisation
MC/DC Modified Conditions / Decision Coverage
NSA National Supervisory Authority
PSSA Preliminary System Safety Assessment
RBT Requirement-Based Testing
SAM Safety Assessment Methodology
SSA System Safety Assessment
SSAS Software Safety Assurance System
SSF Software Safety Folder
SW Software
SWAL SoftWare Assurance Level

© EUROCAE, 2009
11

CHAPTER 2

DOCUMENT STRATEGY

2.0 INTRODUCTION

This section provides the strategy that has been used and that can be built upon by
the users to relate the document purpose with the list of objectives (Chapter 3 to 7)
and the means proposed by various standards to satisfy these objectives as listed in
Appendix A.

2.0.0 Relationship to EC Regulations

This document provides guidance aiming to partially satisfy EC regulations (552/2004,


482/2008).
Therefore, both safety regulatory requirements (482/2008) as well as Interoperability
(552/2004) aspects of Software are addressed by this document.
This section also gives the rationale for the current structure of this document.
It helps the reader to understand the various levels of guidance of this document:
The Software Safety Assurance System (SSAS) implementation is described in
Section 2.1 (to show partial compliance with 482/2008 and to 552/2004
Essential Requirement (ER) 3)
The Software Lifecycle implementation is described in sections 3.6, chapter 5

SUPPORTING LIFECYCLE PROCESSES, chapter 6

ORGANISATIONAL LIFECYCLE PROCESSES, chapter 7

ADDITIONAL ANS SOFTWARE LIFECYCLE OBJECTIVES (to show


partial compliance with 552/2004 Essential Requirement 7).

© EUROCAE, 2009
12

(EC) 482/2008
(EC) 552/2004

ANSP and Suppliers


Safety Management System
ED-153
Software Safety
Chapter 3
Assurance System

Requires

s
s

uire
uire

Req
Req

ED-153
Software Software Safety
Chapters
Development Software Safety Assurance Assurance
4, 5, 6, 7
Products Activities Products
(Inputs) (Outputs)
Can Support

Ca t
ns or
up pp
po n su
rt Ca

ED-153
Annex A
STDs
Standards
STDs
STDs
(ED-109, ED-12B,
IEC61508, IEC12207,
DEF.STAN. 00-55)

FIGURE 2: VARIOUS LEVELS OF GUIDANCE PROVIDED BY ED-153

© EUROCAE, 2009
13

2.1 STRATEGY FOR SSAS:

2.1.0 SSAS Regulatory Requirements hierarchy

This document provides guidance aiming to partially satisfy EC regulations (552/2004, 482/2008). The approach considers 482/2008 and
refines the 552/2004 ER3 for software. It implies that demonstrating compliance with 482/2008 through the implementation of a SSAS
subsequently supports the demonstration of compliance with 552/2004 ER3 for the software part of a constituent.
High level articles and objectives address ‘what’ has to be done and low level articles and objectives address ‘how’ it has to be done.
The table below provides the hierarchy of the regulatory requirements applicable to the SSAS.

482/2008 and 552/2004


SSAS definition & implementation
Article 3.1
Name SSAS SWAL Argument Requirement Traceability Unintended Requirement Configuration Demo Specific
Implementation Correctness functions satisfaction Management to SW
NSA
482/2008 Article 4.1 Article Article 3.2 Article 3.2.a Article 3.2.b Article 3.2.c Article 3.2.d Article 3.2.e Article
articles 4.2; (1st 3.3
“high-level” sentence)
482/2008 Articles Article 4.3.a; Article 4.3.d; Article 4.3.b; Article 4.3.c; Article 5
articles 4.4; 4.5; Annex II Part Annex II Part Annex II Part Annex II Part
“low-level” Annex I A D B C

TABLE 1: 482/2008 REGULATORY REQUIREMENTS HIERARCHY

Example of Table interpretation: To implement the Configuration Management aspects of the SSAS, Article 3.2.e of EC Regulation 482/2008
has to be used as a reference. Article 3.2.e Requirements are refined into Article 4.3.c and Annex II Part C.

© EUROCAE, 2009
14

2.1.1 SSAS Regulatory Requirements and Objectives hierarchy

ED-153 includes various levels of objectives aiming to address some 482/2008 requirements. The table below provides the relationship
between the various levels of 482/2008 requirements and ED-153 objectives.

482/2008 and ED-153 objectives mapping


Requireme Changes to
SSAS Requirement Unintended Configuration Demo to
Name Argument Traceability nt software and to
Implementation Correctness functions Management NSA
satisfaction specific software
482/2008 articles Article 4.1 Article 3.2 Article 3.2.a Article 3.2.b Article 3.2.c Article 3.2.d Article 3.2.e Article 3.3
“high-level” (1st
sentence)
ED-153 Objectives 3.0.1 3.0.15 3.0.17 3.0.2 3.0.3 3.0.4 3.0.6 3.0.7 3.0.8
“high-level” 3.0.16
482/2008 articles Article 4.3.a Article 4.3.d - Article 4.3.b Article 4.3.c Article 5
“low-level” Annex II Part A Annex II Annex II Annex II Part
Part D Part B C
ED-153 Objectives See 4.4.5 4.3.4 4.3.15 Chap 5.4 Chap 5.2 3.0.12 3.0.13
“low-level” Table Chap
3 7.2
ED-153 Objectives 3.1.1; 3.1.5; 5.4.10 3.3.2; 3.3.3; 3.4.4, 5.2.X; 5.4.13; 4.1.7 7.2.X
complementary 3.3.1; 3.3.2; 3.3.4; 4.4.2; 4.3.13; Argument
“low-level” 3.3.3, 3.3.4; 5.4.3; 5.8.3 4.3.14; (note2) +
3.4.X; 4.3.1; 4.4.5; 4.5.3; (SSF 2
4.3.2; 4.3.4; 5.4.X; informativ
4.3.5; 4.3.6; e)
4.3.9; 4.3.10;
4.3.11; 4.3.12;
4.3.13; 4.3.14;
4.3.20;
5.2.5; 5.4.3;
5.4.5; 5.4.6;
5.4.8; 5.4.9;
5.4.11; 5.7.2;
TABLE 2: 482/2008 REQUIREMENTS AND ED-153 OBJECTIVES MAPPING

© EUROCAE, 2009
15

NOTE 1: The objectives above are applicable as per the allocated SWAL.
NOTE 2: This document does not propose any guidance to develop the software argument or any example of it, except guidance to develop
the safety argument for the system change (See SAM- Part IV GM I –Safety Case Development Manual). The system safety
argument is the master argument embedding the software argument.

Example of table interpretation: ED-153 “high-level” objectives 3.0.7 aims at addressing (complying?) the “high-level” Article 3.2.e 482/2008
requirement. ED-153 Chapter 5.2 aims to address 482/2008 requirements. ED-153 Chapter 5.2 contains “low-level” objectives aiming to
address 482/2008 “low-level” requirements stated in Article 4.3.c and in Annex II Part C
The table below expands the ED-153 objective 3.0.1 hierarchy.

ED-153 Objective 3.0.1


482/2008 articles “low- Article 4.1 Article 4.2 Article 4.4.a Article 4.4.b Article 4.5
level”
Annex I
ED-153 Objectives 3.0.1 3.0.2 3.0.9 3.0.8 3.0.11
“low-level”
4.4.2 3.0.5 3.0.10
3.0.14
Chap 3.6
ED-153 Objectives 3.1.X; 3.2.X; 3.3.X; 4.4.5; 4.5.2; Those requirements 4.4.5; 4.5.2; 4.5.3; 4.4.X; 4.5.2; 4.5.3;
complementary “low- 3.4.X; 3.5.X; 5.1.X; associated with the 5.8.X;
“shall” of Chapter 3.6
level” derived SWAL

TABLE 3: ED-153 OBJECTIVE 3.0.1 HIERARCHY (FROM TABLE 2)

NOTE: The low level objectives is not an exhaustive list as many low-level objectives contribute to many low level articles. It may be the role
of an role of the argument to identify the complete set of low-level objectives for each article.

© EUROCAE, 2009
16

2.1.2 SSAS Implementation

This section provides an explanation on how to set up an SSAS (independently of the


regulatory requirements). Therefore references are made to both the high level and
low level objectives identified in tables 2 and 3 above.
In order to implement an SSAS (at organisation level as part of the Safety
Management System), the following features should be addressed:
Establishment of a SSAS: This is addressed in this document through Objective
3.0.1.
Documented Processes: This is addressed in this document through Objective
3.2.1.
Continuous improvement of processes: This is addressed in this document
through Objectives 6.3.X.
SWAL Allocation: This is addressed in this document through Objectives 3.0.5
and Chapter 3.6.
Assurance rigour: This is addressed in this document through Objectives 3.0.8
and 3.0.9
Software Safety Assurance Planning: This is addressed in this document
through Objectives 3.2.X.
Evidence of assurance activities having been undertaken: This is addressed in
this document through Objectives 3.5.X and Software Safety Folder
(informative).
Independence of assurance and evidence: This is addressed in this document
through Objectives 3.0.9 and SWAL tables
Risk Assessment and Mitigation : This is addressed in this document through
Objectives 3.3.X
Risk Mitigations implemented in Software: This is addressed in this document
through Objective 3.3.4
In Service Monitoring: This is addressed in this document through Objectives
3.0.11 and 4.4.5.
Software experience feedback: This is addressed in this document through
Objectives 5.8.X
Resolution of recorded Software malfunction or failures: This is addressed in
this document through Objectives 5.8.X
Modifications: This is addressed in this document through Objectives 3.0.12 and
5.8.2.
Use of Tools (Software Development Tools, Verification Tools or Support tools):
This is addressed in this document through Objectives 4.3.12, 4.3.17-19 and
7.3.X.

2.1.3 Production of Argument

No further refinement of the strategy relating to this high level objective in this version.
Chapter 2 as well as chapter 8 provide some basic strategy information that can be
used to produce the argument.

2.1.4 Requirement Correctness

No further refinement of the strategy relating to this high level objective.

2.1.5 Traceability

No further refinement of the strategy relating to this high level objective.

© EUROCAE, 2009
17

2.1.6 Unintended functions

Unintended functions are functions in the software that are either performed in
addition to those required or that are not performed on demand. They can result from,
but are not limited to, the following examples:
Execution of unspecified code including unintentional execution of Deactivated
code: code which is present in the executable software but which is not
intended to be executed in the given context. For example, different
configurations for different sites may enable/disable different functions of the
software;
Incorrect implementation of SW specification (‘Software fault / malfunction’)
leading to the execution of ‘dead code’;
Code execution not performed at the right time or not on demand
Incorrect SW specification capture (at any level, top level down to detailed
design level)
This high level objective does not mean that, ultimately, there is not any unintended
function in the software. It means that these functions are not activated or that the
consequences of them being activated are justified by the safety analysis or that an
effort has been made to produce evidence for this objective to be met with a level of
rigor commensurate with the SWAL.
Ideally unintended functions should be identified. If this proves impossible, an
alternative approach has to be defined.
In both cases, unintended functions have to be addressed by:
Ensuring software requirements validity with a level of confidence
commensurate with the SWAL. This is addressed in this document through
Objective 4.3.4.
Ensuring software requirements satisfaction with a level of confidence
commensurate with the SWAL. This is addressed in this document through
Objective 5.4.X.
Ensuring that both the above, are appropriately associated with the software
version being assessed (eg by configuration management). This is addressed in
this document through Objective 5.2.X.

2.1.7 Requirement satisfaction

No further refinement of the strategy relating to this high level objective

2.1.8 Configuration Management

No further refinement of the strategy relating to this high level objective.

2.1.9 Assurance and Demonstration to NSA

The overall set of objectives aims to:


provide confidence or assurance to the organisation that the software can be
operated safely (This is addressed in this document in SSF-1 or in SSF2), and
demonstrate to the NSA that the Software complies with the related regulatory
requirements. This is addressed in this document in SSF-2.

2.1.10 Changes to software and to specific software

No further refinement of the strategy relating to this low level objective.

© EUROCAE, 2009
18

2.2 STRATEGY FOR SOFTWARE LIFECYCLE:

The approach chosen to address EC regulations related to the construction of


systems aspects, namely 552/2004 ER7, considerers that the software lifecycle has to
be based on sound engineering principles. It implies that the use of best practice,
extracted from published international standards (ISO/IEC12207) augmented with
domain specific practices, supports the demonstration of compliance with 552/2004
ER7 for the software part of a constituent.
This strategy is complemented by listing the objectives of a Software lifecycle in this
document in addition to the lifecycle related to the SSAS.
The table below provides the hierarchy of best practice that is applicable to software
lifecycle. This includes both the high level objectives required to set-up a process for
each lifecycle activity; and the low level objectives that refine the process further.

© EUROCAE, 2009
19

SW Lifecycle and ED-153 objectives mapping


ED-153 4 PRIMARY LIFECYCLE 5 SUPPORTING 6 ORGANISATIONAL 7 ADDITIONAL ANS 8 SOFTWARE SAFETY
categories PROCESSES LIFECYCLE PROCESSES LIFECYCLE SOFTWARE LIFECYCLE FOLDER
PROCESSES OBJECTIVES
ED-153 4.1 Acquisition Process 5.1 Documentation Process 6.1 Management 7.1 Software Development 8.1 Software Safety
Process Process Environment Folder Structure –
4.2 Supply Process 5.2 Configuration
Option 1
Management Process 6.2 Infrastructure 7.2 COTS
4.3 Development Process
Process 8.2 Software Safety
5.3 Quality Assurance 7.3 Tool Qualification
4.4 Operation Process Folder Structure –
Process 6.3 Improvement
Option 2
4.5 Maintenance Process Process
5.4 Verification Process
6.4 Training Process
5.6 Joint Review Process
5.7 Audit Process
5.8 Problem/Change
Resolution Process

ED-153 4.1.1; 4.2.1; 4.3.3; 4.4.1; 5.1.1; 5.2.1; 5.3.1; 5.4.1; 6.1.1; 6.2.1; 6.3.1; 6.4.1 7.1.1; 7.2.1; 7.3.1
Objectives 4.5.1 5.6.1; 5.7.1; 5.8.1
“high-level”
ED-153 4.1.X; 4.2.X; 4.3.X; 4.4.X; 5.1.X; 5.2.X; 5.3.X; 5.4.X; 6.1.X; 6.2.X; 6.3.X; 7.1.X; 7.2.X; 7.3.X
Objectives “low- 4.5.X 5.6.X; 5.7.X; 5.8.X 6.4.X
(excluding the high level
level”
(excluding the high level (excluding the high level (excluding the high level objectives listed above)
objectives listed above) objectives listed above) objectives listed above)

TABLE 4: SW LIFECYCLE AND ED-153 OBJECTIVES MAPPING

The objectives above are applicable as per the allocated SWAL.

© EUROCAE, 2009
20

CHAPTER 3

SOFTWARE SAFETY ASSURANCE SYSTEM

Introduction
The Software Safety Assurance System encompasses the following tasks:
Software Safety Assurance System overall Objectives specification
Software Assurance Level allocation
Software Safety Assessment
- Software Safety Assessment Initiation
- Software Safety Assessment Planning
- Software Safety Requirements Specification
- Software Safety Assessment Validation, Verification & Process
Assurance
- Software Safety Assessment Completion

3.0 SOFTWARE SAFETY ASSURANCE SYSTEM OVERALL OBJECTIVES

The aim of the Software Safety Assurance System (SSAS) is to ensure that
appropriate processes are in place such that the risk associated with deploying the
software is reduced to a tolerable level (ie the objectives of (EC) 482/2008 Article 3
are satisfied). Consequently, the Software Safety Assurance System is required to
ensure requirements validity, satisfaction, traceability, non-interference and the
configuration consistency of all data used in claiming compliance with (EC) 482/2008.
The Software Safety Assurance System is required to be a part of an organisation’s
Safety Management System and consequently applies to any software under the
responsibility of that organisation. The implementation of a SSAS is described in
section 2.1.2.
Commission Regulation (EC) No 482/2008 sets some high level criteria for the
Software Safety Assurance System and mandates that a SWAL allocation process be
implemented that complies with the given criteria. Consequently, the following
objectives have to be achieved by a Software Safety Assurance System.
NOTE: The Software Safety Folder is defined and described in Chapter 8 of this
document.

© EUROCAE, 2009
21

ALs
Obj
Objectives OBJECTIVES Output

Title/Topic
Annex A Section A.2.1 Software Assurance Manual
3.0.1 Implementation A Software Safety Assurance System shall 3.0.1.1 be defined and implemented (as (Part of the Safety Management
part of the overall System Safety Assessment Documentation). System)
Annex A Section A.2.1
Requirements SSF1-Part VII
3.0.2 Correctness and The software requirements shall 3.0.2.1 correctly and completely state what is required
Completeness by the software, in order to meet the system safety objectives and system safety SSF2 – Part I
requirements as identified by the risk.
Requirements Annex A Section A.2.1 SSF1 -Part VII
3.0.3 Traceability All software requirements shall 3.0.3.1 be traced to the level required by the SWAL.
Assurance SSF2 – Part I
Annex A Section A.2.1
The software implementation shall 3.0.4.1 not contain functions which may adversely
Unintended affect safety or whose effect is not consistent with the safety analysis. SSF1 -Part VII
3.0.4
Functions Note: This objective does not mean that there are no unintended functions in the SSF2 – Part I
software, but that these functions are not activated or that the consequences of them
being activated is justified by the safety analysis.
Annex A Section A.2.1
The ANSP shall 3.0.5.1 ensure, as a minimum, that the Software Safety Assurance SSF1 - Part VII
3.0.5 SWAL Allocation System allocates a SWAL to all operational ground ANS software.
SSF2 – Part II
Note: Refer to Chapter 2 Section 2 of this document for SWAL definitions and the
allocation process.
Annex A Section A.2.1
Requirements SSF1 - Part VII
3.0.6 Satisfaction The ANS software shall 3.0.6.1 satisfy its requirements with a level of confidence which
Assurance is consistent with the SWAL allocated during risk assessment and mitigation eg SSF2 – Part I
PSSA.

© EUROCAE, 2009
22

ALs
Obj
Objectives OBJECTIVES Output

Title/Topic
Annex A Section A.2.1
Configuration Any Assurance shall 3.0.7.1 be at all times derived from a known executable version of SSF1 - Part VII
3.0.7 Management the software, a known range of configuration data, and a known set of software
Assurance SSF2 – Part I
products and descriptions (including specifications) that have been used in the
production of that version.
Annex A Section A.2.1
Assurance SSF1 - Part VII
3.0.8 The SWAL shall 3.0.8.1 give sufficient confidence that the ANS software can be
Rigour Objective SSF2 – Part I
operated, as a minimum, acceptably safely.
Annex A Section A.2.1
The variation in rigour of the assurances per SWAL shall 3.0.9.1 include the following
criteria: Software Assurance Manual (Part of
Assurance the Safety Management System)
3.0.9
Rigour Criteria • required to be achieved with independence,
SSF2 – Part II
• required to be achieved,
• not required.
Annex A Section A.2.1
SWAL SSF1 - Part VII
3.0.10 Assurance shall 3.0.10.1 provide confidence that SWAL is achieved.
Assurance SSF2 – Part I
Note: Assurance may be based on direct or indirect arguments and evidence.
Annex A Section A.2.1
Assurance shall 3.0.11.1 be provided that once in operation the software meets its
requirements with a degree of confidence commensurate with the SWAL, through
monitoring. Feedback of ANS software experience shall 3.0.11.2 be used to confirm
that the Software Safety Assurance System and the assignment of SWALs are
SWAL appropriate. For this purpose, the effect resulting from any reported software SSF1 - Part VII
3.0.11
Monitoring malfunction or failure from ANS operational experience (reported according to the SSF2 – Part II
relevant requirements on reporting and assessment of safety occurrences), shall
3.0.11.3 be assessed in respect of their mapping to SWAL definition as per Chapter 2 of
this document.
(Reported Software malfunction or failures are output of the ANS occurrence
reporting system as part of the ANSP Safety Management System)

© EUROCAE, 2009
23

ALs
Obj
Objectives OBJECTIVES Output

Title/Topic
Annex A Section A.2.1
Software Any change to the software shall 3.0.12.1 lead first to the re-assessment of the safety SSF1 - Part VII
3.0.12
Modifications impact of such a change on the system and then, depending on the impact, shall SSF2 – Part II
3.0.12.2 lead to the re-assessment of the SWAL allocated to this software.

The same level of confidence, through any means chosen and agreed with the
Approval Authority, shall 3.0.13.1 be provided for the same software assurance level for
developmental and non-developmental ANS software (eg Commercial Off The Shelf
SSF1 -Part VI
3.0.13 COTS software, etc).
SSF2 – Part II
These means shall 3.0.13.2 give sufficient confidence that the software meets the
safety objectives and requirements, as identified by the safety risk assessment and
mitigation process.
ANS software components that cannot be shown to be isolated from one another SSF1-Part II
3.0.14 Isolation shall 3.0.14.1 be allocated the SWAL of the most critical (most demanding SWAL) of
the components. SSF2 – Part III
All on-line The Software Safety Assurance System shall 3.0.15.1 deal specifically with software
aspects of SW related aspects, including all on-line software operational changes (eg cutover/hot SS1 -Part II
3.0.15
operational swapping). SSF2 – Part I
changes
The organisation shall 3.0.16.1 make available the required assurances, to the National SSF1 -All
Demonstration
3.0.16 Supervisory Authority, demonstrating that the objectives as per the allocated SWAL
to NSA SSF2 – All
have been satisfied.
The organisation shall 3.0.17.1 produce an argument demonstrating that the objectives
Argument of the allocated SWAL, as per this document, have been satisfied.
3.0.17 No explicit guidance in this version
Production Note: No explicit guidance. Chapter 2 as well as chapter 8 provide some basic
strategy information that can be used to produce the argument.
TABLE 5: SOFTWARE SAFETY ASSESSMENT SYSTEM OVERALL OBJECTIVES

© EUROCAE, 2009
24

3.1 SOFTWARE SAFETY ASSESSMENT INITIATION

Risk assessment and mitigation process (eg Functional Hazard Assessment-FHA and
Preliminary System Safety Assessment-PSSA) assumptions and output should be
confirmed as far as software can impact them.

© EUROCAE, 2009
25

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.2.3.1
The Software purpose shall 3.1.1.1 be defined.
SS1-
Operational scenarios shall 3.1.1.2 be defined (eg HMI: Operator Handbook which defines the
System Part I
3.1.1 mode of operation and the human-machine interface; nominal mode, degraded mode).
Description SSF2 –
The Software and System functions and their relationships shall 3.1.1.3 be defined.
Part IV
Software boundaries shall 3.1.1.4 be defined (eg operational, time)
Software external interfaces shall 3.1.1.5 be described.
Annex A Section A.2.3.1 SSF1 -
Operational Part I
3.1.2 The Software and its environment (physical, operational, control functions, legislative etc)
Environment shall 3.1.2.1 be described in sufficient detail to enable the safety lifecycle tasks to be SSF2 –
satisfactorily carried out. Part XX
Annex A Section A.2.3.1 SSF1 -
Regulatory Part I
3.1.3 Applicable safety regulatory objectives and requirements shall 3.1.3.1 be identified.
Framework SSF2 –
Part XX
Annex A Section A.2.3.1 SSF1 -
Applicable Part I
3.1.4 Processes and Processes and Guidance applicable to the Software Assurance shall 3.1.4.1. be agreed.
guidance SSF2 –
Part XX

Risk Annex A Section A.2.3.1 SSF1 -


Assessment Part I
3.1.5 The system level risk assessment and mitigation identification shall 3.1.5.1 be reassessed at
and Mitigation the software level to ensure it is consistent with the software architecture/design. SSF2 –
process Output Part XX

TABLE 6: SOFTWARE SAFETY ASSESSMENT INITIATION OBJECTIVES

© EUROCAE, 2009
26

NOTE: A system (People, procedures, and equipment) safety assessment has to be performed first, then its output re-assessed during the
equipment safety assessment and then software safety assessment.

3.2 SOFTWARE SAFETY ASSESSMENT PLANNING

ALs SWAL Output


Obj
Objectives OBJECTIVES
N° 1 2 3 4
Title/Topic

Software Annex A Section A.2.3.2 SS1 -


Safety Part III
3.2.1 The overall approach for the Software Safety Assessment across Software Lifecycle shall
Assessment be defined.
3.2.1.1 SSF2 –
Approach Part II
Annex A Section A.2.3.2

Software A plan describing the software safety assessment steps shall 3.2.2.1 be produced (eg SS1 -
Safety approach, relations between safety assessment and software lifecycle, deliverables (content Part III
3.2.2 and date of delivery), relations with software/system major milestones, project risk
Assessment SSF2 –
Plan management due to safety issues, responsibilities, persons, organisations, risk classification
Part II
scheme, safety objectives definition approach, hazard identification methods, safety
assurance activities, schedule, resource)

Software Annex A Section A.2.3.2 SS1 -


Safety Part III
3.2.3 The Software Safety Assessment plan shall 3.2.3.1 be reviewed and commented for approval
Assessment by NSA. SSF2 –
Plan Review Part II
Software Annex A Section A.2.3.2 SS1 -
Safety Part III
The Software Safety Assessment plan shall 3.2.4.1 be disseminated to the impacted parties.
3.2.4 Assessment
Plan SSF2 –
Dissemination Part II

TABLE 7: SOFTWARE SAFETY ASSESSMENT PLANNING OBJECTIVES

NOTE: These objectives recommend neither a specific packaging of a SW safety assessment plan, nor a separate plan (the plan can be
part of a (sub-)system plan).

© EUROCAE, 2009
27

3.3 SOFTWARE SAFETY REQUIREMENTS SPECIFICATION

The objectives in this section describe the Software Safety Assessment defined in objective 3.2.1.
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.2.3.3
SSF1 -
Potential failures shall 3.3.1.1 be identified by considering various ways Software can fail and
Failure Part II
3.3.1 by considering the sequence of events that lead to the occurrence of the failure.
Identification SSF2 –
A list of single, consequential and common modes of failure shall 3.3.1.2 be drawn up.
Part I
Note: Guidance on Common Mode Analysis can be found in SAM and ED-79.
Annex A Section A.2.3.3
SSF1 -
The effects of failure occurrence shall 3.3.2.1 be evaluated. Part II
3.3.2 Failure Effects
The hazards associated with software failure occurrences shall 3.3.2.2 be identified in order to SSF2 –
further complete the list of hazards initiated during Risk Assessment and Mitigation process Part I
(eg FHA and further completed during PSSA).
Annex A Section A.2.3.3 SSF1 -
Assessment of Part II
3.3.3 The initial Risk Assessment and Mitigation process (eg FHA and further completed during
Risk PSSA) shall 3.3.3.1 be revisited based upon the outcome of 3.3.1 and 3.3.2. SSF2 –
Part I
Annex A Section A.2.3.3
Software Requirements shall 3.3.4.1 be compliant with the Safety Objectives to which the SSF1 -
Software Software contributes and System Safety Requirements. Part II
3.3.4 Requirements
Note: The definition of “compliant” has to be developed as part of the argument sustaining SSF2 –
Setting
the demonstration of this objective. This definition should include the traceability with the Part I
above level of requirements, the demonstration of the necessity, sufficiency, appropriateness
and relevance of the requirements to satisfy the above level of requirements.

TABLE 8: SOFTWARE SAFETY REQUIREMENTS SPECIFICATION OBJECTIVES

© EUROCAE, 2009
28

NOTE: Software “safety” requirements are not limited to the SWAL. See Chapter 4 Section 3 Objective 4.3.4. Consequently, in this
document, the term “software requirements” is used, to reflect the fact that the same requirement can be classified as, for example a
performance, security, interoperability or safety requirement. It is the role of the System Safety Assessment and of the Software
Safety Assurance System to ensure that software requirements, that are necessary and sufficient to ensure an acceptably safe
operation, have been identified and verified.

3.4 SOFTWARE SAFETY ASSESSMENT VALIDATION, VERIFICATION AND PROCESS ASSURANCE

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic

Software Annex A Section A.2.3.4 SSF1 -


Safety Part III
3.4.1 The Software Safety Assurance System shall 3.4.1.1 provide an approach to justify that
Assessment Software Requirements are complete and correct. SSF2 –
Validation Part I

Software Annex A Section A.2.3.4 SSF1 -


Safety Part III
3.4.2 The software Requirements shall 3.4.2.1 be consistent with functions to mitigate the effects of
Assessment the hazard and the Safety Objective of the hazards. SSF2 –
Verification Part I
Software Annex A Section A.2.3.4 SSF1 -
Safety Part III
The software Safety Assessment shall 3.4.3.1 be performed completely.
3.4.3 Assessment
Process Note: In accordance with the approved SW safety plan, in conformance with ANSP Safety SSF2 –
Assurance Management System and in compliance with applicable safety regulatory requirements. Part I

Demonstration and Assurance that SW requirements are satisfied shall be provided. SSF1 -
Software Part VII
3.4.4 Safety
Assurance SSF2 –
Part I

TABLE 9: SOFTWARE SAFETY ASSESSMENT VALIDATION, VERIFICATION AND PROCESS ASSURANCE OBJECTIVES

© EUROCAE, 2009
29

3.5 SOFTWARE SAFETY ASSESSMENT COMPLETION

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.2.3.5 SSF1 –
Part XX
Document Software The Software Safety Assessment process results shall 3.5.1.1 be documented.
3.5.1 Safety Assessment SSF2 –
Process Results Part II

Software Safety Annex A Section A.2.3.5 SSF1 –


Assessment Part XX
Software Safety Assessment documentation shall 3.5.2.1 be put under configuration
3.5.2 Documentation
management. SSF2 –
Configuration
Management Part II

Annex A Section A.2.3.5


Software Safety Assessment documentation shall 3.5.3.1 be disseminated to impacted SSF1 –
Software Safety
parties. Part XX
Assessment
3.5.3
Documentation Note: This document does not presume who the impacted parties are. They are SSF2 –
Dissemination defined in accordance with the approved SW safety plan, in conformance with ANSP Part II
Safety Management System and in compliance with applicable safety regulatory
requirements.

TABLE 10: SOFTWARE SAFETY ASSESSMENT COMPLETION OBJECTIVES

© EUROCAE, 2009
30

3.6 SOFTWARE ASSURANCE LEVEL

3.6.0 Introduction

A SWAL scheme is a strategic project management aid that is intended to ensure that
appropriate software safety assurance processes are used throughout the lifecycle of
safety related software. SWALs identify sets of objectives to be applied to the software
lifecycle processes and products.
A SWAL indicates the recommended rigour of the assurance processes to be used to
generate the assurance evidence that will show whether the Software Requirements
have been satisfied with a degree of rigour commensurate with the risk presented by
the software. A SWAL provides a uniform level of rigor to which the development,
operational transfer, maintenance, decommissioning and product functional
assurances for the safety related software are produced. It is allocated upon the
potential consequences of anomalous behaviour of software (the severity of the end
effect of the software malfunction or failure and the likelihood of occurrence of the end
effect) as determined by the system safety assessment process.
A SWAL does not replace Safety Requirements assigned to the Software. It is a
Software Safety Requirement that has to be derived from the risk assessment and
mitigation process.
Compliance with a SWAL requires the software team to follow planned and systematic
actions to provide product, process or other evidence that can be argued to show that
an adequate level of confidence and assurance exists that the software satisfies its
requirements.
The allocation of a Software Assurance Level (SWAL) for software is part of the risk
assessment and mitigation process e.g. it is identified during the Preliminary System
Safety Assessment (SAM-PSSA) and confirmed during the System Safety
Assessment (SAM-SSA). However, if the risk assessment and mitigation process
does not define a SWAL allocation process it must be defined in the Software Safety
Assurance System.
The SWAL allocation process shall 3.6.0.1 be performed during the design of an
ANS system as part of a risk assessment and mitigation process.
The SWAL shall 3.6.0.2 be allocated by the “System team”
NOTE: The term “system team” should be taken to include system designers and
safety practitioners (see section 1.2.1).
The software team can offer support and increased confidence that the allocated
SWAL is appropriate by providing further understanding of the Software role,
contribution, interference and interactions with the overall ANS system architecture.

3.6.1 Supporting Information for SWAL Allocation Process

This section includes information necessary to perform the SWAL allocation process
described in section 3.6.2.
NOTE: The reader is encouraged to note the definitions, provided in section 1.6, of
internal and external mitigation means which are of particular use in this
section.

© EUROCAE, 2009
31

“Pivotal”
HAZARD
Event S EffectA
AND
/ S
OR
AND EffectB
/
F
OR
S EffectC
AND
/
OR Pe F
SW Ph F
EffectD

e.g.FTA
FTA ETAETA
e.g.
Causes
Causes Consequences
Consequences

FIGURE 3: RELATIONSHIP BETWEEN SW MALFUNCTION OR FAILURE, HAZARD AND


EFFECTS

Figure 3 illustrates the likelihood (Ph x Pe) that, once software malfunctions or fails,
this malfunction/failure could result in a certain effect, where Ph and Pe are defined
as:
Ph (identified during the safety assessment eg PSSA) is the likelihood that once
the software has malfunctioned or failed, this malfunction/failure generates a
hazard. Ph is commensurate with the ability of the remaining part of the
architecture to mitigate the software malfunction/failure;
Pe (identified during the safety assessment eg FHA) is the likelihood that the
hazard generates an effect having a certain severity.
For each hazard, depending on the method used to set Safety Objectives, there can
be:
many likelihoods Pe (one Pe per effect of the hazard), to be assessed for each
individual effect or;
only one likelihood Pe (one for the Worst Credible effect ).
The SWAL allocation process illustrated here is based on the second case, given
above, and considers the likelihood of a given software behaviour generating the
Worst Credible effect that it can cause.
The SWAL allocation process’s sensitivity to inaccurate quantification needs to be
reduced to an acceptable level due to:
The lack of accuracy on Pe. Pe may include some un-quantifiable human,
procedural and equipment aspects as well as quantifiable mitigation means or
models such as Collision Risk Models (CRM). Thus Pe may not always be
quantified precisely and may be stated to a given confidence eg +/-10%;
The lack of accuracy on Ph. Ph may include some un-quantifiable human,
procedural and equipment aspects as well as quantifiable mitigation means.
Thus Ph may not always be quantified precisely and may be stated to a given
confidence e.g. +/-10%;

© EUROCAE, 2009
32

The difficulty in providing a failure rate for the Software. Whilst it is assumed
that Software can fail and that a target failure rate objective may be set for the
combination of hardware and software by the Risk Assessment and Mitigation
process, it may only be possible to claim the failure rate for the combination of
hardware to a given accuracy.
As it is difficult to quantify accurately and precisely these probabilities, common sense,
expert judgement and other means (knowledgebase, lessons learned, incidents
reports, equivalent field service experience) should be used to assess those
probabilities.
As part of the safety assessment process (eg SAM-SSA), appropriate monitoring has
to be put in place to ensure that these values are satisfied as Pe and Ph should be
transposed into:
Assumptions captured through Safety Requirements on the Operational
Environment (Pe) (eg during SAM-FHA) and;
Safety Requirements on the elements of the ANS System (Ph) (eg during SAM-
PSSA).
Consequently, it is important to document the rationale for the allocated SWAL (end
effect and the likelihood of generating such an effect, overall system design,
operational environment) so that the satisfaction of the safety requirements and any
assumptions may be verified during the software lifecycle.

3.6.1.0 Example of a SWAL Allocation (Air-Ground Datalink)

Example: Air-Ground Datalink Safety Requirements on Software


Hazard: Undetected Corruption of a CPDLC message used for separation.
Scope of the hazard: CPDLC/ACL Service (Controller Pilot Datalink
Communication/ATC CLearance)
Outcome of the System safety assessment process (e.g.. FHA)
Operational Environment: European Continental Domestic En–Route
airspace (assuming airspace class definition, type and volume of
traffic,…)
External Mitigation Means (EMM) to mitigate the consequence of the
hazard once occurred:
EMM1: VHF shall be the primary means of communication,
EMM2: Radar surveillance shall be available,
EMM3: The airspace to which the clearance pertains is protected
until the response is received.
Effects of the hazard:
Various effects can be identified depending on the scenario, thus
depending on the EMM effectiveness: Severity Class 1 (e.g. mid-
air collision), Severity Class 2 (serious incident such as serious
loss of separation), Severity Class 3 (major incident such as large
reduction of separation with ATCO or aircrew controlling the
situation), Severity Class 4 (significant incident such as significant
increase of workload of Aircrew and ATCO)
Safety Objective (various methods can be applied, See SAM [2]) :
Either:
Worst Credible Effect: Severity Class 3 in a considering some
External Mitigation Means

© EUROCAE, 2009
33

Safety Objective for this hazard: The likelihood that an undetected


corruption of a CPDLC message used for separation shall be no
greater than Occasional (Occasional was set as 10-6/message due
to a Safety Target for Severity Class 3 ST3 = 1E-4/fh, a Pe = 1/10
and 100 ANS hazards having a Worst Credible Effect of Severity
Class3).
Or:
Assess Pe for each couple (Hazard – Effect) by taking into
consideration the effectiveness of the EMM involved in the Hazard
- Effect scenario (not developed in this example).
Outcome of the System safety assessment process (e.g. PSSA)
Safety Requirements assigned to the SW:
o SR-ACL-4: response message shall indicate to which message it refers
(Ph = 1).
o SR-ACL-8: Any processing (data entry/encoding/ transmitting/ decoding/
displaying) shall not affect the intent of the CPDLC message (Ph = 1).
o SR-ACL-18: The aircraft/ATSU shall be capable of detecting a corrupted
CPDLC message (Ph = 1).
SWAL: The likelihood (Pe x Ph) that a SW failure (SW data corruption) could generate
an effect having a Severity Class 3 was considered as “Very Possible” as:
overall no effective and independent significant mitigation means (EMM) exist:
Ph was considered as equal to 1 (no internal system architectural means exist
to mitigate some SW data corruption) and;
the External Mitigation Means (captured via Pe) are not very efficient once the
pilot has started executing the clearance.
Pe = 1/10, Ph = 1, thus Pe x Ph = 1/10 was considered as equivalent to “Very
Possible”.
SWAL3 states the level of satisfaction of Safety Requirements: SR-ACL-4, SR-ACL-8
and SR-ACL-18 will have to be in accordance with SWAL3.

3.6.2 SWAL Allocation Process

3.6.2.0 Process description

To allocate a SWAL to an ANS software function, the following five steps shall 3.6.2.0.1
be performed:
Using the definitions in 3.6.2.1, identify the likelihood (Pe x Ph) that, once
software malfunctions or fails, this software malfunction/failure can generate an
end effect which has a certain severity (for each effect of a hazard1);
i. Justify the likelihood (see guidance in 3.6.2.2);
ii. Identify the SWAL for that combination (of severity and likelihood) using
the matrix below;
iii. Repeat the two previous steps for all SW malfunction/failure.

1
This has to be done for all effects of a SW malfunction. If only the worst credible effect is taken into consideration, then only
the SW malfunction leading to that worst credible effect of the hazard will have been accounted for.

© EUROCAE, 2009
34

Effect Severity
Class
1 2 3 4
Likelihood of
generating such an effect
(Pe x Ph)
Very Possible SWAL1 SWAL2 SWAL3 SWAL4
Possible SWAL2 SWAL3 SWAL3 SWAL4
Very Unlikely SWAL3 SWAL3 SWAL4 SWAL4
Extremely Unlikely SWAL4 SWAL4 SWAL4 SWAL4

Allocate the most stringent SWAL out of all contributions as the final SWAL of the
software.
SWAL identification shall 3.6.2.0.2 assure that the number of SW malfunction/failure
leading to such SWAL has been considered.
NOTE: This requirement aims at ensuring that the cumulative aspect of risks due
to SW malfunction/failure has been considered while taking into account
the ability of a SWAL to provide adequate assurance for not only one, but a
number of malfunction/failures having similar severity/likelihood effects.
However, due to the intrinsic range of assurance provided by a given
SWAL, the SWAL allocation can generally be performed by considering
only the most severe software malfunction/failure.
NOTE: The qualitative definition of likelihood ("Very Possible", "Possible",”Very
Unlikely”, “Extremely Unlikely”) can be interpreted with quantitative values.
NOTE: More than 4 SWALs can be defined according to ESARR 6 and the EC
Regulation 482/2008, although 4 SWALs are used in ED-153.
If the “Software” scope, as initially specified by the System Design Team and to which
a SWAL is allocated during the risk assessment and mitigation process eg PSSA,
leads to further detailed “Software” design that requires splitting the initial Software
into parts, then each of those parts must also achieve the initially assigned SWAL
level. An exception to this is where isolation can be demonstrated and a SWAL is
allocated to each isolated software by applying the SWAL allocation process again.
This may lead to different SWALs being allocated to the isolated software.
Reminder (Objective 3.0.14): ANS software components that cannot be shown
to be isolated from one another shall 3.6.2.0.3 be allocated the software
assurance level of the most critical of the isolated components.
Any introduction of software into operation shall 3.6.2.0.4 demonstrate that an
acceptable level of safety is satisfied. If the risk assessment and mitigation
process leads to a situation where the mitigation of the end-effect Severity
Class 1, due to software malfunction or failure, is solely based on SWAL1 then
the risk assessment and mitigation demonstration should justify that relying
solely on SWAL 1 meets the acceptable level of safety.
The following considerations should be taken into account in relation to SWAL 1
SWAL1 objectives and associated evidence are extremely stringent leading to a
highly elevated; cost, development effort and difficulty in generating evidence.
Wherever possible alternative system designs should be proposed including
other mitigation means rather than relying on software being allocated a
SWAL1.

© EUROCAE, 2009
35

If the retained strategy to mitigate the end-effect Severity Class 1 is solely based on
SWAL1, the ANSP may be required to:
Justify why an alternative design including other additional independent
mitigation means is not used.

3.6.2.1 Likelihood Definition

The following qualitative definitions apply when assessing the likelihood (Pe x Ph)
during the SWAL allocation process for a system/SW having numerous years of
operations. For SW/system having a short lifetime, either quantitative definitions may
be defined (e.g. derived from the ANSP Risk Classification Scheme) or other
qualitative appropriate statements may be proposed.
Very Possible: This effect, due to software malfunction or failure, will almost certainly
occur throughout the system life.
Possible: This effect, due to software malfunction or failure, may occur throughout the
system life.
Very Unlikely: This effect, due to software malfunction or failure, is expected to occur
only in some exceptional circumstances throughout the system life.
Extremely Unlikely: This effect, due to software malfunction or failure, is not
expected to occur throughout the system life.
These definitions are complemented by the guidance in section 3.6.2.2:

3.6.2.2 Likelihood Guidance

The following, non-exhaustive list of factors should be considered when justifying the
likelihood (Pe x Ph) during the SWAL allocation process:
Number of mitigation means;
Efficiency of mitigation means (reliability, availability, suitability, and how
relevant and protective it is for the malfunction or failure being assessed);
Completeness of the mitigation means failure mode analysis;
Level of isolation of mitigation means (dependency between internal and
external mitigation means, common modes between internal mitigation means,
common modes between external mitigation means, physical location between
mitigation means, particular risks);
Nature of mitigation means (and diversity of nature with the element being
assessed);
Novelty of mitigation means;
Complexity of mitigation means;
Difference in the context of collection of mitigation means efficiency (operational
environment, usage/role of mitigation means) with the future context of element being
assessed in case of reuse of an existing mitigation means. This criteria can be used to
contribute in assessing the efficiency of the mitigation means.
The following guidance should be considered when justifying the likelihood (Pe x Ph)
during the SWAL allocation process:
Very Possible: No effective and independent significant mitigation means exist.
For example:
The number and efficiency of the mitigation means cannot be relied upon to
downgrade the SWAL.
The number and efficiency of mitigation means from an individual point of view
may seem to be significant enough, however the isolation criteria is seriously
impaired (common mode of failure).

© EUROCAE, 2009
36

The mitigation means are not yet proven or not yet proven in the future context.
The complexity or novelty of the mitigation means prevents relying significantly
on them (assessment complexity and completeness challengeable).
Though mitigation means may exist, they are not accounted for.
Possible: Effective, isolated mitigation means exist.
For example:
At least one effective, isolated mitigation means is proven.
The number of mitigation means is reduced but the efficiency is significant.
A number of isolated mitigation means each having moderate efficiency.
Very Unlikely: Very effective, isolated mitigation means exist.
For example:
The number of mitigation means is limited but the efficiency is very significant.
A significant number of isolated mitigation means each having significant
efficiency.
Extremely Unlikely: Extremely effective, isolated mitigation means exist.
For example:
The number of mitigation means is limited but the efficiency is extremely
significant.
A very significant number of isolated mitigation means each having significant
efficiency.
A significant number of isolated mitigation means each having very significant
efficiency.
The mitigation means are proven very significant in an identical context.

3.6.3 Examples of SWAL Allocation

3.6.3.0 Allocation of SWAL: looking at all effects

1st CASE: Safety Objectives were allocated by assessing all effects (eg SAM-FHA
Chapter 3 Guidance Material G “Methods for Setting Safety Objectives” Methods 1 & 3
or SAM-FHA Guidance Material E “ED-125 Quantitative Model” [2]). So all effects, due
to software malfunction or failure, are taken into consideration.
This Software will be allocated a SWAL = SWAL3 as it is the most stringent SWAL (for
both hazards).
Effect Severity 1 2 3 4

Likelihood of generating such an effect


(Pe x Ph)
Very Possible SWAL1 SWAL2 SWAL3 SWAL4
Possible SWAL2 SWAL3 SWAL3 SWAL4
Very Unlikely SWAL3 SWAL3 SWAL4 SWAL4
Extremely Unlikely SWAL4 SWAL4 SWAL4 SWAL4
SW malfunction or failure leading to:

Hazard1: Hazard2:

© EUROCAE, 2009
37

The way to read the table is the following:


For Hazard 1:
If it is “Extremely Unlikely” that a SW malfunction or failure generates Hazard1
and an effect having a severity 1, then this SW should be allocated a SWAL4;
If it is “Possible” that a SW malfunction or failure generates Hazard1 and an
effect having a severity 2, then this SW should be allocated a SWAL3;
If it is “Very Possible” that a SW malfunction or failure generates Hazard1 and
an effect having a severity 3, then this SW should be allocated a SWAL3;
If it is “Very Possible” that a SW malfunction or failure generates Hazard1 and
an effect having a severity 4, then this SW should be allocated a SWAL4;
For Hazard 2:
If it is “Extremely Unlikely” that a SW malfunction or failure generates Hazard2
and an effect having a severity 1, then this SW should be allocated a SWAL4;
If it is “Extremely Unlikely” that a SW malfunction or failure generates Hazard2
and an effect having a severity 2, then this SW should be allocated a SWAL4;
If it is “Very Unlikely” that a SW malfunction or failure generates Hazard2 and
an effect having a severity 3, then this SW should be allocated a SWAL4;
If it is “Possible” that a SW malfunction or failure generates Hazard2 and an effect
having a severity 4, then this SW should be allocated a SWAL4.

3.6.3.1 Allocation of SWAL: looking at Worst Credible case

2nd CASE: Safety Objectives were allocated using SAM-FHA Chapter 3 Guidance
Material G “Methods for Setting Safety Objectives” Method 2 or 4 or SAM-FHA
Guidance Material E “ED-125 Semi-Quantitative, Semi-Prescriptive or Fixed-
Prescriptive Models”.
Only the worst credible scenario which has been used to set safety objectives is taken
into consideration.
This Software will be allocated a SWAL = SWAL3 as it is the most stringent SWAL (for
both hazards which have a worst credible hazard effect having a severity 3).
Effect Severity 1 2 3 4

Likelihood of generating such an effect


(Pe x Ph)
Very Possible SWAL1 SWAL2 SWAL3 SWAL4
Possible SWAL2 SWAL3 SWAL3 SWAL4
Very Unlikely SWAL3 SWAL3 SWAL4 SWAL4
Extremely Unlikely SWAL4 SWAL4 SWAL4 SWAL4
SW malfunction or failure leading to:

Hazard1: Hazard2:

© EUROCAE, 2009
38

The way to read the table is the following:


For Hazard 1: As the Worst Credible effect of Hazard 1 has a Severity 3 (FHA result),
then
If it is “Very Possible” that a SW malfunction or failure generates Hazard1 and
an effect having a severity 3, then this SW should be allocated a SWAL3;
For Hazard 2: As the Worst Credible effect of Hazard 1 has a Severity 3 (FHA result),
then
If it is “Very Unlikely” that a SW malfunction or failure generates Hazard2 and an effect
having a severity 3, then this SW should be allocated a SWAL4.

3.6.3.2 Misuse of a SWAL

Whilst quantitative failure rate objectives for the combination of hardware and software
may be set by the Risk Assessment and Mitigation process, and achievement of these
rates may be claimed quantitatively or qualitatively, their achievement may not be
claimed through the application of a process.
Compliance to ED-153 at a given SWAL shall 3.6.3.2.1 not be used to claim a failure
rate/reliability figure for the software.

3.6.4 Grading Policy

3.6.4.0 Criteria

This document provides guidance on objectives to be achieved using defined criteria


but does not restrict the acceptability of evidence to demonstrate achievement of
those objectives to some specific evidence.
The following criteria have been considered in order to establish the objectives
applicable to each SWAL:
Level of achievement necessary ie:
Legend (used throughout this document):
The objective to be achieved with independence: ie the resulting outputs or
activities generated by the objective, have been verified or reviewed by
peers, different departments of the organisation, different organisation etc
The objective to be achieved.
Blank Achievement of objective is at organisation’s discretion.

Nature of evidence.
- Direct evidence: that which is produced by an activity taking place or
software behaviour occurring, which is directly related to the claim being
made;
- Indirect (backing) evidence: that which shows that the Direct Evidence is
both credible and soundly based.
Depth of investigation according to the table below.

© EUROCAE, 2009
39

Depth of investigation down to …. Comments


SWAL4 Software Requirements Software can be considered as a Black Box.
SWAL3 Architectural Design Access to the First level of Software
Components. A first level of architecture (by
refining the software into software components)
is specified and analysed, but these (first level)
software components can be considered as
black boxes
SWAL2 Code (Software Component) Access to the source code;
SWAL1 Executable Compiler, linker translation and editing outputs
are examined;
NOTE: The comments in this table are not necessarily applicable to COTS (see
section 7.2)
NOTE: This depth of investigation is not applicable to every objective.

3.6.4.1 Grading Policy Rationale

The following table provides rationale to help in understanding the aim of a SWAL
including what kind of overall objective is intended and the kind of errors which are
supposed to be prevented or tolerated.
Each SWAL adopts, in addition to its own criteria, the criteria of the less stringent
SWALs.
NOTE: This grading policy rationale is not applicable for every objective.

© EUROCAE, 2009
40

Methods for Independence


Errors to be
SWAL demonstrating error in performing Rationale
prevented
prevention the prevention
Validity of the software Black box testing to
Software requirements, ensure that software
4 No
Functional error performs its intended
Testing functional specifications
Validity and testing of the Confidence
architectural design augmentation by ‘grey
Design/low level
3 requirements : Interface No box testing’ to provide
functional error
and functionality tests of better evidence of
primary components intended functioning.
Validity and testing of the
detailed design
Credible requirements with
Confidence
corruption until structural coverage
Yes for some augmentation by
2 implementation statement verification:
objectives examination of each line
(source code state compatibility,
of code
level) hardware compatibility,
coding standards (strong
typing…)…
Testing of the
implementation Confidence
Credible
(executable code) with augmentation by
corruption until
deeper structural Yes for most of examination of the
1 implementation
coverage, the objectives conditional/decisional
(executable
conditional/decisional activation(s) of the lines
code level)
activation(s) and pre- and of code
post-conditions.

TABLE 11: GRADING CRITERIA

3.6.5 Similarity of Levels throughout Various Standards

Multiple standards exist today that propose activities whose evidence addresses some
of the objectives stated in this document. Where any such standard is used it shall
3.6.5.1 be complemented, where necessary, to satisfy the objectives/activities set for a
given SWAL.
This section provides a reference against which standards may be assessed for the
equivalence of assurance that they provide.
An analysis of the processes and their associated evidence (See Annex A) was used
to provide the comparison of assurances. Annex A provides the detailed traceability
between ED-153 objectives and the various standards (eg IEC 61508, ED-109)
therefore the user can deduce detailed equivalence and compliance from this annex. It
is the responsibility of the user to identify complementary activities for those objectives
for which partial compliance is provided.
The comparison did not address assurances of quantified reliability or quantified
integrity.
It is not intended that this document be used in isolation to demonstrate achievement
of a SWAL but instead that it be used in conjunction with other standards.
The following figure provides a comparison of assurance levels when considering the
process-oriented activities of ED153:
ED-109;
IEC61508.
NOTE: No existing standard fully covers all objectives of ED-153.

© EUROCAE, 2009
41

SW Standards
ED-109 ED-153 IEC61508

AL1 Partial compliance SWAL1 Partial compliance SIL4

AL2 Not Used

AL3 Partial compliance SWAL2 Partial compliance SIL3

AL4 Partial compliance SWAL3 Partial compliance SIL2

AL5 Partial compliance SWAL4 Partial compliance SIL1

Reference against which


to assess practices

FIGURE 4: SOFTWARE STANDARDS

However, caution is recommended when claiming similar assurances, in particular


with IEC61508, firstly due to the absence of a customised version of this generic
standard to ANS, secondly due to the various interpretations that can be made of the
Tables in Annexes A & B of IEC61508 Part 3 where some activities are “R”:
recommended or “HR” Highly recommended.
Example: Table B.2 “Structure-based testing” which is “R” for SIL2.
This recommendation is similar to statement coverage which is recommended for
SWAL2 and required for ED-109/DO-278 Level 3, consequently a SIL2 “similar” to
SWAL3 should not include this aspect. Therefore if a SIL2 is required and includes
this aspect, then the level of assurance becomes a “very strong” SWAL3 and gets
close to SWAL2 (though SWAL2 includes additional objectives).

© EUROCAE, 2009
42

CHAPTER 4

PRIMARY LIFECYCLE PROCESSES


4.0 INTRODUCTION

This chapter lists the objectives, per SWAL, that belong to primary lifecycle processes.
Primary lifecycle processes consist of:
1. Acquisition process:
a. Initiation;
b. System Safety Assessment;
c. Request-for-Tender [Proposal];
d. Contract preparation and update;
e. Supplier monitoring;
f. Acceptance and completion.
2. Supply process:
a. Initiation;
b. Preparation of response;
c. Contract;
d. Planning;
e. Execution and control;
f. Review and evaluation;
g. Delivery and completion.
3. Development process:
a. System requirements analysis;
b. System architectural design;
c. Process implementation;
d. Software requirements analysis;
e. Software architectural design;
f. Software detailed design;
g. Software integration;
h. Software installation;
i. Standards/Rules Definition;
j. Standards/Rules;
k. Requirement Development Management;
l. Use of Requirement Management Tool;
m. Resource Management;
n. Rationale for Design choices;
o. Traceability;
p. Transition criteria;
q. Design tool;

© EUROCAE, 2009
43

r. Use of a design tool;


s. Code generation tool;
t. Complexity constraints.
4. Operation process:
a. Process implementation;
b. Intended Operational Environment;
c. User support;
d. Software operation;
e. Performance monitoring.
5. Maintenance process:
a. Process implementation;
b. SWAL allocation confirmation;
c. SWAL achievement;
d. Software Migration;
e. Software Decommissioning.
The objectives in a primary process are the responsibility of the organisation initiating
and performing that process. This organisation ensures that the process is in
existence and functional.

4.1 ACQUISITION PROCESS

This chapter also provides guidance for SW supplier/acquirer relationships. However,


in some cases such formal relationships may not exist (e.g. in-house development)
and this chapter may either be considered as not applicable or can be used to clarify
the roles and responsibilities of the various units along the SW lifecycle.
The Acquisition Process contains the objectives and tasks of the acquirer. The
process begins with the definition of the need to acquire a system, software product or
software service. The process continues with the preparation and issue of a request
for proposal, selection of a supplier, and management of the acquisition process
through to the acceptance of the system, software product or software service.
The individual organisation having the need may be called the owner. The owner may
contract any or all of the acquisition activities to an agent who will in turn conduct
these activities according to the Acquisition Process. The acquirer in this sub-clause
may be the owner or the agent.
Objectives 4.1.1 to 4.1.3 (and 4.1.4 if the ‘project’ is at system or equipment level) are
not specific to software but related to the system (eg FHA & PSSA).
NOTE: Acquisition process does not relate to business aspects of acquisition, but
only to safety and quality aspects of it.

© EUROCAE, 2009
44

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A section A.3.1
The acquirer begins the acquisition process by describing a concept or a need to acquire,
develop, or enhance a system, software product or software service.
The acquirer shall 4.1.1.1 analyse and approve the system requirements eg against the user
requirements. SSF1 -
Part I
4.1.1 Initiation The system requirements shall 4.1.1.2 include business, organisational and user as well as
safety, security, and other criticality requirements along with related design, testing, and SSF2 –
compliance standards/rules and procedures. Part XX
The acquirer shall 4.1.1.3 prepare, document and execute an acquisition plan.
Note: The scope of applicability of this objective is restricted to the contractual tasks to be
performed by the acquirer in the framework of the contract with the supplier. Therefore this
objective should be reconciled with objective 4.3.3.
Risk Annex A section A.3.1
Assessment SSF1 -
The acquirer shall 4.1.2.1 determine how safe the system needs to be. The acquirer shall Part II
and Mitigation
4.1.2 4.1.2.2 perform a hazard analysis (eg FHA) and identify Safety Objectives for hazards.
Process – SSF2 –
safety Part I
objectives
Risk Annex A section A.3.1
Assessment SSF1 -
The acquirer shall 4.1.3.1 determine (during the System Design phase) whether or not the Part II
and Mitigation
4.1.3 proposed architecture is expected to achieve the Safety Objectives. The Acquirer shall4.1.3.2
Process – SSF2 –
specify Safety Requirements (including allocation of a SWAL) for system elements.
safety Part I
requirements

© EUROCAE, 2009
45

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A section A.3.1
Request For During the request for tender, the acquirer shall 4.1.4.1 determine which processes, activities,
4.1.4
Tender and tasks of the recommendations in this document are appropriate for the project and shall
4.1.4.2 tailor them according to the SWAL.

Annex A section A.3.1


Tender
4.1.5 The acquirer shall 4.1.5.1 establish a procedure for supplier selection including proposal
selection
evaluation criteria and requirements compliance weighting.
Annex A Section A.3.1
Supplier
4.1.6 The acquirer shall 4.1.6.1 monitor and discuss progress with the supplier regarding their
monitoring
contracted activities.
Annex A section A.3.1
The acquirer shall 4.1.7.1 prepare for software acceptance based on the defined acceptance
strategy and criteria. The preparation of test cases, test data, test procedures, and test
Acceptance environment shall 4.1.7.2 be included. The extent of supplier involvement shall 4.1.7.3 be
defined. SSF1 –
4.1.7 and
Part VII
completion The acquirer shall 4.1.7.4 conduct acceptance review and acceptance testing of the
deliverable software product or service.
Note: Acceptance can be composed of many milestones with varying roles and
responsibilities that have to be defined.

TABLE 12: ACQUISITION PROCESS OBJECTIVES

© EUROCAE, 2009
46

4.2 SUPPLY PROCESS

This chapter also provides guidance for SW supplier/acquirer relationships. However,


in some cases such formal relationships may not exist (e.g. in-house development)
and this chapter may either be considered as not applicable or can be used to clarify
the roles and responsibilities of the various units along the SW lifecycle.
The Supply Process contains the objectives and tasks of the supplier. The process
may be initiated either by a decision to prepare a proposal to answer an acquirer's
request for proposal or by signing and entering into a contract with the acquirer to
provide the system, software product or software service. The process continues with
the determination of procedures and resources needed to manage and assure the
project, including development of project plans and execution of the plans through
delivery of the system, software product or software service to the acquirer.

© EUROCAE, 2009
47

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A section A.3.1
4.2.1 Initiation The supplier shall 4.2.1.1 conduct a review of requirements in the request for proposal taking
into account organisational policies and other regulations to prepare the response.
Annex A section A.3.1
Preparation of
4.2.2 The supplier shall 4.2.2.1 define and prepare a proposal in response to the request for
response
proposal, including its recommended tailoring of any applied International Standard/rules.
Annex A section A.3.1
4.2.3 Contract The supplier shall 4.2.3.1 negotiate and enter into a contract with the acquirer organisation to
provide the software product or service.
Annex A section A.3.1
The supplier shall 4.2.4.1 define or select a software lifecycle model appropriate to the scope,
magnitude, and complexity of the project. The processes, activities, and tasks of any
applied International Standard/rules shall 4.2.4.2 be selected and mapped onto the lifecycle
4.2.4 Planning model.
The supplier shall 4.2.4.3 develop and document project management plan(s).
Note: The scope of applicability of this objective is restricted to the contractual tasks to be
performed by the supplier in the framework of the contract with the acquirer. Therefore this
objective should be reconciled with objective 4.3.3.
Annex A section A.3.1
The supplier shall 4.2.5.1 implement and execute the project management plan(s).

Execution & The supplier shall 4.2.5.2 monitor and control the progress and the quality of the software
4.2.5 products or services of the project throughout the contracted lifecycle.
control
Note: The scope of applicability of this objective is restricted to the contractual tasks to be
performed by the supplier in the framework of the contract with the acquirer. Therefore this
objective should be reconciled with objective 4.3.3.

© EUROCAE, 2009
48

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A section A.3.1
Review & The supplier shall 4.2.6.1 co-ordinate contract review activities, interfaces, and communication SSF1 –
4.2.6
evaluation with the acquirer's organisation. Part VII
The supplier shall 4.2.6.2 perform quality assurance activities.
Annex A section A.3.1
Delivery &
4.2.7 The supplier shall 4.2.7.1 deliver and provide assistance to the acquirer in support of the
completion
delivered software product or service as specified in the contract.

TABLE 13: SUPPLY PROCESS OBJECTIVES

© EUROCAE, 2009
49

4.3 DEVELOPMENT PROCESS

The Development Process contains the objectives and tasks of the developer. The
process contains the objectives for requirements analysis, design, coding, integration,
testing, and installation and acceptance related to software products. It may contain
system-related objectives if stipulated in the contract. The developer performs or
supports the activities in this process in accordance with the contract.
NOTE: System related objectives are part of the Risk Assessment and Mitigation
Process (eg FHA & PSSA). However as these processes are interacting in
an iterative way, system requirements, architecture, integration, … are to
be reassessed to be confirmed and validated when software activities are
performed. That is why these system objectives are listed in the software
related one (See Annex A section A.3.3).
Satisfying system-related objectives is a pre-requisite to software objectives
satisfaction.
Objectives 4.3.1 and 4.3.2 are not specific to software but related to the system (eg
PSSA) and as a result do not intend to describe the system requirements definition
process.

© EUROCAE, 2009
50

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A section A.3.3
The system requirements specification shall 4.3.1.1 describe, as a minimum:
System
• functions and capabilities of the system; SSF2 –
4.3.1 Requirements
Part I
Analysis • business/performance, organisational and user requirements;
• safety, security, human-factors engineering (ergonomics), interface, operations, and
maintenance requirements; design constraints and validation requirements.
Annex A section A.3.3 SSF1 -
System Part I
4.3.2 Architectural System requirements shall 4.3.2.1 be allocated among hardware, software, people and
Design procedures. SSF2 –
Part I
Annex A Sections A.3.3 and A.3.3.1
A software lifecycle model appropriate to the scope, magnitude, and complexity of the
project shall 4.3.3.1 be defined and placed under configuration management. It shall 4.3.3.2
include, as a minimum:
• end of activity/phase criteria for each activity/phase SSF1 -
Process • joint technical review for each activity/phase Part VII
4.3.3
Implementation SSF 2 –
Others
Standards/Rules, methods, tools, and computer programming languages shall 4.3.3.3 be
selected, tailored and used according to the SWAL.
Note: Process implementation includes lifecycle definition, output documentation, output
configuration management, SW products problems, environment definition, development
plan, COTS

© EUROCAE, 2009
51

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.3.3
The developer shall 4.3.4.1 establish and document software requirements, using software
requirements standards/rules as defined per Objectives 4.3.9 & 4.3.10.
The Software Requirements shall 4.3.4.2, as a minimum:
SSF1 -
SW • specify the functional behaviour of the ANS software, capacity, accuracy, timing
Part VII
4.3.4 requirements performances, software resource usage on the target hardware, robustness to abnormal
analysis operating conditions, overload tolerance. SSF2 –
Part I
• be complete and correct;
• comply with the System Requirements;
• an identification of the configuration/adaptation data range.
Algorithms shall 4.3.4.3 be specified.
Annex A Section A.3.3
SSF1 -
The developer shall 4.3.5.1 transform the requirements for the software into an architecture Part VII
SW architectural
4.3.5 that describes its top-level structure and identifies the software components.
design SSF2 –
Note: The scope of this objective is top level SW architecture definition, top level Part I
interfaces design, SW integration definition, SW architecture definition criteria
Annex A Section A.3.3
SSF1 -
The developer shall 4.3.6.1 develop a detailed design for each software component of the Part VII
SW detailed
4.3.6 software using software design standards/rules.
design SSF2 –
Note: The scope of this objective is SW detailed design definition, interfaces design, SW Part IV
Units tests definition.

© EUROCAE, 2009
52

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.3.3
An integration plan shall 4.3.7.1 be developed to integrate the software units and software SSF1 -
components into the software. The plan shall 4.3.7.2 include verification/test requirements, Part VII
4.3.7 SW integration
procedures, data responsibilities, and schedule. The plan shall 4.3.7.3 be documented. SSF2 –
Note: The scope of this objective is SW integration plan, SW integration definition, user Part XX
documentation, SW validation preparation, SW integration evaluation (partially)
Annex A Section A.3.3 SSF1 -
Part VII
4.3.8 SW installation A plan shall 4.3.8.1 be developed to install the software product in the target environment as
designated in the contract. The resources and information necessary to install the software SSF2 –
product shall 4.3.8.2 be documented and made available before installation. Part XX
Annex A Section A.3.3.1
Development Plan SSF1 -
Standards/rules The developer shall 4.3.9.1 develop plans for conducting the activities of the development Part VII
4.3.9
definition process. The plans shall 4.3.9.2 include as a minimum: specific standards/rules, methods, SSF2 –
tools, actions and responsibility associated with the development and validation of all Part I
requirements including safety. If necessary, separate plans may be developed. These
plans shall 4.3.9.3 be documented and executed.

© EUROCAE, 2009
53

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.3.3.1
SW development plan (standards/rules):
The developer shall 4.3.10.1 identify SW Requirements standards/rules (note minimum SSF1 -
content identified in objective 4.3.4), Part VII
4.3.10 Standards/rules
The developer shall 4.3.10.2 identify SW Design Standards/Rules, SSF2 –
Part I
The developer shall 4.3.10.3 identify SW Coding Standards/Rules,
Also, references to the standards/rules for previously developed software, including COTS
software, if those standards/rules are different.
Annex A Section A.3.3.1
Software Development Environment
The developer shall 4.3.11.1 identify the selected software development environment in terms
SSF1 -
Requirement of:
Part VII
4.3.11 development (1) The chosen requirements development method(s), procedure(s) and tools (if any) to be
management SSF2 –
used.
Part I
(2) The hardware platforms for the tools (if any) to be used
Example: Method(s) are for example: SADT, SART, OOD…, though procedures are
organisational ways of performing requirement management.
Annex A Section A.3.3.1 SSF1 -
Use of a Part I
4.3.12 Requirement A Requirement specification tool shall 4.3.12.1 be used.
specification tool SSF2 –
Part I

© EUROCAE, 2009
54

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Sections A.3.3.5, A.3.3.6, and A.3.3.9 SSF1 -
Resource Part VII
management A necessary margin with regards usage of resources (eg memory, CPU load, drivers, …)
for safety purpose shall 4.3.13.1 be specified. SSF2 –
4.3.13 Part I
The margin shall 4.3.13.2 be measured or verified to ensure satisfaction of the specification.
(load, memory,
….) If many software share the same resources, then the margin shall 4.3.13.3 be evaluated at
system level.
Annex A Section A.3.3.1 SSF1 -
Part VII
The developer shall 4.3.14.1 define real-time design features of software components at
architectural design level. A set of properties, such as the following, shall 4.3.14.1 be SSF2 –
Rationale for identified: Part I
design choices • tasks and run- time aspects (priority, events, communications, ….)
4.3.14 especially real
time oriented • interruptions (priorities, delay management, SW watchdog…) Included
one in safety
• treatment & propagation of errors (detection & recovering mechanisms, ….) SW
• data management (protection & deadlock mechanisms, ….) design
standar
• initialisation/ stop (exchange of data during these phases) ds/rules
Annex A Section A.3.3.1: SSF1 -
Part VII
a) The developer shall 4.3.15.1 ensure there is traceability between System and
Software requirements (Annex A.3.4) SSF2 –
Part I
b) The developer shall 4.3.15.2 ensure there is traceability between Software
4.3.15 Traceability requirements and Software design (Software component level, architectural design) (Annex
A.3.6) Two
c) The developer shall 4.3.15.3 ensure there is traceability between Software matrices
Architectural Design and Code (Annex A.3.7) per link
d) The developer shall ensure there is traceability between Code and (both
4.3.15.4
Executable ways)

© EUROCAE, 2009
55

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.3.3.1
Verification/ transition criteria
The developer shall 4.316.1 describe the software lifecycle processes to be used to form the
specific software lifecycle(s) to be used on the project, including the transition criteria for the SSF1 -
software development processes. Part VII
Transition
4.3.16 criteria between All essential information from a phase of the software lifecycle needed for the correct SSF2 –
lifecycle phases execution of the next phase shall 4.3.16.2 be available and verified. Part II
See also evaluation criteria for Specification, design, code, test, integration

Transition criteria for all phases shall 4.3.16.3 be defined

Transition criteria for Requirements Analysis and Verification phases shall 4.3.16.4 be
defined.
Annex A Section A.3.3.1.1 and A 3.3.5
SSF1 -
Software Development Environment
Part VII
If a design tool is used, then the developer shall 4.3.17.1identify the selected software
4.3.17 Design tool SSF2 -
development environment in terms of:
Other
(1) The chosen design method(s), procedure(s) and tools (if any) to be used.
(2) The hardware platforms for the tools (if any) to be used
Annex A Sections A.3.3.1.1, A.3.3.2, A.3.3.5 SSF1 -
Part I
A design tool shall 4.3.18.1 be used.
Use of Design
4.3.18 SSF2 -
tool
Other

© EUROCAE, 2009
56

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Sections A.3.3.1, A.3.3.1.1, and A.3.3.7
Software Development Environment
The developer shall 4.3.19.1 identify the selected software development environment in terms
of:
(1) The programming language(s), coding tools, compilers, linkage editors and loaders to
be used,
SSF1 -
(2) The hardware platforms for the tools to be used Part I

Code generation Programming Languages Annex A.3.7 SSF2 -


4.3.19 Other
Environment The selection of suitable programming languages shall 4.3.19.2 be justified for the required
Assurance Level

Compilers considerations Annex A.3.7


Compilers mode of use (optimisations, limitations,….) shall 4.3.19.3 be defined.

SW development tool validation Annex A.3.


The context for such a validation shall 4.3.19.4 be defined.
(Validation/certification of compilers/linkers/code generation tools)

Annex A Sections A.3.3.1.1, A.3.3.4, A.3.3.5, A.3.3.6, A.3.3.7, and A.3.3.8 SSF1 -
Part VII
A level of complexity (as well as selected criteria defining this complexity) shall 4.3.20.1 be
Complexity
4.3.20 defined and measured. SSF2 –
constraints
Part IV

TABLE 14: DEVELOPMENT PROCESS

© EUROCAE, 2009
57

4.4 OPERATION PROCESS

The Operation Process contains the objectives and tasks of the operator. The process
covers the operation of the software product in the context of the ATM system
(including people procedures and equipment) and operational support to users.
Because operation of software product is integrated into the operation of the system,
the objectives and objectives of this process refer to the system. Therefore these
objectives are not software-specific.
The operation process starts once transfer into operation starts, ie after a first software
release has been operationally approved. Operation includes transferring into
operation, commissioning and operating software. Decommissioning is covered by
Maintenance Processes.
The means (activities and evidence) to satisfy Operation process objectives 4.4.1 and
4.4.5 will vary per SWAL due to:
Operation of the software and the consequences of the operation;
The need for performance monitoring data.
The operational process addresses the usage of the software product by operational
staff in accordance with operational procedures.
As described in this document (chapter 5.5 VALIDATION), equipment validation is not
covered in this section.
Some ANS procedures exist which rely on the software. The purpose of this chapter is
not to define these ANS procedures, but to define how the software has to be
operated (eg HMI user’s manual, mode of operation) when using these ANS
procedures.
These ANS procedures need verification and validation, however this is out of scope
for this document.
The procedures for raising problem reports and modification requests are covered by
the Problem Resolution Process (Chapter 5 section 8 of this document).

© EUROCAE, 2009
58

Obj ALs OBJECTIVES SWAL


N° Objectives 1 2 3 4 Output
Title/Topic
Annex A Section A.3.4 SSF1 -
Process Part VII
4.4.1 The operation process shall 4.4.1.1 be defined and executed.
implementation SSF2 –
Part VI
Annex A Section A.3.4
The software shall 4.4.2.1 be operated in its system intended environment according to the user SSF1 -
Intended documentation. Part VII
4.4.2 Operational
Note: Demonstration should be provided that any use of SW settings (eg selection by users of SSF2 –
Environment
parameters, database, configuration file, …) as per the SW specification will lead to safe operations. Part I
A strategy has to be developed to sustain such objective that may be addressed by the verification
process.
Annex A Section A.3.4 SSF1 -
Part VII
4.4.3 User support The operator shall 4.4.3.1 approve training to the users if relevant.
SSF2 –
Other
Annex A Section A.3.4
SSF1 -
Procedures to operate the software shall 4.4.4.1 be defined and executed (including the Part VII
Software
4.4.4 configuration/adaptation sets that have been verified and validated).
Operation SSF2 –
Note: If another adaptation/configuration data set is used, then they must be considered as a Part XX
change to the system.
Annex A Section A.3.4 SSF1 -
Performance Part VII
4.4.5 An approach commensurate with the allocated SWAL stringency shall 4.4.5.1 exist to monitor the
Monitoring Software performance. SSF2 –
Part I

TABLE 15: OPERATION PROCESS OBJECTIVES

© EUROCAE, 2009
59

4.5 MAINTENANCE PROCESS

The Maintenance Process contains the objectives and tasks of the maintainer. This
process is activated whenever the software product undergoes modifications to code
and associated documentation due to a problem or the need for improvement or
adaptation. This process includes the migration and decommissioning of the software
product. The process ends with the decommissioning of the software product.
The maintenance process as defined in this document covers modification of software
that has been commissioned ie software that is introduced into operation (not during
software development). However, the maintenance process as described in this
document does not cover the processes which collect reports or requests for
modification and agree on the acceptance of the modification (See Section 5.8
Problem Resolution Process).
The Problem Resolution Process triggers the Maintenance process.
NOTE: Maintenance does not cover modifications due to new requirements or
changes to existing requirements as this leads to repeating the complete
System Safety Assessment process.

© EUROCAE, 2009
60

Obj ALs OBJECTIVES SWAL


N° Objectives 1 2 3 4 Output
Title/Topic
Annex A Section A.3.5 SSF1 -
Process The maintenance process shall 4.5.1.1 be defined and executed. Part VII
4.5.1
implementation SSF2 –
This software maintenance intervention shall 4.5.1.2 be subject to a risk assessment and
mitigation process (eg SAM-SSA Chapter3 GMC). Part VI

Annex A Section A.3.5 SSF1-


SWAL allocation Part V
4.5.2 The impact on risk acceptability of the problem or modification as provided by the “Problem
confirmation Resolution Process” shall 4.5.2.1 be confirmed throughout the maintenance process. SSF2 –
Part VI
Annex A Section A.3.5
SSF1 -
The maintainer shall 4.5.3.1 ensure that any maintenance activity is performed in accordance Part VII
4.5.3 SWAL satisfaction with the allocated SWAL.
SSF2 –
Note: This means that not only the maintenance process but also the other processes Part VI
must be addressed.
Annex A Section A.3.5 SSF1 -
The maintainer shall 4.5.4.1 define a procedure to migrate the changed software and put it Part V
4.5.4 Software migration
into operation. SSF2 –
A risk assessment and mitigation of the migration shall 4.5.4.2 be performed. Part VI

Annex A Section A.3.5 SSF1 -


Software Part V
4.5.5 A decommissioning plan shall 4.5.5.1 be defined before decommissioning.
Decommissioning SSF2 -
A risk assessment and mitigation of the decommissioning shall 4.5.5.2 be performed.
Other

TABLE 16: MAINTENANCE PROCESS OBJECTIVES

© EUROCAE, 2009
61

CHAPTER 5

SUPPORTING LIFECYCLE PROCESSES


5.0 INTRODUCTION

This chapter lists the objectives, per SWAL, that belong to supporting lifecycle
processes.
Supporting lifecycle processes consist of:
1. Documentation process:
a. Process implementation;
b. Design and development;
c. Production;
d. Maintenance.
2. Configuration management process:
a. Process implementation;
b. Configuration identification;
c. Configuration control;
d. Configuration status accounting;
e. Configuration evaluation;
f. Retrieval & Release process
g. Use of Configuration Management tool;
h. Acquirer agreement for the use of a Configuration Management tool;
i. Configuration Management at the level of Software Component;
j. Configuration Management Traceability.
3. Quality assurance process:
a. Process implementation;
b. Product assurance;
c. Process assurance;
d. Quality audits.
4. Verification process:
a. System verification;
b. Verification plan;
c. Software requirements;
d. Integration;
e. Software Design;
f. Code;
g. Independent verification;
h. Executable;
i. Data;
j. Traceability;

© EUROCAE, 2009
62

k. Complexity measures;
l. Verification process results;
m. Retrieval & Release process.
5. Validation process: (Not Applicable to SW as validation is system-related)
a. Process implementation;
b. Validation planning;
c. Boundaries validation;
d. Pass/Fail criteria;
e. Validation test;
f. Record of validation activities;
g. Independent validation team.
6. Joint review process:
a. Process implementation;
b. Project management reviews;
c. Technical reviews.
7. Audit process:
a. Process implementation;
b. Audit.
8. Problem resolution process:
a. Process implementation;
b. Problem resolution;
c. Safety impact;
d. Problem Report Configuration Management.
The objectives in a supporting process are the responsibility of the organisation
performing that process. Depending on the lifecycle phase, different organisations
may be responsible for performing a process.
Each organisation shall 5.0.1 ensure that the process is in existence and functional.

© EUROCAE, 2009
63

5.1 DOCUMENTATION PROCESS

The Documentation Process is a process for recording information produced by a lifecycle process or activity. The process contains the set of
objectives, which plan, design, develop, produce, edit, distribute, and maintain those documents needed by all concerned such as managers,
engineers, NSA and users of the system or software product.

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.1 SSF1 -
Process Part III
5.1.1 A plan, identifying the documents to be produced during the lifecycle of the software
Implementation product, shall 5.1.1.1 be defined and implemented. SSF2 –
Part II
Annex A Section A.4.1 SSF1 -
Design & Part III
5.1.2 Each identified document shall 5.1.2.1 be designed in accordance with applicable
Development documentation standards/rules. SSF2 –
Part II
Annex A Section A.4.1
SSF1 -
The documents shall 5.1.3.1 be produced and provided in accordance with the plan. Part III
5.1.3 Production Production and distribution of documents may use paper, electronic, or other media. Master
SSF2 –
materials shall 5.1.3.2 be stored in accordance with requirements for record retention,
Part II
security, maintenance, and backup.
Annex A Section A.4.1 SSF1 -
Part III
5.1.4 Maintenance The tasks, that are required to be performed when documentation is to be modified, shall
5.1.4.1 be
performed in accordance with the configuration management processes (5.2). SSF2 –
Part II

TABLE 17: DOCUMENTATION PROCESS OBJECTIVES

© EUROCAE, 2009
64

5.2 CONFIGURATION MANAGEMENT PROCESS

The Configuration Management Process is a process of applying administrative and


technical procedures throughout the software lifecycle to: identify, define, and baseline
software in a system; control modifications and releases of the items; record and
report the status of the items and modification requests; ensure the completeness,
consistency, and correctness of the items; and control storage, handling, and delivery
of the items.
Rationale: Configuration Management ensures that assurances, for the safety of the
software in the system context, are at all times derived from:
- a known executable version of the software,
- a known range of configuration data, and
- a known set of software products and descriptions that have been used in the
production of that version.
NOTE: At the equipment level, configuration management should trace software
and hardware versions to ensure that compatibility is achieved.

© EUROCAE, 2009
65

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.2
Process implementation
A configuration management plan shall 5.2.1.1 be developed. The plan shall 5.2.1.2 include, as
a minimum:
• the configuration management activities;
Configuration • procedures and schedule for performing these activities; SSF1- Part
management VII
5.2.1 • the organisation(s) responsible for performing these activities; and their relationship
process SSF2 –
with other organisations, such as software development or maintenance;
implementation Part II
• Software lifecycle environment control management (tools used to develop or verify
SW)
• Definition of SW lifecycle data (any output relevant to the safety assurance of the
software) control management.
The plan shall 5.2.1.3 be documented, placed under configuration management and
implemented.
Annex A Sections A.3.3.1 and A.4.2
A scheme shall 5.2.2.1 be established for identification of software and their versions to be
controlled throughout the complete lifecycle of the software.
For each version of all software, the following shall 5.2.2.2 be identified, as a minimum:
SSF1 -Part
• the documentation that establishes the baseline;
Configuration VII
5.2.2 • the version references;
identification SSF2 –
• the problem reports list (those already fixed, those fixed in that particular version and Part VI
those still open if any);
• and other identification details.
The items to be configuration-identified shall 5.2.2.3 be identified, along with their associated
configuration management level.

© EUROCAE, 2009
66

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.2
The following shall 5.2.3.1 be performed: identification and recording of change requests;
analysis and evaluation of the changes; approval or rejection of the request; and SSF1 -Part
Configuration implementation, verification, and release of the modified software. V
5.2.3
control An audit trail shall 5.2.3.2 exist, whereby each modification, the reason for the modification, SSF2 –
and authorisation of the modification can be traced. Part VI
Control and audit of all accesses to the controlled software that handle safety related
functions shall 5.2.3.3 be performed.
Annex A Section A.4.2
SSF1 -Part
Configuration Management records and status reports that show the status and history of controlled V & VII
5.2.4 status software including baseline shall 5.2.4.1 be prepared. Status reports shall5.2.4.2 include the
accounting SSF2 –
number of changes for a project, latest software versions, release identifiers, the number of
Part VI
releases, and comparisons of releases.
Annex A Section A.4.2 SSF1 -Part
Configuration VII
5.2.5 The following shall 5.2.5.1 be determined and ensured: the functional completeness of the
evaluation software against their requirements and the physical completeness of the software SSF2 –
(whether their design and code reflect an up-to-date technical description). Part IV
Annex A Section A.4.2
SSF1 -Part
Retrieval & A retrieval and release process shall 5.2.6.1 exist and shall 5.2.6.2 be documented. VII
5.2.6 Release
The release and delivery of software products and documentation shall 5.2.6.3 be formally SSF2 –
Process controlled. Master copies of code and documentation shall 5.2.6.4 be maintained for the life Part VI
of the software product.
A tool shall 5.2.7.1 be used to perform Software configuration management. SSF1- Part
Use of a CM I
5.2.7
tool SSF2 –
Part VI

© EUROCAE, 2009
67

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
The acquirer shall 5.2.8.1 approve the selected software configuration management tool. SSF1 -Part
Use of a CM VII
5.2.8 tool (acquirer
agreement) SSF2 –
Part VI
The software configuration management shall 5.2.9.1 be performed at the Software Unit SSF1 -Part
At level of SW level. VII
5.2.9
component SSF2 –
Part VI
Software lifecycle data (any output) shall 5.2.10.1 be traceable between versions. SSF1 -Part
Configuration VII
5.2.10 Management All lifecycle data shall 5.2.10.2 be traceable to the version of software being deployed.
Traceability SSF2 –
Part VI
The software configuration management shall 5.2.11.1 be performed at the Software source SSF1 -Part
At level of SW code level. VII
5.2.11
source code SSF2 -
Other

TABLE 18: CONFIGURATION MANAGEMENT PROCESS OBJECTIVES

© EUROCAE, 2009
68

5.3 QUALITY ASSURANCE PROCESS

The Quality Assurance Process is a process for providing adequate assurance that
the software products and processes in the project lifecycle conform to their specified
requirements and adhere to their established plans. To be unbiased, quality
assurance shall 5.3.1 have organisational freedom and authority from persons directly
responsible for developing the software product or executing the process in the
project. Quality assurance may be internal or external depending on whether evidence
of product or process quality is demonstrated to the management of the supplier or the
acquirer. Quality assurance may make use of the results of other supporting
processes, such as Verification, Validation, Joint Reviews, Audits, and Problem
Resolution.
NOTE: The Joint Review and Audit Processes may be addressed (with
justification) within the Quality Assurance Process.

© EUROCAE, 2009
69

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.3
A quality assurance process tailored to the project shall 5.3.1.1 be established. The
objectives of the quality assurance process shall 5.3.1.2 be to assure that the software
Process SSF1 -Part VII
products and the processes employed for providing those software products comply with
5.3.1 Implementati
their established requirements and adhere to their established plans. SSF2 – Part I
on
A plan for conducting the quality assurance process activities and tasks shall 5.3.1.3 be
defined, implemented, and maintained (including configuration management of evidence
records) throughout the relevant parts of the software lifecycle.
Annex A Section A.4.3
Product It shall 5.3.2.1 be assured that all the plans required by ED-153 are defined, are mutually SSF1 -Part VII
5.3.2
Assurance consistent, and are being executed as required. SSF2 – Part I
A Software Conformity review shall 5.3.2.2 be performed.
Annex A Section A.4.3
It shall 5.3.3.1 be assured that those software lifecycle processes (supply, development, SSF1 -Part VII
Process operation, maintenance, and supporting processes including quality assurance)
5.3.3 SSF2 – Part I,
Assurance employed for the project adhere to the plans.
IV, V
It shall5.3.3.2 be assured that the internal software engineering practices, development
environment and test environment adhere to the plans.

TABLE 19: QUALITY ASSURANCE PROCESS OBJECTIVES

© EUROCAE, 2009
70

5.4 VERIFICATION PROCESS

The Verification Process is a process for determining whether the software products of
an activity fulfil the requirements or conditions imposed on them in the previous
activities. For cost and performance effectiveness, verification should be integrated,
as early as possible, with the process (such as supply, development, operation, or
maintenance) that employs it. This process may include analysis, review and test.
This process may be executed with varying degrees of independence. The degree of
independence may range from the same person or different person in the same
organisation to a person in a different organisation with varying degrees of separation.
In the case where the process is executed by an organisation independent of the
supplier, developer, operator, or maintainer, it is called Independent Verification
Process (confirmation by examination of evidence that a product, process or service,
fulfils specified requirements).
All verification activities are assessing a product not only against its requirements, but
also for conformance with the process that created it and any particular
standards/rules used in its creation.

© EUROCAE, 2009
71

5.4.0 Software Verification

Obj ALs OBJECTIVES SWAL Output


N° Objectives 1 2 3 4
Title/Topic
Annex A Section A.4.4 SSF1 -
Part VII
Process implementation
Verification
SSF2 –
5.4.1 process A verification process tailored to the software shall 5.4.1.1 be established. The output of
Part II, IV,
implementation the verification process shall 5.4.1.2 be documented and distributed to the interested
V
parties.

Annex A Section A.4.4


A verification plan shall 5.4.2.1 be defined. The plan shall 5.4.2.2 address the lifecycle
verification activities and phase outputs subject to verification and related resources,
SSF1 -Part
responsibilities, pass fail criteria, methods and schedule. The plan shall 5.4.2.3 address
VII
procedures for forwarding verification reports to the interested parties stating the action to
be taken by each party. SSF2 –
Part II, IV,
5.4.2 Verification plan Note: Ensure that the description of the various testing activities and the phase of the
V
SW lifecycle (FAT, SAT, software testing) is included somewhere eg as part of the
verification plan (set of)
verification
Note: Objectives regarding the verification of the configuration/adaptation data may be
plan(s)
expanded in operation process (see 4.4).
The strategy for verifying the appropriate combination of configuration/adaptation
data is described in the verification plan.

© EUROCAE, 2009
72

Obj ALs OBJECTIVES SWAL Output


N° Objectives 1 2 3 4
Title/Topic
Annex A Section A.4.4
a) It shall 5.4.3.1 be verified that software requirements are correct and complete;
b) The software requirements shall 5.4.3.2 be verified considering the functional behaviour
of the implemented Software complies with the Software Requirements;
c) The software requirements shall 5.4.3.3 be verified considering the timing
performances of the implemented software complies with the Software Requirements;
d) The software requirements shall 5.4.3.4 be verified considering the software
requirements are consistent, feasible, and verifiable;
e) The software requirements shall 5.4.3.5 be verified considering implemented software
SSF1 -Part
robustness to abnormal operating conditions complies with the Software Requirements
VII
f) The software requirements shall 5.3.4.6 be verified considering external consistency
SSF2 –
Verification of (boundaries) with the system requirements;
Part II, IV,
5.4.3 software
g) The software requirements shall 5.4.3.7 be verified considering internal consistency V
requirements
between software requirements; SW
h) The software requirements shall 5.4.3.8 be verified considering compatibility between verification
implemented software and the HW/SW features of the target computer (system response results
time, Input/output HW, operation on the target computer);
i) The software requirements shall 5.4.3.9 be verified considering Software requirements
conform to Software requirements standards/rules;
j) The software requirements shall 5.4.3.10 be verified considering algorithms are
accurate and correct;
k) The software requirements shall 5.4.3.11 be verified considering the capacity of the
implemented software complies with the Software Requirements;
l) The software requirements shall 5.4.3.12 be verified considering the overload tolerance
of the implemented Software complies with the Software Requirements.

© EUROCAE, 2009
73

Obj ALs OBJECTIVES SWAL Output


N° Objectives 1 2 3 4
Title/Topic
Annex A Section A.4.4
As a minimum:
a) The integration verification shall 5.4.4.1 verify whether the software components
have been completely and correctly integrated into the software
b) The integration verification shall 5.4.4.2 verify whether the software units have been
completely and correctly integrated into the software component
SSF1 -Part
c) The integration verification shall 5.4.4.3 verify whether the hardware items, software, VII
and manual operations of the system have been completely and correctly integrated into
the system. SSF2 –
Part II, IV,V
Integration d) The integration verification shall 5.4.4.4 verify whether the integration tasks have been
performed in accordance with an integration plan. Integration
5.4.4
Verification verification
Examples of verification criteria are (especially as far as isolation between software results
is concerned) (tests
results,
• Linking and loading data and memory map
reviews
• Data control and coupling records….)
• Incorrect HW addresses
• Memory overlaps
• Missing SW components.
Note: Global verification should be performed either through tests or other methods like
reviews

© EUROCAE, 2009
74

Obj ALs OBJECTIVES SWAL Output


N° Objectives 1 2 3 4
Title/Topic
Annex A Section A.4.4
When evaluating the tests, test results to verify the software architectural design, and
user documentation: SSF1 -Part
a) External consistency with the software requirements (hardware-software VII
compatibility) shall 5.4.5.1 be considered ; SSF2 –
b) Internal consistency (data flow and control flow) shall 5.4.5.2 be considered ; Part II, IV,
V
c) Verification coverage of the software architectural design shall 5.4.5.3 be considered ;
Verification of d) Design conformity to Design standards/rules shall 5.4.5.4 be considered ;
software Design
5.4.5 e) Appropriateness of test standards/rules and methods used shall 5.4.5.5 be considered ; verification
architectural
design f) Conformance to expected results shall 5.4.5.6 be considered ; results

g) Feasibility of software design testing shall 5.4.5.7 be considered ; +

h) Feasibility of maintenance shall 5.4.5.8 be considered ; Integration


test
i) Verification criteria on which verification completion will be judged shall 5.4.5.9 be description
considered. verification
The results of the evaluations shall 5.4.5.10 be documented. results

Note: The compliance should be verified according to the definition of the transition
criteria between lifecycle phases (cf SWAL allocation for Development process)

© EUROCAE, 2009
75

Obj ALs OBJECTIVES SWAL Output


N° Objectives 1 2 3 4
Title/Topic
Annex A Section A.4.4
When evaluating the software code and verification results:
SSF1 -Part
a) External consistency with the requirements and design of the software (hardware- VII
software compatibility) shall 5.4.6.1 be consider ;
SSF2 –
b) Internal consistency between detailed design requirements shall 5.4.6.2 be consider ; Part II, IV,
c) Verification coverage of detailed design (units)shall 5.4.6.3 be considered; V

j) Code conforms to Code standards/rules shall 5.4.6.4 be consider ;


Verification of
5.4.6 k) Verification of the coverage of the software structure (statement coverage) shall 5.4.6.5 SW unit
detailed design
be consider - see note below this table; code &
test
d) Appropriateness of coding methods and standards/rules used shall 5.4.6.6 be consider; description
e) Feasibility of software code verification shall 5.4.6.7 be consider ; verification
results
f) Feasibility of maintenance shall 5.4.6.8 be consider.
(tests or
The results of the evaluations shall 5.4.6.9 be documented. reviews…)
Note: Global verification should be performed either through tests or other methods like
reviews or other means…..
5.4.7 Removed

© EUROCAE, 2009
76

Obj ALs OBJECTIVES SWAL Output


N° Objectives 1 2 3 4
Title/Topic
Annex A Section A.4.4
Executable code and verification results shall 5.4.8.1 be evaluated considering the criteria SSF1 -Part
listed below. VII
a) External consistency with the code of the software (eg is the compiler generating SSF2 –
an appropriate executable or object code?); Part II, IV,
V
b) Internal consistency between exe requirements (eg: is the compiler always
Verification of generating the same executable or object code for the same source?);
5.4.8
executable code
c) Verification of the translation of the software source code into object code (eg is SW source
the compiler generating additional and unnecessary executable or object code, such as code
dead executable code?); verification
results
d) Feasibility of executable verification;
(inspection
e) Verification of software structure (MC/DC). & test)
The results of the evaluations shall 5.4.8.2 be documented.
The data structures specified during detailed design shall 5.4.9.1 be verified for: Annex A SSF1 -Part
Section 4.4.2 VII
• completeness SSF2 –
Part II, IV,
• self-consistency V
5.4.9 Data verification
• protection against alteration or corruption Data
verification
results
(inspection
& test)

© EUROCAE, 2009
77

Obj ALs OBJECTIVES SWAL Output


N° Objectives 1 2 3 4
Title/Topic
As a minimum:
a) Traceability shall 5.4.10.1 be verified between System and Software requirements
b) Traceability shall 5.4.10.2 be verified between Software requirements and
Software Architectural Design
SSF1 -Part
c) Traceability shall 5.4.10.3 be verified between Software Architectural Design and VII
5.4.10 Traceability Detailed Design
SSF2 –
d) Traceability shall 5.4.10.4 be verified between Software Detailed Design and Part II, IV,
Executable Code V
e) Traceability shall 5.4.10.5 be verified between verification evidence and Software
Requirements
f) Traceability shall 5.4.10.6 be verified between safety assurance evidence and the
version of the software being deployed
It shall 5.4.11.1 be demonstrated that the measured complexity is within the defined
threshold by: SSF1 -Part
VII
5.4.11
Complexity • analysing the measures, and
measures SSF2 –
• applying corrective actions. Part II, IV,
V
If value exceeds thresholds (to be defined), a justification shall 5.4.11.2 be provided.

© EUROCAE, 2009
78

Obj ALs OBJECTIVES SWAL Output


N° Objectives 1 2 3 4
Title/Topic
Annex A Section A.4.4
Verification cases, procedures and results shall 5.4.12.1 be verified, so that:
• Verification procedures are correct and complete and discrepancies are justified

• Verification results are correct and complete and discrepancies are justified

• Verification of the software requirements verification cases, procedures and results is


correct and complete and discrepancies are justified
Verification of
5.4.12 Verification • Verification of the software design (architectural level) verification cases, procedures
process results and results is correct and complete and discrepancies are justified
• Verification of the software design (detailed design) verification cases, procedures
and results is correct and complete and discrepancies are justified
• Verification of the software integration verification cases, procedures and results is
correct and complete and discrepancies are justified
• Verification of the software data verification cases, procedures and results is correct
and complete
• Verification of the traceability verification procedures and results is correct and
complete and discrepancies are justified

TABLE 20: SOFTWARE VERIFICATION OBJECTIVES

NOTE: verification may be performed through inspection, analysis, demonstration or a combination of them all throughout the lifecycle:

© EUROCAE, 2009
79

5.4.1 Other Verifications

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
The Software Retrieval and release process shall 5.4.13.1 be verified. SSF1 -
Verification of Part VII
Retrieval and
5.4.13 SSF2
Release
process Part I,
II, V, VI

TABLE 21: OTHER VERIFICATION OBJECTIVES

© EUROCAE, 2009
80

5.5 VALIDATION PROCESS

The Validation Process is a process for determining whether the requirements and the
final, as-built system or software product fulfils its specific intended use. Validation
may be conducted in earlier stages. This process may be conducted as a part of
Software Acceptance Support.
This process may be executed with varying degrees of independence. The degree of
independence may range from the same person or different person in the same
organisation to a person in a different organisation with varying degrees of separation.
In the case where the process is executed by an organisation independent of the
supplier, developer, operator, or maintainer, it is called Independent Validation
Process (confirmation by examination and provision of objective evidence that the
particular requirements for a specific intended use are fulfilled; usually used for
internal validation of the design).
The ANSP is responsible for conducting validation (which intends to show that the
system meets its safety objectives in its operational environment). Validation concerns
the equipment part of the system, as some procedures are defined to perform this
equipment validation.
Validation is out of scope of this document and therefore no objectives are defined
here.

5.6 JOINT REVIEW PROCESS

The Joint Review Process is a process for evaluating the status and products of an
activity of a project as appropriate. Joint reviews are at both project management and
technical levels and are held throughout the life of the contract. This process may be
employed by any two parties, where one party (reviewing party) reviews another party
(reviewed party).
To be unbiased, the Joint Review Process shall 5.4.13.1 have organisational freedom
and authority from persons directly responsible for developing the software product or
executing the process in the project.

© EUROCAE, 2009
81

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.6 SSF1 -
Process Part VII
Periodic reviews shall 5.6.1.1 be held at predetermined milestones as specified in the project
5.6.1 implementati
plan(s). SSF2 –
on
The review results shall 5.6.1.2 be documented and distributed. Part I

Annex A Section A.4.6 SSF1 -


Project Part VII
5.6.2 management Project status shall 5.6.2.1 be evaluated relative to the applicable project plans, schedules,
reviews standards/rules, transition criteria and guidelines. SSF2 –
Part I
Annex A Section A.4.6 SSF1 -
Technical Part VII
5.6.3 Technical reviews shall 5.6.3.1 be held to evaluate the software products or services under
reviews consideration. SSF2 –
Part I

TABLE 22: JOINT REVIEW PROCESS OBJECTIVES

© EUROCAE, 2009
82

5.7 AUDIT PROCESS

The Audit Process is a process for determining compliance with the requirements, plans, and contract as appropriate. This process may be
employed by any two parties, where one party (auditing party) audits the software products or activities of another party (audited party).
To be unbiased, the Audit Process shall 5.7.1 have organisational freedom and authority from persons directly responsible for developing the
software product or executing the process in the project.
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.7
SSF1 -
Audits shall 5.7.1.1 be held at predetermined milestones as specified in the project plan(s) or Part VII
Process
5.7.1 upon specific request.
implementation SSF2 –
After completing an audit, the audit results shall 5.7.1.2 be documented and provided to the Part I
audited party.
Annex A Section A.4.7
Audits at SW requirement level shall 5.7.2.1 be conducted at predetermined milestones.
Audits shall 5.7.2.2 identify whether:
• The acceptance review and verification requirements prescribed by the documentation are
adequate for the acceptance of the software.
• Verification data comply with the specification. SSF1 -
Audits at SW Part VII
5.7.2 requirement • Software was successfully verified and meets its SW requirements.
level SSF2 –
• Verification reports are correct and discrepancies between actual and expected results Part I
have been resolved.
• Product (SW requirement) and User documentation complies with standards/rules as
specified.
• Activities have been conducted according to applicable requirements, plans, and contract.
• The costs and schedules adhere to the plans.

© EUROCAE, 2009
83

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.7
Audits shall 5.7.3.1 be conducted at predetermined milestones or upon specific request to
ensure that:
• The acceptance review and verification requirements prescribed by the documentation are
adequate for the acceptance of the software.
• Verification data comply with the specification. SSF1 -
audits down to • Software was successfully verified and meets its SW requirements and SW architecture Part VII
5.7.3
SW design level requirements. SSF2 –
• Verification reports are correct and discrepancies between actual and expected results Part I
have been resolved.
• Product (SW requirement and SW architecture) and User documentation complies with
standards/rules as specified.
• Activities have been conducted according to applicable requirements, plans, and contract.
• The costs and schedules adhere to the plans.

© EUROCAE, 2009
84

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.7
Audits shall 5.7.4.1 be conducted at predetermined milestones and upon specific request to
ensure that:
• The acceptance review and verification requirements prescribed by the documentation are
adequate for the acceptance of the software.
• Verification data comply with the specification. SSF1 -
quality audits
• Software was successfully verified and meets its SW requirements, SW architectural Part VII
5.7.4 down to source
requirement and SW detailed design requirements. SSF2 –
code level
• Verification reports are correct and discrepancies between actual and expected results Part I
have been resolved.
• Product (SW requirement, SW architecture, SW detailed design and Source code) and
User documentation complies with standards/rules as specified.
• Activities have been conducted according to applicable requirements, plans, and contract.
• The costs and schedules adhere to the plans.

© EUROCAE, 2009
85

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.7
Audits shall 5.7.5.1 be conducted at predetermined milestones or upon specific request to
ensure that:
• The acceptance review and verification requirements prescribed by the documentation are
adequate for the acceptance of the software.
• Verification data comply with the specification.
SSF1 -
quality audits • Software was successfully verified and meets its SW requirements, SW architectural
Part VII
5.7.5 down to requirement and SW detailed design requirements.
executable level SSF2 –
• Verification reports are correct and discrepancies between actual and expected results
Part I
have been resolved.
• Product (SW requirement, SW architecture, SW detailed design, Source code and
executable) and User documentation complies with standards/rules as specified.
• SW development tools (eg Compilers) are qualified.
• Activities have been conducted according to applicable requirements, plans, and contract.
• The costs and schedules adhere to the plans.

TABLE 23: AUDIT PROCESS OBJECTIVES

© EUROCAE, 2009
86

5.8 PROBLEM/CHANGE RESOLUTION PROCESS

The Problem/Change Resolution Process is a process for analysing and resolving the
problems (including non-conformances), whatever their nature or source, that are
discovered during the execution of development, operation, maintenance, or other
processes. The objective is to provide a timely, responsible, and documented means
to ensure that all discovered problems are analysed and resolved and trends are
recognised.
Problem/Change Resolution phase includes:
Non-conformance reports (during all phases) which could or not affect safety;
Correction: a modification has to be performed in order to fix a reported
problem;
Prevention: a modification has to be performed because an analysis has
concluded that the software behaviour could contribute to a safety-related
event:
- either because the system safety assessment (at the system or at the
software level) process review/update identified it and did not do so till
then;
- Or because an operational report identified it though no safety
occurrence happened;
Evolution: a modification has to be performed because software has to be
updated for technological reasons (eg change of hardware platform, software
development tool version change, software development tool obsolescence).

© EUROCAE, 2009
87

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.8
A problem resolution process (including the demonstration of the successful resolution of the SSF1-
Process problem) shall 5.8.1.1 be defined for handling all problems (including non-conformances) Part V
5.8.1
implementation detected in the software products and processes/activities. SSF2 –
Note: The implementation of the process could, for example, include the establishment of Part I
a review board.
Annex A Section A.4.8
SSF1 -
When problems (including non-conformances) have been detected in a software product or
Problem Part V
5.8.2 an activity, a problem report shall 5.8.2.1 be prepared to describe each problem detected.
resolution The problem report shall 5.8.2.2 be used as part of a closed-loop process: from detection of SSF2 –
the problem, through investigation, analysis and resolution of the problem and its cause, and Part I
onto trend detection across problems.
An analysis shall 5.8.3.1 be performed to:
SSF1 -
• assess if a problem report has a safety impact (risk assessment and mitigation eg Part V
5.8.3 Safety Impact FHA/PSSA/SSA) or compromises the previous SWAL allocation
SSF2 –
• corrective actions exist such that safety-related problems can be shown to have been Part I
acceptably mitigated.
Problem report shall 5.8.4.1 be considered as software lifecycle data to be put under SSF1 -
Problem Report configuration management as defined in objective 5.2.1. Part V
5.8.4 Configuration
Management SSF2 –
Part I

TABLE 24: PROBLEM/CHANGE RESOLUTION PROCESS

© EUROCAE, 2009
88

CHAPTER 6

ORGANISATIONAL LIFECYCLE PROCESSES


6.0 INTRODUCTION

This chapter lists the objectives, per SWAL, that belong to organisational lifecycle
processes.
Organisational lifecycle processes consist of:
1. Management process:
a. Initiation and scope definition;
b. Planning;
c. Execution and control;
d. Review and evaluation;
e. Closure.
2. Infrastructure process:
a. Process implementation;
b. Establishment of the infrastructure;
c. Maintenance of the infrastructure.
3. Improvement process:
a. Process establishment;
b. Process assessment;
c. Process improvement.
4. Training process:
a. Process implementation;
b. Training material development;
c. Training plan implementation.
The objectives and tasks in an organisational process are the responsibility of the
organisation using that process. Depending on the lifecycle phase, different
organisations may be responsible for performing a process. Each organisation
ensures that the process is in existence and functional. Therefore the satisfaction of
these objectives can be achieved at the organisational level via documents (Safety
Policy, Quality Management System…) not specifically/exclusively addressing
software.

6.1 MANAGEMENT PROCESS

The Management Process contains the generic objectives and tasks, which may be
employed by any party that has to manage its respective processes. The manager is
responsible for product management, project management, and task management of
the applicable process(es), such as the acquisition, supply, development, operation,
maintenance, or supporting process.

© EUROCAE, 2009
89

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Process implementation
A management process tailored to the project shall 6.1.1.1 be defined. The output of the
Management
management process shall 6.1.1.2 be documented and distributed.
Process SSF1 -
implementation The management process shall 6.1.1.3 be initiated by establishing the requirements of the Part III
6.1.1 process to be undertaken.
Initiation & SSF2 –
scope The manager shall 6.1.1.4 establish the feasibility of the process by checking that the Other
definition resources (personnel, materials, technology, and environment) required to execute and
manage the process are available, funded, adequate, and appropriate and that the time-
scales to completion are achievable.
The manager shall 6.1.2.1 prepare the plans for execution of the process. The plans
associated with the execution of the process shall 6.1.2.2 contain descriptions of the
associated activities and tasks and identification of the software products that will be
provided. These plans shall 6.1.2.3 include, as a minimum, the following:
• Schedules for the timely completion of tasks;
• Estimation of effort;
SSF1 -
• Adequate resources needed to execute the tasks; Part III
6.1.2 Planning
• Allocation of tasks (including who, what and when); SSF2 –
Other
• Assignment of responsibilities;
• Quantification of project risks associated with the tasks or the process itself;
• Quality control measures to be employed throughout the process;
• Costs associated with the process execution;
• Provision of environment and infrastructure.

© EUROCAE, 2009
90

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
The manager shall 6.1.3.1 initiate the implementation of the plan to satisfy the objectives and
criteria set, exercising control over the process.
SSF1 -
The manager shall 6.1.3.2 monitor the execution of the process, providing both internal Part III
Execution &
6.1.3 reporting of the process progress and external reporting to the acquirer as defined in the
control SSF2 –
contract.
Other
The manager shall 6.1.3.3 investigate, analyse, and resolve the problems discovered during
the execution of the process.
The manager shall 6.1.4.1 ensure that the software lifecycle data is evaluated for satisfaction
SSF1 -
of requirements.
Review & Part III
6.1.4 The manager shall 6.1.4.2 assess the evaluation results of the software products, activities,
evaluation SSF2 –
and tasks completed during the execution of the process vis-à-vis the achievement of the
Other
objectives and completion of the plans.
When all software products, activities, and tasks are completed, the manager shall 6.1.5.1
determine whether the process is complete taking into account the criteria as specified in
the contract or as part of organisation's procedure. SSF
6.1.5 Closure
The manager shall 6.1.5.2 check the results and records of the software products, activities, Part III
and tasks employed for completeness. These results and records shall 6.1.5.3 be archived in
a suitable environment as specified in the contract.

TABLE 25: MANAGEMENT PROCESS OBJECTIVES

© EUROCAE, 2009
91

6.2 INFRASTRUCTURE PROCESS

The infrastructure process is a process to establish and maintain the infrastructure needed for any other process. The infrastructure may
include hardware, software, tools, techniques, standards and facilities for development, operation or maintenance.

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
The infrastructure shall 6.2.1.1 be defined to meet the requirements of the processes (eg SSF1 -
Process development or verification) employing this process, considering the applicable procedures, Part I
6.2.1 standards/rules, tools, and techniques.
implementation SSF2 –
The establishment of the infrastructure shall 6.2.1.2 be planned. Other
The configuration of the infrastructure shall 6.2.2.1 be planned. Functionality, performance, SSF1 -
Establishment of safety, security, availability, space requirements, equipment, costs, and time constraints Part I
6.2.2 shall 6.2.2.2 be considered.
the infrastructure SSF2 –
Other
The infrastructure shall 6.2.3.1 be maintained, monitored, and modified as necessary to SSF1 -
Maintenance of ensure that it continues to satisfy the requirements of the processes (eg development or Part VII
6.2.3 verification) employing this process. As part of maintaining the infrastructure, the extent to
the infrastructure SSF2 -
which the infrastructure is under configuration management shall 6.2.3.2 be defined.
Other

TABLE 26: INFRASTRUCTURE PROCESS OBJECTIVES

© EUROCAE, 2009
92

6.3 IMPROVEMENT PROCESS

The Improvement Process is a process for establishing, assessing, measuring, controlling, and improving a software lifecycle process (as
requested by ISO9001-2000).

Obj ALs OBJECTIVES SWAL Output


N° Objectives 1 2 3 4
Title/Topic
The organisation shall 6.3.1.1 establish a suite of organisational processes for all software SSF1 -Part
lifecycle processes as they apply to its business activities. The processes and their III or VII
Process
6.3.1 application to specific cases shall 6.3.1.2 be defined in organisation's publications. As
implementation SSF2 –
appropriate, a process control mechanism shall 6.3.1.3 be established to develop, monitor,
control, and improve the process(es). Other

A process assessment procedure shall 6.3.2.1 be defined and applied. Assessment


records shall 6.3.2.2 be kept and maintained.
Process SSF1 - Part
6.3.2 The organisation shall 6.3.2.3 plan and carry out reviews of the processes at appropriate
assessment III or VII
intervals to ensure their continuing suitability and effectiveness in the light of assessment
results.
The organisation shall 6.3.3.1 effect such improvements to its processes as it determines to SSF1 -Part
Process be necessary as a result of process assessment and review. Process documentation III or VII
6.3.3 shall 6.3.3.2 be updated to reflect improvement in the organisational processes.
improvement SSF2 -
Other

TABLE 27: IMPROVEMENT PROCESS OBJECTIVES

© EUROCAE, 2009
93

6.4 TRAINING PROCESS

The Training Process is a process for providing and maintaining trained personnel. The acquisition, supply, development, operation, or
maintenance of software products is largely dependent upon knowledgeable and skilled personnel. For example: developer personnel should
have essential training in software management and software engineering. It is, therefore, imperative that personnel training be planned and
implemented early so that trained personnel are available as the software product is acquired, supplied, developed, operated, or maintained.
Training includes the use and customisation of standards to a specific application.
In this section, training does not address training of operational staff in charge of operating software, but training of staff in charge of
developing or maintaining software.

ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
A review of the project requirements shall 6.4.1.1 be conducted to establish and make
timely provision for acquiring or developing the resources and skills required by the SSF1 -Part
Process management and technical staff. The types and levels of training and categories of VII
6.4.1
implementation personnel needing training shall 6.4.1.2 be determined. A training plan, addressing SSF2 –
implementation schedules, resource requirements, and training needs, shall 6.4.1.3 be Other
defined.
Training manuals, including presentation materials used in providing training, shall SSF1 -Part
Training material 6.4.2.1 be developed. VII
6.4.2
development SSF2 –
Other

Training plan The training plan shall 6.4.3.1 be implemented to provide training to personnel. Training SSF1Part VII
6.4.3 records shall 6.4.3.2 be maintained.
implementation SSF2 - Other

TABLE 28: TRAINING PROCESS OBJECTIVES

© EUROCAE, 2009
94

CHAPTER 7

ADDITIONAL ANS SOFTWARE LIFECYCLE OBJECTIVES


7.0 INTRODUCTION

This chapter lists the objectives, per SWAL, that do not belong to ISO/IEC 12207, but
have been added due to:
The analysis of other standards more safety oriented (ED-109/DO-278,
ED-12B/DO-178B and IEC 61508),
ANS particularities (some are included in ED-109/DO-278),
Omissions by existing standards.
These additional lifecycle processes consist of:
1. Software development environment
a. Definition
b. Programming language
c. Compiler considerations
2. COTS
a. COTS plans;
b. COTS Transition criteria;
c. COTS Plan consistency;
d. COTS requirement coverage;
e. COTS Lifecycle data;
f. COTS Derived requirements;
g. COTS HW compatibility;
h. COTS Configuration Management;
i. COTS Problem Reporting;
j. COTS Incorporation;
k. COTS Configuration Management Archiving;
3. Tool qualification (Out of scope of these recommendations)
a. Qualification criteria for software development tools
b. Qualification criteria for software verification tools
4. Service experience

© EUROCAE, 2009
95

7.1 SOFTWARE DEVELOPMENT ENVIRONMENT

This section is addressed by objectives 4.3.11 and 4.3.19. It describes the


Infrastructure Process (which is generic) for the Software Lifecycle environment
concerning the Development Process.

7.2 COTS

COTS definition[1]:
COTS software encompasses a wide range of software, including purchased software,
Non-Developmental Items (NDI), and software previously developed without
consideration of ED-153. The term “Previously Developed Software” is also used for
such software. This software may or may not have been approved through other
“approval processes.” Partial data or no data may be available as evidence of
objectives of ANS developmental process. For the rest of this section, all such
software is referred to as COTS for the sake of brevity. This terminology was selected
because of the usual use of the term “COTS” within the “ground” ANS community.
Examples of COTS are operating systems, real-time kernels, graphical user
interfaces, communication and telecommunication protocols, language run-time
libraries, mathematical and low-level bit routines, and string manipulation routines.
COTS software can be purchased apart from or in conjunction with COTS hardware,
such as workstations, mainframes, communication and network equipment, or
hardware items (eg, memory, storage, I/O devices). There also may be some
instances where the use of COTS software is impractical to avoid (eg, library code
associated with certain compilers).
COTS deliverables vary by the contract with the COTS supplier. They may extend
from license rights, executable code, user documentation, and training to the full set of
COTS lifecycle data, including the source code resulting from the COTS development.
COTS information disclosure related to cost, protection of intellectual properties, and
legal questions (eg ownership of the software, patents, liability, and documentation
responsibility). These aspects are beyond the scope of this standard, which addresses
only those aspects that are specific to software assurance.
Development processes used by COTS suppliers and procurement processes applied
by acquirers may not be equivalent to recommended processes, and may not be fully
consistent with the guidance of this document. The use of COTS may mean that
alternate methods are used to gain assurance that the appropriate objectives are
satisfied. These methods include, but are not limited to, product service experience,
prior assurance, process recognition, reverse engineering, restriction of functionality,
formal methods, and audits and inspections. Data may also be combined from more
than one method to gain assurance data that the objectives are satisfied.
In cases where insufficient data is available to satisfy the objectives, this section may
be used as guidance to adapt the SSAS (See Objective 3.0.13) with agreement from
the appropriate Approval Authority.

7.2.0 Scope of COTS Section

This section applies only to COTS used for ANS applications and is not intended to
alter or substitute any of the objectives applied to ANS software unless justified by a
safety assessment process and accepted by the appropriate Approval Authority. It
aims at supporting the customisation of the SSAS (See objective 3.0.13) with
agreement from the Approval Authority.
Where possible, the COTS section titles as the process it addresses with alterrnative
objectives or activities (e.g; verification process). However, sometimes, the title does
not allow identifying in a straight forward manner which process is alternatively
addressed e.g. acquisition process replaces SW requirements,
NOTE: COTS (as defined here above) usage is not accepted for software having
to satisfy a SWAL1.

© EUROCAE, 2009
96

7.2.1 System Aspects Relating to COTS in ANS

COTS software may need to be integrated into high integrity ANS systems or
equipment. However, the higher the risk of the ANS function, the more demanding the
assurance requirements are for the system and the software. Alternate methods may
be used to augment design assurance data for COTS software components at a
desired assurance level. When COTS are used in an ANS system, considerations
such as planning, acquisition, and verification should be addressed.
Risk mitigation techniques may be considered to reduce the ANS system’s reliance on
the COTS. The goal of these mitigation techniques is to reduce the effect of
anomalous behaviour of COTS on the ANS system function. Risk mitigation
techniques may be achieved through a combination of people, procedure, equipment,
or architecture. For example, architectural means may involve partitioning,
redundancy, safety monitoring, COTS safe subsets by the use of encapsulation or
wrappers, and data integrity checking.

7.2.2 COTS Planning Process

The purpose of the COTS planning process is to co-ordinate lifecycle processes


specific to COTS and to define the methods and tools necessary for the incorporation
of COTS in ANS systems. The verification of the COTS planning process is to assure
that all issues regarding the use of COTS have been addressed. The ANS software
planning process should accommodate COTS software if its use is anticipated. The
COTS planning process includes planning for all aspects of COTS, including
acquisition, verification, configuration management, and software quality assurance.
As part of the approval process, early submittal of the results of the COTS
assessment and selection processes to the appropriate Approval Authority is
recommended.

7.2.2.0 COTS Planning Process Objectives

The objectives of the COTS planning process are:


Activities for acquisition and integral processes, including additional
considerations, integration, and maintenance, are defined;
Transition criteria for these processes and transition criteria with respect to ANS
lifecycle processes are defined; and
Plans for COTS processes, including COTS transition criteria, are consistent
with the ANS software plans.

© EUROCAE, 2009
97

ALs SWAL
Obj
Objectives OBJECTIVES
N° 1 2 3 4 Output
Title/Topic
A COTS acquisition, verification, configuration management, quality assurance plan (or SSF1 -
plans) shall 7.2.1.1 be defined; Part VII
7.2.1 COTS Plans N/A
SSF2 –
Part II
Transition criteria shall 7.2.2.1 be defined (according to the relationships between COTS SSF1 -
COTS processes and appropriate CNS/ ATM lifecycle processes); Part VII
7.2.2 Transition N/A
Criteria SSF2 –
Part II
COTS plans shall 7.2.3.1 be consistent with ANS SW plans (plans for acquisition, evaluation, SSF1 -
COTS Plans integration processes are consistent with ANS SW plans); Part VII
7.2.3 N/A
Consistency SSF2 –
Part II

TABLE 29: COTS PLANNING PROCESS OBJECTIVES

© EUROCAE, 2009
98

7.2.2.1 COTS Planning Process Activities

The activities associated with the COTS planning process are:


a. COTS planning activities should evaluate the level of applicability of the COTS
product to ANS requirements. The following considerations should be included
in the evaluation to determine the level of effort involved in the assurance of a
COTS product:
i. Product availability;
ii. Requirements (mapping of ANS requirements to COTS capabilities;
reference section 3.5 of this chapter);
iii. Availability of lifecycle data;
iv. Level of integration and extent of additional efforts, such as, glue code,
architecture mitigation techniques, etc. to allow incorporation of the
COTS into the ANS system;
v. Availability of applicable product service history or service experience;
vi. Supplier qualifications (eg the use of standards, service history and
length of service, technical support);
vii. Configuration control, including visibility of COTS supplier’s product
version control;
viii. Modification considerations. Modified COTS has additional
considerations of warranty, authority to modify, continued technical
support, etc., unless such modifications are allowed by the COTS
supplier. The modifications themselves should be considered a new
development. Change impact analysis should be performed to determine
the extent of the necessary re-verification;
ix. Maintenance issues (eg patches, retirement, obsolescence, and change
impact analysis);
x. Evidence of SQA activities;
xi. Verifiability of the COTS software (eg includes limitations, need for
special test facilities);
xii. Level of compliance with SWAL objectives; and
xiii. Information on COTS in-service problems and resolution of those
problems.
b. Relationships between the COTS planning process, the COTS acquisition
process, and the COTS integral processes should be defined. Additionally,
relationships between COTS processes and appropriate ANS lifecycle
processes should be defined. Every input to a process need not be complete
before that process can be initiated, if the transition criteria established for the
process are satisfied.
i. Reviews should be conducted to ensure:
ii. The COTS planning process and the ANS planning process are
consistent;
iii. COTS transition criteria are compatible with the ANS transition criteria;
and
iv. Transition criteria are verified to assure that the outputs of each process
are sufficient to begin the next process.

© EUROCAE, 2009
99

NOTE: COTS usage may necessitate considerations of glue code, architectural


mitigation techniques, derived requirements, and COTS-specific
integration. Any supplemental software due to COTS integration in ANS
systems should be considered ANS developmental software for which all of
the objectives in this document apply.

7.2.3 COTS Acquisition Process

The focus of this section is on the assurance aspects of acquiring COTS. There are
additional acquisition considerations not described in this document. The COTS
acquisition process is comprised of requirements definition, assessment, and
selection.
a. Requirements Definition: The ANS software requirements definition process
identifies software requirements that COTS may satisfy. COTS may contain
more capabilities than the requirements needed by the ANS system. A
definition of these capabilities may be available from the supplier or derived
from the COTS user’s manuals, technical materials, product data, etc. In the
model depicted in Figure 5, the ANS requirements satisfied by COTS are the
intersection of COTS capabilities and ANS requirements.
Due to the use of COTS, there may be derived requirements (eg platform
dependent requirements, interrupt handling, interface handling, resource
requirements, usage constraints, error handling, partitioning) that should be
added to the ANS software requirements.
All ANS requirements satisfied by the COTS software and the resulting derived
requirements should be provided to the safety assessment process.

COTS
Capabilities
CNS/ATM
Requirements

CNS/ATM Requirements
satisfied by COTS

FIGURE 5: REQUIREMENTS INTERSECTION

b. Assessment: Candidate COTS products should be assessed for their ability to


implement the ANS requirements, for the effect of their respective derived
requirements, and for their support of the needed assurance level. During the
COTS assessment process, more than one COTS candidate product may be
examined to determine the extent of intersection of COTS capabilities with the
ANS requirements as depicted in Figure 5. Availability and relevance of COTS
lifecycle data to support the appropriate assurance level should also be
assessed. Additionally, the impact of any unneeded COTS capabilities should
be assessed.

© EUROCAE, 2009
100

c. Selection: The selection is an iterative process based on results from the


assessment process and comparison of COTS suppliers (eg COTS supplier’s
experience in ANS, the ability of the COTS supplier to support COTS software
version control and maintenance over the expected lifetime of the ANS system,
COTS supplier’s commitment to keep the ANS applicants informed of detected
errors, COTS supplier’s willingness to address the issue of Escrow). Analyses
may be conducted to compare advantages of using COTS versus developing
the software.

7.2.3.0 COTS Acquisition Process Objectives

The objectives of the COTS acquisition process are:


a. To determine the degree to which of the ANS software requirements are
satisfied by the COTS capabilities.
b. To determine the adequacy of lifecycle data available for assurance purposes.
c. To identify the derived requirements. Such requirements comprise:
i. Requirements imposed on the ANS system due to the usage of COTS.
ii. Requirements to prevent the unneeded capabilities of the COTS from
adversely affecting the ANS system.
d. To assure the compatibility of COTS with target hardware and other ANS
software.

© EUROCAE, 2009
101

ALs SWAL
Obj
Objectives OBJECTIVES
N° 1 2 3 4 Output
Title/Topic
ANS SW requirements coverage achieved by the COTS shall 7.2.4.1 be demonstrated SSF1-
COTS Part VII
7.2.4 Requirements N/A
Coverage SSF2 –
Part II
COTS Lifecycle data availability shall 7.2.5.1 be determined in accordance with SWAL (extent SSF1 -
COTS Lifecycle of lifecycle data that are available for assurance purposes) Part VII
7.2.5 N/A
data SSF2 –
Part II
Derived requirements shall 7.2.6.1 be defined (requirements imposed on the ANS system due SSF1 -
COTS Derived to the usage of COTS or requirements to prevent the unneeded functions of the COTS from Part VII
7.2.6 affecting the ANS system) N/A
Requirements SSF2 –
Part II
Compatibility of COTS with target computers shall 7.2.7.1 be demonstrated SSF1 -
COTS HW Part VII
7.2.7 N/A
Compatibility SSF2 –
Part II

TABLE 30: COTS ACQUISITION PROCESS OBJECTIVES

© EUROCAE, 2009
102

7.2.3.1 COTS Acquisition Process Activities

The activities of the COTS acquisition process are:


a. The COTS capabilities should be examined, and an analysis should be
conducted against the ANS requirements. The purpose of this analysis is to
determine the ANS requirements satisfied by COTS and to aid in the
comparison of candidate COTS products.
b. Available COTS software lifecycle data should be assessed. A gap analysis
should be performed against the objectives of this document for the proposed
software assurance level. This analysis aids in comparison of candidate COTS
products. This analysis is used to identify the objectives that are partially or fully
satisfied, and those that need to be addressed through alternate methods.
c. Analysis should be conducted to identify derived requirements. This analysis
should include all COTS software capabilities, both necessary and
unnecessary. Derived requirements may be classified as follows:
i. Requirements to prevent adverse effects of any unneeded functions of
any COTS software. This may result in requirements for isolation,
partitioning, wrapper code, coding directives, customization, etc.
ii. Requirements that the selected COTS may impose on the ANS system
including those for preventing adverse effects of needed COTS functions
(eg input formatting, call order, initialization, data conversion, resources,
range checking). This may result in requirements for interface code,
coding directives, architecture considerations, resource sizing, glue-code,
etc.
d. All ANS requirements satisfied by COTS software, the resulting derived
requirements, and any pertinent supplier-provided data should be provided to
the safety assessment process.
The selected COTS should be shown to be compatible with the target computer(s)
and interfacing systems.

7.2.4 COTS Verification Process

The COTS verification process is an extension of the verification process discussed in


this document (section 5.4). In particular, the COTS acquisition process frequently
identifies verification objectives that cannot be satisfied using traditional means. For
those verification objectives where compliance cannot be demonstrated by the
available COTS data (eg design or requirements), additional activities, including
alternate methods such as reverse engineering, may be used after acceptance by the
Approval Authority.

7.2.4.0 COTS Verification Process Objectives

There are no additional verification objectives imposed upon the ANS system because
of use of COTS.

7.2.4.1 COTS Verification Process Activities

Typical verification activities for COTS software achieved include:


a. Software reviews and analyses of ANS requirements satisfied by COTS,
b. Requirements-Based Testing (RBT) of ANS requirements satisfied by COTS,
c. Verification of development of any supplemental software due to COTS (eg glue
code, partitioning, wrappers), and
d. Verification of integration of COTS into the ANS system.

© EUROCAE, 2009
103

7.2.4.2 Alternative Methods for COTS

The use of alternate methods should satisfy both of the following conditions:
a. The safety assessment process supports the justification.
b. Acceptance is granted by the appropriate Approval Authority.
Activities used for specific alternate methods or for combination of alternate methods
are considered on a case-by-case basis. An example of activities associated with the
usage of service experience for assurance credit is provided below.

7.2.4.3 Use of Service Experience for Assurance Credit of COTS Software

Use of service experience data for assurance credit is predicated upon two factors:
sufficiency and relevance. Sufficient service experience data may be available
through the typical practice of running new ANS systems in parallel with operational
systems in the operational environment, long duration of simulation of new ANS
systems, and multiple shadow operations executing in parallel at many locations.
Relevant service experience data may be available for ANS systems from reuse of
COTS software from in-service ANS Systems, or ANS system verification and pre-
operational activities. For COTS software with no precedence in ANS applications,
many processes may be used to collect service experience; examples include the
validation process, the operator training process, the system qualification testing, the
system operational evaluation, and field demonstrations.
The following applies for accumulation of service experience:
a. The use, conditions of use, and results of COTS service experience should be
defined, assessed by the safety assessment process, and submitted to the
appropriate Approval Authority.
b. The COTS operating environment during service experience time should be
assessed to show relevance to the intended use in ANS. If the COTS operating
environment of the existing and intended applications differ, additional
verification should be performed to ensure that the COTS application and the
ANS applications will operate as intended in the target environment. It should
be assured that COTS capabilities to be used are exercised in all operational
modes. Analysis should also be performed to assure that relevant permutations
of input data are executed.
c. Any changes made to COTS during service experience time should be
analysed. An analysis should be conducted to determine whether the changes
made to COTS alter the applicability of the service experience data for the
period preceding the changes.
d. All in-service problems should be evaluated for their potential adverse effect on
ANS system operation. Any problem during service experience time, where
COTS implication is established and whose resulting effect on ANS operations
is not consistent with the safety assessment, should be recorded. Any such
problem should be considered a failure. A failure invalidates the use of related
service experience data for the period of service experience time preceding the
correction of that problem.
e. COTS capabilities which are not necessary to meet ANS requirements should
be shown to provide no adverse effect on ANS operations.
f. Service experience time should be the accumulated in-service hours. The
number of copies in service should be taken into account to calculate service
experience time, provided each copy and associated operating environment are
shown to be relevant, and that a single copy accounts for a certain pre-
negotiated percentage of the total.
NOTE: Software field data cannot be treated as if it were random hardware
failures.

© EUROCAE, 2009
104

Available COTS data may not be able to demonstrate satisfaction of all of the
verification objectives described in this document. For example, high-level
requirements testing for both robustness and normal operation may be demonstrated
for COTS but the same tests for low-level requirements may not be accomplished.
The use of service experience may be proposed to demonstrate satisfaction of these
verification objectives for COTS. The amount of service experience to be used is
selected based on engineering judgement and experience with the operation of ANS
systems. The results of software reliability models cannot be used to justify service
experience time. A possible approach for different assurance levels is provided
below:
1. Cannot be applied for SWAL1.
2. A minimum of one year (8,760 hours) of service experience with no failure for
SWAL2.
3. A minimum of six months (4,380 hours) of service experience with no failure for
SWAL3.
4. SWAL4 objectives are typically satisfied without a need for alternate methods.

7.2.5 COTS Configuration Management Process

This section describes the configuration management process for a system using
COTS. The configuration management system of the COTS supplier is not under the
control of ANS configuration management system. The ANS configuration
management system should include control of the COTS versions.

7.2.5.0 COTS Configuration Management Process Objectives

The objectives of the COTS configuration management process are:


a. The COTS specific configuration and data items (for example, software,
documentation, adaptation data) are uniquely identified in the ANS software
configuration management system.
b. The ANS problem reporting includes the management of problems found in
COTS.
c. The ANS change control process ensures that the incorporation of COTS
releases is controlled.
COTS-specific configuration and data items are included in the ANS archive, retrieval,
and release.

© EUROCAE, 2009
105

ALs SWAL
Obj
Objectives OBJECTIVES
N° 1 2 3 4 Output
Title/Topic

COTS COTS configuration and data items shall 7.2.8.1 be identified. SSF1 -
Configuration Part VII
7.2.8 N/A
Management: SSF2 –
Identification Part II

COTS problem reporting shall 7.2.9.1 be established. SSF1 -


COTS Problem Part VII
7.2.9 N/A
Reporting SSF2 –
Part II

Incorporation of COTS release shall 7.2.10.1 be controlled. SSF1 -


COTS Part VII
7.2.10 N/A
Incorporation SSF2 –
Part II

COTS COTS configuration and data items shall 7.2.11.1 be archived. SSF1 -
Configuration Part VII
7.2.11 N/A
Management: SSF2 –
Archiving Part II

TABLE 31: COTS CONFIGURATION MANAGEMENT PROCESS OBJECTIVES

© EUROCAE, 2009
106

7.2.5.1 COTS Configuration Management Process Activities

The activities associated with configuration management of COTS are:


a. An identification method should be established to ensure that the COTS
configuration and data items are uniquely identified.
NOTE: The identification method may be based on identification from the COTS
supplier and any additional data such as release or delivery date.
b. The ANS problem reporting should include management of problems found in
COTS, and a bi-directional problem reporting mechanism with the COTS
supplier should be established.
c. The ANS change control process for the incorporation of updated COTS
versions should be established.
d. An impact analysis of changes to the COTS baseline should be performed prior
to incorporation of new releases of COTS.
NOTE: The list of changes (problem fixes and new, changed, or deleted functions)
implemented in each new release may be available from the COTS
supplier.
e. The ANS archival, retrieval, and release should include COTS-specific
configuration and data items.
NOTE: Consideration may be given to technology obsolescence issues for
accessing archived data and escrow issues.

7.2.6 COTS Quality Assurance

The ANS quality assurance process should also assess the COTS processes and
data outputs to obtain assurance that the objectives associated with COTS are
satisfied.
NOTE: It is recommended that the COTS supplier quality assurance is co-
ordinated with the ANS quality assurance process where feasible.

7.3 TOOL QUALIFICATION

Qualification of a tool is needed when processes in this standard are eliminated,


reduced or automated by the use of a software tool without its output being verified as
specified in Verification Process. The use of software tools to automate activities of
the software lifecycle processes can help satisfy system safety objectives insofar as
they can enforce conformance with software development standards and use
automatic checks.
The objective of the tool qualification process is to ensure that the tool provides
confidence at least equivalent to that of the process(es) eliminated, reduced or
automated.
If partitioning of tool functions can be demonstrated, only those functions that are used
to eliminate, reduce, or automate software lifecycle process activities, and whose
outputs are not verified, need be qualified.
Only deterministic tools may be qualified, that is, tools which produce the same output
for the same input data when operating in the same environment. The tool
qualification process may be applied either to a single tool or to a collection of tools.

© EUROCAE, 2009
107

Software tools can be classified as one of two types:


Software development tools: Tools whose output is part of product software and
thus can introduce errors. For example, a tool, which generates Source Code
directly from requirements, would have to be qualified if the generated Source
Code is not verified as specified in Verification Process.
Whenever a software development tool is used and its output is not verified
then the software development tool shall 7.3.1 be developed to the most rigorous
level of assurance as its products.
If the output of a development tool is fully verified, then the tool does not have to
be qualified to be used.
Software verification tools: Tools that cannot introduce errors, but may fail to
detect them. For example, a static analyser, that automates a software
verification process activity, should be qualified if the function that it performs is
not verified by another activity. Type checkers, analysis tools and test tools are
other examples.
Whenever a software verification tool is used and its output is not verified, then
the software verification tool shall 7.3.2 be qualified as per ED-12B/DO-178B [7]
(chapter 12.2) and ED-94B/DO-248B [11].
If the output of a software verification tool is fully verified, then the tool does not
have to be qualified to be used.

© EUROCAE, 2009
108

CHAPTER 8

SOFTWARE SAFETY FOLDER


8.0 INTRODUCTION

This chapter proposes various structures for the documents and evidence that are
intended to provide assurance.
There can be various ways of organising a Software Safety Folder (SSF) depending
on the main usage or goal. This chapter proposes two different SSF satisfying two
different uses:
Option1: Project-based structure:
This option proposes to organise the SSF along the Software Project activities
Option 2: Compliance-based structure
This option proposes to organise the SSF in order to support the Safety
Regulatory requirements compliance demonstration.

SOFTWARE
SOFTWAREASSURANCE
ASSURANCELEVEL
LEVEL(SWAL)
(SWAL)
TO SATISFY

OBJECTIVES
OBJECTIVES
TO GIVE CONFIDENCE
Chapter 8 EVIDENCE
EVIDENCE

TO ACHIEVE TO PRODUCE
ACTIVITIES
ACTIVITIES

FIGURE 6: SOFTWARE SAFETY FOLDER STRUCTURE

NOTE: The “Software Assurance Manual” (which is part of the Safety


Management System), or equivalent, is not part of the Software Safety
folder as it is not software dependent.
A Software Safety Folder is dedicated to one and only one software.
However as many software could be part of a system and share some common items
(eg development tools, system description), the content of the software safety folder
can be restricted to a reference.
As the purpose of the Software Safety Folder is to structure the documents and
evidence that constitute it, objectives 3.5.X aim at recommending the structure
proposed here after.

© EUROCAE, 2009
109

8.1 SOFTWARE SAFETY FOLDER STRUCTURE – OPTION 1

Part Item title Reference



(Objective N°)
PART I: ENVIRONMENT
I System description 3.1.1; 4.1.1; 4.3.1;
I Operational environment 3.1.2;
I List of environment tools (CM, Development tools) 5.2.7; 6.2.1; 6.2.2; 7.1.1; 7.1.3;
PART II: SYSTEM SAFETY ASSESSMENT CONTEXT
II Regulatory framework 3.1.3;
II Applicable standards 3.1.4;
II Risk Assessment and Mitigation output 3.0.6; 4.1.2; 4.1.3;
II SW “FHA” & “PSSA” 3.1.5; 3.3.1; 3.3.2; 3.3.3; 3.3.4
PART III: SOFTWARE SAFETY ASSESSMENT PROCESS
III Plan for Software Safety Assessment 3.2.1; 3.2.2; 6.1.1; 6.1.2; 6.1.3; 6.1.5;
III Review of Plan for Software Safety Assessment 3.2.3; 6.1.4;
III List of Plan for Software Safety Assessment 3.2.4;
recipients
III SW safety assessment process V&V 3.3.1; 3.4.2; 3.4.3; 6.3.1; 6.3.2; 6.3.3;
III List of documents and the documentation process 5.1.1; 5.1.2; 5.1.3; 5.1.4;
PART IV: SAFETY REQUIREMENTS
IV (SW) Safety Requirements 3.0.6;
PART V: SW Modifications
V Change 3.0.12; 4.5.2; 4.5.4; 5.2.3; 5.2.4;
V Problem Resolution 4.5.2; 4.5.4; 5.2.3; 5.2.4; 5.8.1; 5.8.2;
5.8.3;
V Decommissioning 4.5.5;
PART VI: COTS
VI Lifecycle: Acquisition and integral process plans 7.2.1;
VI Lifecycle: Transition criteria 7.2.2;
VI Assurance: COTS plans consistency assurance 7.2.3;
VI Assurance of ANS requirements satisfaction by 7.2.4;
COTS
VI Assurance: Lifecycle data adequacy assurance 7.2.5;
VI Requirements: Derived requirements 7.2.6;
VI Assurance of COTS compatibility with target 7.2.7;
computers
VI Assurance of COTS configuration and data items 7.2.8;
identification
VI Modifications: problem reporting 7.2.9;
VI Assurance of COTS release incorporation control 7.2.10;

© EUROCAE, 2009
110

Part Item title Reference



(Objective N°)
VI Assurance of COTS configuration and data items 7.2.11;
archive
PART VII: ASSURANCES
VII Tools assurance (development, CM, maintenance, 7.1.2, 7.1.4; 4.3.12; 4.3.17; 4.3.18;
….) 4.3.19; 5.2.8; 6.2.3;
VII Operation assurance 4.4.1; 4.4.2;
VII SWAL: SWAL Rigour variation, assurance, 3.0.8; 3.0.10; 3.0.11; 4.4.3;
monitoring
VII Audit, Joint review reports 5.6.1; 5.6.2; 5.6.3; 5.7.1; 5.7.2; 5.7.3;
5.7.4; 5.7.5
VII Training assurance 6.4.1; 6.4.2; 6.4.3;
VII Requirements completeness and correctness 3.0.2; 4.1.7; 4.2.6; 5.4.1; 5.4.2; 5.4.3;
assurance 5.4.4; 5.4.5; 5.4.6; 5.4.8; 5.4.9; 5.4.11;
VII Requirements Traceability assurance 3.0.3; 4.3.15; 5.4.10;
VII Unintended functions assurance 3.0.4;
VII Requirements satisfaction assurance 3.0.6; 3.4.4
VII Configuration Management Assurance 3.0.7; 5.2.1; 5.2.2; 5.2.4; 5.2.5; 5.2.6;
5.2.9; 5.2.10, 5.2.11, 5.4.13
VII Development assurance (includes SDP: SW 4.3.3; 4.3.4; 4.3.5; 4.3.6; 4.3.7; 4.3.8;
Development Plan) 4.3.9; 4.3.10; 4.3.11; 4.3.13; 4.3.14;
4.3.16; 4.3.20;
VII Maintenance assurance 4.5.1; 4.5.3;
VII Verification assurance (verification of the verification 5.4.1; 5.4.2; 5.4.7; 5.4.12;
process results)
VII Quality Assurance 5.3.1; 5.3.2; 5.3.3; 6.3.1; 6.3.2; 6.3.3;
VII Assurance that SW is acceptably safe 3.0.16; The argument and the SSF.

© EUROCAE, 2009
111

8.2 SOFTWARE SAFETY FOLDER STRUCTURE – OPTION 2


Part Item title (ESARR6 Requirement) ESARR EC Reference
N° 6 Regulation (Objective N°)
482/2008
PART I: GENERAL SAFETY REQUIREMENTS 1 Article 3
I Within the framework of its Safety 1.1 Article 3.(1) 3.0.1; 3.0.15; 3.1.X; 3.2.X;
Management System, and as part of its 3.3.X; 3.4.X; 3.5.X
risk assessment and mitigation activities,
the ATM service-provider shall define and
implement a Software Safety Assurance
System to deal specifically with software
related aspects, including all on-line
software operational changes (such as
cutover/hot swapping).
I The ATM service-provider shall ensure, 1.2.a Article 3.0.2; 3.1.5; 3.3.1; 3.3.4;
as a minimum, within its Software Safety 3.(2).a 3.4.2; 4.1.2; 4.1.3; 4.3.1;
Assurance System, that: 4.3.2; 4.3.5; 4.3.9; 4.3.10;
a) The software requirements correctly 4.3.11; 4.3.12; 4.3.20;
state what is required by the software, in 3.3.2; 3.4.1; 4.3.4; 4.3.13;
order to meet safety objectives and 7.2.4; 3.3.3, 3.4.3; 4.3.14
requirements, as identified by the risk Chap. 3.6.3;
assessment and mitigation process;
I The ATM service-provider shall ensure, 1.2.b Article 3.0.3; 3.3.4; 3.4.1; 3.4.2;
as a minimum, within its Software Safety 3.(2).b 4.3.15; 5.4.10;
Assurance System, that:
b) Traceability is addressed in respect of
all software requirements;
I The ATM service-provider shall ensure, 1.2.c Article 3.0.4; 3.3.2; 3.3.3; 3.3.4;
as a minimum, within its Software Safety 3.(2).c 4.4.2; 5.4.3; 5.8.3;
Assurance System, that:
c) The software implementation contains
no functions which adversely affect safety;
I The ATM service-provider shall ensure, 1.2.d Article 3.0.6; 3.4.4; 4.1.7; 4.2.7;
as a minimum, within its Software Safety 3.(2).d 4.3.13; 4.3.14; 4.4.5; 4.5.3;
Assurance System, that: 5.4.X;
d) The ATM software satisfies its
requirements with a level of confidence
which is consistent with the criticality of
the software;
I The ATM service-provider shall ensure, 1.2.e Article 3.0.7; 5.2.X; 5.4.9; 5.4.13;
as a minimum, within its Software Safety 3.(2).e
Assurance System, that:
e) Assurances that the above General
Safety Requirements are satisfied, and
the arguments to demonstrate the
required assurance are at all times
derived
from: -
i. a known executable version of the
software,
ii. a known range of configuration data,
and
iii. a known set of software products and
descriptions (including specifications) that
have been used in the production of that
version.

© EUROCAE, 2009
112

Part Item title (ESARR6 Requirement) ESARR EC Reference


N° 6 Regulation (Objective N°)
482/2008
I The ATM service-provider shall provide 1.3 Article 3.(3) 3.0.8; 3.0.3; 3.0.6; 3.0.7;
the required assurances, to the 3.4.3; 3.0.10; 3.0.11; 4.2.6;
Designated Authority, that the 4.5; 5.3.2; 5.3.1; 5.3.3;
requirements in section 1.2 above have 5.6.1; 5.6.2; 5.6.3; 5.7.1;
been satisfied. 5.7.X; 6.1.3;
PART II: REQUIREMENTS APPLYING TO THE 2 Article 4
SSAS
II The ATM service-provider shall ensure, 2.1 Article 4.(1) 3.2.1; 3.0.1; 3.2.2; 3.2.3;
as a minimum, that the Software Safety 3.2.4; 3.5.1; 3.5.2; 3.5.3;
Assurance System is documented, 4.1.4; 4.2.4; 5.1.1; 5.1.2;
specifically as part of the overall Risk 5.1.3; 5.1.4;
Assessment and Mitigation
Documentation.
II The ATM service-provider shall ensure, 2.2 Article 4.(2) 3.0.5; 3.0.11; 4.1.3; 4.4.5;
as a minimum, that the Software Safety 4.5.2;
Assurance System allocates software
assurance levels to all operational ATM
software.
II The ATM service-provider shall ensure, 2.3.a Article 3.0.12; 3.0.2; 4.3.4; 4.3.9;
as a minimum, that the Software Safety 4.(3).a 4.3.10; 4.3.11; 4.3.12;
Assurance System includes assurances 4.3.13; 4.3.14; 5.4.3; 5.4.5;
of software requirements validity, 5.4.6; 5.4.9; 5.7.2; 5.4.11;
II The ATM service-provider shall ensure, 2.3.b Article 4.1.6; 4.1.7; 4.2.6; 4.2.7;
as a minimum, that the Software Safety 4.(3).b 4.3.16; 3.0.6; 5.3.X; 5.4.2;
Assurance System includes assurances 5.4.X; 5.4.4; 5.4.5; 5.4.6;
of software verification, 5.4.7; 5.4.11; 5.6.X; 5.7.3;
5.7.X; 5.7.4;
II The ATM service-provider shall ensure, 2.3.c Article 5.2.1; 3.0.7; 5.2.X; 5.4.13
as a minimum, that the Software Safety 4.(3).c
Assurance System includes assurances
of software configuration management,
II The ATM service-provider shall ensure, 2.3.d Article 5.4.10; 3.0.3; 4.3.15;
as a minimum, that the Software Safety 4.(3).d 5.2.10;
Assurance System includes assurances
of software requirements traceability.
II The ATM service-provider shall ensure, 2.4.a Article 4.4.a 3.0.9; 4.1.4; + All the
as a minimum, that the Software Safety “shall” of this document.
Assurance System determines the rigour
to which the assurances are established.
The rigour shall be
defined for each software assurance level,
and shall increase as the software
increases in criticality. For this purpose:
a) the variation in rigour of the assurances
per software assurance level shall include
the following criteria:
i. required to be achieved with
independence,
ii. required to be achieved,
iii. not required,
II The ATM service-provider shall ensure, 2.4.b Article 4.4.b 3.0.10; 3.0.8; 3.0.11; 4.2.5;
as a minimum, that the Software Safety 4.2.6; 4.2.7; 4.1.4; 4.4.5;
Assurance System determines the rigour 4.5.2; 4.5.3;
to which the assurances are established.

© EUROCAE, 2009
113

Part Item title (ESARR6 Requirement) ESARR EC Reference


N° 6 Regulation (Objective N°)
482/2008
The rigour shall be defined for each
software assurance level, and shall
increase as the software increases in
criticality. For this purpose:
b) the assurances corresponding to each
software assurance level shall give
sufficient confidence that the ATM
software can be operated tolerably safely.
II The ATM service-provider shall ensure, 2.5 Article 4.5 3.0.11; 4.4.X; 4.5.2; 4.5.3;
as a minimum, that the Software Safety 5.8.X;
Assurance System uses feedback of ATM
software experience to confirm that the
Software Safety
Assurance System and the assignment of
assurance levels are appropriate. For this
purpose, the effects resulting from any
software malfunction or failure from ATM
operational experience reported according
to ESARR 2, shall be assessed in respect
of their mapping to ESARR4.
II The ATM service-provider shall ensure, 2.6 Article 5.(1) 3.0.13; 7.2.1; 7.2.2; 7.2.3;
as a minimum, that the Software Safety Article 5.(2) 7.2.4; 7.2.5; 7.2.6; 7.2.7;
Assurance System provides the same 7.2.8; 7.2.9; 7.2.10; 7.2.11;
level of confidence, through any means Chap. 7.3 (COTS)
chosen and agreed with
the Designated Authority, for
developmental and non-developmental
ATM software (eg COTS - commercial off-
the-shelf software, etc) with the same
software assurance level.
PART III: REQUIREMENTS APPLYING TO THE 3 Annex I
SWAL
III The ATM service-provider shall ensure, 3.1 Annex I.(1) Chap. 2
as a minimum, within the Software Safety
Assurance System that the software
assurance level relates the rigour of the
software assurances to the criticality of
ATM software by using the ESARR 4
severity classification scheme combined
with the likelihood of a certain adverse
effect to occur. A minimum of four
software assurance levels shall be
identified, with software assurance level 1
indicating the most critical level.
III The ATM service-provider shall ensure, 3.2 Annex I.(2) Chap. 2
as a minimum, within the Software Safety
Assurance System, that an allocated
software assurance level shall be
commensurate with the most adverse
effect that software malfunctions or
failures may cause, as per ESARR 4. This
shall also take into account the risks
associated with software malfunctions or
failures and the architectural and/or
procedural defences identified.

© EUROCAE, 2009
114

Part Item title (ESARR6 Requirement) ESARR EC Reference


N° 6 Regulation (Objective N°)
482/2008
III The ATM service-provider shall ensure, 3.3 Annex I.(3) Chap. 2.3.1; 3.0.14;
as a minimum, within the Software Safety
Assurance System, that ATM software
components that cannot be shown to be
independent of one another shall be
allocated the software assurance level of
the most critical of the dependent
components.
PART IV: REQUIREMENTS APPLYING TO THE 4 Annex II
SOFTWARE REQUIREMENTS VALIDITY Part A
ASSURANCES
IV The ATM service-provider shall ensure, 4.1 Annex II 4.3.4; 3.0.2; 3.1.1; 3.3.4;
as a minimum within the Software Safety Part A.(1) 4.3.9; 4.3.5; 4.3.6; 4.3.10;
Assurance System, that software 4.3.11; 5.4.3; 5.4.5; 5.4.6;
requirements specify the functional 4.3.12; 5.4.9; 4.3.13;
behaviour (nominal and downgraded 4.3.14; 4.3.20;
modes) of the ATM software, timing
performances, capacity, accuracy,
software resource usage on the target
hardware, robustness to abnormal
operating conditions and overload
tolerance, as appropriate.
IV The ATM service-provider shall ensure, 4.2 Annex II 4.3.4; 4.3.5; 4.3.6; 5.2.5;
as a minimum within the Software Safety Part A.(2) 3.0.2; 3.3.4; 3.4.X; 4.3.9;
Assurance System, that software 4.3.10; 4.3.11; 4.3.12;
requirements are complete and correct, 4.3.13; 4.3.14; 4.3.20;
and are also compliant with the system 5.4.3; 5.4.5; 5.4.6; 5.4.8;
safety requirements. 5.4.9; 6.1.4; 6.1.5;
PART V: REQUIREMENTS APPLYING TO THE 5 Annex II
SOFTWARE REQUIREMENTS VERIFICATION Part B
ASSURANCES
V The ATM service-provider shall ensure, 5.1 Annex II 4.4.5; 3.0.6; 5.4.1; 5.4.2;
as a minimum, within the Software Safety Part B.(1) 5.4.3; 5.3.X; 5.6.X;
Assurance System, that the functional
behaviour of the ATM software, timing
performances, capacity, accuracy,
software resource usage on the target
hardware, robustness to abnormal
operating conditions and overload
tolerance, comply with the software
requirements.
V The ATM service-provider shall ensure, 5.2 Annex II 4.1.7; 4.2.7; 3.0.6; 3.4.X;
as a minimum, within the Software Safety Part B.(2) 5.2.3; 5.2.5; 5.4.1; 5.4.2;
Assurance System, that the ATM software 5.4.3; 5.4.4; 5.4.5; 5.4.6;
is adequately verified by analysis and/or 5.4.7; 5.4.8; 5.4.9; 5.4.10;
testing and/or equivalent means, as 5.4.11; 5.6.X; 5.7.X;
agreed with Designated Authority.
V The ATM service-provider shall ensure, 5.3 Annex II 4.2.6; 4.2.7; 5.4.1; 5.4.2;
as a minimum, within the Software Safety Part B.(3) 3.0.6; 5.4.3; 3.4.X; 5.4.4;
Assurance System, that the verification of 5.4.5;
the ATM software is correct and complete. 5.4.6; 5.4.7; 5.4.8; 5.4.9;
5.4.10; 5.4.11; 5.4.12;
5.4.13; 5.5.7; 6.1.4; 6.1.5;

© EUROCAE, 2009
115

Part Item title (ESARR6 Requirement) ESARR EC Reference


N° 6 Regulation (Objective N°)
482/2008
PART VI: REQUIREMENTS APPLYING TO THE 6 Annex II
SOFTWARE CONFIGURATION MANAGEMENT Part C
ASSURANCES
VI The ATM service-provider shall ensure, 6.1 Annex II 3.5.2; 3.0.7; 5.2.1; 5.2.2;
as a minimum, within the Software Safety Part C.(1) 5.2.3; 5.2.4; 5.2.7; 5.2.8;
Assurance System, that Configuration 5.2.9; 5.2.10; 5.8.4;
identification, traceability and status
accounting exist such that the software
lifecycle data can be shown to be under
configuration control throughout the ATM
software lifecycle.
VI The ATM service-provider shall ensure, 6.2 Annex II 4.2.5; 4.1.6; 4.4.5; 4.5.1;
as a minimum, within the Software Safety Part C.(2) 3.0.11; 3.0.12; 4.5.2; 4.5.3;
Assurance System, that Problem 4.5.4; 5.8.1; 5.8.2; 5.8.3;
reporting, tracking and corrective actions 5.8.4;
exist such that safety related problems
associated with the software can be
shown to have been mitigated.
VI The ATM service-provider shall ensure, 6.3 Annex II 5.2.6; 5.4.13
as a minimum, within the Software Safety Part C.(3)
Assurance System, that Retrieval and
release procedures exist such that the
software lifecycle data can be
regenerated and delivered throughout the
ATM software lifecycle.
PART VII: REQUIREMENTS APPLYING TO THE 7 Annex II
SOFTWARE REQUIREMENTS TRACEABILITY Part D
ASSURANCES
VII The ATM service-provider shall ensure, 7.1 Annex II 4.3.15; 3.0.3; 5.4.10;
as a minimum, within the Software Safety Part D.(1)
Assurance System, that Each software
requirement is traced to the same level of
design at which its satisfaction is
demonstrated.
VII The ATM service-provider shall ensure, 7.2 Annex II 4.3.15; 3.0.3; 5.4.10;
as a minimum, within the Software Safety Part D.(2)
Assurance System, that Each software
requirement, at each level in the design at
which its satisfaction is demonstrated, is
traced to a system requirement.
PART VIII: Other ESARR6 Requirements
8.1 Chap. 1.2
11.1 Chap 1.7
OTHER OBJECTIVES ADDITIONAL TO 4.1.5; 4.2.1; 4.2.2; 4.2.3;
ESARR6 4.2.7; 4.3.3; 4.3.17; 4.3.18;
4.3.19; 4.4.1; 4.4.3; 4.5.5;
6.1.1; 6.1.2; 6.2.1; 6.2.2;
6.2.3; 6.3.1; 6.4.1; 6.4.2;
6.4.3; 7.1.3;

© EUROCAE, 2009
116

8.3 ESARR6 AND EC REGULATION 482/2008 MAPPING

This table provides a mapping between ESARR6 and EC Regulation 482/2008


requirements.

Title ESARR6 Chapter EC Regulation


482/2008
General Safety Requirements 1 Article 3
Requirements Applying to the Software Safety 2 Article 4
Assurance System
Requirements applying to changes to software N/A Article 5
and to specific software

Requirements Applying to the Software 3 Annex I


Assurance Level
Requirements Applying to the Software 4 Annex II Part A
Requirements Validity Assurances
Requirements Applying to the Software 5 Annex II Part B
Verification Assurances

Requirements Applying to the Software 6 Annex II Part C


Configuration Management Assurances

Requirements Applying to the Software 7 Annex II Part D


Requirements Traceability Assurances

TABLE 32: ESARR6 AND EC REGULATION 482/2008 MAPPING

© EUROCAE, 2009
117

ANNEX A

REFERENCE TO EXISTING SOFTWARE STANDARDS


A.1 PURPOSE

This annex re-uses the IEC/ISO12207 processes structure, because it has the widest
coverage (from definition to decommissioning) of ANS needs. This annex does not
intend to promote any standard, nor state that any standard best fits ANS needs (even
if IEC/ISO 12207 has been used as a process structure basis).
The purpose of this annex is as follows:
To provide a traceability matrix. For each listed ED-153 objective a reference is
given to the standard paragraph, which covers this objective.
To provide a compatibility matrix between standards, in order to identify
commonalities and differences between those standards. This enables
suppliers, ATS providers, regulators and any other organisation or group to be
able to evaluate characteristics of a system or equipment integrating software
without requiring the use of the standard recommended by its organisation. This
compatibility matrix will allow all actors to “speak the same language” when
talking about software standards.
To provide a synthetic overview of ED-153 objectives and activities (‘title’
column in this annex) coverage by each standard. Tables are used to give a
general view of whether or not ED-153 objectives are implemented by using the
following symbols:
- • (means fully covered)
- P (partially covered)
- blank (not covered)
NOTE: ED-109/DO-278 traceability includes specific considerations due to the fact
that ED-109/DO-278 is not a stand-alone document2 as it is based on
ED-12B/RTCA/DO-178B.
To identify area of improvement of existing standards, especially because of
ANS particularities.
To identify objectives which have to be modified for ANS purposes.
NOTE: This annex does not provide a one-to-one mapping between ED-153
objectives and standard activities. Some ED-153 objectives are split into
several activities as denoted in the ‘title’ column.
Software Lifecycle Processes
The set of ANS software lifecycle processes is divided into:
A software safety assurance system,
Five primary processes,
Eight supporting processes,
Four organisational processes,
Additional ANS software lifecycle objectives.

2
See ED-109 chapter 1.3.

© EUROCAE, 2009
118

Specific interpretation & notation regarding mapping to CMMI model:


The CMMI is designed for any type of development or services, and there is no
specific safety “amplification” for safety-constrained development or services. So,
rather than pure traceability, the part of the tables related to the CMMI identifies
mapping or relationship (full, partial or none).
”Mapping” stands for “same Activity, but not systematically the same point of view nor
the same level of detail”, where “traceability” stands for “equivalent level of
requirement (same coverage, same level of detail)”.
The CMMI level should be considered when providing traceability between ED-153
objectives and CMMI Process Areas.
The detailed used references are the following (where XXX is the acronym of a CMMI
Process Area)
XXX mapping to the global Process Area XXX
XXX1 (respectively XXX 1, 2) mapping to the set of practices related to the
goal 1 (respectively to the set of goals 1 and 2) of the Process Area XXX
XXX 2.1 (respectively XXX 1.1, 2.1, 3.2) mapping to the Specific Practice 2.1
(respectively to the set of Specific Practices “1.1, 2.1 & 3.2”) of the Process
Area XXX
GP 2.4 (respectively GP 2.4, 2.7) mapping to the Generic Practice “GP2.4”
(respectively to the set of Generic Practices “2.4, 2.7”) for the set of Process
Areas
XXX GP 2.1 (respectively GP 2.1, 2.7) mapping to the Generic Practice
« 2.1 » (respectively to the set of Generic Practices “2.1, 2.7”) of the Process
Area XXX
NOTE: The safety extensions of CMMI have not been considered in this Annex.

A.2 SOFTWARE SAFETY ASSURANCE SYSTEM

A.2.1 Software Safety Assurance System Objectives

The following table lists, per objective, references to related standards in support of
implementing a Software Safety Assurance System.

© EUROCAE, 2009
119

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 ; CMMI
DO-278 DO-178B
3.0.1 Implementation P P P

Requirements P (Ref: 3.2 – • • •
3.0.2 Correctness and Table A-2 (lines (Ref: RD 1.1, 1.2,
Completeness (Ref: 5.3.4) (Ref: 5.1, 6.3.1) (Ref: 7.2.2)
1,2) Table A-3 ( 2.1)
lines 1, 2)
P
Requirements • • P
3.0.3 Traceability (Ref: 5.3.4.2; P
5.3.5.6; 5.3.6.7; (Ref: A3.6, A4.6, (Ref: ReqM 1.4)
Assurance (Ref: 5.5)
5.3.7.5) A5.6)

Unintended
• P P
3.0.4 (Ref: 3.6 Table A-5
Functions (Ref: 6.3.4.a) (Ref: 7.4.7.2)
line 1)
P P
3.0.5 SWAL Allocation
(Ref: 2.2.2, 2.2.3) (Ref: 7.5.2, 7.6.2)
Requirements
3.0.6 Satisfaction
• • •
Assurance (Ref: 2.1) (Ref: 5.1) (Ref: 7.2)
Configuration
• • • • •
3.0.7 Management (Ref: 3.8 Table A-
Assurance (Ref : 6.2) (Ref: 7) (Ref: 6.2.3) (Ref: CM)
8)

Assurance Rigour • • •
3.0.8 (Ref: 2.1, 9 &
Objective (Ref: 2.1) (Ref: Part 1 –7.4.2)
11.20)

3.0.9
Assurance Rigour • • •
Criteria (Ref: Chap 3) (Ref: Appendix A) (Ref: Appendix A)
• • •
3.0.10 SWAL Assurance (Ref: 3.10 Table
(Ref: 9 & 11.20) (Ref: 6.2.2)
A-10 ; 5.1)

© EUROCAE, 2009
120

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 ; CMMI
DO-278 DO-178B
SWAL Monitoring: P
3.0.11
Assurance (Ref: 4.1.6.3)
SWAL Monitoring: P
3.0.11
Feedback (Ref: 4.1.6.3)
SWAL Monitoring: P
3.0.11
Event assessment (Ref: 4.1.6.3)

3.0.12
Software P •
Modifications (Ref: 4.1.4.2) (Ref: 7. 8)

3.0.13
Equivalent •
Confidence (Ref: 4.1, 4.2)

3.0.13
Equivalent •
Confidence means (Ref: 4.1, 4.2)

• • •
3.0.14 Isolation (Ref: Chap 2.2.3,
(Ref: Chap 2.2.1) (Ref: Chap
2.3.1)
SW-related P
3.0.15
aspects (Ref: Chap 2.2.5)

3.0.16
Demonstration to • •
NSA (Ref: Chap 3.10) (Ref: Chap 9)
P
Argument P (Ref: Chap 6.3.1, P
3.0.17
Production (Ref: Chap 3) 6.3.3, 6.3.4, 8.3, (Ref: Part 3-7, 3-8)
10.2, 11.20, 12.1)

© EUROCAE, 2009
121

A.2.2 Software Assurance Level

See chapters 1-8 of ED-153.

A.2.3 Software Safety Assessment

See chapter 2.1 of ED-153.

A.2.3.1 Software Safety Assessment Initiation

See chapter 3.1 of ED-153.

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
P
[Ref:
System • • • a) RD 1.1
3.1.1 b) RD 3.1, TS 1.2
Description (Ref: 2.2) (Ref: 2.1) (Ref: I-7.2.1)
c) RD 3.2

e) RD 2.3, TS 2.3]
Operational P P • P
3.1.2
Environment (Ref: 2.2) (Ref: 2.1.1) (Ref: I-7.2.1) (Ref: RD 1.1)


Regulatory (Ref: 3.10 Table A- •
3.1.3 (Ref: 2.1.1, 9, 10)
Framework 10 line 2 (Ref: I-7.2.2.4)
- 5.1)
Applicable P • • • P
3.1.4
Standards Standard itself Standard itself Standard itself Standard itself Standard itself
Risk Assessment P P P
3.1.5 and Mitigation
Process Output (Ref: 2.2) (Ref: 2.1.1) (Ref: I-7)

© EUROCAE, 2009
122

A.2.3.2 Software Safety Assessment Planning

See chapter 3.2 of ED-153.

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Software Safety
• • •
3.2.1 Assessment
Approach (Ref: Section 5.1) (Ref: 11.1) (Ref: 8)


Software Safety P P
3.2.2 (Ref: 5.1 - 3.10
Assessment Plan (Ref: 11) (Ref: I-7.8)
Table A-10)
Software Safety •

3.2.3 Assessment Plan (Ref: 5.1 - 3.10
Review (Ref: 9, 10)
Table A-10)
Software Safety P •
3.2.4 Assessment Plan
Dissemination (Ref: 5.1) (Ref: 9, 10)

© EUROCAE, 2009
123

A.2.3.3 Software Requirements Specification

See chapter 3.3 of ED-153.

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Failure P •
3.3.1 Identification:
analysis (Ref: 2.2) (Ref: I-7.4)
Failure
• •
3.3.1 Identification:
modes (Ref: 2.2.2) (Ref: I-7.4)

Failure Effects P •
3.3.2
evaluation (Ref: 2.2.1) (Ref: I-7.4)
Failure Hazard • •
3.3.2
Identification (Ref: 2.2) (Ref: I-7.4)
Re-Assessment of P •
3.3.3
Risk (Ref: 2.2.1) (Ref: I-7.5)
Software P
Requirements: [Ref:
P •
3.3.4 Compliance with a.1) RD 2.1, 2.2
System Safety (Ref: 2.2.1) (Ref: I-7.6)
a.2) TS 2.1, Ver
Objectives 1.1, 2.2, 2.3 ]

NOTE: Column ED-12B/DO-178B – These tasks are identified as partially met by ED-12B/DO-178B because section 2 of this document
compensates the lack of system safety standard namely ARP4754/4761, which was elaborated after ED-12B/DO-178B.

© EUROCAE, 2009
124

A.2.3.4 Software Safety Assessment Validation, Verification And Process Assurance

See chapter 3.4 of ED-153.

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• P
Software Safety (Ref: 3.3 Table A- [Ref:
P •
3.4.1 Assessment 3 lines 1, 2, 3, 4, 5; a) RD 3.3, 3.4, 3.5
Validation (Ref: 6.4; 6.5) (Ref: 6.3)
3.4 Table A-4 lines Ver 2.1, 2.2, 2.3
1, 2, 6) b) -]
Software Safety
• •
3.4.2 Assessment
Verification (Ref: 2.1) (Ref: 2.2.2)

Software Safety

Assessment P •
3.4.3 (Ref: 3.9 Table A-
Process (Ref: 6.4) (Ref: 8)
Assurance 9)

© EUROCAE, 2009
125

A.2.3.5 Software Safety Assessment Completion

See chapter 3.5 of ED-153.

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Document P
Software Safety P P P P
3.5.1 (Ref: I-7.2.2.6,I-
Assessment (Ref: 6.1) (Ref: 5) (Ref: 11, Annex A)
Process Results 7.3.2.5, I-7.4.2.11 )
Software Safety
Assessment P P P P P
3.5.2 Documentation
Configuration (Ref: 6.2) (Ref: 3.8 - 4.1.7) (Ref: 7.3, Annex A) (Ref: I-7.4.2.12) (Ref: CM 1.1)
Management
Software Safety P P
Assessment P
3.5.3 (Ref: 5.1 - 3.10 (Ref:
Documentation (Ref: 9, 10)
Dissemination Table A-10) GP2.7)

© EUROCAE, 2009
126

A.3 PRIMARY LIFECYCLE PROCESSES

See chapter 4 of ED-153.

A.3.1 Acquisition Process

See chapter 4.1 of ED-153.


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

[Ref:
all) SAM 2.1;
Initiation: System • a) TS 2.4;
4.1.1 b,c )RD1.2,2.1,
Requirements (Ref: 5.1.1)
ReqM1.4;
d) SAM 1.1, GP
2.2, 3.1;
ISM GP 2.2, 3.1]

[Ref:
all) SAM 2.1;
Initiation: System a) TS 2.4;

4.1.1 Requirements b,c )RD1.2,2.1,
contents (Ref: 5.1.1)
ReqM1.4;
d) SAM 1.1, GP
2.2, 3.1;
ISM GP 2.2, 3.1]

[Ref:
all) SAM 2.1;
Initiation: • a) TS 2.4;
4.1.1 b,c )RD1.2,2.1,
Acquisition Plan (Ref: 5.1.1)
ReqM1.4;
d) SAM 1.1, GP
2.2, 3.1;
ISM GP 2.2, 3.1]

© EUROCAE, 2009
127

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Risk Assessment P
and Mitigation
4.1.2 (Ref: Part 1-7.2, 1-
Process – Safety
determination 7.3)
Risk Assessment P
and Mitigation
4.1.2 (Ref: Part 1-7.4, 1-
Process – Hazard
Analysis 7.5)
Risk Assessment
and Mitigation P P P
4.1.3
Process – Safety (Ref: 2) (Ref: 2) (Ref: Part 1-7.6)
Objectives
Risk Assessment
and Mitigation P P P
4.1.3
Process – Safety (Ref: 2) (Ref: 2) (Ref: Part 1-7.6)
Requirements
Determination of
• P
4.1.4 Applicability of ED-
153 (Ref: 5.1.2) (Ref: SAM 1.2)

• •
4.1.5 Tender Selection
(Ref: 5.1.3) (Ref: SAM 1.2)
Supplier • •
4.1.6
monitoring (Ref: 5.1.4) (Ref: SAM 2.2)
Acceptance and
completion: • •
4.1.7
Software (Ref: 5.1.5) (Ref: SAM 2.3)
Acceptance
Acceptance and
• •
4.1.7 completion: Test
Preparation (Ref: 5.1.5) (Ref: SAM 2.3)

© EUROCAE, 2009
128

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Acceptance and
completion: • •
4.1.7
Supplier (Ref: 5.1.5) (Ref: SAM 2.3)
involvement
Acceptance and
• •
4.1.7 completion:
Acceptance review (Ref: 5.1.5) (Ref: SAM 2.3)

NOTE: To simplify and as the purpose of this document is to describe the objectives related to the software lifecycle, it has been considered
that the acquirer performs the PSSA (Preliminary System Safety Assessment). Even if in a real project this step may be performed
in relation with the system supplier, however it remains the acquirer responsibility to validate and accept it. As this document focuses
on the software-related objectives, the main purpose of the PSSA is to allocate an Assurance Level to the software, which has to
remain under the Acquirer ultimate responsibility (at least by validating it, when not allocating it).
NOTE: This document intends to address the software aspects of SSA (System Safety Assessment: the third step of the Safety
Assessment Methodology).

© EUROCAE, 2009
129

A.3.2 Supply Process

See chapter 4.2 of ED-153.


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

Requirements •
4.2.1 (Ref: RD 1.1, 1.2,
Review (Ref: 5.2.1)
3.3, 3.4)
Preparation of • •
4.2.2
response (Ref: 5.2.2) (Ref: ReqM 1.1)
• •
4.2.3 Contract
(Ref: 5.2.3 (Ref: ReqM 1.2)

Planning: Software • P P P (Ref:
4.2.4
Lifecycle Model (Ref: 5.2.4) (Ref: 3.1) (Ref: 4) (Ref: Part 1-6) PP 1.3,
2.7)

Planning: Relevant • P P P (Ref:
4.2.4
Standards (Ref: 5.2.4) (Ref: 3.1) (Ref: 4) (Ref: Part 1-6) PP 1.3,
2.7)

Planning: Project • P P P (Ref:
4.2.4
Management Plan (Ref: 5.2.4) (Ref: 3.1) (Ref: 4) (Ref: Part 1-6) PP 1.3,
2.7)
Execution & • •
• P P
4.2.5 control: Project (Ref: 3.9, Table A- (Ref: PMC 1
Management Plan (Ref: 5.2.5) (Ref: 4.6) (Ref: Part 1-6.2.2)
9) SAM 2.2)
• •
Execution & • P P
4.2.5 (Ref: 3.9, Table A- (Ref: PMC 1
control: Monitoring (Ref: 5.2.5.3) (Ref: 4.6) (Ref: Part 1-6.2.2)
9) SAM 2.2)
Review &
• P •
4.2.6 evaluation:
Co-ordination (Ref: 5.2.6, 5.3.13) (Ref: Part 1-6.2) [Ref:

© EUROCAE, 2009
130

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
a) PMC 1.5, 1.6,
1.7
b) PPQA]
(Ref: PI 3.4
Ver
Val)

[Ref:
Review & a) PMC 1.5, 1.6,
• P 1.7
4.2.6 evaluation: Quality
Assurance (Ref: 5.2.6, 6.3) (Ref: Part 1-6.2) b) PPQA]
(Ref: PI 3.4
Ver
Val)
• •
Delivery & P
4.2.7 (Ref: 5.2.7, (Ref: PI 3.4, SAM
completion (Ref: Part 1-6.2)
5.3.13.2, 5.3.13.3) 2.4))

NOTE: Since ANS systems may operate continuously and may have been in operation for many years, the software lifecycle plans for
these systems should include processes for software changes, technology upgrades, etc., specifically with respect to safety issues.

© EUROCAE, 2009
131

A.3.3 Development Process

See chapter 4.3 of ED-153.


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
System • •
• P •
4.3.1 Requirements (Ref: Part 1-7.6, (Ref: RD 2.1, 2.2,
Analysis (Ref: 5.3.2) (Ref: 2.2) (Ref: 2.1)
Part 2-7.2, 2-7.9) 2.3, 3.2)
System P •
• P P
4.3.2 Architectural (Ref: Part 2-7.4) (Ref: RD 2.2,
Design (Ref: 5.3.3) (Ref: 2.1) (Ref: 2.3)
REQM 1.4)
• • P •
(Ref: 3.1 Table A- (Ref: 3, 4, 11.2) (Ref: Part [Ref:
Process • 1 lines 1 to 7 for 3-7.1.2.1, a) PP 1.3
4.3.3
Implementation (Ref: 5.3.1, 6.6) COTS; 3-7.4.1.3, b) GP 2.2, 3.1, PP
4.1.9 Table A-10 Part 1-7) 2
lines 1, 2, 3) c) PP 2.4]
• • •
Software • (Ref: 3.1 Table A- (Ref: 11.1, 11.2) (Ref: PP 1.1, 1.3)
4.3.3
Development Plan (Ref: 5.3.1.4) 1, Lines 1, 5, 7;
4.1.4; 4.1.9 line 3)
P • •
• (Ref: 5.4) (Ref: Part 3-7.5, (Ref: PI 1.3, 3.2,
4.3.3 System Integration
(Ref: 5.3.10) Part 2-7.5) 3.3
VER)
• •
SW Requirements: • • •
4.3.4 (Ref: 3.2 Table A-2 (Ref: 5.1, 11.6,
Establishment (Ref: 5.3.4) (Ref: Part 3-7.2) (Ref: RD 2.1, 2.3)
line 1) 11.9)
• •
SW Requirements: • • •
4.3.4 (Ref: 3.2 Table A-2 (Ref: 5.1, 11.6,
Specification (Ref: 5.3.4) (Ref: Part 3-7.2) (Ref: RD 2.1, 2.3)
line 1) 11.9)
SW Requirements: P
4.3.4
Algorithms (Ref: 5.1, 11.10)

© EUROCAE, 2009
132

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• •
SW Architectural • • •
4.3.5 (Ref: 3.2 Table A-2 (Ref: 5.2, 11.7,
Design (Ref: 5.3.5) (Ref: Part 3-7.4.3) (Ref: TS 2.1, 2.2)
line 3) 11.10)
SW Detailed • •
• • •
4.3.6 Design: Design (Ref: 3.2 Table A-2 (Ref: 5.2, 11.7,
and Interfaces (Ref: 5.3.6) (Ref: Part 3-7.4.5) (Ref: TS 3.1)
lines 1, 2) 11.10)

SW Detailed (Ref: 3.5 Table A-5
• • • •
4.3.6 Design: Test lines 1, 2;
Definition (Ref: 5.3.6) (Ref: 5.2, 11.10) (Ref: Part 3-7.4.7) (Ref: TS 3.1, VER)
3.6 Table A-6 lines
1, 2, 3, 4)

SW Integration: • • • •
4.3.7 (Ref: 3.1 Table A-1
Development (Ref: 5.3.8) (Ref: 5.4) (Ref: Part 3-7.4.8) (Ref: PI 1.1, 1.3)
line 1)

SW Integration: • • • •
4.3.7 (Ref: 3.1 Table A-1
Plan Contents (Ref: 5.3.8) (Ref: 5.4) (Ref: Part 3-7.4.8) (Ref: PI 1.1, 1.3)
line 1)
P
SW Integration: • • •
4.3.7 (Ref: 3.1 Table A-1
Documentation (Ref: 5.3.8) (Ref: 5.4) (Ref: Part 3-7.4.8)
line 1)

(Ref:
Part 1-7.9.1.1, 1-
SW Installation: • 7.9.2.1, •
4.3.8
Plan (Ref: 5.3.12) 1-7.9.2.3, (Ref: PP 2, PI 1)
1-7.13.1.1
1-7.13.2.1,
1-7.13.2.2)
SW Installation:
• •
4.3.8 Resources
Documentation (Ref: 5.3.12) (Ref: PP 2, PI 1)

© EUROCAE, 2009
133

A.3.3.1 Process Implementation


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

• (Ref: Part 3-
• • •
4.3.3 Lifecycle Definition (Ref 3.1 Table A-1 7.1.2.1, 3-7.1.2.3,
(Ref: 5.3.1.1, 6.6) (Ref: 3) 3-7.1.2.6 (Ref: PP 1.3, 2.1)
line 3)
=> 3-7.1.2.5)
• •
Outputs • • •
4.3.3 (Ref 3.1 Table A-1 (Ref: Part
Documentation (Ref: 5.3.1.2.a) (Ref: 4.1, 4.3) (Ref: GP 2.2, 3.1)
line 1) 3-7.1.2.7)

(Ref 3.8 Table A-8 • •
Outputs
• lines 1 to 6; •
4.3.3 Configuration (Ref: Part (Ref: CM,
Management (Ref: 5.3.1.2.b) For COTS: 4.1.7, (Ref: 4.3)
3-7.1.2.8) GP 2.6)
4.1.9 Table 4-3
lines 1 to 4)


Software Products • P (Ref: PMC 2.1, 2.2,
4.3.3 (Ref 3.8 Table A-8
Problems (Ref: 5.3.1.2.c, 6.8) (Ref: 6.3) 2.3
line 3)
CM 2.1, 2.2)

• (Ref: REQM 1.1
Support Process • P • PP 2.3, PMC 1.4,
4.3.3 (Ref 3.9 Table A-9
Compliance (Ref: 5.3.1.2.d) (Ref: 5.1.2, 5.2.2) (Ref: Part 3-7.1.1) PPQA, PMC 1.6,
line 1)
1.7, 2 VER,
VAL GP 2.9)

• • P (Ref: PP 2, PMC

4.3.3 Development Plan (Ref 3.1 Table A-1 (Ref: 2.2, 4.1, 4.2, (Ref: Part 1.1
(Ref: 5.3.1.4)
line 1) 4.3, 11.2) 3-7.1.2.2) IPM 1.1, 1.3, 1.4
GP 2.2, 2.3, 3.1)

© EUROCAE, 2009
134

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

Non-Deliverable •
4.3.3 (Ref 3.1 Table A-1
Items (Ref: 5.3.1.5)
line 4)
Environment •
• •
Definition: • • (Ref: PP 2.4
4.3.9 (Ref 3.1. Table A-1 (Ref: Part
Development of (Ref: 5.3.1.3) (Ref: 4.4, 4.5) IPM 1.1
plans line 3) 3-7.1.2.6)
GP 2.2, 3.1)
• •
Environment •
• • (Ref: Part 3 (Ref: PP 2.4
4.3.9 Definition: Content (Ref 3.1 Table A-1
of plans (Ref: 5.3.1.3) (Ref: 4.4, 4.5) Annexes A&B, IPM 1.1
line 3)
3-7.4.4.2) GP 2.2, 2.3, 3.1)
Environment •

Definition: • • (Ref: PP 2.4
4.3.9 (Ref 3.1 Table A-1
Documentation of (Ref: 5.3.1.3) (Ref: 4.4, 4.5) IPM 1.1
plans line 3)
GP 2.2)
Development

Standards: • • P P
4.3.9 (Ref 3.1 Table A-1
Development of (Ref: 5.3.1.4) (Ref 4.1, 4.2) (Ref: Part 3-7.4.4) (Ref: GP2.2)
plans line 5)

Development •
• P P
4.3.9 Standards: (Ref 3.1 Table A-1
Content of plans (Ref 4.1, 4.2) (Ref: Part 3-7.4.4) (Ref: GP2.2)
line 5)
Development

Standards: • P P
4.3.9 (Ref 3.1 Table A-1
Documentation of (Ref 4.1, 4.2) (Ref: Part 3-7.4.4) (Ref: GP2.2)
plans line 5)

• •

Environment • • (Ref: Part (Ref: PP 2.4
4.3.10 (Ref 3.1 Table A-1
Definition (Ref: 5.3.1.3) (Ref: 4.4, 4.5) 3-7.1.2.6, Annexes IPM 1.1
line 3)
A&B, 3-7.4.4.2) GP 2.2, 2.3, 3.1)

© EUROCAE, 2009
135

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

Development • • P P
4.3.10 (Ref 3.1 Table A-1
Standards (Ref: 5.3.1.4) (Ref 4.1, 4.2) (Ref: Part 3-7.4.4) (Ref: GP2.2)
line 5)
• •

Environment • • (Ref: Part (Ref: PP 2.4
4.3.11 (Ref 3.1 Table A-1
Definition (Ref: 5.3.1.3) (Ref: 4.4, 4.5) 3-7.1.2.6, Annexes IPM 1.1
line 3)
A&B, 3-7.4.4.2) GP 2.2, 2.3, 3.1)

Development • • P P
4.3.11 (Ref 3.1 Table A-1
Standards (Ref: 5.3.1.4) (Ref 4.1, 4.2) (Ref: Part 3-7.4.4) (Ref: GP2.2)
line 5)
• •

Environment • • (Ref: Part (Ref: PP 2.4
4.3.12 (Ref 3.1 Table A-1
Definition (Ref: 5.3.1.3) (Ref: 4.4, 4.5) 3-7.1.2.6, Annexes IPM 1.1
line 3)
A&B, 3-7.4.4.2) GP 2.2, 2.3, 3.1)
• •
P
Environment • • (Ref: Part (Ref: PP 2.4
4.3.14 (Ref: 3.1 Table A-1
Definition (Ref: 5.3.1.3) (Ref: 4.4, 4.5) 3-7.1.2.6, Annexes IPM 1.1
line 3)
A&B, 3-7.4.4.2) GP 2.2, 2.3, 3.1)
Outputs
Documentation: • •
• • •
4.3.15 System and (Ref 3.1 Table A-1 (Ref: Part
Software (Ref: 5.3.1.2.a) (Ref: 4.1, 4.3) (Ref: GP 2.2, 3.1)
line 1) 3-7.1.2.7)
Traceability
Outputs
• •
Documentation: • • •
4.3.15 (Ref 3.1 Table A-1 (Ref: Part
Requirements and (Ref: 5.3.1.2.a) (Ref: 4.1, 4.3) (Ref: GP 2.2, 3.1)
Design Traceability line 1) 3-7.1.2.7)
Outputs
• •
Documentation: • • •
4.3.15 (Ref 3.1 Table A-1 (Ref: Part
Architecture and (Ref: 5.3.1.2.a) (Ref: 4.1, 4.3) (Ref: GP 2.2, 3.1)
Code Traceability line 1) 3-7.1.2.7)

© EUROCAE, 2009
136

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Outputs
Documentation: • •
• • •
4.3.15 Code and (Ref 3.1 Table A-1 (Ref: Part
Executable (Ref: 5.3.1.2.a) (Ref: 4.1, 4.3) (Ref: GP 2.2, 3.1)
line 1) 3-7.1.2.7)
Traceability
Lifecycle • •
Definition: • • •
4.3.16 (Ref 3.1 (Ref: Part 3-7.1.2.1
Lifecycle (Ref: 5.3.1.1) (Ref: 3) (Ref: PP 1.3, 2.1)
Processes Table A-1 line 3) => 3-7.1.2.5)

Lifecycle •

Definition: • • (Ref: Part •
4.3.16 (Ref 3.1
Lifecycle (Ref: 5.3.1.1) (Ref: 3) 3-7.1.2.1, 3-7.1.2.3 (Ref: PP 1.3, 2.1)
Information Table A-1 line 3)
=> 3-7.1.2.5)
Lifecycle • •
• • •
4.3.16 Definition: (Ref 3.1 (Ref: Part 3-7.1.2.3
Transition Criteria (Ref: 5.3.1.1) (Ref: 3) (Ref: PP 1.3, 2.1)
Table A-1 line 3) => 3-7.1.2.5)

• • P (Ref: PP 2, PMC

4.3.16 Development Plan (Ref 3.1 (Ref: 2.2, 4.1, 4.2 (Ref: Part 1.1
(Ref: 5.3.1.4)
Table A-1 line 1) ,4.3 11.2) 3-7.1.2.2) IPM 1.1, 1.3, 1.4
GP 2.2, 2.3, 3.1)
• •

Environment • • (Ref: Part (Ref: PP 2.4
4.3.19 (Ref 3.1. Table A-1
Definition (Ref: 5.3.1.3) (Ref: 4.4, 4.5) 3-7.1.2.6, Annexes IPM 1.1
line 3)
A&B, 3-7.4.4.2) GP 2.2, 2.3, 3.1)

• (Ref: ReqM1.1
Support Process • • PP2.3, PMC1.4,
5.X (Ref 3.9 Table A-9
Compliance (Ref: 5.3.1.2.d) (Ref: Part 3-7.1.1) PPQA, PMC 1.6,
line 1)
1.7, 2 Ver, Val
GP2.9)

© EUROCAE, 2009
137

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

(Ref 3.8 Table A-8 • •
Outputs
• line 1 to 6; •
5.2 Configuration (Ref: Part (Ref: CM,
Management (Ref: 5.3.1.2.b) For COTS: 4.1.7 (Ref: 4.3)
3-7.1.2.8) GP 2.6)
Table 4-3 lines 1 to
4)


Software Products • (Ref: PMC 2.1, 2.2,
5.2.2 (Ref 3.8 Table A-8
Problems: Scheme (Ref: 5.3.1.2.c) 2.3
line 3)
CM 2.1, 2.2)

Software Products •
• (Ref: PMC 2.1, 2.2,
5.2.2 Problems: (Ref 3.8 Table A-8
(Ref: 5.3.1.2.c) 2.3
Procedure line 3)
CM 2.1, 2.2)


Software Products • (Ref: PMC 2.1, 2.2,
5.2.2 (Ref 3.8 Table A-8
Problems: ? (Ref: 5.3.1.2.b) 2.3
line 3)
CM 2.1, 2.2)


Software Products • (Ref: PMC 2.1, 2.2,
5.8 (Ref 3.8 Table A-8
Problems (Ref: 5.3.1.2.c) 2.3
line 3)
CM 2.1, 2.2)

© EUROCAE, 2009
138

A.3.3.1.1 Software Development Plan


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
P • • P
4.3.3 System Overview
(Ref: 2; 4.1.3; 5.1) (Ref: 11.1) (Ref: Part 1-7.2.1) (PP 2.7)
• • P
4.3.3 Software Overview
(Ref: 5.1) (Ref: 11.1) (PP 2.7)
• • •

4.3.3 Software Lifecycle (Ref: 3.1 Table A-1 (Ref: Part (Ref: PP 1.3, IPM
(Ref: 11.1, 11.2)
lines 2, 3) 3-7.1.2.1) 1.3)

Software Lifecycle • • •
4.3.3 (Ref: Part
Data (Ref: 5) (Ref: 11.1) (Ref: GP 2.2, 3.1)
3-7.1.2.7, Table 1)
• •

4.3.3 Schedule (Ref: 3.1.2; 4.1.4.2; (Ref: PP 2.1
(Ref: 11.1)
5.1) PMC GP 2.2, 3.1)
Software • •
• • P
4.3.3 Development (Ref: 3.1. Table A- (Ref: PP 2.4
Environment (Ref: 5.3.1.3) (Ref: 11.2) (Ref: Part 3-7.4.4)
1 line 3) GP 2.2, 2.3, 3.1)
Additional • •
4.3.3
considerations (Ref: 2, 4) (Ref: 11.1)
• •
Software • (Ref: Section 3.1 • (Ref: PI 1.1, 1.2,
4.3.3
Integration Plan (Ref: 5.3.8.1) Table A-1 lines 1, (Ref: 5.4.2) 1.3,
2, 3, 4) VER 1.3)


Software • (Ref: Section 3.1 •
4.3.7 (Ref: PI 1.1, 1.3,
Integration Plan (Ref: 5.3.8.1) Table A-1 lines 1, (Ref: 5.4.2)
VER 1.3)
2, 3, 4)

© EUROCAE, 2009
139

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B


P (Ref: 3.1 • P
4.3.9 Standards (Ref: GP 2.2, 3.1,
(Ref: 5.3.1.4) Table A-1 line 5; (Ref: 11.2) (Ref: Part 3-7.4.4)
PP 2.4)
For COTS: 4.1.4.2)

P •
(Ref: 3.1 • P
4.3.10 Standards (Ref: 5.3.1.3, (Ref: GP 2.2, 3.1,
Table A-1 line 5; (Ref: 11.2) (Ref: Part 3-7.4.4)
5.3.1.4) PP 2.4)
For COTS: 4.1.4.2)
Software • •
• • P
4.3.12 Development (Ref: 3.1. Table A- (Ref: PP 2.4,
Environment (Ref: 5.3.1.3) (Ref: 11.2) (Ref: Part 3-7.4.4)
1 line 3) GP 2.2, 2.3, 3.1)
Additional • •
4.3.12
considerations (Ref: 4) (Ref: 11.1)
Software Lifecycle: • • •
• •
4.3.16 Lifecycle (Ref: 3.1Table A-1 (Ref: Part 3-7.1.2.1 (Ref: PP 1.3, IPM
Processes (Ref: 5.3.1.1) (Ref: 11.1, 11.2)
lines 2, 3) => 3-7.1.2.5) 1.3)

Software Lifecycle: • •
• • (Ref: Part
4.3.16 Lifecycle (Ref: 3.1Table A-1 (Ref: PP 1.3, IPM
(Ref: 5.3.1.1) (Ref: 11.1, 11.2) 3-7.1.2.1, 3-7.1.2.3
Information lines 2, 3) 1.3)
=> 3-7.1.2.5)
• • •
Software Lifecycle: • •
4.3.16 (Ref: 3.1Table A-1 (Ref: Part (Ref: PP 1.3, IPM
Transition Criteria (Ref: 5.3.1.1) (Ref: 11.1, 11.2)
lines 2, 3) 3-7.1.2.3) 1.3)
Schedule: • •
• •
4.3.16 Lifecycle (Ref: 3.1; 4.1.4.2 (Ref: PP 2.1,
Processes (Ref: 5.3.1.1) (Ref: 11.1)
;5.1) PMC, GP 2.2, 3.1)
Schedule: • •
• •
4.3.16 Lifecycle (Ref: 3.1; 4.1.4.2 (Ref: PP 2.1,
Information (Ref: 5.3.1.1) (Ref: 11.1)
;5.1) PMC, GP 2.2, 3.1)

© EUROCAE, 2009
140

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• •
Schedule: • •
4.3.16 (Ref: 3.1; 4.1.4.2 (Ref: PP 2.1,
Transition Criteria (Ref: 5.3.1.1) (Ref: 11.1)
;5.1) PMC, GP 2.2, 3.1)
Software • •
• • P
4.3.17 Development (Ref: 3.1. Table A- (Ref: PP 2.4
Environment (Ref: 5.3.1.3) (Ref: 11.2) (Ref: Part 3-7.4.4)
1 line 3) GP 2.2, 2.3, 3.1)
Additional • •
4.3.17
considerations (Ref: 4) (Ref: 11.1)
Software • •
• • P
4.3.18 Development (Ref: 3.1. Table A- (Ref: PP 2.4
Environment (Ref: 5.3.1.3) (Ref: 11.2) (Ref: Part 3-7.4.4)
1 line 3) GP 2.2, 2.3, 3.1)
Additional • •
4.3.18
considerations (Ref: 4) (Ref: 11.1)
Software • •
• • P
4.3.19 Development (Ref: 3.1. Table A- (Ref: PP 2.4
Environment (Ref: 5.3.1.3) (Ref: 11.2) (Ref: Part 3-7.4.4)
1 line 3) GP 2.2, 2.3, 3.1)
Additional •
4.3.20
considerations (Ref: 11.7)

Software Lifecycle • • •
5.X (Ref: Part
Data (Ref: 5) (Ref: 11.1) (Ref: GP 2.2, 3.1)
3-7.1.2.7, Table 1)
Software • •
• • P
7.1.X Development (Ref: 3.1. Table A- (Ref: PP 2.4
Environment (Ref: 5.3.1.3) (Ref: 11.2) (Ref: Part 3-7.4.4)
1 line 3) GP 2.2, 2.3, 3.1)
Additional • •
7.2.X
considerations (Ref: 4) (Ref: 11.1)

© EUROCAE, 2009
141

A.3.3.2 System Requirements Analysis

NOTE: ED-12B/DO-178B does not address system-related issues (supposed to be covered by ARP 4754), nor does ED-109/DO-278.
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 RTCA/DO-178B
System • •
• P P
4.3.1 Requirements (Ref: Part 1-7.6, (Ref: RD 1.1, 2,
Analysis (Ref: 5.3.2.1) (Ref: 2) (Ref: 2.1.1, 2.2)
Part 2-7.2.3) 3.1, 3.2)

System
• P P P (Ref:
4.3.1 Requirements
Definition Criteria (Ref: 5.3.2.2) (Ref: 2) (Ref: 2.1.1) (Ref: Part 2-7.2.2) REQM 1.4, 1.5
RD 3)

System
• P P P (Ref:
4.3.15 Requirements
Definition Criteria (Ref: 5.3.2.2) (Ref: 2) (Ref: 2.1.1) (Ref: Part 2-7.2.2) REQM 1.4, 1.5
RD 3)

A.3.3.3 System Architectural Design

NOTE: ED-12B/DO-178B does not address system-related issues (supposed to be covered by ARP 4754).
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

System P •
• P P
4.3.2 Architecture (Ref: (Ref: TS 2.1, 2.2
Definition (Ref: 5.3.3.1) (Ref: 2.2) (Ref: 2.3)
Part 2-7.4.2) RD 2.2)


System
• P (Ref:
4.3.2 Architecture
Definition Criteria (Ref: 5.3.3.2) (Ref: Part 2-7.4) REQM 1.4,
TS 2.1, 2.2)

System
• P (Ref:
4.3.15 Architecture
Definition Criteria (Ref: 5.3.3.2) (Ref: Part 2-7.4) REQM 1.4,
TS 2.1, 2.2)

© EUROCAE, 2009
142

A.3.3.4 Software Requirements Analysis


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 RTCA/DO-178B

Software • (Ref: Part

Requirements • (Ref: 3.2 3-7.2.2.3, •
4.3.4 (Ref: 5.1, 11.6,
Definition: (Ref: 5.3.4.1) Table A-2; Table 3-7.2.2.4, (Ref: RD 2.1, 2.3)
Establishment 11.9)
A-3) 3-7.2.2.7=>
3-7.2.2.11)


Software • (Ref: Part

Requirements • (Ref: 3.2 3-7.2.2.3, •
4.3.4 (Ref: 5.1, 11.6,
Definition: (Ref: 5.3.4.1) Table A-2; Table 3-7.2.2.4, (Ref: RD 2.1, 2.3)
Specification 11.9)
A-3) 3-7.2.2.7=>
3-7.2.2.11)
Software •
Requirements P •
4.3.4 (Ref: 3.2
Definition: (Ref: 5.1, 11.10) (Ref: TS 2.1)
Algorithms Table A-3 line 7)

Assurance Level
(Ref: 3.2 • • P
4.3.4 Related
Requirements Table A-2; Table (Ref: 5.1.2, 11.9) (Ref: Part 3-7.2.2) (Ref: RD 3.3)
A-3)
P

Software • (Ref: Part •
• (Ref: 3.2
4.3.4 Requirements (Ref: 5.5, 11.6, 3-7.2.2.1, (Ref: RD 3.3
Definition Criteria (Ref: 5.3.4.2) Table A-2; Table 3-7.2.2.2,
11.9) REQM 1.4)
A-3 line 6)
3-7.2.2.6)

Software •
• (Ref: PI 1,
4.3.7 Integration (Ref: Part
Definition update (Ref: 5.3.5.5) VER 1.3, GP 2.2,
3-7.4.3.2.f)
2.3, 3.1)

© EUROCAE, 2009
143

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 RTCA/DO-178B
• •
Software •
(Ref: 3.2 Table A- • (Ref: Part
4.3.9 Requirements (Ref: RD, GP 2.2,
Standards 2; Table A-3; Table (Ref: 11.6) 3-7.2.2.4,
2.3, 3.1)
A-4) 3-7.2.2.6)
• •
Software •
(Ref: 3.2 Table A- • (Ref: Part
4.3.10 Requirements (Ref: RD, GP 2.2,
Standards 2; Table A-3; Table (Ref: 11.6) 3-7.2.2.4,
2.3, 3.1)
A-4) 3-7.2.2.6)
• •
Software •
(Ref: 3.2 Table A- • (Ref: Part
4.3.11 Requirements (Ref: RD, GP 2.2,
Standards 2; Table A-3; Table (Ref: 11.6) 3-7.2.2.4,
2.3, 3.1)
A-4) 3-7.2.2.6)
• •
Software •
(Ref: 3.2 Table A- • (Ref: Part
4.3.12 Requirements (Ref: RD, GP 2.2,
Standards 2; Table A-3; Table (Ref: 11.6) 3-7.2.2.4,
2.3, 3.1)
A-4) 3-7.2.2.6)
P

Software • (Ref: Part •
• (Ref: 3.2
4.3.15 Requirements (Ref: 5.5, 11.6, 3-7.2.2.1, (Ref: RD 3.3,
Definition Criteria (Ref: 5.3.4.2) Table A-2; Table 11.9) 3-7.2.2.2, REQM 1.4)
A-3 line 6) 3-7.2.2.6)
P
Software (Ref: Part P

4.3.20 Requirements 3-7.2.2.1, (Ref: RD 3.3
Definition Criteria (Ref: 11.7)
3-7.2.2.2, REQM 1.4)
3-7.2.2.6)

© EUROCAE, 2009
144

A.3.3.5 Software Architectural Design


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

Top-Level • (Ref: Part
3-7.4.1.1, •
Software • (Ref: 3.2. Table A- •
4.3.5 3-7.4.1.2, (Ref: TS 2.1, 2.2
Architecture (Ref: 5.3.5.1) 2 (Ref: 5.2.2, 11.10)
Definition 3-7.4.3.1, RD 2.2)
line 3) 3-7.4.3.2,
3-7.4.3.3)
• P
• • •
4.3.5 Interfaces Design (Ref: 3.2 Table A-2 (Ref: Part
(Ref: 5.3.5.2) (Ref: 11.10) (Ref: TS 2.3)
line3) 3-7.4.2.2.b)
• •
Assurance Level • •
4.3.5 (Ref: 3.2 (Ref: TS 1.1, 1.3,
Related Design (Ref: 11.10) (Ref: Part 3-7.4.2)
Table A-2.line 3) 2.1, 2.2)
Database Top- • •
4.3.5
Level design (Ref: 5.3.5.3) (Ref: TS 2.1, 2.2)
P •
Software • •
• (Ref: Part 3-7.4.2.2 (Ref: TS 2.1, 2.2
4.3.5 Architecture (Ref: 3.2 Table A-2 (Ref: 5.2, 5.5,
Definition Criteria (Ref: 5.3.5.6) => 3-7.4.2.11, ReqM 1.4
line 3) 11.7)
3-7.4.3.2) PI 2.1)
Software • •

4.3.7 Integration (Ref: Part (Ref: PI 1, PI GP
Definition update (Ref: 5.3.5.5)
3-7.4.3.2.f) 2.2, 2.3, 3.1)
Software • • •

4.3.9 Architectural (Ref: 3.2 Table A-2 (Ref: Part (Ref: TS GP 2.2,
Design Standards (Ref: 11.7)
line 3) 3-7.4.3.2) 2.3, 3.1)
Software • • •

4.3.10 Architectural (Ref: 3.2 Table A-2 (Ref: Part (Ref: TS GP 2.2,
Design Standards (Ref: 11.7)
line 3) 3-7.4.3.2) 2.3, 3.1)

© EUROCAE, 2009
145

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

Top-Level • (Ref: Part •
Software • (Ref: 3.2. Table A- • 3-7.4.1.1, (Ref: TS 2.1, 2.2
4.3.13
Architecture (Ref: 5.3.5.1) 2 (Ref: 5.2.2, 11.10) 3-7.4.1.2, RD 2.2
Definition line 3) 3-7.4.3.1,
3-7.4.3.3)
• •
Assurance Level • •
4.3.13 (Ref: 3.2 (Ref: TS 1.1, 1.3,
Related Design (Ref: 11.10) (Ref: Part 3-7.4.2)
Table A-2 line 3) 2.1, 2.2)
P •
Software • •
• (Ref: Part 3-7.4.2.2 (Ref: TS 2.1, 2.2
4.3.13 Architecture (Ref: 3.2 Table A-2 (Ref: 5.2, 5.5,
Definition Criteria (Ref: 5.3.5.6) => 3-7.4.2.11, ReqM 1.4
line 3) 11.7)
3-7.4.3.2) PI 2.1)
P •
Software • •
• (Ref: Part 3-7.4.2.2 (Ref: TS 2.1, 2.2
4.3.14 Architecture (Ref: 3.2 Table A-2 (Ref: 5.2, 5.5,
Definition Criteria (Ref: 5.3.5.6) => 3-7.4.2.11, ReqM 1.4
line 3) 11.7)
3-7.4.3.2) PI 2.1)
P •
Software • •
• (Ref: Part 3-7.4.2.2 (Ref: TS 2.1, 2.2
4.3.15 Architecture (Ref: 3.2 Table A-2 (Ref: 5.2, 5.5,
Definition Criteria (Ref: 5.3.5.6) => 3-7.4.2.11, ReqM 1.4
line 3) 11.7)
3-7.4.3.2) PI 2.1)
Software • • •

4.3.17 Architectural (Ref: 3.2 Table A-2 (Ref: Part (Ref: TS GP 2.2,
Design Standards (Ref: 11.7)
line 3) 3-7.4.3.2) 2.3, 3.1)
Software • •

4.3.18 Architectural (Ref: 3.2 Table A-2 (Ref: TS GP 2.2,
Design Standards (Ref: 11.7)
line 3) 2.3, 3.1)
P •
Software •
4.3.20 Architecture (Ref: Part 3-7.4.2.2 (Ref: TS 2.1, 2.2
(Ref: 5.2, 5.5,
Definition Criteria => 3-7.4.2.11, ReqM 1.4
11.7)
3-7.4.3.2) PI 2.1)

© EUROCAE, 2009
146

A.3.3.6 Software Detailed Design


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• P •
Software Detailed •
• (Ref: 3.2 (Ref: Part (Ref: TS 2.1, 2.2
4.3.5 Design Definition (Ref: 5.2.2, 5.5,
Criteria (Ref: 5.3.6.7) Table A-2 lines 4, 3-7.4.5.2, ReqM 1.4
11.7)
5) 3-7.4.5.3) PI 2.1)

Database Detailed •
4.3.5 (Ref: TS 2.1, 2.2,
Design (Ref: 5.3.6.3)
3.1)

• •
(Ref: Part •
Software Detailed • (Ref: 3.2 •
4.3.6 3-7.4.1.4, (Ref: TS 2.1, 2.2,
Design Definition (Ref: 5.3.6.1) Table A-2 lines 4, (Ref: 11.10)
3-7.4.5.1, 3.1)
5) 3-7.4.5.4)
P
• (Ref: 3.2 • •
4.3.6 Interfaces Design
(Ref: 5.3.6.2) Table A-2 lines 4, (Ref: 5.2.2, 11.10) (Ref: TS 2.3)
.5)

Software P
• (Ref:
4.3.7 Integration (Ref: Part
Definition Update (Ref: 5.3.6.6) PI 1, PI GP 2.2,
3-7.4.5.5)
2.3, 3.1)
• •
Software Detailed •
4.3.9 (Ref: 3.1 Tables A- (Ref: TS GP 2.2,
Design Standards (Ref: 11.7, 11.10)
2 lines 4, 5) 2.3, 3.1)
• •
Software Detailed •
4.3.10 (Ref: 3.2 Table A2 (Ref: TS GP 2.2,
Design Standards (Ref: 11.7, 11.10)
lines 4, 5) 2.3, 3.1)
• P •
Software Detailed •
• (Ref:3.2 (Ref: Part (Ref: TS 2.1, 2.2
4.3.13 Design Definition (Ref: 5.2.2, 5.5,
Criteria (Ref: 5.3.6.7) Table A-2 lines 4, 3-7.4.5.2, ReqM 1.4
11.7)
5) 3-7.4.5.3) PI 2.1)

© EUROCAE, 2009
147

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• P •
Software Detailed •
• (Ref:3.2 (Ref: Part (Ref: TS 2.1, 2.2
4.3.14 Design Definition (Ref: 5.2.2, 5.5,
Criteria (Ref: 5.3.6.7) Table A-2 lines 4, 3-7.4.5.2, ReqM 1.4
11.7)
5) 3-7.4.5.3) PI 2.1)
• P •
Software Detailed •
• (Ref:3.2 (Ref: Part (Ref: TS 2.1, 2.2
4.3.15 Design Definition (Ref: 5.2.2, 5.5,
Criteria (Ref: 5.3.6.7) Table A-2 lines 4, 3-7.4.5.2, ReqM 1.4
11.7)
5) 3-7.4.5.3) PI 2.1)
P •
Software Detailed •
4.3.20 Design Definition (Ref: Part (Ref: TS 2.1, 2.2
(Ref: 5.2.2, 5.5,
Criteria 3-7.4.5.2, ReqM 1.4
11.7)
3-7.4.5.3) PI 2.1)

A.3.3.7 Software Coding


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
P
Software Units P • (Ref: Part •

4.3.6 Code definition (Ref: 3.7 Table A- (Ref: 5.3, 5.5, 11.8, 3-7.4.6.1, (Ref: TS 3.1
Criteria (Ref: 5.3.7.5)
7) 11.11) 3-7.4.7.1, ReqM 1.4)
3-7.4.7.2)
P

Development & • (Ref: 3.5 Table A- • •
4.3.6 (Ref: Part 3-7.4.6,
Documentation (Ref: 5.3.7.1) 5; (Ref: 5.3) (Ref: TS 3.1)
3-7.4.7)
3.6 Table A-6)
• • •

4.3.9 Coding Standards (Ref: 3.2 Table A-2 (Ref: Part (Ref: TS GP 2.2,
(Ref: 11.8, 11.11)
line 6 3-7.4.4.6) 2.3, 3.1)

© EUROCAE, 2009
148

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
P
Software Units P • (Ref: Part •

4.3.9 Code definition (Ref: 3.7 Table A- (Ref: 5.3, 5.5, 11.8, 3-7.4.6.1, (Ref: TS 3.1
Criteria (Ref: 5.3.7.5)
7) 11.11) 3-7.4.7.1, ReqM 1.4)
3-7.4.7.2)
• • •

4.3.10 Coding Standards (Ref: 3.2 Table A-2 (Ref: Part (Ref: TS GP 2.2,
(Ref: 11.8, 11.11)
line 6 3-7.4.4.6) 2.3, 3.1)
P
Software Units P • (Ref: Part •

4.3.10 Code definition (Ref: 3.7 Table A- (Ref: 5.3, 5.5, 11.8, 3-7.4.6.1, (Ref: TS 3.1
Criteria (Ref: 5.3.7.5)
7) 11.11) 3-7.4.7.1, ReqM 1.4)
3-7.4.7.2)
P
Software Units P • •
• (Ref: 7.4.6.1,
4.3.15 Code definition (Ref: 3.7 Table A- (Ref: 5.3, 5.5, 11.8, (Ref: TS 3.1
(Ref: 5.3.7.5) 7.4.7.1,
Criteria 7) 11.11) ReqM 1.4)
7.4.7.2)
Software Units P • •

4.3.19 Code definition (Ref: 3.7 Table A- (Ref: 5.3, 5.5, 11.8, (Ref: TS 3.1
Criteria (Ref: 5.3.7.5)
7) 11.11) ReqM 1.4)
Software Units • •
4.3.20 Code definition (Ref: 5.3, 5.5, 11.8, (Ref: TS 3.1
Criteria 11.11) ReqM 1.4)

© EUROCAE, 2009
149

A.3.3.8 Software Integration


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

P (Ref: TS 2.1, 3.1,
Software P 3.2
4.3.7 Integration (Ref: Part
PI 1, PI GP 2.2,
Definition Criteria (Ref: 5.3.8.5) 3-7.4.8.2,
3.1
3-7.4.8.5)
PP3.1
ReqM 1.4)
Software •

4.3.7 Integration (Ref:
Definition Update (Ref: 5.3.7.4)
PI 1.1)
P

Software • (Ref: Part
4.3.7 (Ref: PI 3.2, 3.3
Integration (Ref: 5.3.8.2) 3-7.4.8.3,
)
3-7.4.8.4)
Software • P •
4.3.9 Integration (Ref: 3.6 Table A-6 (Ref: 5.4.3, (Ref: PI 1.2
Standards line 2) 6.4.3.b) PI GP 2.2, 2.3, 3.1)
Software • P •
4.3.10 Integration (Ref: 3.6 Table A-6 (Ref: 5.4.3, (Ref: PI 1.2
Standards line 2) 6.4.3.b) PI GP 2.2, 2.3, 3.1)

(Ref: TS 2.1, 3.1,
Software 3.2
4.3.20 Integration PI 1, PI GP 2.2,
Definition Criteria 3.1
PP3.1
ReqM 1.4)

© EUROCAE, 2009
150

A.3.3.9 System Integration


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
P
(Ref: Part
3-7.5.2.1,
• P 3-7.5.2.2,
System Integration • •
4.3.3 (Ref: 5.3.10.1, (Ref: 3.2 Table A-2 3-7.5.2.3,
Definition (Ref: 5.4.1, 5.4.2) (Ref: PI 1.2, 1.3,
5.3.10.3) line 7) 3-7.5.2.4,
3.2, 3.3)
3-7.5.2.5,
3-7.5.2.7,
3-7.5.2.8)

System Integration P
4.3.9 (Ref: PI 1.2
Standards (Ref: 6.4.3.a)
PI GP 2.2, 2.3, 3.1)

System Integration P
4.3.10 (Ref: PI 1.2
Standards (Ref: 6.4.3.a)
PI GP 2.2, 2.3, 3.1)
Software P
• • •
4.3.13 Compatibility with (Ref: 3.6 Table A-6
target Hardware (Ref: 5.4) (Ref: Part 3-7.5.2) (Ref: PI 1.3)
line 5)
Software P
• • •
4.3.14 Compatibility with (Ref: 3.6 Table A-6
target Hardware (Ref: 5.4) (Ref: Part 3-7.5.2) (Ref: PI 1.3)
line 5)

© EUROCAE, 2009
151

A.3.3.10 Software Installation


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

(Ref: Part •
Software P
4.3.8 1-7.9.1.1, (Ref: PI 1, PI GP
Installation Plan (Ref: 5.3.12.1)
1-7.9.2.1, 2.2,2.3, 3.1
1-7.9.2.3)

Software
• (Ref: Part •
4.3.8 Installation
Performance: Plan (Ref: 5.3.12.2) 1-7.13.1.1, (Ref: PI 3.4)
1-7.13.2.1)
Software •
Installation • (Ref: Part •
4.3.8
Performance: (Ref: 5.3.12.2) 1-7.13.1.1, (Ref: PI 3.4)
Documentation 1-7.13.2.2)

© EUROCAE, 2009
152

A.3.4 Operation Process

See chapter 4.4 of ED-153.


NOTE: ED-12B/DO-178B, ED-109/DO-278 and CMMI do not cover operation.
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Process • P
4.4.1
Implementation (Ref: 5.4.1) (Ref: Part 1-7.15)
Intended
• P
4.4.2 Operational
Environment (Ref: 5.4.3) (Ref: Part 1-7.15)


4.4.3 User support
(Ref: 5.4.4)
Software • P
4.4.4
Operation (Ref: 5.4.3) (Ref: Part 1-7.15)
Performance
4.4.5
Monitoring

© EUROCAE, 2009
153

A.3.5 Maintenance Process

See chapter 4.5 of ED-153.


NOTE: ED-12B/DO-178B and ED-109/DO-278 do not cover maintenance.
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Process P
Implementation: • (Ref: Part 1-6.2.1.j •
4.5.1
Maintenance (Ref: 5.5.1) , (Ref: PP 2)
Process I-7.7,I-7.15)
Process P
Implementation: • (Ref: Part 1-6.2.1.j •
4.5.1
Maintenance Risk (Ref: 5.5.1) , (Ref: PP 2)
Assessment I-7.7,I-7.15)
SWAL allocation
4.5.2
confirmation
P
P P
4.5.3 SWAL satisfaction (Ref: 7.8, Part 1-
(Ref: 5.5.4) (Ref: CM 1.3, 3.2)
7.16)
Software
• P
4.5.4 Migration:
Procedure (Ref: 5.5.5) (PI 3.4)
Software
• P
4.5.4 Migration: Risk
Assessment (Ref: 5.5.5) (PI 3.4)
SW P P
4.5.5 Decommissioning:
Plan (Ref: 5.5.6) (Ref: Part 1-7.17)
SW P P
4.5.5 Decommissioning:
Risk Assessment (Ref: 5.5.6) (Ref: Part 1-7.17)

© EUROCAE, 2009
154

A.4 SUPPORTING LIFECYCLE PROCESSES

See chapter 5 in ED-153.

A.4.1 Documentation Process

See chapter 5.1 of ED-153.


NOTE: ED-109/DO-278 and ED-12B/DO-178B do not prescribe or recommend delivering documents as such. Instead, some “Software
Lifecycle Data” has to be produced as evidence.
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• •
4.3.3 Baseline Update (Ref: 5.3.9.5, (Ref: PI 3.4
5.3.11.4) CM 1.3)

• • (Ref: Part
Documentation • •
4.3.4 (Ref: 3.2 Table A-2 (Ref: 5.1, 11.6, 3-7.2.2.3,
(SW Requirement) (Ref: 5.3.4.1) (Ref: RD 2.1, 2.3)
lines 1, 2) 11.9) 3-7.2.2.4, 3-7.2.2.7
=> 3-7.2.2.11)
Documentation • •
• •
4.3.5 (SW architectural (Ref: 3.2, Table (Ref: 5.2, 11.7,
design) (Ref: 5.3.5.4) (Ref: TS 2.1, 2.2)
A.2 line 3) 11.10)
Documentation •
• •
4.3.6 (SW architectural (Ref: 3.2, Table
design) (Ref: 5.3.5.4) (Ref: TS 2.1, 2.2)
A.2 line 3)
Documentation • •
4.3.6
(SW coding) (Ref: 5.3.7.3) (Ref: TS 3.1, 2.2)
Documentation • •
4.3.7
(SW integration) (Ref: 5.3.8.3) (Ref: PI1.1, 1.3)


• (Ref: Part 1-5.1, I-
Process • P (Ref: GP 2.2, 3.1
5.1.1 (Ref: 3.1 Table A-1 5.2.7,
Implementation (Ref: 6.1.1) (Ref: 4.3, 11) PP2.3, 2.7
lines 1, 2, 3, 4) I-5.2.9=>
CM 1.1)
I-5.2.11)

© EUROCAE, 2009
155

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• •
Design & • P
5.1.2 (Ref: Part 1-5.2.8, (Ref: PP 2.3
Development (Ref: 6.1.2) (Ref: 11)
I-Annex A) PMC 1.4)
Production:
• P P •
5.1.3 Document
Provision (Ref: 6.1.3) (Ref: 4.3, 11) (Ref: Part 1-5.2.11) (Ref: PMC 1.4)

Production: • P P •
5.1.3
Document Storage (Ref: 6.1.3) (Ref: 4.3, 11) (Ref: Part 1-5.2.11) (Ref: PMC 1.4)
Documentation •

(SW Requirement): • • (Ref: 7.2.2.3, •
5.1.3 (Ref: 3.2 Tables
Document (Ref: 5.3.4.1) (Ref: 5.1, 11.9) 7.2.2.4, 7.2.2.7=> (Ref: RD 2.1, 2.3)
Provision A2.1, A2.2)
7.2.2.11)

Documentation •
• • (Ref: 7.2.2.3, •
5.1.3 (SW Requirement): (Ref: 3.2 Tables
Document Storage (Ref: 5.3.4.1) (Ref: 5.1, 11.9) 7.2.2.4, 7.2.2.7=> (Ref: RD 2.1, 2.3)
A2.1, A2.2)
7.2.2.11)
Documentation

(SW architectural • •
5.1.3 (Ref: 3.2, Table
design): Document (Ref: 5.3.5.4) (Ref: TS 2.1, 2.2)
Provision A.2 line 3)
Documentation

(SW architectural • •
5.1.3 (Ref: 3.2, Table
design) : (Ref: 5.3.5.4) (Ref: TS 2.1, 2.2)
Document Storage A.2 line 3)
Documentation P
(SW detailed • •
5.1.3 (Ref: For COTS
design): Document (Ref: 5.3.6.4) (Ref: TS 2.1, 2.2)
Provision 4.1.2)
Documentation P
(SW detailed • •
5.1.3 (Ref: For COTS
design): Document (Ref: 5.3.6.4) (Ref: TS 2.1, 2.2)
Storage 4.1.2)

© EUROCAE, 2009
156

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Documentation
(SW coding): • •
5.1.3
Document (Ref: 5.3.7.3) (Ref: TS 3.1, 2.2)
Provision
Documentation
• •
5.1.3 (SW coding):
Document Storage (Ref: 5.3.7.3) (Ref: TS 3.1, 2.2)
Documentation
(SW integration): • •
5.1.3
Document (Ref: 5.3.8.3) (Ref: PI1.1, 1.3)
Provision
Documentation
• •
5.1.3 (SW integration):
Document Storage (Ref: 5.3.8.3) (Ref: PI1.1, 1.3)

Baseline Update: • •
5.1.3 Document (Ref: 5.3.9.5, (Ref: PI 3.4
Provision 5.3.11.4) CM 1.3)
• •
Baseline Update:
5.1.3 (Ref: 5.3.9.5, (Ref: PI 3.4
Document Storage
5.3.11.4) CM 1.3)

• P •
5.1.4 Maintenance (Ref: 3.8 Table A-8
(Ref: 6.1.4) (Ref: Annex A) (Ref: PMC 1.4)
lines 3, 4)
• •
5.2.2 Baseline Update (Ref: 5.3.9.5, (Ref: PI 3.4
5.3.11.4) CM 1.3)

© EUROCAE, 2009
157

A.4.2 Configuration Management Process

See chapter 5.2 of ED-153.


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Process
Implementation: • •
P • P
5.2.1 Configuration (Ref: 3.1 Table A-1 (Ref: CM 1.2, CM
Management Plan (Ref: 6.2.1) (Ref: 7.1, 11.4) (Ref: Part 1-6.2.1)
lines 1, 2, 3) GP 2.2, 2.4, 3.1)
Development
Process
Implementation: • •
P • P
5.2.1 Configuration (Ref: 3.1 Table A-1 (Ref: CM 1.2, CM
Management Plan (Ref: 6.2.1) (Ref: 7.1, 11.4) (Ref: Part 1-6.2.1)
lines 1, 2, 3) GP 2.2, 2.4, 3.1)
Content
Process
Implementation: • •
P • P
5.2.1 Configuration (Ref: 3.1 Table A-1 (Ref: CM 1.2, CM
Management Plan (Ref: 6.2.1) (Ref: 7.1, 11.4) (Ref: Part 1-6.2.1)
lines 1, 2, 3) GP 2.2, 2.4, 3.1)
Documentation
Configuration •
P • P •
5.2.2 Identification: (Ref: 3.8 Table A-8
Scheme (Ref: 6.2.2) (Ref: 7.2.1, 7.2.2) (Ref: 6.2.3.c) (Ref: CM 1.1, 1.3)
line 1)
Configuration •
P • P •
5.2.2 Identification: (Ref: 3.8 Table A-8
Procedure (Ref: 6.2.2) (Ref: 7.2.1, 7.2.2) (Ref: 6.2.3.c) (Ref: CM 1.1, 1.3)
line 1)

Configuration P • P •
5.2.2 (Ref: 3.8 Table A-8
Identification: ?? (Ref: 6.2.2) (Ref: 7.2.1, 7.2.2) (Ref: 6.2.3.c) (Ref: CM 1.1, 1.3)
line 1)
Baseline & • •

5.2.2 Configuration Item (Ref: 3.8 Table A-8 (Ref: 7.2.2.e,
Traceability (Ref: CM 1.3)
line 2) 7.2.2.f)
Software Patch •
5.2.2
Management (Ref: 5.4.3)

© EUROCAE, 2009
158

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Baseline & • •

5.2.3 Configuration Item (Ref: 3.8 Table A-8 (Ref: 7.2.2.e,
Traceability (Ref: CM 1.3)
line 2) 7.2.2.f)
Configuration
• •
Control: Change • • •
5.2.3 (Ref: 3.8 Table A-8 (Ref: 6.2.3.d,
Request (Ref: 6.2.3) (Ref: 7.2.3=>7.2.5) (Ref: CM 2, 3)
Procedures line 3) 6.2.3.e)

• •
Configuration • • •
5.2.3 (Ref: 3.8 Table A-8 (Ref: 6.2.3.d,
Control: Audit Trail (Ref: 6.2.3) (Ref: 7.2.3=>7.2.5) (Ref: CM 2, 3)
line 3) 6.2.3.e)
Configuration
• •
Control: Access to • • •
5.2.3 (Ref: 3.8 Table A-8 (Ref: 6.2.3.d,
Safety-Related (Ref: 6.2.3) (Ref: 7.2.3=>7.2.5) (Ref: CM 2, 3)
Functions line 3) 6.2.3.e)
Software Lifecycle
Environment • •
• P
5.2.3 Control: Change (Ref: 3.8 Table A-8 (Ref: CM
Request (Ref: 7.2.9) (Ref: 6.2.3.c)
line 6) GP 2.6)
Procedures
Software Lifecycle • •
• P
5.2.3 Environment (Ref: 3.8 Table A-8 (Ref: CM
Control: Audit Trail (Ref: 7.2.9) (Ref: 6.2.3.c)
line 6) GP 2.6)
Software Lifecycle
Environment • •
• P
5.2.3 Control: Access to (Ref: 3.8 Table A-8 (Ref: CM
Safety-Related (Ref: 7.2.9) (Ref: 6.2.3.c)
line 6) GP 2.6)
Functions
Software Patch •
5.2.3
Management (Ref: 5.4.3)

Configuration • • P •
5.2.4 (Ref: 3.8 Table A-8
Status Accounting (Ref: 6.2.4) (Ref: 7.2.6) (Ref: 6.2.3.e) (Ref: CM 3.1)
line 3)

© EUROCAE, 2009
159

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

Configuration • P • •
5.2.5 (Ref: 3.8 Table A-8
Evaluation (Ref: 6.2.5) (Ref: 7.2.4) (Ref: 6.2.3.d) (Ref: CM 3.2)
line 3)
Release • P
P • P
5.2.6 Management & (Ref: 3.8 Table A-8 (Ref: CM 2
Delivery (Ref: 6.2.6) (Ref: 7.2.7) (Ref: 6.2.3.f)
line 4) CM 1.2)

Software Load •
5.2.6 (Ref: 3.8 Table A-8
Control (Ref: 7.2.8)
line 5)
Software Patch •
5.2.6
Management (Ref: 5.4.3)
5.2.7 Use of a CM Tool
Use of a CM Tool
5.2.8 “Acquirer
Agreement”
At Level of
5.2.9 Software
Component
Configuration
5.2.10 Management
Traceability: Data
Configuration
Management
5.2.10
Traceability:
Compatibility
At level of SW
5.2.11
compilation unit
• •
Configuration • • •
5.8.4 (Ref: 3.8 Table A-8 (Ref: 6.2.3.d,
Control (Ref: 6.2.3) (Ref: 7.2.3=>7.2.5) (Ref: CM 2, 3)
line 3) 6.2.3.e)

© EUROCAE, 2009
160

A.4.3 Quality Assurance Process

See chapter 5.3 in ED-153.


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Process • • P •

5.3.1 implementation: (Ref: 3.1 Table A-1 (Ref: 8.1, 8.2, (Ref: 7.1.2.2, Part (Ref: PPQA GP
(Ref: 6.3.1)
QA Process lines 1, 2, 3, 4) 11.5) 1-6.2.5, I-8) 2.2, 3.1)
Process • • P •

5.3.1 implementation: (Ref: 3.1 Table A-1 (Ref: 8.1, 8.2, (Ref: 7.1.2.2, Part (Ref: PPQA GP
(Ref: 6.3.1)
QA Objectives lines 1, 2, 3, 4) 11.5) 1-6.2.5, I-8) 2.2, 3.1)
Process • • P •

5.3.1 implementation: (Ref: 3.1 Table A-1 (Ref: 8.1, 8.2, (Ref: 7.1.2.2, Part (Ref: PPQA GP
(Ref: 6.3.1)
QA Plan lines 1, 2, 3, 4) 11.5) 1-6.2.5, I-8) 2.2, 3.1)

(Ref: 3.1 Table A-1 •
Product P •
5.3.2 lines 6, 7; (Ref: GP 2.9
assurance: Plans (Ref: 6.3.2) (Ref: 8.3)
3.9 Table A-9 line PPQA 2)
3)

Product (Ref: 3.1 Table A-1 •
P •
5.3.2 assurance: lines 6, 7; (Ref: GP 2.9
(Ref: 6.3.2) (Ref: 8.3)
Adherence 3.9 Table A-9 line PPQA 2)
3)

Product
(Ref: 3.1 Table A-1 •
assurance: P •
5.3.2 lines 6, 7; (Ref: GP 2.9
Software (Ref: 6.3.2) (Ref: 8.3)
3.9 Table A-9 line PPQA 2)
Conformity
3)
Process
• •
assurance: • •
5.3.3 (Ref: 3.9 Table A-9 (Ref: GP 2.9
Lifecycle (Ref: 6.3.3) (Ref: 8.2)
line 1) PPQA 1)
Adherence
Process
• •
assurance: • •
5.3.3 (Ref: 3.9 Table A-9 (Ref: GP 2.9
Engineering (Ref: 6.3.3) (Ref: 8.2)
line 1) PPQA 1)
Adherence

© EUROCAE, 2009
161

A.4.4 Verification Process

See chapter 5.4 in ED-153.


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• •
Process •
• • (Ref: 7.4.1.5, 7.9.2, (Ref: Ver 1
5.4.1 implementation: (Ref: 3.1 Table A-1
(Ref: 6.4.1) (Ref: 6.1, 11.3) Part 1-7.4, I-7.6,I- Ver GP 2.2, 3.1
Establishment lines 1, 2, 3, 4)
7.18) )
• •
Process •
• • (Ref: 7.4.1.5, 7.9.2, (Ref: Ver 1
5.4.1 implementation: (Ref: 3.1 Table A-1
(Ref: 6.4.1) (Ref: 6.1, 11.3) Part 1-7.4, I-7.6,I- Ver GP 2.2, 3.1
Documentation lines 1, 2, 3, 4)
7.18) )
Process • •

implementation: • • (Ref: 7.4.1.5, 7.9.2, (Ref: Ver 1
5.4.2 (Ref: 3.1 Table A-1
Verification Plan (Ref: 6.4.1) (Ref: 6.1, 11.3) Part 1-7.4, I-7.6,I- Ver GP 2.2, 3.1
lines 1, 2, 3, 4)
Development 7.18) )
Process • •

implementation: • • (Ref: 7.4.1.5, 7.9.2, (Ref: Ver 1
5.4.2 (Ref: 3.1 Table A-1
Verification Plan (Ref: 6.4.1) (Ref: 6.1, 11.3) Part 1-7.4, I-7.6,I- Ver GP 2.2, 3.1
lines 1, 2, 3, 4)
Scope 7.18) )
P
Verification: P
P (Ref: AnnexA-3 P •
5.4.3 Software (Ref: 3.1 Table A-1
(Ref: 6.4.2) =>Annex A-7, 6.2, (Ref: 7.9.2) (Ref: Ver)
Requirements lines 1, 2, 3, 4)
6.3, 6.4)
P
P
Verification: P (Ref: AnnexA-3 P •
5.4.4 (Ref: 3.1 Table A-1
Integration (Ref: 6.4.2) =>Annex A-7, 6.2, (Ref: 7.9.2) (Ref: Ver)
lines 1, 2, 3, 4)
6.3, 6.4)
P
Verification: P
P (Ref: AnnexA-3 P •
5.4.5 Software (Ref: 3.1 Table A-1
(Ref: 6.4.2) =>Annex A-7, 6.2, (Ref: 7.9.2) (Ref: Ver)
Architecture lines 1, 2, 3, 4)
6.3, 6.4)
P
P
Verification: P (Ref: AnnexA-3 P •
5.4.6 (Ref: 3.1 Table A-1
Detailed Design (Ref: 6.4.2) =>Annex A-7, 6.2, (Ref: 7.9.2) (Ref: Ver)
lines 1, 2, 3, 4)
6.3, 6.4)

© EUROCAE, 2009
162

A.4.4.1 Process Implementation


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Verification •

Process • • • (Ref:
5.4.1 (Ref: Ver GP 2.2,
Implementation: (Ref: 6.4.1.2) (Ref: 3.4, 3.5, 3.7) (Ref: 6) 7.9.2.1=>7.9.2.7,Part
3.1))
Establishment 1-7.18.1)
Verification •

Process • • • (Ref:
5.4.1 (Ref: Ver GP 2.2,
Implementation: (Ref: 6.4.1.2) (Ref: 3.4, 3.5, 3.7) (Ref: 6) 7.9.2.1=>7.9.2.7,Part
3.1))
Documentation 1-7.18.1)
Criticality • •
• • •
5.4.2 Evaluation criteria: (Ref: 2.2, 2.3.3, (Ref: Part 1-7.4, I-
(Ref: 6.4.1.1) (Ref: 2.1) (Ref: Ver 1.3)
Development Annex A) 7.6)
Criticality • •
• • •
5.4.2 Evaluation criteria: (Ref: 2.2, 2.3.3, (Ref: Part 1-7.4, I-
(Ref: 6.4.1.1) (Ref: 2.1) (Ref: Ver 1.3)
Scope Annex A) 7.6)
Verification
• • •
Environment • •
5.4.2 (Ref: 3.1 Table A-1 (Ref: Annex A, (Ref: Ver 1.2, GP
Definition: (Ref: 6.4.1.4) (Ref: 7.9.2.2)
line 4) 11.3.c/d) 2.2, 2.3, 3.1)
Development
Verification • • •
• •
5.4.2 Environment (Ref: 3.1 Table A-1 (Ref: Annex A, (Ref: Ver 1.2, GP
(Ref: 6.4.1.4) (Ref: 7.9.2.2)
Definition: Scope line 4) 11.3.c/d) 2.2, 2.3, 3.1)


Transition Criteria: • • (Ref: PP 1.3, PMC
5.4.2 (Ref: 3.1 Table A-1
Development (Ref: 11.3.e) (Ref: 7.9.2.6) 1
line 2)
IPM 1.3, 1.4)


Transition Criteria: • • (Ref: PP 1.3, PMC
5.4.2 (Ref: 3.1 Table A-1
Scope (Ref: 11.3.e) (Ref: 7.9.2.6) 1
line 2)
IPM 1.3, 1.4)
• • •
Verification Plan: • •
5.4.2 (Ref: 3.1 Table A-1 (Ref: 7.9.2.1, (Ref: Ver GP 2.2,
Development (Ref: 6.4.1.5) (Ref: 11.3)
lines 1, 2, 3, 4) Part 1-7.18.2.1) 3.1)

© EUROCAE, 2009
163

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• • •
Verification Plan: • •
5.4.2 (Ref: 3.1 Table A-1 (Ref: 7.9.2.1, (Ref: Ver GP 2.2,
Scope (Ref: 6.4.1.5) (Ref: 11.3)
lines 1, 2, 3, 4) Part 1-7.18.2.1) 3.1)

• (Ref: Ver 2, 3, Ver
Verification • • •
5.4.2 (Ref: 3.9 Table 9 GP 2.7, 2.8
Results (Ref: 6.4.1.6) (Ref: 6.2.e) (Ref: 7.9.2)
Line 1) CM 2.1
PMC 2)

© EUROCAE, 2009
164

A.4.4.2 Verification
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

(Ref: all) Ver 2
a, c) RD 3.3,
System b) Ver 2
• d)
5.4.1 Requirements
Verification (Ref: 6.4.2.3) Ver 1.3, 2, 3
e) ReqM 1.5, RD
3.3
f) RD 3.3, 3.5
g) None


P (Ref: Part 1-7.8, I-
System Verification • P (Ref: Ver 1.3, Val
5.4.1 7.14, Part 2-
Evaluation Criteria (Ref: 3.7 Table A-7 1.3
(Ref: 5.3.11.2) (Ref: 2.7) 7.7.2.3, II-
lines 2, 3 ) Ver 3
7.7.2.5=>
Val 2)
II-7.7.2.7)

(Ref: all) PPQA 1,
Ver 2
a) SAM 1.2,
Contract • b) ReqM 1.1,

5.4.2 Verification: (Ref: 3.10 Table A- RD 3.3, 3.4, 3.5
Development (Ref: 6.4.2.1)
10 lines 1, 2, 3) c) ReqM 1.3
PMC GP 2.2, 2.4,
3.1, 2.7
d) IPM 2
e) Ver 1.3)

© EUROCAE, 2009
165

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

(Ref: all) PPQA 1,
Ver 2
a) SAM 1.2,
• b) ReqM 1.1,
Contract •
5.4.2 (Ref: 3.10 Table A- RD 3.3, 3.4, 3.5
Verification: Scope (Ref: 6.4.2.1)
10 lines 1, 2, 3) c) ReqM 1.3
PMC GP 2.2, 2.4,
3.1, 2.7
d) IPM 2
e) Ver 1.3)

(Ref: all) Ver GP
2.2, 3.1
a) ReqM 1.5, PP
• 3.1, 3.2
Process
• (Ref: 3.3 Table A- b) PP 3.1, IPM 1.1,
5.4.2 Verification:
Development (Ref: 6.4.2.2) 3, 3.4 Table A-4, 1.3, PMC1.1, Ver
3.7 Table A-7) GP 2.8, PPQA 1,
Ver GP 2.9
c) PP 3.1
d) PP 2.4, 2.5
Ver GP 2.3, 2.5)

(Ref: all) Ver GP
2.2, 3.1
a) ReqM 1.5, PP
• 3.1, 3.2
Process • (Ref: 3.3 Table A- b) PP 3.1, IPM 1.1,
5.4.2
Verification: Scope (Ref: 6.4.2.2) 3, 3.4 Table A-4, 1.3, PMC1.1, Ver
3.7 Table A-7) GP 2.8, PPQA 1,
Ver GP 2.9
c) PP 3.1
d) PP 2.4, 2.5
Ver GP 2.3, 2.5)

© EUROCAE, 2009
166

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Verification of
retrieval & release
5.4.2
process:
Development
Verification of
5.4.2 retrieval & release
process: Scope
P
(Ref: Annex A-3,
• 6.1.a, 6.3.1, 6.3.2)

Software (Ref: 3.3 Table A-3 (Because only • •
5.4.3 (Ref: 6.4.2.3,
Requirement lines 1, 2, 3, 4, 5, Software (Ref: 7.9.2.8) (Ref: Ver 3)
5.3.9)
6, 7) requirements not
System
requirements)

Operational • P
5.4.3 (Ref: Val 2
Testing (Ref: 5.4.2) (Ref: Part 1-7.15)
PI 3.4)

Adaptation data •
5.4.3 (Ref: 3.2 Table A-2
verification (Ref Ver, Val)
line 8)
P
P • (Ref: all) Ver 2, 3
Integration • • a,b) PI 3.1
5.4.4 (Ref: 3.5 Table A-5 (Ref: 7.9.2.10,
Verification (Ref: 6.4.2.6) (Ref: 6.3.5) c) PI GP 2.9
line 7) 7.9.2.11)
d, e,f) Ver 1.3
g) PI 3.2)

© EUROCAE, 2009
167

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

(Ref: 3.4 •
Table A-4 • (Ref: all) Ver 1.3, 2
Architectural • lines 1, 2, 3, 4, 5, • a) TS 1.1, 2.1
5.4.5 (Ref: Annex A-4,
Design Verification (Ref: 6.4.2.4) 6, 7, 8, 9, 10, 11, (Ref: 7.9.2.9) c) TS1.1,1.3,2.1
6.3.3)
12, 13 Over- ,ReqM1.4
compliant in line f) TS 2.1)
13) (partitioning)

(Ref: 3.4
Table A-4 •

Detailed design • lines 1, 2, 3, 4, 5, • [Ref: Ver 1.3, 2
5.4.5 (Ref: Annex A-4,
Verification (Ref: 6.4.2.4) 6, 7, 8, 9, 10, 11, (Ref: 7.9.2.9) TS 2.1
6.3.3)
12, 13 Over-
compliant in line
13) (partitioning)
P •
Module Testing P
5.4.5 (Ref: 3.6 Table A-6 (Ref: TS GP 2.2,
Standards (Ref: 6.4.3.c)
lines 3, 4 ) 2.3, 3.1, Ver 1.2)

Software Units • P
• P (Ref: TS 3.1
5.4.5 Tests definition (Ref: 5.3, 5.5, 11.8, (Ref: 7.4.6.1,
Criteria (Ref: 5.3.7.5) (Ref: 3. Table A-7) ReqM 1.4
11.11) 7.4.7.1, 7.4.7.2)
Ver 1.3, 2)
• • •
Software Units •
5.4.5 (Ref: 3.6 Table A-6 (Ref: 7.4.7.1, (Ref: TS 3.1, Ver
Testing (Ref: 5.3.7.2)
lines 3, 4, 5) 7.4.7.3) 3)
• •
Source Code • (Ref: 3.5 Table A-5 • • (Ref: Ver 1.3, 2
5.4.6
Verification (Ref: 6.4.2.5) lines 1, 2, 3, 4, 5, (Ref: Annex A-5) (Ref: 7.9.2.12) TS 3.1
6) ReqM 1.4)

© EUROCAE, 2009
168

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

Software Units • • •
5.4.6 (Ref: TS 3.1, Ver
Test Definition (Ref: 5.3.6.5) (Ref: 3.6.3) (Ref: 7.4.5.4)
1.3)
P •
Module Testing P
5.4.6 (Ref: 3.6 Table A-6 (Ref: TS GP 2.2,
Standards (Ref: 6.4.3.c)
lines 3, 4 ) 2.3, 3.1, Ver 1.2)

Software Units • P
• P (Ref: TS 3.1
5.4.6 Tests definition (Ref: 5.3, 5.5, 11.8, (Ref: 7.4.6.1,
Criteria (Ref: 5.3.7.5) (Ref: 3. Table A-7) ReqM 1.4
11.11) 7.4.7.1, 7.4.7.2)
Ver 1.3, 2)
• • •
Software Units •
5.4.6 (Ref: 3.6 Table A-6 (Ref: 7.4.7.1, (Ref: TS 3.1, Ver
Testing (Ref: 5.3.7.2)
lines 3, 4, 5) 7.4.7.3) 3)
• •
Executable Code
(Ref: 3.6 Table A-6 • (Ref: TS 3.1
5.4.8 Verification:
Evaluation lines 1, 2, 3, 4, 5 (Ref: Annex A-6) ReqM 1.4
Over-compliant) Ver 3)
• •
Executable Code
(Ref: 3.6 Table A-6 • (Ref: TS 3.1
5.4.8 Verification:
Documentation lines 1, 2, 3, 4, 5 (Ref: Annex A-6) ReqM 1.4
Over-compliant) Ver 3)

Adaptation data •
5.4.9 (Ref: 3.2 Table A-2
verification (Ref Ver, Val)
line 8)

5.4.9 Data Verification (Ref: Ver 1.3, 2


(Ref: 7.9.2.13) TS 3.1
PI 2.2)
P •
Development & • • •
5.4.12 (Ref: 3.5 Table A- (Ref: TS 3.1, Ver
Documentation (Ref: 5.3.7.1) (Ref: 5.3) (Ref: 7.4.6, 7.4.7)
5; 3.6 Table A-6) 1.3)

© EUROCAE, 2009
169

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Software • •

5.4.12 Verification (Ref: 7.7.2.4, (Ref: TS 2.1, 3.1
Evaluation (Ref: 5.3.9.3)
7.7.2.6) Ver 1.3)


System Verification • (Ref: Ver 3, Val 2
5.4.12 (Ref: Part 1-7.8, I-
Evaluation (Ref: 5.3.11) ReqM 1.4
7.14, Part 2-7.7)
Ver GP 2.9)

• P (Ref: a) Ver 2
Documentation
5.4.12 b) PI GP 2.8, 2.9
Verification (Ref: 6.4.2.7) (Ref: Part 1-5.2)
c) CM 3,
GP2.6)

Verification (Ref: 3.7 Table A- •

5.4.12 Process Outputs 7 lines 1, 2, 3, 4, 5, (Ref: Annex A-7,
Verification (Ref: Ver 1.3, 2)
6, 7, 8 Over- 6.3.6, 6.4.4)
compliant)

A.4.5 Validation Process

N/A

A.4.6 Joint Review Process

See chapter 5.6 in ED-153.


NOTE: ED-109/DO-278, ED-12B/DO-178B and IEC 61508 do not define a specific process for Joint Review objectives. However, reviews
are part of their Verification process.

© EUROCAE, 2009
170

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Process P P •
• P
5.6.1 implementation: (Ref: 3.9 Table A- (Ref: Part 1- (Ref: PP 2.1, 2.6
Review Milestones (Ref: 6.6.1) (Ref: 6, 8.3)
9 Line 3 partial) 6.2.1.b, I-7.18.1) PMC 1.6, 1.7)
Process P P •
implementation: • P
5.6.1 (Ref: 3.9 Table A- (Ref: Part 1- (Ref: PP 2.1, 2.6
Review (Ref: 6.6.1) (Ref: 6, 8.3)
Documentation 9 Line 3 partial) 6.2.1.b, I-7.18.1) PMC 1.6, 1.7)

Project P P
• P •
5.6.2 management (Ref: 3.9 Table A- (Ref: 7.3.2.4, Part
reviews (Ref: 6.6.2) (Ref: 4.6, 8.2.b/c) (Ref: PMC 1)
9 Line 1 partial) 1-6.2.3)

P (Ref: 7.2.2.4, •
• •
5.6.3 Technical reviews (Ref: 3.9 Table A- 7.4.1.2, 7.4.6.2, (Ref: PMC 1.6 ,
(Ref: 6.6.3) (Ref: 6, 8.3) 7.4.4.5,
9 Line 3 partial) 1.7)
Part 1-5.2.11)
Software •

5.6.3 Requirements (Ref: PMC 1.6, 1.7
Joint Review (Ref: 5.3.4.3)
CM 1.3)
Software •

5.6.3 Architecture Joint (Ref: PMC 1.6,
Review (Ref: 5.3.5.7)
1.7)
Software Detailed •

5.6.3 Design Joint (Ref: PMC 1.6,
Review (Ref: 5.3.6.8)
1.7)

5.6.3 Code Joint Review (Ref: PMC 1.6,
1.7)
Software •

5.6.3 Integration Joint (Ref: PMC 1.6,
Review (Ref: 5.3.8.6)
1.7)

© EUROCAE, 2009
171

A.4.7 Audit Process

See chapter 5.7 in ED-153.


NOTE: ED-109/DO-278 and ED-12B/DO-178B do not define a specific process for Audit. However, audits are part of Software Quality
Assurance process.
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
P
Process (Ref: Part 1-
• P •
5.7.1 implementation: 6.2.1.k, 7.8.2.2,
(Ref: 6.8.1) (Ref: 8.2.d, 11.19) (Ref: .GP 2.7, 2.9)
Audit Milestones Part 1-7.7.2.1.c, I-
7.15.2.3)
P
Process
(Ref: Part 1-
implementation: • P •
5.7.1 6.2.1.k, 7.8.2.2,
Audit (Ref: 6.8.1) (Ref: 8.2.d, 11.19) (Ref: .GP 2.7, 2.9)
Part 1-7.7.2.1.c, I-
Documentation
7.15.2.3)

(Ref: CM 3.2
a) TS 3.1, Ver 2
b) Ver 1.1, Ver 2
c) Ver 1.3, Ver 2
• P P d) Ver 2, 3, PMC 2,
5.7.2 Software Audit
(Ref: 6.8.2) (Ref: 8.2.d) (Ref: 6.2.3.e) CM 2.1, 3.2
e) Ver 3, PMC 2,
CM 2.1, 3.2
f) TS 3.2, Ver 2
g) GP 2.9
h) PMC 1.1)

© EUROCAE, 2009
172

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

(Ref: CM 3.2
a) TS 3.1, Ver 2
b) Ver 1.1, Ver 2
c) Ver 1.3, Ver 2
• P P d) Ver 2, 3, PMC 2,
5.7.3 Software Audit
(Ref: 6.8.2) (Ref: 8.2.d) (Ref: 6.2.3.e) CM 2.1, 3.2
e) Ver 3, PMC 2,
CM 2.1, 3.2
f) TS 3.2, Ver 2
g) GP 2.9
h) PMC 1.1)

(Ref: CM 3.2
a) TS 3.1, Ver 2
b) Ver 1.1, Ver 2
c) Ver 1.3, Ver 2
• P P d) Ver 2, 3, PMC 2,
5.7.4 Software Audit
(Ref: 6.8.2) (Ref: 8.2.d) (Ref: 6.2.3.e) CM 2.1, 3.2
e) Ver 3, PMC 2,
CM 2.1, 3.2
f) TS 3.2, Ver 2
g) GP 2.9
h) PMC 1.1)

(Ref: CM 3.2
a) TS 3.1, Ver 2
b) Ver 1.1, Ver 2
c) Ver 1.3, Ver 2
• P P d) Ver 2, 3, PMC 2,
5.7.5 Software Audit
(Ref: 6.8.2) (Ref: 8.2.d) (Ref: 6.2.3.e) CM 2.1, 3.2
e) Ver 3, PMC 2,
CM 2.1, 3.2
f) TS 3.2, Ver 2
g) GP 2.9
h) PMC 1.1)

© EUROCAE, 2009
173

A.4.8 Problem Resolution Process

See chapter 5.8 in ED-153.


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Problem Report
Configuration • •
• • •
5.2.3 Management: (Ref: 3.8 Table A-8 (Ref: 6.2.3.d,
(Ref: 6.2.3) (Ref: 7.2.3=>7.2.5) (Ref: CM 2, 3)
Change Request line 3) 6.2.3.e)
Procedures
Problem Report
• •
Configuration • • •
5.2.3 (Ref: 3.8 Table A-8 (Ref: 6.2.3.d,
Management: (Ref: 6.2.3) (Ref: 7.2.3=>7.2.5) (Ref: CM 2, 3)
line 3) 6.2.3.e)
Audit Trail
Problem Report
Configuration • •
• • •
5.2.3 Management: (Ref: 3.8 Table A-8 (Ref: 6.2.3.d,
(Ref: 6.2.3) (Ref: 7.2.3=>7.2.5) (Ref: CM 2, 3)
Access to Safety- line 3) 6.2.3.e)
Related Functions
• • •
Process • •
5.8.1 (Ref: 3.8 Table A-8 (Ref: 7.8, Part 1- (Ref: PMC GP
implementation (Ref: 6.9.1) (Ref: 7, 11.17)
line 3) 6.2.1, I-7.16) 2.2,3.1)
• •
• • •
5.8.2 Problem resolution (Ref: 3.8 Table A-8 (Ref: PMC 2
(Ref: 6.9.2) (Ref: 7.2, 11.17) (Ref: 7.8.2)
line 3) CAR)
Problem & P •

5.8.3 Modification (Ref: 7.8, Part 1- (Ref: ReqM 1.3
(Ref: 5.5.2 ; 6.9.1)
Analysis 7.8, I-7.15) CM 2)
Problem Report • •
• • •
5.8.4 Configuration (Ref: 3.8 Table A-8 (Ref: 6.2.3.d,
(Ref: 6.2.3) (Ref: 7.2.3=>7.2.5) (Ref: CM 2, 3)
Management line 3) 6.2.3.e)

NOTE: These process objectives are part of Software Configuration Management for ED-109/DO-278 and ED-12B/DO-178B.

© EUROCAE, 2009
174

A.5 ORGANISATIONAL LIFECYCLE PROCESSES

See chapter 6 in ED-153.

A.5.1 Management Process

See chapter 6.1 in ED-153.


NOTE: ED-109/DO-278, ED-12B/DO-178B and IEC-61508 do not provide generic requirements for management. That is why all
requirements concerning management are referenced in the related process, for example planning objectives of the supplier are
referenced in Supplier Process – Planning (cf : 3.4).
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

Management
(Ref: a) GP 2.2,
Process •
6.1.1 3.1
Initiation & scope (Ref: 7.1.1)
PP2.4
definition
b) PP 3.2)

(Ref: all) PP1, 2
GP 2.2, 3.1
b) PP 2.1
c) PP 1.4
d) PP 2.4, 2.5
Management e) GP 2.3, 2.4
Process • f) GP 2.4
6.1.2
Planning: (Ref: 7.1.2) g) PP 2.2
Preparation RskM 2.2
h) M&A 1
PMC GP 2.2, 3.1
Ver GP 2.2, 3.1
i) PP 1.4
j) PP 2.4
GP 2.3)

© EUROCAE, 2009
175

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

(Ref: all) PP1, 2
GP 2.2, 3.1
b) PP 2.1
c) PP 1.4
d) PP 2.4, 2.5
e) GP 2.3, 2.4
Management
• f) GP 2.4
6.1.2 Process
(Ref: 7.1.2) g) PP 2.2
Planning: Scope
RskM 2.2
h) M&A 1
PMC GP 2.2, 3.1
Ver GP 2.2, 3.1
i) PP 1.4
j) PP 2.4
GP 2.3)

(Ref: all) PP1, 2
GP 2.2, 3.1
b) PP 2.1
c) PP 1.4
d) PP 2.4, 2.5
e) GP 2.3, 2.4
Management
• f) GP 2.4
6.1.2 Process
(Ref: 7.1.2) g) PP 2.2
Planning: Content
RskM 2.2
h) M&A 1
PMC GP 2.2, 3.1
Ver GP 2.2, 3.1
i) PP 1.4
j) PP 2.4
GP 2.3)

© EUROCAE, 2009
176

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

Management
(Ref: a) PMC
Process •
6.1.3 GP 2.8
Execution & (Ref: 7.1.3)
b) PMC 1
control: Initiation
c) PMC 2)

Management
(Ref: a) PMC
Process •
6.1.3 GP 2.8
Execution & (Ref: 7.1.3)
b) PMC 1
control: Monitoring
c) PMC 2)

Management
(Ref: a) PMC
Process •
6.1.3 GP 2.8
Execution & (Ref: 7.1.3)
b) PMC 1
control: Problems
c) PMC 2)
Management
Process •

6.1.4 Review & (Ref: a) ReqM 1.5
(Ref: 7.1.4)
evaluation: b) PMC)
Lifecycle data
Management
Process •

6.1.4 Review & (Ref: a) ReqM 1.5
(Ref: 7.1.4)
evaluation: b) PMC)
Assessment

Management (Ref: a) IPM 1.3
Process • b) PMC 1.1, 1.6,
6.1.5
Closure: (Ref: 7.1.5) 1.7,
Completeness GP 2.8
c) PMC 1.4)

© EUROCAE, 2009
177

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

(Ref: a) IPM 1.3
Management
• b) PMC 1.1, 1.6,
6.1.5 Process
(Ref: 7.1.5) 1.7,
Closure: Archive
GP 2.8
c) PMC 1.4)

A.5.2 Infrastructure Process

See chapter 6.2 in ED-153.


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

Infrastructure P
• • P (Ref: a) PP 2.4
6.2.1 Process (Ref: 3.1 Table A-1
(Ref: 7.2.1) (Ref: 4.4, 11.2) (Ref: 6.2.3.c, 7.4.4) GP 2.3
implementation line 3 partial)
b) PP 2)
P • •
Establishment of • P
6.2.2 (Ref: 3.8 Table A-8 (Ref: 4.4, 7.2.9, (Ref: CM 1.1, CM
the infrastructure (Ref: 7.2.2) (Ref: 8.3)
line 6 partial) 11.15) GP 2.2, 3.1)
Maintenance of the
• •
infrastructure: • • P
6.2.3 (Ref: 3.8 Table A-8 (Ref: PMC 1.1
Requirement (Ref: 7.2.3) (Ref: 7.2.9) (Ref: 6.2.3.c)
line 6 CM)
satisfaction
Maintenance of the
• •
infrastructure: • • P
6.2.3 (Ref: 3.8 Table A-8 (Ref: PMC 1.1
Configuration (Ref: 7.2.3) (Ref: 7.2.9) (Ref: 6.2.3.c)
line 6 CM)
Management

© EUROCAE, 2009
178

A.5.3 Improvement Process

See chapter 6.3 in ED-153.


NOTE: ED-109/DO-278, ED-12B/DO-178B and IEC 61508 do not cover this process.
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

Improvement
• (Ref: OPD 1.3, 2.1,
6.3.1 Process
(Ref: 7.3.1) OPD GP 2.6
implementation
OPF )
Improvement •

6.3.2 Process (Ref: OPF 1.2
(Ref: 7.3.2)
assessment )

Process •
6.3.3 (Ref: OPF 1.3, 2.1,
improvement (Ref: 7.3.3)
2.2)
A.5.4 Training Process

See chapter 6.4 in ED-153.


NOTE: ED-109/DO-278 and ED-12B/DO-178B do not cover this process.
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B

• (Ref: PP 2.5
Training Process •
6.4.1 (Ref: Part 1-6.2.1, PMC 1.1
implementation (Ref: 7.4.1)
I-Annex B) GP 2.5
OT 1.1, 1.3)
Training material • •
6.4.2
development (Ref: 7.4.2) (Ref: OT 1.4)
• •
Training plan •
6.4.3 (Ref: Part 1-6.2.2, (Ref: PP 2.5
implementation (Ref: 7.4.3)
I-Annex B) OT 2.1, 2.2)

© EUROCAE, 2009
179

A.6 ADDITIONAL ANS SOFTWARE LIFECYCLE OBJECTIVES

See chapter 7 in ED-153.

A.6.1 Software Development Environment

This section is addressed by objectives 4.3.11 and 4.3.19. It describes the Infrastructure Process (which is generic) for the Software Lifecycle
environment concerning the Development Process.

A.6.2 Commercial Off The Shelf (Cots) Considerations

See chapter 7.2 in ED-153.

A.6.2.1 COTS Specific Objectives

The following objectives should be satisfied in addition to the objectives contained in this document for non-COTS software.
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• •
7.2.1 COTS Plans (Ref: 4.1.9 Table (Ref: SAM GP 2.2,
4-1 line 1) 3.1)


COTS Transition (Ref: SAM 2.1
7.2.2 (Ref: 4.1.9 Table
Criteria TS 2.4
4-1 line 3)
IPM 1.3)

COTS Plans •
7.2.3 (Ref: 4.1.9 Table
Consistency (Ref: SAM 2.1, 2.4)
4-1 line 2)
COTS • •
7.2.4 Requirements (Ref: 4.1.9 Table (Ref: SAM 2.1, TS
Coverage 4-2 line 2) 2.4)
• •
COTS Lifecycle
7.2.5 (Ref: 4.1.9 Table (Ref: SAM 2.1, TS
Data
4-2 line 1) 2.4)
• •
COTS Derived
7.2.6 (Ref: 4.1.9 Table (Ref: SAM 2.1, TS
Requirements
4-2 line 4) 2.4)

© EUROCAE, 2009
180

ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• •
COTS HW
7.2.7 (Ref: 4.1.9 Table (Ref: TS 2.4, RD
Compatibility
4-2 line 3) 2.2, 3.4)
COTS

Configuration •
7.2.8 (Ref: 4.1.9 Table
Management: (Ref: CM 1.2)
4-3 line 1)
Identification
• •
COTS Problem
7.2.9 (Ref: 4.1.9 Table (Ref: CM 1.1, SAM
Reporting
4-3 line 4) GP 2.6)
• •
COTS
7.2.10 (Ref: 4.1.9 Table (Ref: SAM 2.3
Incorporation
4-3 line 2) CM 2.1)
COTS

Configuration •
7.2.11 (Ref: 4.1.9 Table
Management: (Ref: CM 2.2)
4-3 line 3)
Archiving
*: COTS Derived requirements are defined in this Chapter 5 in Section 3.5.2.c

A.6.3 Tool Qualification

See chapter 7.3 in ED-153.


ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Software P
P • •
7.3.1 Development Tool (Ref: Part
(Ref: 5.2) (Ref: 12.2, 12.2.1) (Ref: Ver 1.2)
Qualification 3-7.7.2.7)
Software P
P •
7.3.2 Verification Tool (Ref: Part
(Ref: 5.2) (Ref: 12.2)
Qualification 3-7.7.2.7)

© EUROCAE, 2009
181

A.7 ED-109/DO-278 OMISSIONS

The purpose of this paragraph is to highlight the aspects of ED-153 which are not
covered by ED-109/DO-278.
ED-109/DO-278 addresses mainly safety aspects of software during its development.
The major ED-12B/DO-178B missing items are as follows:
There is no reference to system safety assessment standard;
This standard does not provide guidance to allocate Assurance Levels;
HMI specifics are not covered;
Documentation process is partially covered:
The lifecycle is restricted to the SW development phase.
The major ED-12B/DO-178B missing items also apply to ED-109/DO-278:
Only a part of the safety lifecycle is defined by ED-12B/DO-178B (the part
concerned with the development of software). No requirements are set
concerning acquisition, supply, installation, acceptance, maintenance,
operation, decommissioning;
Life cycle activities scheduling;
Integration of the software product into the system on site;
Software integration testing is not defined concurrently with the
design/development phases;
Requirements to choose a programming language;
Staff training, staff competence;
Capacity for safe modifications (A margin for throughput (e.g., Input and Output
(I/O) rate or Central Processing Unit (CPU) load) and memory usage);
Software self monitoring of control flow and data flow;
Some techniques and methods to verify outputs of different development
phases;
Project risk management;
Use of Configuration Management tool;
Tool selection criteria;
Process improvement.

© EUROCAE, 2009
182

ANNEX B

ROLES AND RESPONSIBILITIES SCENARIOS


This annex provides example scenarios of how roles and responsibilities may be
allocated between the ANSP and supplier per objective. It does not represent a
compliance matrix with ED-153 nor a prescribed structure of any organisation.

B.1 FIRST SCENARIO – CONTRACTED MAJOR SYSTEM PROJECT

In this scenario, the ANSP awards a manufacturer with a contract the equipment part
of a system i.e. a set of equipment (HW and SW) that all together fulfils the
requirements allocated by the ANSP to this system. Some of the requirements may
even address some procedural aspects of operating, maintaining the system (eg
reset, swap, switch to degraded mode, reversion to normal operation).
Therefore, the Manufacturer writes the system requirements derived from the ANSP
operational requirements. Consequently, the Manufacturer also wirtes the SW
requirements.
The System manufacturer has to complete the risk assessment and mitigation process
(e.g; namely contribute to finish the PSSA and do the SSA for this part of the system).
For example, the ANSP allocates globally SWAL to the overall system. The System
Manufacturer proposes a refinement of the SWAL allocation to each SW of the
system: only degrades SWAL to less demanding SWAL by demonstrating SW
isolation and contribution to hazards and their effects.
L : Lead; C: Contribute; P: Perform; A: Accept

Objective N° ANSP SW Comments


Manufacturer
§2.1 L C
3.0.1 L C System Manufacturer may also set-up their own SASS for
the part they are in charge of.
3.0.2 A L The SW requirements are derived by the manufacturer from
the ANSP system requirements.
3.0.3 A L
3.0.4 A L
3.0.5 L C The ANSP allocates globally SWAL to the overall system.
The System Manufacturer proposes a refinement of the
SWAL allocation to each SW of the system : only degrades
SWAL to less demanding SWAL by demonstrating SW
isolation and contribution to hazards and their effects.
3.0.6 A L
3.0.7 A L During the SW development phase.
3.0.7 L During the operation & maintenance phase.
3.0.8 L C
3.0.9 N/A N/A As per ED-153.
3.0.10 C L
3.0.11 L
3.0.12 A L

© EUROCAE, 2009
183

Objective N° ANSP SW Comments


Manufacturer
3.0.13 A L If applicable.
3.0.14 L
3.0.15 L C
3.0.16 L
3.0.17 L C
3.X.Y (except C/A L Objectives 3.1.5 and 3.3.3. are exceptions as they relate to
3.2X, 3.1.5, an FHA, which is led by the ANSP
3.3.3)
3.2.X C/A L
4.1.X L
4.2.X L
4.3.X C/A L
4.4.X L C SW Operations procedure may be written by the SW
manufacturer.
4.5.X L C SW Maintenance procedure may be written by the SW
manufacturer.
5.X.Y A L Configuration Management (5.2): When the SW is handed-
over to ANSP for operation, the equivalent Configuration
Management shall be set-up by the ANSP.
Audit (5.7): ANSP may either participate to SW
manufacturer “internal” audits or conduct additional
“external” audits.
Problem/Change Resolution (5.8)
6.X.Y A L
7.1.X A L
7.2.X A L If applicable.

B.2 SECOND SCENARIO – CONTRACTED SW

In this scenario, the ANSP awards a manufacturer with a contract for one piece of
equipment (HW and SW) that fulfils the requirements allocated by the ANSP to this
SW.
Therefore, the ANSP writes the system requirements. The Manufacturer writes the
SW requirements derived from the system requirements.
The SW manufacturer has to complete the risk assessment and mitigation process
(e.g; namely contribute to finish the SSA for this part of the SW).
For example, the ANSP allocates the SWAL to the SW. The only action bearing on the
SW Manufacturer with regards SWAL consists in ensuring that the SSA does not
identify failures leading to re-assess the risk assessment and mitigation results.
L : Lead; C: Contribute; A: Accept

© EUROCAE, 2009
184

Objective ANSP SW Comments


N° Manufacturer
§2.1 L C The only action bearing on the SW Manufacturer with
regards SWAL consists in ensuring that the SSA does not
identify failures leading to re-assess the risk assessment
and mitigation results.
3.0.1 L C SW Manufacturer may also set-up their own SASS for the
part they are in charge of.
3.0.2 L C The SW requirements are derived by the manufacturer from
the ANSP system requirements.
3.0.3 A L
3.0.4 A L
3.0.5 L
3.0.6 A L
3.0.7 A L During the SW development phase.
3.0.7 L During the operation & maintenance phase.
3.0.8 L C SW Manufacturer contributes through eg SSA by ensuring
that no failure can lead to re-assess risk assessment and
mitigation results.
3.0.9 N/A N/A As per ED-153.
3.0.10 C L SW manufacturer leads the SW development part.
3.0.11 L
3.0.12 A L SW manufacturer leads it during the SW development part.
3.0.13 A L If applicable.
3.0.14 L C
3.0.15 L C
3.0.16 L
3.0.17 L C
3.1.X L C
3.X.Y C/A L
4.1.X L
4.2.X L
4.3.1 L
4.3.2 L
4.3.X C/A L
4.4.X L C SW Operations procedure may be written by the SW
manufacturer.
4.5.X L C SW Maintenance procedure may be written by the SW
manufacturer.
5.X.Y A L Configuration Management (5.2): When the SW is handed-
over to ANSP for operation, the equivalent Configuration
Management shall be set-up by the ANSP.

© EUROCAE, 2009
185

Objective ANSP SW Comments


N° Manufacturer
Audit (5.7) : ANSP may either participate to SW
manufacturer “internal” audits or conduct additional
“external” audits.
Problem/Change Resolution (5.8)
6.X.Y A L
7.1.X A L
7.2.X A L If applicable.

B.3 THIRD SCENARIO – ANSP INTERNAL DEVELOPMENT

In this scenario, the ANSP performs internally the development. Various ANSP
units/divisions may be involved to specify, develop, integrate, verify,.. the SW.
Therefore, the ANSP has to satisfy all SWAL objectives.
L: Lead;
Objective N° ANSP Comments
§2.1 L The system team is in charge of allocating the SWAL.
3.0.1 L The ANSP has to set-up its SASS for the overall SW lifecycle.
3.0.2 L
3.0.3 L
3.0.4 L
3.0.5 L
3.0.6 L
3.0.7 L
3.0.8 L
3.0.9 N/A As per ED-153.
3.0.10 L SW development unit leads the SW development part.
3.0.11 L
3.0.12 L SW development unit leads it during the SW development part.
3.0.13 L If applicable.
3.0.14 L
3.0.15 L
3.0.16 L
3.0.17 L
3.1.X L The system team/unit.
3.X.Y L The SW development unit/team.
4.1.X L The unit/team in charge of specifying the needs.
4.2.X L The unit/team in charge of SW development.
4.3.1 L The System unit/team.

© EUROCAE, 2009
186

4.3.2 L The System unit/team.


4.3.X L The SW development unit
4.4.X L SW Operations procedure may be written by the Ops unit.
4.5.X L SW Maintenance procedure may be written by the maintenance
unit/team.
5.X.Y L An independent unit (eg safety management team) may conduct
the internal audits.
6.X.Y L
7.1.X L
7.2.X L If applicable.

© EUROCAE, 2009
187

ANNEX C

TRACEABILITY WITH ESARR6


This table proposes a traceability between ESARR6 and ED-153 objectives and sections. This traceability table has been set-up by WG64
and has not been officially reviewed or approved by the Eurocontrol Safety Regulation Commission (SRC).
NOTE: “X” means it applies to all numbers in that set eg 5.4.x means all objectives within 5.4
ED-153 Ref.
ESARR 6 Requirement
Objectives N°
Within the framework of its SMS, and
as part of its risk assessment and
1 mitigation activities, the ATM service-
provider shall define and implement
1.1 a Software Assurance System to deal 3.0.1, 3.0.15
General Safety specifically with software related
Requirements aspects, including all on-line software
operational changes (such as cutover
/ hot swapping)
3.0.2, 3.0.15, 3.1.5, 3.3.1, 3.3.2,
3.3.3, 3.3.4, 3.4.2, 3.4.1, 3.4.3,
The ATM service-provider shall The software requirements correctly state what is 4.1.2, 4.1.3, 4.3.1, 4.3.2, 4.3.4,
ensure, as a minimum, within its required by the software, in order to meet safety 4.3.5, 4.3.9, 4.3.10, 4.3.11,
1.2 a)
Software Safety Assurance System, objectives and requirements, as identified by the risk 4.3.12, 4.3.13, 4.3.14, 4.3.20,
that; assessment and mitigation process; 7.2.4
Section 3.6.3
Traceability is addressed in respect of all software 3.0.3, 3.0.15, 3.3.4, 3.4.1, 3.4.2,
b)
requirements; 4.3.15, 5.4.10
The software implementation contains no functions 3.0.4, 3.0.15, 3.3.2, 3.3.3, 3.3.4,
c)
which adversely affect safety; 4.4.2, 5.4.3, 5.8.3
The ATM software satisfies its requirements with a 3.0.6, 3.0.15, 3.4.4; 4.1.7, 4.2.7,
d) level of confidence which is consistent with the 4.3.13, 4.3.14, 4.4.5, 4.5.3,
criticality of the software; 5.4.X
Assurances that the a known executable
e) i. 3.0.7, 3.0.15, 5.2.X, 5.4.13
above General Safety version of the software,
Requirements are a known range of 5.4.9, 3.0.7, 3.0.15, 5.2.X,
satisfied, and the ii. configuration data, and 5.4.13

© EUROCAE, 2009
188

ED-153 Ref.
ESARR 6 Requirement
Objectives N°
arguments to a known set of software
demonstrate the products and
required assurance descriptions (including
are at all times iii. specifications) that 3.0.7, 3.0.15, 5.2.X, 5.4.13
derived from: - have been used in the
production of that
version
3.0.8, 3.0.3, 3.0.6, 3.0.7, 3.0.15,
The ATM service-provider shall provide the required 3.0.16, 3.0.17, 3.4.3, 3.0.10,
1.3 assurances, to the Designated Authority, that the 3.0.11, 4.2.6, 4.4.5, 5.3.2, 5.3.1,
requirements in section 1.2 above have been satisfied. 5.3.3, 5.6.1, 5.6.2, 5.6.3, 5.7.X,
6.1.3
The ATM 3.2.1, 3.0.1, 3.2.2, 3.2.3, 3.2.4,
Is documented, specifically as part of the overall Risk
2 service 2.1 3.5.1, 3.5.2, 3.5.3, 4.1.4, 4.2.4,
Assessment and Mitigation Documentation.
provider 5.1.1, 5.1.2, 5.1.3, 5.1.4
shall Allocates software assurance levels to all operational ATM
ensure, as 2.2 3.0.5, 3.0.11, 4.1.3, 4.4.5, 4.5.2
software
a minimum,
3.0.12, 3.0.2, 4.3.4, 4.3.9,
that the
4.3.10, 4.3.11, 4.3.12, 4.3.13,
Software a) software requirements validity
4.3.14, 5.4.3, 5.4.5, 5.4.6, 5.4.9,
Safety
5.7.2, 5.4.11
Assurance
System: 2.3 Includes assurances of: 4.1.6, 4.1.7, 4.2.6, 4.2.7, 4.3.16,
b) software verification 3.0.6, 5.3.X, 5.4.X, 5.6.X, 5.7.3,
5.7.X, 5.7.4
c) software configuration management, and 5.2.1, 3.0.7, 5.2.X, 5.4.13
d) software requirements traceability 5.4.10, 3.0.3, 4.3.15, 5.2.10
the variation in rigour of the assurances per software 3.0.9, 4.1.4, 3.0.10
Determines the rigour to which the assurance level shall include the following criteria:
the assurances are established. The a) i. required to be achieved with independence,
ii. required to be achieved, All tables of the document
rigour shall be defined for each
2.4 iii. not required,
software assurance level, and shall
increase as the software increases the assurances corresponding to each software assurance
in criticality. For this purpose: 3.0.8, 3.0.11, 4.2.5, 4.2.6, 4.2.7,
b) level shall give sufficient confidence that the ATM software
4.1.4, 4.4.5, 4.5.2, 4.5.3
can be operated tolerably safely.

© EUROCAE, 2009
189

ED-153 Ref.
ESARR 6 Requirement
Objectives N°
Uses feedback of ATM software experience to confirm that
the Software Safety Assurance System and the assignement
2.5 of assurance levels are appropriate. For this purpose, the 3.0.11, 4.4.X, 4.5.2, 4.5.3, 5.8.X
effects resulting from any software malfunction or failure from
ATM operational ex

Provides the same level of confidence, through any means 3.0.13, 7.2.1, 7.2.2, 7.2.3, 7.2.4,
chosen and agreed with the Designated Authority, for the 7.2.5, 7.2.6, 7.2.7, 7.2.8, 7.2.9,
2.6 developmental and non-developmental ATM software (eg 7.2.10, 7.2.11
COTS - commercial off-the-shelf software, etc) with the same
software assurance level. Section 7.2

The software assurance level relates to the rigour of the


The ATM software assurances to the criticality of ATM software by
service- 3.1 using ESARR 4 severity classification scheme combined with Section 3.6
provider the likelihood of a certain adverse effect to occur. A minimum
shall of four software
ensure, as An allocated software assurance level shall be
a minimum, commensurate with the most adverse effect that software
3
within the 3.2 malfunctions or failures may cause, as per ESARR 4. This Section 3.6
Software shall also take into account the risks associated with
Safety software malfunctions or failures and th
Assurance ATM software components that cannot be shown to be
System, independent of one another shall be allocated the software 3.0.14
that: 3.3
assurance level of the most critical of the dependant Section 3.6
components.

© EUROCAE, 2009
190

ED-153 Ref.
ESARR 6 Requirement
Objectives N°
The ATM
service- Specify the functional behaviour (nominal and downgraded
4.3.4, 3.0.2, 3.1.1, 3.3.4, 4.3.9,
provider modes) of the ATM software, timing performances, capacity,
4.3.5, 4.3.6, 4.3.10, 4.3.11,
shall 4.1 accuracy, software resource usage on the target hardware,
5.4.3, 5.4.5, 5.4.6, 4.3.12, 5.4.9,
ensure, as robustness to abnormal operating conditions and overload
4.3.13, 4.3.14, 4.3.20
a minimum tolerance, as appropriat
within the
4 Software
Safety 4.3.4, 4.3.5, 4.3.6, 5.2.5, 3.0.2,
Assurance 3.3.4, 3.4.X, 4.3.9, 4.3.10,
System, Are complete and correct, and are also compliant with the
4.2 4.3.11, 4.3.12, 4.3.13, 4.3.14,
that system safety requirements
4.3.20, 5.4.3, 5.4.5, 5.4.6, 5.4.8,
software 5.4.9, 6.1.4, 6.1.5
requirement
s:
The ATM The functional behaviour of the ATM software, timing
service- performances, capacity, accuracy, software resource usage
4.4.5, 3.0.6, 5.4.2, 5.4.3, 5.3.X,
5 provider 5.1 on the target hardware, robustness to abnormal operating
5.6.X, 5.7.1, 5.7.2
shall conditions and overload, comply with the software
ensure, as requirements
a minimum,
within the
Software
Safety
Assurance
4.1.7, 4.2.7, 3.0.6, 3.4.X, 5.2.3,
System, The ATM software is adequately verified by analysis and / or
5.2.5, 5.4.1, 5.4.2, 5.4.3, 5.4.4,
that: 5.2 testing and / or equivalent means, as agreed with Designated
5.4.5, 5.4.6, 5.4.7, 5.4.8, 5.4.9,
Authority.
5.4.10, 5.4.11, 5.6.X, 5.7.X

© EUROCAE, 2009
191

ED-153 Ref.
ESARR 6 Requirement
Objectives N°

4.2.6, 5.4.2, 5.4.3, 5.4.4, 5.4.5,


5.4.6, 5.4.7, 5.4.8, 5.4.9, 5.4.10,
5.3 The verification of the ATM software is correct and complete.
5.4.11, 5.4.12, 5.4.13, 5.5.7,
6.1.4, 6.1.5, 4.2.7, 3.0.6, 3.4.X

Configuration identification, traceability and status accounting


3.5.2, 3.0.7, 5.2.1, 5.2.2, 5.2.3,
exist such that the software lifecycle data can be shown to be
The ATM 6.1 5.2.4, 5.2.7, 5.2.8, 5.2.9, 5.2.10,
under configuration control throughout the ATM software
service- 5.8.4
lifecycle.
provider
shall
ensure, as
a minimum,
6
within the
Software Problem reporting, tracking and corrective actions exist such 4.2.5, 4.1.6, 4.4.5, 4.5.1, 3.0.11,
Safety 6.2 that safety related problems associated with the software can 3.0.12, 4.5.2, 4.5.3, 4.5.4, 5.8.1,
Assurance be shown to have been mitigated. 5.8.2, 5.8.3, 5.8.4
System,
that:

Retrieval and release procedures exist such that the software


6.3 lifecycle data can be regenerated and delivered throughout 5.2.6, 5.4.13
the ATM software lifecycle.
The ATM Each software requirement is traced to the same level of
7 7.1 4.3.15, 3.0.3, 5.4.10
service- design at which its satisfaction is demonstrated.

© EUROCAE, 2009
192

ED-153 Ref.
ESARR 6 Requirement
Objectives N°
provider
shall
ensure, as
a minimum,
Each software requirement, at each level in the design at
within the
7.2 which its satisfaction is demonstrated, is traced to a system 4.3.15, 3.0.3, 5.4.10
Software
requirement.
Safety
Assurance
System,
that:
This safety regulatory requirement shall apply to civil and
military ATM service who have the responsibility for the
8.1 management of safety in ground-based ATM systems and Section 1.2
other ground-based supporting services (including CNS)
under their managerial control
8 The software safety assurance system already existing for
ATM systems under the direct managerial control of the
8.2 N/A
military ATM organisation can be accepted, provided it
accords with the obligatory provisions of ESARR 6.
The obligatory provisions of this ESARR shall be enacted as
8.3 N/A
minimum national safety regulatory requirements.
The provisions of ESARR 6 are to become effective within
9 9.1 three years from the date of its approval by the N/A
EUROCONTROL Commission
Use of the terms and definitions listed in Appendix A shall be
11 11.1 Sections 1.6 and 1.7
considered mandatory.

© EUROCAE, 2009
193

APPENDIX I

WG-64 MEMBERSHIP

Membership EUROCAE Working Group - 64


ATM Risk Assessment

FIRST NAME SECOND NAME COMPANY


Chairman:
Patrick MANA EUROCONTROL
Secretary:
James HANSON HELIOS

Members:
FIRST NAME SECOND NAME COMPANY
Vincent BEHE SKYGUIDE
Amada BERNALDEZ DE ARANZABAL INDRA
Christophe BERTHELÉ DSNA (MSQS)
David BOWEN EUROCAE
Yann CARLIER DSNA
Mark CATTERSON NATS
Florin CIORAN EUROCONTROL
Nathalie CORBOVIANU DSNA (DTI)
Andrew EATON UK CAA (SRG)
Corinna GINGINS SKYGUIDE
Jules HERMENS DUTCH CAA (IVW)
Francisco LOPEZ AENA
Cécile MOURA DSNA (MSQS)
Gabrielle PARIZE DSNA (DTI)
Bernard PAULY THALES
John PENNY UK CAA (SRG)
Paula SANTOS NAV PORTUGAL
Vincent SCHIFFLERS EUROCONTROL
Rob WEAVER NATS
John SPRIGGS NATS

© EUROCAE, 2009
194

APPENDIX II

HISTORY AND TERMS OF REFERENCE OF WG-64

Introduction
The SWAL CS group was established as a separate group within WG64 to respond to
the European Commission Mandate to CEN/CENELEC/ETSI for the development of
European Standards (first set of Community Specifications) for interoperability of the
European Air Traffic Management Network (EATMN).
This mandate (M/390) in particular specified that “The elaboration of the standard
must be undertaken in cooperation with EUROCAE, particularly taking into account
the technical expertise of EUROCAE on equipments (systems and constituents) for air
traffic management.”
The terms of reference for the WG-64 SWAL CS group (see below) are therefore
taken directly from the “Description of the Mandate Work” in section 2 of M/390.
Terms of Reference
CEN/CENELEC/ETSI are asked to produce European standards that satisfy the
essential requirements and/or implementing rules of the interoperability Regulation for
systems, together with the relevant procedures, or constituents for the following
agreed priority 1 Community specifications:
Purpose and scope
The Community specification on Software Assurance Levels (SWAL) is intended to
apply to software components that are part of an Air Navigation System (ANS),
focusing only on the “ground” segment of ANS and provides a reference against which
stakeholders can assess their own practices for software specification, design,
development, operation, maintenance, evolution and decommissioning.
Recommendations on the major processes required to provide assurance for software
in Air Navigation Systems may include:
An allocation process for Software Assurance Levels (SWAL);
A SWAL grading policy, i.e. the identification of a policy and its rationale to justify and
substantiate increasing stringency of the objectives to be met per SWAL;
A list of objectives to be satisfied per SWAL;
The identification of appropriate techniques to achieve these objectives.
The scope of the Community specification will cover the overall lifecycle of software
within an Air Navigation System and provide an assessment of the activities for the
development, operation, maintenance and evolution of Air Navigation System software
components.

© EUROCAE, 2009
195

APPENDIX III

ATTACHMENT TO THE ED-153


UK CAA /SRG:

The content of this document has not been fully supported by UK CAA/SRG with regards the ability of
ED-153 to satisfy some aspects of EC Regulation 482/2008.

DFS:

The content of this document has not been fully supported by DFS due to the reasons described
below:

© EUROCAE, 2009
196

APPENDIX IV

IMPROVEMENT SUGGESTION FORM

Name: Company:

Address:

City: State, Province:

Postal Code, Country: Date:

Phone: Fax:

Email:

Document : ED-153 Sec: Page: Line:


[] Documentation error (Format, punctuation, spelling)
[] Content error
[] Enhancement or refinement

Rationale (Describe the error or justification for enhancement):

Proposed change (Attach marked-up text or proposed rewrite):

Please provide any general comments for improvement of this document:

Return completed form to:


EUROCAE
Attention: Secretary-General ED-153
102 rue Etienne Dolet
92240 – MALAKOFF
FRANCE
Email:eurocae@eurocae.net

© EUROCAE, 2009

You might also like