Guidelines For Ans Software Safety Assurance
Guidelines For Ans Software Safety Assurance
Guidelines For Ans Software Safety Assurance
ED-153
August 2009
ED-153
August 2009
© EUROCAE, 2009
i
FOREWORD
1. This document, prepared by EUROCAE Working Group 64 (SWAL CS), was
accepted by the Council of EUROCAE on 28th August 2009 taking note of the
concerns expressed by UK CAA and DFS, which are summarized in the
Attachment to this document (Appendix III).
2. EUROCAE is an international non-profit making organisation. Membership is
open to European manufacturers of equipment for aeronautics, trade
associations, national civil aviation administrations, users, and non-European
organisations. Its work programme is principally directed to the preparation of
performance specifications and guidance documents for civil aviation
equipment, for adoption and use at European and worldwide levels.
3. EUROCAE performance specifications are recommendations only. EUROCAE
is not an official body of the European Governments; its recommendations are
valid as statements of official policy only when adopted by a particular
government or conference of governments. The inclusion of references to
standards published by other standardisation bodies, or extracts from those
standards, does not imply any safety or regulatory liability.
4. Copies of this document may be obtained from:
EUROCAE
102, rue Etienne Dolet
92240 MALAKOFF
France
Telephone: 33 1 40 92 79 30
Fax: 33 1 46 55 62 65
E-mail: eurocae@eurocae.net
Website: www.eurocae.net
© EUROCAE, 2009
ii
TABLE OF CONTENTS
1 INTRODUCTION.............................................................................................................. 1
1.0 Purpose ............................................................................................................ 1
1.1 Scope................................................................................................................ 2
1.2 Organisation of This Document ........................................................................ 3
1.2.0 Structure............................................................................................ 3
1.2.1 Mandating and recommending phrases............................................ 3
1.2.2 Table legend...................................................................................... 4
1.3 Target Audience ............................................................................................... 4
1.4 Readership ....................................................................................................... 5
1.5 References ....................................................................................................... 5
1.6 Glossary............................................................................................................ 6
1.7 Abbreviations .................................................................................................... 10
2 DOCUMENT STRATEGY ................................................................................................ 11
2.0 Introduction ....................................................................................................... 11
2.0.0 Relationship to EC Regulations ........................................................ 11
2.1 Strategy for SSAS:............................................................................................ 13
2.1.0 SSAS Regulatory Requirements hierarchy ....................................... 13
2.1.1 SSAS Regulatory Requirements and Objectives hierarchy .............. 14
2.1.2 SSAS Implementation ....................................................................... 16
2.1.3 Production of Argument..................................................................... 16
2.1.4 Requirement Correctness ................................................................. 16
2.1.5 Traceability ........................................................................................ 16
2.1.6 Unintended functions......................................................................... 17
2.1.7 Requirement satisfaction................................................................... 17
2.1.8 Configuration Management............................................................... 17
2.1.9 Assurance and Demonstration to NSA ............................................. 17
2.1.10 Changes to software and to specific software .................................. 17
2.2 Strategy for Software lifecycle: ......................................................................... 18
3 SOFTWARE SAFETY ASSURANCE SYSTEM .............................................................. 20
3.0 Software Safety Assurance System Overall Objectives................................... 20
3.1 Software Safety Assessment Initiation ............................................................. 24
3.2 Software Safety Assessment Planning............................................................. 26
3.3 Software Safety Requirements Specification ................................................... 27
3.4 Software Safety Assessment Validation, Verification and Process Assurance 28
3.5 Software Safety Assessment Completion ........................................................ 29
3.6 Software Assurance Level ................................................................................ 30
3.6.0 Introduction........................................................................................ 30
3.6.1 Supporting Information for SWAL Allocation Process....................... 30
3.6.2 SWAL Allocation Process ................................................................. 33
3.6.3 Examples of SWAL Allocation........................................................... 36
3.6.4 Grading Policy ................................................................................... 38
3.6.5 Similarity of Levels throughout Various Standards ........................... 40
© EUROCAE, 2009
iii
© EUROCAE, 2009
iv
© EUROCAE, 2009
1
CHAPTER 1
INTRODUCTION
1.0 PURPOSE
© EUROCAE, 2009
2
SOFTWARE
SOFTWAREASSURANCE
ASSURANCELEVEL
LEVEL(SWAL)
(SWAL)
TO SATISFY
OBJECTIVES
OBJECTIVES
TO GIVE CONFIDENCE
Chapter 8 EVIDENCE
EVIDENCE
TO ACHIEVE TO PRODUCE
ACTIVITIES
ACTIVITIES
ED109 ( DO178B)
IEC 61508,
Chapters 3-7 IEC 12207,
482/2008 CMMi
NOTE: Whilst the objectives described in this document support the achievement
of many (but not all) of the articles within EC Regulation 482/2008, in no
way can compliance with any such articles be claimed as this is the
responsibility of regulatory and legislative authorities.
1.1 SCOPE
This document applies to software that forms part of an ANS system. The scope of
this extends to the overall lifecycle of software within an ANS system, however this
document considers aircraft software out of scope and is therefore limited to the
“ground” segment of ANS.
This document assumes that a risk assessment and mitigation process has been
undertaken along with an a priori system (where system includes people, procedure
and equipment) safety assessment (eg a SAM-FHA and SAM-PSSA) with the results
forming an input to this document.
This document is limited to software safety assurance and any references to software
lifecycle data are made solely within the context of software safety assurance.
Documentation not related to software lifecycle data is therefore out of scope.
This document covers:
Guidance for an ANSP to establish a software safety assurance system;
Guidance for software suppliers on the necessary software safety assurance
regarding products and processes;
A reference against which stakeholders can assess their own practices for
software safety assurance of: specification, design, development, operation,
maintenance, and decommissioning;
A software assurance process that will promote interoperability through its
common application to ANS software development.
© EUROCAE, 2009
3
1.2.0 Structure
Each objective contains at least one “shall” statement, where “shall” means that this
requirement must be satisfied to demonstrate compliance with this document.
Where more than one “shall” statement is included as part of an objective then, to
achieve that objective, all of the “shall” requirements must be satisfied.
This document is not a regulatory requirement. It implies that, in case of a “shall”
statement not being satisfied, alternative means sustained with rationale and evidence
can be chosen and agreed with the NSA.
© EUROCAE, 2009
4
The tables including Objective allocation per SWAL have to be read as follows (Cf:
3.6.4.0-criteria):
© EUROCAE, 2009
5
1.4 READERSHIP
Safety Practitioner
System Designers
Manager, Safety
Software Team
Manager)
Section
Team
Chapter 1 – Introduction
Chapter 2 – Document Strategy
Chapter 3 – Software Safety Assurance System
Chapter 4 – Primary Lifecycle Processes
Chapter 5 – Supporting Lifecycle Processes N/A
Chapter 6 – Organisational Lifecycle Processes N/A N/A
Chapter 7 – Additional ANS Software Lifecycle Objectives N/A N/A
Chapter 8 – Software Safety Folder
Annex A - Reference To Existing Software Standards N/A N/A
Annex B – Roles and Responsibilities Scenarios N/A N/A
Annex C – Traceability with ESARR6 N/A N/A N/A
1.5 REFERENCES
© EUROCAE, 2009
6
1.6 GLOSSARY
Acceptably Safe Acceptably safe defines the target risk for an ANSP and is more
demanding than tolerably safe.
Acquirer An entity in charge of acquiring SW and/or SW lifecycle data from
another entity so-called “Supplier”. There are multiple type of
relationships between Acquirer and Supplier as they can be part of
the same organisation or from different organisations (contractual
relationships).
In some cases such formal relationships may not exist (e.g. in-house
development) and so some requirements relating to the Acquirer may
be justified as not applicable.
Adaptation Data Data used to customise elements of the Air Traffic Management
System for their designated purpose.
Note: Adaptation data is utilized to customize elements of the
CNS/ATM system for its designated purpose at a specific location.
These systems are often configured to accommodate site-specific
characteristics. These site dependencies are developed into sets of
adaptation data. Adaptation data includes:
- Data that configures the software for a given
geographical site, and
- Data that configures a workstation to the preferences
and/or functions of an operator.
Examples include, but are not limited to:
a) Geographical Data – latitude and longitude of a radar
site.
b) Environmental Data – operator selectable data to provide
their specific preferences.
c) Airspace Data – sector-specific data.
d) Procedures – operational customization to provide the
desired operational role.
Adaptation data may take the form of changes to either
database parameters or take the form of pre-programmed
options. In some cases, adaptation data involves re-linking
the code to include different libraries. Note that this should
not be confused with recompilation in which a completely
new version of the code is generated.
Where appropriate, the generation of adaptation data should
seek compliance with the guidance given in ADI DAL
(Aeronautical Data Integrity Data Assurance Level). Except
for generation related processes, compliance with this
document should be shown to the same assurance level as
the ANS code that uses them.
© EUROCAE, 2009
7
© EUROCAE, 2009
8
Mitigation Means Barriers or lines of defence, defined from the risk mitigation strategy,
taken to control, prevent or reduce either: the likelihood of an event
developing further such that it can contribute to or cause an end
effect; or the severity of an end effect. They may take various forms,
including operational, procedural, functional, performance and
interoperability requirements or environment characteristics.
Non-developmental Software not developed for the current demonstration.
software
Organisation An administrative structure in which people collectively manage one
or more projects as a whole, and whose projects share a senior
manager and operate under the same policies.
Overload Tolerance This term should be defined in the framework of the software under
consideration as the definition itself may vary depending on the type
of software (eg real-time, database, HMI). The definition may include,
if appropriate, the SW performance range for some levels of load
beyond some limits to be defined.
Perform Completely In accordance with the approved software safety plan, in
conformance with ANSP Safety Management System and in
compliance with applicable safety regulatory requirements.
Resource Usage The amount of resources within the computer system that can be
used by the application software.
Note: Resources may include main memory of various categories
(such as static data, stack and heap), disc space and
communications bandwidth and may include internal software
resources, such as the number of files which may be simultaneously
open.
Safety Requirement A risk-mitigation means, defined from the risk-mitigation strategy that
achieves a particular safety objective, including organisational,
operational, procedural, functional, performance, and interoperability
requirements or environment characteristics;
Software (SW) Computer programs and corresponding configuration data, including
non-developmental software (eg proprietary software, Commercial
Off The Shelf (COTS) software, re-used software, etc.), but excluding
electronic items such as application specific integrated circuits,
programmable gate arrays or solid-state logic controllers.
Software Assurance The software assurance level (SWAL) defines the level of rigour of
Level (SWAL) the software assurances throughout the software lifecycle, supporting
the demonstration that the EATMN software is acceptability safe.
Software Component A component can be seen as a building block that can be fitted or
connected together with other reusable blocks of software to combine
and create a custom software application.
In the framework of this document, it was found necessary to further
developed the definition of “Software component” by providing the
following information:
A software component is the result of the first level of decomposition
of the software architecture, so that requirements, actions, objects,
input and output flows can be associated to that software component.
Therefore, it can be a process if the application is based on a multi-
process architecture or a thread if the architecture is mono or multi-
process and multi thread or a set of actions or a set of objects with
their associated methods or a state and its associated actions of a
finite state machine.
© EUROCAE, 2009
9
© EUROCAE, 2009
10
1.7 ABBREVIATIONS
© EUROCAE, 2009
11
CHAPTER 2
DOCUMENT STRATEGY
2.0 INTRODUCTION
This section provides the strategy that has been used and that can be built upon by
the users to relate the document purpose with the list of objectives (Chapter 3 to 7)
and the means proposed by various standards to satisfy these objectives as listed in
Appendix A.
© EUROCAE, 2009
12
(EC) 482/2008
(EC) 552/2004
Requires
s
s
uire
uire
Req
Req
ED-153
Software Software Safety
Chapters
Development Software Safety Assurance Assurance
4, 5, 6, 7
Products Activities Products
(Inputs) (Outputs)
Can Support
Ca t
ns or
up pp
po n su
rt Ca
ED-153
Annex A
STDs
Standards
STDs
STDs
(ED-109, ED-12B,
IEC61508, IEC12207,
DEF.STAN. 00-55)
© EUROCAE, 2009
13
This document provides guidance aiming to partially satisfy EC regulations (552/2004, 482/2008). The approach considers 482/2008 and
refines the 552/2004 ER3 for software. It implies that demonstrating compliance with 482/2008 through the implementation of a SSAS
subsequently supports the demonstration of compliance with 552/2004 ER3 for the software part of a constituent.
High level articles and objectives address ‘what’ has to be done and low level articles and objectives address ‘how’ it has to be done.
The table below provides the hierarchy of the regulatory requirements applicable to the SSAS.
Example of Table interpretation: To implement the Configuration Management aspects of the SSAS, Article 3.2.e of EC Regulation 482/2008
has to be used as a reference. Article 3.2.e Requirements are refined into Article 4.3.c and Annex II Part C.
© EUROCAE, 2009
14
ED-153 includes various levels of objectives aiming to address some 482/2008 requirements. The table below provides the relationship
between the various levels of 482/2008 requirements and ED-153 objectives.
© EUROCAE, 2009
15
NOTE 1: The objectives above are applicable as per the allocated SWAL.
NOTE 2: This document does not propose any guidance to develop the software argument or any example of it, except guidance to develop
the safety argument for the system change (See SAM- Part IV GM I –Safety Case Development Manual). The system safety
argument is the master argument embedding the software argument.
Example of table interpretation: ED-153 “high-level” objectives 3.0.7 aims at addressing (complying?) the “high-level” Article 3.2.e 482/2008
requirement. ED-153 Chapter 5.2 aims to address 482/2008 requirements. ED-153 Chapter 5.2 contains “low-level” objectives aiming to
address 482/2008 “low-level” requirements stated in Article 4.3.c and in Annex II Part C
The table below expands the ED-153 objective 3.0.1 hierarchy.
NOTE: The low level objectives is not an exhaustive list as many low-level objectives contribute to many low level articles. It may be the role
of an role of the argument to identify the complete set of low-level objectives for each article.
© EUROCAE, 2009
16
No further refinement of the strategy relating to this high level objective in this version.
Chapter 2 as well as chapter 8 provide some basic strategy information that can be
used to produce the argument.
2.1.5 Traceability
© EUROCAE, 2009
17
Unintended functions are functions in the software that are either performed in
addition to those required or that are not performed on demand. They can result from,
but are not limited to, the following examples:
Execution of unspecified code including unintentional execution of Deactivated
code: code which is present in the executable software but which is not
intended to be executed in the given context. For example, different
configurations for different sites may enable/disable different functions of the
software;
Incorrect implementation of SW specification (‘Software fault / malfunction’)
leading to the execution of ‘dead code’;
Code execution not performed at the right time or not on demand
Incorrect SW specification capture (at any level, top level down to detailed
design level)
This high level objective does not mean that, ultimately, there is not any unintended
function in the software. It means that these functions are not activated or that the
consequences of them being activated are justified by the safety analysis or that an
effort has been made to produce evidence for this objective to be met with a level of
rigor commensurate with the SWAL.
Ideally unintended functions should be identified. If this proves impossible, an
alternative approach has to be defined.
In both cases, unintended functions have to be addressed by:
Ensuring software requirements validity with a level of confidence
commensurate with the SWAL. This is addressed in this document through
Objective 4.3.4.
Ensuring software requirements satisfaction with a level of confidence
commensurate with the SWAL. This is addressed in this document through
Objective 5.4.X.
Ensuring that both the above, are appropriately associated with the software
version being assessed (eg by configuration management). This is addressed in
this document through Objective 5.2.X.
© EUROCAE, 2009
18
© EUROCAE, 2009
19
ED-153 4.1.1; 4.2.1; 4.3.3; 4.4.1; 5.1.1; 5.2.1; 5.3.1; 5.4.1; 6.1.1; 6.2.1; 6.3.1; 6.4.1 7.1.1; 7.2.1; 7.3.1
Objectives 4.5.1 5.6.1; 5.7.1; 5.8.1
“high-level”
ED-153 4.1.X; 4.2.X; 4.3.X; 4.4.X; 5.1.X; 5.2.X; 5.3.X; 5.4.X; 6.1.X; 6.2.X; 6.3.X; 7.1.X; 7.2.X; 7.3.X
Objectives “low- 4.5.X 5.6.X; 5.7.X; 5.8.X 6.4.X
(excluding the high level
level”
(excluding the high level (excluding the high level (excluding the high level objectives listed above)
objectives listed above) objectives listed above) objectives listed above)
© EUROCAE, 2009
20
CHAPTER 3
Introduction
The Software Safety Assurance System encompasses the following tasks:
Software Safety Assurance System overall Objectives specification
Software Assurance Level allocation
Software Safety Assessment
- Software Safety Assessment Initiation
- Software Safety Assessment Planning
- Software Safety Requirements Specification
- Software Safety Assessment Validation, Verification & Process
Assurance
- Software Safety Assessment Completion
The aim of the Software Safety Assurance System (SSAS) is to ensure that
appropriate processes are in place such that the risk associated with deploying the
software is reduced to a tolerable level (ie the objectives of (EC) 482/2008 Article 3
are satisfied). Consequently, the Software Safety Assurance System is required to
ensure requirements validity, satisfaction, traceability, non-interference and the
configuration consistency of all data used in claiming compliance with (EC) 482/2008.
The Software Safety Assurance System is required to be a part of an organisation’s
Safety Management System and consequently applies to any software under the
responsibility of that organisation. The implementation of a SSAS is described in
section 2.1.2.
Commission Regulation (EC) No 482/2008 sets some high level criteria for the
Software Safety Assurance System and mandates that a SWAL allocation process be
implemented that complies with the given criteria. Consequently, the following
objectives have to be achieved by a Software Safety Assurance System.
NOTE: The Software Safety Folder is defined and described in Chapter 8 of this
document.
© EUROCAE, 2009
21
ALs
Obj
Objectives OBJECTIVES Output
N°
Title/Topic
Annex A Section A.2.1 Software Assurance Manual
3.0.1 Implementation A Software Safety Assurance System shall 3.0.1.1 be defined and implemented (as (Part of the Safety Management
part of the overall System Safety Assessment Documentation). System)
Annex A Section A.2.1
Requirements SSF1-Part VII
3.0.2 Correctness and The software requirements shall 3.0.2.1 correctly and completely state what is required
Completeness by the software, in order to meet the system safety objectives and system safety SSF2 – Part I
requirements as identified by the risk.
Requirements Annex A Section A.2.1 SSF1 -Part VII
3.0.3 Traceability All software requirements shall 3.0.3.1 be traced to the level required by the SWAL.
Assurance SSF2 – Part I
Annex A Section A.2.1
The software implementation shall 3.0.4.1 not contain functions which may adversely
Unintended affect safety or whose effect is not consistent with the safety analysis. SSF1 -Part VII
3.0.4
Functions Note: This objective does not mean that there are no unintended functions in the SSF2 – Part I
software, but that these functions are not activated or that the consequences of them
being activated is justified by the safety analysis.
Annex A Section A.2.1
The ANSP shall 3.0.5.1 ensure, as a minimum, that the Software Safety Assurance SSF1 - Part VII
3.0.5 SWAL Allocation System allocates a SWAL to all operational ground ANS software.
SSF2 – Part II
Note: Refer to Chapter 2 Section 2 of this document for SWAL definitions and the
allocation process.
Annex A Section A.2.1
Requirements SSF1 - Part VII
3.0.6 Satisfaction The ANS software shall 3.0.6.1 satisfy its requirements with a level of confidence which
Assurance is consistent with the SWAL allocated during risk assessment and mitigation eg SSF2 – Part I
PSSA.
© EUROCAE, 2009
22
ALs
Obj
Objectives OBJECTIVES Output
N°
Title/Topic
Annex A Section A.2.1
Configuration Any Assurance shall 3.0.7.1 be at all times derived from a known executable version of SSF1 - Part VII
3.0.7 Management the software, a known range of configuration data, and a known set of software
Assurance SSF2 – Part I
products and descriptions (including specifications) that have been used in the
production of that version.
Annex A Section A.2.1
Assurance SSF1 - Part VII
3.0.8 The SWAL shall 3.0.8.1 give sufficient confidence that the ANS software can be
Rigour Objective SSF2 – Part I
operated, as a minimum, acceptably safely.
Annex A Section A.2.1
The variation in rigour of the assurances per SWAL shall 3.0.9.1 include the following
criteria: Software Assurance Manual (Part of
Assurance the Safety Management System)
3.0.9
Rigour Criteria • required to be achieved with independence,
SSF2 – Part II
• required to be achieved,
• not required.
Annex A Section A.2.1
SWAL SSF1 - Part VII
3.0.10 Assurance shall 3.0.10.1 provide confidence that SWAL is achieved.
Assurance SSF2 – Part I
Note: Assurance may be based on direct or indirect arguments and evidence.
Annex A Section A.2.1
Assurance shall 3.0.11.1 be provided that once in operation the software meets its
requirements with a degree of confidence commensurate with the SWAL, through
monitoring. Feedback of ANS software experience shall 3.0.11.2 be used to confirm
that the Software Safety Assurance System and the assignment of SWALs are
SWAL appropriate. For this purpose, the effect resulting from any reported software SSF1 - Part VII
3.0.11
Monitoring malfunction or failure from ANS operational experience (reported according to the SSF2 – Part II
relevant requirements on reporting and assessment of safety occurrences), shall
3.0.11.3 be assessed in respect of their mapping to SWAL definition as per Chapter 2 of
this document.
(Reported Software malfunction or failures are output of the ANS occurrence
reporting system as part of the ANSP Safety Management System)
© EUROCAE, 2009
23
ALs
Obj
Objectives OBJECTIVES Output
N°
Title/Topic
Annex A Section A.2.1
Software Any change to the software shall 3.0.12.1 lead first to the re-assessment of the safety SSF1 - Part VII
3.0.12
Modifications impact of such a change on the system and then, depending on the impact, shall SSF2 – Part II
3.0.12.2 lead to the re-assessment of the SWAL allocated to this software.
The same level of confidence, through any means chosen and agreed with the
Approval Authority, shall 3.0.13.1 be provided for the same software assurance level for
developmental and non-developmental ANS software (eg Commercial Off The Shelf
SSF1 -Part VI
3.0.13 COTS software, etc).
SSF2 – Part II
These means shall 3.0.13.2 give sufficient confidence that the software meets the
safety objectives and requirements, as identified by the safety risk assessment and
mitigation process.
ANS software components that cannot be shown to be isolated from one another SSF1-Part II
3.0.14 Isolation shall 3.0.14.1 be allocated the SWAL of the most critical (most demanding SWAL) of
the components. SSF2 – Part III
All on-line The Software Safety Assurance System shall 3.0.15.1 deal specifically with software
aspects of SW related aspects, including all on-line software operational changes (eg cutover/hot SS1 -Part II
3.0.15
operational swapping). SSF2 – Part I
changes
The organisation shall 3.0.16.1 make available the required assurances, to the National SSF1 -All
Demonstration
3.0.16 Supervisory Authority, demonstrating that the objectives as per the allocated SWAL
to NSA SSF2 – All
have been satisfied.
The organisation shall 3.0.17.1 produce an argument demonstrating that the objectives
Argument of the allocated SWAL, as per this document, have been satisfied.
3.0.17 No explicit guidance in this version
Production Note: No explicit guidance. Chapter 2 as well as chapter 8 provide some basic
strategy information that can be used to produce the argument.
TABLE 5: SOFTWARE SAFETY ASSESSMENT SYSTEM OVERALL OBJECTIVES
© EUROCAE, 2009
24
Risk assessment and mitigation process (eg Functional Hazard Assessment-FHA and
Preliminary System Safety Assessment-PSSA) assumptions and output should be
confirmed as far as software can impact them.
© EUROCAE, 2009
25
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.2.3.1
The Software purpose shall 3.1.1.1 be defined.
SS1-
Operational scenarios shall 3.1.1.2 be defined (eg HMI: Operator Handbook which defines the
System Part I
3.1.1 mode of operation and the human-machine interface; nominal mode, degraded mode).
Description SSF2 –
The Software and System functions and their relationships shall 3.1.1.3 be defined.
Part IV
Software boundaries shall 3.1.1.4 be defined (eg operational, time)
Software external interfaces shall 3.1.1.5 be described.
Annex A Section A.2.3.1 SSF1 -
Operational Part I
3.1.2 The Software and its environment (physical, operational, control functions, legislative etc)
Environment shall 3.1.2.1 be described in sufficient detail to enable the safety lifecycle tasks to be SSF2 –
satisfactorily carried out. Part XX
Annex A Section A.2.3.1 SSF1 -
Regulatory Part I
3.1.3 Applicable safety regulatory objectives and requirements shall 3.1.3.1 be identified.
Framework SSF2 –
Part XX
Annex A Section A.2.3.1 SSF1 -
Applicable Part I
3.1.4 Processes and Processes and Guidance applicable to the Software Assurance shall 3.1.4.1. be agreed.
guidance SSF2 –
Part XX
© EUROCAE, 2009
26
NOTE: A system (People, procedures, and equipment) safety assessment has to be performed first, then its output re-assessed during the
equipment safety assessment and then software safety assessment.
Software A plan describing the software safety assessment steps shall 3.2.2.1 be produced (eg SS1 -
Safety approach, relations between safety assessment and software lifecycle, deliverables (content Part III
3.2.2 and date of delivery), relations with software/system major milestones, project risk
Assessment SSF2 –
Plan management due to safety issues, responsibilities, persons, organisations, risk classification
Part II
scheme, safety objectives definition approach, hazard identification methods, safety
assurance activities, schedule, resource)
NOTE: These objectives recommend neither a specific packaging of a SW safety assessment plan, nor a separate plan (the plan can be
part of a (sub-)system plan).
© EUROCAE, 2009
27
The objectives in this section describe the Software Safety Assessment defined in objective 3.2.1.
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.2.3.3
SSF1 -
Potential failures shall 3.3.1.1 be identified by considering various ways Software can fail and
Failure Part II
3.3.1 by considering the sequence of events that lead to the occurrence of the failure.
Identification SSF2 –
A list of single, consequential and common modes of failure shall 3.3.1.2 be drawn up.
Part I
Note: Guidance on Common Mode Analysis can be found in SAM and ED-79.
Annex A Section A.2.3.3
SSF1 -
The effects of failure occurrence shall 3.3.2.1 be evaluated. Part II
3.3.2 Failure Effects
The hazards associated with software failure occurrences shall 3.3.2.2 be identified in order to SSF2 –
further complete the list of hazards initiated during Risk Assessment and Mitigation process Part I
(eg FHA and further completed during PSSA).
Annex A Section A.2.3.3 SSF1 -
Assessment of Part II
3.3.3 The initial Risk Assessment and Mitigation process (eg FHA and further completed during
Risk PSSA) shall 3.3.3.1 be revisited based upon the outcome of 3.3.1 and 3.3.2. SSF2 –
Part I
Annex A Section A.2.3.3
Software Requirements shall 3.3.4.1 be compliant with the Safety Objectives to which the SSF1 -
Software Software contributes and System Safety Requirements. Part II
3.3.4 Requirements
Note: The definition of “compliant” has to be developed as part of the argument sustaining SSF2 –
Setting
the demonstration of this objective. This definition should include the traceability with the Part I
above level of requirements, the demonstration of the necessity, sufficiency, appropriateness
and relevance of the requirements to satisfy the above level of requirements.
© EUROCAE, 2009
28
NOTE: Software “safety” requirements are not limited to the SWAL. See Chapter 4 Section 3 Objective 4.3.4. Consequently, in this
document, the term “software requirements” is used, to reflect the fact that the same requirement can be classified as, for example a
performance, security, interoperability or safety requirement. It is the role of the System Safety Assessment and of the Software
Safety Assurance System to ensure that software requirements, that are necessary and sufficient to ensure an acceptably safe
operation, have been identified and verified.
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Demonstration and Assurance that SW requirements are satisfied shall be provided. SSF1 -
Software Part VII
3.4.4 Safety
Assurance SSF2 –
Part I
TABLE 9: SOFTWARE SAFETY ASSESSMENT VALIDATION, VERIFICATION AND PROCESS ASSURANCE OBJECTIVES
© EUROCAE, 2009
29
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.2.3.5 SSF1 –
Part XX
Document Software The Software Safety Assessment process results shall 3.5.1.1 be documented.
3.5.1 Safety Assessment SSF2 –
Process Results Part II
© EUROCAE, 2009
30
3.6.0 Introduction
A SWAL scheme is a strategic project management aid that is intended to ensure that
appropriate software safety assurance processes are used throughout the lifecycle of
safety related software. SWALs identify sets of objectives to be applied to the software
lifecycle processes and products.
A SWAL indicates the recommended rigour of the assurance processes to be used to
generate the assurance evidence that will show whether the Software Requirements
have been satisfied with a degree of rigour commensurate with the risk presented by
the software. A SWAL provides a uniform level of rigor to which the development,
operational transfer, maintenance, decommissioning and product functional
assurances for the safety related software are produced. It is allocated upon the
potential consequences of anomalous behaviour of software (the severity of the end
effect of the software malfunction or failure and the likelihood of occurrence of the end
effect) as determined by the system safety assessment process.
A SWAL does not replace Safety Requirements assigned to the Software. It is a
Software Safety Requirement that has to be derived from the risk assessment and
mitigation process.
Compliance with a SWAL requires the software team to follow planned and systematic
actions to provide product, process or other evidence that can be argued to show that
an adequate level of confidence and assurance exists that the software satisfies its
requirements.
The allocation of a Software Assurance Level (SWAL) for software is part of the risk
assessment and mitigation process e.g. it is identified during the Preliminary System
Safety Assessment (SAM-PSSA) and confirmed during the System Safety
Assessment (SAM-SSA). However, if the risk assessment and mitigation process
does not define a SWAL allocation process it must be defined in the Software Safety
Assurance System.
The SWAL allocation process shall 3.6.0.1 be performed during the design of an
ANS system as part of a risk assessment and mitigation process.
The SWAL shall 3.6.0.2 be allocated by the “System team”
NOTE: The term “system team” should be taken to include system designers and
safety practitioners (see section 1.2.1).
The software team can offer support and increased confidence that the allocated
SWAL is appropriate by providing further understanding of the Software role,
contribution, interference and interactions with the overall ANS system architecture.
This section includes information necessary to perform the SWAL allocation process
described in section 3.6.2.
NOTE: The reader is encouraged to note the definitions, provided in section 1.6, of
internal and external mitigation means which are of particular use in this
section.
© EUROCAE, 2009
31
“Pivotal”
HAZARD
Event S EffectA
AND
/ S
OR
AND EffectB
/
F
OR
S EffectC
AND
/
OR Pe F
SW Ph F
EffectD
e.g.FTA
FTA ETAETA
e.g.
Causes
Causes Consequences
Consequences
Figure 3 illustrates the likelihood (Ph x Pe) that, once software malfunctions or fails,
this malfunction/failure could result in a certain effect, where Ph and Pe are defined
as:
Ph (identified during the safety assessment eg PSSA) is the likelihood that once
the software has malfunctioned or failed, this malfunction/failure generates a
hazard. Ph is commensurate with the ability of the remaining part of the
architecture to mitigate the software malfunction/failure;
Pe (identified during the safety assessment eg FHA) is the likelihood that the
hazard generates an effect having a certain severity.
For each hazard, depending on the method used to set Safety Objectives, there can
be:
many likelihoods Pe (one Pe per effect of the hazard), to be assessed for each
individual effect or;
only one likelihood Pe (one for the Worst Credible effect ).
The SWAL allocation process illustrated here is based on the second case, given
above, and considers the likelihood of a given software behaviour generating the
Worst Credible effect that it can cause.
The SWAL allocation process’s sensitivity to inaccurate quantification needs to be
reduced to an acceptable level due to:
The lack of accuracy on Pe. Pe may include some un-quantifiable human,
procedural and equipment aspects as well as quantifiable mitigation means or
models such as Collision Risk Models (CRM). Thus Pe may not always be
quantified precisely and may be stated to a given confidence eg +/-10%;
The lack of accuracy on Ph. Ph may include some un-quantifiable human,
procedural and equipment aspects as well as quantifiable mitigation means.
Thus Ph may not always be quantified precisely and may be stated to a given
confidence e.g. +/-10%;
© EUROCAE, 2009
32
The difficulty in providing a failure rate for the Software. Whilst it is assumed
that Software can fail and that a target failure rate objective may be set for the
combination of hardware and software by the Risk Assessment and Mitigation
process, it may only be possible to claim the failure rate for the combination of
hardware to a given accuracy.
As it is difficult to quantify accurately and precisely these probabilities, common sense,
expert judgement and other means (knowledgebase, lessons learned, incidents
reports, equivalent field service experience) should be used to assess those
probabilities.
As part of the safety assessment process (eg SAM-SSA), appropriate monitoring has
to be put in place to ensure that these values are satisfied as Pe and Ph should be
transposed into:
Assumptions captured through Safety Requirements on the Operational
Environment (Pe) (eg during SAM-FHA) and;
Safety Requirements on the elements of the ANS System (Ph) (eg during SAM-
PSSA).
Consequently, it is important to document the rationale for the allocated SWAL (end
effect and the likelihood of generating such an effect, overall system design,
operational environment) so that the satisfaction of the safety requirements and any
assumptions may be verified during the software lifecycle.
© EUROCAE, 2009
33
To allocate a SWAL to an ANS software function, the following five steps shall 3.6.2.0.1
be performed:
Using the definitions in 3.6.2.1, identify the likelihood (Pe x Ph) that, once
software malfunctions or fails, this software malfunction/failure can generate an
end effect which has a certain severity (for each effect of a hazard1);
i. Justify the likelihood (see guidance in 3.6.2.2);
ii. Identify the SWAL for that combination (of severity and likelihood) using
the matrix below;
iii. Repeat the two previous steps for all SW malfunction/failure.
1
This has to be done for all effects of a SW malfunction. If only the worst credible effect is taken into consideration, then only
the SW malfunction leading to that worst credible effect of the hazard will have been accounted for.
© EUROCAE, 2009
34
Effect Severity
Class
1 2 3 4
Likelihood of
generating such an effect
(Pe x Ph)
Very Possible SWAL1 SWAL2 SWAL3 SWAL4
Possible SWAL2 SWAL3 SWAL3 SWAL4
Very Unlikely SWAL3 SWAL3 SWAL4 SWAL4
Extremely Unlikely SWAL4 SWAL4 SWAL4 SWAL4
Allocate the most stringent SWAL out of all contributions as the final SWAL of the
software.
SWAL identification shall 3.6.2.0.2 assure that the number of SW malfunction/failure
leading to such SWAL has been considered.
NOTE: This requirement aims at ensuring that the cumulative aspect of risks due
to SW malfunction/failure has been considered while taking into account
the ability of a SWAL to provide adequate assurance for not only one, but a
number of malfunction/failures having similar severity/likelihood effects.
However, due to the intrinsic range of assurance provided by a given
SWAL, the SWAL allocation can generally be performed by considering
only the most severe software malfunction/failure.
NOTE: The qualitative definition of likelihood ("Very Possible", "Possible",”Very
Unlikely”, “Extremely Unlikely”) can be interpreted with quantitative values.
NOTE: More than 4 SWALs can be defined according to ESARR 6 and the EC
Regulation 482/2008, although 4 SWALs are used in ED-153.
If the “Software” scope, as initially specified by the System Design Team and to which
a SWAL is allocated during the risk assessment and mitigation process eg PSSA,
leads to further detailed “Software” design that requires splitting the initial Software
into parts, then each of those parts must also achieve the initially assigned SWAL
level. An exception to this is where isolation can be demonstrated and a SWAL is
allocated to each isolated software by applying the SWAL allocation process again.
This may lead to different SWALs being allocated to the isolated software.
Reminder (Objective 3.0.14): ANS software components that cannot be shown
to be isolated from one another shall 3.6.2.0.3 be allocated the software
assurance level of the most critical of the isolated components.
Any introduction of software into operation shall 3.6.2.0.4 demonstrate that an
acceptable level of safety is satisfied. If the risk assessment and mitigation
process leads to a situation where the mitigation of the end-effect Severity
Class 1, due to software malfunction or failure, is solely based on SWAL1 then
the risk assessment and mitigation demonstration should justify that relying
solely on SWAL 1 meets the acceptable level of safety.
The following considerations should be taken into account in relation to SWAL 1
SWAL1 objectives and associated evidence are extremely stringent leading to a
highly elevated; cost, development effort and difficulty in generating evidence.
Wherever possible alternative system designs should be proposed including
other mitigation means rather than relying on software being allocated a
SWAL1.
© EUROCAE, 2009
35
If the retained strategy to mitigate the end-effect Severity Class 1 is solely based on
SWAL1, the ANSP may be required to:
Justify why an alternative design including other additional independent
mitigation means is not used.
The following qualitative definitions apply when assessing the likelihood (Pe x Ph)
during the SWAL allocation process for a system/SW having numerous years of
operations. For SW/system having a short lifetime, either quantitative definitions may
be defined (e.g. derived from the ANSP Risk Classification Scheme) or other
qualitative appropriate statements may be proposed.
Very Possible: This effect, due to software malfunction or failure, will almost certainly
occur throughout the system life.
Possible: This effect, due to software malfunction or failure, may occur throughout the
system life.
Very Unlikely: This effect, due to software malfunction or failure, is expected to occur
only in some exceptional circumstances throughout the system life.
Extremely Unlikely: This effect, due to software malfunction or failure, is not
expected to occur throughout the system life.
These definitions are complemented by the guidance in section 3.6.2.2:
The following, non-exhaustive list of factors should be considered when justifying the
likelihood (Pe x Ph) during the SWAL allocation process:
Number of mitigation means;
Efficiency of mitigation means (reliability, availability, suitability, and how
relevant and protective it is for the malfunction or failure being assessed);
Completeness of the mitigation means failure mode analysis;
Level of isolation of mitigation means (dependency between internal and
external mitigation means, common modes between internal mitigation means,
common modes between external mitigation means, physical location between
mitigation means, particular risks);
Nature of mitigation means (and diversity of nature with the element being
assessed);
Novelty of mitigation means;
Complexity of mitigation means;
Difference in the context of collection of mitigation means efficiency (operational
environment, usage/role of mitigation means) with the future context of element being
assessed in case of reuse of an existing mitigation means. This criteria can be used to
contribute in assessing the efficiency of the mitigation means.
The following guidance should be considered when justifying the likelihood (Pe x Ph)
during the SWAL allocation process:
Very Possible: No effective and independent significant mitigation means exist.
For example:
The number and efficiency of the mitigation means cannot be relied upon to
downgrade the SWAL.
The number and efficiency of mitigation means from an individual point of view
may seem to be significant enough, however the isolation criteria is seriously
impaired (common mode of failure).
© EUROCAE, 2009
36
The mitigation means are not yet proven or not yet proven in the future context.
The complexity or novelty of the mitigation means prevents relying significantly
on them (assessment complexity and completeness challengeable).
Though mitigation means may exist, they are not accounted for.
Possible: Effective, isolated mitigation means exist.
For example:
At least one effective, isolated mitigation means is proven.
The number of mitigation means is reduced but the efficiency is significant.
A number of isolated mitigation means each having moderate efficiency.
Very Unlikely: Very effective, isolated mitigation means exist.
For example:
The number of mitigation means is limited but the efficiency is very significant.
A significant number of isolated mitigation means each having significant
efficiency.
Extremely Unlikely: Extremely effective, isolated mitigation means exist.
For example:
The number of mitigation means is limited but the efficiency is extremely
significant.
A very significant number of isolated mitigation means each having significant
efficiency.
A significant number of isolated mitigation means each having very significant
efficiency.
The mitigation means are proven very significant in an identical context.
1st CASE: Safety Objectives were allocated by assessing all effects (eg SAM-FHA
Chapter 3 Guidance Material G “Methods for Setting Safety Objectives” Methods 1 & 3
or SAM-FHA Guidance Material E “ED-125 Quantitative Model” [2]). So all effects, due
to software malfunction or failure, are taken into consideration.
This Software will be allocated a SWAL = SWAL3 as it is the most stringent SWAL (for
both hazards).
Effect Severity 1 2 3 4
Hazard1: Hazard2:
© EUROCAE, 2009
37
2nd CASE: Safety Objectives were allocated using SAM-FHA Chapter 3 Guidance
Material G “Methods for Setting Safety Objectives” Method 2 or 4 or SAM-FHA
Guidance Material E “ED-125 Semi-Quantitative, Semi-Prescriptive or Fixed-
Prescriptive Models”.
Only the worst credible scenario which has been used to set safety objectives is taken
into consideration.
This Software will be allocated a SWAL = SWAL3 as it is the most stringent SWAL (for
both hazards which have a worst credible hazard effect having a severity 3).
Effect Severity 1 2 3 4
Hazard1: Hazard2:
© EUROCAE, 2009
38
Whilst quantitative failure rate objectives for the combination of hardware and software
may be set by the Risk Assessment and Mitigation process, and achievement of these
rates may be claimed quantitatively or qualitatively, their achievement may not be
claimed through the application of a process.
Compliance to ED-153 at a given SWAL shall 3.6.3.2.1 not be used to claim a failure
rate/reliability figure for the software.
3.6.4.0 Criteria
Nature of evidence.
- Direct evidence: that which is produced by an activity taking place or
software behaviour occurring, which is directly related to the claim being
made;
- Indirect (backing) evidence: that which shows that the Direct Evidence is
both credible and soundly based.
Depth of investigation according to the table below.
© EUROCAE, 2009
39
The following table provides rationale to help in understanding the aim of a SWAL
including what kind of overall objective is intended and the kind of errors which are
supposed to be prevented or tolerated.
Each SWAL adopts, in addition to its own criteria, the criteria of the less stringent
SWALs.
NOTE: This grading policy rationale is not applicable for every objective.
© EUROCAE, 2009
40
Multiple standards exist today that propose activities whose evidence addresses some
of the objectives stated in this document. Where any such standard is used it shall
3.6.5.1 be complemented, where necessary, to satisfy the objectives/activities set for a
given SWAL.
This section provides a reference against which standards may be assessed for the
equivalence of assurance that they provide.
An analysis of the processes and their associated evidence (See Annex A) was used
to provide the comparison of assurances. Annex A provides the detailed traceability
between ED-153 objectives and the various standards (eg IEC 61508, ED-109)
therefore the user can deduce detailed equivalence and compliance from this annex. It
is the responsibility of the user to identify complementary activities for those objectives
for which partial compliance is provided.
The comparison did not address assurances of quantified reliability or quantified
integrity.
It is not intended that this document be used in isolation to demonstrate achievement
of a SWAL but instead that it be used in conjunction with other standards.
The following figure provides a comparison of assurance levels when considering the
process-oriented activities of ED153:
ED-109;
IEC61508.
NOTE: No existing standard fully covers all objectives of ED-153.
© EUROCAE, 2009
41
SW Standards
ED-109 ED-153 IEC61508
© EUROCAE, 2009
42
CHAPTER 4
This chapter lists the objectives, per SWAL, that belong to primary lifecycle processes.
Primary lifecycle processes consist of:
1. Acquisition process:
a. Initiation;
b. System Safety Assessment;
c. Request-for-Tender [Proposal];
d. Contract preparation and update;
e. Supplier monitoring;
f. Acceptance and completion.
2. Supply process:
a. Initiation;
b. Preparation of response;
c. Contract;
d. Planning;
e. Execution and control;
f. Review and evaluation;
g. Delivery and completion.
3. Development process:
a. System requirements analysis;
b. System architectural design;
c. Process implementation;
d. Software requirements analysis;
e. Software architectural design;
f. Software detailed design;
g. Software integration;
h. Software installation;
i. Standards/Rules Definition;
j. Standards/Rules;
k. Requirement Development Management;
l. Use of Requirement Management Tool;
m. Resource Management;
n. Rationale for Design choices;
o. Traceability;
p. Transition criteria;
q. Design tool;
© EUROCAE, 2009
43
© EUROCAE, 2009
44
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A section A.3.1
The acquirer begins the acquisition process by describing a concept or a need to acquire,
develop, or enhance a system, software product or software service.
The acquirer shall 4.1.1.1 analyse and approve the system requirements eg against the user
requirements. SSF1 -
Part I
4.1.1 Initiation The system requirements shall 4.1.1.2 include business, organisational and user as well as
safety, security, and other criticality requirements along with related design, testing, and SSF2 –
compliance standards/rules and procedures. Part XX
The acquirer shall 4.1.1.3 prepare, document and execute an acquisition plan.
Note: The scope of applicability of this objective is restricted to the contractual tasks to be
performed by the acquirer in the framework of the contract with the supplier. Therefore this
objective should be reconciled with objective 4.3.3.
Risk Annex A section A.3.1
Assessment SSF1 -
The acquirer shall 4.1.2.1 determine how safe the system needs to be. The acquirer shall Part II
and Mitigation
4.1.2 4.1.2.2 perform a hazard analysis (eg FHA) and identify Safety Objectives for hazards.
Process – SSF2 –
safety Part I
objectives
Risk Annex A section A.3.1
Assessment SSF1 -
The acquirer shall 4.1.3.1 determine (during the System Design phase) whether or not the Part II
and Mitigation
4.1.3 proposed architecture is expected to achieve the Safety Objectives. The Acquirer shall4.1.3.2
Process – SSF2 –
specify Safety Requirements (including allocation of a SWAL) for system elements.
safety Part I
requirements
© EUROCAE, 2009
45
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A section A.3.1
Request For During the request for tender, the acquirer shall 4.1.4.1 determine which processes, activities,
4.1.4
Tender and tasks of the recommendations in this document are appropriate for the project and shall
4.1.4.2 tailor them according to the SWAL.
© EUROCAE, 2009
46
© EUROCAE, 2009
47
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A section A.3.1
4.2.1 Initiation The supplier shall 4.2.1.1 conduct a review of requirements in the request for proposal taking
into account organisational policies and other regulations to prepare the response.
Annex A section A.3.1
Preparation of
4.2.2 The supplier shall 4.2.2.1 define and prepare a proposal in response to the request for
response
proposal, including its recommended tailoring of any applied International Standard/rules.
Annex A section A.3.1
4.2.3 Contract The supplier shall 4.2.3.1 negotiate and enter into a contract with the acquirer organisation to
provide the software product or service.
Annex A section A.3.1
The supplier shall 4.2.4.1 define or select a software lifecycle model appropriate to the scope,
magnitude, and complexity of the project. The processes, activities, and tasks of any
applied International Standard/rules shall 4.2.4.2 be selected and mapped onto the lifecycle
4.2.4 Planning model.
The supplier shall 4.2.4.3 develop and document project management plan(s).
Note: The scope of applicability of this objective is restricted to the contractual tasks to be
performed by the supplier in the framework of the contract with the acquirer. Therefore this
objective should be reconciled with objective 4.3.3.
Annex A section A.3.1
The supplier shall 4.2.5.1 implement and execute the project management plan(s).
Execution & The supplier shall 4.2.5.2 monitor and control the progress and the quality of the software
4.2.5 products or services of the project throughout the contracted lifecycle.
control
Note: The scope of applicability of this objective is restricted to the contractual tasks to be
performed by the supplier in the framework of the contract with the acquirer. Therefore this
objective should be reconciled with objective 4.3.3.
© EUROCAE, 2009
48
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A section A.3.1
Review & The supplier shall 4.2.6.1 co-ordinate contract review activities, interfaces, and communication SSF1 –
4.2.6
evaluation with the acquirer's organisation. Part VII
The supplier shall 4.2.6.2 perform quality assurance activities.
Annex A section A.3.1
Delivery &
4.2.7 The supplier shall 4.2.7.1 deliver and provide assistance to the acquirer in support of the
completion
delivered software product or service as specified in the contract.
© EUROCAE, 2009
49
The Development Process contains the objectives and tasks of the developer. The
process contains the objectives for requirements analysis, design, coding, integration,
testing, and installation and acceptance related to software products. It may contain
system-related objectives if stipulated in the contract. The developer performs or
supports the activities in this process in accordance with the contract.
NOTE: System related objectives are part of the Risk Assessment and Mitigation
Process (eg FHA & PSSA). However as these processes are interacting in
an iterative way, system requirements, architecture, integration, … are to
be reassessed to be confirmed and validated when software activities are
performed. That is why these system objectives are listed in the software
related one (See Annex A section A.3.3).
Satisfying system-related objectives is a pre-requisite to software objectives
satisfaction.
Objectives 4.3.1 and 4.3.2 are not specific to software but related to the system (eg
PSSA) and as a result do not intend to describe the system requirements definition
process.
© EUROCAE, 2009
50
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A section A.3.3
The system requirements specification shall 4.3.1.1 describe, as a minimum:
System
• functions and capabilities of the system; SSF2 –
4.3.1 Requirements
Part I
Analysis • business/performance, organisational and user requirements;
• safety, security, human-factors engineering (ergonomics), interface, operations, and
maintenance requirements; design constraints and validation requirements.
Annex A section A.3.3 SSF1 -
System Part I
4.3.2 Architectural System requirements shall 4.3.2.1 be allocated among hardware, software, people and
Design procedures. SSF2 –
Part I
Annex A Sections A.3.3 and A.3.3.1
A software lifecycle model appropriate to the scope, magnitude, and complexity of the
project shall 4.3.3.1 be defined and placed under configuration management. It shall 4.3.3.2
include, as a minimum:
• end of activity/phase criteria for each activity/phase SSF1 -
Process • joint technical review for each activity/phase Part VII
4.3.3
Implementation SSF 2 –
Others
Standards/Rules, methods, tools, and computer programming languages shall 4.3.3.3 be
selected, tailored and used according to the SWAL.
Note: Process implementation includes lifecycle definition, output documentation, output
configuration management, SW products problems, environment definition, development
plan, COTS
© EUROCAE, 2009
51
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.3.3
The developer shall 4.3.4.1 establish and document software requirements, using software
requirements standards/rules as defined per Objectives 4.3.9 & 4.3.10.
The Software Requirements shall 4.3.4.2, as a minimum:
SSF1 -
SW • specify the functional behaviour of the ANS software, capacity, accuracy, timing
Part VII
4.3.4 requirements performances, software resource usage on the target hardware, robustness to abnormal
analysis operating conditions, overload tolerance. SSF2 –
Part I
• be complete and correct;
• comply with the System Requirements;
• an identification of the configuration/adaptation data range.
Algorithms shall 4.3.4.3 be specified.
Annex A Section A.3.3
SSF1 -
The developer shall 4.3.5.1 transform the requirements for the software into an architecture Part VII
SW architectural
4.3.5 that describes its top-level structure and identifies the software components.
design SSF2 –
Note: The scope of this objective is top level SW architecture definition, top level Part I
interfaces design, SW integration definition, SW architecture definition criteria
Annex A Section A.3.3
SSF1 -
The developer shall 4.3.6.1 develop a detailed design for each software component of the Part VII
SW detailed
4.3.6 software using software design standards/rules.
design SSF2 –
Note: The scope of this objective is SW detailed design definition, interfaces design, SW Part IV
Units tests definition.
© EUROCAE, 2009
52
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.3.3
An integration plan shall 4.3.7.1 be developed to integrate the software units and software SSF1 -
components into the software. The plan shall 4.3.7.2 include verification/test requirements, Part VII
4.3.7 SW integration
procedures, data responsibilities, and schedule. The plan shall 4.3.7.3 be documented. SSF2 –
Note: The scope of this objective is SW integration plan, SW integration definition, user Part XX
documentation, SW validation preparation, SW integration evaluation (partially)
Annex A Section A.3.3 SSF1 -
Part VII
4.3.8 SW installation A plan shall 4.3.8.1 be developed to install the software product in the target environment as
designated in the contract. The resources and information necessary to install the software SSF2 –
product shall 4.3.8.2 be documented and made available before installation. Part XX
Annex A Section A.3.3.1
Development Plan SSF1 -
Standards/rules The developer shall 4.3.9.1 develop plans for conducting the activities of the development Part VII
4.3.9
definition process. The plans shall 4.3.9.2 include as a minimum: specific standards/rules, methods, SSF2 –
tools, actions and responsibility associated with the development and validation of all Part I
requirements including safety. If necessary, separate plans may be developed. These
plans shall 4.3.9.3 be documented and executed.
© EUROCAE, 2009
53
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.3.3.1
SW development plan (standards/rules):
The developer shall 4.3.10.1 identify SW Requirements standards/rules (note minimum SSF1 -
content identified in objective 4.3.4), Part VII
4.3.10 Standards/rules
The developer shall 4.3.10.2 identify SW Design Standards/Rules, SSF2 –
Part I
The developer shall 4.3.10.3 identify SW Coding Standards/Rules,
Also, references to the standards/rules for previously developed software, including COTS
software, if those standards/rules are different.
Annex A Section A.3.3.1
Software Development Environment
The developer shall 4.3.11.1 identify the selected software development environment in terms
SSF1 -
Requirement of:
Part VII
4.3.11 development (1) The chosen requirements development method(s), procedure(s) and tools (if any) to be
management SSF2 –
used.
Part I
(2) The hardware platforms for the tools (if any) to be used
Example: Method(s) are for example: SADT, SART, OOD…, though procedures are
organisational ways of performing requirement management.
Annex A Section A.3.3.1 SSF1 -
Use of a Part I
4.3.12 Requirement A Requirement specification tool shall 4.3.12.1 be used.
specification tool SSF2 –
Part I
© EUROCAE, 2009
54
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Sections A.3.3.5, A.3.3.6, and A.3.3.9 SSF1 -
Resource Part VII
management A necessary margin with regards usage of resources (eg memory, CPU load, drivers, …)
for safety purpose shall 4.3.13.1 be specified. SSF2 –
4.3.13 Part I
The margin shall 4.3.13.2 be measured or verified to ensure satisfaction of the specification.
(load, memory,
….) If many software share the same resources, then the margin shall 4.3.13.3 be evaluated at
system level.
Annex A Section A.3.3.1 SSF1 -
Part VII
The developer shall 4.3.14.1 define real-time design features of software components at
architectural design level. A set of properties, such as the following, shall 4.3.14.1 be SSF2 –
Rationale for identified: Part I
design choices • tasks and run- time aspects (priority, events, communications, ….)
4.3.14 especially real
time oriented • interruptions (priorities, delay management, SW watchdog…) Included
one in safety
• treatment & propagation of errors (detection & recovering mechanisms, ….) SW
• data management (protection & deadlock mechanisms, ….) design
standar
• initialisation/ stop (exchange of data during these phases) ds/rules
Annex A Section A.3.3.1: SSF1 -
Part VII
a) The developer shall 4.3.15.1 ensure there is traceability between System and
Software requirements (Annex A.3.4) SSF2 –
Part I
b) The developer shall 4.3.15.2 ensure there is traceability between Software
4.3.15 Traceability requirements and Software design (Software component level, architectural design) (Annex
A.3.6) Two
c) The developer shall 4.3.15.3 ensure there is traceability between Software matrices
Architectural Design and Code (Annex A.3.7) per link
d) The developer shall ensure there is traceability between Code and (both
4.3.15.4
Executable ways)
© EUROCAE, 2009
55
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.3.3.1
Verification/ transition criteria
The developer shall 4.316.1 describe the software lifecycle processes to be used to form the
specific software lifecycle(s) to be used on the project, including the transition criteria for the SSF1 -
software development processes. Part VII
Transition
4.3.16 criteria between All essential information from a phase of the software lifecycle needed for the correct SSF2 –
lifecycle phases execution of the next phase shall 4.3.16.2 be available and verified. Part II
See also evaluation criteria for Specification, design, code, test, integration
Transition criteria for Requirements Analysis and Verification phases shall 4.3.16.4 be
defined.
Annex A Section A.3.3.1.1 and A 3.3.5
SSF1 -
Software Development Environment
Part VII
If a design tool is used, then the developer shall 4.3.17.1identify the selected software
4.3.17 Design tool SSF2 -
development environment in terms of:
Other
(1) The chosen design method(s), procedure(s) and tools (if any) to be used.
(2) The hardware platforms for the tools (if any) to be used
Annex A Sections A.3.3.1.1, A.3.3.2, A.3.3.5 SSF1 -
Part I
A design tool shall 4.3.18.1 be used.
Use of Design
4.3.18 SSF2 -
tool
Other
© EUROCAE, 2009
56
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Sections A.3.3.1, A.3.3.1.1, and A.3.3.7
Software Development Environment
The developer shall 4.3.19.1 identify the selected software development environment in terms
of:
(1) The programming language(s), coding tools, compilers, linkage editors and loaders to
be used,
SSF1 -
(2) The hardware platforms for the tools to be used Part I
Annex A Sections A.3.3.1.1, A.3.3.4, A.3.3.5, A.3.3.6, A.3.3.7, and A.3.3.8 SSF1 -
Part VII
A level of complexity (as well as selected criteria defining this complexity) shall 4.3.20.1 be
Complexity
4.3.20 defined and measured. SSF2 –
constraints
Part IV
© EUROCAE, 2009
57
The Operation Process contains the objectives and tasks of the operator. The process
covers the operation of the software product in the context of the ATM system
(including people procedures and equipment) and operational support to users.
Because operation of software product is integrated into the operation of the system,
the objectives and objectives of this process refer to the system. Therefore these
objectives are not software-specific.
The operation process starts once transfer into operation starts, ie after a first software
release has been operationally approved. Operation includes transferring into
operation, commissioning and operating software. Decommissioning is covered by
Maintenance Processes.
The means (activities and evidence) to satisfy Operation process objectives 4.4.1 and
4.4.5 will vary per SWAL due to:
Operation of the software and the consequences of the operation;
The need for performance monitoring data.
The operational process addresses the usage of the software product by operational
staff in accordance with operational procedures.
As described in this document (chapter 5.5 VALIDATION), equipment validation is not
covered in this section.
Some ANS procedures exist which rely on the software. The purpose of this chapter is
not to define these ANS procedures, but to define how the software has to be
operated (eg HMI user’s manual, mode of operation) when using these ANS
procedures.
These ANS procedures need verification and validation, however this is out of scope
for this document.
The procedures for raising problem reports and modification requests are covered by
the Problem Resolution Process (Chapter 5 section 8 of this document).
© EUROCAE, 2009
58
© EUROCAE, 2009
59
The Maintenance Process contains the objectives and tasks of the maintainer. This
process is activated whenever the software product undergoes modifications to code
and associated documentation due to a problem or the need for improvement or
adaptation. This process includes the migration and decommissioning of the software
product. The process ends with the decommissioning of the software product.
The maintenance process as defined in this document covers modification of software
that has been commissioned ie software that is introduced into operation (not during
software development). However, the maintenance process as described in this
document does not cover the processes which collect reports or requests for
modification and agree on the acceptance of the modification (See Section 5.8
Problem Resolution Process).
The Problem Resolution Process triggers the Maintenance process.
NOTE: Maintenance does not cover modifications due to new requirements or
changes to existing requirements as this leads to repeating the complete
System Safety Assessment process.
© EUROCAE, 2009
60
© EUROCAE, 2009
61
CHAPTER 5
This chapter lists the objectives, per SWAL, that belong to supporting lifecycle
processes.
Supporting lifecycle processes consist of:
1. Documentation process:
a. Process implementation;
b. Design and development;
c. Production;
d. Maintenance.
2. Configuration management process:
a. Process implementation;
b. Configuration identification;
c. Configuration control;
d. Configuration status accounting;
e. Configuration evaluation;
f. Retrieval & Release process
g. Use of Configuration Management tool;
h. Acquirer agreement for the use of a Configuration Management tool;
i. Configuration Management at the level of Software Component;
j. Configuration Management Traceability.
3. Quality assurance process:
a. Process implementation;
b. Product assurance;
c. Process assurance;
d. Quality audits.
4. Verification process:
a. System verification;
b. Verification plan;
c. Software requirements;
d. Integration;
e. Software Design;
f. Code;
g. Independent verification;
h. Executable;
i. Data;
j. Traceability;
© EUROCAE, 2009
62
k. Complexity measures;
l. Verification process results;
m. Retrieval & Release process.
5. Validation process: (Not Applicable to SW as validation is system-related)
a. Process implementation;
b. Validation planning;
c. Boundaries validation;
d. Pass/Fail criteria;
e. Validation test;
f. Record of validation activities;
g. Independent validation team.
6. Joint review process:
a. Process implementation;
b. Project management reviews;
c. Technical reviews.
7. Audit process:
a. Process implementation;
b. Audit.
8. Problem resolution process:
a. Process implementation;
b. Problem resolution;
c. Safety impact;
d. Problem Report Configuration Management.
The objectives in a supporting process are the responsibility of the organisation
performing that process. Depending on the lifecycle phase, different organisations
may be responsible for performing a process.
Each organisation shall 5.0.1 ensure that the process is in existence and functional.
© EUROCAE, 2009
63
The Documentation Process is a process for recording information produced by a lifecycle process or activity. The process contains the set of
objectives, which plan, design, develop, produce, edit, distribute, and maintain those documents needed by all concerned such as managers,
engineers, NSA and users of the system or software product.
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.1 SSF1 -
Process Part III
5.1.1 A plan, identifying the documents to be produced during the lifecycle of the software
Implementation product, shall 5.1.1.1 be defined and implemented. SSF2 –
Part II
Annex A Section A.4.1 SSF1 -
Design & Part III
5.1.2 Each identified document shall 5.1.2.1 be designed in accordance with applicable
Development documentation standards/rules. SSF2 –
Part II
Annex A Section A.4.1
SSF1 -
The documents shall 5.1.3.1 be produced and provided in accordance with the plan. Part III
5.1.3 Production Production and distribution of documents may use paper, electronic, or other media. Master
SSF2 –
materials shall 5.1.3.2 be stored in accordance with requirements for record retention,
Part II
security, maintenance, and backup.
Annex A Section A.4.1 SSF1 -
Part III
5.1.4 Maintenance The tasks, that are required to be performed when documentation is to be modified, shall
5.1.4.1 be
performed in accordance with the configuration management processes (5.2). SSF2 –
Part II
© EUROCAE, 2009
64
© EUROCAE, 2009
65
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.2
Process implementation
A configuration management plan shall 5.2.1.1 be developed. The plan shall 5.2.1.2 include, as
a minimum:
• the configuration management activities;
Configuration • procedures and schedule for performing these activities; SSF1- Part
management VII
5.2.1 • the organisation(s) responsible for performing these activities; and their relationship
process SSF2 –
with other organisations, such as software development or maintenance;
implementation Part II
• Software lifecycle environment control management (tools used to develop or verify
SW)
• Definition of SW lifecycle data (any output relevant to the safety assurance of the
software) control management.
The plan shall 5.2.1.3 be documented, placed under configuration management and
implemented.
Annex A Sections A.3.3.1 and A.4.2
A scheme shall 5.2.2.1 be established for identification of software and their versions to be
controlled throughout the complete lifecycle of the software.
For each version of all software, the following shall 5.2.2.2 be identified, as a minimum:
SSF1 -Part
• the documentation that establishes the baseline;
Configuration VII
5.2.2 • the version references;
identification SSF2 –
• the problem reports list (those already fixed, those fixed in that particular version and Part VI
those still open if any);
• and other identification details.
The items to be configuration-identified shall 5.2.2.3 be identified, along with their associated
configuration management level.
© EUROCAE, 2009
66
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.2
The following shall 5.2.3.1 be performed: identification and recording of change requests;
analysis and evaluation of the changes; approval or rejection of the request; and SSF1 -Part
Configuration implementation, verification, and release of the modified software. V
5.2.3
control An audit trail shall 5.2.3.2 exist, whereby each modification, the reason for the modification, SSF2 –
and authorisation of the modification can be traced. Part VI
Control and audit of all accesses to the controlled software that handle safety related
functions shall 5.2.3.3 be performed.
Annex A Section A.4.2
SSF1 -Part
Configuration Management records and status reports that show the status and history of controlled V & VII
5.2.4 status software including baseline shall 5.2.4.1 be prepared. Status reports shall5.2.4.2 include the
accounting SSF2 –
number of changes for a project, latest software versions, release identifiers, the number of
Part VI
releases, and comparisons of releases.
Annex A Section A.4.2 SSF1 -Part
Configuration VII
5.2.5 The following shall 5.2.5.1 be determined and ensured: the functional completeness of the
evaluation software against their requirements and the physical completeness of the software SSF2 –
(whether their design and code reflect an up-to-date technical description). Part IV
Annex A Section A.4.2
SSF1 -Part
Retrieval & A retrieval and release process shall 5.2.6.1 exist and shall 5.2.6.2 be documented. VII
5.2.6 Release
The release and delivery of software products and documentation shall 5.2.6.3 be formally SSF2 –
Process controlled. Master copies of code and documentation shall 5.2.6.4 be maintained for the life Part VI
of the software product.
A tool shall 5.2.7.1 be used to perform Software configuration management. SSF1- Part
Use of a CM I
5.2.7
tool SSF2 –
Part VI
© EUROCAE, 2009
67
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
The acquirer shall 5.2.8.1 approve the selected software configuration management tool. SSF1 -Part
Use of a CM VII
5.2.8 tool (acquirer
agreement) SSF2 –
Part VI
The software configuration management shall 5.2.9.1 be performed at the Software Unit SSF1 -Part
At level of SW level. VII
5.2.9
component SSF2 –
Part VI
Software lifecycle data (any output) shall 5.2.10.1 be traceable between versions. SSF1 -Part
Configuration VII
5.2.10 Management All lifecycle data shall 5.2.10.2 be traceable to the version of software being deployed.
Traceability SSF2 –
Part VI
The software configuration management shall 5.2.11.1 be performed at the Software source SSF1 -Part
At level of SW code level. VII
5.2.11
source code SSF2 -
Other
© EUROCAE, 2009
68
The Quality Assurance Process is a process for providing adequate assurance that
the software products and processes in the project lifecycle conform to their specified
requirements and adhere to their established plans. To be unbiased, quality
assurance shall 5.3.1 have organisational freedom and authority from persons directly
responsible for developing the software product or executing the process in the
project. Quality assurance may be internal or external depending on whether evidence
of product or process quality is demonstrated to the management of the supplier or the
acquirer. Quality assurance may make use of the results of other supporting
processes, such as Verification, Validation, Joint Reviews, Audits, and Problem
Resolution.
NOTE: The Joint Review and Audit Processes may be addressed (with
justification) within the Quality Assurance Process.
© EUROCAE, 2009
69
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.3
A quality assurance process tailored to the project shall 5.3.1.1 be established. The
objectives of the quality assurance process shall 5.3.1.2 be to assure that the software
Process SSF1 -Part VII
products and the processes employed for providing those software products comply with
5.3.1 Implementati
their established requirements and adhere to their established plans. SSF2 – Part I
on
A plan for conducting the quality assurance process activities and tasks shall 5.3.1.3 be
defined, implemented, and maintained (including configuration management of evidence
records) throughout the relevant parts of the software lifecycle.
Annex A Section A.4.3
Product It shall 5.3.2.1 be assured that all the plans required by ED-153 are defined, are mutually SSF1 -Part VII
5.3.2
Assurance consistent, and are being executed as required. SSF2 – Part I
A Software Conformity review shall 5.3.2.2 be performed.
Annex A Section A.4.3
It shall 5.3.3.1 be assured that those software lifecycle processes (supply, development, SSF1 -Part VII
Process operation, maintenance, and supporting processes including quality assurance)
5.3.3 SSF2 – Part I,
Assurance employed for the project adhere to the plans.
IV, V
It shall5.3.3.2 be assured that the internal software engineering practices, development
environment and test environment adhere to the plans.
© EUROCAE, 2009
70
The Verification Process is a process for determining whether the software products of
an activity fulfil the requirements or conditions imposed on them in the previous
activities. For cost and performance effectiveness, verification should be integrated,
as early as possible, with the process (such as supply, development, operation, or
maintenance) that employs it. This process may include analysis, review and test.
This process may be executed with varying degrees of independence. The degree of
independence may range from the same person or different person in the same
organisation to a person in a different organisation with varying degrees of separation.
In the case where the process is executed by an organisation independent of the
supplier, developer, operator, or maintainer, it is called Independent Verification
Process (confirmation by examination of evidence that a product, process or service,
fulfils specified requirements).
All verification activities are assessing a product not only against its requirements, but
also for conformance with the process that created it and any particular
standards/rules used in its creation.
© EUROCAE, 2009
71
© EUROCAE, 2009
72
© EUROCAE, 2009
73
© EUROCAE, 2009
74
Note: The compliance should be verified according to the definition of the transition
criteria between lifecycle phases (cf SWAL allocation for Development process)
© EUROCAE, 2009
75
© EUROCAE, 2009
76
© EUROCAE, 2009
77
© EUROCAE, 2009
78
• Verification results are correct and complete and discrepancies are justified
NOTE: verification may be performed through inspection, analysis, demonstration or a combination of them all throughout the lifecycle:
© EUROCAE, 2009
79
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
The Software Retrieval and release process shall 5.4.13.1 be verified. SSF1 -
Verification of Part VII
Retrieval and
5.4.13 SSF2
Release
process Part I,
II, V, VI
© EUROCAE, 2009
80
The Validation Process is a process for determining whether the requirements and the
final, as-built system or software product fulfils its specific intended use. Validation
may be conducted in earlier stages. This process may be conducted as a part of
Software Acceptance Support.
This process may be executed with varying degrees of independence. The degree of
independence may range from the same person or different person in the same
organisation to a person in a different organisation with varying degrees of separation.
In the case where the process is executed by an organisation independent of the
supplier, developer, operator, or maintainer, it is called Independent Validation
Process (confirmation by examination and provision of objective evidence that the
particular requirements for a specific intended use are fulfilled; usually used for
internal validation of the design).
The ANSP is responsible for conducting validation (which intends to show that the
system meets its safety objectives in its operational environment). Validation concerns
the equipment part of the system, as some procedures are defined to perform this
equipment validation.
Validation is out of scope of this document and therefore no objectives are defined
here.
The Joint Review Process is a process for evaluating the status and products of an
activity of a project as appropriate. Joint reviews are at both project management and
technical levels and are held throughout the life of the contract. This process may be
employed by any two parties, where one party (reviewing party) reviews another party
(reviewed party).
To be unbiased, the Joint Review Process shall 5.4.13.1 have organisational freedom
and authority from persons directly responsible for developing the software product or
executing the process in the project.
© EUROCAE, 2009
81
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.6 SSF1 -
Process Part VII
Periodic reviews shall 5.6.1.1 be held at predetermined milestones as specified in the project
5.6.1 implementati
plan(s). SSF2 –
on
The review results shall 5.6.1.2 be documented and distributed. Part I
© EUROCAE, 2009
82
The Audit Process is a process for determining compliance with the requirements, plans, and contract as appropriate. This process may be
employed by any two parties, where one party (auditing party) audits the software products or activities of another party (audited party).
To be unbiased, the Audit Process shall 5.7.1 have organisational freedom and authority from persons directly responsible for developing the
software product or executing the process in the project.
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.7
SSF1 -
Audits shall 5.7.1.1 be held at predetermined milestones as specified in the project plan(s) or Part VII
Process
5.7.1 upon specific request.
implementation SSF2 –
After completing an audit, the audit results shall 5.7.1.2 be documented and provided to the Part I
audited party.
Annex A Section A.4.7
Audits at SW requirement level shall 5.7.2.1 be conducted at predetermined milestones.
Audits shall 5.7.2.2 identify whether:
• The acceptance review and verification requirements prescribed by the documentation are
adequate for the acceptance of the software.
• Verification data comply with the specification. SSF1 -
Audits at SW Part VII
5.7.2 requirement • Software was successfully verified and meets its SW requirements.
level SSF2 –
• Verification reports are correct and discrepancies between actual and expected results Part I
have been resolved.
• Product (SW requirement) and User documentation complies with standards/rules as
specified.
• Activities have been conducted according to applicable requirements, plans, and contract.
• The costs and schedules adhere to the plans.
© EUROCAE, 2009
83
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.7
Audits shall 5.7.3.1 be conducted at predetermined milestones or upon specific request to
ensure that:
• The acceptance review and verification requirements prescribed by the documentation are
adequate for the acceptance of the software.
• Verification data comply with the specification. SSF1 -
audits down to • Software was successfully verified and meets its SW requirements and SW architecture Part VII
5.7.3
SW design level requirements. SSF2 –
• Verification reports are correct and discrepancies between actual and expected results Part I
have been resolved.
• Product (SW requirement and SW architecture) and User documentation complies with
standards/rules as specified.
• Activities have been conducted according to applicable requirements, plans, and contract.
• The costs and schedules adhere to the plans.
© EUROCAE, 2009
84
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.7
Audits shall 5.7.4.1 be conducted at predetermined milestones and upon specific request to
ensure that:
• The acceptance review and verification requirements prescribed by the documentation are
adequate for the acceptance of the software.
• Verification data comply with the specification. SSF1 -
quality audits
• Software was successfully verified and meets its SW requirements, SW architectural Part VII
5.7.4 down to source
requirement and SW detailed design requirements. SSF2 –
code level
• Verification reports are correct and discrepancies between actual and expected results Part I
have been resolved.
• Product (SW requirement, SW architecture, SW detailed design and Source code) and
User documentation complies with standards/rules as specified.
• Activities have been conducted according to applicable requirements, plans, and contract.
• The costs and schedules adhere to the plans.
© EUROCAE, 2009
85
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.7
Audits shall 5.7.5.1 be conducted at predetermined milestones or upon specific request to
ensure that:
• The acceptance review and verification requirements prescribed by the documentation are
adequate for the acceptance of the software.
• Verification data comply with the specification.
SSF1 -
quality audits • Software was successfully verified and meets its SW requirements, SW architectural
Part VII
5.7.5 down to requirement and SW detailed design requirements.
executable level SSF2 –
• Verification reports are correct and discrepancies between actual and expected results
Part I
have been resolved.
• Product (SW requirement, SW architecture, SW detailed design, Source code and
executable) and User documentation complies with standards/rules as specified.
• SW development tools (eg Compilers) are qualified.
• Activities have been conducted according to applicable requirements, plans, and contract.
• The costs and schedules adhere to the plans.
© EUROCAE, 2009
86
The Problem/Change Resolution Process is a process for analysing and resolving the
problems (including non-conformances), whatever their nature or source, that are
discovered during the execution of development, operation, maintenance, or other
processes. The objective is to provide a timely, responsible, and documented means
to ensure that all discovered problems are analysed and resolved and trends are
recognised.
Problem/Change Resolution phase includes:
Non-conformance reports (during all phases) which could or not affect safety;
Correction: a modification has to be performed in order to fix a reported
problem;
Prevention: a modification has to be performed because an analysis has
concluded that the software behaviour could contribute to a safety-related
event:
- either because the system safety assessment (at the system or at the
software level) process review/update identified it and did not do so till
then;
- Or because an operational report identified it though no safety
occurrence happened;
Evolution: a modification has to be performed because software has to be
updated for technological reasons (eg change of hardware platform, software
development tool version change, software development tool obsolescence).
© EUROCAE, 2009
87
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Annex A Section A.4.8
A problem resolution process (including the demonstration of the successful resolution of the SSF1-
Process problem) shall 5.8.1.1 be defined for handling all problems (including non-conformances) Part V
5.8.1
implementation detected in the software products and processes/activities. SSF2 –
Note: The implementation of the process could, for example, include the establishment of Part I
a review board.
Annex A Section A.4.8
SSF1 -
When problems (including non-conformances) have been detected in a software product or
Problem Part V
5.8.2 an activity, a problem report shall 5.8.2.1 be prepared to describe each problem detected.
resolution The problem report shall 5.8.2.2 be used as part of a closed-loop process: from detection of SSF2 –
the problem, through investigation, analysis and resolution of the problem and its cause, and Part I
onto trend detection across problems.
An analysis shall 5.8.3.1 be performed to:
SSF1 -
• assess if a problem report has a safety impact (risk assessment and mitigation eg Part V
5.8.3 Safety Impact FHA/PSSA/SSA) or compromises the previous SWAL allocation
SSF2 –
• corrective actions exist such that safety-related problems can be shown to have been Part I
acceptably mitigated.
Problem report shall 5.8.4.1 be considered as software lifecycle data to be put under SSF1 -
Problem Report configuration management as defined in objective 5.2.1. Part V
5.8.4 Configuration
Management SSF2 –
Part I
© EUROCAE, 2009
88
CHAPTER 6
This chapter lists the objectives, per SWAL, that belong to organisational lifecycle
processes.
Organisational lifecycle processes consist of:
1. Management process:
a. Initiation and scope definition;
b. Planning;
c. Execution and control;
d. Review and evaluation;
e. Closure.
2. Infrastructure process:
a. Process implementation;
b. Establishment of the infrastructure;
c. Maintenance of the infrastructure.
3. Improvement process:
a. Process establishment;
b. Process assessment;
c. Process improvement.
4. Training process:
a. Process implementation;
b. Training material development;
c. Training plan implementation.
The objectives and tasks in an organisational process are the responsibility of the
organisation using that process. Depending on the lifecycle phase, different
organisations may be responsible for performing a process. Each organisation
ensures that the process is in existence and functional. Therefore the satisfaction of
these objectives can be achieved at the organisational level via documents (Safety
Policy, Quality Management System…) not specifically/exclusively addressing
software.
The Management Process contains the generic objectives and tasks, which may be
employed by any party that has to manage its respective processes. The manager is
responsible for product management, project management, and task management of
the applicable process(es), such as the acquisition, supply, development, operation,
maintenance, or supporting process.
© EUROCAE, 2009
89
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
Process implementation
A management process tailored to the project shall 6.1.1.1 be defined. The output of the
Management
management process shall 6.1.1.2 be documented and distributed.
Process SSF1 -
implementation The management process shall 6.1.1.3 be initiated by establishing the requirements of the Part III
6.1.1 process to be undertaken.
Initiation & SSF2 –
scope The manager shall 6.1.1.4 establish the feasibility of the process by checking that the Other
definition resources (personnel, materials, technology, and environment) required to execute and
manage the process are available, funded, adequate, and appropriate and that the time-
scales to completion are achievable.
The manager shall 6.1.2.1 prepare the plans for execution of the process. The plans
associated with the execution of the process shall 6.1.2.2 contain descriptions of the
associated activities and tasks and identification of the software products that will be
provided. These plans shall 6.1.2.3 include, as a minimum, the following:
• Schedules for the timely completion of tasks;
• Estimation of effort;
SSF1 -
• Adequate resources needed to execute the tasks; Part III
6.1.2 Planning
• Allocation of tasks (including who, what and when); SSF2 –
Other
• Assignment of responsibilities;
• Quantification of project risks associated with the tasks or the process itself;
• Quality control measures to be employed throughout the process;
• Costs associated with the process execution;
• Provision of environment and infrastructure.
© EUROCAE, 2009
90
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
The manager shall 6.1.3.1 initiate the implementation of the plan to satisfy the objectives and
criteria set, exercising control over the process.
SSF1 -
The manager shall 6.1.3.2 monitor the execution of the process, providing both internal Part III
Execution &
6.1.3 reporting of the process progress and external reporting to the acquirer as defined in the
control SSF2 –
contract.
Other
The manager shall 6.1.3.3 investigate, analyse, and resolve the problems discovered during
the execution of the process.
The manager shall 6.1.4.1 ensure that the software lifecycle data is evaluated for satisfaction
SSF1 -
of requirements.
Review & Part III
6.1.4 The manager shall 6.1.4.2 assess the evaluation results of the software products, activities,
evaluation SSF2 –
and tasks completed during the execution of the process vis-à-vis the achievement of the
Other
objectives and completion of the plans.
When all software products, activities, and tasks are completed, the manager shall 6.1.5.1
determine whether the process is complete taking into account the criteria as specified in
the contract or as part of organisation's procedure. SSF
6.1.5 Closure
The manager shall 6.1.5.2 check the results and records of the software products, activities, Part III
and tasks employed for completeness. These results and records shall 6.1.5.3 be archived in
a suitable environment as specified in the contract.
© EUROCAE, 2009
91
The infrastructure process is a process to establish and maintain the infrastructure needed for any other process. The infrastructure may
include hardware, software, tools, techniques, standards and facilities for development, operation or maintenance.
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
The infrastructure shall 6.2.1.1 be defined to meet the requirements of the processes (eg SSF1 -
Process development or verification) employing this process, considering the applicable procedures, Part I
6.2.1 standards/rules, tools, and techniques.
implementation SSF2 –
The establishment of the infrastructure shall 6.2.1.2 be planned. Other
The configuration of the infrastructure shall 6.2.2.1 be planned. Functionality, performance, SSF1 -
Establishment of safety, security, availability, space requirements, equipment, costs, and time constraints Part I
6.2.2 shall 6.2.2.2 be considered.
the infrastructure SSF2 –
Other
The infrastructure shall 6.2.3.1 be maintained, monitored, and modified as necessary to SSF1 -
Maintenance of ensure that it continues to satisfy the requirements of the processes (eg development or Part VII
6.2.3 verification) employing this process. As part of maintaining the infrastructure, the extent to
the infrastructure SSF2 -
which the infrastructure is under configuration management shall 6.2.3.2 be defined.
Other
© EUROCAE, 2009
92
The Improvement Process is a process for establishing, assessing, measuring, controlling, and improving a software lifecycle process (as
requested by ISO9001-2000).
© EUROCAE, 2009
93
The Training Process is a process for providing and maintaining trained personnel. The acquisition, supply, development, operation, or
maintenance of software products is largely dependent upon knowledgeable and skilled personnel. For example: developer personnel should
have essential training in software management and software engineering. It is, therefore, imperative that personnel training be planned and
implemented early so that trained personnel are available as the software product is acquired, supplied, developed, operated, or maintained.
Training includes the use and customisation of standards to a specific application.
In this section, training does not address training of operational staff in charge of operating software, but training of staff in charge of
developing or maintaining software.
ALs SWAL
Obj
Objectives OBJECTIVES Output
N° 1 2 3 4
Title/Topic
A review of the project requirements shall 6.4.1.1 be conducted to establish and make
timely provision for acquiring or developing the resources and skills required by the SSF1 -Part
Process management and technical staff. The types and levels of training and categories of VII
6.4.1
implementation personnel needing training shall 6.4.1.2 be determined. A training plan, addressing SSF2 –
implementation schedules, resource requirements, and training needs, shall 6.4.1.3 be Other
defined.
Training manuals, including presentation materials used in providing training, shall SSF1 -Part
Training material 6.4.2.1 be developed. VII
6.4.2
development SSF2 –
Other
Training plan The training plan shall 6.4.3.1 be implemented to provide training to personnel. Training SSF1Part VII
6.4.3 records shall 6.4.3.2 be maintained.
implementation SSF2 - Other
© EUROCAE, 2009
94
CHAPTER 7
This chapter lists the objectives, per SWAL, that do not belong to ISO/IEC 12207, but
have been added due to:
The analysis of other standards more safety oriented (ED-109/DO-278,
ED-12B/DO-178B and IEC 61508),
ANS particularities (some are included in ED-109/DO-278),
Omissions by existing standards.
These additional lifecycle processes consist of:
1. Software development environment
a. Definition
b. Programming language
c. Compiler considerations
2. COTS
a. COTS plans;
b. COTS Transition criteria;
c. COTS Plan consistency;
d. COTS requirement coverage;
e. COTS Lifecycle data;
f. COTS Derived requirements;
g. COTS HW compatibility;
h. COTS Configuration Management;
i. COTS Problem Reporting;
j. COTS Incorporation;
k. COTS Configuration Management Archiving;
3. Tool qualification (Out of scope of these recommendations)
a. Qualification criteria for software development tools
b. Qualification criteria for software verification tools
4. Service experience
© EUROCAE, 2009
95
7.2 COTS
COTS definition[1]:
COTS software encompasses a wide range of software, including purchased software,
Non-Developmental Items (NDI), and software previously developed without
consideration of ED-153. The term “Previously Developed Software” is also used for
such software. This software may or may not have been approved through other
“approval processes.” Partial data or no data may be available as evidence of
objectives of ANS developmental process. For the rest of this section, all such
software is referred to as COTS for the sake of brevity. This terminology was selected
because of the usual use of the term “COTS” within the “ground” ANS community.
Examples of COTS are operating systems, real-time kernels, graphical user
interfaces, communication and telecommunication protocols, language run-time
libraries, mathematical and low-level bit routines, and string manipulation routines.
COTS software can be purchased apart from or in conjunction with COTS hardware,
such as workstations, mainframes, communication and network equipment, or
hardware items (eg, memory, storage, I/O devices). There also may be some
instances where the use of COTS software is impractical to avoid (eg, library code
associated with certain compilers).
COTS deliverables vary by the contract with the COTS supplier. They may extend
from license rights, executable code, user documentation, and training to the full set of
COTS lifecycle data, including the source code resulting from the COTS development.
COTS information disclosure related to cost, protection of intellectual properties, and
legal questions (eg ownership of the software, patents, liability, and documentation
responsibility). These aspects are beyond the scope of this standard, which addresses
only those aspects that are specific to software assurance.
Development processes used by COTS suppliers and procurement processes applied
by acquirers may not be equivalent to recommended processes, and may not be fully
consistent with the guidance of this document. The use of COTS may mean that
alternate methods are used to gain assurance that the appropriate objectives are
satisfied. These methods include, but are not limited to, product service experience,
prior assurance, process recognition, reverse engineering, restriction of functionality,
formal methods, and audits and inspections. Data may also be combined from more
than one method to gain assurance data that the objectives are satisfied.
In cases where insufficient data is available to satisfy the objectives, this section may
be used as guidance to adapt the SSAS (See Objective 3.0.13) with agreement from
the appropriate Approval Authority.
This section applies only to COTS used for ANS applications and is not intended to
alter or substitute any of the objectives applied to ANS software unless justified by a
safety assessment process and accepted by the appropriate Approval Authority. It
aims at supporting the customisation of the SSAS (See objective 3.0.13) with
agreement from the Approval Authority.
Where possible, the COTS section titles as the process it addresses with alterrnative
objectives or activities (e.g; verification process). However, sometimes, the title does
not allow identifying in a straight forward manner which process is alternatively
addressed e.g. acquisition process replaces SW requirements,
NOTE: COTS (as defined here above) usage is not accepted for software having
to satisfy a SWAL1.
© EUROCAE, 2009
96
COTS software may need to be integrated into high integrity ANS systems or
equipment. However, the higher the risk of the ANS function, the more demanding the
assurance requirements are for the system and the software. Alternate methods may
be used to augment design assurance data for COTS software components at a
desired assurance level. When COTS are used in an ANS system, considerations
such as planning, acquisition, and verification should be addressed.
Risk mitigation techniques may be considered to reduce the ANS system’s reliance on
the COTS. The goal of these mitigation techniques is to reduce the effect of
anomalous behaviour of COTS on the ANS system function. Risk mitigation
techniques may be achieved through a combination of people, procedure, equipment,
or architecture. For example, architectural means may involve partitioning,
redundancy, safety monitoring, COTS safe subsets by the use of encapsulation or
wrappers, and data integrity checking.
© EUROCAE, 2009
97
ALs SWAL
Obj
Objectives OBJECTIVES
N° 1 2 3 4 Output
Title/Topic
A COTS acquisition, verification, configuration management, quality assurance plan (or SSF1 -
plans) shall 7.2.1.1 be defined; Part VII
7.2.1 COTS Plans N/A
SSF2 –
Part II
Transition criteria shall 7.2.2.1 be defined (according to the relationships between COTS SSF1 -
COTS processes and appropriate CNS/ ATM lifecycle processes); Part VII
7.2.2 Transition N/A
Criteria SSF2 –
Part II
COTS plans shall 7.2.3.1 be consistent with ANS SW plans (plans for acquisition, evaluation, SSF1 -
COTS Plans integration processes are consistent with ANS SW plans); Part VII
7.2.3 N/A
Consistency SSF2 –
Part II
© EUROCAE, 2009
98
© EUROCAE, 2009
99
The focus of this section is on the assurance aspects of acquiring COTS. There are
additional acquisition considerations not described in this document. The COTS
acquisition process is comprised of requirements definition, assessment, and
selection.
a. Requirements Definition: The ANS software requirements definition process
identifies software requirements that COTS may satisfy. COTS may contain
more capabilities than the requirements needed by the ANS system. A
definition of these capabilities may be available from the supplier or derived
from the COTS user’s manuals, technical materials, product data, etc. In the
model depicted in Figure 5, the ANS requirements satisfied by COTS are the
intersection of COTS capabilities and ANS requirements.
Due to the use of COTS, there may be derived requirements (eg platform
dependent requirements, interrupt handling, interface handling, resource
requirements, usage constraints, error handling, partitioning) that should be
added to the ANS software requirements.
All ANS requirements satisfied by the COTS software and the resulting derived
requirements should be provided to the safety assessment process.
COTS
Capabilities
CNS/ATM
Requirements
CNS/ATM Requirements
satisfied by COTS
© EUROCAE, 2009
100
© EUROCAE, 2009
101
ALs SWAL
Obj
Objectives OBJECTIVES
N° 1 2 3 4 Output
Title/Topic
ANS SW requirements coverage achieved by the COTS shall 7.2.4.1 be demonstrated SSF1-
COTS Part VII
7.2.4 Requirements N/A
Coverage SSF2 –
Part II
COTS Lifecycle data availability shall 7.2.5.1 be determined in accordance with SWAL (extent SSF1 -
COTS Lifecycle of lifecycle data that are available for assurance purposes) Part VII
7.2.5 N/A
data SSF2 –
Part II
Derived requirements shall 7.2.6.1 be defined (requirements imposed on the ANS system due SSF1 -
COTS Derived to the usage of COTS or requirements to prevent the unneeded functions of the COTS from Part VII
7.2.6 affecting the ANS system) N/A
Requirements SSF2 –
Part II
Compatibility of COTS with target computers shall 7.2.7.1 be demonstrated SSF1 -
COTS HW Part VII
7.2.7 N/A
Compatibility SSF2 –
Part II
© EUROCAE, 2009
102
There are no additional verification objectives imposed upon the ANS system because
of use of COTS.
© EUROCAE, 2009
103
The use of alternate methods should satisfy both of the following conditions:
a. The safety assessment process supports the justification.
b. Acceptance is granted by the appropriate Approval Authority.
Activities used for specific alternate methods or for combination of alternate methods
are considered on a case-by-case basis. An example of activities associated with the
usage of service experience for assurance credit is provided below.
Use of service experience data for assurance credit is predicated upon two factors:
sufficiency and relevance. Sufficient service experience data may be available
through the typical practice of running new ANS systems in parallel with operational
systems in the operational environment, long duration of simulation of new ANS
systems, and multiple shadow operations executing in parallel at many locations.
Relevant service experience data may be available for ANS systems from reuse of
COTS software from in-service ANS Systems, or ANS system verification and pre-
operational activities. For COTS software with no precedence in ANS applications,
many processes may be used to collect service experience; examples include the
validation process, the operator training process, the system qualification testing, the
system operational evaluation, and field demonstrations.
The following applies for accumulation of service experience:
a. The use, conditions of use, and results of COTS service experience should be
defined, assessed by the safety assessment process, and submitted to the
appropriate Approval Authority.
b. The COTS operating environment during service experience time should be
assessed to show relevance to the intended use in ANS. If the COTS operating
environment of the existing and intended applications differ, additional
verification should be performed to ensure that the COTS application and the
ANS applications will operate as intended in the target environment. It should
be assured that COTS capabilities to be used are exercised in all operational
modes. Analysis should also be performed to assure that relevant permutations
of input data are executed.
c. Any changes made to COTS during service experience time should be
analysed. An analysis should be conducted to determine whether the changes
made to COTS alter the applicability of the service experience data for the
period preceding the changes.
d. All in-service problems should be evaluated for their potential adverse effect on
ANS system operation. Any problem during service experience time, where
COTS implication is established and whose resulting effect on ANS operations
is not consistent with the safety assessment, should be recorded. Any such
problem should be considered a failure. A failure invalidates the use of related
service experience data for the period of service experience time preceding the
correction of that problem.
e. COTS capabilities which are not necessary to meet ANS requirements should
be shown to provide no adverse effect on ANS operations.
f. Service experience time should be the accumulated in-service hours. The
number of copies in service should be taken into account to calculate service
experience time, provided each copy and associated operating environment are
shown to be relevant, and that a single copy accounts for a certain pre-
negotiated percentage of the total.
NOTE: Software field data cannot be treated as if it were random hardware
failures.
© EUROCAE, 2009
104
Available COTS data may not be able to demonstrate satisfaction of all of the
verification objectives described in this document. For example, high-level
requirements testing for both robustness and normal operation may be demonstrated
for COTS but the same tests for low-level requirements may not be accomplished.
The use of service experience may be proposed to demonstrate satisfaction of these
verification objectives for COTS. The amount of service experience to be used is
selected based on engineering judgement and experience with the operation of ANS
systems. The results of software reliability models cannot be used to justify service
experience time. A possible approach for different assurance levels is provided
below:
1. Cannot be applied for SWAL1.
2. A minimum of one year (8,760 hours) of service experience with no failure for
SWAL2.
3. A minimum of six months (4,380 hours) of service experience with no failure for
SWAL3.
4. SWAL4 objectives are typically satisfied without a need for alternate methods.
This section describes the configuration management process for a system using
COTS. The configuration management system of the COTS supplier is not under the
control of ANS configuration management system. The ANS configuration
management system should include control of the COTS versions.
© EUROCAE, 2009
105
ALs SWAL
Obj
Objectives OBJECTIVES
N° 1 2 3 4 Output
Title/Topic
COTS COTS configuration and data items shall 7.2.8.1 be identified. SSF1 -
Configuration Part VII
7.2.8 N/A
Management: SSF2 –
Identification Part II
COTS COTS configuration and data items shall 7.2.11.1 be archived. SSF1 -
Configuration Part VII
7.2.11 N/A
Management: SSF2 –
Archiving Part II
© EUROCAE, 2009
106
The ANS quality assurance process should also assess the COTS processes and
data outputs to obtain assurance that the objectives associated with COTS are
satisfied.
NOTE: It is recommended that the COTS supplier quality assurance is co-
ordinated with the ANS quality assurance process where feasible.
© EUROCAE, 2009
107
© EUROCAE, 2009
108
CHAPTER 8
This chapter proposes various structures for the documents and evidence that are
intended to provide assurance.
There can be various ways of organising a Software Safety Folder (SSF) depending
on the main usage or goal. This chapter proposes two different SSF satisfying two
different uses:
Option1: Project-based structure:
This option proposes to organise the SSF along the Software Project activities
Option 2: Compliance-based structure
This option proposes to organise the SSF in order to support the Safety
Regulatory requirements compliance demonstration.
SOFTWARE
SOFTWAREASSURANCE
ASSURANCELEVEL
LEVEL(SWAL)
(SWAL)
TO SATISFY
OBJECTIVES
OBJECTIVES
TO GIVE CONFIDENCE
Chapter 8 EVIDENCE
EVIDENCE
TO ACHIEVE TO PRODUCE
ACTIVITIES
ACTIVITIES
© EUROCAE, 2009
109
© EUROCAE, 2009
110
© EUROCAE, 2009
111
© EUROCAE, 2009
112
© EUROCAE, 2009
113
© EUROCAE, 2009
114
© EUROCAE, 2009
115
© EUROCAE, 2009
116
© EUROCAE, 2009
117
ANNEX A
This annex re-uses the IEC/ISO12207 processes structure, because it has the widest
coverage (from definition to decommissioning) of ANS needs. This annex does not
intend to promote any standard, nor state that any standard best fits ANS needs (even
if IEC/ISO 12207 has been used as a process structure basis).
The purpose of this annex is as follows:
To provide a traceability matrix. For each listed ED-153 objective a reference is
given to the standard paragraph, which covers this objective.
To provide a compatibility matrix between standards, in order to identify
commonalities and differences between those standards. This enables
suppliers, ATS providers, regulators and any other organisation or group to be
able to evaluate characteristics of a system or equipment integrating software
without requiring the use of the standard recommended by its organisation. This
compatibility matrix will allow all actors to “speak the same language” when
talking about software standards.
To provide a synthetic overview of ED-153 objectives and activities (‘title’
column in this annex) coverage by each standard. Tables are used to give a
general view of whether or not ED-153 objectives are implemented by using the
following symbols:
- • (means fully covered)
- P (partially covered)
- blank (not covered)
NOTE: ED-109/DO-278 traceability includes specific considerations due to the fact
that ED-109/DO-278 is not a stand-alone document2 as it is based on
ED-12B/RTCA/DO-178B.
To identify area of improvement of existing standards, especially because of
ANS particularities.
To identify objectives which have to be modified for ANS purposes.
NOTE: This annex does not provide a one-to-one mapping between ED-153
objectives and standard activities. Some ED-153 objectives are split into
several activities as denoted in the ‘title’ column.
Software Lifecycle Processes
The set of ANS software lifecycle processes is divided into:
A software safety assurance system,
Five primary processes,
Eight supporting processes,
Four organisational processes,
Additional ANS software lifecycle objectives.
2
See ED-109 chapter 1.3.
© EUROCAE, 2009
118
The following table lists, per objective, references to related standards in support of
implementing a Software Safety Assurance System.
© EUROCAE, 2009
119
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 ; CMMI
DO-278 DO-178B
3.0.1 Implementation P P P
•
Requirements P (Ref: 3.2 – • • •
3.0.2 Correctness and Table A-2 (lines (Ref: RD 1.1, 1.2,
Completeness (Ref: 5.3.4) (Ref: 5.1, 6.3.1) (Ref: 7.2.2)
1,2) Table A-3 ( 2.1)
lines 1, 2)
P
Requirements • • P
3.0.3 Traceability (Ref: 5.3.4.2; P
5.3.5.6; 5.3.6.7; (Ref: A3.6, A4.6, (Ref: ReqM 1.4)
Assurance (Ref: 5.5)
5.3.7.5) A5.6)
Unintended
• P P
3.0.4 (Ref: 3.6 Table A-5
Functions (Ref: 6.3.4.a) (Ref: 7.4.7.2)
line 1)
P P
3.0.5 SWAL Allocation
(Ref: 2.2.2, 2.2.3) (Ref: 7.5.2, 7.6.2)
Requirements
3.0.6 Satisfaction
• • •
Assurance (Ref: 2.1) (Ref: 5.1) (Ref: 7.2)
Configuration
• • • • •
3.0.7 Management (Ref: 3.8 Table A-
Assurance (Ref : 6.2) (Ref: 7) (Ref: 6.2.3) (Ref: CM)
8)
Assurance Rigour • • •
3.0.8 (Ref: 2.1, 9 &
Objective (Ref: 2.1) (Ref: Part 1 –7.4.2)
11.20)
3.0.9
Assurance Rigour • • •
Criteria (Ref: Chap 3) (Ref: Appendix A) (Ref: Appendix A)
• • •
3.0.10 SWAL Assurance (Ref: 3.10 Table
(Ref: 9 & 11.20) (Ref: 6.2.2)
A-10 ; 5.1)
© EUROCAE, 2009
120
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 ; CMMI
DO-278 DO-178B
SWAL Monitoring: P
3.0.11
Assurance (Ref: 4.1.6.3)
SWAL Monitoring: P
3.0.11
Feedback (Ref: 4.1.6.3)
SWAL Monitoring: P
3.0.11
Event assessment (Ref: 4.1.6.3)
3.0.12
Software P •
Modifications (Ref: 4.1.4.2) (Ref: 7. 8)
3.0.13
Equivalent •
Confidence (Ref: 4.1, 4.2)
3.0.13
Equivalent •
Confidence means (Ref: 4.1, 4.2)
• • •
3.0.14 Isolation (Ref: Chap 2.2.3,
(Ref: Chap 2.2.1) (Ref: Chap
2.3.1)
SW-related P
3.0.15
aspects (Ref: Chap 2.2.5)
3.0.16
Demonstration to • •
NSA (Ref: Chap 3.10) (Ref: Chap 9)
P
Argument P (Ref: Chap 6.3.1, P
3.0.17
Production (Ref: Chap 3) 6.3.3, 6.3.4, 8.3, (Ref: Part 3-7, 3-8)
10.2, 11.20, 12.1)
© EUROCAE, 2009
121
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
P
[Ref:
System • • • a) RD 1.1
3.1.1 b) RD 3.1, TS 1.2
Description (Ref: 2.2) (Ref: 2.1) (Ref: I-7.2.1)
c) RD 3.2
e) RD 2.3, TS 2.3]
Operational P P • P
3.1.2
Environment (Ref: 2.2) (Ref: 2.1.1) (Ref: I-7.2.1) (Ref: RD 1.1)
•
•
Regulatory (Ref: 3.10 Table A- •
3.1.3 (Ref: 2.1.1, 9, 10)
Framework 10 line 2 (Ref: I-7.2.2.4)
- 5.1)
Applicable P • • • P
3.1.4
Standards Standard itself Standard itself Standard itself Standard itself Standard itself
Risk Assessment P P P
3.1.5 and Mitigation
Process Output (Ref: 2.2) (Ref: 2.1.1) (Ref: I-7)
© EUROCAE, 2009
122
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Software Safety
• • •
3.2.1 Assessment
Approach (Ref: Section 5.1) (Ref: 11.1) (Ref: 8)
•
Software Safety P P
3.2.2 (Ref: 5.1 - 3.10
Assessment Plan (Ref: 11) (Ref: I-7.8)
Table A-10)
Software Safety •
•
3.2.3 Assessment Plan (Ref: 5.1 - 3.10
Review (Ref: 9, 10)
Table A-10)
Software Safety P •
3.2.4 Assessment Plan
Dissemination (Ref: 5.1) (Ref: 9, 10)
© EUROCAE, 2009
123
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Failure P •
3.3.1 Identification:
analysis (Ref: 2.2) (Ref: I-7.4)
Failure
• •
3.3.1 Identification:
modes (Ref: 2.2.2) (Ref: I-7.4)
Failure Effects P •
3.3.2
evaluation (Ref: 2.2.1) (Ref: I-7.4)
Failure Hazard • •
3.3.2
Identification (Ref: 2.2) (Ref: I-7.4)
Re-Assessment of P •
3.3.3
Risk (Ref: 2.2.1) (Ref: I-7.5)
Software P
Requirements: [Ref:
P •
3.3.4 Compliance with a.1) RD 2.1, 2.2
System Safety (Ref: 2.2.1) (Ref: I-7.6)
a.2) TS 2.1, Ver
Objectives 1.1, 2.2, 2.3 ]
NOTE: Column ED-12B/DO-178B – These tasks are identified as partially met by ED-12B/DO-178B because section 2 of this document
compensates the lack of system safety standard namely ARP4754/4761, which was elaborated after ED-12B/DO-178B.
© EUROCAE, 2009
124
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• P
Software Safety (Ref: 3.3 Table A- [Ref:
P •
3.4.1 Assessment 3 lines 1, 2, 3, 4, 5; a) RD 3.3, 3.4, 3.5
Validation (Ref: 6.4; 6.5) (Ref: 6.3)
3.4 Table A-4 lines Ver 2.1, 2.2, 2.3
1, 2, 6) b) -]
Software Safety
• •
3.4.2 Assessment
Verification (Ref: 2.1) (Ref: 2.2.2)
Software Safety
•
Assessment P •
3.4.3 (Ref: 3.9 Table A-
Process (Ref: 6.4) (Ref: 8)
Assurance 9)
© EUROCAE, 2009
125
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Document P
Software Safety P P P P
3.5.1 (Ref: I-7.2.2.6,I-
Assessment (Ref: 6.1) (Ref: 5) (Ref: 11, Annex A)
Process Results 7.3.2.5, I-7.4.2.11 )
Software Safety
Assessment P P P P P
3.5.2 Documentation
Configuration (Ref: 6.2) (Ref: 3.8 - 4.1.7) (Ref: 7.3, Annex A) (Ref: I-7.4.2.12) (Ref: CM 1.1)
Management
Software Safety P P
Assessment P
3.5.3 (Ref: 5.1 - 3.10 (Ref:
Documentation (Ref: 9, 10)
Dissemination Table A-10) GP2.7)
© EUROCAE, 2009
126
© EUROCAE, 2009
127
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Risk Assessment P
and Mitigation
4.1.2 (Ref: Part 1-7.2, 1-
Process – Safety
determination 7.3)
Risk Assessment P
and Mitigation
4.1.2 (Ref: Part 1-7.4, 1-
Process – Hazard
Analysis 7.5)
Risk Assessment
and Mitigation P P P
4.1.3
Process – Safety (Ref: 2) (Ref: 2) (Ref: Part 1-7.6)
Objectives
Risk Assessment
and Mitigation P P P
4.1.3
Process – Safety (Ref: 2) (Ref: 2) (Ref: Part 1-7.6)
Requirements
Determination of
• P
4.1.4 Applicability of ED-
153 (Ref: 5.1.2) (Ref: SAM 1.2)
• •
4.1.5 Tender Selection
(Ref: 5.1.3) (Ref: SAM 1.2)
Supplier • •
4.1.6
monitoring (Ref: 5.1.4) (Ref: SAM 2.2)
Acceptance and
completion: • •
4.1.7
Software (Ref: 5.1.5) (Ref: SAM 2.3)
Acceptance
Acceptance and
• •
4.1.7 completion: Test
Preparation (Ref: 5.1.5) (Ref: SAM 2.3)
© EUROCAE, 2009
128
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Acceptance and
completion: • •
4.1.7
Supplier (Ref: 5.1.5) (Ref: SAM 2.3)
involvement
Acceptance and
• •
4.1.7 completion:
Acceptance review (Ref: 5.1.5) (Ref: SAM 2.3)
NOTE: To simplify and as the purpose of this document is to describe the objectives related to the software lifecycle, it has been considered
that the acquirer performs the PSSA (Preliminary System Safety Assessment). Even if in a real project this step may be performed
in relation with the system supplier, however it remains the acquirer responsibility to validate and accept it. As this document focuses
on the software-related objectives, the main purpose of the PSSA is to allocate an Assurance Level to the software, which has to
remain under the Acquirer ultimate responsibility (at least by validating it, when not allocating it).
NOTE: This document intends to address the software aspects of SSA (System Safety Assessment: the third step of the Safety
Assessment Methodology).
© EUROCAE, 2009
129
© EUROCAE, 2009
130
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
a) PMC 1.5, 1.6,
1.7
b) PPQA]
(Ref: PI 3.4
Ver
Val)
•
[Ref:
Review & a) PMC 1.5, 1.6,
• P 1.7
4.2.6 evaluation: Quality
Assurance (Ref: 5.2.6, 6.3) (Ref: Part 1-6.2) b) PPQA]
(Ref: PI 3.4
Ver
Val)
• •
Delivery & P
4.2.7 (Ref: 5.2.7, (Ref: PI 3.4, SAM
completion (Ref: Part 1-6.2)
5.3.13.2, 5.3.13.3) 2.4))
NOTE: Since ANS systems may operate continuously and may have been in operation for many years, the software lifecycle plans for
these systems should include processes for software changes, technology upgrades, etc., specifically with respect to safety issues.
© EUROCAE, 2009
131
© EUROCAE, 2009
132
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• •
SW Architectural • • •
4.3.5 (Ref: 3.2 Table A-2 (Ref: 5.2, 11.7,
Design (Ref: 5.3.5) (Ref: Part 3-7.4.3) (Ref: TS 2.1, 2.2)
line 3) 11.10)
SW Detailed • •
• • •
4.3.6 Design: Design (Ref: 3.2 Table A-2 (Ref: 5.2, 11.7,
and Interfaces (Ref: 5.3.6) (Ref: Part 3-7.4.5) (Ref: TS 3.1)
lines 1, 2) 11.10)
•
SW Detailed (Ref: 3.5 Table A-5
• • • •
4.3.6 Design: Test lines 1, 2;
Definition (Ref: 5.3.6) (Ref: 5.2, 11.10) (Ref: Part 3-7.4.7) (Ref: TS 3.1, VER)
3.6 Table A-6 lines
1, 2, 3, 4)
•
SW Integration: • • • •
4.3.7 (Ref: 3.1 Table A-1
Development (Ref: 5.3.8) (Ref: 5.4) (Ref: Part 3-7.4.8) (Ref: PI 1.1, 1.3)
line 1)
•
SW Integration: • • • •
4.3.7 (Ref: 3.1 Table A-1
Plan Contents (Ref: 5.3.8) (Ref: 5.4) (Ref: Part 3-7.4.8) (Ref: PI 1.1, 1.3)
line 1)
P
SW Integration: • • •
4.3.7 (Ref: 3.1 Table A-1
Documentation (Ref: 5.3.8) (Ref: 5.4) (Ref: Part 3-7.4.8)
line 1)
•
(Ref:
Part 1-7.9.1.1, 1-
SW Installation: • 7.9.2.1, •
4.3.8
Plan (Ref: 5.3.12) 1-7.9.2.3, (Ref: PP 2, PI 1)
1-7.13.1.1
1-7.13.2.1,
1-7.13.2.2)
SW Installation:
• •
4.3.8 Resources
Documentation (Ref: 5.3.12) (Ref: PP 2, PI 1)
© EUROCAE, 2009
133
© EUROCAE, 2009
134
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
•
Non-Deliverable •
4.3.3 (Ref 3.1 Table A-1
Items (Ref: 5.3.1.5)
line 4)
Environment •
• •
Definition: • • (Ref: PP 2.4
4.3.9 (Ref 3.1. Table A-1 (Ref: Part
Development of (Ref: 5.3.1.3) (Ref: 4.4, 4.5) IPM 1.1
plans line 3) 3-7.1.2.6)
GP 2.2, 3.1)
• •
Environment •
• • (Ref: Part 3 (Ref: PP 2.4
4.3.9 Definition: Content (Ref 3.1 Table A-1
of plans (Ref: 5.3.1.3) (Ref: 4.4, 4.5) Annexes A&B, IPM 1.1
line 3)
3-7.4.4.2) GP 2.2, 2.3, 3.1)
Environment •
•
Definition: • • (Ref: PP 2.4
4.3.9 (Ref 3.1 Table A-1
Documentation of (Ref: 5.3.1.3) (Ref: 4.4, 4.5) IPM 1.1
plans line 3)
GP 2.2)
Development
•
Standards: • • P P
4.3.9 (Ref 3.1 Table A-1
Development of (Ref: 5.3.1.4) (Ref 4.1, 4.2) (Ref: Part 3-7.4.4) (Ref: GP2.2)
plans line 5)
Development •
• P P
4.3.9 Standards: (Ref 3.1 Table A-1
Content of plans (Ref 4.1, 4.2) (Ref: Part 3-7.4.4) (Ref: GP2.2)
line 5)
Development
•
Standards: • P P
4.3.9 (Ref 3.1 Table A-1
Documentation of (Ref 4.1, 4.2) (Ref: Part 3-7.4.4) (Ref: GP2.2)
plans line 5)
• •
•
Environment • • (Ref: Part (Ref: PP 2.4
4.3.10 (Ref 3.1 Table A-1
Definition (Ref: 5.3.1.3) (Ref: 4.4, 4.5) 3-7.1.2.6, Annexes IPM 1.1
line 3)
A&B, 3-7.4.4.2) GP 2.2, 2.3, 3.1)
© EUROCAE, 2009
135
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
•
Development • • P P
4.3.10 (Ref 3.1 Table A-1
Standards (Ref: 5.3.1.4) (Ref 4.1, 4.2) (Ref: Part 3-7.4.4) (Ref: GP2.2)
line 5)
• •
•
Environment • • (Ref: Part (Ref: PP 2.4
4.3.11 (Ref 3.1 Table A-1
Definition (Ref: 5.3.1.3) (Ref: 4.4, 4.5) 3-7.1.2.6, Annexes IPM 1.1
line 3)
A&B, 3-7.4.4.2) GP 2.2, 2.3, 3.1)
•
Development • • P P
4.3.11 (Ref 3.1 Table A-1
Standards (Ref: 5.3.1.4) (Ref 4.1, 4.2) (Ref: Part 3-7.4.4) (Ref: GP2.2)
line 5)
• •
•
Environment • • (Ref: Part (Ref: PP 2.4
4.3.12 (Ref 3.1 Table A-1
Definition (Ref: 5.3.1.3) (Ref: 4.4, 4.5) 3-7.1.2.6, Annexes IPM 1.1
line 3)
A&B, 3-7.4.4.2) GP 2.2, 2.3, 3.1)
• •
P
Environment • • (Ref: Part (Ref: PP 2.4
4.3.14 (Ref: 3.1 Table A-1
Definition (Ref: 5.3.1.3) (Ref: 4.4, 4.5) 3-7.1.2.6, Annexes IPM 1.1
line 3)
A&B, 3-7.4.4.2) GP 2.2, 2.3, 3.1)
Outputs
Documentation: • •
• • •
4.3.15 System and (Ref 3.1 Table A-1 (Ref: Part
Software (Ref: 5.3.1.2.a) (Ref: 4.1, 4.3) (Ref: GP 2.2, 3.1)
line 1) 3-7.1.2.7)
Traceability
Outputs
• •
Documentation: • • •
4.3.15 (Ref 3.1 Table A-1 (Ref: Part
Requirements and (Ref: 5.3.1.2.a) (Ref: 4.1, 4.3) (Ref: GP 2.2, 3.1)
Design Traceability line 1) 3-7.1.2.7)
Outputs
• •
Documentation: • • •
4.3.15 (Ref 3.1 Table A-1 (Ref: Part
Architecture and (Ref: 5.3.1.2.a) (Ref: 4.1, 4.3) (Ref: GP 2.2, 3.1)
Code Traceability line 1) 3-7.1.2.7)
© EUROCAE, 2009
136
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Outputs
Documentation: • •
• • •
4.3.15 Code and (Ref 3.1 Table A-1 (Ref: Part
Executable (Ref: 5.3.1.2.a) (Ref: 4.1, 4.3) (Ref: GP 2.2, 3.1)
line 1) 3-7.1.2.7)
Traceability
Lifecycle • •
Definition: • • •
4.3.16 (Ref 3.1 (Ref: Part 3-7.1.2.1
Lifecycle (Ref: 5.3.1.1) (Ref: 3) (Ref: PP 1.3, 2.1)
Processes Table A-1 line 3) => 3-7.1.2.5)
Lifecycle •
•
Definition: • • (Ref: Part •
4.3.16 (Ref 3.1
Lifecycle (Ref: 5.3.1.1) (Ref: 3) 3-7.1.2.1, 3-7.1.2.3 (Ref: PP 1.3, 2.1)
Information Table A-1 line 3)
=> 3-7.1.2.5)
Lifecycle • •
• • •
4.3.16 Definition: (Ref 3.1 (Ref: Part 3-7.1.2.3
Transition Criteria (Ref: 5.3.1.1) (Ref: 3) (Ref: PP 1.3, 2.1)
Table A-1 line 3) => 3-7.1.2.5)
•
• • P (Ref: PP 2, PMC
•
4.3.16 Development Plan (Ref 3.1 (Ref: 2.2, 4.1, 4.2 (Ref: Part 1.1
(Ref: 5.3.1.4)
Table A-1 line 1) ,4.3 11.2) 3-7.1.2.2) IPM 1.1, 1.3, 1.4
GP 2.2, 2.3, 3.1)
• •
•
Environment • • (Ref: Part (Ref: PP 2.4
4.3.19 (Ref 3.1. Table A-1
Definition (Ref: 5.3.1.3) (Ref: 4.4, 4.5) 3-7.1.2.6, Annexes IPM 1.1
line 3)
A&B, 3-7.4.4.2) GP 2.2, 2.3, 3.1)
•
• (Ref: ReqM1.1
Support Process • • PP2.3, PMC1.4,
5.X (Ref 3.9 Table A-9
Compliance (Ref: 5.3.1.2.d) (Ref: Part 3-7.1.1) PPQA, PMC 1.6,
line 1)
1.7, 2 Ver, Val
GP2.9)
© EUROCAE, 2009
137
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
•
(Ref 3.8 Table A-8 • •
Outputs
• line 1 to 6; •
5.2 Configuration (Ref: Part (Ref: CM,
Management (Ref: 5.3.1.2.b) For COTS: 4.1.7 (Ref: 4.3)
3-7.1.2.8) GP 2.6)
Table 4-3 lines 1 to
4)
•
•
Software Products • (Ref: PMC 2.1, 2.2,
5.2.2 (Ref 3.8 Table A-8
Problems: Scheme (Ref: 5.3.1.2.c) 2.3
line 3)
CM 2.1, 2.2)
•
Software Products •
• (Ref: PMC 2.1, 2.2,
5.2.2 Problems: (Ref 3.8 Table A-8
(Ref: 5.3.1.2.c) 2.3
Procedure line 3)
CM 2.1, 2.2)
•
•
Software Products • (Ref: PMC 2.1, 2.2,
5.2.2 (Ref 3.8 Table A-8
Problems: ? (Ref: 5.3.1.2.b) 2.3
line 3)
CM 2.1, 2.2)
•
•
Software Products • (Ref: PMC 2.1, 2.2,
5.8 (Ref 3.8 Table A-8
Problems (Ref: 5.3.1.2.c) 2.3
line 3)
CM 2.1, 2.2)
© EUROCAE, 2009
138
© EUROCAE, 2009
139
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
•
•
P (Ref: 3.1 • P
4.3.9 Standards (Ref: GP 2.2, 3.1,
(Ref: 5.3.1.4) Table A-1 line 5; (Ref: 11.2) (Ref: Part 3-7.4.4)
PP 2.4)
For COTS: 4.1.4.2)
•
P •
(Ref: 3.1 • P
4.3.10 Standards (Ref: 5.3.1.3, (Ref: GP 2.2, 3.1,
Table A-1 line 5; (Ref: 11.2) (Ref: Part 3-7.4.4)
5.3.1.4) PP 2.4)
For COTS: 4.1.4.2)
Software • •
• • P
4.3.12 Development (Ref: 3.1. Table A- (Ref: PP 2.4,
Environment (Ref: 5.3.1.3) (Ref: 11.2) (Ref: Part 3-7.4.4)
1 line 3) GP 2.2, 2.3, 3.1)
Additional • •
4.3.12
considerations (Ref: 4) (Ref: 11.1)
Software Lifecycle: • • •
• •
4.3.16 Lifecycle (Ref: 3.1Table A-1 (Ref: Part 3-7.1.2.1 (Ref: PP 1.3, IPM
Processes (Ref: 5.3.1.1) (Ref: 11.1, 11.2)
lines 2, 3) => 3-7.1.2.5) 1.3)
•
Software Lifecycle: • •
• • (Ref: Part
4.3.16 Lifecycle (Ref: 3.1Table A-1 (Ref: PP 1.3, IPM
(Ref: 5.3.1.1) (Ref: 11.1, 11.2) 3-7.1.2.1, 3-7.1.2.3
Information lines 2, 3) 1.3)
=> 3-7.1.2.5)
• • •
Software Lifecycle: • •
4.3.16 (Ref: 3.1Table A-1 (Ref: Part (Ref: PP 1.3, IPM
Transition Criteria (Ref: 5.3.1.1) (Ref: 11.1, 11.2)
lines 2, 3) 3-7.1.2.3) 1.3)
Schedule: • •
• •
4.3.16 Lifecycle (Ref: 3.1; 4.1.4.2 (Ref: PP 2.1,
Processes (Ref: 5.3.1.1) (Ref: 11.1)
;5.1) PMC, GP 2.2, 3.1)
Schedule: • •
• •
4.3.16 Lifecycle (Ref: 3.1; 4.1.4.2 (Ref: PP 2.1,
Information (Ref: 5.3.1.1) (Ref: 11.1)
;5.1) PMC, GP 2.2, 3.1)
© EUROCAE, 2009
140
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• •
Schedule: • •
4.3.16 (Ref: 3.1; 4.1.4.2 (Ref: PP 2.1,
Transition Criteria (Ref: 5.3.1.1) (Ref: 11.1)
;5.1) PMC, GP 2.2, 3.1)
Software • •
• • P
4.3.17 Development (Ref: 3.1. Table A- (Ref: PP 2.4
Environment (Ref: 5.3.1.3) (Ref: 11.2) (Ref: Part 3-7.4.4)
1 line 3) GP 2.2, 2.3, 3.1)
Additional • •
4.3.17
considerations (Ref: 4) (Ref: 11.1)
Software • •
• • P
4.3.18 Development (Ref: 3.1. Table A- (Ref: PP 2.4
Environment (Ref: 5.3.1.3) (Ref: 11.2) (Ref: Part 3-7.4.4)
1 line 3) GP 2.2, 2.3, 3.1)
Additional • •
4.3.18
considerations (Ref: 4) (Ref: 11.1)
Software • •
• • P
4.3.19 Development (Ref: 3.1. Table A- (Ref: PP 2.4
Environment (Ref: 5.3.1.3) (Ref: 11.2) (Ref: Part 3-7.4.4)
1 line 3) GP 2.2, 2.3, 3.1)
Additional •
4.3.20
considerations (Ref: 11.7)
•
Software Lifecycle • • •
5.X (Ref: Part
Data (Ref: 5) (Ref: 11.1) (Ref: GP 2.2, 3.1)
3-7.1.2.7, Table 1)
Software • •
• • P
7.1.X Development (Ref: 3.1. Table A- (Ref: PP 2.4
Environment (Ref: 5.3.1.3) (Ref: 11.2) (Ref: Part 3-7.4.4)
1 line 3) GP 2.2, 2.3, 3.1)
Additional • •
7.2.X
considerations (Ref: 4) (Ref: 11.1)
© EUROCAE, 2009
141
NOTE: ED-12B/DO-178B does not address system-related issues (supposed to be covered by ARP 4754), nor does ED-109/DO-278.
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 RTCA/DO-178B
System • •
• P P
4.3.1 Requirements (Ref: Part 1-7.6, (Ref: RD 1.1, 2,
Analysis (Ref: 5.3.2.1) (Ref: 2) (Ref: 2.1.1, 2.2)
Part 2-7.2.3) 3.1, 3.2)
•
System
• P P P (Ref:
4.3.1 Requirements
Definition Criteria (Ref: 5.3.2.2) (Ref: 2) (Ref: 2.1.1) (Ref: Part 2-7.2.2) REQM 1.4, 1.5
RD 3)
•
System
• P P P (Ref:
4.3.15 Requirements
Definition Criteria (Ref: 5.3.2.2) (Ref: 2) (Ref: 2.1.1) (Ref: Part 2-7.2.2) REQM 1.4, 1.5
RD 3)
NOTE: ED-12B/DO-178B does not address system-related issues (supposed to be covered by ARP 4754).
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
System P •
• P P
4.3.2 Architecture (Ref: (Ref: TS 2.1, 2.2
Definition (Ref: 5.3.3.1) (Ref: 2.2) (Ref: 2.3)
Part 2-7.4.2) RD 2.2)
•
System
• P (Ref:
4.3.2 Architecture
Definition Criteria (Ref: 5.3.3.2) (Ref: Part 2-7.4) REQM 1.4,
TS 2.1, 2.2)
•
System
• P (Ref:
4.3.15 Architecture
Definition Criteria (Ref: 5.3.3.2) (Ref: Part 2-7.4) REQM 1.4,
TS 2.1, 2.2)
© EUROCAE, 2009
142
•
Software • (Ref: Part
•
Requirements • (Ref: 3.2 3-7.2.2.3, •
4.3.4 (Ref: 5.1, 11.6,
Definition: (Ref: 5.3.4.1) Table A-2; Table 3-7.2.2.4, (Ref: RD 2.1, 2.3)
Specification 11.9)
A-3) 3-7.2.2.7=>
3-7.2.2.11)
Software •
Requirements P •
4.3.4 (Ref: 3.2
Definition: (Ref: 5.1, 11.10) (Ref: TS 2.1)
Algorithms Table A-3 line 7)
•
Assurance Level
(Ref: 3.2 • • P
4.3.4 Related
Requirements Table A-2; Table (Ref: 5.1.2, 11.9) (Ref: Part 3-7.2.2) (Ref: RD 3.3)
A-3)
P
•
Software • (Ref: Part •
• (Ref: 3.2
4.3.4 Requirements (Ref: 5.5, 11.6, 3-7.2.2.1, (Ref: RD 3.3
Definition Criteria (Ref: 5.3.4.2) Table A-2; Table 3-7.2.2.2,
11.9) REQM 1.4)
A-3 line 6)
3-7.2.2.6)
•
Software •
• (Ref: PI 1,
4.3.7 Integration (Ref: Part
Definition update (Ref: 5.3.5.5) VER 1.3, GP 2.2,
3-7.4.3.2.f)
2.3, 3.1)
© EUROCAE, 2009
143
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 RTCA/DO-178B
• •
Software •
(Ref: 3.2 Table A- • (Ref: Part
4.3.9 Requirements (Ref: RD, GP 2.2,
Standards 2; Table A-3; Table (Ref: 11.6) 3-7.2.2.4,
2.3, 3.1)
A-4) 3-7.2.2.6)
• •
Software •
(Ref: 3.2 Table A- • (Ref: Part
4.3.10 Requirements (Ref: RD, GP 2.2,
Standards 2; Table A-3; Table (Ref: 11.6) 3-7.2.2.4,
2.3, 3.1)
A-4) 3-7.2.2.6)
• •
Software •
(Ref: 3.2 Table A- • (Ref: Part
4.3.11 Requirements (Ref: RD, GP 2.2,
Standards 2; Table A-3; Table (Ref: 11.6) 3-7.2.2.4,
2.3, 3.1)
A-4) 3-7.2.2.6)
• •
Software •
(Ref: 3.2 Table A- • (Ref: Part
4.3.12 Requirements (Ref: RD, GP 2.2,
Standards 2; Table A-3; Table (Ref: 11.6) 3-7.2.2.4,
2.3, 3.1)
A-4) 3-7.2.2.6)
P
•
Software • (Ref: Part •
• (Ref: 3.2
4.3.15 Requirements (Ref: 5.5, 11.6, 3-7.2.2.1, (Ref: RD 3.3,
Definition Criteria (Ref: 5.3.4.2) Table A-2; Table 11.9) 3-7.2.2.2, REQM 1.4)
A-3 line 6) 3-7.2.2.6)
P
Software (Ref: Part P
•
4.3.20 Requirements 3-7.2.2.1, (Ref: RD 3.3
Definition Criteria (Ref: 11.7)
3-7.2.2.2, REQM 1.4)
3-7.2.2.6)
© EUROCAE, 2009
144
© EUROCAE, 2009
145
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
•
Top-Level • (Ref: Part •
Software • (Ref: 3.2. Table A- • 3-7.4.1.1, (Ref: TS 2.1, 2.2
4.3.13
Architecture (Ref: 5.3.5.1) 2 (Ref: 5.2.2, 11.10) 3-7.4.1.2, RD 2.2
Definition line 3) 3-7.4.3.1,
3-7.4.3.3)
• •
Assurance Level • •
4.3.13 (Ref: 3.2 (Ref: TS 1.1, 1.3,
Related Design (Ref: 11.10) (Ref: Part 3-7.4.2)
Table A-2 line 3) 2.1, 2.2)
P •
Software • •
• (Ref: Part 3-7.4.2.2 (Ref: TS 2.1, 2.2
4.3.13 Architecture (Ref: 3.2 Table A-2 (Ref: 5.2, 5.5,
Definition Criteria (Ref: 5.3.5.6) => 3-7.4.2.11, ReqM 1.4
line 3) 11.7)
3-7.4.3.2) PI 2.1)
P •
Software • •
• (Ref: Part 3-7.4.2.2 (Ref: TS 2.1, 2.2
4.3.14 Architecture (Ref: 3.2 Table A-2 (Ref: 5.2, 5.5,
Definition Criteria (Ref: 5.3.5.6) => 3-7.4.2.11, ReqM 1.4
line 3) 11.7)
3-7.4.3.2) PI 2.1)
P •
Software • •
• (Ref: Part 3-7.4.2.2 (Ref: TS 2.1, 2.2
4.3.15 Architecture (Ref: 3.2 Table A-2 (Ref: 5.2, 5.5,
Definition Criteria (Ref: 5.3.5.6) => 3-7.4.2.11, ReqM 1.4
line 3) 11.7)
3-7.4.3.2) PI 2.1)
Software • • •
•
4.3.17 Architectural (Ref: 3.2 Table A-2 (Ref: Part (Ref: TS GP 2.2,
Design Standards (Ref: 11.7)
line 3) 3-7.4.3.2) 2.3, 3.1)
Software • •
•
4.3.18 Architectural (Ref: 3.2 Table A-2 (Ref: TS GP 2.2,
Design Standards (Ref: 11.7)
line 3) 2.3, 3.1)
P •
Software •
4.3.20 Architecture (Ref: Part 3-7.4.2.2 (Ref: TS 2.1, 2.2
(Ref: 5.2, 5.5,
Definition Criteria => 3-7.4.2.11, ReqM 1.4
11.7)
3-7.4.3.2) PI 2.1)
© EUROCAE, 2009
146
• •
(Ref: Part •
Software Detailed • (Ref: 3.2 •
4.3.6 3-7.4.1.4, (Ref: TS 2.1, 2.2,
Design Definition (Ref: 5.3.6.1) Table A-2 lines 4, (Ref: 11.10)
3-7.4.5.1, 3.1)
5) 3-7.4.5.4)
P
• (Ref: 3.2 • •
4.3.6 Interfaces Design
(Ref: 5.3.6.2) Table A-2 lines 4, (Ref: 5.2.2, 11.10) (Ref: TS 2.3)
.5)
•
Software P
• (Ref:
4.3.7 Integration (Ref: Part
Definition Update (Ref: 5.3.6.6) PI 1, PI GP 2.2,
3-7.4.5.5)
2.3, 3.1)
• •
Software Detailed •
4.3.9 (Ref: 3.1 Tables A- (Ref: TS GP 2.2,
Design Standards (Ref: 11.7, 11.10)
2 lines 4, 5) 2.3, 3.1)
• •
Software Detailed •
4.3.10 (Ref: 3.2 Table A2 (Ref: TS GP 2.2,
Design Standards (Ref: 11.7, 11.10)
lines 4, 5) 2.3, 3.1)
• P •
Software Detailed •
• (Ref:3.2 (Ref: Part (Ref: TS 2.1, 2.2
4.3.13 Design Definition (Ref: 5.2.2, 5.5,
Criteria (Ref: 5.3.6.7) Table A-2 lines 4, 3-7.4.5.2, ReqM 1.4
11.7)
5) 3-7.4.5.3) PI 2.1)
© EUROCAE, 2009
147
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• P •
Software Detailed •
• (Ref:3.2 (Ref: Part (Ref: TS 2.1, 2.2
4.3.14 Design Definition (Ref: 5.2.2, 5.5,
Criteria (Ref: 5.3.6.7) Table A-2 lines 4, 3-7.4.5.2, ReqM 1.4
11.7)
5) 3-7.4.5.3) PI 2.1)
• P •
Software Detailed •
• (Ref:3.2 (Ref: Part (Ref: TS 2.1, 2.2
4.3.15 Design Definition (Ref: 5.2.2, 5.5,
Criteria (Ref: 5.3.6.7) Table A-2 lines 4, 3-7.4.5.2, ReqM 1.4
11.7)
5) 3-7.4.5.3) PI 2.1)
P •
Software Detailed •
4.3.20 Design Definition (Ref: Part (Ref: TS 2.1, 2.2
(Ref: 5.2.2, 5.5,
Criteria 3-7.4.5.2, ReqM 1.4
11.7)
3-7.4.5.3) PI 2.1)
© EUROCAE, 2009
148
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
P
Software Units P • (Ref: Part •
•
4.3.9 Code definition (Ref: 3.7 Table A- (Ref: 5.3, 5.5, 11.8, 3-7.4.6.1, (Ref: TS 3.1
Criteria (Ref: 5.3.7.5)
7) 11.11) 3-7.4.7.1, ReqM 1.4)
3-7.4.7.2)
• • •
•
4.3.10 Coding Standards (Ref: 3.2 Table A-2 (Ref: Part (Ref: TS GP 2.2,
(Ref: 11.8, 11.11)
line 6 3-7.4.4.6) 2.3, 3.1)
P
Software Units P • (Ref: Part •
•
4.3.10 Code definition (Ref: 3.7 Table A- (Ref: 5.3, 5.5, 11.8, 3-7.4.6.1, (Ref: TS 3.1
Criteria (Ref: 5.3.7.5)
7) 11.11) 3-7.4.7.1, ReqM 1.4)
3-7.4.7.2)
P
Software Units P • •
• (Ref: 7.4.6.1,
4.3.15 Code definition (Ref: 3.7 Table A- (Ref: 5.3, 5.5, 11.8, (Ref: TS 3.1
(Ref: 5.3.7.5) 7.4.7.1,
Criteria 7) 11.11) ReqM 1.4)
7.4.7.2)
Software Units P • •
•
4.3.19 Code definition (Ref: 3.7 Table A- (Ref: 5.3, 5.5, 11.8, (Ref: TS 3.1
Criteria (Ref: 5.3.7.5)
7) 11.11) ReqM 1.4)
Software Units • •
4.3.20 Code definition (Ref: 5.3, 5.5, 11.8, (Ref: TS 3.1
Criteria 11.11) ReqM 1.4)
© EUROCAE, 2009
149
© EUROCAE, 2009
150
© EUROCAE, 2009
151
© EUROCAE, 2009
152
•
4.4.3 User support
(Ref: 5.4.4)
Software • P
4.4.4
Operation (Ref: 5.4.3) (Ref: Part 1-7.15)
Performance
4.4.5
Monitoring
© EUROCAE, 2009
153
© EUROCAE, 2009
154
© EUROCAE, 2009
155
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• •
Design & • P
5.1.2 (Ref: Part 1-5.2.8, (Ref: PP 2.3
Development (Ref: 6.1.2) (Ref: 11)
I-Annex A) PMC 1.4)
Production:
• P P •
5.1.3 Document
Provision (Ref: 6.1.3) (Ref: 4.3, 11) (Ref: Part 1-5.2.11) (Ref: PMC 1.4)
Production: • P P •
5.1.3
Document Storage (Ref: 6.1.3) (Ref: 4.3, 11) (Ref: Part 1-5.2.11) (Ref: PMC 1.4)
Documentation •
•
(SW Requirement): • • (Ref: 7.2.2.3, •
5.1.3 (Ref: 3.2 Tables
Document (Ref: 5.3.4.1) (Ref: 5.1, 11.9) 7.2.2.4, 7.2.2.7=> (Ref: RD 2.1, 2.3)
Provision A2.1, A2.2)
7.2.2.11)
•
Documentation •
• • (Ref: 7.2.2.3, •
5.1.3 (SW Requirement): (Ref: 3.2 Tables
Document Storage (Ref: 5.3.4.1) (Ref: 5.1, 11.9) 7.2.2.4, 7.2.2.7=> (Ref: RD 2.1, 2.3)
A2.1, A2.2)
7.2.2.11)
Documentation
•
(SW architectural • •
5.1.3 (Ref: 3.2, Table
design): Document (Ref: 5.3.5.4) (Ref: TS 2.1, 2.2)
Provision A.2 line 3)
Documentation
•
(SW architectural • •
5.1.3 (Ref: 3.2, Table
design) : (Ref: 5.3.5.4) (Ref: TS 2.1, 2.2)
Document Storage A.2 line 3)
Documentation P
(SW detailed • •
5.1.3 (Ref: For COTS
design): Document (Ref: 5.3.6.4) (Ref: TS 2.1, 2.2)
Provision 4.1.2)
Documentation P
(SW detailed • •
5.1.3 (Ref: For COTS
design): Document (Ref: 5.3.6.4) (Ref: TS 2.1, 2.2)
Storage 4.1.2)
© EUROCAE, 2009
156
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Documentation
(SW coding): • •
5.1.3
Document (Ref: 5.3.7.3) (Ref: TS 3.1, 2.2)
Provision
Documentation
• •
5.1.3 (SW coding):
Document Storage (Ref: 5.3.7.3) (Ref: TS 3.1, 2.2)
Documentation
(SW integration): • •
5.1.3
Document (Ref: 5.3.8.3) (Ref: PI1.1, 1.3)
Provision
Documentation
• •
5.1.3 (SW integration):
Document Storage (Ref: 5.3.8.3) (Ref: PI1.1, 1.3)
Baseline Update: • •
5.1.3 Document (Ref: 5.3.9.5, (Ref: PI 3.4
Provision 5.3.11.4) CM 1.3)
• •
Baseline Update:
5.1.3 (Ref: 5.3.9.5, (Ref: PI 3.4
Document Storage
5.3.11.4) CM 1.3)
•
• P •
5.1.4 Maintenance (Ref: 3.8 Table A-8
(Ref: 6.1.4) (Ref: Annex A) (Ref: PMC 1.4)
lines 3, 4)
• •
5.2.2 Baseline Update (Ref: 5.3.9.5, (Ref: PI 3.4
5.3.11.4) CM 1.3)
© EUROCAE, 2009
157
© EUROCAE, 2009
158
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Baseline & • •
•
5.2.3 Configuration Item (Ref: 3.8 Table A-8 (Ref: 7.2.2.e,
Traceability (Ref: CM 1.3)
line 2) 7.2.2.f)
Configuration
• •
Control: Change • • •
5.2.3 (Ref: 3.8 Table A-8 (Ref: 6.2.3.d,
Request (Ref: 6.2.3) (Ref: 7.2.3=>7.2.5) (Ref: CM 2, 3)
Procedures line 3) 6.2.3.e)
• •
Configuration • • •
5.2.3 (Ref: 3.8 Table A-8 (Ref: 6.2.3.d,
Control: Audit Trail (Ref: 6.2.3) (Ref: 7.2.3=>7.2.5) (Ref: CM 2, 3)
line 3) 6.2.3.e)
Configuration
• •
Control: Access to • • •
5.2.3 (Ref: 3.8 Table A-8 (Ref: 6.2.3.d,
Safety-Related (Ref: 6.2.3) (Ref: 7.2.3=>7.2.5) (Ref: CM 2, 3)
Functions line 3) 6.2.3.e)
Software Lifecycle
Environment • •
• P
5.2.3 Control: Change (Ref: 3.8 Table A-8 (Ref: CM
Request (Ref: 7.2.9) (Ref: 6.2.3.c)
line 6) GP 2.6)
Procedures
Software Lifecycle • •
• P
5.2.3 Environment (Ref: 3.8 Table A-8 (Ref: CM
Control: Audit Trail (Ref: 7.2.9) (Ref: 6.2.3.c)
line 6) GP 2.6)
Software Lifecycle
Environment • •
• P
5.2.3 Control: Access to (Ref: 3.8 Table A-8 (Ref: CM
Safety-Related (Ref: 7.2.9) (Ref: 6.2.3.c)
line 6) GP 2.6)
Functions
Software Patch •
5.2.3
Management (Ref: 5.4.3)
•
Configuration • • P •
5.2.4 (Ref: 3.8 Table A-8
Status Accounting (Ref: 6.2.4) (Ref: 7.2.6) (Ref: 6.2.3.e) (Ref: CM 3.1)
line 3)
© EUROCAE, 2009
159
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
•
Configuration • P • •
5.2.5 (Ref: 3.8 Table A-8
Evaluation (Ref: 6.2.5) (Ref: 7.2.4) (Ref: 6.2.3.d) (Ref: CM 3.2)
line 3)
Release • P
P • P
5.2.6 Management & (Ref: 3.8 Table A-8 (Ref: CM 2
Delivery (Ref: 6.2.6) (Ref: 7.2.7) (Ref: 6.2.3.f)
line 4) CM 1.2)
•
Software Load •
5.2.6 (Ref: 3.8 Table A-8
Control (Ref: 7.2.8)
line 5)
Software Patch •
5.2.6
Management (Ref: 5.4.3)
5.2.7 Use of a CM Tool
Use of a CM Tool
5.2.8 “Acquirer
Agreement”
At Level of
5.2.9 Software
Component
Configuration
5.2.10 Management
Traceability: Data
Configuration
Management
5.2.10
Traceability:
Compatibility
At level of SW
5.2.11
compilation unit
• •
Configuration • • •
5.8.4 (Ref: 3.8 Table A-8 (Ref: 6.2.3.d,
Control (Ref: 6.2.3) (Ref: 7.2.3=>7.2.5) (Ref: CM 2, 3)
line 3) 6.2.3.e)
© EUROCAE, 2009
160
© EUROCAE, 2009
161
© EUROCAE, 2009
162
© EUROCAE, 2009
163
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• • •
Verification Plan: • •
5.4.2 (Ref: 3.1 Table A-1 (Ref: 7.9.2.1, (Ref: Ver GP 2.2,
Scope (Ref: 6.4.1.5) (Ref: 11.3)
lines 1, 2, 3, 4) Part 1-7.18.2.1) 3.1)
•
• (Ref: Ver 2, 3, Ver
Verification • • •
5.4.2 (Ref: 3.9 Table 9 GP 2.7, 2.8
Results (Ref: 6.4.1.6) (Ref: 6.2.e) (Ref: 7.9.2)
Line 1) CM 2.1
PMC 2)
© EUROCAE, 2009
164
A.4.4.2 Verification
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
•
(Ref: all) Ver 2
a, c) RD 3.3,
System b) Ver 2
• d)
5.4.1 Requirements
Verification (Ref: 6.4.2.3) Ver 1.3, 2, 3
e) ReqM 1.5, RD
3.3
f) RD 3.3, 3.5
g) None
•
•
P (Ref: Part 1-7.8, I-
System Verification • P (Ref: Ver 1.3, Val
5.4.1 7.14, Part 2-
Evaluation Criteria (Ref: 3.7 Table A-7 1.3
(Ref: 5.3.11.2) (Ref: 2.7) 7.7.2.3, II-
lines 2, 3 ) Ver 3
7.7.2.5=>
Val 2)
II-7.7.2.7)
•
(Ref: all) PPQA 1,
Ver 2
a) SAM 1.2,
Contract • b) ReqM 1.1,
•
5.4.2 Verification: (Ref: 3.10 Table A- RD 3.3, 3.4, 3.5
Development (Ref: 6.4.2.1)
10 lines 1, 2, 3) c) ReqM 1.3
PMC GP 2.2, 2.4,
3.1, 2.7
d) IPM 2
e) Ver 1.3)
© EUROCAE, 2009
165
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
•
(Ref: all) PPQA 1,
Ver 2
a) SAM 1.2,
• b) ReqM 1.1,
Contract •
5.4.2 (Ref: 3.10 Table A- RD 3.3, 3.4, 3.5
Verification: Scope (Ref: 6.4.2.1)
10 lines 1, 2, 3) c) ReqM 1.3
PMC GP 2.2, 2.4,
3.1, 2.7
d) IPM 2
e) Ver 1.3)
•
(Ref: all) Ver GP
2.2, 3.1
a) ReqM 1.5, PP
• 3.1, 3.2
Process
• (Ref: 3.3 Table A- b) PP 3.1, IPM 1.1,
5.4.2 Verification:
Development (Ref: 6.4.2.2) 3, 3.4 Table A-4, 1.3, PMC1.1, Ver
3.7 Table A-7) GP 2.8, PPQA 1,
Ver GP 2.9
c) PP 3.1
d) PP 2.4, 2.5
Ver GP 2.3, 2.5)
•
(Ref: all) Ver GP
2.2, 3.1
a) ReqM 1.5, PP
• 3.1, 3.2
Process • (Ref: 3.3 Table A- b) PP 3.1, IPM 1.1,
5.4.2
Verification: Scope (Ref: 6.4.2.2) 3, 3.4 Table A-4, 1.3, PMC1.1, Ver
3.7 Table A-7) GP 2.8, PPQA 1,
Ver GP 2.9
c) PP 3.1
d) PP 2.4, 2.5
Ver GP 2.3, 2.5)
© EUROCAE, 2009
166
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Verification of
retrieval & release
5.4.2
process:
Development
Verification of
5.4.2 retrieval & release
process: Scope
P
(Ref: Annex A-3,
• 6.1.a, 6.3.1, 6.3.2)
•
Software (Ref: 3.3 Table A-3 (Because only • •
5.4.3 (Ref: 6.4.2.3,
Requirement lines 1, 2, 3, 4, 5, Software (Ref: 7.9.2.8) (Ref: Ver 3)
5.3.9)
6, 7) requirements not
System
requirements)
•
Operational • P
5.4.3 (Ref: Val 2
Testing (Ref: 5.4.2) (Ref: Part 1-7.15)
PI 3.4)
•
Adaptation data •
5.4.3 (Ref: 3.2 Table A-2
verification (Ref Ver, Val)
line 8)
P
P • (Ref: all) Ver 2, 3
Integration • • a,b) PI 3.1
5.4.4 (Ref: 3.5 Table A-5 (Ref: 7.9.2.10,
Verification (Ref: 6.4.2.6) (Ref: 6.3.5) c) PI GP 2.9
line 7) 7.9.2.11)
d, e,f) Ver 1.3
g) PI 3.2)
© EUROCAE, 2009
167
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
•
(Ref: 3.4 •
Table A-4 • (Ref: all) Ver 1.3, 2
Architectural • lines 1, 2, 3, 4, 5, • a) TS 1.1, 2.1
5.4.5 (Ref: Annex A-4,
Design Verification (Ref: 6.4.2.4) 6, 7, 8, 9, 10, 11, (Ref: 7.9.2.9) c) TS1.1,1.3,2.1
6.3.3)
12, 13 Over- ,ReqM1.4
compliant in line f) TS 2.1)
13) (partitioning)
•
(Ref: 3.4
Table A-4 •
•
Detailed design • lines 1, 2, 3, 4, 5, • [Ref: Ver 1.3, 2
5.4.5 (Ref: Annex A-4,
Verification (Ref: 6.4.2.4) 6, 7, 8, 9, 10, 11, (Ref: 7.9.2.9) TS 2.1
6.3.3)
12, 13 Over-
compliant in line
13) (partitioning)
P •
Module Testing P
5.4.5 (Ref: 3.6 Table A-6 (Ref: TS GP 2.2,
Standards (Ref: 6.4.3.c)
lines 3, 4 ) 2.3, 3.1, Ver 1.2)
•
Software Units • P
• P (Ref: TS 3.1
5.4.5 Tests definition (Ref: 5.3, 5.5, 11.8, (Ref: 7.4.6.1,
Criteria (Ref: 5.3.7.5) (Ref: 3. Table A-7) ReqM 1.4
11.11) 7.4.7.1, 7.4.7.2)
Ver 1.3, 2)
• • •
Software Units •
5.4.5 (Ref: 3.6 Table A-6 (Ref: 7.4.7.1, (Ref: TS 3.1, Ver
Testing (Ref: 5.3.7.2)
lines 3, 4, 5) 7.4.7.3) 3)
• •
Source Code • (Ref: 3.5 Table A-5 • • (Ref: Ver 1.3, 2
5.4.6
Verification (Ref: 6.4.2.5) lines 1, 2, 3, 4, 5, (Ref: Annex A-5) (Ref: 7.9.2.12) TS 3.1
6) ReqM 1.4)
© EUROCAE, 2009
168
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
•
Software Units • • •
5.4.6 (Ref: TS 3.1, Ver
Test Definition (Ref: 5.3.6.5) (Ref: 3.6.3) (Ref: 7.4.5.4)
1.3)
P •
Module Testing P
5.4.6 (Ref: 3.6 Table A-6 (Ref: TS GP 2.2,
Standards (Ref: 6.4.3.c)
lines 3, 4 ) 2.3, 3.1, Ver 1.2)
•
Software Units • P
• P (Ref: TS 3.1
5.4.6 Tests definition (Ref: 5.3, 5.5, 11.8, (Ref: 7.4.6.1,
Criteria (Ref: 5.3.7.5) (Ref: 3. Table A-7) ReqM 1.4
11.11) 7.4.7.1, 7.4.7.2)
Ver 1.3, 2)
• • •
Software Units •
5.4.6 (Ref: 3.6 Table A-6 (Ref: 7.4.7.1, (Ref: TS 3.1, Ver
Testing (Ref: 5.3.7.2)
lines 3, 4, 5) 7.4.7.3) 3)
• •
Executable Code
(Ref: 3.6 Table A-6 • (Ref: TS 3.1
5.4.8 Verification:
Evaluation lines 1, 2, 3, 4, 5 (Ref: Annex A-6) ReqM 1.4
Over-compliant) Ver 3)
• •
Executable Code
(Ref: 3.6 Table A-6 • (Ref: TS 3.1
5.4.8 Verification:
Documentation lines 1, 2, 3, 4, 5 (Ref: Annex A-6) ReqM 1.4
Over-compliant) Ver 3)
•
Adaptation data •
5.4.9 (Ref: 3.2 Table A-2
verification (Ref Ver, Val)
line 8)
© EUROCAE, 2009
169
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Software • •
•
5.4.12 Verification (Ref: 7.7.2.4, (Ref: TS 2.1, 3.1
Evaluation (Ref: 5.3.9.3)
7.7.2.6) Ver 1.3)
•
•
System Verification • (Ref: Ver 3, Val 2
5.4.12 (Ref: Part 1-7.8, I-
Evaluation (Ref: 5.3.11) ReqM 1.4
7.14, Part 2-7.7)
Ver GP 2.9)
•
• P (Ref: a) Ver 2
Documentation
5.4.12 b) PI GP 2.8, 2.9
Verification (Ref: 6.4.2.7) (Ref: Part 1-5.2)
c) CM 3,
GP2.6)
•
Verification (Ref: 3.7 Table A- •
•
5.4.12 Process Outputs 7 lines 1, 2, 3, 4, 5, (Ref: Annex A-7,
Verification (Ref: Ver 1.3, 2)
6, 7, 8 Over- 6.3.6, 6.4.4)
compliant)
N/A
© EUROCAE, 2009
170
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
Process P P •
• P
5.6.1 implementation: (Ref: 3.9 Table A- (Ref: Part 1- (Ref: PP 2.1, 2.6
Review Milestones (Ref: 6.6.1) (Ref: 6, 8.3)
9 Line 3 partial) 6.2.1.b, I-7.18.1) PMC 1.6, 1.7)
Process P P •
implementation: • P
5.6.1 (Ref: 3.9 Table A- (Ref: Part 1- (Ref: PP 2.1, 2.6
Review (Ref: 6.6.1) (Ref: 6, 8.3)
Documentation 9 Line 3 partial) 6.2.1.b, I-7.18.1) PMC 1.6, 1.7)
Project P P
• P •
5.6.2 management (Ref: 3.9 Table A- (Ref: 7.3.2.4, Part
reviews (Ref: 6.6.2) (Ref: 4.6, 8.2.b/c) (Ref: PMC 1)
9 Line 1 partial) 1-6.2.3)
•
P (Ref: 7.2.2.4, •
• •
5.6.3 Technical reviews (Ref: 3.9 Table A- 7.4.1.2, 7.4.6.2, (Ref: PMC 1.6 ,
(Ref: 6.6.3) (Ref: 6, 8.3) 7.4.4.5,
9 Line 3 partial) 1.7)
Part 1-5.2.11)
Software •
•
5.6.3 Requirements (Ref: PMC 1.6, 1.7
Joint Review (Ref: 5.3.4.3)
CM 1.3)
Software •
•
5.6.3 Architecture Joint (Ref: PMC 1.6,
Review (Ref: 5.3.5.7)
1.7)
Software Detailed •
•
5.6.3 Design Joint (Ref: PMC 1.6,
Review (Ref: 5.3.6.8)
1.7)
•
5.6.3 Code Joint Review (Ref: PMC 1.6,
1.7)
Software •
•
5.6.3 Integration Joint (Ref: PMC 1.6,
Review (Ref: 5.3.8.6)
1.7)
© EUROCAE, 2009
171
© EUROCAE, 2009
172
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
•
(Ref: CM 3.2
a) TS 3.1, Ver 2
b) Ver 1.1, Ver 2
c) Ver 1.3, Ver 2
• P P d) Ver 2, 3, PMC 2,
5.7.3 Software Audit
(Ref: 6.8.2) (Ref: 8.2.d) (Ref: 6.2.3.e) CM 2.1, 3.2
e) Ver 3, PMC 2,
CM 2.1, 3.2
f) TS 3.2, Ver 2
g) GP 2.9
h) PMC 1.1)
•
(Ref: CM 3.2
a) TS 3.1, Ver 2
b) Ver 1.1, Ver 2
c) Ver 1.3, Ver 2
• P P d) Ver 2, 3, PMC 2,
5.7.4 Software Audit
(Ref: 6.8.2) (Ref: 8.2.d) (Ref: 6.2.3.e) CM 2.1, 3.2
e) Ver 3, PMC 2,
CM 2.1, 3.2
f) TS 3.2, Ver 2
g) GP 2.9
h) PMC 1.1)
•
(Ref: CM 3.2
a) TS 3.1, Ver 2
b) Ver 1.1, Ver 2
c) Ver 1.3, Ver 2
• P P d) Ver 2, 3, PMC 2,
5.7.5 Software Audit
(Ref: 6.8.2) (Ref: 8.2.d) (Ref: 6.2.3.e) CM 2.1, 3.2
e) Ver 3, PMC 2,
CM 2.1, 3.2
f) TS 3.2, Ver 2
g) GP 2.9
h) PMC 1.1)
© EUROCAE, 2009
173
NOTE: These process objectives are part of Software Configuration Management for ED-109/DO-278 and ED-12B/DO-178B.
© EUROCAE, 2009
174
© EUROCAE, 2009
175
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
•
(Ref: all) PP1, 2
GP 2.2, 3.1
b) PP 2.1
c) PP 1.4
d) PP 2.4, 2.5
e) GP 2.3, 2.4
Management
• f) GP 2.4
6.1.2 Process
(Ref: 7.1.2) g) PP 2.2
Planning: Scope
RskM 2.2
h) M&A 1
PMC GP 2.2, 3.1
Ver GP 2.2, 3.1
i) PP 1.4
j) PP 2.4
GP 2.3)
•
(Ref: all) PP1, 2
GP 2.2, 3.1
b) PP 2.1
c) PP 1.4
d) PP 2.4, 2.5
e) GP 2.3, 2.4
Management
• f) GP 2.4
6.1.2 Process
(Ref: 7.1.2) g) PP 2.2
Planning: Content
RskM 2.2
h) M&A 1
PMC GP 2.2, 3.1
Ver GP 2.2, 3.1
i) PP 1.4
j) PP 2.4
GP 2.3)
© EUROCAE, 2009
176
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
•
Management
(Ref: a) PMC
Process •
6.1.3 GP 2.8
Execution & (Ref: 7.1.3)
b) PMC 1
control: Initiation
c) PMC 2)
•
Management
(Ref: a) PMC
Process •
6.1.3 GP 2.8
Execution & (Ref: 7.1.3)
b) PMC 1
control: Monitoring
c) PMC 2)
•
Management
(Ref: a) PMC
Process •
6.1.3 GP 2.8
Execution & (Ref: 7.1.3)
b) PMC 1
control: Problems
c) PMC 2)
Management
Process •
•
6.1.4 Review & (Ref: a) ReqM 1.5
(Ref: 7.1.4)
evaluation: b) PMC)
Lifecycle data
Management
Process •
•
6.1.4 Review & (Ref: a) ReqM 1.5
(Ref: 7.1.4)
evaluation: b) PMC)
Assessment
•
Management (Ref: a) IPM 1.3
Process • b) PMC 1.1, 1.6,
6.1.5
Closure: (Ref: 7.1.5) 1.7,
Completeness GP 2.8
c) PMC 1.4)
© EUROCAE, 2009
177
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
•
(Ref: a) IPM 1.3
Management
• b) PMC 1.1, 1.6,
6.1.5 Process
(Ref: 7.1.5) 1.7,
Closure: Archive
GP 2.8
c) PMC 1.4)
© EUROCAE, 2009
178
© EUROCAE, 2009
179
This section is addressed by objectives 4.3.11 and 4.3.19. It describes the Infrastructure Process (which is generic) for the Software Lifecycle
environment concerning the Development Process.
The following objectives should be satisfied in addition to the objectives contained in this document for non-COTS software.
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• •
7.2.1 COTS Plans (Ref: 4.1.9 Table (Ref: SAM GP 2.2,
4-1 line 1) 3.1)
•
•
COTS Transition (Ref: SAM 2.1
7.2.2 (Ref: 4.1.9 Table
Criteria TS 2.4
4-1 line 3)
IPM 1.3)
•
COTS Plans •
7.2.3 (Ref: 4.1.9 Table
Consistency (Ref: SAM 2.1, 2.4)
4-1 line 2)
COTS • •
7.2.4 Requirements (Ref: 4.1.9 Table (Ref: SAM 2.1, TS
Coverage 4-2 line 2) 2.4)
• •
COTS Lifecycle
7.2.5 (Ref: 4.1.9 Table (Ref: SAM 2.1, TS
Data
4-2 line 1) 2.4)
• •
COTS Derived
7.2.6 (Ref: 4.1.9 Table (Ref: SAM 2.1, TS
Requirements
4-2 line 4) 2.4)
© EUROCAE, 2009
180
ED-109/ ED-12B/
Obj Title ISO/IEC 12207 IEC 61508 CMMI
DO-278 DO-178B
• •
COTS HW
7.2.7 (Ref: 4.1.9 Table (Ref: TS 2.4, RD
Compatibility
4-2 line 3) 2.2, 3.4)
COTS
•
Configuration •
7.2.8 (Ref: 4.1.9 Table
Management: (Ref: CM 1.2)
4-3 line 1)
Identification
• •
COTS Problem
7.2.9 (Ref: 4.1.9 Table (Ref: CM 1.1, SAM
Reporting
4-3 line 4) GP 2.6)
• •
COTS
7.2.10 (Ref: 4.1.9 Table (Ref: SAM 2.3
Incorporation
4-3 line 2) CM 2.1)
COTS
•
Configuration •
7.2.11 (Ref: 4.1.9 Table
Management: (Ref: CM 2.2)
4-3 line 3)
Archiving
*: COTS Derived requirements are defined in this Chapter 5 in Section 3.5.2.c
© EUROCAE, 2009
181
The purpose of this paragraph is to highlight the aspects of ED-153 which are not
covered by ED-109/DO-278.
ED-109/DO-278 addresses mainly safety aspects of software during its development.
The major ED-12B/DO-178B missing items are as follows:
There is no reference to system safety assessment standard;
This standard does not provide guidance to allocate Assurance Levels;
HMI specifics are not covered;
Documentation process is partially covered:
The lifecycle is restricted to the SW development phase.
The major ED-12B/DO-178B missing items also apply to ED-109/DO-278:
Only a part of the safety lifecycle is defined by ED-12B/DO-178B (the part
concerned with the development of software). No requirements are set
concerning acquisition, supply, installation, acceptance, maintenance,
operation, decommissioning;
Life cycle activities scheduling;
Integration of the software product into the system on site;
Software integration testing is not defined concurrently with the
design/development phases;
Requirements to choose a programming language;
Staff training, staff competence;
Capacity for safe modifications (A margin for throughput (e.g., Input and Output
(I/O) rate or Central Processing Unit (CPU) load) and memory usage);
Software self monitoring of control flow and data flow;
Some techniques and methods to verify outputs of different development
phases;
Project risk management;
Use of Configuration Management tool;
Tool selection criteria;
Process improvement.
© EUROCAE, 2009
182
ANNEX B
In this scenario, the ANSP awards a manufacturer with a contract the equipment part
of a system i.e. a set of equipment (HW and SW) that all together fulfils the
requirements allocated by the ANSP to this system. Some of the requirements may
even address some procedural aspects of operating, maintaining the system (eg
reset, swap, switch to degraded mode, reversion to normal operation).
Therefore, the Manufacturer writes the system requirements derived from the ANSP
operational requirements. Consequently, the Manufacturer also wirtes the SW
requirements.
The System manufacturer has to complete the risk assessment and mitigation process
(e.g; namely contribute to finish the PSSA and do the SSA for this part of the system).
For example, the ANSP allocates globally SWAL to the overall system. The System
Manufacturer proposes a refinement of the SWAL allocation to each SW of the
system: only degrades SWAL to less demanding SWAL by demonstrating SW
isolation and contribution to hazards and their effects.
L : Lead; C: Contribute; P: Perform; A: Accept
© EUROCAE, 2009
183
In this scenario, the ANSP awards a manufacturer with a contract for one piece of
equipment (HW and SW) that fulfils the requirements allocated by the ANSP to this
SW.
Therefore, the ANSP writes the system requirements. The Manufacturer writes the
SW requirements derived from the system requirements.
The SW manufacturer has to complete the risk assessment and mitigation process
(e.g; namely contribute to finish the SSA for this part of the SW).
For example, the ANSP allocates the SWAL to the SW. The only action bearing on the
SW Manufacturer with regards SWAL consists in ensuring that the SSA does not
identify failures leading to re-assess the risk assessment and mitigation results.
L : Lead; C: Contribute; A: Accept
© EUROCAE, 2009
184
© EUROCAE, 2009
185
In this scenario, the ANSP performs internally the development. Various ANSP
units/divisions may be involved to specify, develop, integrate, verify,.. the SW.
Therefore, the ANSP has to satisfy all SWAL objectives.
L: Lead;
Objective N° ANSP Comments
§2.1 L The system team is in charge of allocating the SWAL.
3.0.1 L The ANSP has to set-up its SASS for the overall SW lifecycle.
3.0.2 L
3.0.3 L
3.0.4 L
3.0.5 L
3.0.6 L
3.0.7 L
3.0.8 L
3.0.9 N/A As per ED-153.
3.0.10 L SW development unit leads the SW development part.
3.0.11 L
3.0.12 L SW development unit leads it during the SW development part.
3.0.13 L If applicable.
3.0.14 L
3.0.15 L
3.0.16 L
3.0.17 L
3.1.X L The system team/unit.
3.X.Y L The SW development unit/team.
4.1.X L The unit/team in charge of specifying the needs.
4.2.X L The unit/team in charge of SW development.
4.3.1 L The System unit/team.
© EUROCAE, 2009
186
© EUROCAE, 2009
187
ANNEX C
© EUROCAE, 2009
188
ED-153 Ref.
ESARR 6 Requirement
Objectives N°
arguments to a known set of software
demonstrate the products and
required assurance descriptions (including
are at all times iii. specifications) that 3.0.7, 3.0.15, 5.2.X, 5.4.13
derived from: - have been used in the
production of that
version
3.0.8, 3.0.3, 3.0.6, 3.0.7, 3.0.15,
The ATM service-provider shall provide the required 3.0.16, 3.0.17, 3.4.3, 3.0.10,
1.3 assurances, to the Designated Authority, that the 3.0.11, 4.2.6, 4.4.5, 5.3.2, 5.3.1,
requirements in section 1.2 above have been satisfied. 5.3.3, 5.6.1, 5.6.2, 5.6.3, 5.7.X,
6.1.3
The ATM 3.2.1, 3.0.1, 3.2.2, 3.2.3, 3.2.4,
Is documented, specifically as part of the overall Risk
2 service 2.1 3.5.1, 3.5.2, 3.5.3, 4.1.4, 4.2.4,
Assessment and Mitigation Documentation.
provider 5.1.1, 5.1.2, 5.1.3, 5.1.4
shall Allocates software assurance levels to all operational ATM
ensure, as 2.2 3.0.5, 3.0.11, 4.1.3, 4.4.5, 4.5.2
software
a minimum,
3.0.12, 3.0.2, 4.3.4, 4.3.9,
that the
4.3.10, 4.3.11, 4.3.12, 4.3.13,
Software a) software requirements validity
4.3.14, 5.4.3, 5.4.5, 5.4.6, 5.4.9,
Safety
5.7.2, 5.4.11
Assurance
System: 2.3 Includes assurances of: 4.1.6, 4.1.7, 4.2.6, 4.2.7, 4.3.16,
b) software verification 3.0.6, 5.3.X, 5.4.X, 5.6.X, 5.7.3,
5.7.X, 5.7.4
c) software configuration management, and 5.2.1, 3.0.7, 5.2.X, 5.4.13
d) software requirements traceability 5.4.10, 3.0.3, 4.3.15, 5.2.10
the variation in rigour of the assurances per software 3.0.9, 4.1.4, 3.0.10
Determines the rigour to which the assurance level shall include the following criteria:
the assurances are established. The a) i. required to be achieved with independence,
ii. required to be achieved, All tables of the document
rigour shall be defined for each
2.4 iii. not required,
software assurance level, and shall
increase as the software increases the assurances corresponding to each software assurance
in criticality. For this purpose: 3.0.8, 3.0.11, 4.2.5, 4.2.6, 4.2.7,
b) level shall give sufficient confidence that the ATM software
4.1.4, 4.4.5, 4.5.2, 4.5.3
can be operated tolerably safely.
© EUROCAE, 2009
189
ED-153 Ref.
ESARR 6 Requirement
Objectives N°
Uses feedback of ATM software experience to confirm that
the Software Safety Assurance System and the assignement
2.5 of assurance levels are appropriate. For this purpose, the 3.0.11, 4.4.X, 4.5.2, 4.5.3, 5.8.X
effects resulting from any software malfunction or failure from
ATM operational ex
Provides the same level of confidence, through any means 3.0.13, 7.2.1, 7.2.2, 7.2.3, 7.2.4,
chosen and agreed with the Designated Authority, for the 7.2.5, 7.2.6, 7.2.7, 7.2.8, 7.2.9,
2.6 developmental and non-developmental ATM software (eg 7.2.10, 7.2.11
COTS - commercial off-the-shelf software, etc) with the same
software assurance level. Section 7.2
© EUROCAE, 2009
190
ED-153 Ref.
ESARR 6 Requirement
Objectives N°
The ATM
service- Specify the functional behaviour (nominal and downgraded
4.3.4, 3.0.2, 3.1.1, 3.3.4, 4.3.9,
provider modes) of the ATM software, timing performances, capacity,
4.3.5, 4.3.6, 4.3.10, 4.3.11,
shall 4.1 accuracy, software resource usage on the target hardware,
5.4.3, 5.4.5, 5.4.6, 4.3.12, 5.4.9,
ensure, as robustness to abnormal operating conditions and overload
4.3.13, 4.3.14, 4.3.20
a minimum tolerance, as appropriat
within the
4 Software
Safety 4.3.4, 4.3.5, 4.3.6, 5.2.5, 3.0.2,
Assurance 3.3.4, 3.4.X, 4.3.9, 4.3.10,
System, Are complete and correct, and are also compliant with the
4.2 4.3.11, 4.3.12, 4.3.13, 4.3.14,
that system safety requirements
4.3.20, 5.4.3, 5.4.5, 5.4.6, 5.4.8,
software 5.4.9, 6.1.4, 6.1.5
requirement
s:
The ATM The functional behaviour of the ATM software, timing
service- performances, capacity, accuracy, software resource usage
4.4.5, 3.0.6, 5.4.2, 5.4.3, 5.3.X,
5 provider 5.1 on the target hardware, robustness to abnormal operating
5.6.X, 5.7.1, 5.7.2
shall conditions and overload, comply with the software
ensure, as requirements
a minimum,
within the
Software
Safety
Assurance
4.1.7, 4.2.7, 3.0.6, 3.4.X, 5.2.3,
System, The ATM software is adequately verified by analysis and / or
5.2.5, 5.4.1, 5.4.2, 5.4.3, 5.4.4,
that: 5.2 testing and / or equivalent means, as agreed with Designated
5.4.5, 5.4.6, 5.4.7, 5.4.8, 5.4.9,
Authority.
5.4.10, 5.4.11, 5.6.X, 5.7.X
© EUROCAE, 2009
191
ED-153 Ref.
ESARR 6 Requirement
Objectives N°
© EUROCAE, 2009
192
ED-153 Ref.
ESARR 6 Requirement
Objectives N°
provider
shall
ensure, as
a minimum,
Each software requirement, at each level in the design at
within the
7.2 which its satisfaction is demonstrated, is traced to a system 4.3.15, 3.0.3, 5.4.10
Software
requirement.
Safety
Assurance
System,
that:
This safety regulatory requirement shall apply to civil and
military ATM service who have the responsibility for the
8.1 management of safety in ground-based ATM systems and Section 1.2
other ground-based supporting services (including CNS)
under their managerial control
8 The software safety assurance system already existing for
ATM systems under the direct managerial control of the
8.2 N/A
military ATM organisation can be accepted, provided it
accords with the obligatory provisions of ESARR 6.
The obligatory provisions of this ESARR shall be enacted as
8.3 N/A
minimum national safety regulatory requirements.
The provisions of ESARR 6 are to become effective within
9 9.1 three years from the date of its approval by the N/A
EUROCONTROL Commission
Use of the terms and definitions listed in Appendix A shall be
11 11.1 Sections 1.6 and 1.7
considered mandatory.
© EUROCAE, 2009
193
APPENDIX I
WG-64 MEMBERSHIP
Members:
FIRST NAME SECOND NAME COMPANY
Vincent BEHE SKYGUIDE
Amada BERNALDEZ DE ARANZABAL INDRA
Christophe BERTHELÉ DSNA (MSQS)
David BOWEN EUROCAE
Yann CARLIER DSNA
Mark CATTERSON NATS
Florin CIORAN EUROCONTROL
Nathalie CORBOVIANU DSNA (DTI)
Andrew EATON UK CAA (SRG)
Corinna GINGINS SKYGUIDE
Jules HERMENS DUTCH CAA (IVW)
Francisco LOPEZ AENA
Cécile MOURA DSNA (MSQS)
Gabrielle PARIZE DSNA (DTI)
Bernard PAULY THALES
John PENNY UK CAA (SRG)
Paula SANTOS NAV PORTUGAL
Vincent SCHIFFLERS EUROCONTROL
Rob WEAVER NATS
John SPRIGGS NATS
© EUROCAE, 2009
194
APPENDIX II
Introduction
The SWAL CS group was established as a separate group within WG64 to respond to
the European Commission Mandate to CEN/CENELEC/ETSI for the development of
European Standards (first set of Community Specifications) for interoperability of the
European Air Traffic Management Network (EATMN).
This mandate (M/390) in particular specified that “The elaboration of the standard
must be undertaken in cooperation with EUROCAE, particularly taking into account
the technical expertise of EUROCAE on equipments (systems and constituents) for air
traffic management.”
The terms of reference for the WG-64 SWAL CS group (see below) are therefore
taken directly from the “Description of the Mandate Work” in section 2 of M/390.
Terms of Reference
CEN/CENELEC/ETSI are asked to produce European standards that satisfy the
essential requirements and/or implementing rules of the interoperability Regulation for
systems, together with the relevant procedures, or constituents for the following
agreed priority 1 Community specifications:
Purpose and scope
The Community specification on Software Assurance Levels (SWAL) is intended to
apply to software components that are part of an Air Navigation System (ANS),
focusing only on the “ground” segment of ANS and provides a reference against which
stakeholders can assess their own practices for software specification, design,
development, operation, maintenance, evolution and decommissioning.
Recommendations on the major processes required to provide assurance for software
in Air Navigation Systems may include:
An allocation process for Software Assurance Levels (SWAL);
A SWAL grading policy, i.e. the identification of a policy and its rationale to justify and
substantiate increasing stringency of the objectives to be met per SWAL;
A list of objectives to be satisfied per SWAL;
The identification of appropriate techniques to achieve these objectives.
The scope of the Community specification will cover the overall lifecycle of software
within an Air Navigation System and provide an assessment of the activities for the
development, operation, maintenance and evolution of Air Navigation System software
components.
© EUROCAE, 2009
195
APPENDIX III
The content of this document has not been fully supported by UK CAA/SRG with regards the ability of
ED-153 to satisfy some aspects of EC Regulation 482/2008.
DFS:
The content of this document has not been fully supported by DFS due to the reasons described
below:
© EUROCAE, 2009
196
APPENDIX IV
Name: Company:
Address:
Phone: Fax:
Email:
© EUROCAE, 2009