Chapter 5 Static Testing
Chapter 5 Static Testing
Chapter 5 Static Testing
Outline:
- Reviews
- Program Inspections
- Walkthroughs
- Principles of Validation
- Software Verification
1
1.2 Formal Design Reviews (Reviews)
• In a review , a work product is examined for defects by individuals other than the person who
produced it.
• A Work Product is any important deliverable created during the requirements, design, coding,
or testing phase of software development.
• Formal design reviews, commonly called “design reviews”, “DRs” and “formal technical reviews
(FTR)”, are the only reviews that are necessary for approval of the design product.
• Reviews in the SQA process are important because they provide early detection and prevent
the passing of design and analysis errors to next phases
• Methods to conduct reviews include:
• Formal Design Review
• Peer Reviews (Walkthrough and Inspection)
• Expert Opinion
6
Review Objectives
Direct objectives:
1.To detect analysis and design errors as well as subjects where corrections, changes and
completions are required
2.To identify new risks likely to affect the project.
3.To locate deviations from templates, style procedures and conventions.
4.To approve the analysis or design product. Approval allows the team to continue on to
the next development phase.
Indirect objectives:
1.To provide an informal meeting place for exchange of professional knowledge about
methods, tools and techniques.
2.To record analysis and design errors that will serve as a basis for future corrective
actions.
7
Some Common Formal Design Reviews
Characteristics of a DR leader
• Knowledge and experience in development of projects of the type reviewed.
• Preliminary acquaintance with the current project is not necessary
• Seniority at a level similar if not higher than that of the project leader.
• A good relationship with the project leader and his team.
• A position external the project team
9
Design Review Preparations
• Review Team
• Review documents
• List comments
• Use checklist
10
Sample Design Review Checklist
• Well-structured
• Simple
• Efficient
• Adequate
• Flexible
• Practical
• Implementable
11
• General:
1. Does the architecture convey a clear vision of the system that can be used for further development?
2. Is the architecture structured to support likely changes?
3. Does the architecture describe the system at a high level of detail? (No interface or implementation details.)
4. Does the architecture cleanly decompose the system?
5. Is the architecture independent of the infrastructure used to develop the system?
6. Has maintainability been considered?
7. No duplicate functionality in the architecture?
• Complete:
1. Are software requirements reflected in the software architecture?
2. Is effective modularity achieved? Are modules functionally independent?
3. Does each module/class have an understandable name?
4. Is each association well named?
5. Is each association’s and aggregation’s cardinality correct?
• Correct:
1. Does each association reflect a relationship that exists over the lives of the related modules/classes?
2. Does the architecture have loose coupling and good cohesion?
12
DR Session
• A typical DR session agenda includes:
• A short presentation of the design document
• Comments made by members of the review team
• Verification and validation of comments is discussed to determine the
required action items (corrections, changes and additions)
• Decisions about the design product (document), which determines the
project's progress:
• Full approval
• Partial approval
• Denial of approval
13
Post- Design Review Session
1. Preparation of the DR report
• The report's major sections:
• A summary of the review discussions.
• The decision about continuation of the project.
• A full list of the required action items: corrections, changes and
additions. For each action item, completion date and project
team member responsible are listed.
• The name (s) of the review team member (s) assigned to follow
up.
14
Guidelines for successful Design Review
Design Review Infrastructure
• Develop checklists for common types of design documents.
• Train senior professionals serve as a reservoir for DR teams.
• Periodically analyze past DR effectiveness
• Schedule the DRs as part of the project plan
15
Guidelines for successful Design Review (cont’d)
The Design Review Session
• Discuss professional issues in a constructive way refraining from
personalizing the issues.
• Keep to the review agenda.
• Focus on detection of defects by verifying and validating the participants'
comments. Refrain from discussing possible solutions.
• In cases of disagreement about an error - end the debate by noting the
issue and shifting its discussion to another forum.
• Properly document the discussed comments, and the results of their
verification and validation.
• The duration of a review session should not exceed two hours.
Post-Review Activities
• Prepare the review report, including the action items
• Establish follow-up to ensure the satisfactory performance of all the list of
action items
16
Peer Reviews
• Peer reviews are conducted to detect errors and deviations from standards
• The difference between design reviews and peer reviews is based on the
participation and authority of reviewers
• Participants in design review hold superior positions to the project leader
while participants in peer review hold an equal position to the project leader
• Formal design reviews are authorized to approve the design document so that
work on the next stage of the project can begin
• Peer reviews however are authorized to only detect errors and deviations
from standards
• Common methods of peer reviews: inspection and walkthrough
• Peer reviews are guided by: Checklists, Standards and Past problems
17
Peer Review: Software Inspections
• These involve people examining a work product with the aim of discovering
anomalies and defects
• Inspections do not require execution of a system
• They may be applied to any representation of the system (requirements, design,
configuration data, test data, etc.)
• They have been shown to be an effective technique for discovering program
errors
• Inspections can check conformance with a specification but not conformance with
the customer’s real requirements.
• Inspections cannot check non-functional characteristics such as performance,
usability, etc.
• Intended explicitly for defect detection (not correction).
18
Inspection Participants
Participants Roles
Author or owner The programmer or designer responsible for
producing the program or document. Responsible
for fixing defects discovered during the inspection
process.
Inspector Finds errors, omissions and inconsistencies in
programs and documents. May als o identify
broader issues that are outside the scope of the
inspection team.
Reader Presents the code or document at an inspection
meeting.
Scribe Records the results of the inspection meeting.
Chairman or moderator Manages the process and facilitates the inspection.
Reports process results to the Chief moderator.
Chief moderator Responsible for inspection process improvements,
checklist updating, standards deve lopment etc.
19
Inspection Participants (cont’d)
• A designer:
• The systems analyst responsible for analysis and design of the software system
reviewed.
• A coder or implementer
• This inspector is expected to contribute his or her expertise to the detection of defects
that could lead to coding errors and subsequent software implementation difficulties.
• It’s preferable if this inspector is the leader of the coding team
• A tester
• An experienced professional, preferably the leader of the assigned testing team, who
focuses on identification of design errors usually detected during the testing phase.
20
Inspection Preparation
• Review leader/Moderator
• The moderator together with the author, determine which sections of the design document are
to be reviewed.
• Selecting team members
• Schedule the review session
• Distribute the document to the team members prior to the review session
• Team Members
• Read the document session to be reviewed
• List out their comments
• An error checklist should be prepared. The error check list an important tool supporting the
inspector’s review.
21
Inspection Checklists
22
Inspection Check list- Samples
Fault Class Inspection
Data faults Are all program variables initialised before their values are
used?
Have all constants been named?
Should the upper bound of arrays be equal to the size of the
array or Size -1?
If character strings are used, is a de limiter explicitly
assigned?
Is there any possibility of buffer overflow?
Control faults For each co nditional state ment, is the condition correct?
Is eac h loop certain to terminate?
Are compound statements correctly bracketed?
In case state ments, are all possible cases accounted for?
If a break is required after each case in case state ments, has
it been included?
Input/output faults Are all input variables used?
Are all output variables assigned a value before they are
output?
Can unexpected inputs cause corruption?
23
Inspection Checklist- Sample (cont’d)
Fault Class Inspection
Interface faults Do all function and method calls have the correct number
of parameters?
Do formal and actual parameter types match?
Are the parameters in the right order?
If components access shared memory, do they have the
same model of the shared memory structure?
Storage If a linked structure is modified, have all links been
management faults correctly reassigned?
If dynamic storage is used, has space been allocated
correctly?
Is space explicitly de-allocated after it is no longer
required?
Exception Have all possible error conditions been taken into account?
management faults
24
Inspection Process
Inspection Session
• Author presents system overview to the inspection team
• Review team asks questions and express opinions
• Inspection takes place and discovered errors are noted
25
Post- Inspection Session
• After Meeting
• Scribe prepares summary
• Team approves summary
• Modifications are made to repair discovered errors
• Re-inspection (follow up) may or may not be required
• At the end of the session, two documents will be prepared:
Inspection session findings report
• This report, produced by the scribe, should be completed and distributed immediately after the
session’s closing. Its main purpose is to assure full documentation of identified errors for correction
and follow up.
Inspection session summary report
• This report is to be compiled by the inspection leader shortly after the session or series of
sessions dealing with the same document. A typical report of this type summarizes the
inspection findings and the resources invested in the inspection.
• It serves mainly as input for analysis aimed at inspection process improvement and corrective
actions that go beyond the specific document or project.
26
Inspection Guidelines
• Walkthrough can be pre-planned or can be conducted at need basis and generally people
working on the work product are involved in the walkthrough process.
28
Walkthrough- Preparation
Participants
• Review leader (coordinator)
• The author
• Scribe/Recorder
• Specialized professionals:
• Standards enforcer
• Maintenance expert
• User representative
29
Walkthrough Participants (cont’d)
30
Walkthrough Session
• Walkthrough session opens with the author’s short presentation or overview of
the project and the design sections to be reviewed
• Reviewers present comments, possible defects and improvement suggestions
• Scribe/Recorder take notes of all the defects, suggestions given during
walkthrough meeting
• Based on reviewer comments, author performs necessary rework of the product,
if necessary
• At the end of a session, copies of the error documentation – the “walkthrough
session findings report” – should be handed to the development team and the
session participants.
31
Sample Design Walkthrough
32
Inspection vs. Walkthroughs
33
Expert Opinion
• Expert opinions, prepared by outside experts, support quality
evaluation by introducing additional capabilities to the internal
review staff.
34
Expert Opinion
• Situations when expert’s participation in reviews may become
necessary:
• Insufficient in-house professional capabilities in a
specialized area.
• Temporary lack of in-house professionals for review team.
• Indecisiveness caused by major disagreements among the
organization’s senior professionals.
• In small organizations, where the number of suitable
candidates for a review team is insufficient.
35
Technical review
◦ A technical review is carried out by a trained moderator or technical expert.
◦ It is less formal as compared to an inspection.
◦ It is usually carried out as a peer review without any participation from the
management.
◦ A team consisting of your peers, review the technical specification of the software
product and checks whether it is suitable for the project.
◦ They try to find any inconsistencies in the specifications and standards followed.
◦ This review concentrates mainly on the technical documents related to the software
such as test strategy, test plan and requirement specification documents.
36
Management Review
38
Verification and Validation
39
Verification and Validation (cont’d)
40
Verification & Validation Process
41
Goals of V & V
• Verification and validation should establish confidence that
the software is fit for purpose
• This does NOT mean completely free of defects
• Rather, it must be good enough for its intended use and the
type of use will determine the degree of confidence that is
needed
42
Verification and Validation Confidence
• Depends on system’s purpose, user expectations and
marketing environment
• Software function
• The level of confidence depends on how critical the
software is to an organisation.
• User expectations
• Users may have low expectations of certain kinds of
software.
• Marketing environment
• Getting a product to market early may be more
important than finding defects in the program.
43
Verification- Static and Dynamic Verification
44
Verification- Static and Dynamic Verification
45
Summary of SQA Techniques
• Formal Design Reviews
• conducted by senior personnel or outside experts
• uncover potential problems
• Inspections and Walkthroughs
• done by peers
• detect errors, adherence to standards, etc.
• Expert Opinion
• External expert’s opinion on the works of the design
46
Comparison of Review Methods
47
Comparison of Review Methods
48
SQA Plan – 1
• Management section
• describes the place of SQA in the structure of the
organization
• Documentation section
• describes each work product produced as part of
the software process
• Standards, practices, and conventions section
• lists all applicable standards/practices applied
during the software process and any metrics to be
collected as part of the software engineering work
49
SQA Plan - 2
50
SQA Plan - 3
51