Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Chapter 5 Static Testing

Download as pdf or txt
Download as pdf or txt
You are on page 1of 51

Chapter 5

Static Testing Techniques

Outline:
- Reviews
- Program Inspections
- Walkthroughs
- Principles of Validation
- Software Verification

1
1.2 Formal Design Reviews (Reviews)
• In a review , a work product is examined for defects by individuals other than the person who
produced it.
• A Work Product is any important deliverable created during the requirements, design, coding,
or testing phase of software development.

• Formal design reviews, commonly called “design reviews”, “DRs” and “formal technical reviews
(FTR)”, are the only reviews that are necessary for approval of the design product.

• Reviews in the SQA process are important because they provide early detection and prevent
the passing of design and analysis errors to next phases
• Methods to conduct reviews include:
• Formal Design Review
• Peer Reviews (Walkthrough and Inspection)
• Expert Opinion

6
Review Objectives
Direct objectives:
1.To detect analysis and design errors as well as subjects where corrections, changes and
completions are required
2.To identify new risks likely to affect the project.
3.To locate deviations from templates, style procedures and conventions.
4.To approve the analysis or design product. Approval allows the team to continue on to
the next development phase.

Indirect objectives:
1.To provide an informal meeting place for exchange of professional knowledge about
methods, tools and techniques.
2.To record analysis and design errors that will serve as a basis for future corrective
actions.

7
Some Common Formal Design Reviews

• DPR – Development Plan Review


• SRSR – Software Requirement Specification Review
• PDR – Preliminary Design Review
• DDR – Detailed Design Review
• DBDR – Data Base Design Review
• TPR – Test Plan Review
• STPR – Software Test Procedure Review
• VDR – Version Description Review
• OMR – Operator Manual Review
• SMR – Support Manual Review
• TRR – Test Readiness Review
• PRR – Product Release Review
• IPR – Installation Plan Review
8
Participants in a Design Review

• All DRs are conducted by a review leader and a review team


• Reviewers should be senior personnel and/or outside experts
• Project leader should not be Review leader
• It is desirable for non-project staff to make up the majority of the review team

Characteristics of a DR leader
• Knowledge and experience in development of projects of the type reviewed.
• Preliminary acquaintance with the current project is not necessary
• Seniority at a level similar if not higher than that of the project leader.
• A good relationship with the project leader and his team.
• A position external the project team

9
Design Review Preparations

• Review Team Leader


• Appoint team members
• Schedule review session
• Distribute design (or other) documents to review team
members

• Review Team
• Review documents
• List comments
• Use checklist

10
Sample Design Review Checklist
• Well-structured
• Simple
• Efficient
• Adequate
• Flexible
• Practical
• Implementable

11
• General:
1. Does the architecture convey a clear vision of the system that can be used for further development?
2. Is the architecture structured to support likely changes?
3. Does the architecture describe the system at a high level of detail? (No interface or implementation details.)
4. Does the architecture cleanly decompose the system?
5. Is the architecture independent of the infrastructure used to develop the system?
6. Has maintainability been considered?
7. No duplicate functionality in the architecture?
• Complete:
1. Are software requirements reflected in the software architecture?
2. Is effective modularity achieved? Are modules functionally independent?
3. Does each module/class have an understandable name?
4. Is each association well named?
5. Is each association’s and aggregation’s cardinality correct?
• Correct:
1. Does each association reflect a relationship that exists over the lives of the related modules/classes?
2. Does the architecture have loose coupling and good cohesion?

12
DR Session
• A typical DR session agenda includes:
• A short presentation of the design document
• Comments made by members of the review team
• Verification and validation of comments is discussed to determine the
required action items (corrections, changes and additions)
• Decisions about the design product (document), which determines the
project's progress:
• Full approval
• Partial approval
• Denial of approval

13
Post- Design Review Session
1. Preparation of the DR report
• The report's major sections:
• A summary of the review discussions.
• The decision about continuation of the project.
• A full list of the required action items: corrections, changes and
additions. For each action item, completion date and project
team member responsible are listed.
• The name (s) of the review team member (s) assigned to follow
up.

2. Follow up performance of the corrections and to examine


the corrected sections

14
Guidelines for successful Design Review
Design Review Infrastructure
• Develop checklists for common types of design documents.
• Train senior professionals serve as a reservoir for DR teams.
• Periodically analyze past DR effectiveness
• Schedule the DRs as part of the project plan

The Design Review Team


• Review teams size should be limited, with 3–5 members
being the optimum.

15
Guidelines for successful Design Review (cont’d)
The Design Review Session
• Discuss professional issues in a constructive way refraining from
personalizing the issues.
• Keep to the review agenda.
• Focus on detection of defects by verifying and validating the participants'
comments. Refrain from discussing possible solutions.
• In cases of disagreement about an error - end the debate by noting the
issue and shifting its discussion to another forum.
• Properly document the discussed comments, and the results of their
verification and validation.
• The duration of a review session should not exceed two hours.
Post-Review Activities
• Prepare the review report, including the action items
• Establish follow-up to ensure the satisfactory performance of all the list of
action items

16
Peer Reviews
• Peer reviews are conducted to detect errors and deviations from standards
• The difference between design reviews and peer reviews is based on the
participation and authority of reviewers
• Participants in design review hold superior positions to the project leader
while participants in peer review hold an equal position to the project leader
• Formal design reviews are authorized to approve the design document so that
work on the next stage of the project can begin
• Peer reviews however are authorized to only detect errors and deviations
from standards
• Common methods of peer reviews: inspection and walkthrough
• Peer reviews are guided by: Checklists, Standards and Past problems

17
Peer Review: Software Inspections
• These involve people examining a work product with the aim of discovering
anomalies and defects
• Inspections do not require execution of a system
• They may be applied to any representation of the system (requirements, design,
configuration data, test data, etc.)
• They have been shown to be an effective technique for discovering program
errors
• Inspections can check conformance with a specification but not conformance with
the customer’s real requirements.
• Inspections cannot check non-functional characteristics such as performance,
usability, etc.
• Intended explicitly for defect detection (not correction).

18
Inspection Participants
Participants Roles
Author or owner The programmer or designer responsible for
producing the program or document. Responsible
for fixing defects discovered during the inspection
process.
Inspector Finds errors, omissions and inconsistencies in
programs and documents. May als o identify
broader issues that are outside the scope of the
inspection team.
Reader Presents the code or document at an inspection
meeting.
Scribe Records the results of the inspection meeting.
Chairman or moderator Manages the process and facilitates the inspection.
Reports process results to the Chief moderator.
Chief moderator Responsible for inspection process improvements,
checklist updating, standards deve lopment etc.

19
Inspection Participants (cont’d)

Specialized professionals (Inspectors)

• A designer:
• The systems analyst responsible for analysis and design of the software system
reviewed.

• A coder or implementer
• This inspector is expected to contribute his or her expertise to the detection of defects
that could lead to coding errors and subsequent software implementation difficulties.
• It’s preferable if this inspector is the leader of the coding team

• A tester
• An experienced professional, preferably the leader of the assigned testing team, who
focuses on identification of design errors usually detected during the testing phase.

20
Inspection Preparation

• Review leader/Moderator
• The moderator together with the author, determine which sections of the design document are
to be reviewed.
• Selecting team members
• Schedule the review session
• Distribute the document to the team members prior to the review session
• Team Members
• Read the document session to be reviewed
• List out their comments

• An error checklist should be prepared. The error check list an important tool supporting the
inspector’s review.

21
Inspection Checklists

• Checklist of common errors should be used to


drive the inspection.
• Error checklists are programming language
dependent and reflect the characteristic errors that are likely to arise in the
language.
• In general, the 'weaker' the type checking, the larger the checklist.
• Examples: Initialisation, Constant naming, loop
termination, array bounds, etc.

22
Inspection Check list- Samples
Fault Class Inspection
Data faults Are all program variables initialised before their values are
used?
Have all constants been named?
Should the upper bound of arrays be equal to the size of the
array or Size -1?
If character strings are used, is a de limiter explicitly
assigned?
Is there any possibility of buffer overflow?
Control faults For each co nditional state ment, is the condition correct?
Is eac h loop certain to terminate?
Are compound statements correctly bracketed?
In case state ments, are all possible cases accounted for?
If a break is required after each case in case state ments, has
it been included?
Input/output faults Are all input variables used?
Are all output variables assigned a value before they are
output?
Can unexpected inputs cause corruption?

23
Inspection Checklist- Sample (cont’d)
Fault Class Inspection
Interface faults Do all function and method calls have the correct number
of parameters?
Do formal and actual parameter types match?
Are the parameters in the right order?
If components access shared memory, do they have the
same model of the shared memory structure?
Storage If a linked structure is modified, have all links been
management faults correctly reassigned?
If dynamic storage is used, has space been allocated
correctly?
Is space explicitly de-allocated after it is no longer
required?
Exception Have all possible error conditions been taken into account?
management faults

24
Inspection Process

Inspection Session
• Author presents system overview to the inspection team
• Review team asks questions and express opinions
• Inspection takes place and discovered errors are noted

25
Post- Inspection Session

• After Meeting
• Scribe prepares summary
• Team approves summary
• Modifications are made to repair discovered errors
• Re-inspection (follow up) may or may not be required
• At the end of the session, two documents will be prepared:
Inspection session findings report
• This report, produced by the scribe, should be completed and distributed immediately after the
session’s closing. Its main purpose is to assure full documentation of identified errors for correction
and follow up.
Inspection session summary report
• This report is to be compiled by the inspection leader shortly after the session or series of
sessions dealing with the same document. A typical report of this type summarizes the
inspection findings and the resources invested in the inspection.
• It serves mainly as input for analysis aimed at inspection process improvement and corrective
actions that go beyond the specific document or project.
26
Inspection Guidelines

• Review the Product, not the person!


• Find errors, don't try to solve them!
• Keep Records
• Take written notes.
• Review your earlier reviews.
• Allocate resources and schedule time for FTRs.
• 3 to 5 people
• Conduct training for reviewers
• Keep it short
• limit debate and rebuttal
• Set an agenda and keep it.
• no more than two hours preparation
• small portions only
• narrow focus increases likelihood of finding an error
• meeting duration less than two hours
27
Walkthrough
• This is an informal method of conducting informal group/individual review
• In a walkthrough, a designer or programmer leads the review team through a software work
product, and the participants ask questions and make comments about
• possible errors,
• violation of development standards
• other problems or may suggest improvement on the work product under review

• Walkthrough can be pre-planned or can be conducted at need basis and generally people
working on the work product are involved in the walkthrough process.

The Purpose of walkthrough is to:


• Find problems
• Discuss alternative solutions
• Focusing on demonstrating how work product meets all requirements.

28
Walkthrough- Preparation

• The preconditions for walkthrough is almost similar with Inspections


• Team members briefly read the document to be reviewed in order to
obtain a general overview of the sections to be reviewed, the project
and its environment.

Participants
• Review leader (coordinator)
• The author
• Scribe/Recorder
• Specialized professionals:
• Standards enforcer
• Maintenance expert
• User representative
29
Walkthrough Participants (cont’d)

Specialized professionals of walkthrough team


• A standards enforcer
• This team member, who specializes in development standards and procedures, is assigned the
task of locating deviations from those standards and procedures.
• A maintenance expert
• This team member will focus on maintainability, flexibility and testability issues, and detects
design defects that could make the correction of bugs or performance of future changes
difficult.
• Another focus will be on documentation, whose completeness and correctness are vital for
any maintenance activity.
• A user representative
• Participation of user’s representative in the walk- through team contributes to the review’s
validity because he or she examines the software system from the point of view of the user-
consumer rather than the designer–supplier.

30
Walkthrough Session
• Walkthrough session opens with the author’s short presentation or overview of
the project and the design sections to be reviewed
• Reviewers present comments, possible defects and improvement suggestions
• Scribe/Recorder take notes of all the defects, suggestions given during
walkthrough meeting
• Based on reviewer comments, author performs necessary rework of the product,
if necessary
• At the end of a session, copies of the error documentation – the “walkthrough
session findings report” – should be handed to the development team and the
session participants.

31
Sample Design Walkthrough

32
Inspection vs. Walkthroughs

Inspection vs. walkthrough


– participants and processes

33
Expert Opinion
• Expert opinions, prepared by outside experts, support quality
evaluation by introducing additional capabilities to the internal
review staff.

• Outside experts transmit their expertise by either:


• Preparing an expert’s judgment about a document or a
code section.

• Participating as a member of an internal design review,


inspection or walkthrough team.

34
Expert Opinion
• Situations when expert’s participation in reviews may become
necessary:
• Insufficient in-house professional capabilities in a
specialized area.
• Temporary lack of in-house professionals for review team.
• Indecisiveness caused by major disagreements among the
organization’s senior professionals.
• In small organizations, where the number of suitable
candidates for a review team is insufficient.

35
Technical review
◦ A technical review is carried out by a trained moderator or technical expert.
◦ It is less formal as compared to an inspection.
◦ It is usually carried out as a peer review without any participation from the
management.
◦ A team consisting of your peers, review the technical specification of the software
product and checks whether it is suitable for the project.
◦ They try to find any inconsistencies in the specifications and standards followed.
◦ This review concentrates mainly on the technical documents related to the software
such as test strategy, test plan and requirement specification documents.

36
Management Review

◦ The management review is a cross-functional review


undertaken by an organization’s top management.
◦ It includes analyses of
◦ customer satisfaction,
◦ determination of the cost of poor quality,
◦ performance trends and
◦ achievement of objectives defined in the business plan.
37
Audit

◦ An audit is an independent examination of all the documents related


to the software, carried out by an external agent.

◦ Audits provide increased assurance to the various stakeholders of the


project that the document is up to standards and devoid of any defects.

38
Verification and Validation

• IEEE Std 610.12-1990 (IEEE, 1990) defines these aspects as


follows:
• Verification – “The process of evaluating a system or
component to determine whether the products of a given
development phase satisfy the conditions imposed at the start
of that phase”

• Validation – “The process of evaluating a system or component


during or at the end of the development process to determine
whether it satisfies specified requirements”

39
Verification and Validation (cont’d)

• Verification examines the consistency of the products being developed with


products developed in previous phases.
• Validation represents the customer’s interest by examining the extent of
compliance to his or her original requirements.
• Comprehensive validation reviews tend to improve customer satisfaction from the
system.
• Verification – Are we building the product right?
• The software should conform to its specification.
• Validation – Are we building the right product?
• The software should do what the user really requires

40
Verification & Validation Process

• Is a whole life-cycle process - V & V must be applied at each stage in the


software process.

• Has two principal objectives:


• The discovery of defects in a system
• The assessment of whether or not the system is useful and useable in an
operational situation

41
Goals of V & V
• Verification and validation should establish confidence that
the software is fit for purpose
• This does NOT mean completely free of defects
• Rather, it must be good enough for its intended use and the
type of use will determine the degree of confidence that is
needed

42
Verification and Validation Confidence
• Depends on system’s purpose, user expectations and
marketing environment
• Software function
• The level of confidence depends on how critical the
software is to an organisation.
• User expectations
• Users may have low expectations of certain kinds of
software.
• Marketing environment
• Getting a product to market early may be more
important than finding defects in the program.

43
Verification- Static and Dynamic Verification

• Software inspections- Concerned with analysis of


the static system representation to discover problems (static
verification)
• May be supplemented by tool-based document and code
analysis

• Software testing- Concerned with exercising and


observing product behaviour (dynamic verification)
• The system is executed with test data and its operational
behaviour is observed

44
Verification- Static and Dynamic Verification

45
Summary of SQA Techniques
• Formal Design Reviews
• conducted by senior personnel or outside experts
• uncover potential problems
• Inspections and Walkthroughs
• done by peers
• detect errors, adherence to standards, etc.
• Expert Opinion
• External expert’s opinion on the works of the design

46
Comparison of Review Methods

47
Comparison of Review Methods

48
SQA Plan – 1
• Management section
• describes the place of SQA in the structure of the
organization
• Documentation section
• describes each work product produced as part of
the software process
• Standards, practices, and conventions section
• lists all applicable standards/practices applied
during the software process and any metrics to be
collected as part of the software engineering work

49
SQA Plan - 2

• Reviews and audits section


• provides an overview of the approach used in the
reviews and audits to be conducted during the
project
• Test section
• references the test plan and procedure document
and defines test record keeping requirements

50
SQA Plan - 3

• Problem reporting and corrective action section


• defines procedures for reporting, tracking, and
resolving errors or defects, identifies organizational
responsibilities for these activities
• Other
• tools, SQA methods, change control, record
keeping, training, and risk management

51

You might also like