Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Unit 2

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 19

SQA COMPONENTS AND PROJECT LIFE CYCLE

SOFTWARE DEVELOPMENT METHODOLOGIES


The software development process classified into four types:

The Software Development Life Cycle (SDLC) model ( Waterfall Model)


The prototyping model
The spiral model
The object-oriented model.

The Software Development Life Cycle model

A (software/system) lifecycle model is a description of the sequence of activities carried out in an SE


project, and the relative order of these activities.

Figure 2.1: Software lifecycle model

Requirements definition. For the functionality of the software system to be developed, the customers
must define their requirements. It defines needed information, function, behavior, performance and
interfaces.
Analysis. It analyzes the requirements’ implications to form the initial software system model.

Design. This stage involves the detailed definition of the outputs, inputs and processing procedures,
including data structures and databases, software structure, etc.

Coding. In this phase, the design is translated into a code. Coding involves quality assurance activities
such as inspection, unit tests and integration tests.

System tests. System tests are performed once the coding phase is completed. The main goal of
testing is to uncover as many software errors as possible so as to achieve an acceptable level of
software quality, once corrections have been completed. System tests are carried out by the software
developer before the software is supplied to the customer.

In many cases the customer performs independent software tests (“acceptance tests”) to assure him or
herself that the developer has fulfilled all the commitments and that no unanticipated or faulty
software reactions are anticipated.
Installation and conversion. After the software system is approved, the system is installed to serve as
firmware. If the new information system is to replace an existing system, a software conversion
process has to be initiated to make sure that the organization’s activities continue uninterrupted during
the conversion phase.

Regular operation and maintenance. Regular software operation begins once installation and
conversion have been completed. Throughout the regular operation period, which usually lasts for
several years or until a new software generation appears on the scene, maintenance is needed.
Maintenance incorporates three types of services: Corrective – repairing software faults identified by
the user during operation; Adaptive – using the existing software features to fulfill new requirements;
and Perfective – adding new minor features to improve software performance.

The number of phases can vary according to the characteristics of the project. In complex, large-
scale models, some phases are split, causing their number to grow to eight, nine or more. In smaller
projects, some phases may be merged, reducing the number of phases to six, five or even four phases.

Waterfall Strength

• Easy to understand, easy to use

• Provides structure to inexperienced staff

• Milestones are well understood

• Sets requirements stability

• Good for management control (plan, staff, track)

• Works well when quality is more important than cost or schedule


When to use the Waterfall Model

• Requirements are very well known

• Product definition is stable

• Technology is understood

• New version of an existing product

• Porting an existing product to a new platform.


Disadvantages

• Idealized, doesn’t match reality well.

• Doesn’t reflect iterative nature of exploratory development.

• Unrealistic to expect accurate requirements so early in project

• Software is delivered late in project, delays discovery of serious errors.


PROTOTYPE MODEL

Prototyping is the process of quickly putting together a working model (a prototype) in order to test
various aspects of a design, illustrate ideas or features and gather early user feedback. A typical
application of the prototyping methodology is shown in Figure 2.2.

Figure 2.2: Methodology for Prototype Model

• Developers build a prototype during the requirements phase

• Prototype is evaluated by end users

• Users give corrective feedback

• Developers further refine the prototype


• When the user is satisfied, the prototype code is brought up to the standards needed for a
final product.

Advantages

 Users can try the system and provide constructive feedback during development
 An operational prototype can be produced in weeks
 Prototyping enables early detection of errors

Disadvantages

 Uncertain design idea’s


 Information can be lost through so many improvement changes
 Difficult to know how long project will last

Spiral model
According to the spiral model, shown in Figure, software development is perceived to be an iterative
process; at each iteration, the following activities are performed:

 Planning

 Risk analysis and resolution

 Engineering activities according to the stage of the project: design, coding, testing,
installation and release
 Customer evaluation, including comments, changes and additional requirements, etc.
Figure 2.3 : Sriral Model

Planning Phase: Requirements are gathered during this planning phase. Requirements like
‘BRS’ that is ‘Business Requirement Specifications’ and ‘SRS’ that is ‘System Requirement
specifications’ are collected in this phase.

Risk Analysis: In the risk analysis phase, a process is undertaken to identify risk and
alternate solutions. A prototype is produced at the end of the risk analysis phase. If any risk
is found during the risk analysis then alternate solutions are suggested and implemented.

Engineering Phase: In this phase software is developed, along with testing at the end of the
phase. Hence in this phase the development and testing is done.

Evaluation phase: This phase allows the customer to evaluate the output of the project to
date, before the project continues to the next spiral.

Advantages:

1. Realism: the model accurately reflects the iterative nature of software development on
projects with unclear requirements

2. Flexible: incoporates the advantages of the waterfal and rapid prototyping methods
3. Comprehensive model decreases risk

4. Good project visibility.

Disadvantages:

 Can be a costly model to use.

 Risk analysis requires highly specific expertise.

 Project’s success is highly dependent on the risk analysis phase.

 Doesn’t work well for smaller projects.

When to use Spiral model:

 When costs and risk evaluation is important

 For medium to high-risk projects

 Long-term project commitment unwise because of potential changes to economic


priorities

 Users are unsure of their needs

 Requirements are complex

 New product line

 Significant changes are expected (research and exploration)

Example: Building a House

Incremental: Start with a modest house; keep adding rooms and upgrades to it.

Iterative: On each iteration, the house is re-designed and built a new.


Object oriented model

The object-oriented model shown in figure 2.4 differs from the other models by its
intensive reuse of software components. This methodology is characterized by its easy
integration of existing software modules (called objects or components) into newly developed
software systems. A software component library serves this purpose by supplying software
components for reuse.

The development process begins with a sequence of object-oriented analyses and designs.
The design phase is followed by acquisition of suitable components from the reusable software
library, when available. “Regular” development is carried out otherwise copies of newly
developed software components are then “stocked” in the software library for future reuse.
Figure 2.4: Object oriented Model

Advantages:

 Reuse
 Improved software-development productivity

FACTORS AFFECTING INTENSITY OF QUALITY ASSURANCE


ACTIVITIES IN THE DEVELOPMENT PROCESS
Project life cycle quality assurance activities are process oriented, in other words, linked to
completion of a project phase, accomplishment of a project milestone, and so forth. The quality
assurance activities will be integrated into the development plan that implements one or more software
development models – the waterfall, prototyping, spiral, object-oriented or other models.
The list of quality assurance activities needed for a project.

For each quality assurance activity:

– Timing
– Type of quality assurance activity to be applied
– Who performs the activity and the resources required? It should be noted that various bodies
may participate in the performance of quality assurance activities: development team and department
staff members together with independent bodies such as external quality assurance team members or
consultants

– Resources required for removal of defects and introduction of changes.

Factors affecting the intensity of quality assurance activities

Project factors:

 Magnitude of the project


 Technical complexity and difficulty
 Extent of reusable software components
 Severity of failure outcomes, if the project fails

Team factors

 Professional qualification of the team members


 Team acquaintance with the project and its experience in the area
 Availability of staff members who can professionally support the team
 Familiarity with the team members, in other words the percentage of new staff
members in the team

Example 1

The real-time software development unit of a hospital’s information systems department has
been assigned to develop an advanced patient monitoring system. The new monitoring unit is to
combine of patient’s room unit with a control unit. The patient’s room unit is meant to interface
with several types of medical equipment, supplied by different manufacturers, which measure
various indicators of the patient’s condition. A sophisticated control unit will be placed at the
nurses’ station, with data to be communicated to cellular units carried by doctors.

The project leader estimates that 14 months will be required to complete the system; a team
of five will be needed, with an investment of a total of 40 man-months. She estimates that only
15% of the components can be obtained from the reusable component library. The SDLC
methodology was chosen to integrate application of two prototypes of the patient’s room unit and
two prototypes of the control unit for the purpose of improving communication with the users
and enhancing feedback of comments at the analysis and design phases.

The main considerations affecting this plan are:

 High complexity and difficulty of the system

 Low percentage of reusable software available

 Large size of the project

 High severity of failure outcomes if the project fails.

The quality assurance activities and their duration, as defined by the project leader, are listed in
Table.

Quality assurance activity

1 Design review of requirements definition


2 Design review of analysis of patient’s room unit
3 Design review of analysis of control unit
4 Design review of preliminary design
5 Inspection of design of patient’s room unit
6 Inspection of design of control unit
7 Design review of prototype of patient’s room unit
8 Design review of prototype of control unit
9 Inspection of detailed design for each software interface component
10 Design review of test plans for patient’s room unit and control unit
11 Unit tests of software code for each interface module of patient’s room unit
12 Integration test of software code of patient’s room unit
13 Integration test of software code of control unit
14 System test of completed software system
15 Design review of user’s manual

Verification, validation and qualification


Three aspects of quality assurance of the software product (a report, code, etc.) are
examined under the rubrics of verification, validation and qualification.

Verification – The process of evaluating a system or component to determine whether the


products of a given development phase satisfy the conditions imposed at the start of that phase.”
Validation – The process of evaluating a system or component during or at the end of the development
process to determine whether it satisfies specified requirements.”

Qualification – The process used to determine whether a system or component is suitable for operational
use.”
 Verification examines the consistency of the products being developed with products
developed in previous phases. When doing so, the examiner follows the development
process and assumes that all the former development phases have been completed
correctly, whether as originally planned or after removal of all the discovered
defects.This assumption forces the examiner to disregard deviations from the customer’s
original requirements that might have been introduced during the development process.
 Validation represents the customer’s interest by examining the extent of compliance to
his or her original requirements. Comprehensive validation reviews tend to improve
customer satisfaction from the system.

 Planners are required to determine which of these aspects should be examined in each
quality assurance activity. The distinction between the two terms is largely to do with the
role of specifications. Validation is the process of checking whether the specification
captures the customer's needs, while verification is the process of checking that the
software meets the specification.

Verification Validation

1. Verification is a static practice of verifying 1. Validation is a dynamic mechanism of


documents, design, code and program. validating and testing the actual product.

2. It does not involve executing the code. 2. It always involves executing the code.

3. It is human based checking of documents 3. It is computer based execution of program.


and files.

4. Verification uses methods like inspections, 4. Validation uses methods like black box
reviews, walkthroughs, and Desk-checking etc. (functional) testing, gray box testing, and
white box (structural) testing etc.

5. Verification is to check whether the 5. Validation is to check whether software


software conforms to specifications. meets the customer expectations and
requirements.

6. It can catch errors that validation cannot 6. It can catch errors that verification cannot
catch. It is low level exercise. catch. It is High Level Exercise.

7. Target is requirements specification, 7. Target is actual product-a unit, a module, a


application and software architecture, high bent of integrated modules, and effective
level, complete design, and database design etc. final product.
8. Verification is done by QA team to ensure 8. Validation is carried out with the
that the software is as per the specifications in involvement of testing team.
the SRS document.

9. It generally comes first-done before 9. It generally follows after verification.


validation.

REVIEW

“A process or meeting during which a work product, or set of work products, is presented to project
personnel, managers, users, customers, or other interested parties for comment or approval.”

Formal Technical Review Objectives

Direct objectives

 To detect analysis and design errors as well as subjects where corrections, changes and
completions are required with respect to the original specifications.

 To identify new risks likely to affect completion of the project.

 To locate deviations from templates and style procedures and conventions.


 To approve the analysis or design product.

Indirect objectives
 To provide an informal meeting place for exchange of professional knowledge about
development methods, tools and techniques.
 To record analysis and design errors that will serve as a basis for future corrective actions. The
corrective actions are expected to improve development methods by increasing effectiveness and
quality, among other product features.

Formal Technical Review:

A method involving a structured encounter in which a group of technical personnel analyzes or


improves the quality of the original work product as well as the quality of the method. All DRs are
conducted by a review leader and a review team.

Review leader

The following characteristics are expected to become a leader

 Knowledge and experience in development of projects of the type reviewed.


 A good relationship with the project leader and his team.
 A position external to the project team.

Review Team

The entire review team should be selected from among the senior members of the project team
together with appropriate senior professionals assigned to other projects and departments, customer–user
representatives, and in some cases, software development consultants. A review team of three to five
members is expected to be an efficient team, given the proper diversity of experience.

An excessively large team tends to create coordination problems waste review session time and
decrease the level of preparation, based on a natural tendency to assume that others have read the design
document.

Preparation for DR

Review leader preparations


The main tasks of the review leader in the preparation stage are:
 To appoint the team members
 To schedule the review sessions
 To distribute the design document among the team members (hard copy, electronic file, etc.).

Review team preparations

Team members are expected to review the design document and list their comments prior
to the review session. In cases where the documents are sizable, the review leader may ease the
load by assigning to each team member review of only part of the document. An important tool
for ensuring the review’s completeness is the checklist.
Development team preparations
The team’s main obligation as the review session approaches is to prepare a short
presentation of the design document. Assuming that the review team members have read the
design document thoroughly and are now familiar with the project’s outlines, the presentation should
focus on the main professional issues awaiting approval rather than wasting time on description of the
project in general.

Agenda for DR session

 A short presentation of the design document.


 Comments made by members of the review team.
 Verification and validation of comments is discussed to determine the required action
items (corrections, changes and additions).
 Decisions about the design product (document), which determines the project's progress:
· Full approval.
· Partial approval.
· Denial of approval.
Preparation of the DR report

The report's major sections:

 A summary of the review discussions.


 The decision about continuation of the project.
 A full list of the required action items — corrections,
changes and additions. For each action item, completion
date and project team member responsible are listed.
 The name(s) of the review team member(s) assigned to
follow up.

Pressman's 13 golden guidelines for successful FDR

Design Review Infrastructure

 Develop checklists for each or common types of design documents.


 Train senior professionals serve as a reservoir for DR teams.
 Periodically analyze past DR effectiveness.
· Schedule the DRs as part of the project plan.
The Design Review Team

 . Review teams size should be limited, with 3–5 members


being the optimum.
The Design Review Session

 Discuss professional issues in a constructive way refraining


from personalizing the issues.
 Keep to the review agenda.
 Focus on detection of defects by verifying and validating the
participants' comments. Refrain from discussing possible solutions.
 In cases of disagreement about an error - end the debate by
noting the issue and shifting its discussion to another forum.
 Properly document the discussed comments, and the results of
their verification and validation.
 The duration of a review session should not exceed two hours.
Post-Review Activities

 Prepare the review report, including the action items


 Establish follow-up procedure to ensure the satisfactory performance of all the list of
action items
Difference between FDR and Peer DR
Formal design reviews are authorized to approve the design document so that work on the next
stage of the project can begin. This authority is not granted to the peer reviews, whose main objectives lie
in detecting errors and deviations from standards.
Difference between Inspection and walkthrough- peer review
Software Inspection: A formal evaluation technique in which software requirements, design, or
code are examined in detail by person or group other than the author to detect faults, violations
of development standards, and other problems. Software Inspection is the most formal,
commonly used form of peer review. The key feature of an inspection is the use of the checklists
to facilitate error detection and defined roles for participants.
Software Walkthrough: In the most usual form of term, a walkthrough is step by step
simulation of the execution of a procedure, as when walking through code line by line, with an
imagined set of inputs. The term has been extended to the review of material that is not
procedural, such as data descriptions, reference manuals, specifications, etc.

PEER REVIEW
Peer review is the evaluation of work by one or more people of similar competence to the
producers of the work. It constitutes a form of self-regulation by qualified members of a
profession within the relevant field.
Participants of peer reviews
The optimal peer review team is composed of three to five participants. In certain cases, the
addition of one to three further participants is acceptable.
A recommended peer review team includes:
 A review leader
 The author
 Specialized professionals.

The review leader

The characteristics of leader:


 Be well versed in development of projects of the current type and famil-iar with its technologies.
Preliminary acquaintance with the current project is not necessary.
 Maintain good relationships with the author and the development team.
 Come from outside the project team.
 Display proven experience in coordination and leadership of professional meetings.
 For inspections, training as a moderator is also required.
The author
The author is, invariably a participant in each type of peer review.
Specialized professionals

The specialized professionals participating in the two peer review methods differ by
review. For inspections, the recommended professionals are:

A designer: the systems analyst responsible for analysis and design of the software system
reviewed.
A coder or implementer: a professional who is thoroughly familiar with coding tasks, preferably the
leader of the designated coding team. This inspector is expected to contribute his or her expertise to the
detection of defects that could lead to coding errors and subsequent software implementation difficulties.

A tester: an experienced professional, preferably the leader of the assigned testing team, who focuses on
identification of design errors usu-ally detected during the testing phase.

For walkthroughs, the recommended professionals are:

A standards enforcer: This team member, who specializes in development standards and procedures, is
assigned the task of locating deviations from those standards and procedures.

A maintenance expert who is called upon to focus on maintainability, flexibility and testability issues
and to detect design defects capable of impeding correction of bugs or performance of future changes.
Another area requiring his or her expertise is documentation, whose completeness and correctness are
vital for any maintenance activity.
A user representative. Participation of an internal (when the customer is a unit in the same firm) or an
external user’s representative in the walk-through team contributes to the review’s validity because he or
she examines the software system from the point of view of the user consumer rather than the designer
supplier. In cases where a “real” user is not available, a team member may take on that role and focus on
validity issues by comparing of the original requirements with the actual design.

Team assignments
Conducting a review session requires, naturally, assignment of specific tasks to the team
members. Two of these members are the presenter of the document and the scribe, who documents the
discussions.

The presenter: During inspection sessions, the presenter of the document is chosen by the moderator;
usually, the presenter is not the document’s author. In many cases the software coder serves as the
presenter because he or she is the team member who is most likely to best understand the design logic and
its implications for coding.
In contrast, for most walk-through sessions, it is the author, the professional most intimately
acquainted with the document, who is chosen to present it to the group. Some experts claim that an
author’s assignment as presenter may affect the group members’ judgement; therefore, they argue that the
choice of a “neutral” presenter is to be preferred.
The scribe: The team leader will often but not always serve as the scribe for the session, and record
the noted defects that are to be corrected by the development team. This task is more than procedural; it
requires thorough professional understanding of the issues discussed.

Peer review leader’s preparations for the review session


The main tasks of the review leader in the preparation stage are:
 To determine, together with the author, which sections of the design document are to be
reviewed.
Such sections can be:
o The most difficult and complex sections
o The most critical sections, where any defect can cause severe damage to the program
application and thus to the user
o The sections prone to defects.
 To select the team members.
 To schedule the peer review sessions.
 To distribute the document to the team members prior to the review session.

Peer review team’s preparations for the review session


Inspection team members are expected to read the document sections to be reviewed and list their
comments before the inspection session begins. This advance preparation is meant to guarantee the
session’s effectiveness. They will also be asked to participate in an overview meeting. At this meeting,
the author provides the inspection team members with the necessary relevant background for reviewing
the chosen document sections: the project in general, the logic, processes, outputs, inputs, and interfaces.
In cases where the participants are already well acquainted with the material, an overview meeting may
be waived. An important tool supporting the inspector’s review is a checklist.

Prior to the walkthrough session, team members briefly read the material in order to obtain a general
overview of the sections to be reviewed, the project and its environment. Participants lacking preliminary
knowledge of the project and its substantive area will need far more preparation time. In most
organizations employing walkthroughs, team participants are not required to prepare their comments in
advance.

The peer review session


A typical peer review session takes the following form.

 The presenter reads a section of the document and adds, if needed, a brief explanation of
the issues involved in his or her own words.
 As the session progresses, the participants either deliver their comments to the document
or address their reactions to the comments.
 The discussion should be confined to identification of errors, which means that it should
not deal with tentative solutions. During the session, the scribe should document each error
recognized by location and description, type and character (incorrect, missing parts or extra
parts)

Session Documentation

Two documents are to be produced following an inspection session


Inspection session findings report: This report, produced by the scribe, could be completed and
distributed immediately after the session’s closing. Its main purpose is to assure full documentation of
identified errors for correction and follow up.
Inspection session summary report: This report is to be compiled by the inspection leader shortly
after the session or series of sessions dealing with the same document. A typical report of this type
summarizes the inspection findings and the resources invested in the inspection.
Participants and Process of Inspection and Walkthrough
Formal design
Properties reviews Inspections Walkthroughs
(1) Detect
errors
(2) Identify (1) Detect errors
new risks (2) Identify
Main direct (3) Approve deviations from
Objectives the design standards Detect errors
document
(1) Knowledge
Knowledge exchange
exchange 2) Support corrective
Main indirect objectives actions Knowledge exchange

Chief software
engineer Coordinator (peer, the
Review leader or senior staff Trained moderator project leader on
member (peer) occasion)

Top-level staff Peers


Participants and customer Peers
representatives

Project leader Yes; usually as the


Participation Yes Yes review’s initiator
Specialized (1) Designer (1) Standards enforcer
professionals Yes (2) Coder or (2) Maintenance expert
in the team implementer (3) User representative
(3) Tester

Overview No Yes Yes


Meeting No

Participants’ Yes –
Preparations thorough Yes – thorough Yes – brief
Revie
w
Session
Yes Yes Yes

Follow-up of Yes Yes No


correction
Infrastructure
Formal

training of No Yes No

Participants
Use
of
check
list No Yes No
Error-related data Not formally
collection required Formally required Not formally required
Review
Documentation (1) Inspection
Formal design session Walkthrough
review findings report session
Report (2) Inspection findings report
session
summary report

You might also like