Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 29

Software Quality

Assurance
Basic concepts
• Software quality assurance (SQA)
consists of a means of monitoring the software
engineering processes and methods used to ensure
quality.
It does this by means of audits of the quality
management system under which the software
system is created.
These audits are backed by one or more standards,
usually ISO 9000 or CMMI.
• It is practically impossible to iron out every
single bug before releasing it both from a
difficulty point of view and due to time
constraints.
2
QA, QC and Testing
• Quality Assurance  measures the quality of processes used
to create a quality product
 A set of activities designed to ensure that the development and/or
maintenance process is adequate to ensure a system will meet its
objectives.
 QA activities ensure that the process is defined and appropriate.
 Methodology and standards development are examples of QA
activities.
 A QA review would focus on the process elements of a project
• Quality Control  measures the quality of a product
 A set of activities designed to evaluate a developed work product.
 QC activities focus on finding defects in specific deliverables
• Testing  quality control
 The process of executing a system with the intent of finding
defects.
 includes test planning prior to the execution of the test cases.
 Testing is one example of a QC activity, but there are others such
as inspections.
• Both QA and QC activities are generally required for successful
software development 3
SQA jobs
• SQA includes:
Reviewing requirements documents
Software testing
• SQA encompasses the entire software
development process
Software design
Coding
Source code control
Code reviews
Change management
Configuration management
Release management.

4
Quality control vs. QA
• Software quality control is a control of products
• Software quality assurance is a control of processes
 Related to the practice of quality assurance in product
manufacturing
• Software vs. manufactured product
 Manufactured product is physical and can be seen
 Software product is not visible.
 Manufactured product rolls off the assembly line, it is
essentially a complete, finished product
 Software is never finished
• The processes and methods to manage, monitor,
and measure software ongoing quality are as fluid
and sometimes elusive as are the defects that they
are meant to keep in check
5
SQA Methodology
• PPQA audits:
Process and Product Quality Assurance
Is the activity of ensuring that the process
and work product conform to the agreed
upon process.

6
Quality control activities
• Peer Reviews:
 Peer reviews of a project's work products are the most efficient defect
removal (quality control) activity.
• Validation testing
 Is the act of entering data that the tester knows to be erroneous into an
application.
 For instance, typing "Hello" into an edit box that is expecting to receive a
numeric entry.
• Data comparison
 Comparing the output of an application with specific parameters to a
previously created set of data with the same parameters that is known to be
accurate.
• Stress testing:
 A stress test is when the software is used as heavily as possible for a
period of time to see whether it copes with high levels of load.
 Often used for server software that will have multiple users connected to it
simultaneously.
 Also known as Destruction testing.
• Usability testing
 Sometimes getting users who are unfamiliar with the software to try it for a
while and offer feedback to the developers about what they found difficult to
do is the best way of making improvements to a user interface
7
Advantages of SQA
• Improved customer satisfaction 
• Reduced cost of development
• Reduced cost of maintenance

8
9
Testing lifecycle and phases (1)
• Test Requirements
• Test Planning
• Test Environment Setup
• Test Design
• Test Automation
• Test Execution and Defect Tracking
• Test Reports and Acceptance

10
Testing lifecycle and phases (2)

11
Testing lifecycle and phases (3)

12
Positive vs. Negative testing
• Positive testing:
Doing something it was supposed to do
• Negative testing:
Doing something it was not supposed to do
• The simple testing the application
beyond and below of its limits
• Examples
the password where it should be minimum
of 8 characters so testing it using 6
characters is negative testing
13
Capability Maturity Model CMM
• A model of the maturity of the capability of certain business processes.
 A maturity model can be described as a structured collection of
elements that describe certain aspects of maturity in an organization
 Aids in the definition and understanding of an organization's
processes
• Level 1 - Ad hoc (Chaotic)
 Are (typically) undocumented and in a state of dynamic change,
tending to be driven in an ad hoc, uncontrolled and reactive manner
by users or events
• Level 2 – Repeatable
 some processes are repeatable, possibly with consistent results.
• Level 3 - Defined
 sets of defined and documented standard processes established
and subject to some degree of improvement over time.
• Level 4 - Managed
 using process metrics, management can effectively control the AS-
IS process (e.g., for software development )
• Level 5 - Optimized
 The focus is on continually improving process performance through
both incremental and innovative technological
changes/improvements.
14
Test case methodologies
• Testing Methodologies are different from Test case Methodologies
• Decision Tables
 like if-then-else and switch-case statements, associate conditions with
actions to perform
• Cause Effect Graphs,
 a directed graph that maps a set of causes to a set of effects. The causes
may be thought of as the input to the program, and the effects may be
thought of as the output.
• Equivalence Class Partitioning ECP
 Approach divides the input domain of a software to be tested into the finite
number of partitions or equivalence classes.
 This method can be used to partition the output domain as well, but it is not
commonly used
• Boundary value testing BVA
 find whether the application is accepting the expected range of values and
rejecting the values which falls out of range
• Error Guessing
 design technique based on the ability of the tester to draw on his past
experience, knowledge and intuition to predict where bugs will be found in
the software under test

15
Verification vs. Validation
• The process of checking that a product, service, or system
meets specifications and that it fulfils its intended purpose
• Validation  Are you building the right thing?
 The process of establishing documented evidence that
provides a high degree of assurance that a product, service,
or system accomplishes its intended requirements.
 This often involves acceptance and suitability with external
customers
 High level activity: correctness of the final software product
• Verification  Are you building the thing right?
 A quality process that is used to evaluate whether or not a
product, service, or system complies with a regulation,
specification, or conditions imposed at the start of a
development phase.
 Verification can be in development, scale-up, or production.
 This is often an internal process
 Low level activity: consistency, completeness, and
correctness of the software at each stage
16
Black / white / Gray box testing
• Gray-box
 Test designed based on the knowledge of algorithm, internal
states, architectures, or other high-level descriptions of the
program behavior
 involves having access to internal data structures and algorithms
for purposes of designing the test cases, but testing at the user, or
black-box level.
 Is particularly important when conducting integration testing
between two modules of code written by two different developers
• White box testing
 Uses an internal perspective of the system to design test cases
based on internal structure.
 It requires programming skills to identify all paths through the
software.
 Applicable at the unit, integration and system levels of the software
testing process
• Black box testing
 An external perspective of the test object to derive test cases.
 These tests can be functional or non-functional.
 The test designer selects valid and invalid input and determines the
correct output.
17
 There is no knowledge of the test object's internal structure
Non Functional Software Testing
• Performance testing
checks to see if the software can handle large
quantities of data or users.
This is generally referred to as software scalability.
• Usability testing
needed to check if the user interface is easy to use
and understand.
• Security testing
essential for software which processes confidential
data and to prevent system intrusion by hackers.
• Internationalization and localization
needed to test these aspects of software, for which a
pseudolocalization method can be used
18
Regression testing
• After modifying software, either for a change in
functionality or to fix defects, a regression test re-
runs previously passing tests on the modified
software to ensure that the modifications haven't
unintentionally caused a regression of previous
functionality.
• Regression testing can be performed at any or all of
the above test levels.
• These regression tests are often automated.
 More specific forms of regression testing are known as
sanity testing, when quickly checking for bizarre behavior,
and smoke testing when testing for basic functionality.
 Benchmarks may be employed during regression testing to
ensure that the performance of the newly modified software
will be at least as acceptable as the earlier version or, in the
case of code optimization, that some real improvement has
been achieved.
19
System testing
• Software or hardware is testing
conducted on a complete, integrated
system to evaluate the system's
compliance with its specified
requirements
• Black box testing
• To detect any inconsistencies
between the software units that are
integrated together (called assemblages)
between any of the assemblages and the
hardware
20
Test strategy vs. Test plan
• Test strategy
a company level document and which says
the approach for testing.
The test strategy doc also says that the
scope, business issues, test deliverables,
tools used, risk analysis etc,
• Test plan
document which says what to test, when to
test ,how to test and who to test.
the test plan document was prepared by the
test lead

21
Automated testing
• The use of software to control the
execution of tests, the comparison of
actual outcomes to predicted
outcomes, the setting up of test
preconditions, and other test control
and test reporting functions

22
IEEE 829
• 829 Standard for Software Test
Documentation
a set of documents for use in eight defined stages of
software testing, each stage potentially producing
its own separate type of document
Test Plan: a management planning document
Test Design Specification
Test Case Specification
Test Procedure Specification
Test Item Transmittal Report
Test Log
Test Incident Report
Test Summary Report
23
ISO 9003
• A quality assurance model for final
inspections and testing

24
Software walkthrough
• a form of software peer review
A designer or programmer leads members
of the development team and other
interested parties through a software
product
And the participants ask questions and
make comments about possible errors,
violation of development standards, and
other problems

25
Change Management
• the process of requesting, determining
attainability, planning, implementing
and evaluation of changes to a system
• Changes in the IT infrastructure may
arise reactively in response to
problems or externally imposed
requirements

26
Configuration Management
• a field of management that focuses on
establishing and maintaining
consistency of a product's
performance and its functional and
physical attributes with its
requirements, design, and operational
information throughout its life
 Configuration identification
 Configuration control
 Configuration status accounting
 Configuration authentication
27
Version Control
• The management of multiple revisions
of the same unit of information
application source code

28
Defect Tracking
• the process of finding defects in a
product (by inspection, testing, or
recording feedback from customers),
and making new versions of the
product that fix the defects.

29

You might also like