SOFT3406 Week14
SOFT3406 Week14
SOFT3406 Week14
specification.
● Validation:
"Are we building the right product”.
▪ The software should do what the user really
requires.
3
The V & V process
● Is a whole life-cycle process - V & V must be
applied at each stage in the software process.
6
Software testing
● Can reveal the presence of errors NOT their
absence
● A successful test is a test which discovers one
or more errors
● The only validation technique for non-functional
requirements
● Should be used in conjunction with static
verification to provide full V&V coverage
Testing and
debugging
● Defect testing and debugging are distinct
processes
● Verification and validation is concerned with
establishing the existence of defects in a program
● Debugging is concerned with locating and
repairing these errors
● Debugging involves formulating a hypothesis
about program behaviour then testing these
hypotheses to find the system error
The V-model of development
Software Inspections
● Software Inspection involves examining the
source representation with the aim of discovering
anomalies and defects without execution of a
system.
● They may be applied to any representation of the
system (requirements, design, configuration data,
test data, etc.).
● Corresponds to static analysis of the software
and codes.
10
Inspections and Testing
● Inspections and testing are complementary and not
opposing verification techniques.
● Inspections can check conformance with a specification
but not conformance with the customer’s real
requirements.
● Inspections cannot check non-functional characteristics
such as performance, usability, etc.
● Management should not use inspections for staff
appraisal i.e. finding out who makes mistakes.
11
Inspection Procedure
● System overview presented to inspection team.
● Code and associated documents are
distributed to inspection team in advance.
● Inspection takes place and discovered errors
are noted.
● Modifications are made to repair errors.
● Re-inspection may or may not be required.
● Checklist of common errors should be used to
drive the inspection. Examples: Initialization, Constant
naming, loop termination, array bounds…
12
Inspection Roles
13
Inspection Checks
14
Inspection Checks (cont’d)
15
Stages of Static Analysis
● Control flow analysis. Checks for loops with
multiple exit or entry points, finds unreachable
code, etc.
● Data flow analysis. Detects uninitialised
variables, variables written twice without an
intervening assignment, variables which are
declared but never used, etc.
● Interface analysis. Checks the consistency of
routine and procedure declarations and their
use
Stages of Static Analysis (cont’d)
18
LINT Static Analysis Example
Lint: is an automated source code checker for programmatic and stylistic errors. A tool known as a
linter, analyzes C code without executing it.
19
Use of static analysis
● Particularly valuable when a language such as C
is used which has weak typing and hence many
errors are undetected by the compiler
● Less cost-effective for languages like Java that
have strong type checking and can therefore
detect many errors during compilation
Black-Box Testing
● Also known as Behavioral Testing.
● Involves testing a system with no
prior knowledge of its internal workings.
● Testing process is carried out by
simulating the software with proper
input, and observing the outputs.
● Objective is makes it possible to
identify how the system responds to
expected and unexpected user actions,
its response time, usability issues and reliability issues.
21
White-Box Testing
23
Object Class Testing
● Complete test coverage of a class involves
• Testing all operations associated with an object;
• Setting and interrogating all object attributes;
• Exercising the object in all possible states.
● Inheritance makes it more difficult to design
object class tests as the information to be tested
is not localised.
24
Interface Testing
● Objectives are to detect faults due to interface
errors or invalid assumptions about interfaces.
● Particularly important for object-oriented
development as objects are defined by their
interfaces.
25
Interface Types
● Parameter interfaces
• Data passed from one procedure to another.
● Shared memory interfaces
• Block of memory is shared between procedures or functions.
● Procedural interfaces
• Sub-system encapsulates a set of procedures to be called by
other sub-systems.
● Message passing interfaces
• Sub-systems request services from other sub-system.
26
System Testing
● Involves integrating components to create a
system or sub-system.
● May involve testing an increment to be delivered
to the customer.
● Two phases:
• Integration testing - the test team have access to the
system source code. The system is tested as
components are integrated.
• Release testing - the test team test the complete
system to be delivered as a black-box.
27
Incremental Integration Testing
28
Release Testing
● The process of testing a release of a system that
will be distributed to customers.
● Primary goal is to increase the supplier’s
confidence that the system meets its
requirements.
● Release testing is usually black-box or functional
testing
• Based on the system specification only;
• Testers do not have knowledge of the system
implementation.
29
Stress Testing
● Exercises the system beyond its maximum design load.
Stressing the system often causes defects to
come to light.
● Stressing the system test failure behaviour. Systems
should not fail catastrophically. Stress testing checks for
unacceptable loss of service or data.
● Stress testing is particularly relevant to distributed
systems that can exhibit severe degradation as a
network becomes overloaded.
● There is a slight difference between stress testing and
performance testing
○ Stress testing pushes the system beyond its dimensioned limits,
while performance testing aims to the system behavior when it is
working within the border of its limits, but very close to them.
30
Test Case Design
● Involves designing the test cases (inputs and
outputs) used to test the system.
● The goal of test case design is to create a set of
tests that are effective in validation and defect
testing.
● Design approaches:
• Requirements-based testing (i.e. trace test cases to
the requirements)
• Partition based testing,
• Boundary value testing,
• Path based testing.
31
Partition Testing
● Input data and output results often fall into
different classes where all members of a class
are related.
● Each of these classes is an equivalence partition
or domain where the program behaves in an
equivalent way for each class member.
● Test cases should be chosen from each partition.
32
Boundary Value Testing
● Based on testing the boundary values of valid
and invalid partitions.
● The behavior at the edge of the equivalence
partition is more likely to be incorrect than the
behavior within the partition,
● So boundaries are an area where testing is likely
to yield defects.
33
Path Testing
● The objective of path testing is to ensure that the
set of test cases is such that each path through
the program is executed at least once.
● The starting point for path testing is a program
flow graph that shows nodes representing
program decisions and arcs representing the flow
of control.
● Statements with conditions are therefore nodes in
the flow graph.
34
Binary Search Flow Graph
35
Independent Paths
● 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 14
● 1, 2, 3, 4, 5, 14
● 1, 2, 3, 4, 5, 6, 7, 11, 12, 5, …
● 1, 2, 3, 4, 6, 7, 2, 11, 13, 5, …
● Test cases should be derived so that all of these
distinct paths are executed
● A dynamic program analyser may be used to
check that paths have been executed
36
Test Automation
● Testing is an expensive process phase. Testing
workbenches provide a range of tools to reduce the time
required and total testing costs.
● Systems such as Junit support the automatic execution
of tests.
● Most testing workbenches are open systems because
testing needs are organisation-specific.
37
Junit as a Test Automation Framework
38
JUnit - Basic Usage import org.junit.Test;
https://www.tutorialspoint.com/junit/index.htm import static org.junit.Assert.assertEquals;
System.out.println(result.wasSuccessful());
}
}
39