Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Strategic Approach To Software Testing

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

STRATEGIC APPROACH TO SOFTWARE TESTING

The software engineering process can be viewed as a spiral. Initially system engineering
defines the role of software and leads to software requirement analysis where the
information domain, functions, behavior, performance, constraints and validation criteria
for software are established. Moving inward along the spiral, we come to design and
finally to coding. To develop computer software we spiral in along streamlines that
decrease the level of abstraction on each turn.
A strategy for software testing may also be viewed in the context of the spiral. Unit
testing begins at the vertex of the spiral and concentrates on each unit of the software as
implemented in source code. Testing progress by moving outward along the spiral to
integration testing, where the focus is on the design and the construction of the software
architecture. Talking another turn on outward on the spiral we encounter validation
testing where requirements established as part of software requirements analysis are
validated against the software that has been constructed. Finally we arrive at system
testing, where the software and other system elements are tested as a whole.

UNIT TESTING

MODULE TESTING

Component Testing
SUB-SYSTEM
TESING

SYSTEM TESTING
Integration Testing

ACCEPTANCE TESTING

User Testing

85
UNIT TESTING

Unit testing focuses verification effort on the smallest unit of software design, the
module. The unit testing we have is white box oriented and some modules the steps are
conducted in parallel.

1. WHITE BOX TESTING

This type of testing ensures that


• All independent paths have been exercised at least once
• All logical decisions have been exercised on their true and false sides
• All loops are executed at their boundaries and within their operational bounds
• All internal data structures have been exercised to assure their validity.
To follow the concept of white box testing we have tested each form .we have
created independently to verify that Data flow is correct, All conditions are exercised to
check their validity, All loops are executed on their boundaries.

2. BASIC PATH TESTING


Established technique of flow graph with Cyclomatic complexity was used to derive test
cases for all the functions. The main steps in deriving test cases were:
Use the design of the code and draw correspondent flow graph.
Determine the Cyclomatic complexity of resultant flow graph, using formula:
V(G)=E-N+2 or
V(G)=P+1 or
V(G)=Number Of Regions
Where V(G) is Cyclomatic complexity,
E is the number of edges,
N is the number of flow graph nodes,
P is the number of predicate nodes.
Determine the basis of set of linearly independent paths.

86
3. CONDITIONAL TESTING

In this part of the testing each of the conditions were tested to both true and false aspects.
And all the resulting paths were tested. So that each path that may be generate on
particular condition is traced to uncover any possible errors.

4. DATA FLOW TESTING

This type of testing selects the path of the program according to the location of definition
and use of variables. This kind of testing was used only when some local variable were
declared. The definition-use chain method was used in this type of testing. These were
particularly useful in nested statements.

5. LOOP TESTING

In this type of testing all the loops are tested to all the limits possible. The following
exercise was adopted for all loops:
All the loops were tested at their limits, just above them and just below them.
All the loops were skipped at least once.
For nested loops test the inner most loop first and then work outwards.
For concatenated loops the values of dependent loops were set with the help of
connected loop.
Unstructured loops were resolved into nested loops or concatenated loops and tested as
above.
Each unit has been separately tested by the development team itself and all the input
have been validated.

87
TEST CASES

SNO TEST DESCRIPTION EXPECTED VALUE OBSERVED RESULT


CASE VALUE
ID
1 TC1 PASSWORD NOT “INVALID USERID SAME AS
GIVEN IN AND PASSWORD EXPECTED SUCCESS
ADMINISTRATION DISPLAYED” VALUE
LOGIN FORM
2 TC2 PASSWORD “INVALID USERID SAME AS
NOT GIVEEN IN AND PASSWORD IS EXPECTED SUCCESS
COLLEGE DISPLAYED” VALUE
LOGIN FORM
3 TC3 PASSWORD IS “INVALID SAME AS
NOT GIVEN IN USERID/PASSWORD” EXPECTED SUCCESS
STAFF LOGIN MESSAGE VALUE
FORM DISPLAYED
4 TC4 PASSWORD IS “INVALID SAME AS
NOT GIVEN IN USERID/PASSWORD” EXPECTED SUCCESS
STUDENT LOGIN MESSAGE VALUE
FORM DISPLAYED

88
Test case screen1

89
Test case screen 2

90

You might also like