Testing
Testing
1. Monkey Testing: The Coverage of basic Functionalities during testing is called “Monkey Testing”. It is also
known as “Chimpongy Testing”. This type of testing applies by testing team due to lack of time.
2. Exploratory testing: Level by level of activities coverage during testing is called “Exploratory Testing”.
3. Sanity Testing: - Weather development team Release build is stable for complete testing or not? This type of
check is called “Sanity Testing” or “Tester Acceptance Testing” or “Build Verification Testing”.
4. Smoke Testing: - A small shakeup in “Sanity Testing” process. During smoke testing tester try to analyze reason
when that project is not working before start a testing.
5. Bigbang Testing: - A single storage of testing process after completion of entire coding is called “Bigbang
Testing”. It is also known as “Informal Testing”.
6. Incremental Testing: - A step-by-step testing process from unit level to system level is called “Incremental
Testing”. This testing is also known as “Formal Testing”.
7. Mutation Testing: - Mutation means that a small change in coding. This type of technique used by white box
tester to apply on tested programs to estimate completeness and correctness of the testing (Coding level testing
under White Box Testing).
9. Manual Vs Automation: -
A tester conduct a test on build without using any third party tool help is called a “Manual Testing”.
A tester conduct a test on a build through software testing tool help is called “Test Automation”.
10. Retesting: - Re-execution of a test in the application build with multiple test data is called
“Retesting”.
11. Regression Testing: Re execution of the tests on modified build to ensure Bug Fix Work and Possibility of side
effects is called “Regression”.
12. Adhoc Testing : To test without any plan to understand the functionality of the application build is called
“Adhoc Testing".
Verification: - Weather the system is Right or Wrong? This type of checking is called
Verification.
Software Development Life Cycle: - The process used to create a software product from its initial conception to
its public release is known as the “Software Development Lifecycle”.
Information Gathering
Design
Coding
Testing
Maintenance
System
Testing
Information
Gathering
(BRS) Black Box
Testing
Lifecycle Testing Reviews Review Prototype White Box Testing Test Software
Changes
Reviews
S/wRs Functional & System Testing
Reviews
LLD Unit Testing / Micro Testing
Coding
Integration Testing:
After completion of “Unit level Testing”, Developers are combining those modules as a system with respect to
HLD’s. During this composition of modules, they concentrate on integration testing to verify coupling of that modules.
There are 3 approaches to conduct integration testing
1. Top-Down Approach: - Tester testing on main module without coming some of the sub modules is called
“Top-Down Approach”. In this scenario White Box Tester use a Temporary program instead of under
constructive sub modules called “Strub”.
2. Bottom-Up Approach: - A White Box Tester conducts a test on sub modules without coming from main
module. In this scenario white box tester use a temporary program instead of under constructive main module
called “Driver”.
3. Sandwitch Approach: - It’s a combination of Top-Down and Bottom-Up Approaches. In this scenario White
Box Tester uses a Driver and a Strub instead of under constructed modules.
Test Strategy :
Before starting any testing activities, the team lead will have to think a lot & arrive at a strategy. This will describe
the approach, which is to be adopted for carrying out test activities including the planning activities. This is a formal
document and the very first document regarding the testing area and is prepared at a very early stag in SDLC. This
document must provide generic test approach as well as specific details regarding the project. The following areas
are addressed in the test strategy document.
Test Plan
The test strategy identifies multiple test levels, which are going to be performed for the project. Activities at each
level must be planned well in advance and it has to be formally documented. Based on the individual plans only,
the individual test levels are carried out.
The plans are to be prepared by experienced people only. In all test plans, the ETVX {Entry-Task-Validation-Exit}
criteria are to be mentioned. Entry means the entry point to that phase. For example, for unit testing, the coding
must be complete and then only one can start unit testing. Task is the activity that is performed. Validation is the
way in which the progress and correctness and compliance are verified for that phase. Exit tells the completion
criteria of that phase, after the validation is done. For example, the exit criterion for unit testing is all unit test cases
must pass.
• Test case ID - The test case id must be unique across the application
• Test case description - The test case description must be very brief.
• Test prerequisite - The test pre-requisite clearly describes what should be present in the system,
before the test can be executes.
• Test Inputs - The test input is nothing but the test data that is prepared to be fed to the system.
• Test steps - The test steps are the step-by-step instructions on how to carry out the test.
• Expected Results - The expected results are the ones that say what the system must give as output or
how the system must react based on the test steps.
• Actual Results – The actual results are the ones that say outputs of the action for the given inputs or
how the system reacts for the given inputs.
• Pass/Fail - If the Expected and Actual results are same then test is Pass otherwise Fail.
The test cases are classified into positive and negative test cases. Positive test cases are designed to prove that the
system accepts the valid inputs and then process them correctly.
What is a Defect?
• Any deviation from specification
• Anything that causes user dissatisfaction
• Incorrect output
• Software does not do what it intended to do.
Duplicate,
Reject or More Close
Info
Update Defect
. how do the companies expect the defect reporting to be communicated by the tester to the development team.
Can the excel sheet template be used for defect reporting. If so what are the common fields that are to be
included ? who assigns the priority and severity of the defect
To report bugs in excel:
Sno. Module Screen/ Section Issue detail Severity
Prioriety Issuestatus
this is how to report bugs in excel sheet and also set filters on the Columns attributes.
But most of the companies use the share point process of reporting bugs In this when the project came for testing a
module wise detail of project is inserted to the defect managment system they are using. It contains following field
1. Date
2. Issue brief
3. Issue discription(used for developer to regenrate the issue)
4. Issue satus( active, resolved, onhold, suspend and not able to regenrate)
5. Assign to (Names of members allocated to project)
6. Prioriety(High, medium and low)
7. severity (Major, medium and low)