Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Qa Framework

Download as pdf or txt
Download as pdf or txt
You are on page 1of 23
At a glance
Powered by AI
The document discusses a QA framework proposal prepared by Thatavarti Technologies including executive summary, solution, project plan, risks, version management, training, team structure, processes, communication, status reporting, knowledge management, case studies, retention policy and more.

The document covers sections like executive summary, acronyms, solution, project plan, risks, version management, training, team structure, processes, communication, status reporting, knowledge management, case studies, retention policy, why Thatavarti, glossary, test metrics, templates and formats.

Test case template, test scenario template, test plan template, traceability matrix, defect report, test summary report are some of the templates and formats mentioned.

QA Framework Proposal

Prepared by:

Proposal Number: 2005.010

Version: 3.0

20-JAN-2007

This document is the property of and is proprietary to Thatavarti Technologies Pvt. Ltd., and is not to be disclosed in
whole or in part without the express written consent of TT, and shall not be duplicated or used, in whole or in part, for
any purpose other than to evaluate TT proposal.
QA Frame Work

Table of Contents

1. EXECUTIVE SUMMARY ............................................................................................................ 4

2. ACRONYMS ............................................................................................................................................... 4

3. THATAVARTI SOLUTION............................................................................................................................. 5

4. PROJECT PLAN...................................................................................................................................….……16

5. RISK, CONTINGENCIES, ASSUMPTIONS, DEPENDENCIES AND CONSTRAINTS ....................... 17

6. VERSION MANAGEMENT PROCEDURE......................................................................................……….18

7. TRAINING ......................................................................................................................................................... 19

8. TEAM STRUCTURE..........................................................................................................................................19

9. PROCESSES AND PROCEDURES................................................................................................................. 20

10. COMMUNICATION CHANNEL ....................................................................................................................... 21

11. STATUS REPORTING...................................................................................................................................... 22

12. KNOWLEDGE MANAGEMENT ................................................................................................................... 22

13. CASE STUDIES..................................................................................................................................................22

14. RETENTION POLICY .................................................................................................................................... 22

15. WHY THATAVARTI?...................................................................................................................................... 23

16. GLOSSARY ....................................................................................................................................................... 23

17. TEST METRICS ................................................................................................................................................23

18. TEMPLATES AND FORMATS....................................................................................................................... 23

Version 3.0 20-JAN-2007 Page 2 of 23


QA Frame Work

Appendix 01: Test Life Cycle


Appendix 02: Installation testing checklist
Appendix 03: Review document
Appendix 04: System study procedure
Appendix 05: Doubts & Issues procedure
Appendix 06: Test data procedure
Appendix 07: Test case template
Appendix 08: Test scenario template
Appendix 09: Test plan template
Appendix 10: Test metrics
Appendix 11: Glossary
Appendix 12: Thatavarti Case studies
Appendix 13: Traceability matrix
Appendix 14: Defect report
Appendix 15: Test Summary report

Version 3.0 20-JAN-2007 Page 3 of 23


QA Frame Work

1. Executive Summary

Thatavarti Technologies is an Independent Testing Vendor serving development companies to


deliver the bug free software. At the time of Inception TT strength was 10 and now we have
grown to 165 serving for 18 different clients. This engagement with is an opportunity for TT to
prove its Testing capabilities.

2. Acronyms
Abbreviation Description
SME Subject Matter Expert
KT Knowledge Transfer
TT Thatavarthi Technologies
T Testing
TC Test Case
TS Test Scenario
TP Test Plan
w.r.t. With respect to
SPOC Single Point of Contact
4L 4 Layered
TM Test Management
MPP Microsoft Project Plan

Version 3.0 20-JAN-2007 Page 4 of 23


QA Frame Work

3. Thatavarti Solution

TT will implement 8-step approach for execution.

Appendix 01: Test Life Cycle

Version 3.0 20-JAN-2007 Page 5 of 23


QA Frame Work

3.1 Test Strategy:

TM is a part of Test Plan. TT implements 4L Product Testing Methodology to test the CLIENT
products.

4-Layer Product Testing Architecture

Version 3.0 20-JAN-2007 Page 6 of 23


QA Frame Work

3.1.1 Installation Testing


Installation testing is defined as testing that occurs outside the development environment. This will occur
on the computer system, the software product will eventually be installed on. Installation tests will check
the installation and configuration procedure as well as any missing dependencies.

Entry Criteria Activities Deliverables Proof of Exit Criteria


Execution
Installation Execute the Status of the filled Check the Proof Successful
source of the heuristic Checklist Check List. of execution in installation of
application (Check List checks Check List. software.
should be Successful
for all possible
available. installation
defects in the
message.
Installation.)
Please find the
check list in
Appendix 02.

Appendix 02: Installation Testing Checklist

3.1.2 Build Acceptance Test:

The build acceptance test is a simplistic check of a product's functionality in order to determine if the
product is capable of being tested to a greater extent. Every new build should undergo a build acceptance
test to determine if further testing can be executed.

Entry Criteria Activities Deliverables Proof of Exit Criteria


Execution

New build Build acceptance test Execution results Review, Stable build,
received with case execution Execution Pass status for
release notes. result all test cases.
documents

Version 3.0 20-JAN-2007 Page 7 of 23


QA Frame Work

3.1.3 Unit / Module Testing


Unit testing is a verification effort on the smallest unit of the software design – the software component
or module. Unit testing is white-box oriented, and this can be conducted in parallel for multiple
components.
Unit testing is mainly focused on following areas

Entry Criteria Activities Deliverables Proof of Execution Exit Criteria

Availably of Preparation of Unit test plan Review of Unit Sign off Unit Test plan
modules. Unit Test Plan. Document Test Plan Document.
Stable build Preparation of Document Sign off Unit test Case
Unit Test cases. Traceability Matrix Document.
Unit Test Case
‘Preparation’/
for the all Unit test
Document.
cases.
‘CLIENT shall
Review of Test Frozen Test data
Provide’ Test
Document
data for Unit test Test data data Document
cases document for Unit 2 rounds of test
Success
Execution of test Test cases execution.
messages for
cases. passed test cases
Test Status and screen shots
report. for failed test
cases.

3.1.4 Integration Testing


The primary objective of integration testing is to discover errors in the interfaces between Modules / Sub-
Systems.
Techniques/Approaches for Integration testing:
Top-Down approach: is an incremental approach to testing of the program structure.
Modules are integrated by moving downward through the control hierarchy, beginning
with the main control module, this could be done as depth- first or breadth-first manner.

Version 3.0 20-JAN-2007 Page 8 of 23


QA Frame Work

Bottom-up approach: as the name implies, begins construction and testing with atomic
modules i.e., from the components at the lowest levels in the program structure

Entry Criteria Activities Deliverables Proof of Execution Exit Criteria

Unit testing of all Preparation of Integration Test Review of Integration Sign off
Modules is Integration Test Plan Document Test Plan Integration Test
Signed off Plan. plan Document.
Sign off
Integration Test Traceability Matrix for
Integration test
Preparation of scenarios the all Integration test
Scenarios
Integration Test Document. Scenarios
Document.
Scenarios. Success messages for
Sign off
Execution of test Test Status report. passed test scenarios
Integration
Scenarios. and screen shots for
testing.
failed test scenarios

3.1.5 System Testing


The primary objective of system testing is to discover errors when the system is tested as
a whole. System testing is also called as End-To-End Testing.
System testing is mainly focused on following areas
Identifying the End-To-End / Business Life Cycles.
Design the test and data.
Optimize the End-End / Business Life Cycles.

Entry Criteria Activities Deliverables Proof of Execution Exit Criteria

Successful Preparation of System Test Plan Review of system Sign off system
completion of System Test Plan. Document test plan document test plan
integration Preparation of System Test Case document.
testing. Sign off System
System Test Document.
Traceability matrix for
cases. test scenarios
the all System test
document.
cases.
Test data document Frozen test data
‘Preparation’/ Review of test data
for System Test Document
‘CLIENT shall document for system
cases Sign off system
provide test data testing
for System test Success messages testing.
cases
Test Status report. for passed test cases
Execution of test and screen shots for
cases. failed test cases

Version 3.0 20-JAN-2007 Page 9 of 23


QA Frame Work

3.1.6 Regression Testing:


Testing the new version of product/ new build to ensure enhancements implemented correctly and
existing functionalities are stable and not corrupted.

Regression Testing Methods


Regression testing can be done either manually or by automated testing tools.
Manual testing will be done for small systems, where investing in automated tools might not be feasible
enough.
Automated testing: One class of these tools is called as Capture-playback tool. This is very helpful in
situations where the system undergoes lots of version changes.
Regression testing are basically repetitive tests for which automation may be a good approach

Entry Criteria Activities Deliverables Proof of Execution Exit Criteria

Testing of Executing the Regression Test case review All Regression Test
change Regression Test Test and Defect document. Cases are executed
requirements are cases Report Message of pass Status report
signed off
test cases and Document
screen shots for fail
scenarios.

3.1.7 Reliability Testing


Reliability Testing is a property, which defines how well the software meets its requirements. Reliability
is considered as the probability of failure-free operation for a specified time in a specified environment for
a given purpose
Objective is to find Mean Time between failure/time available under specific load pattern. Mean time for
recovery.
Reliability Testing helps you to confirm:
Business logic performs as expected
Active buttons are really active
Correct menu options are available
Reliable hyper links

Entry Criteria Activities Deliverables Proof of Execution Exit Criteria

Functionality Preparation of Test Plan Test Plan review Testing baselines,


testing is Reliability test document. document schedules are planned
signed-off. plan. Test Case Traceability Matrix and and test plan document is
Preparation of frozen.
document. Test Case review
Reliability Test document Business critical test
cases. Test Success messages for cases are Signed-off.
Execution of Execution passed test cases. And
test cases. Report. screen shots for failed 85% stability of the
test cases. application.

Version 3.0 20-JAN-2007 Page 10 of 23


QA Frame Work

3.1.8 Performance Testing

Definition: Demonstrating system functions to specifications with acceptable response time while
processing the required transaction volume on a production sized database.

Performance Test Plan

Entry Criteria Activities Deliverables Proof of Execution Entry Criteria

Stable Preparation Of Test plan Test Plan Test Plan Review Performance
Architecture Document Document document baselines identified.
System testing is Architecture critical
completed scenarios are
identified.

Smoke Testing: Initial testing implemented on the Application to check the performance before going to
exploratory test.

Performance Smoke Test


Entry Criteria Activities Deliverables Proof of Execution Entry Criteria

Performance test Testing the application Transaction per Summary Report Completion of
plan is ready. with 2 users second, No. of containing Date, Single iteration of
Availability of transactions Start and Stop time test
scripts for passed/Failed of test and duration
Execution of test.

Load Testing / Exploratory : Testing application with the load the customer wants to have on his
application
Performance Load/Exploratory Test
Entry Criteria Activities Deliverables Proof of Execution Entry Criteria

Successful Testing the application Analysis, Summary Report Completion of script


Execution of with number of user as Summary Report, containing Date, Start execution
Smoke test per Client Requirement Reports Graphs. and Stop time of test
and duration of test.

Version 3.0 20-JAN-2007 Page 11 of 23


QA Frame Work

Stress Testing: Testing Application with an intention to find the break point by implementing heavy load
Performance Stress Testing
Entry Criteria Activities Deliverables Proof of Execution Entry Criteria

Successful Testing the application Analysis, Summary Report Completion of script


Execution of with Heavy load of user, Summary containing Date, execution.
Load / to find Application break Report, Graphs. Start and Stop time
Exploratory test point. of test and duration
of test.

3.1.9 Scalability Testing:


Objective is to find the maximum number of user system can handle.
A test that applies increasing workloads to determine a system's ability to scale. This test answers the
question, "Given an increase from load x to load y, how will the system behave at load y versus at load
x?"
Classification:
Network Scalability
Server Scalability
Application Scalability

Entry Criteria Activities Deliverables Proof of Exit Criteria


Execution
After successful Preparation of Scalability Test Plan Test Plan Test Plan
completion of Test Plan. Document. Review Document signoff.
functionality testing Identify scalability test Document All Test scenarios
and all required non- Test
scenarios. are executed
functional testing. Test case
Execute the application on scenarios
Document.
different Networks and
servers increasing the
System Status report
number of users with Test Status
behavior of
adequate and inadequate report. Document
different
resources.
loads.

Version 3.0 20-JAN-2007 Page 12 of 23


QA Frame Work

3.1.10 Compatibility Testing

Compatibility testing provides a basic understanding of how a product will perform over a wide range of
hardware, software & network configuration and to isolate the specific problems.
Compatibility testing verifies that a product looks and functions the same across all supported
environments.

Entry Criteria Activities Deliverables Proof of Exit Criteria


Execution
After successful Preparation of Test Plan Test Plan Test Plan
completion of Compatibility Test Plan. Document. Review Document signoff.
functionality testing Preparation of Document All Test Cases are
and all required non- Test Case
Compatibility Test cases. executed
functional testing. Test case
Execute the application on review
Document.
document
different browsers and
Success Status report
operating Systems
Document
Execute the application Test Status message for
report. pass cases
with Test Bed Creation
and
i.e.
screenshots
1) Partition of the for fail
hard disk cases.
2) Creation of Base
Image

3.1.11 Usability Testing:


Usability Testing is defined as the effectiveness, efficiency, and satisfaction with which specified users
achieve specified goals in particular environments.
Evaluates the ease of Using, learning the system, system user documents, effectiveness of system
functioning in supporting User tasks by the end users.

Entry Criteria Activities Deliverables Proof of Exit Criteria


Execution
CLIENT shall To test the, User Usability Test case Test case Test Case
make the friendliness of the Document, Review Document
application product includes, Document
available Test case execution.
Test Status Report Product should
Not Applicable be user friendly

Version 3.0 20-JAN-2007 Page 13 of 23


QA Frame Work

3.1.12 Security Testing:

To test the application and system level accessibility correctness.

Entry Activities Deliverables Proof of Exit


Criteria Executio Criteria
n
Product is stable in Verify that an actor can Authorization of Review All Test Cases
terms of Functionality access only thoseAuthentication Test Documents. pass.
and Performance functions or data for Case execution
which their user type is reports. Screenshots of
provided permissions. Pass and Fail
System-level Security: results.
Verify that only those
actors with access to
the system and
applications are
permitted to access
them

3.1.13 Alpha Testing:


• Alpha Testing is carried out on the developer’s premises itself in a controlled environment.
• Generally, the Quality Assurance cell is the body that is responsible for conducting the Alpha test.
• On successful completion of this phase, the software is ready to migrate outside the developer’s
premises for more rigorous and unplanned testing that takes place at the next level of testing, i.e.,
Beta testing.
Entry Criteria Activities Deliverables Proof of Exit Criteria
Execution
System testing Testing baselines
Test Plan creation Test plan Test Plan
is signed-off and and schedules
document. review
completion of are planned and
Test scenario document
required non- Test plan
identification for major Business Test
functional testing document is
of the product is business functionalities scenarios Test Scenario frozen.
complete. document. review
Preparation of test document Business Critical
cases for major Traceability Test Scenarios
business requirements. Matrix and Test are signed-off.
Test case
Execution of test cases. case review Business critical
document.
document test cases are
Test Execution signed-off.
Success
Report. messages for Alpha testing
passed test signed off for
cases. And Beta testing.
screen shots
for failed test
cases.

Version 3.0 20-JAN-2007 Page 14 of 23


QA Frame Work

3.1.14 Ad-hoc Testing: Ad-hoc testing will be carried out with out having any specific objective; test
resource will test the application with the domain knowledge and testing experience.

Test Scenarios:
1. Test scenarios will be identified for each module and product.
2. Test scenarios will be associated with both positive and negative data
3. Optimization of test scenarios will be taken care as the test cycles increase

Test Cycle:

1. Test cases and Test scenarios will be base lined for each test cycle
2. Test execution reports and metrics will be maintained independently for each cycle

3.1.15 Test Automation:


Test automation is code or script, which executes tests without human intervention.
Software testing that utilizes a variety of tools to automate the testing process and when the importance
of having a person manually testing is diminished. Automated testing still requires a skilled quality
assurance professional with knowledge of the automation tool and the software being tested to set up the
tests.

Why Automation:
• Avoid the errors that humans make when they get tired after multiple repetitions. The test
program won’t skip any test by mistake
• Each future test cycle will take less time and require less human intervention
• Required for Regression Testing

Life Cycle of Automation:


• Analyze the Application
• Select the Tool
• Identify the Scenarios
• Design/Record the Script
• Modify the Script
• Run the Test Script
• Reporting the Defects
Benefits of Test Automation:
• Allows more testing to happen
• Tightens / Strengthen Test Cycle
• Testing is consistent, repeatable
• Useful when new patches released
• Makes configuration testing easier
• Test battery can be continuously improved.

Version 3.0 20-JAN-2007 Page 15 of 23


QA Frame Work

Entry Criteria Activities Deliverables Proof of Exit Criteria


Execution
Application Analyze the Application Automation Review of Developed
should be 80% Select the Tool Framework Automation Framework
stable Identify the Scenarios Framework
Executable
Develop the Test Review of Scripts
Framework Automation Test Scripts
Test logs
Design/Record the Script Scripts
Modify the Script
Run the Test Script Test Logs
Test Logs

4. Project Plan
Project plan will be designed in MPP with the help of the following:
Defined activities.
Man efforts estimated for each activity.
Resources

Appendix 09: Detail Project Plan (Test Plan)

Version 3.0 20-JAN-2007 Page 16 of 23


QA Frame Work
5. Risk, Contingencies, Assumptions, Dependencies and Constraints

# Risk Mitigation/Contingency Plan Impact on Schedules and Cost

1 Acceptance plan may not Project managers to plan and Delay in providing this might result in
be available on time from follow up with Client project stakeholder conflicts.
client. coordinator
2 Required information may Project Managers to plan and Delay in the starting up the project
not be provided by client to follow up with Client project resulting in schedule and cost over
set up the functional coordinator runs
environment (Hardware,
Software and installation
Licenses, etc).
3 Changes to original Project Managers to plan and Delay in the overall project schedule
scope/agreed design/test follow up with Client project and cost over runs
cases. coordinator

4 Changes to the application Project Managers to plan and Delay in the overall project schedule
and test environment may follow up with Client project and cost over runs
be frequent. coordinator

5 Knowledge transfer may not Project Managers to plan and Delay in the overall project schedule
happen as planned. follow up with Client project and cost over runs
coordinator
6 Delay in formal sign-off Project Managers to plan and Might result in major re-work
from the client at agreed follow up with Client project
major milestones. coordinator

7 Client may not provide Project Managers to plan and Onsite project manager to plan and
committed resources on follow up with Client project follow up with Client project
time. coordinator coordinator
8 Installation of application Onsite project manager to Might result in re-work which can lead
may cause issues. manage this through formal to schedule and cost over runs
change request procedure

5.1 Assumptions

CLIENT will provide the independent environment for testing


Application Developer(s) and Business Analysts will review the System Test Plan and provide
feedback and help identify scenarios covering new functional and design requirements.
A stable QA environment is available – which mirrors the production environment on which the
application will run.
Build verification testing is performed on all iterations of the code, prior to full-scale regression
testing.
Subsequent iterations of code will be regression tested against the previous build.
The Developer(s)/Business Analysts will review the Test Scenario document and suggest
inclusion of any additional test scenarios to achieve full coverage of the application functionality.
All the new builds will be deployed on testing environment
CLIENT will share existing test suite

Version 3.0 20-JAN-2007 Page 17 of 23


QA Frame Work

5.2 Dependencies
This table lists the identified system dependencies and respective owners for testing of product.
The logical owner for investigation, tracking and reporting of these is the assigned QA resource.
<List the system dependencies. For example is data loaded from another application or fed into
another application that will need consideration in the test planning The following table should be
completed to show what the dependencies are and who owns them. For example for a feed to an
external system, what is the system name and who is the point of contact for configuration issues.
These dependencies should be inserted in the Test Director test labs in the details section as pre-
conditions of execution.>

Dependency OWNER
Network Connection. Technical team of CLIENT
Data base server Database tam, CLIENT
Development completion Dev team CLIENT
Unit testing should be complete to start integration testing. Testing team TT.

5.3 Constraints
a) Complete test data should be made available before testing

6. Version Management Procedure


Version Management:
Version management is storage of complete history of the versions of each component. Supports
a concept of workspaces as directories and which files are extracted for modification or viewing
purposes.

Ensures old versions are not lost.


Ensures developers do not overwrite each other changes.
Allow you to examine the differences between versions.
Provides the essential information of Who, What, When and Why.
Provides permissions to particular users allow to allocated projects.
All kinds of document tracking

Quality point of view:

• The purpose of Version management is to establish and maintain the integrity of the products of
the software project throughout the software life cycle.

Procedure:

1. Identify the software that needs version control.


2. Identification of the configuration items (Individual program or Any document that needs version
control)
3. Identifying the users and granting the privilege to access the version management folders.
4. Defining the tree structure of folders for the project.
5. Define the naming convention for the child items in the tree hierarchy.
6. Naming conventions for the configuration items (files).
7. Maintaining all documents in respective folders with increasing version numbers in as part of
name.
8. Should maintain baseline documents.
9. Shared access to be granted to users if a folder and its items are shared amongst different
projects.

Version 3.0 20-JAN-2007 Page 18 of 23


QA Frame Work

7. Training

TT will train the CLIENT resource in the Inception stage of the Project.
TT will share the best practices, case studies and learning’s with the CLIENT resources.
TT will conduct the trainings to CLIENT resource monthly and need basis.

8. Team Structure

This section presents the recommended number of resources for the CLIENT Assignment, their
main responsibilities, and their knowledge or skill set.

QA/Test Manager Frame work Implementation


Maintaining multiple projects
Mail ID
Test planning
Update the status to CLIENT manager
Mentor test leads and test members
Design and maintain metrics
Raise issues and variance
Understand and create service level agreements
Project reviews and execution reviews
Propose the new techniques
Define the automation strategy
Fill time sheets
Update status to Thatavarti

Version 3.0 20-JAN-2007 Page 19 of 23


QA Frame Work
QA/Test Analyst Implement Test Plan
System study
Mail ID
Design Test Cases
Participate in test reviews
Test case execution
Identify test scenarios
Defect management
Update project metrics
Prepare test data
Guide the test members
Define the scope
Raise the issues
Fill time sheets
QA/Test Engineer System study
Test case design
Mail ID Test case execution
Submit defects
Track defects
Raise issues
Prepare test data
Peer reviews
Update status
Fill time sheets

9. Processes and Procedures


9.1 Appendix 04: System Study Procedure
9.2 Appendix 05: Issues & Doubts Procedure.

Version 3.0 20-JAN-2007 Page 20 of 23


QA Frame Work

9.3 Escalation Procedure

Thatavarti Client

RadhaKrishna Senior Management Project manager

QA Manager QA Manager
Relationship
(48-72 hrs) Issue

Test Engineers Application SME


(48-72 hrs) Issue

Any
TT TEAM Issue

9.4 Appendix 01: Test Case Design Procedure


9.5 Appendix 01: Review Procedure
9.6 Appendix 01: Defect Tracking Procedure
9.7 Appendix 06: Test Data procedure

10. Communication Channel

A. Offshore (TT, Hyderabad)/ Onsite(CLIENT) Team Collaboration

Thatavarti Technologies lays heavy emphasis on communication among global team members. It
has been Thatavarti’s experience that effective communication is essential for the success of the
project.

B. Communication Practices
To ensure effective communication between offshore and onsite teams, Thatavarti
Technologies adopts multiple tools and processes, such as regular account calls and status
review calls though voice conferencing. The knowledge transfer activities are given most
importance during the inception phase as well as during subsequent phases to ensure that the
offshore and onsite teams are completely synchronized. This is supported by activities such as
onsite visits, joint workshops with the client teams etc.

Version 3.0 20-JAN-2007 Page 21 of 23


QA Frame Work

Typical Project Communication modes are:

• Monthly account meetings attended by personnel from sales, delivery, and business unit
head and senior management representative. These meetings review progress and
issues related to the project.
• Daily project review meetings attended by project manager leads and key engineers to
discuss day-to-day progress and issues.

11. Status Reporting

Thatavarti Technologies will provide the status reports frequently to CLIENT. The status
report will contain but is not limited to the following items:
• Daily status reports will be provided with the progress of task.
• Weekly status report will be provided with the Planned/Actual/Pending tasks and also
forecast for the coming week.
• Monthly status report will be maintained to understand the following things:
Planned task progress
Issues related to the task
Risks involved in the execution of the task
Need for the allocation/de-allocation of resource

12. Knowledge Management

TT does the knowledge management of domain, technology and execution of the project on
Weekly, Monthly and Quarterly basis using the following techniques.

Team will deliver the presentations on the understanding of domain knowledge of the application.
Team will share all the functional problem areas and team issues.
Team will develop and share the best practices.
TT will deploy a new back-up resource for every 20 working days, to retain domain knowledge.

13. Case Studies

Appendix 12

14. Retention Policy

TT shares 20% equity with the managers and 18% equity with the employees.
TT treats all employees as partners.
TT evaluates resources on quarterly basis and does appraisal
TT encourages all the employees to take the ownership of the project.

Version 3.0 20-JAN-2007 Page 22 of 23


QA Frame Work

15. Why Thatavarti?

Better Quality will be delivered with Domain/Technology/Tools Expertise

Faster turnaround with the help of parallel work on Verification & Validation.

Experienced test teams and round the clock testing.

Lower cost with Onsite/Offshore combinations and complete offshore facility

16. Glossary
Appendix 11

17. Test Metrics


Appendix 10

18. Templates and Formats


Appendix 07: Test case template
Appendix 08: Test scenario template
Appendix 09: Test plan template
Appendix 13: Traceability matrix
Appendix 14: Defect report
Appendix 15: Test summary report

Version 3.0 20-JAN-2007 Page 23 of 23

You might also like