Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Etl Testing

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 25

Data Warehouse Testing Practical Approach

Creator:-Manoj Tiwari

Agenda
Date warehouse Definition Challenges of Data warehouse Testing Testing Goal Testing Methodology Testing Types Test Stop Criteria

Date warehouse Definition:"A data warehouse is a subject oriented, integrated, time variant, non volatile collection of data in support of management's decision making process".

Challenges of Data warehouse Testing:


Data selection from multiple source systems and analysis that follows pose great challenge. Volume and the complexity of the data. Inconsistent and redundant data in a data warehouse. Loss of data during the ETL process. Non-Availability of comprehensive test bed Critical Data for Business.

Testing Goal:
Our main aim is to check the quality of that data. Data completeness. Ensures that all expected data is loaded. Data transformation. Ensures that all data is transformed correctly according to business rules and/or design specifications. Data quality. Ensures that the ETL application correctly rejects, substitutes default values, corrects or ignores and reports invalid data.

Testing Methodology :
To perform testing better

Use of Traceability matrix to enable full test coverage of Business Requirements. In depth review of Test Cases. Manipulation of Test Data to ensure full test coverage. Provision of appropriate tools to speed the process of Test Execution & Evaluation. Regression Testing

Testing Types:
The following are types of Testing performed for Data warehousing projects. Unit Testing Integration Testing Technical Shakedown Testing System Testing Operation readiness Testing User Acceptance Testing Regression Testing

Integration Testing
Its major objective is to verify the data produced and validate the design

Prerequisite:
Implementation Checklist for move from development to test. All unit testing completed and summarized. Migration to the test environment from the development environment. Data available in the test environment.

Objectives:
Validate the business requirements, functional requirements Validate the data for correct business rules that correct number of rows are transferred and verify load volumes. Ensure mapping order is correct and dependencies among workflows are in place.

Validate target tables are populated with correct number of records. To Check for Error log messages in appropriate file. To check for restarting of Jobs in case of failures. Validate the execution of workflows and data at the following stages Source to Staging . Staging to ODS. ODS to Data Mart Verify integration of new mappings with existing mappings. Validate proper functionality of mapping variables and parameter files. Testing the individual mappings to verify the transformations and also at the workflow level.

Inputs:
Project Plan,Business requirements document Test cases and steps Access to personal files on the network Executed and approved unit test cases or peer review reports Source to Target Matrices(STM) Extract and Load Order document

Note: The project manager is responsible for ensuring all the input criteria are completed by the appropriate project team member as defined in the project Deliverables Matrix prior to each phase of testing

Environment:
Integration testing is performed in the test environment.

Tools:
Data access tools (e.g., TOAD, PL/SQL) are used to analyze content of tables and to analyze results of loads. ETL Tools(e.g. Informatica,Datastage). Test management tool(e.g. Test Director ,QC) that maintains and tracks the requirements, test cases, defects and traceability matrix.

Deliverables:
Executed Integration Test Case documents, i.e., documented actual results against each test, signed and dated by the tester(s). Signed and approved Test Case Index & Results document which contains results of executed Integration test scripts. Updated Requirements Traceability Matrix

Test Case Index and Results:


The DW&BI team should use the Test Case Index and Results document to report result of testing. The document tracks the following Test Case #: Enter a test case number in sequential outline format (e.g., 1, 1.1, 2, 2.1, 3). Description: Provide a brief description that covers each test case instance as fully as possible. Requirement # and Description: List each requirement number that corresponds to the listed test case number and briefly describe. Criticality: Provide a relative criticality ranking for each test case instance (Low, Medium, High). Result: Indicate each test case result (Pass [test case meets acceptable criteria], Fail [test case does not meet acceptable criteria], Hold [test case requires additional data for result to be determined].) Fail Description Reference SPR#: For each failed test case, list the assigned Software Process Report (SPR) #, briefly describe what caused the failure. Robot / SQL Script Name: Indicate the assigned SQL script name, as applicable.

Technical Shakedown Test:


A Technical Shakedown Test will be conducted prior to System Testing Objective: Software has been configured correctly (including Informatica architecture, Source system connectivity and Business Objects). All the code has been migrated to the QA environments correctly. All required connectivity between systems are in place.

System Testing
System Testing is performed to prove that the system meets the Functional Specifications from an end to end perspective. The testing team will verify that the data in the source system databases and the data in the Target is consistent through out the process

Prerequisite:
Finalized Implementation Checklist All integration testing should be completed Migration from the Test environment to the QA environment, as applicable Production configuration and data available

Input:
Project Plan,Business requirements document System Test Cases and steps Updated Operations Manual Signed and approved integration Test Case Index, Test Case documents, and scripts

Objectives:
Verify the QA environment is an exact replica of Production prior to running the system test Run end-to-end system test starting from the source databases to target and verify the data output. Record initialization and incremental load statistics Verify functionality of the system meets the business specifications Verify error handling and reconciliation processes are functioning properly

Environment: System testing is performed in the QA environment

Tools:
Data access tools (e.g., TOAD, PL/SQL) are used to analyze content of tables and to analyze results of loads. ETL Tools(e.g. Informatica,Datastage). Test management tool(e.g. Test Director ,QC) that maintains and tracks the requirements, test cases, defects and traceability matrix

Data:
Production replicated data

Deliverables:
Executed System Test Cases, i.e., documented actual results against each test, signed and dated by the tester(s) Signed and approved Test Case Index & Results document which contains results of executed system test scripts Requirements Traceability Matrix A summary report

User Acceptance Testing:


The objective of this testing to ensure that System meets the expectations of the business users. It aims to prove that the entire system operates effectively in a production environment and that the system successfully supports the business processes from a user's perspective. The tests will also include functions that involve source systems connectivity, jobs scheduling and Business reports functionality.

Operational Readiness Testing (ORT):


This is the final phase of testing which focuses on verifying the deployment of software and the operational readiness of the application.

Deployment Test
Tests the deployment of the solution . Tests overall technical deployment checklist and timeframes . Tests the security aspects of the system including user authentication and authorization, and user-access levels. Tests the operability of the system including job control and scheduling

Regression Testing:
Performed after a defect reported is fixed by the developer. Performed when a Change Request is implemented on an existing production system.

Inputs : Impact analysis workbook prepared by the developer STM developed by the Business Analyst as for designing the test cases for the CR. Test Result Report of SIT ,if Change Request is implemented on an existing production system.

Test Stop Criteria:


Reaching deadlines, e.g.: release deadlines, testing deadlines Test Cases completed with certain percentage passed Test budget has been depleted Coverage of code or requirements reaches a specified point Defects rate falls below a certain level Testing stops when the result is unproductive (No. of errors per person per day reduces)

Thank You

Prepared by :-Manoj Tiwari

You might also like