Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

TRANSFORMATIONS

Download as pdf or txt
Download as pdf or txt
You are on page 1of 191

INFORMATICA ************************ Informatica is a GUI(Graphical User Interface) based tool and it is useful to do ETL work.

The famous versions of Informatica are : 5, 6, 7.1, 8.6, 9 Architecture of Informatica :

Different Components of Informatica : 1) 2) 3) 4) 5) PowerCenter Administration Console PowerCenter Repository Manager PowerCenter Designer PowerCenter Workflow Manager PowerCenter Workflow Monitor

Different Roles of Informatica : In any Informatica project, there mainly two roles : Developer and Administrator Developer will create Mappings and Workflow s by using Designer and Workflow Manager respectively. Administrator is responsible to do all Admin activities, like, creating Users, Groups, Providing permissions and running the Workflows.

* List of Transformations * Following are the list of Transformations available in Power Designer tool : 1) Aggregator Transformation 2) Source Qualifier Transformation 2) Expression Transformation 3) Filter Transformation 4) Joiner Transformation 5) Lookup Transformation 6) Normalizer Transformation 7) Rank Transformation 8) Router Transformation 9) Sequence Generator Transformation 10) S tored Procedure Transformation 11) Sorter Transformation 12) Update Strategy Transformation 13) XML Source Qualifier Transformation 16) Union Transformation. 17 ) Transaction Control Transformation * POWERCENTER DESIGNER *

It is Windows based Client and it is useful to create mappings. To do any work with PowerCenter Designer window, we go through the following steps : 1) 2) 3) 4) 5) 6) Creating ODBC connection to the Source Database Importing Source Metadata. Creating ODBC connection to the Target Database. Importing Target Metadata. Creating Mapping Creating Tran sformations.

Step 1 : Creating ODBC connection to the Source Database : Goto Start click on Control Panel Click on Administrative Tools click on DataSources(ODBC) option then one window will open in that click on System DSN tab and fig as below :

Click on Ad d button, then another window will open , in that select DataSource as Oracle in OraDb10g_home1(2) option as below :

Click on Finish button One window will open as : In that provide Data Source Name as : RRITEC_SOURCE TNS Service Name as : ORCL User ID as : SOURCE_USER

Click on Test Connection button Click on Ok. Then Connection Successful message window will get as :

Click on Ok click on Ok. Now, the ODBC connection creation is successful for the Source Database. Step 2 : Importing Source Metadata : Go to Start click on RUN type serives.m sc in that run window as below:

Click on Ok then one window will open it shows all the services in that select Informatica Services 8.6.0 click on start, if it is already start in mode, then click on Restart link as below :

go to Start All Programs Informatica Poercenter 8.6.0 client Powercenter Designer click on Ok then Designer window will open. right click on our created Repository ie., rep_05 click on Connect one window will open and it asks the Username and password. Provide Username and Password as Administrator as shown below window :

click on Connect. right click on our folder ie., RRITEC folder click on Open. click on Source Analyzer icon from the tool bar as marked by Red circle.

go to Sources menu click on Import from Database option , then one window will open in that provide ODBC datasource as RRITEC_SOURCE, Username and Password as SOURCE_USER as shown below :

Click on Connect under Select Tables sub window, which is marked by B rown colored window, select EMP and DEPT tables click on Ok go to Repository menu click on Save. Step -3 : Creating ODBC connection to Target Database : Goto SqlPlus tool type as / as sysdba after opening the Sql Plus window, type as > CREATE USER TARGET_USER Identified By TARGET_USER; > GRANT DBA TO TARGET_USER; connect or login into Sql Plus using above UserName and Password as: CONN TARGET_USER @ ORCL ---- > press enter Password : TARGET_USER; Select * From Tab; Create Table T_EMP as Select * From Scott.Emp Where 1=2 ; go to Start click on Control Panel Click on Administrative Tools Click on Data Sources(ODBC) then open one window as below : click on login.

Click on Add button then one window will Open and select Oracle in OraDb10g_home2 option as shown in below :

click on Finish, then new window will open in that provide Data Source Name as : RRITEC_TARGET TNS Service Name as : ORCL User ID as : TARGET_USER

click on Test Connection one small window will open as : provide Service Name : ORCL User Name : TARGET_USER Password : TARGET_USER click on Ok.

Step -4 : Importing Target Metadata : go to Informatica Designer tool click on Target Designer icon as :

go to targets menu click on Import From Database then new window will open in that provide ODBC data source as : RRITEC_TARGET Username as : TARGET_USER Password as : TARGET_USER

Click on Connect then new window will open in that expand TARGET_USER schema which is marked by Brown color as below :

expand TABLES select the Target Table and click on Ok go to Repository menu and click on Save.

Step 5 : Creating Mapping : in the Designer tool click on Mapping Designer icon go to Mappings menu click on Create give the name as m_First_ Mapping as shown below :

click on Ok. from the Repository navigator, expand RRITEC folder expand Sources tab Drag and drop EMP table into Workspace of mapping window expand targets tab drag and drop T_EMP table besides to Source Qualifier table (SQ_EMP) in workspace map the corresponding ports from SQ_EMP to target table T_EMP go to repository menu click on Save.

POWERCENTER WORKFLOW MANAGER

It is Useful to Create Tasks and Workflows. In Creation of Workflow, we have below steps : 1) Creating Workflow 2) Creating Connection 3) Creating Task 4) Linking Task and Workflow 5) Running Workflow 1) Creating Workflow : Start All Programs Informatica Powercenter 8.6.0 Client click on PowerCenter Wo rkflow Manager then window will look like as

right click on our Repository ie., rep_05 click on Connect new window will open and provide Username and Password as Administrator.

click on connect Right click on RRITEC folder click on O pen click on Workflow Designer icon as shown in below :

go to Workflows menu click on Create name it as First_Workflow :

click on Ok then workflow workspace will look as below :

Goto Repository menu click on Save. 2) Creating Connection : For SOURCE : In the Connections menu click on Relational option new window will open as :

select Type as ALL click on New button new window will open and select the Oracle from list of available options as shown below :

click on Ok then new window will open in that Provide name as SOURCE , Username as : SOURCE_USER Password as : SOURCE_USER Connection String as : ORCL

as shown below :

Click on Ok button Similarly, create one more connection for Target as same procedure : the process is same as above , but here we need to provide the details as below : Provide name as SOURCE , Username as : SOURCE_USER Password as : SOURCE_USER Connection String as : ORCL as shown below :

click on Ok. 3) Creating Task : In the Workflow Manager window goto Tasks menu click on Create new window will open, in that select the type as Session and name as : First Session as shown below :

click on Create click on Done then new window will open as :

From the above window, select our mapping ie., First_Mapping from the list of all available mappings in o ur Repository ( but here it is not available, because I didn t save our mapping while preparing this document, that s why we didn t observe that.) click on Ok then our workflow window will look like as :

Double click on First_Session new window will open, in that click on Mapping tab, now the window will look as :

from the Navigator of above window select SQ_EMP under sources tab the window will change as :

click on Down arrow mark , which is marked by Brown color, then another w indow will open as :

In that we need to select our newly created SOURCE, because what we selected from Navigator in above, is also Source click on Ok.

similarly, same as above., select our target table ie; T_EMP and click on Down arrow mark, and select the TARGET and click on Ok. 4) Link Task and Workflow : goto Tasks menu click on Link Task option just click the workflow Start which is represented by Green Arrow and drag the cursor to First_Session then the flow will look like as below :

goto Repository menu click on Save. 5) Running the Workflow : Right click on Workflow ie., Green colored Arrow click on Start workflow from task.

TRANSFORMATIONS ******************************** In Informatica, Transformations help to transform the source data according to the requirements of target system and it ensure the quality of the data being loaded into target. Transformations are of two types: Active and Passive. Active Transformation : An active transformation can change the number of rows that pass through it from source to target i.e it eliminates rows that do not meet the condition in transformation. Passive Transformation : A passive transformation does not change the number of rows that pass through it i.e it passes all rows through the transformation.

Transformations can be Connected or UnConnected. Connected Transformation : Connected transformation is connected to other transformations or directly to target table in the mapping. UnConnected Transformation : An unconnected transformation is not connected to other transformations in the mapping. It is called within another transformation, and returns a value to that transformation.

FILTER TRANSFORMATION ************************************* It is Active and Connected type Transformation A filter condition returns TRUE or FALSE for each row, that the Integration Service evaluates, depending on whether a row meets the specified condition. For each row that returns TRUE, the Integration Services pass through the transformation. For each row that returns FALSE, the Integration Service drops and writes a message to the session log. For any task we will going to do, its good practice to draw the one diagram firstly, to get clear idea about task, that is DFD (Data Flow Diagram ) or MFD (Mapping Flow Diagram ). So, here also, we are going to draw the sample Data Flow Diagram.

Step -1, 2 and 3 are same here also. So I m not giving here, you can do it Step -4 : Creating and Importing Target Table Metadata :

As we keep in mind that, if we want to import any table structure into the designer window, to place the target data, first that table structure is to be there in the Database. If it not there, then Create it first using below procedure. Go to Sql Plus, give username as TAREGT _USER , password as TAREGT_USER , and Host string as ORCL, then click on Ok. After opening the SqlPlus window, type the below syntax to create the table definition as:

CREATE TABLE F_EMP ( Empno Number(10), Ename Varchar2(20), Job Varchar2(50), Sal Number(10,2), Comm Number(10,2), Deptno Number(10) ); Now, open the Informatica Designer window, right click on Repository ( here rep_05) from the Navigator window click on C onnect then small window will open and it asks Username and Password as :

provide Username and Password as Administrator click on Connect then our created Folder is Opened where we need to Store all our work and expand that Folder and see the list of options available in that Folder as below :

Now, w e need to import that Target table into the Designer as per below steps : Click on Warehouse Designer ( Target Designer) icon from the top of Workspace window which is rounded by Red Colored Circle in below diagram as :

Now go to Targets menu click o n Import From Database option , then new window will open as below :

From that above window, Select ODBC data source as : RRITEC_TARGET, UserName as : Target_User Passowrd as : Target_User Click on Connect and observe Under the Select Tables sub window which is rounded by Red colored circle, expand that user, you will find list of available tables under that user. Select the required table, here F_EMP table and click on OK, then that table metadata is came into Designer window und er Target Designer workspace as shown below :

Step 5 : Creating Mapping : Got to Designer tool, click on Mapping Icon which is indicated by Red colored Circle as shown below :

Go to Mappings menu, click on Create option, then one small windo w will open as : Give the mapping name as anything, here I m giving as m_FILTER .

Click on Ok. In Repository under Navigator window, expand RRITEC folder expand Sources folder expand Source_User tab, there we will get our source table. So, drag and drop that source table EMP into the Workspace window , after dragging done, the designer window is like below :

Whenever, if we drag Source table into the Workspace, one another table is also came by default with the name as SQ_<source_table_name> . here SQ_EMP. Where SQ stands for Source Qualifier. Now, to create Transformation, go to Transformation tab click on Create, one window will open and it asks two values. Select Transformation type as Filter and Name of the transformation is : t_SAL _GREATERTHAN_3000 , as shown below :

Click on Create click on Done. Now the designer window workspace will look like as:

Now, we need to give filter condition on this newly created Filter Transformation table. Now select the required fields in Source Qualifier and drag and drop those fields from Source Qualifier Table into Filter Transformation Table as below :

Now, we need to gove the condition for the filter Tranformation table, For that click on Header of that Filter Transformation table, then it will open one new window that is Edit Transformation in that select Properties Tab and the window will look like as :

Click on Filter Condition attribute Down arrow mark , which is rounded by Red colored Circle as shown in above diagram, then Expression Editor window will open as below :

In that Expression Editor window, select Ports tab double click on SAL from the keyboard type >= 3000, click on Validate button , then it shows validation success message, if the syntax and everything is proper as : click on Ok click on Ok again Ok.

Now, expand Target_User tab, there we will get our Target table. So, drag and drop that Target table F_EMP into the Workspace window , after dragging done, the designer window is like below :

Now, map the required fields from Transformation Filter table into Target table as below:

Go to Repository tab click on Save, and observe the Saved message that is either Valid or Invalid the mapping message, we will got under Output wind ow in the same Designer Window tool. If want to check that whether the records are to be loaded into the target table after execution of our workflow, go to SqlPlus, querying the table as : > SELECT * FROM F_EMP; We get the records only those records will meet the Condition as we specified in the filter Transformation. Filtering Rows with NULL values : To filter rows containing null values or spaces, use the ISNULL and IS_SPACES functions to test the value of the port. For Eg , if you want to filter out rows that contain NULL value in the FIRST_NAME port, use the following condition : IIF ( ISNULL ( FIRST_NAME ) , FALSE , TRUE ) This condition states that if the FIRST_NAME port is NULL, the return value is FALSE and the row should be discarded. Otherwise, the row passes through to the next transformation. Limitations of Filter Transformation: i) Using Filter Transformation, we can load only one Target Table at a time. ii) If we have five different conditions, then we need to create five different Filter Transformations.

iii) W e cannot capture the False or Unsatisfied records of the condition in an table. NOTE : 1) Use the Filter tran sformation early in the mapping : To maximize session perform ance, keep the Filter transformation as close as possible to the sources in the mapping. Rather than passing rows that you plan to discard through the mapping, you can filter out unwanted data early in the flow of data from sources to targets. 2) Use the Source Qualifier transformation to filter : The Source Qualifier transformation provides an alternate way to filter rows. Rather than filtering rows from within a mapping, the Source Qualifier transformation filters rows when read from a sou rce. The main difference is that the source qualifier limits the row set extracted from a source, while the Filter transformation limits the row set sent to a target. Since a source qualifier reduces the number of rows used throughout the m apping, it provides better performance. However, the Source Qualifier transformation only lets you filter rows from relational sources, while the Filter transformation filters rows from any type of source. Also, note that since it runs in the database, you must make sure that the filter condition in the Source Qualifier transformation only uses standard SQL. The Filter transformation can define a condition using any statement or transformation function that returns either a TRUE or FALSE value.

EXPRESSION TRANSFORMATION ************************************* It is Passive and Connected type of Transformation. Use the Expression Transformation to perform Aggregate calculations. It is best practice to place one Expression Transformation after Source Qualifier Transformation and one Expression Transformation before the Target table, because, in future, if we add any new column to the target which is independent to Source table, then we need to disturb all the mapping., instead of that, if we use above best practice, its no need to disturb all the mapping. Its enough to change at the Expression Transformation at before of the Target table. Expression Transformation is useful to derive the new columns from the Existing Columns.

Mapping Flow Diagram : SQ_EMP (source) SQ_EMP EXPRESSSION T/F Tax=Sal * 0.1 Total = Sal+IIF(IsNull(Comm),0,Comm) ( here the target table has fields of EmpNo, Ename, Job, Sal, Comm, Tax, Total Sal ) Step -1, 2, 3 is same here also, so you can do it. Step 4: Creating Target table and Import that metadata table into Designer tool : open SqlPlus and type the below to create a table as : CREATE TABLE EXP_EMP Number, ( EmpNo Ename Varchar2(30), Job Varchar2(20), Sal Number(p,s), Comm Number(p,s), Tax Number(p,s), Total_Sal Number(p,s) ); goto Targets menu click on Import from Database option new window will open, in that provide Target ODBC connection as : RRITEC_TARGET Username as : TARGET_USER Password as : TARGET_USER As below : EXP_EMP (target)

click on Connect the same above window will just look as different in that expand the Target_User tab and expand the Tables tab which are marked by Brown color and select our target table ie., EXP_EMP table and click on Ok. as below :

Step 5: Creating Mapping : click on Mapping Designer icon from the tool bar goto Mappings menu click on Create option and give the mapping name as : m_EXP_EMP as shown below :

click on Ok. now drag and drop Source and Target tables into mapping window workspace as below :

Goto Transformations menu click on Create new window will get, in that select type as Expression and give name as : T_Tax_And_TotSal as below :

click on Create click on Done Drag and Drop required columns ( EmpNo, Ename, Job, Sal, Comm ) from SQ_EMP to T_Tax_And_TotSal transformation table as shown below :

Double click on header of Expression Transformation new window will get, in that click on Ports Tab window will look as :

select last port., ie., Comm click on two times on Add a New port icon which is marked by Brown color, to add new two ports as below :

rename those two new ports as one is Tax and another is TotSal and set both Datatypes as Decimal Disable the Input ports for both fields as :

click on Expression icon, which is marked by brown color in above new window will get, where you need to write the Formula for Tax column as :

click on Validate button, it will give the validation status message, if any sysntax errors is there or not as :

click on Ok Click on Ok similarly click on expression down arrow mark corresponding to TotalSal and type the below expression as :

click on validate to check for any syntax errors are there or not :

click on Ok Click on Ok Click on Ok now connect corresponding ports from Expression Transformation table to Target table as shown below :

The remaining process to create the Session and create the Workflow and running the Workflow from Task is same here also, so you can do that. Note : In the Session Properties window., make sure that Change the Load Type from default option ie., Bulk to Normal after successfully ran the Workflow goto target table and observe the data whether it is properly populated or not ! Performace Tuning : As we know that there are three types of Ports. Input, Output, and Variable. Variable --- > this Port is used to store any temporary Calculation . 1) Use Operators instead of Fun ctions. 2) Minimize the usage of String Functions. 3) If we use the Complex Expression multiple times in the Expression Transformer, then make that expression as Variable. Then we need to use only this variable for all computations. RANK TRANSFORMATION ******************************* It is Active and Connected type of transformation. The Rank transformation differs from the transformation functions MAX and MIN, in that it lets you select a group of top or bottom values, not just one value. you can designate only one Rank port in a rank Transformation. The Designer creates a RANKINDEX port for each Rank transformation automatically. The Integration Service uses the Rank Index port to store the ranking position for each row in a group. For example, if you create a Rank transformation that ranks the top five salespersons for each quarter, the rank index numbers the salespeople from 1 to 5 : RANKINDEX 1 2 3 4 5 SALES_PERSON Sam Mary Alice Ron Alex SALE S 10,000 9,000 8,000 7,000 6,000

The RANKINDEX is an output port only. You can pass the rank index to another transformation in the mapping or directly to a target. We Cannot Edit or Delete the RANKINDEX port. Rank Cache : there are two types of Caches here. They are Rank Data Cache and Rank Index Cache. Rank Data Cache : The index cache holds group information from the group by ports. If we are Using Group By on DEPTNO, then this cache stores values 10, 20, 30 etc. All Group By Columns are in RANK INDEX CACHE. Ex. DEPTNO Rank Index Cache : It holds row data until the Power Center Server completes the ranking and is Generally larger than the index cache. To reduce the data cache size, connect Only the necessary input/output ports to subsequent transformations. Note : All Variable ports, if there, Rank Port, All ports going out from RANK Transformations are stored in RANK DATA CACHE. Mapping Flow Diagram : EMP (source) SQ_EMP RANK T / F (top 3 ranks) RANK_EMP (target)

( the target table R ank_Emp has fields of EmpNo, Ename, Sal, DeptNo, Rank ) Step -1, 2, 3 is same here also., so you can do and use it. Step -4 : Create and Import Target table as per above Mapping Flow Diagram using SqlPlus. Step -5: Creating Mapping : Goto designer tool click on mapping icon goto Mappings menu click on Create then new window will open and give the mapping name there as :

Click on Ok Drag and drop Source EMP table into mapping window workspace as shown below :

Go to Transformations men u click on Create select type as RANK and name it as T_RANK as shown below :

click on Create click on Done from Source Qualifier, Drag and Drop EmpNO, Sal, DeptNo into Rank Transformation table ie., T_RANK as shown below :

Double click on header of the Rank Transformation table new will open, in that click on Ports tab enable the Checkbox of Rank ie., R[] of the SAL port because we are finding the Rank based on the Rank , which is marked by Brown color as shown in below :

click on Properties tab on that same window select the Top/Bottom attribute as value of Top provide Number of Ranks option as 3 , the window as shown below :

click on Ok drag and drop Target table ie., RANK_EMP into workspace connect RANKINDEX of Rank Transformation table to Rank field in Target table and connect all the remaining ports from T_RANK to corresponding ports at Target table as shown below :

the remaining process for creating the Session and creating the Workflow and Running the workflow from Task is same here also. after successfully ran the Workflow, go to taget table and observe that Rank is correctly populated or not.

ROUTER TRANSFORMATION **** ******************************** It is an Active and Connected transformation. Router Transformation is useful to load Multiple targets and handled Multiple Conditions. It is similar to filter transformation. The only difference is, filter transformation drops the data that does not meet the condition whereas router has an option to capture the data that do not meet the condition. It is useful to test multiple conditions. It has input, output, and default groups. If you need to test the same input data based on multiple conditions, use a Router transformation in a mapping instead of creating multiple Filter transformations to perform the same task. Likewise, when you use a Router transformation in a mapping, the Integration Service processes the incoming data only once. When you use multiple Filter transformations in a mapping, the Integration Service processes the incoming data for each transformation. This will effect the Performance.

Configuring Router Transformation : The router transformation has input and output groups. You need to configure these groups. 1) Input groups : The designer copies the input ports properties to create a set of output ports for each output group. 2) Output groups : Router transformation has two ou tput groups. They are user-defined groups and default group. This have again two types of Groups. They are User-defined Groups and Default Group. 2- i) User-defined groups : Create a user-defined group to test a condition based on the incoming data. Each user-defined group consists of output ports and a group filter condition. You can create or modify the user-defined groups on the groups tab. Create one user-defined group for each condition you want to specify. 2- ii) Default group : The Designer creates only one default group when you create one new user-defined group. You cannot edit or delete the default group. The default group does not have a group filter condition. If all the conditions evaluate to FALSE, the integration service passes the row to the default group. Specifying Group Filter Condition : Specify the Group filter condition on the groups tab usin g the expression editor. You can enter any expression that returns a single value. The group filter condition returns TRU E or FALSE for each row that passes through the transformation.

Mapping Flow Diagram : R_SAL_UPTO_2500 ( Target -1 )

EMP (Source)

SQ_EMP (Source Qualifier)

ROUTER T/F

R_SAL_AFTER_2500 ( Target-2 )

( Both Target tables has fields as EmpNo, Ename, Job, Sal, DeptNo )

Create two tables R_SAL_UPTO_2500 and R_SAL_AFTER_2500 from SqlPlus as : CREATE TABLE R_SAL_UPTO_2500 ( Eno Number(5), Ename Varchar2(10), Job Varchar2(10), Sal Number(6,2), DeptNo Number(10,2) ); CREATE TABLE R_SAL_AFTER_2500 ( Eno Number(5), Ename Varchar2(10), Job Varchar2(10), Sal Number(6,2), DeptNo Number(10,2) ); Creating Mapping : Import Source and Target Tables into the Workspace window. Go to Mapping designer workspace go to Mapping menu click on Create, then below window will open and give the mapping name as m_ROUTER as below :

Click on Ok drag and drop source and two target tables into the workspace window as and mapping win dow will look as below :

Now, go to Transformation menu click on Create, then new window will open as below:

Click on Create click on Done Copy and drag all the columns from Source Qualifier table to Router Transformation table as per below :

Just keenly observe that, in the Router Transformation, we found that one INPUT, NEWGROUP1 and DEFAULT label we got, which is marked by Red rectangle as shown above. double click on Header of the Router Transformation one new window ie., Edit Transformations window will open in that select Groups tab, which is marked by Red rectangle click on Add a New Group to this Router icon, which is also rectangled by the Red color as shown below :

After click on that Add a New Group to this Router icon, the window will look as below :

On the NEWGROUP1 name, rename that with SAL_UPTO_2500 under Group Filter Condition Field, click on Down arrow mark, which is circled by the Red color new window will open that is Expression Editor as below :

Delete that default TRUE from the workspace go to Ports Tab double click on SAL field type from the keyboard as <= 2500 click on Ok click on Ok click on Ok. Now connect from NEWGROUP1 section fields to one target table and DEFAULT1 section fields to another Target table as shown in below :

The remaining process for creating Task and creating workflow is same here also.

After running the workflow and it is succeeded, go to database and give the query test the data of above two target tables, one table have to store all the details of employees whose salaries Upto 2500 and the other table have to store all the details of employees whose salaries are After/ Above 2500. Advantages of Using Router over Filter Transformation : Use router transformation to test multiple conditions on the same input data. If you use more than one filter transformation, the integration service needs to process the input for each filter transformation. In case of router transformation, the integration service processes the input data only once and thereby improving the performance.

SORTER TRANSFORMATION ************************************** It is Active and Connected type of transformation. It is useful to sort the data acco rding to the sort key specified , either in Ascending or Descending order based on requirement. You can specify multiple columns as sort keys and the order (Ascending or Descending for each column) Properties :Distinct Output Rows - Can output distinct records if you check this option , suppose If we enable this option, then obviously it will not return all the Rows as there in Input. So in this case, this Sorter Transformation acts as an Active Transformation. Null Treated Low --- You can configure the way the Sorter transformation treats null values. Enable th is property if you want the Integration Service to treat null values as lower than any other value when it performs the sort operation. Disable this option if you want the Integration Service to treat null values as higher than any other value. Case Sensitive --- The Case Sensitive property determines whether the Integration Service considers case when sorting data. When you enable the Case Sensitive property, the Integration Service sorts uppercase characters higher than lowercase characters. Work Directo ry --- You must specify a work directory the Integration Service uses to create temporary files while it sorts data. After the Integration Service sorts the data, it deletes the temporary files. You can specify any directory on the Integration Service machine to use as a work directory. By default, the Integration Service uses the value specified for the $PMTempDir process variable.

Sorter Cache Size : - It is the max amount of memory used for sorter. In version 8 it is set to Auto. In version v7 it is 8MB If it cannot allocate then the session will fail If it needs more memory that it will pages the data in the W ork Directory, and writes a warning in the log file. Better Performance : Sort the data using sorter transformation before passing in to aggregator or joiner transformation. As the data is sorted, the integration service uses the memory to do aggregate and join operations and does not use cache files to process the data. Mapping Flow Diagram : EMP (source) SQ_EMP SORTER T / F (based on DeptNo) S_EMP (target)

( Here, Target table has fields that whatever those Fields available for Source table ) Create target table based on mapping flow diagram with name S_EMP as below : CREATE TABLE S_EMP AS SELECT * FROM EMP WHERE 1=2 ; Which is nothing the same structure that what we had in Source table ie., EMP Creating m apping : in the mapping designer, go to mappings menu click on create, then small window will open as below :

Click on Ok drag and drop source and target tables into the Workspace window go to Transformation menu click on create , then small window will open as :

Click on Create click on Done, then mapping designer window will look as :

Now select and drop all the columns from Source Qualifier Transformation table to Sorter Transformation Table , the mapping is like below :

Now, double click on header of the Sorter transformation click on Ports tab Enable the Check box under Key field corresponding to the DEPTNO , which is marked by Red colored rectangle as shown below :

Click on Ok click on Save from Repository tab. Next, creating the Session and Creating the Workflow is same for this also. After running the workflow, go to backend and check the data. Performance Tuning : 1) While using the sorter transformation, configure sorter cache size to be larger than the input data size. 2) Configure the sorter cache size setting to be larger than the input data size while Using sorter transformation. 3) At the sorter transformation, use hash auto keys partitioning or hash user keys Partitioning.

AGGREGATOR TRANSFORMATION ****************************************** It is Active and Connected type of Transformation . It used to perform calculations such as SUMS, AVERAGES, COUNTS etc., on groups of data. The Integration service stores the data group and Row data in Aggregate Cache. The Aggregator Transformation provides more advantages than the SQL, you can use the conditional clauses to filter rows. Components and Options of Aggregator Transformation: Aggregate Cache : The Integration Service stores data in the aggregate cache until it completes aggregate calculations. It stores group values in an index cache and row data in the data cache. Aggregate Expression : Enter an expression in an output port or Variab le Port. The expression can include non -aggregate expressions and conditional clauses. Group by Port : This tells the integration service how to create groups. You can configure input, input/output or variable ports for the group. The integration service performs aggregate calculations and produces one row for each group. If you do not specify any group by ports, the integration service returns one row for all input rows. By default, the integration service returns the last row received for each group along with the result of aggregation. By using the FIRST function, you can specify the integration service to return the first row of the group. Sorted Input : This option can be used to improve the session performance. You can use this option only when the input to the aggregator transformation in sorted on group by ports. Note: By default, the Integration Service treats null values as NULL in aggregate functions. You can change this by configuring the integration service. When you run a session that uses an Aggregator transformation, the Integration Service creates Index Cache and Data Cache in memory to process the transformation. If the Integration Service requires more space, it stores overflow values in cache files. Aggregator Index Cache : Th e value of index Cache can be increased upto 24MB. The index cache holds group information from the group by ports. If we are using Group By on DEPTNO, then this cache stores values 10, 20, 30 etc.

Data Cache : DATA CACHE is generally larger than the AGGREGATOR INDEX CACHE. Columns in Data Cache: i) Variable ports if any ii) Non group by input/output ports. iii) Non group by input ports used in non -aggregate output expression. iv) Port containing aggregate functio n It can also include one aggregate function nested within another aggregate function, such as : MAX ( COUNT ( ITEM ) ) Nested Aggregate Functions : You can include multiple single-level or multiple nested functions in different output ports in an Aggregator transformation. However, you cannot include both single-level and nested functions in an Aggregator transformation. Therefore, if an A ggregator transformation contains a single-level function in any output port, you cannot use a nested function in any other port in that transformation. When you include single-level and nested functions in the same Aggregator transformation. The Designer marks the mapping or mapplet invalid. If you need to create both single-level and nested functions, create separate Aggregator transformations. Null Values in Aggregate Functions : When you configure the Integration Service, you can choose how you want the Integration Service to handle null values in aggregate functions. You can choose to treat null values in aggregate functions as NULL or zero. By default, the Integration Service treats null values as NULL in aggregate functions. Mapping Flow Diagram : EMP (source) SORTER T/F AGGREGATOR T/F AGG_EMP (target)

SQ_EMP

Best Practise : Whatever the Order of the Sorter Transformation Key positions, the same positions we should maintain the same order in Aggregator Transformation.

Aggregator Transformation supports Conditional Sum., ie., SUM( sal, sal<3000) , this means to sum all the Salaries below the 3000 margin. Only two level Nested Aggregated Transformation is supported . ie., for eg : MAX ( COUNT ( ITEM )) is valid eg : MIN ( MAX ( COUNT ( ITEM ))) Process : Create Target Tables as per Mapping Flow Diagram using Sql*Plus as below : CREATE TABLE AGG_EMP NUMBER, ( DEPTNO DEPT_WISE_SAL NUMBER ) ; and Import this table into Informatica Designer tool as a Target. Go to mappin gs menu click on create, then open small window as :

Click on Ok Drag and drop source and target tables into the workspace as shown below :

Go to Tran sformation tab click on Create, then small window will open as :

Click on Create click on Done. Drag and drop Deptno and Sal into Sorter Transformation from Source Qualifier Transformation table as shown below :

Now, double click on header of Sorter transformation, then Edit Transformation window will open and click on Ports tab as sh own below :

Enable the Checkbox under K, corresponding DeptNo portname, which is marked by Red color click on Ok. Go to Transformation click on Create, then small window will open as :

Click on Create click on Done copy and drag the DeptNo and Sal from Sorter Transformation to Aggregator Transformation as shown below :

Double click on Aggregator Transformation click on Ports tab opened window is like below : Enable the checkbox under the Group By column corresponding to DeptNo portn ame as shown in above. click on Ok.

Select Sal column in above fig click on Add New Port icon which is marked by Green colored circle then new column is added under Port Name field and rename it as DEPT_WISE_SAL disable the Input checkbox for the corresponding newly created field under expression field type SUM(sal) for the same field like as below :

Now, go to Properties tab on the same above window enable the checkbox corresponding Sorted Input filed, which is marked by Red

Click o n Ok. Now, connect DeptNo to DeptNo, Dept_Wise_Sal to the Dept_Wise_Sal from Aggregator Transformation to the Target table as shown below :

Scenario -1 : Load the Target table using Conditional SUM feature for the same above mapping. Solution : all the steps remains same in this scenario, the only one change we need to do is : go to Edit Transformation of the above Aggregate Transformation table, under

expression field for the DEPT_WISE_SAL port, type as SUM( sal, sal< 2500 ), which is marked by Red colored rectangle as shown below :

The remaining process to create Session, creating workflow is same. Now go to backend and check the target table data as querying SELECT * FROM AGG_EMP; We get result as : Empno Dept_Wise_Sal -------------------------10 1250 20 3480 like this .. that means it gives the result as Sum all the Salaries in Dept wise whichever Salaries are less than 2500. Performance Tuning Tips : 1) Use sorted input to decrease the use of aggregate caches : Sorted input reduces the amount of data cached during the session and im proves session performance. Use this option with the Sorter transformation to pass sorted data to the Aggregator transformation. 2) Limit connected input/output or output ports : Limit the number of connected input/output or output ports to reduce the amount of data the Aggregator transformation stores in the data cache.

3) Filter the data before aggregating it : If you use a Filter transformation in the mapping, place the transformation before the Aggregator transformation to reduce unnecessary aggregation.

LOOK -UP TRANSFORMATION ************************************** It is Passive and Connected / Unconnected type of transformation. A connected Look up transformation receives source data, performs a lookup, and returns data to the pipeline. Cache the lookup source to improve performance. If you cache the lookup source, you can use a dynamic or static cache. By default, the lookup cache remains static and does not change during the session. With a dynamic cache, the Integration Service inserts or updates rows in the cache. When you cache the target table as the lookup source, you can look up values in the cache to determine if the values exist in the target. The Lookup transformation marks rows to insert or update the target. Connected Look -Up Transformation : The following steps describe how the Integration Service processes a connected Lookup transformation : 1) A connected Lookup transformation receives input values directly from another transformation in the pipeline. 2) For each input row, the Integration Service queries the lookup source or cache based on the lookup ports and the condition in the transformation. 3) If the transformation is uncached or uses a static cache, the Integration Service returns values from the lookup query. 4) If the transformation uses a dynamic cache, the Integration Service inserts the row into the cache when it does not find the row in the cache. When the Integration Service finds the row in the cache, it updates the row in the cache or leaves it unchanged. It flags the row as insert, update, or no change. 5) The Integration Service passes return values from the query to the next transformation. If the transformation uses a dynamic cache, you can pass rows to a Filter or Router transformation to filter new rows to the target.

Mapping Flow Diagram : EMP (source table) SQ_EMP Look-Up Transformation LKP_EMP (target table) ( Target table has the fields say DeptNo, Dname, EmpNo, Sal ) Create the Target table as per Mapping flow diagram. Till now, we had created the target table using SqlPlus, but now, we will see that how to create the table from Designer itself as below : Go to Designer tool click on Target Designer icon from the tool bar as

goto Targets menu click on Create, then small window opens as below :

Click on Create Click on Done. Then one empty table will got in the workspace as below :

Double click on Header of that above empty table, then another window will open , as below :

In that window, goto Columns tab click on Add new column icon which is marked by red color symbol , here my target table has 4 ports, so I clicked 4 times on that red marked icon, then new window will open as below :

Rename the column N ames as per our Target table in Mapping flow diagram and select the corresponding Data type, the final window will look like as :

Click on Ok, then the target table appear in the Workspace as below :

Now, source and target tables are ready and impo rted into Workspace of Designer tool. Go to Mappings menu click on create and give the name as below window :

Click on Ok Drag and drop Source and Target tables into the workspace of Mapping window as below :

Go to Transformation menu click o n Create, then window will open in that select Type as Lookup and name as T_LKP as :

Click on Create then another new window will open as below :

In the above window, click on Import button, then two options will popup as shown below :

In those two options click on From Relational Table option it again open Import Tables window as shown below in that provide ODBC connection details for Source as below and click on Connect Under Select Tables sub block select DEPT table and click on Ok.

Click on Done on the above Create Transformation window. Now the mapping window will look like as :

Connect DeptNo, E mpNo , S al from SQ_EMP to target table ie., LKP_EMP drag and drop DeptNo from SQ_EMP to T_LKP table and connect the Dname from T_LKP to LKP_EMP table as shown below :

Double click on header of T_LKP table new window will open click on Condition tab click on Add a New Condition icon marked by Red circle make sure that Lookup Table Column as DEPTNO and Transformation Port as DEPTNO1 as :

click on Ok.

Now connect the DNAME from Lookup (T_LKP) to target (LKP_EMP) as below :

The remaining process for creating the Session, creating the Workflow is same., but at the session properties tab, we need to select or map the Lookup Transformation connection type to Source ODBC connection, which is similar what we done to map the Source Table to Source ODBC connection and Target table is mapped to Target ODBC connection, as of now. Note : This is the only one Transfo rmation that we need to map the Transformation Table to ODBC connection at Session Properties window. That means only Look-Up transformation will interact and talk to Database, no other Transformation can do this..

UNCONNECTED LOOK -UP TRANSFORMATION ****************************************************** An unconnected Lookup transformation is not connected to a source or target. A transformation in the pipeline calls the Lookup transformation with a :LKP ex pression. The unconnected Lookup transformation returns one column to the calling transformation. The following steps describe the way the Integration Service processes an unconnected Lookup transformation : 1) An unconnected Lookup transformation receives input values from the result of a :LKP expression in another transformation, such as an Update Strategy transformation. 2) The Integration Service queries the lookup source or cache based on the lookup ports and condition in the transformation.

3) The Integration Service returns one value into the return port of the Lookup transformation. 4) The Lookup transformation passes the return value into the :LKP expression. Mapping Flow Diagram : EMP (source table) SQ_EMP Expression Transformation LKP_EMP (target table)

UnConnected Look -Up Transformation ( the target table LKP_EMP has the fields say: DeptNo, Dname, EmpNo, Sal )

It will not be connected in Mapping Pipeline. It can be called in any Transformation. It supports expressions. Using Unconnected Look-up, we will get only one output port. Create and Import Target table into Target designer window as per Mapping Flow Diagram ( which is exactly same as above procedure ) goto Mappings menu click on create name it as : m_UnConnected_Lookup click on Ok.

Drag and drop Source EMP table into workspace of Mapping window . Goto Transformation menu click on create select Expression Transformation as type and name it as : T_EXP click on Create click on Done as below :

from SQ_EMP table, drag and drop DeptNo, EmpNo, Sal into Expression Transformation (T_EXP) as below :

Double click on Expression Transformation click on Ports tab select DeptNo column click on Add New Port icon name it as DNAME disable the Input( I ) port click on Ok, as shown below :

connect all the columns from Expression Transformation table to Target table as below :

go to Transformation menu click on Create select Look-Up transformation as type and name it as T_LookUp as below :

Click on Create new window will open as below :

Click on Source b utton select the DEPT table from the sub window click on O., here there is no DEPT table, so we need to Import the table first from the database click on Ok click on Done.

Double click on Look-Up transformation click on ports tab click on Add New Port name it as IN_DEPTNO and datatype as Decimal disable the Output( o ) port as below :

from the above window only, click on Condition tab click on Add a new condition icon make sure that LookUp table column as DEPTNO and Transform ation Port as IN_DEPTNO click on Ok.

Double click on Expression Transformation click on Dname Expression icon as below and provide the expression as : LKP . T_LOOKUP( DeptNo )

Click on Ok again click on Ok save it. Note : Suppose, If we have more than one mapping flow, we can possible to call the Same Unconnected Look-Up again. Performance Tuning Tips : 1) Add an index to the columns used in a lookup condition : If you have privileges to modify the database containing a lookup table, you can improve performance for both cached and uncached lookups. This is important for very large lookup tables. Since the Integration Service needs to query, sort, and compare values in these columns, the index needs to include every column used in a lookup condition. 2) Place conditions with an equality operator (=) first : If you include more than one lookup condition, place the conditions in the following order to optimize lookup performance: Equal to (=) Less than (<), greater than (>), less than or equal to (<=), greater than or equal to (>=) Not equal to (!=) 3) Cache small lookup tables : Improve session performance by caching small lookup tables. The result of the lookup query and processing is the same, whether or not you cache the lookup table. 4) Join tables in the database : If the lookup table is on the same database as the source table in the mapping and caching is not feasible, join the tables in the source database rather than using a Lookup transformation. 5) Use a persistent lookup cache for static lookups : If the lookup source does not change between sessions, configure the Lookup transformation to use a persistent lookup cache. The Integration S ervice then saves and reuses cache files from session to session, eliminating the time required to read the lookup source.

6) Call unconnected Lookup transformations with the :LKP reference qualifier : When you write an expression using the :LKP reference qualifier, you call unconnected Lookup transformations only. If you try to call a connected Lookup transformation, the Designer displays an error and marks the mapping invalid. 7) Configure a pipeline Lookup transformation to improve performance when processing a relational or flat file lookup source : You can create partitions to process a relational or flat file lookup source when you define the lookup source as a source qualifier. Configure a non -reusable pipeline Lookup tran sformation and create partitions in the partial pipeline that processes the lookup source.

SEQUENCE GENERATOR TRANSFORMATION ******************************************************* OLTP Data Enum 101 101 101 Location Hyd Chennai Banglore Year 2002 2006 2011 W_ID 1 2 3 OLAP Data Enum 101 101 101 Ename Hyd Chennai Banglore Year 2002 2006 2011

in the first table Enum acts as a Primary Key, whereas, in the second table Primary key is W_ID, it is nothing but Warehouse Id. It is also called Surrogate Key. Surrogate Key : it is the key which is acting as a Primary Key in Datawarehouse. Sequence Generator is useful to generate the Surrogate Key and it is useful to generate some ran dom number as Surrogate Key Values. It is Passive and Connected type of transformation. It contains two output ports that you can connect to one or more transformations. The Integration Service generates a block of sequence numbers each time a b lock of rows enters a connected transformation. If you connect CURRVAL, the Integration Service processes one row in each block. When NEXTVAL is connected to the input port of another transformation, the Integration Service generates a sequence of numbers. When CURRVAL is connected to

the input port of another transformation, the Integration Service generates the NEXTVAL value plus the Increment By value. You can make a Sequence Generator reusable, and use it in multiple mappings. You might reuse a Sequence Generator when you perform multiple loads to a single target. For example, if you have a large input file that you separate into three sessions running in parallel, use a Sequence Generator to generate primary key values. If you use different Sequence Generators, the Integration Service might generate duplicate key values. Instead, use the reusable Sequence Generator for all three sessions to provide a unique value for each target row. You can complete the following tasks with a Sequence Generator transformation : 1) Create Keys 2) Replace Missing Values 3) Cycle through a Sequential range of Numbers Creating keys is easier thing, anyway we will see in below mapping. Replace Missing Values : Use the Sequence Generator transformation to replace missing keys by using NEXTVAL with the IIF and ISNULL functions. For example, to replace null values in the ORDER_NO column, you create a Sequence Generator transformation with the properties and drag the NEXTVAL port to an Expression transformation. In the Expression transformation, drag the ORDER_NO port into the transformation along with any other necessary ports. Then create an output port, ALL_ORDERS. Under ALL_ORDERS, you can then enter the follow ing expression to replace null orders: IIF ( ISNULL ( ORDER_NO ), NEXTVAL, ORDER_NO ) The Sequence Generator transformation has two output ports: NEXTVAL and CURRVAL. You cannot edit or delete these ports. Likewise, you cannot add ports to the transformation. Mapping Flow Diagram : EMP (Source Table) Sequence Generator T/F ( Fields of Target table is : Row_Wid, EmpNo, Ename, Sal ) SEQ_EMP (Target Table)

Create the Target table SEQ_EMP as per known technique. Now, we go to SqlPlus and create the Target table as per below : > CREATE TABLE SEQ_EMP Row_Id NUMBER, ( EmpNo NUMBER, Ename VARCHAR2(20), Sal NUMBER ) import that target table into Designer tool now, drag and drop Source and Target table into workspace of Mapping window as below :

Click on Ok. drag and drop Source and Target tables into Workspace of mapping window as below :

connect EmpNo, Sal from SQ_EMP to target SEQ_EMP table as below :

click on Transformation menu click on Create select type as Sequence Generator and name it as t_Seq as below :

Click on Create click on Done , the mapping window workspace will look as :

Connect NextVal of transformation table to Row_Id of target tab le as below :

Save the mapping the remaining process for creating the session and creating the Workflow is same here also and execute the workflow and observe the Target Table.

JOINER TRANSFORMATION ************************************* It is Active and Connected type of transformation. Use the Joiner transformation to join source data from two related heterogeneous sources residing in different locations or file system s. You can also join data from the same source. The Joiner transformation joins sources with at least one matching column. Note: The Joiner transformation does not match null values. For example, if both EMP_ID1 and EMP_ID2 contain a row with a null v alue, the Integration Service does not consider them a match and does not join the two rows. To join rows with null values, replace null input with default values, and then join on the default values. The Joiner transformation supports the following types of joins : 1) 2) 3) 4) Normal Master Outer Detail Outer Full Outer

Note: A normal or master outer join performs faster than a full outer or detail outer join.

Normal Join : With a normal join, the Integration Service discards all rows of data from the master and detail source that do not match, based on the condition. Master Outer : A master outer join keeps all rows of data from the detail source and the matching rows from the master source. It discards the unmatched rows from the master source. Detail Outer : A detail outer join keeps all rows of data from the master source and the matching rows from the detail source. It discards the unmatched rows from the detail source. Full Outer Join : A full outer join keeps all rows of data from both the master and detail sources. Generally, if we join two tables from source, obviously we will get two Source Qualifier Tables came out for individual Sources and combine them and pointing to Target table as below : EMP (Oracle Data base) SQ_EMP JOINER_EMP (Target Table) SQ_DEPT

DEPT (Oracle Data base)

But in the above case, both the sources are came from same database ie., Oracle, so we don t need to take separate Source Qualifier for each source table., instead of that we delete one source qualifier and take only one Source Qualifier and map both the sources to this and connect it to target as below diagram. Joining Two Tables Using single Source Qualifier : Mapping Flow Diagram : EMP (Oracle Data base) SQ_EMPDEPT DEPT (Oracle Data base) ( Here Target table has fields of EmpN O, Dname, Job, Sal, DeptNo )

JOINER_EMP (Target Table)

Normal Join in our Informatica is equivalent to Equi-Join as in Oracle Database level. Master Outer Joins gives all the records of Detail Table( in this case ie., EMP ), not gives all the records of Master table ( in this case ie., Dept ) silly note : e . deptno = d . deptno (+) ---- > this is Right Outer Join and this gives all the Data of employee table (e) e . deptno (+) = d . deptno ----- > this is Left Outer Join and this gives all the Data of Department table (d). create the Target table as per mapping flow diagram. Go to Sql Plus and create the table as below : > CREATE TABLE JOINER_EMP ( EmpNo Number, Dname Varchar2(20), Job Varchar2(20), Sal Number, DeptNo Number ); go to Mappings menu click on Create name it as m_JOINER as below :

Drag and drop Source tables EMP and DEPT table and target table JOINER_EMP into the mapping window workspace and window will look like as below :

in our case, both source tables are coming from Same Database, so we don t need two separate Source Qualifiers as we d iscussed previously, so just delete any one Source Qualifier , but here iam deleting SQ_DEPT table, then the window will look as below :

Double click on Header of SQ_EMP table, the it will open one window in that click on Rename button again small window will open in that type name as SQ_EMPDEPT for better clarity purpose as shown below :

click on Ok. Drag and drop DeptNo, Dname, Loc from DEPT table to SQ_EMP table then window will look like as :

Map the corresponding ports from SQ_ EMPDEPT table to JOINER_EMP table as shown below :

now, double click on SQ_EMPDEPT transformation new window will open in that click on Properties tab in Sql Query option click on corresponding down arrow mark, which is rectangled by Brown colo r as shown below :

After click on that down arrow mark another window will open as below :

In that provide ODBC datasource as : RRITEC_SOURCE Username as : SOURCE_USER Password as : SOURCE_U SER then click on Generate SQL button then automatically one query will popup into the workspace of SQL sub -window , which is marked by Green Color in below window. Then window will look like as below :

Click on Ok again click on Ok. The Remain ing process to create Session , to create Workflow and running the Workflow is same here also. After ran the workflow, goto target table and preview the data whether it is populated correctly or not. Joining one Databse table and one Flat File : Silly N ote : whatever the table we drag and dropped into workspace as second attempt, then Inforamtica designer automatically treat that table as : MASTER Open Notepad and type as : Deptno,Dname,Loc 10,Hr,Hyderabad 20,Sales,Chennai 30,Stock,Pune Click on Save with name as DEPT.txt and place this notepad file at below location : F: \ Informatica \ Powercenter 8.6.0 \ Server \ Infa_Shared \ SrcFiles. We had already created the Target table of JOINER_EMP, so you can use the sam e table in this scenario also : Import the Source table and source Notepad file into Informatica Designer tool. Goto Mapping Designer Window click on Mappings menu click on Create and name it as m_JOINER_EMP and drag and drop Source EMP, source Flat-File and target JOINER_EMP into the workspace as shown below :

goto Transformations menu click on Create select transformation type as Joiner and give name as T_JOIN as below :

click on Create click on Done, then mapping window looks as below :

Drag and Drop Ename, Sal, DeptNo from SQ_EMP table and Drag and drop Deptno, Dname from SQ_DEPT to to T_JOINER transformation table as below :

Double click on Joiner transformation, then Edit Transformations window will open in that click on Ports tab, make sure that the table coming from Source side having less number of Records is treated as Master and Corresponding Master port is Enabled in checkbox format as below :

click on Condition tab on the same above window as :

click on Add a new condition icon which is marked by Red color, from the above window as and select and make sure that Condition as below Red mark

click on Ok connect all the required ports from Joiner Transformation ports to Target table p orts as below :

click on Save. goto Powercenter wokflow manager window click on Workflows menu click on Create name it as : w_Hetro_Join , then window will look as :

Click on Ok goto Tasks menu click on Create and give name as S_Hetro_Join As below :

Click on Create , then window will open , in that need to select our mapping as below :

click on Ok click on Done then connect the Session and Workflow using Link Task and the workspace will look as :

now double click on Session Edit Tasks window will open in that click on Mapping window as shown below :

from the Navigator window at left side of above diagram click on SQ_EMP and the corresponding Relational DB -connection is mapped to SOURCE as shown below :

similarly select SQ_DEPT from the leftside Navigator window then the corresponding properties will appear at right side in that select Import Type as File Source File type as : Direct Source File Directory as : $PM Source File Dir \ Source File Name as : DEPT.txt similarly select the target table from the leftside navigator and map the corresponding Relational DB -connection to TARGET click on Ok and right click on Workflow and click on Start Workflow from Task option and goto Target table and observe the target table data.

Joining Two Flat Files using Joiner Transformation : Open notepad and type the below data as : Deptno,Ename,Sal 10,Kishor,1000 20,Divya,2000 30,Venkat,3000 Save the above notepad file with the name of EMP.txt similarly open another notepad and type the below data as : Deptno,Dname,Loc 10,Hr,Hyderabad 20,Sales,Chennai 30,Stock,Pune Save the above notepad file with the name of DEPT.txt Now place the above two notepad files in the below location as: F: \ Informatica \ Powercenter 8.6.0 \ Server \ Infa_Shared \ SrcFiles Goto Mapping Designer window goto Mappings menu click on C reate give the mapping name as : m_Join_FlatFiles as below :

Click on Ok now drag and drop the above two Source FlatFiles and the Target Table, then the mapping window looks as :

goto Transformations menu click on Create select the Transformation type as Joiner and give the name as : T_Join_FlatFiles as below :

click on Create Click on Done then drag drop Deptno, Ename and Sal from SQ_EMP and Deptno, Dname from SQ_DEPT to the T_Join_FlatFiles transformation table as shown below :

Double click on Header of the Join transformation the Edit Transformation window will open , in that click on Condition tab make sure the condition as in below window with red marked rectangle as :

click on Ok then connect the correspondin g ports from Joiner Transfromation table to the target table as below :

the remaining process to create the Workflow and create the Task and setting the properties of the Task is same as above after running the workflow, goto target table and view the data whether it is properly populated or not.

Loading one Source Flat File data into another Target Flat File : in the Designer tool, click on Target Designer icon goto Targets menu click on Create name it as FlatFiles_Target and select a Database type as : Flat File as shown below :

click on Create Click on Done. Double click on the Header of that above target source file table then Edit Tables window will open click on Columns tab

click on Add a New column to this table three times (because we need three fields at target table) then the above window will look as :

Name those fields as Deptno, Ename, Sal and set the appropriate datatypes as shown in below :

click on Ok click on Mapping Designer icon goto Mappings menu click on Create give the mapping name as m_FlatFiles_Target as shown below :

Click on Ok now drag and drop source EMP.txt and target FlatFiles_Target.txt file into the mapping window workspace as shown below : Goto W orkflow Manager tool goto Workflows menu click on Create give the name as : w_FlatFiles_Target and click on Ok. Create the session with the name of s_FlatFiles_Target and click on ok now connect the Workflow and Session using Link Task now run the Workflow. After running the Workflow , goto backend location of F: \ Informatica \ Powercenter 8.6.0 \ Server \ Infa_Shared \ SrcFiles And observe that whether the Target Flat file is created or not . Loading a List of Flat Files into a Table Using Indirect Method : Open a Notepad and type the below data as : Deptno,Ename,Sal 10,Kishor,1000 20,Divya,2000 30,Shreya,3000 Save this file with the name of EMP.txt similarly, open one more Notepad and type as below : Deptno,Ename,Sal 40,Swetha,4000 50,Sweety,5000 60,Alekhya,6000 Save this file with the name of EMP1.txt Now place the above two Notepads in the below location of C:\Informatica\PowerCenter8.6.0\server\in fa_shared \SrcFiles

similarly, open one more notepad and type above two notepad s path location as below : C:\Informatica\PowerCenter8.6.0\server\infa_shared \SrcFiles\EMP.txt C:\Informatica\PowerCenter8.6.0\server\infa_shared \SrcFiles\EMP1.txt Save this notepad with the name of E MPLIST .txt and place this notepad also in the same location of SrcFiles. Open Designer tool Import EMP as a Source create and import target table with the name of T_ListofFiles having fields of Deptno, Ename, Sal using SqlPlus. Create a Mapping with the name of m_Load_ListofFiles. Drag and drop source (EMP is enough ) and target into the workspace and connect the corresponding Ports as shown below :

goto Workflow manager create a workflow with the name of w_T_ ListofFiles create a Task with the name of s_T_ListofFiles connect the Workflow and session Double click on Session click on mappings tab select the SQ_EMP from the leftside navigator window and the corresponding properties will appear at rightside. In that select Source File Type as : Indirect Source File Directory as : $PM Source File Dir \ Source File Name as : EMPLIST.txt Click on Ok save it and run the workflow goto table T_ListofFiles and observe the data.

XML SOURCE QUALIFIER TRANSFORMATION ***************************************************** It is an Active and Connected type of transformation . Open notepad and type the below data as : < EMP > < ROWS > <DEPTNO>10</DEPTNO> <ENAME>RamReddy</ENAME> <SAL>1000</SAL> </ROWS> <ROWS> <DEPTNO>20</DEPTNO> <ENAME>Venkat</ENAME> <SAL>2000</SAL> </ROWS> <ROWS> <DEPTNO>30</DEPTNO> <ENAME>Divya</ENAME> <SAL>3000</SAL> </ROWS> </EMP> Save the above file with the name as EMP.xml and place that file in the below location as : C:\Informatica\PowerCenter8.6.0\server\infa_shared \SrcFiles Open the Informatica Designer tool click on Source Analyzer icon from the tool bar click on Sources menu click on Import From XML definition new window will open from that navigate to our Src Folder and select the newly created XML file as shown below :

click on Open then one warning window will came as below :

click on Yes again another window will came as below :

if you have time, then read all those above options , otherwise simply click on Ok new window will open as :

click on Next new window will open , in that select Hierarchy Relationship radio button and select De-normalized XML views options as below :

click on Finish if we go and observe in the Source Analyzer window, we will get the table structure as somewhat different normal database table structure as :

create a Target table with name as XML_EMP with the columns of DEPTNO, ENAME, SAL and import into Informatica Designer tool as : > CRE ATE TABLE XML_EMP Number(5), ( Deptno Ename Varchar2(30), Sal Number(10,2) ); click on Mapping designer icon goto Mappings menu click on Create name it as m_XML as :

click on Ok from the leftside Navigator window, expand Sources expand XML_File tab drag and drop EMP into workarea , also drag and drop Target into workarea as :

connect corresponding ports from XML source qualifier to Target table as :

goto Wo rkflow Manager tool create a workflow with the name w_XML and create a Session with the name of s_XML and connect the both using Link Task. double click on Session click on Mappings tab select XMLDSQ_EMP from leftside n avigator window and pro vide Source File directory as : $PMSourceFileDir \ Source file name as EMP . XML Source file type as Direct as shown below :

select target XML_EMP table and map the proper Target connection click on Ok and run the Workflow.

NORMALIZER TRANSFORMATION ******************************************** It is Active and Connected Transformation It is Useful to read Data from COBOL files. It is associated with COBOL file. It is useful to convert Single I/P record into Multiple O/P records. The below table is of De-Normalized Form : Year Account_Type Q1 ---------------------------------2012 1000 The below table Savings is of Normalized Form : Year ------2012 2012 2012 2012 Accoun t_Type -----------------Savings Savings Savings Savings Quarter ----------1 2 3 4 Q2 -----------2000 Amount ----------1000 2000 3000 4000 Q3 ----------3000 Q4 ----------4000

In the above table, the field Quarter is nothing but GCID (generated column Id ) Open one notepad and type as below : Year,Account_Type,Q1,Q2,Q3,Q4 2012,Savings,1000,2000,3000,4000 Save the notepad and place this file in the below location as C:\Informatica\PowerCenter8.6.0\server\infa_shared \SrcFiles Create a target table with the columns of Year, Account_Type,Quarter,Amount with a table name as T_EMP_NORMALIZER using SqlPlus as :

> CREATE TABLE T_EMP_NORMALIZER Number(5), ( Year, Acco unt_Type Varchar2(30), Quarter Number(3), Amount Number(10,2) ); and import that table into Designer tool. Goto Designer tool click on Mapping Designer window icon goto M appings menu Create a mapping with the name of m_Normalizer as shown below :

click on Ok Drag and Drop Source flat file and target table into the mapping window workspace. goto Transformations menu click on Create select transformation type as Normalizer and give name as T_Normalizer as below :

click on Create click on Done now the mapping window will look as :

Double click on Normalizer transformation Edit transformations window will open as :

In the above window , click on Normalizer tab and click on Add a new column to this table icon of three times to create three ports, which are marked by Red color in the above window, now the resultant window will look as :

change the Column Names and Datatypes and Occurs field as per below :

click on Ok then automatically in the Normalizer Transformation table has fileds in a different manner that what we regularly seen at the mapping window workspace as :

connect the ports from SQ_EMP to T_NORMALIZER an d connect the ports from T_NORMALIZER table to target T_EMP_NORMALIZER as shown below :

UPDATE STRATEGY TRANSFORMATION ************************************************ It is Active and Connected Transformation Target table should contain Primary Key Implimenting SCD1 : ( SCD is useful to maintain Current Data ) It should maintain Source Data and Target Data as Same. Mapping Flow Diagram : EMP SQ_EMP LookUp T/F Expression T/F Update Strategy Transformation (for Update Flow) SCD1 (target)

Router T/F

Expression T/F

Expression T/F

Update Strategy Transformation (for Insert Flow )

SCD1 (target)

Sequence Transformation

Target table SCD1 having the Fields as EmpKey, EmpNo, Ename, Job, Sal Goto SqlPlus and create the target table as below : CREATE TABLE SCD1 Number(5), ( EmpKey EmpNo Number(5), Ename Varchar2(3 0), Job Varchar2(30), Sal Number(5) ); Import this table into Informatica Designer tool. Goto Mappings menu click on create give the name as m_SCD1_EMP as :

click on Ok --> drag and drop Source EMP tab le and Target table into the mapping window workspace as :

Goto Transformations menu click on Create select the transformation type as LookUp and name it as T_LKP_SCD_UPDATE as shown below :

click on Create open one window , in that click o n Target button select our target table ie., SCD1 from the below available list of tables as shown below :

click on Ok click on Done the mapping window will looks as :

Drag and drop EmpNo port form SQ_EMP to LookUp transformation table as :

Double click on LookUp transformation table click on Conditions tab click on Add a New Condition icon make sure the Condition as below :

Click on Ok . Note : if we get any errors after click on Ok button in the above window, then just change the datatype of EMPNO to decimal in the LookUp transformation table. Goto Transformations menu click on Create select type as Expression and give name as EXP_B4_ROUTER_UPDATE and drag and drop EmpNo, Ename, Job, Sal from Lookup Transformation to Expression Transformation table as below :

from the Look-Up transformation , drag and drop EmpKey to Expression transformation as :

double click on Expression Transformation click on Ports tab click on Add a new Port icon twice to create two ports as : NEW_RECORD ----- Output ------- exp : IIF (ISNULL (EMPKEY ), TRUE , FALSE ) UPDATE_RECORD ---Output---exp : IIF (NOT ISNULL (EMPKEY ), TRUE , FALSE ) as shown below :

click on Ok drop Router Transformation drag and drop all the ports from Expression Transformation to Router Transformation as :

double click on Router Transformation click on Groups tab click on Add a New Group icon twice to create two groups and give conditions as : NEW_RECORD UPDATE_RECORD as shown below : NEW_RECORD = TRUE UPDATE_RECORD = TRUE

click on Ok now the mapping will look like as :

New Record Flow : drop three Transformations of type Expression, Update Strategy, Sequence Generator into the mapping window workspace drag and drop ports of EMPNO1, ENAME1, JOB1, SAL1 from NEW_RECORD group into Expression Transformation as :

drag and drop all the above four ports from Expression Transformation to Update Strategy tran sformation as :

double click on Update Strategy Transformation click on Properties tab provide Update Strategy expression as 0 as :

click on Ok connect all the ports from Update Strategy to Target table and connect NEXTVAL port from Sequ ence Generator Transformation to EMPKEY of target table as

Update Record Flow : drop Update Strategy, Expression transformations into the above Mapping Workspace in the above Mapping, from the Router Transformation , drag and drop the ports EMP KEY3, EMPNO3, ENAME3, JOB3, SAL3 of UPDATE_RECORD group into Expression Transformation as :

drag and drop all the ports from Expression Transformation to Update Strategy Transformation as :

double click on Update Strategy Transformation click on Properties tab set the Update Strategy Expression value as 1 as shown below :

click on Ok copy and paste the target table into the mapping workspace, for the purpose of Updated Record flow and connect all the Ports from Update Strategy Transformation to another Target as :

goto Workflow Manager tool create a Workflow with the name of w_SCD1 create a Session with the name of s_SCD1 and map this task to our above Mapping m_SCD1_EMP click on Ok connect Task and Workflow using Link Task. double click on Session click on Mapping tab select SQ_EMP and set that to SOURCE connection select Target and set that to TARGET connection select Look-Up transformation and set that to TARGET connection click on Properties tab set the Treat Source Rows as Data Driven click on Ok Save the Task and Run the Workflow and observe the Target table data. Testing : goto SqlPlus login as TARGET_USER; INSERT INTO SOURCE_USER . EMP ( Empno, Ename, Job, Sal ) VALUES ( 102, Kishor , Manager , 90000 ); COMMIT ; > UPDATE SOURCE_USER . EMP SET Sal = 5001 WHERE Empno = 7839; > COMMIT ;

go and run the Workflow now and observe the Result. Exercise -1 : Try to Impliment above scenario using below Mapping Flow Diagram : EMP SQ_EMP Update Strategy T/F Target

Look Up T/F Condition to be used in the Update Strategy is : IIF ( ISNULL (EMPKEY ) , 0 , 1 ) Exercise 2 : Impliment the above SCD1 using inbuilt Slowly Changing Dimension Wizard.

SLOWLY CHANGING DIMENSION 2 ************************************************ Create a Target table with the name of EMP_SCD2 with columns EMPKEY (pk), EMPNO, ENAME, JOB, DEPTNO, SAL, VERSION and import into Informatica Designer tool as : > CREATE TABLE EMP_SCD2 number(5) Primary Key, ( Empkey Empno number(5), Ename varchar2(20), Job varchar2(20), Deptno number(5), Sal number(10,2), Version number(5) ); create a Mapping with the name of m_SCD2 as :

click on Ok drag and drop Source and Target tables into the Mapping workspace as

drop Look -Up transformation and Use Look-Up table as Target table as :

click on Create new window will open in that Click on Target button select EMP_SCD2 ass below :

click on Ok from Source Qualifier transformation , drag and drop EMPNO into Look-Up transformation as :

double click on Look -Up transformation click on Ports tab and change Datatypes from Double to Decimal (if it is in Double datatype initially) click on Condition tab click on Add a new Condition make sure that condition as EmpNo = EmpNo1 as :

click on Ok drop Expression Transformation into mapping window drag and drop EMPNO, ENAME, JOB, SAL, DEPTNO from Source Qualifier Transformation into Expression Transformation as :

drag and drop EMPKEY, SAL, VERSION from Look-Up transformation into Expression transformation as :

double click on Expression Transformation click on Ports tab select last port click on Add a new port to this transfo rmation icon twice to create two ports name those two ports as :

click on Ok now the mapping window will look as :

drop a Router Transformation drag and drop all the ports from Expression Transformation to Router Transformation as :

double click on Router Transformation click on Groups tab click on Add a new group to this transformation icon twice to create two groups and name those two groups as NEWRECORD and UPDATEGROUP as below :

click on Ok New Record Flow : drop Expression, Sequence Generator, Update Strategy transformations into mapping workspace as : drag and drop EMPNO, ENAME, JOB, SAL, DEPTNO of New Record section from Router Transformation to Expression Transformation as :

double click on Ex pression Transformation click on Ports tab select last port click on Add a new port icon and rename that new port as VERSION , data type as : decimal give value as 0 to the expression as :

click on Ok from Expression Transformation, d rag and drop all the ports to Update Strategy Transformation connect corresponding ports from Update Strategy to Target table connect NEXTVAL port from Sequence Generator transformation to EMPKEY of target table as :

Update Record Flow : Dro p Expression transformation and Update Strategy Expression into workspace from Update Record Group of Router transformation, drag and drop EMPNO3, ENAME3, JOB3, SAL3, DEPTNO3, VERSION3 into the expression Transformation as :

double click on Expression transformation click on Ports tab select last port click on Add a new port icon name that port as VERSION , datatype as Decimal, under expression type as VERSION3+1 click on Ok

click on Ok from Expression Transformation, drag an drop EMPNO, ENAME, JOB, SAL, DEPTNO, VERSION to Update Strategy Transformation copy and paste the Target table in the mapping workspace to store Update Record related data connect corresponding ports from Update Strategy transformation to Target table connect NEXTVAL port from Sequence Generator transformation to EMPKEY of Target table as :

create a Workflow with the name of w_SCD2 and create a task with the name of s_SCD2 double click on Session click on Mappings tab make sure that So urce and Target connections as Appropriate make sure that Look-Up transformation connection is to Target and run the Workflow. Testing : in the Source EMP table, add one more record using Insert statement update any Existing record in the same source table now run the above workflow and observe the output. Exercise : Use the In -Built SCD2 wizard and built the Mapping.

SLOWLY CHANGING DIMENSION - 3 ************************************************ it is Useful to maintain Current data plus one level previous data.

Process Using Wizard only : go to Mappings menu select Wizards option click on Slowly Changing Dimension option then one window will open in that provide name as SCD3 select last radio button (Type-3 dimension) as :

click on Next select source table as : SOURCE_USER . EMP and give new target table name as EMP_SCD3 as :

click on Next select EMPNO column from Target table fields section click on Add>> button then that column will go and place in the Logical Key Fields section as :

select SAL column from Target Table Fields section and click on Add>> button and that column will go and place in the F ields to compare for changes section as :

click on Next click on Finish then automatically one mapping will form and placed in the Mapping window workspace as :

save the mapping click on Target Designer workspace icon from the leftside Navigator window, expand targets tab drag and drop EMP_SCD3 table into target designer workspace goto Targets menu click on Generate and Execute SQL option , new window will open as :

in the above, click on Generate and Execute option , new small window will open , in that provide ODBC connection to target database, username and password for the target database need to give as :

click on Connect click on Close save it create a workflow with the name o f w_SCD3 and create a task with the name of s_SCD3 and map the Session and Workflow using Link Task. double click on Task provide database connections appropriately to both source and target click on Ok save the workflow and run the Workflow. Testing : login to SOURCE_USER schema using Sql Plus, give statements as : > UPDATE EMP SET > COMMIT ; SAL=10000 WHERE EMPNO = 7839 ;

Exercise : Use Router Transformation inplace of Filter Transformation and develop SCD3.

SOURCE QUALIFIER TRANSFORMATION *************************************************** Create a target table with name as EMP_DEPT with columns DNAME, DEPT_WISE_SALARY using SqlPlus and import into Informatica as : > CREATE TABLE EMP_DEPT ( Dname Varchar2(30), Dept_Wise_Salary Number(10,2) ); Create a mapping with the name of M_SQ as :

click on Ok. drag and drop sources EMP and DEPT and target E MP_DEPT tables into the mapping window workspace as :

delete the above two default coming Source Qualifier tables then mapping window as:

go to Transformations menu click on Create select transformation type as Source Qualifier and give n ame as Emp_Dept as below :

click on Create then one small window will open and it asks us to select the tables for which we are going to create Source Qualifier transformation. Here we are created for both the tables., so I m selecting EMP and D EPT as below :

click on Ok click on Done automatically the newly created Source Qualifier Transformations ports are mapped with the corresponding ports from both EMP and DEPT tables as :

now connect DNAME and SAL from source qualifier transformation to target table as :

double click on Source Qualifier click on Properties tab as :

click on SQL Q uery attribute down arrow mark which is marked by Red color as above, new SQL editor window will open in that select ODB C data source as RRITEC_SOURCE Username as : SOURCE_USER Password as : SOURCE_USER As below :

Click on Generate SQL then automatically Query will be generated the workspace of SQL window shown as below :

click on Ok click on Ok the remaining procedure to create Session and Workflow is same here also.

UNION TRANSFORMATION ********************************* Connect to source database ( SOURCE_USER ) using Sql Plus and create two tables as below : > CREATE TABLE emp1020 AS SELECT deptno, sal FROM scott.EMP WHERE deptno IN ( 10,20 ); > CREATE TABLE emp2030 AS SELECT d eptno, sal FROM scott.EMP WHERE deptno IN ( 20,30 ); connect to target databse ( TARGET_USER ) using Sql Plus and create a table as : > CREATE TABLE emp_Union102030 AS SELECT deptno, sal FROM scott.EMP WHERE 1 = 2 ; now, import the above two sources and one target tables into Informatica designer. create a mapping with the name of m_UNION as :

Click on Ok drag and drop two sources and a target tables into mapping window as :

goto transformations menu click on create select type as UNION and name it as t_UNION as :

click on Create click on Done under transformation we get two new options differently as shown below :

double click on Union Transformation click on Groups tab as below :

click on Add a new group icon twice which is marked by Red color give names as below :

click on Group Ports tab

click on Add a new port icon twice and name those two ports as below :

click on Ok now the mapping window will look as :

connect all ports from SQ_EMP1020 with the emp1020 group of Union Transformation as :

similarly, connect all the ports from SQ_EMP2030 with the emp2030 group of Union Transformation as below :

now, connect all the output ports from Union Transformation to target table as below :

STORED PROCEDURE TRANSFORMATION ************************************************* Using Sql Plus, create a Target table with columns EMPNO, SAL, TAX and import that table into Informatica Designer tool as : CREATE TABLE emp_ sproc ( EMPNO number(5), SAL number(10,2), TAX number(10,2) ); using Sql Plus, connect to source database ( SOURCE_USER ) , create a Stored Procedure as : > CREATE OR REPLACE PROCEDURE emp_tax ( Sal Tax IS BEGIN tax := sal * 0.1 ; END; create a mapping with the name of m_SPROC as : IN OUT number, number )

click on Ok drag and drop source EMP table and target emp_sproc into the mapping window as :

goto Transformations menu click on Import Stored Procedure option new window will open , in that provide required details to import the stored procedure as :

click on Ok from the Source Qualifier transformation, connect SAL to stored procedure SAL port and connect TAX port of Stored Procedure to TAX port of target table as below :

connect EMPNO and SAL from source qualifier transformation to corresponding ports in Target table as below :

creating Task and creating Workflow and providing proper connections to Source and Target at session is same as here also. Create Mapping using Un -Connected Stored Procedure Transformation *********************************************************************** create a Mapping with the name of m_SP_UnConnected as :

click on Ok drag and drop source and target tables into the mapping window workspace as below :

goto Transformations menu click on create select type as Expression and name it as t_SP_UnConnected as :

click on Create click on Done drag and EMPNO and SAL from Source Qualifier transforamation to Expression Transformation as :

goto Transformations click on Import Stored Procedure option provide required details select our Stored Procedure click on Ok. double click on Expression Transformation click on Ports tab as :

click on Add a new port to this transformation icon which is marked by Red color in above window and name it as TAX disable the Input port as :

click on Tax expression down arrow mark which is circled by red color , Expression Editor window will open and type the Formula as below :

Note : in the above, PROC_RESULT is the keyword. click on Validate click on Ok click on Ok connect all the ports from Expression transformation to Target table as :

creating Session and creating workflow is same here also.

TRANSACTION CONTROL TRANSFORMATION ***************************************************** It is Active and Connected type of transformation A transaction is the set of rows bound by commit or roll back rows. You can define a transaction based on a varying number of input rows. In PowerCenter, you define transaction control at the following levels: Within a Mapping : Within a mapping, you use the Transaction Control transformation to define a tran saction. You define transactions using an expression in a Transaction Control transformation. Based on the return value of the expression, you can choose to commit, roll back, or continue without any transaction changes. Within a Session : When you config ure a session, you configure it for user-defined commit. You can choose to commit or roll back a transaction if the Integration Service fails to transform or write any row to the target. When you run the session, the Integration Service evaluates the ex pression for each row that enters the transformation. When it evaluates a commit row, it commits all rows in the transaction to the target or targets. When the Integration Service evaluates a roll back row, it rolls back all rows in the transaction from th e target or targets.

in the Transaction Control Transformation, we have 5 built-in variables. 1) TC_CONTINUE_TRANSACTION : The Integration Service does not perform any transaction change for this row. This is the default value of the expression. 2) TC_COMMIT_BEFORE : The Integration Service commits the transaction, begins a new transaction, and writes the current row to the target. The current row is in the new transaction. 3) TC_COMMIT_AFTER : The Integration Service writes the current row to the target, commits the transaction, and begins a new transaction. The current row is in the committed transaction. 4) TC_ROLLBACK_BEFORE : The Integration Service rolls back the current transaction, begins a new transaction, and writes the current row to the target. The current row is in the new transaction. 5) TC_ROLLBACK_AFTER : The Integration Service writes the current row to the target, rolls back the transaction, and begins a new transaction. The current row is in the rolled back transaction. Note : If the transaction control expression evaluates to a value other than commit, roll back, or continue, the Integration Service fails the session. Create a Target table of name EMP_TCL with the columns EMPNO, ENAME, JOB, SAL and import into Informatica tool. > CREATE TABLE EMP_TCL ( EMPNO number(5), ENAME varchar2(20), JOB varchar2(2), SAL number(10,2) ); goto mappings menu click on Create give name as m_TCL as :

click on Ok drag and drop source EMP table and target EMP_TCL table into mapping window workspace as :

goto transformations menu click on Create select type as Transaction Control and give name as t_TC as :

click on Create click on Done drag and drop EMPNO, ENAME, JOB and SAL from SQ_EMP to EMP_TCL transformation table as :

double click on Transaction Control Transformation click on Properties tab as :

click on Transaction Control Condition value dropdow n button, which is circled by red color the Expression Editor will open in that give condition as below :

click on Validate click on Ok click on Ok connect all the Ports from Transactional Control Transformation to Target table as :

Save the mapping creating Session and creating the Workflow is same here also.

Advanced Training of Informatica Mapping Parameters and Variables ******** ***************************** PARAMETERS : A mapping Parameter represents a constant value that we can define before running a session. A mapping parameter retains the same value throughout the entire session. Intial and Default value : When w e declare a mapping parameter or variable in a mapping or a mapplet, we can enter an initial value. When the Integration Service needs an initial value, and we did not declare an initial value for the parameter or variable, the Integration Service uses a default value based on the data type of the parameter or variable. Data -------Numeric String Date Time Default Value ----------------0 Empty String 1/1/1

create a target table with the column s EMPNO, ENAME, JOB, SAL, COMM, DEPTNO and import into Informatica as :

> CREATE TABLE emp_par_var number(5), ( EMPNO ENAME varchar2(20), JOB varchar2(20), SAL number(10,2), COMM number(10,2) ); create a mapping with the name of m_MP as :

Click on Ok drag and drop Source EMP table and target EMP_PAR_VAR table into mapping window workspace as :

goto Mappings menu click on Parameters and Varaibles option new window will open as :

click on Add a new variable to this table name the new field as $$DEPTNO set the type as Parameters from drodpwn set IsExpVar to FALSE as below :

click on Ok drop Filter Transformation into mapping workspace drag and drop EMPNO, ENAME, JOB, SAL, COMM from source qualifier transformation to Filter transformation as :

double click on Filter transformation click on Properties tab

click on Filter Condition value drop down arrow mark, which is marked by red color then Expression Editor window will open as :

type DEPTNO= in the formula space, click on Variables tab expand Mapping parameters tab under that we will get our created Parameters. here, ie., $$DEPTNO.

double click on $$DEPTNO from the leftside navigator window, the formual window will now get as :

click on Validate click on Ok click on Ok click on Ok drag and drop corresponding columns from Filter transformation to Target table as :

save the Mapping creating the Session and creating Workflow is same here also. Creating Parameter File : navigate to below path as : F : \ Informatica \ PowerCenter 8.6.0 \ Server \ Infa_Shared \ BWparam right click on free space select New click on Text Document name it as Deptno .prm double click on that file and type as : [RRITEC .S_MP ] $$DEPTNO = 20 Save the file and close it ( RRITEC --- > folder name ; S_MP --- > session name )

double click on session ie., S_MP click on Properties tab provide parameter file name as : F:\Informatica\PowerCenter 8.6.0 \Server\Infa_Shared \ BWparam \DEPTNO.prm set Source and Target table connections accordingly save the workflow and run the workflow. VARIABLES : Create a target table of name V_EMP with the columns of EMPNO, ENAME, JOB, SAL, COMM, DEPTNO and import into Informatica : > CREATE TABLE V_EMP number(5), ( EMPNO ENAME varchar2(20), JOB varchar2(20), SAL number(10,2), COMM number(10,2), DEPTNO number(5) ); create a mapping with the name of m_Variable_Incrimental as :

click on Ok drag and drop Source EMP and Target V_EMP table into workarea as :

goto Mappings menu click on Parameters and Variables option click on Add a new Variable to this table icon name that field as $$DEPTNO set the Type as Variable Datatype as decimal Aggregation as Max Intial Value as 0 as shown below :

click on Ok drop Filter Transformation into mapping workspace drag and drop required ports(required for Target) from Source Qualifier Transformation to Filter Transformation as :

double click on Filter Transformation click on Properties tab as :

click on Filter Condition down arrow mark which is marked by red color in above window, then Expression Editior window will open in that type as
DEPTNO = SETVARIABLE ( $$DEPTNO,$$DEPTNO+10 ) ;

click on Validate click on Ok click on Ok click on Ok connect corresponding ports from Filter Transformation to Target table as :

save the mapping and the remaining process to create the Session and create the Workflow is same here also. MAPPLETS ********************** A mapplet is a reusable object that we create in the Mapplet Designer. It contains a set of transformations and lets us reuse that transformation logic in multiple mappings. Mapplet Input : Mapplet input can originate from a source definition and/or from an Input transformation in the mapplet. We can create multiple pipelines in a mapplet. We use Mapplet Input transformation to give input to mapplet. Use of Mapplet Input transformation is optional. Mapplet Output : The output of a mapplet is not connected to any target table. We must use Mapplet Output transformation to store mapplet output. A mapplet must contain at least one Output transformation with at least one connected port in the mapplet. Mapplet Input Transformation is a Passive and Connected type . Mapplet Input T/F > > > > T/F --- 1 T/F --- 2

The above type of mapping link is Invalid. T hat means, Connecting the Same port to more than one Transformation is disallowed directly from Mapplet Input Transformation. If we want to really connect one port to more than one transformation, a small workaround you need to be done. That is., take one Expression transformation, and first

you need to connect the same port to Expression transformation, from that Exp T/F, connect more than one Transformations. Create a Target table of name EMP_MAPPLET with the columns EMPNO, SAL, COMM, TAX, TOTALSAL and import into Informatica as : > CREATE TABLE EMP_MAPPLET number(5), ( EMPNO SAL number(10,2), COMM number(10,2), TAX number(10,2), TOTALSAL number(10,2) ); click on Mapplet Designer goto Mapplets menu click on Create give name as MLT_TOTALSAL_TAX as :

Click on Ok goto Transformations menu click on Create drop four transformations of types 1) Mapplet Input 2) Mapplet Output 3) Filter Transformation 4) Expression Transformation Then the Mapplet window will look as :

double click on Mapplet Input Transformation goto Ports tab click on Add a new port icon thrice to create three fields name those as : EMPNO, SAL , COMM as

click on Ok drag and drop all the columns from Mapplet Input Transformation to Filter Transformation as :

double click on Filter Transformation click on Ports tab change all the Datatypes as Decimal as :

click on Properties tab as :

click on Filter Condition down arrow mark which is marked by red color, then Expression Editor window will open in that type the formula as
IIF ( ( ISNULL ( EMPNO ) OR ISNULL ( SAL ) OR ISNULL ( COMM ) ) , FALSE , TRUE )

as shown below :

click on Validate click on Ok click on Ok drag and drop all the columns from Filter Transformation to Expression Transformation ass below :

dou ble click on Expression Transformation click on Ports tab click on Add a new port to this Transformation icon twice to create two ports and rename those as below : TOTALSAL --- decimal --- expression as : SAL + COMM TAX ---- decimal ---- expression as : IIF ( SAL>3000 , SAL*0.2 , SAL*0.1 ) as shoowm in below :

click on Ok drag and drop all the columns from Expression Transformation to Mapplet Output Transformation as below :

Using Mapplet in a Mapping : click on Mapping Designer icon Mappings menu click on Create name it as m_MAPLET as :

click on Ok drag and drop source EMP table and target EMP_MAPPLET into the mapping window workspace as :

from the left side navigator window, drag our Mapplet (MLT_TOTALSAL_TAX ) which is marked by red color in above fig , into the Mapping window workspace connect required ports from Source Qualifier Transformation to Mapplet and connect required ports from Mapplet to Target table as :

save the Mapping creating the Session and creating Workflow is same here also.

RE -USABLE TRANSFORMATION ***************************************** It is Useful , if we want to utilize one Transformation in Many Mappings. Except Source Qualifier Transformation, all the other Transformations can be ReUsable. we can develop Re-Usable Transformations in two ways : 1) By using Transformation Tool ( its not recommended method, becoz we didnt tested whether it is functioning properly or not ) 2) Promoting Existing Transformation to Re-Usable (it s a recommended method ) goto any mapping click on any Transformation then we will get one checkbox and we have to enable that Make Re-Usable checkbox observe that Transformation Type as Filter only which is marked by Red color whenever we enable that checkbox we will get one confirmation window as below :

click on Yes to confirm then now observe that Transformation Type which is marked in ab ove fig is changed to ReUsable as shown below :

click on Ok observe that our Re-UsableTransformation is placed under Transformations tab of Navigator window as :

TARGET LOAD PLAN **************************** If we have morethan one Source Qualifier in the Mapping, then control using SQ by setting Target Load Plan in Mapping designer. if we have only one Source Qualifier in the Mapping and Multiple Targets are loading then we need to load based on the Constraint, then it is called as Constraint Based Loading . Process for Constraint Based Loading : Create a source table as : > CREATE TABLE emp_dept AS SELECT emp . * , dept . deptno dept_deptno , Dname, loc FROM emp , dept WHERE emp . deptno (+) = dept . deptno ; Import that above table into Informatica as source create two target tables as below : > CREATE TABLE dept1 number(5) CONSTRAINT ( DEPTNO DNAME varchar2(20), LOC varchar2(20) ); pk_dept1 PRIMARY KEY,

> CREATE TABLE emp1 number(5) CONSTRAINT pk_emp1 PRIMARY KEY, ( EMPNO ENAME varchar2(20), JOB varchar2(20), MGR number(5), HIREDATE date, SAL number(7,2), COMM number(7,2), DEPTNO number(2) CONSTRAINT fk_deptno1 REFERENCES dept1 ); goto Designer tool create Mapping with the name m_CBL as :

click on Ok drag and drop above one source and two targets into the mapping window workspace and connect the corresponding ports from Source qualifier transformation to target as :

create a workflow and create a Session and link those two using Link Task double click on Session click on Config Object enable the checkbox of Constraint Based Load Ordering as below :

click on Ok run the workflow and observe the result. *************************************************** Need to add the missing Content of tw o classes ..

***************************************************

TASKS *********** Tasks are of two types. 1) Re-Usable Tasks 2) Non Re-Usable Tasks.

Re-Usable Tasks: there are three types of Re-Usable Tasks i) Session ii) Command iii) Email

All Re-Usable Tasks will be created using Task Developer. SESSION : Session will be useful to execute a Mapping. Creating Re-Usable Session : goto Power center Workflow Manager click on Task Developer icon goto Tasks menu click on Create select type as Session give any name suppose say s_reusable as :

click on Create select required Mapping click on Ok click on Done Save the session. from the Navigator window of that current Workflow Manager expand Sessions folder and observed that our Session Task is created or not .

Email : goto Power center Workflow Manager click on Task Developer icon goto Tasks menu click on Create select type as Email give name as suppose say, Send Mail

click on Create click on Done. Double click on Email click on Properties tab provide em ail username as : rritec@gmail.com give Email subject as : Test give Email Text as : Text Mail

click on Ok Save the Task. from the Navigator window of that current Workflow Manager expand Tasks folder and observed that our Email Task is created or not as :

Command : It is Useful to execute any UNIX commands. goto Power center Workflow Manager click on Task Developer icon goto Tasks menu click on Create select type as Command give name as suppose say Copyfile

click on Create click on Done. Double click on Copyfile command click on Commands tab

click on Add a New Command icon , which is marked by Red color as :

name it as Copyfile

click on Down arrow mark which is circled by Red color circle then new window will open and type as below :

click on Ok click on Ok Save the session.

from the Navigator window of that current W orkflow Manager expand Tasks folder and observed that our Command Task is created or not as below :

Non Re-Usable Tasks : If we create a Task within the Workflow Designer window of Power Center Workflow Manager tool , then it is called as Non -Reu sable Task. 1) Event Driven 4) Timer 1) Event Driven : 2) Event Raise 5) Assignment 3) Decision 6) Control

Start

Session

Event Wait

Session

Goto Power Center Workflow Manager click on Workflow Designer goto Tasks menu click on Create select task type as Event Wait give name as s_event_wait as below :

Click on Create click on Done.

click on Ok Save the Task. 2) Event Raise :

Session Start Event_Wait

Session

Event_Raise

Session

Creating a Workflow Event : Right click on Workflow Designer free space click on Edit Workflow/Worklet click on Events tab click on EXECUTE2SESSIONS click on Ok. Double click on Event Raise click on Properties tab under Value click on Dropdown buttion select EXECUTE2SESSIONS click on Ok click on Ok.

Double click on EventWait click on Events tab select Userdefined radio button click on Browse Events icon select EXECUTE2SESSIONS click on Ok click on Ok save the Task. 3) Decission :

Session Start Session Decission Command

double click on Decission Task write the below condition as : $ session-1 . status = SUCCEDED and $ session-2 . status = SUCCEDED click on Ok click on Ok. Double click on Link between Command and Decission one new window will open , in that type $Decission-Name . Condition = TRUE Similarly, double click on Link between Session and Decission one new window will open , in that type $Decission -Name . Condition = FALSE click on Ok Save the Task 4) Timer :

Start

Timer

Session

Goto Power Center Workflow Manager click on Workflow Designer goto Tasks menu click on Create select task type as Timer and give name as t_timer

click on Create click on Done. now double click on Timer select Timer tab select Absolute Time radio button give two minutes of forward time as below :

click on Ok save the Task and Run the workflow.

5) Assignment : Start Session Assignment

Goto Po wer Center Workflow Manager click on Workflow Designer goto Tasks menu click on Create select task type as Assignment and give name as t_Assignment as shown below :

click on Create click on Done. right click on Workflow Designer free space click on Edit Workflow/Worklet click on Variables tab as :

Click on Add a new Variable to this Table icon which is marked by Red color in above

Change the name as $$no_of_times_run enable the Persistent checkbox set default values as 0

click on Ok double click on Assignment click on Add a new Expression icon type expression as $$no_of_times_run + 1 as shown below :

Click on Ok WORKLET ****************** A Reusable Workflow is called as Worklet. Creating Worklet : click on Worklet Designer icon goto Worklets menu click on Create give name as Wroklet as below :

Click on Ok take any Session into the workspace of Worklet Designer window connect or link Worklet and Session save it. from the Navigator window of that current Workflow Manager window expand Worklets folder and observed that our newly created Worklet is created or not as below :

Utilizing Worklet : click on Workflow Designer icon goto Workflow menu click on Create name it as Using_Worklet drag and drop Worklet from the Wroklet folder from left side Navigator window Connect Workflow and Wo rklet as :

********************************* ****** *******************************

You might also like