Informatica Tutorial
Informatica Tutorial
doc
5/4/2016
Informatica Tutorial
Informatica Exercises
MOORTHY E S
MATHIVANAN S
TCS Internal
316852746.doc
5/4/2016
1. Terminologies
2. Source Analyzer
Ex_1 - Import RELATIONAL SOURCE definition
Ex_2 - Import FLAT FILE SOURCE definition
Ex_3 - Import XML Source definition
3. Warehouse Designer
Ex_1 Import RELATIONAL target definition
Ex_2 Create a SQL target definition in Warehouse Designer &
Generate the Table
structure in Database
Ex_3 Create a FLAT FILE target definition manually.
4. MAPPING DESIGNER
CREATING MAPPING
5. WORKFLOW MANAGER:
BEFORE CREATING SESSION:
STEPS FOR REGISTERING A POWERCENTER SERVER
Starting the informatica services
SETTING UPA RELATIONAL DATABASE CONNECTION
CREATING SESSION
CONFIGURING CONNECTION STRINGS OF SESSION
CREATING WORKFLOW
RUNNING A WORKFLOW
EVALUATING THE RESULTS
TRANSFORMATIONS
6. Source Qualifier Transformation
Ex_1 Pass thro mapping (Direct mapping Relational source to
Relationa target)
Ex_2 Extracting Data from FLAT FILE (Flat File Source)
Ex_3 Loading Data to Flat File (Indirect Sources- Optional Practices)
Ex_4 Using Source Qualifier Transformation Properties (Filter,
Sorter)
Ex_5 Using SQL Override in Source Qualifer Transformation.
7. Expression Transformation
TCS Internal
316852746.doc
Ex_1
Ex_2
Ex_3
Ex_4
Ex_5
8. Filter Transformation
Ex_1
Ex_2
Ex_3
Ex_4
Ex_5
9 Router Transformation
Ex_1 Using Router Transformation
Ex_2 Practice (Optional Practices)
Ex_3 Practice (Null value Handling)
10 Aggregator Transformation
Ex_1
Ex_2
Ex_3
Ex_4
Ex_5
11 Joiner Transformation
Ex_1
Ex_2
Ex_3
Ex_4
Ex_5
Ex_6
12 LookUp Transformation
Ex_1 Using LookUp Transformation
Ex_2 Using LookUp Override & Unconnected LookUp
Ex_3 Using Self Join
13 Sorter Transformation
Ex_1 Using Sorter Transformation
Ex_2 (Optional Practices)
Ex_3 (Optional Practices)
14 Sequence Generator Transformation
Ex_1 Using Sequence Generator Transformation
TCS Internal
5/4/2016
316852746.doc
15 Rank Transformation
Ex_1 Using Rank Transformation
16 Update Strategy Transformation
Ex_1 Using Update Strategy Transformation (Slowly Growing
Targets)
Ex_2 Supporting only Updations
Ex_3 Slowly Changing Dimensions
17 Stored Procedure Transformation
Ex_1 Using Stored Procedure Transformation (Procedures).
Ex_2 Multiple Input/Output parameters.
Ex_3 (Optional Practice).
Ex_4 Using Functions, Using Unconnected Stored Procedure
Transformation.
18 Union Transformation
Ex_1 Using Union Transformation.
Ex_2 More than two input data sets (Optional Practices)
19 Normalizer Transformation
Ex_1 Using Normalizer Transformation
Ex_2 Optional Practices
==========================Page
Starter==========================
WORKFLOW MANAGER TASKS
TCS Internal
5/4/2016
316852746.doc
5/4/2016
4. Assignment task
5. Command task
6. Timer task
7. Control task
8. Scheduling feature task
9. Email task
TCS Internal
316852746.doc
5/4/2016
Create repository
Create ODBC Connectivity for Source and Target data bases.
Register the Powercenter Server in the Workflow Manager.
Create Connection strings for source and target databases.
--TERMINOLOGIES:
SOURCE :
To extract data from a table or file, you must first define sources in the repository. You
can import or create the following types of source definitions in the Source
Analyzer:
TCS Internal
316852746.doc
5/4/2016
NorthWind (Source) Schema Diagram (Take a print out for quick reference)
TCS Internal
316852746.doc
Source Analyzer
TCS Internal
5/4/2016
316852746.doc
5/4/2016
Figure: 1
3. choose Sources of Menu bar:
Sources -> Import from Database. You will see a new window as in Figure 2:
1. Select the ODBC data source used to connect to the source database.
2. Enter a appropriate database user name and password with valid
permission
to connect to the database, and view the objects.
3. You may need to specify the owner name for database objects you want to
use as sources. Click Connect. If no table names appear, click All of show
owner tab. You can see the list of tables in the Select tables window.
Figure: 2
TCS Internal
316852746.doc
5/4/2016
C:\Program Files\
Informatica PowerCenter 7.1\Server\SrcFiles\src_ft_sales_office.txt
Double click on the icon above and Save the flat file attached in the location
C:\Program Files\Informatica PowerCenter 7.1\Server\SrcFiles on your machine and
import the source definition of the same file to Source Analyzer.
STEPS:
Choose Source Analyzer:
1. Sources (Menu bar) -> Import from files , you will get the window as in
figure 3.
Figure: 3
2. Browse to the folder (where you have saved the file) and select the file
3. Click OK. You will see the window for editing flat file structure as in figure 4.
TCS Internal
316852746.doc
5/4/2016
Figure: 4
4. Choose the file system (Delimited or Fixed Width). For the exercise file
given
5. If the file contains the column names then enable Import field names
from first
line, else ignore it. For the exercise file given field names are given
so enable.
6. click next you can see the imported source structure in source analyzer.
TCS Internal
316852746.doc
5/4/2016
3. Import the source definition of the XML file attached. Follow the same
steps of previous exercise.
Aim: Learn to import XML Structure using Source Analyzer.
cd_catalog.xml
STEP:
1.
2.
3.
4.
5.
6.
Warehouse Designer
TCS Internal
316852746.doc
5/4/2016
2. Choose
Targets(Of Menu bar) -> Import from database
3. Choose the target ODBC Data source and import the objects required. As
you import you can see the imported structure as below.
TCS Internal
316852746.doc
5/4/2016
Click on the icon and click in Warehouse Designer panel. You can see
the empty structure in the panel, as below which is of Microsoft SQL Server Database
type.
Step 2: Create the required fields. Double click on the structure and choose
columns tab. You can see empty fields. Create columns using the icon
shown
below.
TCS Internal
316852746.doc
5/4/2016
You can see a new column has been created. Create two more columns.
Rename the columns and table name and set to required datatypes, precision and
scale for the columns. Click Apply and then OK.
You can see the Column names renamed and datatypes and prec are set.
STEP 3: Generate the structure created in the designer in the database using
Generate/Execute SQL option of Targets menu bar.
Select the required structure in the Warehouse designer panel, and click
Generate/Execute SQL option of Targets in menu bar.
TCS Internal
316852746.doc
5/4/2016
Choose the Database DSN we have created and give User name and password of the
database, and click connect.
Enable Create table option of Generate options and click Generate and execute
button. If the table is already existing in database we can drop the table before
creating by enabling Drop table option
You can see the notification below in the generate tab of the output window. You can
check the table in the database.
TCS Internal
316852746.doc
5/4/2016
5. Manually create a flat file target definition similar to flat file source
src_ft_sales_office, and name it as trg_ft_sales_office.
Aim: To create flat files Structures or other database type.
Steps:
Create the required structure.
1. Change the Database type to Flat File. If you need to create structures
for some other database you can choose here. Set the file properties as
fixed width or Delimited. Rename the structure.
2. Click Apply and then ok, we can see the target structure created in
warehouse designer as shown below.
TCS Internal
316852746.doc
5/4/2016
Mapping Designer
Note: 1. mapping name you are creating should be as
m_<Abbreviated_transformation_name>_ex_<no>_<description for
understanding> e.g. m_SQ_ex_1_direct_mapping for the first exercise.
2. Session name should be named as s_<mapping_name>
Source Qualifier Transformation:
ex_1: Load the table Dim_Region with data of Region table.
Aim: To load data from One relational table to another relational table.
Step 1: Create a simple pass thro mapping to load data from Region table of
source database to Dim_Region table of Target database.
Step 2: Create a session in Workflow manager.
Step 3: Create a Workflow in Workflow manager. Add the session and run the
workflow. View the status in Workflow monitor.
Creating Mapping:
TCS Internal
316852746.doc
1. Click the mapping designer icon
in menu bar.
5/4/2016
and choose Create icon of Mappings
2. you can see the prompt asking for the mapping name. Type the mapping
name and click OK.
3. The mapping will be created with its name being displayed as below.
4. Drag the required source from the navigator to the workspace. When you
drag the source you can see source qualifier connected to source by
default.
TCS Internal
316852746.doc
5/4/2016
5. Drag the target to the workspace. You can see the view as below.
TCS Internal
316852746.doc
5/4/2016
Note:
When the mapping is valid you can continue with session creation. The mapping we
have created is a simple pass thro mapping (Direct mapping), we do not perform any
operations or function on the data flow. We are just loading the target with source
data.
But still when we get the source to mapping we get source qualifier transformation by
default, because what ever the source type may be (Oracle, SQL Server, DB2, flat file
) Informatica converts source data types to ( informatica ) transformation datatype.
CREATING SESSIONS:
TCS Internal
316852746.doc
5/4/2016
TCS Internal
316852746.doc
5/4/2016
Figure 5
Start the informatica services
Run -> services.msc -> informatica
Right click on the informatica services above informatica Repository services,
and click start. Refresh to check.
Note: After you start the PowerCenter Server, the Services dialog box displays the
status as Started. Due to some errors the status may automatically change to
stopped.
If so check for the following:
i) Check for Repository server has started. If your repository database is in your local
system check you MS SQL SERVER services has started.
ii) Go to Informatica Server configuration in workflow manager and click Resolve
server. As we dont have static IP we need to resolve the IP each time we restart our
machine.
iii) You can set the services to Automatic mode for starting the services automatically.
TCS Internal
316852746.doc
5/4/2016
Step 2: Click New, you can see the Subtype window. Choose Microsoft SQL Server
and
click OK, The Connection Object Definition dialog box appears
TCS Internal
316852746.doc
5/4/2016
After filling the box with required details click OK, you can see the newly created
connection string in objects of Relational connection Browser.
Create one more connection string for target database BasicTraining as above.
You have to create connection strings for database that we are using as our source or
target. To create connections strings for other type of database refer to help index.
CREATING SESSION:
TCS Internal
316852746.doc
5/4/2016
STEP 2: You can see the creat task dialog box prompting for task and name for the
task. Selec the task as Session and give the name for the task, and click Create.
STEP 3: It will show the list of mappings as shown below. Currently it will show only
one mapping. Choose one and click OK. You can see the session in the task developer
panel. Click Done.
TCS Internal
316852746.doc
5/4/2016
You can also create session as follows. Click the session icon in the toolbar and click
in the task developer panel. It will prompt for selecting mapping. Select the mapping
and click OK.
Double click on the SESSION and choose Mapping tab. In the navigator choose the
Sources -> SQ_Region and select Northwind Connections. Similarly choose Targets
Dim_Region and select BasicTraining as Connections. Enable Truncate option to
truncate target table before loading. Click apply. Save the Repository.
Note:
The sessions created in Task developer will be reuable tasks. We can use the same
session in multiple workflows. We can even create a session in workflow designer or
TCS Internal
316852746.doc
5/4/2016
Worklet designer diretly which is non reusable i.e it can be used only by that workflow
or worklet.
CREATING THE WORKFLOW:
AIM: To create a workflow.
STEP 1: (Creating Workflow). Click Workflows(menu) -> Create
WORKFLOW General tab: Give the name of the workflow and choose the Server
using
and enable Tasks must run on Server option. Click OK.
You can see workflow has been created with Start Task. Below figure shows
workflow with its name.
TCS Internal
316852746.doc
5/4/2016
Click the required sessions in the Sessions folder of Repository Navigator and drag
the object to the Workflow Designer panel and release the mouse button.
STEP 3: Give a link between Start task and Session. Use Link Task in Tasks menu
or use
TCS Internal
316852746.doc
5/4/2016
ii) click the first task (Start task) you want to connect and drag it to the second task
and release the mouse cursor. You can see the link between two tasks as shown
below.
3. Open the Workflow monitor and see the status of the running workflow with
Task view, as shown below.
TCS Internal
316852746.doc
5/4/2016
5. You can double click on the session that is succeeded and see the detailed
status of the task. You can see Source success rows and target success rows.
You can swith to Transformation Statistics to view more details statistically.
Note: You can look into session log for more details. To see session log, click the
session in the panel and click the Session log icon shown below. You can see the
session log opening in the wordpad.
Open the SQL Query analyzer and query the target table. Check for the availability of
data in target table.
TCS Internal
316852746.doc
5/4/2016
ex_2: Load the data from flat file src_ft_sales_office to the table
Dim_sales_office in the target
database.
Note: Create the table Dim_sales_office in the target database using
Generate/Execute SQL option of target menu in Warehouse Designer.
TCS Internal
316852746.doc
5/4/2016
STEP 1: Import the souce file structure in the Source Analyzer (we have done
already).
STEP 2: Create the target structure in Warehouse Designer and generate the
structure in the target database using Generate/Execute SQL option of target
menu (We have done already).
STEP 3: Create a direct mapping with source as flat file and target as relational
table.
STEP 4: Create a non rereusable Session in the workflow wf_Source_Qualifier
and
configure the target string with the required relational string
(BasicTraining).
STEP 5: Configure the flat file source with the source file location and its name.
Properties
Source file directory: $PMSourceFileDir\ (server variable directory we set
while registering the server in workflow manager Note: we can specify source file
folder path e.g: C:\Program_files\Source If the source file was stored in the folder.
Source filename: src_ft_sales_office (Name of the Source file with extension
.csv or .txt and so on)
Source filetype: Direct.
TCS Internal
to set file
316852746.doc
5/4/2016
If you open the file you can see the data below which has column Delimited
(separated ) by comma.
SalesOffice,Region,Country
0001,NW,USA
0002,NE,USA
0031,SW,Finland
Choose fixed width if the file is of fixed width format. Example of same file is
shown below
SalesOffice Region
0001
NW
0002
NE
0031
SW
Country
USA
USA
Finland
Click Advanced:
You can see Delimiter file Properties dialog box appearing as shown below. If the
delimiter was comma choose , or if something else choose the delimiter.
If the column values are enclosed in double quotes or single quotes choose
Single or Double of Optional Quotes. If no Choose None. Click OK and Apply.
Save the repository.
Example of file values enclosed with double quotes.
SalesOffice,Region,Country
0001,NW,USA
0002,NE,USA
0031,SW,Finland
TCS Internal
316852746.doc
5/4/2016
Set Numbe of initial rows to skip to 0. You can skip first n rows using this option.
If Escape Character was specified here the PowerCenter Server reads the
delimiter character as a regular character after reading the Escape Character
STEP 6: Create a Link task between start task and the newly created session.
STEP 7: Right click on the session and click Start Task, to start the current
session alone.
STEP 8: Monitor the task status in workflow monitor and validate the results.
Evaluation Process:
TCS Internal
316852746.doc
5/4/2016
When you enable the option it prompts as below. Click Yes to make it reusable.
Give the source file name in session as Sales_Office.txt and its location in Source file
directory and Source filetype as Indirect.
TCS Internal
316852746.doc
5/4/2016
The power center reads data from the source file src_ft_sales_office.txt &
src_ft_sales_office_2.txt in the location specified in Sales_Office.txt file. Similarly we
can have a list of files and its locations in the Sales_Office.txt file. Set the file
Properties. Save.
STEP 5: Create the link between start task and newly created task. Save. Start
Running the new session alone. Check the status and check for the file in the target.
Validate the results.
ex_4: Load the data from Employees to Dim_Sales_Rep (Table structure should
contain EmployeeID, Last_name, First_name and Title with data types similar to
employees table). Extract only employees with Title Sales Representative
and sort the output by First name.
STEP 1: Create the mapping, using source as Employees table and target as
Dim_Sales_Rep (Create it manually and generate in database using
Generate/Execute).
STEP 2: Double click on the Source Qualifier Transformation and select Properties
tab.
We need to set two properties.
i) where condition Employees.Title='Sales Representative'
ii) Sort by first name.
TCS Internal
316852746.doc
5/4/2016
STEP 3: Setting the Where Clause. Click Source Filter, you can see a SQL Editor.
Give the required where condition. You can use columns in Ports tab (by double
clicking on it). After giving required condition, Click OK.
TCS Internal
316852746.doc
5/4/2016
We can do the inverse action defined as SQL Override, link the corresponding ports.
Paste the custom query and validate against the database using Validate option. If
the query is valide click ok.
TCS Internal
316852746.doc
5/4/2016
ex_5: Load the data from Employees to Dim_Emp_Loc (Table structure should
contain EmployeeID, Last_name, First_name, Title, city and Address with data types
similar to employees table). Extract only employees residing in London. Use
SQL Override.
STEP 1: Create a target structure in Warehouse designer with required fields &
generate in target database.
STEP 2: Create a Mapping with source as Employees and Target as Dim_Emp_Loc.
STEP 3: Create a Custom Query to retrieve data from source table.
STEP 4: Get in to Sql Query option of Properties tab Paste the Query created, and
validate against the source database use corresponding source string, username
and password.
If Query is valid against the Source database it will raise a prompt sayin No errors
detected as below. Click OK and Save.
STEP 5: Create Session for the mapping in the workflow w_Source_Qualifier. Run the
session and monitor the status. Validate the results.
Note: The order of fields in select list of query and output links in SQ must match.
TCS Internal
316852746.doc
5/4/2016
We have seen enough abt creating mapping, working with source & targets and
sessions and workflow. Following exercises will explain only about working with
transformations.
Expression Transformation:
ex_1: Extract records from Employees table as shown in sample below.
Create the table with required data types.
Table name: Dim_Emp_Detail
(
Full_name
sample record: Mr. Robert King (TitleOfCourtesy+First_name Last_name)
Designation sample record: Sales Representative (Title)
Location
sample record: London, UK (city, country)
)
Note: Use Expression Transformation to get the results.
STEP 1: Create a target structure in the warehouse designer with fields as below,
and generate the target in target database.
The data type should match with the requirement. If you see Full_name the column is
formed using three columns and a space as mentioned in sample record. The
datatype is derived by adding the precision of three fileds and adding one for space.
This is to be followed for the following exercises.
STEP 2: Create a mapping with source and target.
STEP 3: Create a Expression transformation.
Click on the icon shown above and click in the mapping designer you can see a
expression transformation as shown below. We can create ports or else we can get
the query from prior transformation (just drag the field and drop in expression.
TCS Internal
316852746.doc
5/4/2016
STEP 5: Create the ouput ports to get the output after performing the required
operations. You can see the O ouput port is enabled we cannot link input ports to
these ports.
TCS Internal
316852746.doc
5/4/2016
STEP 8: Save the repository, create a non reusable session, a new workflow named
wf_Expression_Transformation, load the target table and validate the results.
Follow the same methodology for the following exercises.
ex_2: Extract records from Order_Details table to Dim_Sales with
following structure.
Create the table with required data types.
Table name: dim_sales
(
Order id
Sales
(Calculate using formula (Unit Price*Quantity)-(Unit Price*Quantity*Discount)
Discount (Populate Y if Discount !=0 and Populate N if Discount=0)
)
Note: Use Expression Transformation to get the results. While configuring the
Discount port use the in built functions in the funcations tab of the transformation as
shown in the figure.
Search Informatica help index for syntax of all the functions listed in the tab.
TCS Internal
316852746.doc
5/4/2016
Categories phrase
In The Default value option we can provided values for handling null values.
Validate the data using the mark next to it.
We can handle the null values by creating another port and using the inbuilt function
IIF and ISNULL. Eg: IIF(ISNULL(Fax), 'NO FAX', Fax).
Try both the ways.
TCS Internal
316852746.doc
5/4/2016
ex_5: Extract the following record format from customer table in a flat file
named customer_contact and target structure named
tgt_ft_cust_contacts. Extract only for Germany.
The record should look like below format:
Maria Anders
Alfreds Futterkiste
Obere Str. 57
Berlin
Germany- 12209
Column details:
ContactName
CompanyName
Address
city
country-postalCode
Get all the required ports in the expression transformation and create a new ouput
port. Create a expression using CHR() function to get the above formatted output.
Note: Create the flat file target with single port and use the logical expression to get
the above format.
Filter Transformation:
TCS Internal
316852746.doc
5/4/2016
ex_1: Load the data from Employees to Dim_Sales_Rep (Use the structure
and table created before. Extract only employees with Title Sales
Representative and sort the output by First name.
Note: Use Filter transformation to extract the data. The table is already
loaded with the data of ex_4 of source qualifier transformation. Use
truncate option to truncate the table, before loading the data.
STEP 1: Open the mapping m_SQ_ex_4, using Copy As.. Option of Mappings tab
create mapping named m_FLT_ex_1.
Click OK.
STEP 2: Alter the mapping as below
i)
ii)
Newly created transformation will appear as below. Drag the required ports from the
source qualifier transformation to filter transformation.
TCS Internal
316852746.doc
5/4/2016
Open the Properties tab of Filter transformation and give the filter condition as
Title=Sales Representative in the Filter Condition Editor and Click OK.
STEP 3: Give the link between Filter transformation and target. Validate the
mapping, create session in the new workflow named wf_FLT_ex_1. Run and validate
the results.
ex_2: Extract records from Order Details table to Dim_Sales with the
table structure and logic used in ex_2 of expression transformation.
Use filter to load sales data with discount, neglect the rest.
Copy the mapping m_EXP_ex_2 as m_FLT_ex_2 and make necessary changes and
proceed.
ex_3: Load customer data from customers table to dim_cust_clean with
out any null values in any column, and data with null values in any column
into dim_cust_bad in target database.
Create dim_cust_clean and dim_cust_bad with structures similar to
customers table.
Note: Use Filter transformation to extract the data.
STEP 1: Creating a mapping, drag the source and targets to the mapping.
Add a Filter transformation and drag all the columns from source qualifier to filter &
name the filter as FLT_Cust_Clean. Add the required condition to filter out not null
columns.
TCS Internal
316852746.doc
5/4/2016
STEP 2: Add another filter and rename it as FLT_Cust_bad and get all the columns
from source qualifier to filter, and add condition to filter records with null values.
STEP 3: Link FLT_Cust_Clean with target dim_cust_clean and FLT_Cust_bad with
target dim_cust_bad. Validate the mapping save, create session, run and validate the
results.
ex_4: Extracts stock required to fulfill the orders from products table to
Prod_delivery file in the target folder mentioned earlier.
Note: Check column UnitsOnOrder, to find order status.
Target File_strucuture:
ProductName
QuantityPerUnit
SupplierID
CategoryID
RequiredUnits Calculate (UnitsInStock-UnitsOnOrder)-Neglect records if orders
can be
satisfied
Amount
RequiredUnits*UnitPrice
ex_5: Achieve the query result.
SELECT
OrderID, CustomerID, EmployeeID, OrderDate, RequiredDate, ShipVia
FROM
Orders
WHERE
(ShippedDate IS NULL)
AND (ShipVia = 3)
Load the data in a file named Convey_Via3.txt
TCS Internal
316852746.doc
5/4/2016
Router Transformation:
ex_1: Load customer data from customers table to dim_cust_clean with
out any null values and data with any null values into dim_cust_bad in
target database.
Create dim_cust_clean and dim_cust_bad with structures similar to
customers table.
Note: Use Router Transformation to route the data. Use truncate option to
truncate tables.
If any circumstance is there to use more than one filter transformation we can use
Router transformation.
STEP 1: Create the mapping with source and two target and a Router
Transformation. Get all fields from source qualifier to Router transformation.
STEP 2: Click Groups tab of router transformation and create a group and name it as
CleanRecords, using Add group option shown below. In the Group Filter Condition add
the condition to filter records with no null values.
STEP 3:
Link the fields of group CleanRecords of the Router with the target structure
Dim_Cust_Clean, and the group Default with the target structure Dim_Cust_bad as
shown in the figure below.
Validate the mapping, create the session in the workflow wf_Router_Transformation,
run and evaluate the results.
TCS Internal
316852746.doc
5/4/2016
TCS Internal
316852746.doc
5/4/2016
X
-1
01-01-2500.
Hint: Use a expression to handle the null values and then use route the values based
on the conditions specified.
Aggregator Transformation:
TCS Internal
316852746.doc
5/4/2016
Expression Editor.
Link the port of Aggregator and target structure. Validate the mapping, create the
session in a new workflow wf_Aggregator_Transformation. Run and validate the
results.
ex_2: Find no: of: units ordered and its worth based on category of the
product from products table. Populate the results in dim_Order_cat
TCS Internal
316852746.doc
5/4/2016
dim_price_analyse
MAX(UnitPrice)
MIN(UnitPrice)
AVG(UnitPrice)
Note: Use Inbuilt functions in the Aggregator folder of the Functions tab to use Max,
Min...
ex_4: Extract records from Order Details table to dim_prod_sales with
following structure.
Create the table with required data types.
Table name: Dim_Prod_Sales
(
ProductID
Sales
(Calculate using formula SUM ((Unit Price*Quantity)-(Unit Price*Quantity*Discount))
)
ex_5: Load the freight expense given for each ship from orders table.
Table_name: Dim_Freight
(
ShipName
Freight_Expense
SUM (Freight)
)
Joiner Transformation:
Note: Use fixed width format for all flat file targets. Write column names to target
files.
TCS Internal
316852746.doc
5/4/2016
STEP 2: Go to Conditions tab of the Joiner Transformation and create the join
condition using the Add a new condition icon . Select the ports for the condition
as shown below.
TCS Internal
316852746.doc
5/4/2016
STEP 3: Go to Properties tab. Check for the Join Type option and set to required
condition based on our requirement.
You can see the Ports tab to check the master and detail groups
A Normal Join gets only if the join condition is met. Full Outer join retrieves
matched as well as unmatched rows of master and detailed data sets. A detail
outer join gets all records from the master source and the matching rows from the
detail source, A master outer join gets all records from the detail source and the
matching rows from the master source.
STEP 4: Continue with the process, and evaluate the results.
TCS Internal
316852746.doc
5/4/2016
Testing Query:
SELECT C.CategoryName, P.ProductName, P.QuantityPerUnit, P.UnitsInStock,
P.Discontinued
FROM
Categories C JOIN
Products P
ON C.CategoryID = P.CategoryID
WHERE
(P.Discontinued <> 1)
ex_2: Find the sales of each product in the year 1997(shipped date year
should be 1997).
Extract the data to Dim_Sales_97.
Table name: Dim_Sales_97.
(
CategoryName
ProductName
ProductSales
SUM ((D.UnitPrice * D.Quantity) * (1 - D.Discount))
)
The target tables sample data.
CategoryName
Meat/Poultry
Condiments
ProductName
Alice Mutton
Aniseed Syrup
ProductSales
16580.850044250488
1724.0
TCS Internal
316852746.doc
5/4/2016
Testing Query:
SELECT C.CategoryName, P.ProductName,
SUM ((D.UnitPrice * D.Quantity) * (1 - D.Discount)) AS ProductSales
FROM Categories C JOIN Products P
ON C.CategoryID = P.CategoryID
JOIN Orders O JOIN [Order_Details] D
ON O.OrderID = D.OrderID
ON P.ProductID = D.ProductID
WHERE (O.ShippedDate BETWEEN '19970101' AND '19971231')
GROUP BY C.CategoryName, P.ProductName
ex_3: Prepare order list to suppliers to fulfill the stock requirement from
products table based on the condition given below.
Note: Check column UnitsOnOrder, to find order status. Neglect records if orders can
be satisfied.
Route Supplier with Fax to Dim_Order_with_Fax & supplier without fax to
Dim_Order_with_Phone
Target File_structure: Dim_Order_with_Fax
(
ProductName
QuantityPerUnit
CompanyName
ContactName
Fax
CategoryID
RequiredUnits Calculate (UnitsInStock-UnitsOnOrder)-Neglect records if orders
can be
satisfied
Amount
RequiredUnits*UnitPrice
)
Target File_structure: Dim_Order_with_Phone
(
ProductName
QuantityPerUnit
CompanyName
ContactName
Phone
CategoryID
RequiredUnits Calculate (UnitsInStock-UnitsOnOrder)-Neglect records if orders
can be
satisfied
Amount
RequiredUnits*UnitPrice
)
STEP 1: Create Mapping with required sources and targets. Add the required filter in
the source qualifier transformation if possible.
STEP 2: Create a joiner transformations to join two tables. Get the output of the
joiner transformation to the expression transformation.
TCS Internal
316852746.doc
5/4/2016
To create the variable create a new port and enable the V variable port check box
option. Open the expression editor and give the required expression. This variable
can be used only with in this expression.
Using Transformation variable In the ports tab you can see the newly created
variable as shown above. You can use this variable like a port in the expressions.
STEP 3: Use Router transformation to split up the records for targets Fax and Phone.
Create the required groups & group conditions in the transformation.
TCS Internal
316852746.doc
5/4/2016
TCS Internal
316852746.doc
5/4/2016
(
FULL_NAME FIRST_NAME+LAST_NAME
TOTAL_SALES SUM ((D.UnitPrice * D.Quantity) * (1 - D.Discount))
)
From Employees, Orders, Order Details.
The target tables sample data.
FULL_NAME
Margaret Peacock
Janet Leverling
TOTAL_SALES
225763.69595336914
202812.84279346466
Testing Query:
SELECT E.FirstName+' '+E.LastName FULL_NAME,
SUM ((D.UnitPrice * D.Quantity) * (1 - D.Discount)) TOTAL_SALES
FROM
Employees E JOIN Orders O
ON
E.EmployeeID=O.EmployeeID
JOIN [Order_Details] D
ON
O.OrderID=D.OrderID
WHERE O.ShippedDate IS NOT NULL
AND
E.Title='Sales Representative'
GROUP BY E.FirstName+' '+E.LastName
ORDER BY TOTAL_SALES DESC
ex_5:
TCS Internal
Sales_96
Sales_97
Sales_98
27257.499817848
206
80894.151594161
987
56520.400411605
835
316852746.doc
Dairy Products
5/4/2016
36711.369998931
885
114749.77032089
233
79490.024902343
75
Testing Query:
SELECT C.CategoryName,
SUM (CASE
WHEN O.ShippedDate BETWEEN '19960101' AND '19961231' THEN D.UnitPrice
ELSE 0
END* D.Quantity*(1 - D.Discount)) AS SALES_96,
SUM (CASE
WHEN O.ShippedDate BETWEEN '19970101' AND '19971231' THEN D.UnitPrice
ELSE 0
END* D.Quantity*(1 - D.Discount)) AS SALES_97,
SUM (CASE
WHEN O.ShippedDate BETWEEN '19980101' AND '19981231' THEN D.UnitPrice
ELSE 0
END* D.Quantity*(1 - D.Discount)) AS SALES_98
FROM
Categories C JOIN Products P
ON
C.CategoryID=P.CategoryID
JOIN [Order_Details] D
ON
D.ProductID=P.ProductID
JOIN Orders O
ON
O.OrderID=D.OrderID
GROUP BY C.CategoryName
ORDER BY C.CategoryName
Total sales query: Just for comparison (to validate test script).
SELECT C.CategoryName, SUM ((D.UnitPrice * D.Quantity) * (1 - D.Discount))
FROM
Categories C JOIN Products P
ON
C.CategoryID=P.CategoryID
JOIN [Order_Details] D
ON
D.ProductID=P.ProductID
JOIN Orders O
ON
O.OrderID=D.OrderID
WHERE O.ShippedDate IS NOT NULL
GROUP BY C.CategoryName
ORDER BY C.CategoryName
ex_6: Built logic to extract sales for each category by year and percentage
growth for adjacent years.
TCS Internal
316852746.doc
5/4/2016
Note: Copy the mapping for previous exercise and modify to get the following
results.
File name: Category_Sales_Growth target file should be in fixed width format.
(
CategoryName
Sales_96
Sales_97
96-97 Growth%
(SALES_97-SALES_96)*100/SALES_96
Sales_98
97-98 Growth%
(SALES_98-SALES_97)*100/SALES_97
)
The target tables sample data.
Target Sample data.
CategoryNa
me
Sales_96
Sales_97
Confections
27257.499817848
206
36711.369998931
885
80894.151594161
987
114749.77032089
233
Dairy Products
96-97
Growth
%
196 %
212 %
Sales_98
56520.400411605
835
79490.024902343
75
Testing Query:
SELECT CategoryName,
SALES_96, SALES_97, CAST (CAST ((SALES_97-SALES_96)*100/SALES_96 AS INT) AS
VARCHAR (5)) +' %' AS "96-97 Growth%",
SALES_98, CAST (CAST ((SALES_98-SALES_97)*100/SALES_97 AS INT) AS VARCHAR
(5)) +' %' AS "97-98 Growth%"
FROM
(SELECT C.CategoryName AS CategoryName,
SUM (CASE
WHEN O.ShippedDate BETWEEN '19960101' AND '19961231' THEN D.UnitPrice
ELSE 0
END* D.Quantity*(1 - D.Discount)) AS SALES_96,
SUM (CASE
WHEN O.ShippedDate BETWEEN '19970101' AND '19971231' THEN D.UnitPrice
ELSE 0
END* D.Quantity*(1 - D.Discount)) AS SALES_97,
SUM (CASE
WHEN O.ShippedDate BETWEEN '19980101' AND '19981231' THEN D.UnitPrice
ELSE 0
END* D.Quantity*(1 - D.Discount)) AS SALES_98
FROM
Categories C JOIN Products P
ON
C.CategoryID=P.CategoryID
JOIN [Order_Details] D
ON
TCS Internal
97-98
Growth
%
-30 %
-30 %
316852746.doc
5/4/2016
D.ProductID=P.ProductID
JOIN Orders O
ON
O.OrderID=D.OrderID
GROUP BY C.CategoryName
) TEMP
Lookup Transformation:
ex_1: Prepare a list with CategoryName, CompanyName & ProductName
from Products, Categories, and Suppliers.
Note: Use Lookup transformation to get CompanyName & CategoryName.
The target tables sample data.
CategoryNa
me
Beverages
Beverages
CompanyName
ProductName
Aux joyeux
ecclsiastiques
Aux joyeux
ecclsiastiques
Cte de Blaye
Chartreuse
verte
TCS Internal
316852746.doc
5/4/2016
OrderID
10248
10249
10250
10295
10541
10542
10737
10738
CustomerID EmployeeID
VINET
5
TOMSP
6
HANAR
4
VINET
2
HANAR
2
KOENE
1
VINET
2
NewCustomer
1
CustomerKey
1
2
3
4
5
6
7
8
CustomerID
VINET
TOMSP
VINET
HANAR
KOENE
VINET
CACTU
CENTC
CustomerName
Paul Henriot
Karin Josephs
Peter Franklin
Mario Pontes
Philip Cramer
Ruther Martyn
Patricio Simpson
Francisco Chang
With the above Orders & Customers tables we will see the difference between LookUp
transformation and Joiner transformation. When we use joiner with the Join condition
Orders.CustomerID=Customers.CustomerID(+) to retrieve all rows from Orders table
then the resultant data set will be as below.
OrderID
10248
10248
10248
10249
10250
10295
10295
10295
10541
10542
10737
10737
CustomerID
VINET
VINET
VINET
TOMSP
HANAR
VINET
VINET
VINET
HANAR
KOENE
VINET
VINET
TCS Internal
EmployeeID
5
5
5
6
4
2
2
2
2
1
2
2
CustomerKey
1
3
6
2
4
1
3
6
4
5
1
3
CustomerID
VINET
VINET
VINET
TOMSP
HANAR
VINET
VINET
VINET
HANAR
KOENE
VINET
VINET
CustomerName
Paul Henriot
Peter Franklin
Ruther Martyn
Karin Josephs
Mario Pontes
Paul Henriot
Peter Franklin
Ruther Martyn
Mario Pontes
Philip Cramer
Paul Henriot
Peter Franklin
316852746.doc
10737
10738
VINET
NewCustomer
5/4/2016
2
1
6
(null)
VINET
(null)
Ruther Martyn
(null)
For the above case CustomerID VINET of Orders table is having multiple match case
with the Customer table, retrieving all the multiple match cases, as mentioned
above.
In case of LookUP it wont retrieve all the multiple match case. For handling the
multiple match case we are provided with a option Lookup policy on multiple
match in properties tab.
CustomerID
VINET
TOMSP
HANAR
VINET
HANAR
KOENE
VINET
NewCustomer
EmployeeID
5
6
4
2
2
1
2
1
CustomerKey
1
2
4
1
4
5
1
(null)
CustomerID
VINET
TOMSP
HANAR
VINET
HANAR
KOENE
VINET
(null)
CustomerName
Paul Henriot
Karin Josephs
Mario Pontes
Paul Henriot
Mario Pontes
Philip Cramer
Paul Henriot
(null)
the PowerCenter Server determines which row is first and which is last by generating
an ORDER BY clause for each column in the lookup source. The PowerCenter Server
then sorts each lookup source column in the lookup condition in ascending order.
In the properties tab we can see another option Connection information, set the
required connection string.
TCS Internal
316852746.doc
5/4/2016
Create another lookup transformation for Supplier and link to target. Continue with
the process.
FullName
RequiredDate
Steven
Buchanan
Michael Suyama
1996-08-01
00:00:00.000
1996-08-16
00:00:00.000
LookUp Override:
TCS Internal
316852746.doc
5/4/2016
STEP 6: Enable the Return port option in the Lookup transformations required field.
STEP 7: Create a Expression transformation and get required fields from source.
Create another port to get return value from LookUP TRT. Open the expression Editor
and use :LKP() function as shown below.
TCS Internal
316852746.doc
5/4/2016
Using :LKP() Expression to pass i/p to Unconnected Lookup TRT and get o/p.
STEP 8: Connect the expression to the target and continue with session creation
and validations of results.
Sample Data:
TCS Internal
316852746.doc
Employee
ID
1
2
5/4/2016
FullName
Title
Manager
Davolio
Nancy
Fuller Andrew
Sales
Representative
Vice President,
Sales
Fuller
Andrew
No Manager
Take the source as Employees and create a lookup transformation with Employees
table and continue with the process
Testing Query:
SELECT E.EmployeeID, E.LastName+' '+E.FirstName AS FullName, E.Title,
ISNULL (M.LastName+' '+M.FirstName, 'No Manager') AS Manager
FROM Employees E LEFT OUTER JOIN Employees M
ON
E.ReportsTo=M.EmployeeID
SORTER Transformation:
ex_1: Populate the ContactName, ContactTitle, CompanyName & Phone in a
target file named customer_contact Order by ContactName. The target file
should be text pad with fixed width format.
STEP 1: Create a mapping with source as customer and
STEP 2: Create a Sorter transformation and get the required columns from source
Qualifier to sorter transformation.
STEP 3: Open the properties tab and check the columns to be sorted in the Key
column and specify the type of ordering (Ascending or descending) in Direction
column.
Note: If you want the distinct data out of sorter transformation, we can enable
Distinct option in Properties tab.
STEP 4: Link the output of the sorter transformation to the target. Continue with the
session & validation or results.
Testing Query:
SELECT ContactName, ContactTitle, CompanyName, Phone
FROM Customers
ORDER BY ContactName ASC
TCS Internal
316852746.doc
5/4/2016
ex_2: Extract Orders taken by employees and sort the result by EmployeeID
From Orders & Employees.
Target_file: Orders_by_Employees
(
EmployeeID
FullName
OrderID
OrderDate
)
Testing Query:
SELECT DISTINCT E.EmployeeID, E.LastName+' '+E.FirstName AS FullName,
O.OrderID
FROM Employees E JOIN Orders O
ON
O.EmployeeID=E.EmployeeID
ORDER BY E.EmployeeID
ex_3: Find the age of employees on 5/3/1999(mm/dd/yyyy) and Sort the
result by age in ascending order.
Dim_emp_age
(
FullName
BirthDate
Age
)
Testing Query:
SELECT , E.LastName+' '+E.FirstName AS FullName,
DATEDIFF (day, E.BirthDate, CAST ('19990503' AS DATETIME))/365 AS Age,
E.BirthDate
FROM Employees E
ORDER BY BirthDate desc
TCS Internal
316852746.doc
5/4/2016
Properties Tab
Start Value: Starting Value of the Sequence Number generated.
Increment By: Difference between two consecutive numbers generated.
End Value: The maximum value of the sequence number generated. If it got
reached when session is running the session will fail.
Cycle: When the sequence number generation reaches the maximum value (End
Value) and if our requirement was to restart the sequence generation from start
value, we can enable Cycle option.
Reset: Generated sequence starting with Start Value for each session run.
TCS Internal
316852746.doc
5/4/2016
TotalSales
110277.30497741
699
104874.97871398
926
Ran
k
1
2
STEP 1: Create a mapping with required sources & target. Create two joiner to join
three source tables as in testing query.
STEP 2: Add a Aggregator to aggregate the result of the second joiner to customer
level. Use the calculation logic with in the Aggregator.
STEP 3: Add a Rank Transformation, you can see a column named RANKINDEX
already existing in the transformation. This will generate the Sequence number for
Rank column. Get the required columns from Aggregator to Rank transformation.
TCS Internal
316852746.doc
5/4/2016
Customer Sales
100A
1000
101A
1050
102A
1200
100B
1600
100C
1810
101B
1100
102B
1675
102C
1675
Rank By Sales
If we rank the source using Sales then we will arrive at Rank By Sales data set. If we
rank the source using Sales and grouping by Employee then we will get the result as
next data set. The Ranking will be for each Employee group.
Properties:
1) Number of Ranks: You can fill with number n. These n number of rows will
be retrieved.
2) Top/Bottom: You can fetch Top n rows or Bottom n rows(n from Number of
Ranks).
TCS Internal
316852746.doc
5/4/2016
TCS Internal
316852746.doc
5/4/2016
TCS Internal
316852746.doc
5/4/2016
STEP 2: DTS Import/Export Wizard will appear. Click Next>. You can see the Wizard
asking for source details. Fill it will required data ( Server, DatabaseName, UN/PW)
and click next. It will prompt for target details ( fill it with required data) and click
next.
STEP 3: It will prompt for following options. Choose Copy tables(s) options and
continue..
STEP 4: Select the required source tables as shown below and in the Destination
columns give reqired target names. Give the target name as UPD_Employees.
Click Next. Continue the process. Using this Export/Import utility we can export or
import tables not only in SQL Server We can use it for any database. We can see
the list in the drop down box of the data source selection of wizards first figure.
We can create packages similar to informatica mappings. It is a free ETL tool of SQL
Server.
TCS Internal
316852746.doc
5/4/2016
TCS Internal
316852746.doc
5/4/2016
Create a mapping for slowly changing dimension. Check with the mapping with the
above scripts. Use source & target tables used in previous exercise. This mapping
should support both new insertion and Updation.
Run the script below in Northwind database.
INSERT INTO UPD_SRC_Employees
VALUES(11,'Crowther','Simon','Sales Associate','Mr.',
'1962-04-24 00:00:00.000','1995-11-15 00:00:00.000',
'345 Queensbridge',
'London',null,'SW7 1RZ','UK','(171) 555-7733','428',null,'Simon Crowther served in the
North/South and, he joined the company for the sake of Sales Association commitee.
After completing his project he will be transferred North/South to continue',2,null);
UPDATE UPD_SRC_Employees
SET ReportsTo =-1
WHERE ReportsTo is null ;
Test for the changes.
TCS Internal
316852746.doc
5/4/2016
ex_1: CREATE the procedure given below in the source database. Create a
STORED PROCEDURE Transformation & use it to retrive FullName of a
employee.
Complete Ex_2 of Lookup transformation using stored procedure transformation to
retrieve employees fullname.
Create a procedure given below in the source database using Query Analyzer.
CREATE PROCEDURE SP_FullName @@EmployeeID int, @@FullName varchar(30)
OUTPUT
AS
SELECT @@FullName = FirstName+LastName
FROM Employees
WHERE EmployeeID=@@EmployeeID
GO
Executing the Procedure for testing. (These testing execute commands are only for
single record parameters. For multiple records use cursors)
DECLARE @@FullName varchar(30) EXECUTE SP_FullName 1, @@FullName OUTPUT
BEGIN PRINT @@FullName END
STEP 1: Copy the mapping Ex_2 of Lookup transformation. Change the logic to get
the
TCS Internal
316852746.doc
5/4/2016
STEP 2: Create a Stored Procedure transformation using icon in the toolbar. It will
prompt for the selecting procedures in the database. As shown above. Select
appropriate database, UN/PW and procedure/Function you need to import and Click
OK.
STEP 3: You can note the new Stored procedure transformation has been created in
the mapping. You can see the input and Output ports for the Stored Procedure
Transformation. Give the required input links and take the required ouput port.
Imported SP TRT
TCS Internal
316852746.doc
5/4/2016
TCS Internal
316852746.doc
5/4/2016
O.OrderID=D.OrderID
WHERE O.ShippedDate IS NOT NULL AND
E.EmployeeID=@@EmployeeID
GROUP BY E.FirstName+' '+E.LastName
GO
Executing the Procedure: For testing
DECLARE
@@FullName varchar(30),
@@TotalBonus decimal(23,4)
EXECUTE SP_Emp_Bonus 1, UK, @@FullName OUTPUT, @@TotalBonus OUTPUT
BEGIN
PRINT @@FullName + ' ,' +CAST(@@TotalBonus AS CHAR)
END
ex_4: Use the below function and extract Customer & Sales by Customer
information. Load the data to a Notepad. The same way
-- DROP FUNCTION SF_Cust_Sales
CREATE FUNCTION SF_Cust_Sales
(@CustomerID varchar(5) )
RETURNS decimal(12,3) -- Customer Total Sales.
AS
BEGIN
RETURN ( SELECT SUM (D.UnitPrice*D.Quantity*(1-D.Discount))
FROM Orders O JOIN [Order_Details] D
ON
O.OrderID=D.OrderID
JOIN
Customers C
ON
C.CustomerID=O.CustomerID
WHERE C.CustomerID=@CustomerID )
END
Testing Query:
SELECT CustomerID, dbo.SF_Cust_Sales(CustomerID) AS Sales
FROM Customers;
UNION Transformation
ex_1: Create a Contact lists of Suppliers and Customers with the fields
listed below.
TCS Internal
316852746.doc
5/4/2016
CompanyName
Drachenblut Delikatessen
Rattlesnake Canyon Grocery
Old World Delicatessen
Grandma Kelly's Homestead
Gai pturage
ContactName
Sven Ottlieb
Paula Wilson
Rene Phillips
Regina Murphy
Eliane Noz
Relationship
Customers
Customers
Customers
Suppliers
Suppliers
STEPS 1: Create a new mapping. Add Suppliers & Customers Source definitions.
STEPS 2: Create a new target definition with required structures and use Generate &
Execute Options to create the table definition in target database.
STEPS 3: Create two expression transformation for hardcoding the Relationship fiels.
If the contact is for supplier hardcode as Supplier, and if he is a Customer hardcode
as Customer.
STEPS 4: Add a Union Transformation to do a UNION operation on the two data sets
from Suppliers and Customers. Link the output of the Union transformation to the
target structure. Creating Union transformation was explained below.
STEPS 5: Create a session, validate and add to the new workflow and run workflow,
monitor the status. Use the testing Query to validate the results.
Creating UNION Transformation:
STEP 1: Click on the icon in the menu bar and click in the click in the mapping
designer.
STEP 2: You can see the new transformation in the designer palette as below.
STEP 3: Select the required fields from source transformation ( in this exercise select
the fields from expression transformation) and drag the cursor and place it on the
Union transformation and just release the mouse button.
TCS Internal
316852746.doc
5/4/2016
You can see the two different set of fields appearing. One for the OUTPUT ports and
the other for the Input group with name NEWGROUP.
STEP 4: Double click on Union transformation and open the Groups tab. You can see
the properties as below.
You can rename the existing groups and can create any number of groups. To this
exercise rename the existing group for our convenience.
As we create new port, corresponding new set of fields will be created in the ports
tab. After creating two fields you can see the transformation as in below figure.
TCS Internal
316852746.doc
5/4/2016
Link the different data source sets to UNION transformation and link the output to the
target. The Iconic view of the mapping is shown below.
TCS Internal
316852746.doc
5/4/2016
Testing Query :
SELECT
FROM
UNION
SELECT
FROM
Ex_2: Create a new target table Dim_Locations with the fields City, Country,
PostalCode, Region from Customers, Suppliers and Employees. Just like
confirmed dimensions.
STEP 1: Create a mapping with source and target structures.
STEP 1: Add a union transformation, create three input groups and connect sources
to input groups.
STEP 1: Link the output of the UNION transformation to the Sorter and extract the
distinct values and load to target.
STEP 1: Create session and workflow. Run the workflow and monitor the status.
STEP 1: Check the result against the Query below.
Testing Query :
SELECT
FROM
UNION
SELECT
FROM
UNION
SELECT
FROM
Normalizer Transformation
TCS Internal
316852746.doc
5/4/2016
Ex_1: Use the files embedded below as the source and load the data to the
target table with the structure specified in DDL statement below.
D16220w1.cbl
cus t_acct_own.out
Double click on the files embedded and save it to the source file locations. The
.cbl file is the file for representing the structure of the file and .out file is the data
file.
Execute the DDL statement in the target data base using SQL Query Analyzer.
Use Import from Cobol File Option of sources menu to import the cobol structure.
Point to the .cbl file which contains the structure of the file and click ok.
TCS Internal
316852746.doc
5/4/2016
STEP 2: Create a new mapping and add cobol source to the mapping you can see
the Normalizer transformation appearing next to the source structure Instead of
Source Qualifier transformation.
TCS Internal
316852746.doc
5/4/2016
STEP 3: Add the target structure to the mapping and link the output ports of the
Normalizer transformation to target.
STEP 4: While configuring session, you need to set the source filename as .out
filename not the .cbl filename. The .out file is the data file. Set the target filename as
cust_acct_own.txt
Open the Set File Properties to set the source file properties. Set the file
properties to Fixed width properties.
Click Advanced and set the Number of initial rows to skip: to 0.
Set Code Page: to IBM EBCDIC US English as below
Run the session and monitor the status and check the results.
TCS Internal
316852746.doc
5/4/2016
TARGET STRUCTURE:
CREATE TABLE Dim_Employee_Skills
(
EmployeeKey int,
EmployeeID int,
SkillNumber int,
Skill_Set nvarchar(25)
);
Run the DDL in the target data base. Import the source and target structures from
source and target databases to informatica designer.
STEP 1: Create a new mapping, add the source and target structures.
STEP 2: Create the Normalizer transformation. Open the Normalizer tab and Create
the ports similar to target structure. Set the Occurs field of the pivoting column to
number of columns in source to be pivoted.
Note: In this exercise, the target field Skill_Set has three occurrences (skill_1, skill_2,
skill_3) in source table. Hence set the Occurs field of Skill_Set to 3.
TCS Internal
316852746.doc
5/4/2016
As you set the occurrence to 3 view the ports tab. You can see the three Input ports
has been created for the single output port Skill_Set.
Another two new fields namely GK_Skill_Set and GCID_Skill_Set are created.
Port structures
STEP 3: Link the Output of the Normalizer transformation to the target. Link the
GK_ field to target Primary key and GCID_ to the Occurence field. Map the other
ports just directly. Validate the mapping. Create the session, run and test the results.
Mapping view
TCS Internal