0% found this document useful (0 votes)
98 views

P R o F e S S I o N A L S U M M A R y

The summary provides an overview of the candidate's professional experience in IT with a focus on data warehousing and ETL using Informatica. Specifically: - Over 5 years of experience in data warehousing and ETL using Informatica to develop mappings, sessions, and workflows across various industries. Extensive experience with Informatica tools and processes. - Work history includes projects for JP Morgan Chase building a data mart, Nokia developing an NDW, and GE Asset Management building finance data marts, demonstrating experience across multiple domains. - Technical skills include Informatica, Oracle, SQL, data modeling, UNIX scripting and more. Focuses on extracting, transforming and loading data from various sources

Uploaded by

Nive Dita
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
98 views

P R o F e S S I o N A L S U M M A R y

The summary provides an overview of the candidate's professional experience in IT with a focus on data warehousing and ETL using Informatica. Specifically: - Over 5 years of experience in data warehousing and ETL using Informatica to develop mappings, sessions, and workflows across various industries. Extensive experience with Informatica tools and processes. - Work history includes projects for JP Morgan Chase building a data mart, Nokia developing an NDW, and GE Asset Management building finance data marts, demonstrating experience across multiple domains. - Technical skills include Informatica, Oracle, SQL, data modeling, UNIX scripting and more. Focuses on extracting, transforming and loading data from various sources

Uploaded by

Nive Dita
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Professional Summary:

• Over Five plus years of IT Experience, which includes Four plus years of experience in
Data warehousing environment in developing, designing and programming Informatica
Mappings, Sessions and Workflows.
• Strong experience in Extraction, Transformation and Loading (ETL) process using
Informatica Power Center 8.x/7.x/6.x (Designer, Workflow Manager and Repository
Manager) in Insurance, Banking, Healthcare and Manufacturing domains.
• Worked extensively with all Informatica Designer tools including (Source Analyzer,
Warehouse Designer, Transformation Developer, Mapping Designer and Mapplet
Designer).
• Extensively worked on Informatica Workflow Manager and Workflow Monitor to build
sessions and run workflows.
• Knowledge of Star Schema and Snowflake Schema Methodology, using Data Modeling
tool Erwin.
• Involved in designing Data Marts and Data Warehouse using Star Schema and
Snowflake Schema in implementing Decision Support Systems.
• Worked on Dimensional modeling, Data cleansing and Data Staging of operational
sources using ETL processes.
• Performed several mappings with complex transformation logic using components such as
connected and unconnected Lookup & Stored procedures, Filters and Expressions to
extract data from diverse sources including flat files, RDBMS tables and XML files.
• Good at Performance Tuning of sources, targets, mappings and sessions.
• Monitoring the jobs, which are scheduled to run on a daily & weekly basis, fixing the issues
in case of any failures.
• Knowledge in Data Stage and Business Objects.
• Excellent at UNIX Shell Scripting for File validation and scheduling Jobs.
• Sound knowledge of Oracle 9i/8i, PL/SQL (Procedures, Functions). Good knowledge of
Relational Database Concepts, and Normalization Concepts.
• Extensive Database experience using Oracle, DB2, MS SQL Server, MS Access and
Sybase.
• Expertise in utilizing Toad for developing Oracle applications.
• Effective communicator with good Analytical and Inter-Personnel skills.
• Excellent Team player and can work with minimum supervision.

Professional Qualification:

• M.C.A from Osmania University, 2004

Technical Skills:
ETL Tool Informatica Power Mart/ Power Center 8.x/7.x/6.x, Data
Stage, Ab-initio, OWB 9.0.2, 9.0.3 & 10gR2
OLAP Business Objects 6.5/Web Intelligence
RDBMS Oracle Versions 9i/8i
(SQL, PL/SQL,Stored Procedures, Triggers),
Tera data, MS Access 2000
Oracle Tools SQL*Loader
Data Modeling Erwin, MS Visio
Languages SQL*Plus, SQL, PL/SQL, C, C++, Java
Internet/Web JavaScript, HTML, XML
Operating Systems UNIX, Windows XP/NT, HP-UX

Projects Summary:

Client: JPMC (JPMorgan Chase)


Implementer: CTS
Project: T&SS Data mart (Treasury Services Share Class). Apr 2008 – Till date.

Project Description:

A share class is a group of investment funds with similar sales charges and
distribution fees. This will be based on the average aggregate balance maintained for a
client on an investment period. Balance will be aligned with funds belonging to a share class
that provides the client with more attractive charge and fee options.

The average balance will be calculated by aggregating the average balance for all sweep
accounts under a specific tax id. For a given tax id, if this aggregate average balance is above
the minimum requirements for a specific share class, then all of the accounts belonging to
the tax id will be eligible for the share class

The SCR system will provide analysis for a client at the account level, and provide
recommendations for new fund alignment based on the share class to which the client is
entitled. The Financial analyst will have the ability to override the alignment manually at
the tax id level. This manual exception process will allow the financial analyst to bypass the
accounts under a tax id so that no re-alignment takes place, or the financial analyst may
manually specify a share class for the client, which will override the recommended changes
generated by the system.

Finally, after each individual sweep investment system has completed the realignment, a
status for each account re-aligned will be returned to the SCR system which will be
maintained in an SCR repository. This repository will then be available indefinitely for ad-
hoc reporting.

Responsibilities:

• Understanding the Business Processes.


• Analyzing the source data from different databases.
• Used Source Analyzer and Warehouse Designer to import the source and target database
schemas and the mapping designer to map source to the target.
• Worked on flat files as sources, targets and lookups.
• Created Informatica Mappings with different transformations including Source Analyzer,
Lookup, Filter, Expression, Rank, Aggregator, Stored Procedure, Look up Transformations
and more.
• Designed mappings and scheduled workflows to Load data into Fact tables
• Developed mapping to implement type 2 slowly changing dimensions.
• Developed Informatica parameter files to filter the daily data from the source system.
• Used debugging techniques to debug the mappings.
• Created reusable transformations and Mapplets to use in multiple mappings.
• Extracted the data from Relational Database, Flat Files and Loaded into the target Data
warehouse.
• Responsible for scheduling the workflow

Environment:
Informatica 8.1, Oracle 8i, SQL Developer, SFTP, Micro Strategy, Autosys

Client: NOKIA Helsinki, Finland


Implementer: TCS, Helsinki, Finland
Project: NDW Release 2.0 on
Role: Tech Lead Aug 2007 – Dec 2007

Project Description:

Nokia has a number of applications to support reporting. These data marts are integrated to some
but not all necessary systems, depending on the immediate need they have been developed for as well
as schedule. NDW and E2E Product Quality will enable visibility for actual supplier-material related
material issues at field and giving fact based information for warranty charge back negotiations with
supplier. Integration of supplier management, manufacturing, and quality management data enables
also easy ad hoc queries to previously separate data.

The scope of the project describes a requirement for certain material data related reports used by
Environmental affairs entity in Nokia. Environmental affairs are interested on how much Nokia has
delivered material included in a Sales Package for countries and sales areas during specific time frame.
The information needed is 1) quantity (shipped) and 2) Weight (KG) (shipped) per fraction and
further per fraction type Environmental affairs deliver and utilize material reporting for two purposes

A) Operative use: Monthly/Quarterly material reporting for 1.EU Compliance Organizations and
2.external distributors

B) Analytical use: decision making use – total sums of weights/delivered amounts (combining sales
pack weight information with country ship-to data)

Responsibilities:
• Designed Conceptual integration model(CIM-0) Diagram for NDW Release2
Environmental Reporting
• Defined Non Functional Requirements for NDW Release2
Environmental Reporting By Communicating with End User
• Responsible for Data Model including tables, Synonyms, Sequences and other DB objects to
support proposed POC.
• Responsible for ETL Design to implement POC using MS VISIO 5.0,
• Created project modules, DB links, runtime users including configuration of source, target
and project modules.
• Performed Change Requests on Informatica mappings according to application
enhancements.
• Performed and Documented Unit, System and Regression testing of the project in
compliance with defect tracking process using Mercury test Director 5.0.
• Responsible for Identifying and Fixing Performance Bottlenecks on informatica Mappings.
• Responsible for documenting user manual for the end users.

Environment:

Informatica 8.1, Tera data, Erwin 4.0, Test director 5.0, MVS, Cognos, System Architechre Tool, MS
VISIO

Client: GE Asset Management, USA


Implementer: TCS
Project: Finance Total Returns - Visions
Role: ETL Developer Dec 2006 – Jul 2007.

Environment:

Informatica 7.1, Oracle 9i, Rapid SQL, WSFTP, MVS, Business Objects 6.5, APPWORX TOOL,
Autosys and Exceed.

Description:

GE-AM is a wholly owned subsidiary of G.E that is into Funds management and
Portfolio management business. GE-Asset Management is in the business of managing assets. At
first, it used to manage only their employee pension and other benefits (assets). Today GEAM is a
registered investment advisor with offices in the U.S, Canada, Europe and Japan. Usually they
invest money in U.S equity, International Equity, Fixed Income, Real estate, Private Equity
etc.
ETL process:

The ETL layer first gets the data from the source systems, cleanses and stages the data into
staging area. The data is then subsequently transformed, consolidated and then loaded into the
Data Marts. There onwards, the reporting layer and application layer accesses the data mart to
cater to their operational purposes.

Responsibilities:

• Involved in requirements gathering and translation of Business processes into


Informatica mappings.
• Extraction and loading of data from various sources like Flat files, Oracle, Mainframe to
Oracle database using Informatica mappings.
• Developed various simple to complex mappings using transformations such as Aggregator,
Expression, Lookup, Filter, Router, Rank, Sequence Generator, Update Strategy
using Informatica
• Enhanced the performance of SQL statements, Mappings by identifying bottlenecks and
used Informatica debugger to eliminate them
• Involved in writing the PL/SQL Packages, Stored Procedures, Functions to accomplish
the business rules and tuning the SQL queries using Explain plan in Toad
• Involved in writing the shell scripts to automate the execution of sessions and scheduled the
jobs using Informatica Scheduler.
• Used Toad extensively to debug, test SQL and PL/SQL scripts, procedures, functions,
triggers
• Created automatic scripts to create views for objects in an external Database and validated
• Involved in maintaining Reporting database by extracting data for report developers
• Worked on moving mappings from Development to Test and from Test to Production
environments
• Worked closely with DBA in creating the Tables, indexes, views, Index rebuilds.
• Involved in testing at various levels, unit, performance and Integration.

Client: CompU Credit


Implementer: GOLDSTONE INFO SYSTEMS PVT LTD.
Project: FNBO Data Load. Aug 2004 - Nov 2006

Project Description:

CompUCredit is one of the leading Information providers in banking domain. Their


main business is to collect portfolios of under-served consumers from other banks, give
potential ranking and serve them. They also market the information to other banks on Fee
Basis. CompUCredit has now acquired new portfolios from First National Bank of
Omaha (FNBO). Now, they want the portfolios acquired from the FNBO database to be
extracted into an intermediate staging area present in the Compu credit Data Store
(CCDS), and then perform required Business transformations on the data so that it can be
loaded into the CCDS. The current project deals with acquiring FNBO’s portfolios into the
CCDS.

Responsibilities:

• Designed ETL processes using Informatica to load data from Flat Files, and Excel files to
target Oracle Data Warehouse database
• Performed data manipulations using various Transformations like Joiner, Expression,
Lookup, Aggregate, Filter, Update Strategy, and Sequence Generator, Stored Procedure etc
• Used SQL overrides in source Qualifier to meet business requirements
• Written pre session and post session scripts in mappings. Created sessions and workflows
for designed mappings. Redesigned some of the existing mappings in the system to meet
new functionality
• Created and used different tasks like command and email tasks for session status
• Used Workflow Manager to create Sessions and scheduled them to run at specified time with
required frequency
• Monitored and configured the sessions that are running, scheduled, completed and failed
• Involved in writing UNIX shell scripts for Informatica ETL tool to fire off services and
sessions
• Migrated mappings from Development to Test and from Test to Production environments

You might also like