0% found this document useful (0 votes)
417 views16 pages

Informatica TDM Resume

- The document describes a candidate's 15+ years of experience in IT with expertise in ETL using Informatica PowerCenter for data warehousing projects in various industries. Their skills include data mapping, modeling, masking, loading data from multiple sources into databases like Oracle, Teradata and SQL Server. They have experience architecting, developing, testing and maintaining large data warehouse implementations.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
417 views16 pages

Informatica TDM Resume

- The document describes a candidate's 15+ years of experience in IT with expertise in ETL using Informatica PowerCenter for data warehousing projects in various industries. Their skills include data mapping, modeling, masking, loading data from multiple sources into databases like Oracle, Teradata and SQL Server. They have experience architecting, developing, testing and maintaining large data warehouse implementations.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 16

SUMMARY:

• 15+ years’ professional experience in IT industry with special emphasis on design,


development and testing of Database/Data Warehousing applications using Informatica
Power Center in various domains like Pharmaceutical, Financial, Health, Media &
Entertainment.
• Areas of expertise in Masking, Column & Data Mapping, ETL, TDM, Data Archive / Growth,
Data Profiling solutions with use of IBM OPTIM 11.3 z/OS and Distributed (LUW), Optim
• Strong experience in Informatica PowerCenter,PowerConnect for ETL (Extraction
Transformation and Loading) of data from multiple source database systems in both
UNIX/Windows environment.
• Diversified Domain expertise implementing solutions to Data Warehousing and BIG DATA
Projects.
• Practical understanding and experience with Data Mapping, Data Validation and Data
Modeling (Dimensional & Relational) concepts such as Star-Schema Modeling, Snowflake
Schema Modeling, Fact and Dimension tables.
• Solid experience in Informatica Power Center (versions 10/9.6/8.6/ 7.1) including Power
Center Designer, Workflow Manager, Workflow Monitor, Repository Manager, Repository
Server Admin console and Teradata.
• Expertise in Oracle,PL/SQL, Functions, Procedures, Packages, Triggers and Materialized
Views.
• Excellent knowledge in Creating and Maintaining Database objects like Tables, Indexes,
Views, Synonyms, Stored Procedures and Packages.
• Experienced in RDBMS design, data modeling, data normalization and SQL tuning using
Indexes and working with UTL Files.
• Extensively worked on UNIX shell scripting for creating and maintaining the parameter files,
executing database procedures, calling Informatica workflows, and job scheduling.
• Effectively communicate with business users, Business Analysts, project manager and team
members.
• Strong ability to work within a demanding and aggressive project schedules and
environments.
• Excellent analytical, problem solving skills and a motivated team player with excellent inter-
personal skills.
• Extensive experience in architecting, implementing, maintaining and migrating Data
Warehouse projects.
• Excellent hands of experience in monitoring bug status using Jira and maintaining project
documentation using Share Point.
• Good knowledge in Azure Data Factory, Sales Force and SQL Server 2014 Management
Studio.
• Extensively developed the complex stored procedures, functions to maintain and populate the
data into the Data marts, EDWH system and OLAP database.
• Effective, independent and team player and self-starter with high adaptability to new
technologies and a penchant to learn new things.
• Ability to meet deadlines and handle pressures coordinating multiple tasks in project
environment.
• Excellent written and verbal communication skills blended with good analytical reasoning.

TECHNICAL SKILLS:

Operating Systems Windows 98/NT/2000/XP,2003/2008 Server, Windows 7, UNIX


Oracle12c,11g,10g,9i,8i,MSSQLServer7.0/2000/2005/2008/2012,MSAccess
Databases
97/2000/2003,Teradata 13.0/14.0/15.0,SQ Server 2014 Management Studio
Dimensional Data Dimensional Data Modeling, Data Modeling, Star and Snow flake schema Modeling,
Modeling Physical and Logical Data Modeling, Fact and Dimensions table
ETL Tools and
Azure Data Factory, Informatica Power Center, Informatica cloud and IDQ
BI tool
Languages Java 7/8,PL/SQL, HTML, XML, Shell Scripting, SQL *plus, SQL *loader
BIG DATA HDFS, MapReduce, HIVE, PIG, Sqoop, SPARK & OOZIE.

PROFESSIONAL EXPERIENCE:
• Working as senior developer in Allergan from Jun 18 till Jun 19.
• Worked as senior consultant in CGI-Federal from Jun 17 till May 18.
• Worked as senior consultant in Capgemini from Feb 15 till May 17
• Worked as senior developer in HCL Technologies from May 10 till Jan 15
• Worked as senior developer in Cegedim Software India Pvt. Ltd from Mar 07 till may 10
• Worked as software developer in Marques Consulting from Dec 03 till Feb 07

Project #1 Jun 2018 till


date

Title : FOCUS
Client : Allergan.
Location : Irvine, CA.
Company : Irvine
Domain : Healthcare
Team Size :4
Role : Informatica Developer.

Responsibilities:
• Worked with stakeholders for requirements gathering, analysis, design, development, testing for
N-to-N solutions and successfully implemented the project.
• Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to
design the business process, dimensions and measured facts.
• Executed test cases and supported the workflows in Dev and QA
• Databases with use of Informatica ILM – TDMand PowerCenter 9.6 Hotfix.
• Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager,
and Workflow Monitor.
• Worked on Azure Data Factory for extracting the data from Sales Force.
• Developed standard and reusable mappings and mapplets using various transformations like
Expression, Aggregator, Joiner, Router, Lookup (Connected and Unconnected) and Filter.
• Worked on various tasks like Session, E-Mail task and Command task.
• Worked on different tasks in Workflows like sessions, event wait, decision, e-mail, command,
worklets, Assignment, Timer and scheduling of the workflow.
• Involved in building the ETL architecture and Source to Target mapping to load data into Data
warehouse.
• Troubleshooting of long running sessions and fixing the issues related to it.
• Worked withVariables and Parametersin the mappings to pass the values between sessions.
• Worked with Services and Portal teams on various occasion for data issues in OLTP system.
• Worked with the testing team to resolve bugs in ETL mappings before migrating to
production.
• Involved in meetings with production team for issues related to Deployment, maintenance,
future enhancements, backup and crisis management of DW.
• Worked with production team to resolve data issues in Production database of OLAP and
OLTP systems.
• Designed tables required for the execution of the ETL processes using ERwin.
• Extracted data stored in a multi-level hierarchy using Oracle Stored Procedures.
• Loaded Dimension, Fact and Exception tables and automated email generation when
exceptions occurred.
• Provided excellent support during QA/UAT testing by working with multiple groups.
• Got involved in unit testing, regression testing, function based testing and system integrated
testing.
• Improved performance using Oracle Partitions, Indexes and other Performance Tuning
techniques.
• Worked on data profiling ,data validation and data analysis.
• Proved accuracy in doing sanity checks, smoke tests after the install of new scripts used by
ETL.
• Prepared Data Mapping documents, process documents,BRD, Technical and Functioanl
specific documents.
• Efficiently handled multiple projects during resource crunch.
• Provided Training to the team with the functional and technical knowledge.

Environment: Informatica PowerCentre 9.6,Workflow Manager, Workflow Monitor, Toad, SQL


Developer, SOQL, Oracle 11g, PL/SQL, Azure Data Factory, Sales Force & SQL Server 2014
Management Studio.

Project #2 Jun 2017 till May


2018

Title : PECOS
Client : CGI-Federal.
Location : VA, US.
Company : CGI-Federal
Domain : Healthcare
Team Size : 12
Role : Team Lead.

Responsibilities:
• Involved in leading a team of 12 members.
• Responsible for Business Analysis and Requirements gathering.
• Was Involved in TDM project Sub-setting, Masking sensitive columns with use
of Informatica PowerCenter 9.1.
• Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager,
and Workflow Monitor.
• Worked with heterogeneous source to Extracted data fromOracle database, XML and flat files
and loading to a relational Oracle warehouse.
• Developed standard and reusable mappings and mapplets using various transformations like
Expression, Aggregator, Joiner, Router, Lookup (Connected and Unconnected) and Filter.
• Databases with use of Informatica ILM – TDM and PowerCenter 9.6 Hotfix.
• Performed tuning of SQL queries and Stored Procedures for speedy extraction of data, to
resolve and troubleshoot issues in OLTP environment.
• Involved in building the ETL architecture and Source to Target mapping to load data into Data
warehouse.
• Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to
design the business process, dimensions and measured facts.
• Troubleshooting of long running sessions and fixing the issues related to it.
• Worked withVariables and Parametersin the mappings to pass the values between sessions.
• Involved in the development of PL/SQL stored procedures, functions and packages to
process business data in OLTP system..
• Developed mapping parameters and variables to support SQL override.
• Carried out changes into Architecture and Design of Oracle Schemas for both OLAP and
OLTP systems.
• Worked with Services and Portal teams on various occasion for data issues in OLTP system.
• Worked with the testing team to resolve bugs in ETL mappings before migrating to
production.
• Creating the weekly project status reports, tracking the progress of tasks according to
schedule and reporting any risks and contingency plan to management and business users.
• Involved in meetings with production team for issues related to Deployment, maintenance,
future enhancements, backup and crisis management of DW.
• Worked with production team to resolve data issues in Production database of OLAP and
OLTP systems
• Designed tables required for the execution of the ETL processes using ERwin.
• Extracted data stored in a multi-level hierarchy using Oracle Stored Procedures.
• Loaded Customer data in multiple levels (rows) using Oracle Stored Procedures and Cursors.
• Provided excellent support during QA/UAT testing by working with multiple groups.
• Got involved in unit testing, regression testing, function based testing and system integrated
testing.
• Provided excellent support during QA/UAT/Beta testing by working with multiple groups.
• Optimized the SQLs and Informatica mappings which handled millions of records.
• Improved performance using Oracle Partitions, Indexes and other Performance Tuning
techniques.
• Developed re-usable components in Informatica, Oracle and UNIX
• On-Call/Production Support provided during day-time and off-hours.
• Actively participated in Install/Deployment plan meetings.
• Proved accuracy in doing sanity checks, smoke tests after the install of new scripts used by
ETL.
• Efficiently handled multiple projects during resource crunch.
• Provided Training to the team with the functional and technical knowledge.
Environment: Informatica PowerCentre 9.6,Workflow Manager, Workflow Monitor, Informatica Power
Connect / Power Exchange, Data Analyzer 8.1, Toad, SQL Developer, Oracle 11g, SQL loader,
PL/SQL, Erwin,Linux,Teradata,Micro Strategy and Tableau.

Project #3 Feb 2015 to May


2017

Title : HEDW
Client : NBC Universal.
Location : LA, US.
Company : Capgemini, India
Domain : Media & Entertainment
Team Size : 12
Role : Team Lead.

Responsibilities:
• Involved in gathering requirements from business users. Participated in the detailed
requirement analysis for the design of data marts.
• Databases with use of Informatica ILM – TDM and PowerCenter 9.6 Hotfix.
• Actively interface with other teams to gather requirements, design, code, debug, document,
implement and maintain various DB projects.
• Worked according to the specifications and restrictions of the company by strictly maintaining
data privacy.
• Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow
Manager, and Workflow Monitor.
• Parsed high-level design specification to simple ETL coding and mapping standards.
• Designed and customized data models for Data warehouse supporting data from multiple
sources.
• Worked with Informatica Data Quality (IDQ 8.6.1) for data quality measurement.
• Involved in building the ETL Source to Target mapping to load data into Data warehouse.
• Created mapping documents to outline data flow from sources to targets.
• Extracted the data from the flat files and other RDBMS databases into staging area and
populated onto Data warehouse.
• Used various transformations like Filter, Expression, Sequence Generator, Update Strategy,
Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
• Created mapplets to use them in different mappings.
• Developed mappings to load into staging tables and then to Dimensions and Facts using
existing ETL standards.
• Worked on different tasks in Workflows like sessions, events raise,event wait, decision, e-
mail, command, worklets, Assignment, Timer and scheduling of the workflow.
• Created sessions, configured workflows to extract data from various sources, transformed
data, and loading into data warehouse.
• Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
• Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
• Modified existing mappings for enhancements of new business requirements.
• Involved in Performance tuning at source, target, mappings, sessions, and system levels.
• Prepared migration document to move the mappings from development to testing and then to
production repositories
• Migrated the codes from Dev to Test and Test to Prod. Wrote the migration documentation in
details for system compatibility, object and parameter files for smooth transfer of code into
different environments.
• Designed the automation process of Sessions, Workflows, scheduled the Workflows, created
Worklets (command, email, assignment, control, event wait/raise, conditional flow etc.) and
configured them according to business logic & requirements to load data from different
Sources to Targets.
• Created Pre & Post-Sessions UNIX Scripts, Functions, Triggers and Stored Procedures to
drop & re-create the indexes and to solve the complex calculations on data.

Environment: Informatica Power Centre,Workflow Manager,Workflow Monitor, Informatica Power


Connect/PowerExchange,Erwin,Toad,SQLDeveloper,Oracle11g,SQLloader,Linux,PL/SQL,Teradata,
Micro Strategy and Tableau

Project #4 Dec 2013 to Jan


2015

Title : MTNA
Client : Deutsche Bank.
Location : New York, US.
Company : HCL Technologies, India
Domain : Banking
Team Size : 16
Role : Sr.Developer.

Responsibilities:
• Extensively involved in requirements gathering, writing ETL Specs and preparing design
document
• Worked on parameters, variables, procedures, scheduling and pre/post session shell scripts
• Developed code to extract, transform, and load (ETL) data from inbound flat files and various
databases into outbound flat files and XML files using complex business logic
• Expertise in performance tuning of the mappings, ETL Environments, procedures, queries
and processes
• Expertise in handling and loading high volumes of data into data warehouse in a given load
window
• Expertise in preparing test strategy, test plan, test summary reports, test cases and test
scripts for automated and manual testing based on user requirement documents and system
requirement documents.
• Extensive experience in Data warehouses with strong understanding of Logical, Physical, and
Dimensional Data Modeling to Design Star and Snow flake Schemas.
• Utilized Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing
duplicate data.
• Used SCD (type1, type2 and type3) to Load present and Historical Data to ODS, EDW.
• Involved in creating, monitoring, modifying, & communicating the project plan with other team
members.
• Experience in ETL process using ETL Tools, creating and maintaining Repository, Source
systems, Target Databases and developing strategies for ETL mechanisms
using ETL tools,Gathering and analysis, design, development, testing, performance tuning,
and production support.
• Experience in Unix Shell scripting, scheduling cron jobs and also Job Scheduling on multiple
platforms like Windows NT/2000/2003, Unix
• Strong technical skills in Oracle, SQL, PL/SQL, MySQL.
• Worked with architects and DBA in designing the rules engine and implementing the logical,
physical and Dimensional model based on the rules and metadata.
• Responsible for preparing the technical specifications from the business requirements

Environment: Informatica PowerCenter 9.1,Informatica Power Centre,Workflow Manager,Workflow


Monitor, Oracle11g/12c, Toad, Sun Solaris11.1, Visual SourceSafe, SQL, PL/SQL,BO,Unix,AutoSys
and Tableau

Project #5 Jun 2010 to Nov


2013

Title : ABFO
Client : Deutsche Bank.
Location : Frankfurt, Germany.
Company : HCL Technologies, India
Domain : Banking
Team Size : 16
Role : Sr.Developer.

Responsibilities:
• Extensively involved in requirements gathering, writing ETL Specs and preparing design
document
• Expertise in Masking, Column & Data Mapping, ETL,TDM, Data Archiving, Retiring and Data
Growth 20 applications with use of IBM OPTIM z/OS and Distributed(LUW).
• Designed and developed ETL mappings for data sharing between interfaces utilizing SCD
type 2 and CDC methodologies
• Worked on parameters, variables, procedures, scheduling and pre/post session shell scripts
• Developed code to extract, transform, and load (ETL) data from inbound flat files and various
databases into outbound flat files and XML files using complex business logic
• Expertise in performance tuning of the mappings, ETL Environments, procedures, queries
and processes
• Expertise in handling and loading high volumes of data into data warehouse in a given load
window
• Expertise in preparing test strategy, test plan, test summary reports, test cases and test
scripts for automated and manual testing based on user requirement documents and system
requirement documents.
• Extensive experience in Data warehouses with strong understanding of Logical, Physical, and
Dimensional Data Modeling to design Star and Snow flake Schemas.
• Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing
duplicate data.
• Used SCD (type1, type2 and type3) to Load present and Historical Data to ODS, EDW.
• Planned, created and executed SSIS packages to integrate data from varied sources like
Oracle, DB2, flat files and SQL databases and loaded into landing tables.
• Involved in creating, monitoring, modifying, & communicating the project plan with other team
members.
• Experience in ETL process using ETL Tools, creating and maintaining Repository, Source
systems, Target Databases and developing strategies for ETL mechanisms using ETL tools
• Gathering and analysis, design, development, testing, performance tuning, and production
support.
• Experience in Unix Shell scripting, scheduling cron jobs and also Job Scheduling on multiple
platforms like Windows NT/2000/2003, Unix
• Worked with architects and DBA in designing the rules engine and implementing the logical,
physical and Dimensional model based on the rules and metadata.
Environment: Informa Informatica PowerCenter 9.1,Informatica Power Centre,Workflow
Manager,Workflow Monitor, Oracle11g/12c, Toad, Sun Solaris11.1, Visual SourceSafe, SQL,
PL/SQL,BO,Unix,AutoSys and Tableau.

Project #6 Mar 2009 to May


2010

Title : Sample Shipment.


Client : Sanofi Aventis.
Location : NJ, US.
Company : Cegedim Software India Pvt. Ltd.
Domain : Health Care
Team Size : 18
Role : Sr.Developer

Responsibilities:
• Responsible for developing, support and maintenance for the ETL (Extract, Transform and
Load) processes using Informatica Power Center
• Develop Mappings and Workflows to generate staging files.
• Developed various transformations like Source Qualifier, Sorter transformation, Joiner
transformation, Update Strategy, Lookup transformation, Expressions and Sequence
Generator for loading the data into target table.
• Created multiple Mapplets, Workflows, Tasks, database connections using Designer,
Workflow Manager
• Created sessions and batches to move data at specific intervals & on demand using Server
Manager
• Used debugger to validate the mappings and gain troubleshooting information about data and
error conditions.
• Extensively used Unix Scripting, Scheduled PMCMD and PMREP to interact with Informatica
Server from command mode.
• Responsibilities include creating the sessions and scheduling the sessions
• Extracted the data from Oracle, DB2, CSV and Flat files
• Implemented performance tuning techniques by identifying and resolving the bottlenecks in
source, target, transformations, mappings and sessions to improve performance.
• Understanding the Functional Requirements.
• Designed the dimension model of the OLAP data marts in Erwin.
• Wrote documentation to describe program development, logic, coding, testing, changes and
corrections.
• Performed Unit testing and System integration testing. Preparing the documents for test data
loading
• Troubles shoot the Productions failure and provide root cause analysis. Worked on
emergency code fixes to Production.
• Created Mappings to extract the unique account numbers from the data warehouse tables,
and updated them with tokens in PCCD/SKOUT/KOUT files and data warehouse tables.
Prepared the Test Strategy and Test Case documents for Unit testing, developed and
executed unit tests for the D & B Business requirement team
• Migrated the code to different environments like QA and PROD, and explored Serena while
importing Informatica objects, Unix scripts and procedures.
.
Environment: Informatica PowerCenter 8.6.1 (PowerCenter Designer, workflow manager, workflow
monitor), Oracle 10g, Windows XP, TOAD, SQL.

Project #7 Jul 2008 to Feb


2009

Title : Quantum.
Client : Sanofi Aventis.
Location : NJ, US.
Company : Cegedim Software India Pvt. Ltd.
Domain : Health Care
Team Size : 18
Role : Sr.Developer

Responsibilities:
• Responsible for Business Analysis and Requirements gathering.
• Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow
Manager, and Workflow Monitor.
• Parsed high-level design specification to simple ETL coding and mapping standards.
• Designed and customized data models for Data warehouse supporting data from multiple
sources.
• Involved in building the ETL architecture and Source to Target mapping to load data into Data
warehouse.
• Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to
design the business process, dimensions and measured facts.
• Extracted the data from the flat files and other RDBMS databases into staging area and
populated into Data warehouse.
• Used various transformations like Filter, Expression, Sequence Generator, Update Strategy,
Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
• Developed mapping parameters and variables to support SQL override.
• Created mapplets to use them in different mappings.
• Invoked Unix Scripts through Command task, Under Session and Pre/Post session
commands
• Developed mappings to load into staging tables and then to Dimensions and Facts.
• Used existing ETL standards to develop these mappings.
• Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-
mail, command, worklets, Assignment, Timer and scheduling of the workflow.
• Created sessions, configured workflows to extract data from various sources, transformed
data, and loading into data warehouse.
• Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
• Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
• Modified existing mappings for enhancements of new business requirements.
• Involved in Performance tuning at source, target, mappings, sessions, and system levels.
• Prepared migration document to move the mappings from development to testing and then to
production repositories.

Environment: Informatica PowerCenter 8.6.1 (PowerCenter Designer, workflow manager, workflow


monitor), Oracle 10g, Windows XP, TOAD, SQL.

Project #8 Mar 2007 to Jun


2008

Title : EDS.
Client : Sanofi Aventis.
Location : NJ, US.
Company : Cegedim Software India Pvt. Ltd.
Domain : Health Care
Team Size : 18
Role : Sr.Developer

Responsibilities:
• Involved in gathering requirements from business users. Participated in the detailed
requirement analysis for the design of data marts and star schemas.
• Created Technical design specification documents based on the functional design documents
and the physical data model.
• Extracted data from several source systems like Oracle, DB2, Flat files, XML files, etc. and
loaded data into Enterprise Data warehouse. Designed many Multi Source Single Target
mappings and vice versa.
• Worked on Power Center Designer client tools like Source Analyzer, Target Analyzer,
Mapping Designer and Mapplet Designer.
• Created reusable transformations by using Lookup, Aggregator, Normalizer, Update strategy,
Expression, Joiner, Rank, Router, Filter, and Sequence Generator etc. in the
Transformation Developer
• Moved the data from source systems to different schemas based on the dimensions and fact
tables by using the slowly changing dimensions type 2 and type 1.
• Raised change requests, incident Management, analyzed and coordinated resolution of
program flaws for the Development environment and hot fixed them in the QA during the
runs.
• Informatica workflow manager was used to create, schedule, execute Sessions, Worklets,
Command, E-Mail Tasks and Workflows. Performed validation and loading of the Flat files
received from business users.
• Wrote several oracle stored procedures, functions and packages for build and deployment
process.
• Wrote UNIX Shell scripts to schedule the workflows and also to extract the data from the
various databases and file systems
• Managed various data marts by collaborating with different organizations including IT teams
and DBAs to build a quality system and adhering to standard operating procedures resulted in
a highly available 24x7 system to business and product development consumers worldwide
• Used Parameter files to reuse the mapping with different criteria to decrease the
maintenance.
• Used Autosys to schedule Informatica, SQL script and shell script jobs.
• Identifying the issues and resolutions by coordinating with the QA team and acted as a point
of contact between the Dev and the QA team.
• Trained and provided knowledge transfer to colleagues and technical consultants who
developed applications and generated more than 300 reports and metrics utilizing the
Engineering Data Warehouse environment.

Environment: Informatica PowerCenter 8.6.1 (PowerCenter Designer, workflow manager, workflow


monitor), Oracle 10g, Windows XP, TOAD, SQL.

Project #9 Sep 2005 to Feb


2007

Title : DNB.
Client : Global Hygiene Match.
Location : Texas, US.
Company : Marques Software India Pvt. Ltd.
Domain : Health Care
Team Size : 26
Role : Developer

Responsibilities:
• Involved in Full life cycle of Business Objects reporting Application from requirements
gathering to deployment of the reporting system to the production environment.
• Designed and developed Reports and Universes based on the specifications and assigned
them to respective domains.
• Developed Transformation logic and designed various complex Mappings in the Designer for
data load and data cleansing.
• Responsible for using the capabilities of PowerCenter namely list files, pmcmd, pmrep,
Target override, Persistent lookup, Dynamic & Static lookup.
• Develop Logical and Physical data models that capture current state/future state data
elements and data flows using Erwin.
• Created and Monitored Workflows using Workflow Manager and Workflow Monitor.
• Created several Procedures, Functions, Triggers and Packages to implement the functionality
in PL/SQL.
• Developed Informatica Mappings, Re-usable Transformations, and Mapplets for data load to
data warehouse and database (oracle)
• Support between developers and end users, functional managers and Oracle regarding
issues with database software, SQL programming, tuning, releases and development with
Oracle tools like Oracle Warehouse Builder.
• Involved in creating ETL model (snowflake schema), normalizing and documenting.
• Walked through the Logical and Physical Data Models of all source systems for data quality
analysis.
• Involved in XML parsing and loading XML files into the database.
• Optimizing/Tuning mappings for better performance and efficiency.

Environment: Informatica PowerCenter 8.6.1 (PowerCenter Designer, workflow manager, workflow


monitor), Oracle 10g, Windows XP, TOAD, SQL.

Project #10 Dec 2003 to Aug


2005
Title : Analyzder Warehouse.
Client : Apex Finance.
Location : CA, US.
Company : Marques Software India Pvt. Ltd.
Domain : Finance
Team Size : 26
Role : Developer

Responsibilities:
• Involved in gathering, analyzing, and documenting business requirements, functional
requirements and data specifications for Business Objects universes and reports.
• Created parameterized Crystal reports using Charts, Cross-Tab reports, sub-reports, Running
Totals.
• Created mappings for dimensions and facts.
• Designing and creation of complex mappings involving transformations such as expression,
joiner, aggregator, lookup, update strategy, and filter.
• Extracted data from various sources like Oracle, flat files and XML.
• Worked extensively on Source Analyzer, Mapping Designer, Mapplet designer, Warehouse
Designer and Transformation Developer.
• Worked on Informatica PowerCenter tools- Designer, Repository Manager, Workflow
Manager, and Workflow Monitor.
• Analyzed, designed, developed, implemented and maintained moderate to complex initial
load and incremental Informatica mappings and workflows to provide data for reporting
purposes
• Created complex mappings using different transformations such as Filter, Router, Connected
& Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Union, Expression and
Aggregator transformations to pipeline data to DataMart. Also, made use of variables and
parameters.
• Designed and created scripts for objects such as tables, indexes, views, sequences, object
types and collection types.
• Used changed data capture type mappings to load slowly changing Type 1, 2 and 3
dimensions.
• Extensively used Power Center command line utilities such as pmcmd, Pmrep embedded in
shell scripts to perform various administrative tasks such as migrations, repository backup
scripts, start/stop scripts, statistics collections etc.
• Tuned inefficient queries using combinations of index creation, alternate indexing strategies,
join strategies and query level optimization hints
• Developed Oracle PL/SQL Stored Procedures.
• Responsible for testing and validating the Informatica mappings against the pre-defined ETL
design standards.
• Involved in trouble shooting the issues faced in the production environment and serving as a
point of contact in between the QA team and Production team, minimizing the loss by
providing swift and immaculate solutions in a timely manner.

Environment: Informatica PowerCenter 8.6.1 (PowerCenter Designer, workflow manager, workflow


monitor), Oracle 10g, Windows XP, TOAD, SQL.

EDUCATION:

• Bachelors in Electronics and Communications Enigneering,from JNTU, Hyderabad in


2003.

You might also like