Elizabeth: ETL Informatica Developer
Elizabeth: ETL Informatica Developer
[email protected]
734-367-4333 ETL Informatica Developer
PROFESSIONAL SUMMARY:
Having almost 7 years of IT experience working on Informatica Power Center tool in implementing the ETL methodology
for Data Extraction, Transformation and Loading.
Experience in SDLC (Requirement Analysis, Design, Development, Testing and implementation) and good knowledge on
Data Warehousing Concepts.
Working knowledge of Installation, Configuring and Administration of Informatica Power Center Server.
Good experience in handling data from various sources and ETL jobs with respect to Data Warehouses and Data Marts
using Informatica Power Center which includes client Tools (Repository Manager, Designer, Workflow Manager and
Workflow Monitor).
Experienced in Integration of relational and non-relational sources such as Oracle, SQL Server, Flat Files, XML files etc.
Extensive working experience with mappings using different transformations like Source Qualifiers, Expressions, Lookup
(Connected and Unconnected), Joiners, Routers, Aggregators, Update strategy to handle and load data into different target
types.
Experience in creating Workflows, Worklets, Mappings, Mapplets and Sessions.
Good experience in using workflow tasks like Session Tasks, Command Tasks, Decision Tasks, and Email Tasks.
Hands on experience in Data warehousing techniques like Surrogate Key assignment, Slowly Changing Dimensions (SCD
Type 1, SCD Type 2).
Experience in creating ETL mappings using XML Source Qualifier, XML Parser and XML Generator to load data from XML
Applications.
Knowledge on creating data quality solutions using Informatica Data Quality (IDQ)
Experience working with stored procedures, Functions, Triggers and Views, complex SQL queries using SQL Server and
Oracle PL/SQL.
Experience in designing ER models and Dimensional Data Models.
Real time experience working with DB Schemas like Star Schema and Snowflake Schema used in Relational, dimensional
and multidimensional modeling.
Experience with Normalization and De- normalization process.
Experience working with session Logs for Error Handling and Trouble shooting in all environments.
Worked with parameter files, usage of mapping parameters and variables to pass values between sessions.
Monitor and optimize query performance, session performance and fine tune mappings.
Working knowledge of Data Analysis and Reporting tools like Tableau, R Studio, MS Power BI and Looker.
Team player with good communication, analytical and problem-solving skills; Ability to manage time and resources and
prioritize tasks to be able to complete projects on time consistently.
TECHNICAL SKILLS
PROFESSIONAL EXPERIENCE
Client: Marsh & McLennan Companies, Hoboken, NJ Feb 2019 to Present
Role: Informatica/ETL Developer
Marsh is a global leader in insurance broking and risk management with presence in more than 130 countries, which is a
subsidiary of Marsh & McLennan Companies. As a part of this project, I have undertaken assignments that required migration of
legacy code to version 10.2 power center objects, develop ETL jobs that migrate policy information from legacy oracle to
MongoDB on cloud, supporting an application upgrade assignment, develop ETL jobs to transform and carry data through
different stages of data cycle into and out of a financial planning tool called ANAPLAN.
Responsibilities:
Interacted with the Business Analysts to understand the business & gather technical requirements.
Developed ETL programs using Informatica to implement the business requirements.
Developed appropriate mapping designs to convert Policy information from oracle RDBMS to JSON objects which are
then loaded into MongoDB on cloud.
Developed Python scripts that convert a comma separated file to JSON documents with built in hierarchies in array
objects
Developed Python scripts to post the JSON documents to the respective END points via REST API webservices.
Write Shell script running workflows in UNIX environment.
Actively participated in the Ongoing Business communication calls to collect the changing requirements and tune the SQL
queries and mapping design accordingly.
Created shell scripts to fine tune the ETL flow of the Informatica workflows
Worked with data coming from various sources like, SQL server, Oracle, Flat files containing xlsx, xlsm and xlsb formats.
Coordinated in a team with Onsite, offshore model.
Performed unit testing at various levels of the ETL and actively involved in team code reviews.
Identified problems in existing 6.2 ETL code and suggested corrections and performance tuning measures.
Fixed the invalid mappings and troubleshoot the technical problems of the database.
Create new mapping designs using various tools in Informatica Designer like Source Analyzer and Mapping Designer.
Develop the mappings using needed Transformations in Informatica tool according to technical specifications
Created complex mappings that involved implementation of Business Logic to load data in to staging area.
Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and
Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and
monitored the results using workflow monitor
Optimizing performance tuning at source, target, mapping and session level
Participated in weekly status meetings and conducting internal and external reviews as well as formal walk through
among various teams and documenting the proceedings.
Environment: Informatica Power Center 10.2, Microsoft SQL server 2012, Oracle 11g, SQL, UNIX, SQL Developer, Python, Rest
API, MongoDB 3.6.12 Enterprise
SiriusXM is one of the world's largest audio entertainment companies and is among the largest subscription media companies in
the United States, offering an array of exclusive content to satisfy many interests. SiriusXM is a provider of connected vehicle
services that give customers access to a suite of safety, security and convenience services including automatic crash notification,
stolen vehicle recovery assistance, enhanced roadside assistance and turn-by-turn navigation.
Responsibilities:
Involved in analysis of end user requirements and business rules based on given documentation and worked closely with
Solution Analysts and the Design teams.
Developed mappings using Informatica 10.2 to load data from multiple sources and Flat files (fixed width and delimited),
XML files, Excel files into SQL Server tables.
Extensively Used Informatica client tools like Source Analyzer, Target designer, Mapping Designer, Mapplet Designer,
and Transformation Developer for defining Source & Target definitions and coded the process of data flow from source
system to data warehouse.
Extensively worked with transformations like Lookup, Update Strategy, Expression, Filter, Router, Joiner, and
Aggregator.
Worked on lookup caches likeStatic and Dynamicin Lookup transformation as per the requirement for better
performance. Developed mappings involving SCD type-I and SCD Type-II mappings in Informatica to load the data from
various sources
using different transformations like Source Qualifier, Lookup (connected and unconnected), Expression, Aggregate,
Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router and SQL transformations and used Update strategy
and Target load plans.
Involved in identifying the various bottle necks at different levels (mappings, sessions and workflows) and also provide
solution to improve the performance.
Developed the relational connections, migrated the mappings from one environment to another and also to the
Production environment. Created and used Mapplets and reusable transformations using Informatica Power Center.
Used Mapping parameters, variables and session parameters to dynamically enter values between sessions
and to parameterize Relational and Application connections, Filenames and other objects.
Implemented Performance Tuning measures in source, target, mapping, session design
Worked on different tasks in Workflows like sessions, event raise, event wait, decision, e-mail, command, Assignment,
Timer etc.
Developed PL/SQL procedures, for creating/dropping of indexes on tables using target pre-load and post-load strategies
to improve session performance in bulk loading, for gathering statistics and archiving table Data.
Worked with Session Logs and Workflow Logs for Error handling and Troubleshooting in all environment
Environment: Informatica Power Center 10.2, Microsoft SQL server 2012, Oracle 11g, PL/SQL, SQL,UNIX, SQLDeveloper.
Involved in analysis of source systems, business requirements and identification of business rule and responsible for
developing, support and maintenance for the ETL process using Informatica.
Created / updated ETL design documents for all the Informatica components changed.
Created Informatica transformations/mapplets/mappings/tasks/worklets/workflows using Power Center to load the data
from source to stage, stage to output.
Made use of various Power Center Designer transformations like Source Qualifier, Connected and Unconnected Lookups,
Expression, Filter, Router, Sorter, Aggregator, Joiner, Rank, Router, Sequence generator, Union and Update Strategy
transformations while creating Mapplets/mappings.Wrote complex PL/SQL functions/procedures/packages.
Created different parameter files and changed Session parameters, mapping parameters, and variables at run time.
Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating
transformations and tuned accordingly for better performance.
Participated in User Acceptance testing (UAT) and involved in UAT test cases, executed test cases, Documenting Defects,
resolved defects and Signed Off from Application.
During the project, participated in multiple meetings with the client and data architect / ETL architect to propose better
strategies for performance improvement and gather new requirements.
Environment: Informatica 10.1, Oracle 10g, UNIX, SQL, PL/SQL XML, reporting.
Responsibilities:
Work with ETL Team in understanding the business requirements and designing and loading the data into data mart.
Use Informatica to load data from source to ODS.
Create complex mappings using unconnected Lookup, Sorter, Aggregator, Union, Rank, Normalizer, Update strategy and
Router transformations for populating target table.
Implemented SCD Type 1 and Type 2 for change data Capture.
Work on Workflow Manager for the creation of Worklets, Sessions, E-mail notifications, Decision and to Schedule jobs.
Involved in Performance tuning of the Informatica mappings Prepared ETL detailed design and unit testing document to
outline the flow of data, for testing.
Source/targets counts and field-to-field mappings.
Optimized/Tuned mappings for better performance and efficiency. Performance tuning of SQL Queries, Targets and
sessions.
Maintain Development, Test and Production mapping migration using Repository Manager.
Worked on System Analysis & Design of process/data flow.
Implemented Slowly Changing Dimensions- Type I &II in different mappings as per the requirements.
Translated SQL logic into informatica mappings. Worked on Database stored procedures and views.
Used workflow manager to create and configure workflow and session task to load data. Used informatica workflow
monitor to create,monitor workflow in case of process failures.
Implemented optimization techniques for performance tuning and wrote Pre and post session shell scripts.
Configured the sessions using workflow manager to have multiple partitions on source data and to improve performance.
Involved in end-to-end system performance and regression testing and data validations.
Worked on Data Extraction, Transformation, Loading, Data Conversions and Data Analysis.
Environment: Informatica Power Center9.x, SQL Server 2012/2008, Oracle 10g, SQL Server Management Studio, UNIX Scripting,
Teradata.
Responsibilities:
Developed Source to Target Mappings using Informatica PowerCenter Designer to load data from various source systems
like ORACLE, MySQL, SQL Server and Flat files to the target.
Communicated with business customers to discuss the issues and requirements.
Reviewed and analyzed functional requirements, Mapping documents, problem solving and trouble shooting.
Involved in fixing invalid mappings, testing of stored procedures and functions, testing of informatica sessions, workflow
and the target data.
Tuned performance of informatica session for large data files by partitioning and changing appropriate properties data
cache size.
Developed shell scripts, PL/SQL stored procedures, table and Index creation scripts.
Production Support has been done to resolve the ongoing issues and troubleshoot the problems.
Responsible for creating data mapping, designing documents and unit test documents.
Expertise in configuration, performance tuning, and integration of various data sources/targets like Oracle, MS SQL
Server, Flat files.
Design and Developed ODS to Data Mart Mappings/Sessions/Workflows.
Involved in Debugging and Troubleshooting Informatica mappings.
Populated error tables as part of the ETL process to capture the records that failed the migration.
Extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
Developed UNIX Shell Scripts to start the Informatica jobs using command line.
Used SQL Server management studio to run SQL queries and validate data in ODS.
Created Informatica mappings to read change data from SQL server 2008 and load into oracle.
Performance tuning on sources, targets mappings and SQL (Optimization) tuning.
Effectively use informatica parameter files for defining mapping variables, workflow variables, FTP connections and
relational connections.
Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
Executing the project both in waterfall and agile methodologies.
Effectively worked on Onsite and Offshore work model.
Participated in daily status meetings and conducting internal and external reviews as well as formal walk through among
various teams and documenting the proceedings.
ENVIRONMENT: Informatica Power Center 9.5, Oracle 11G, PL/SQL Developer, SQL.
Client: Ochsner Health Systems, New Orleans, LA May 2013 to May 2014
Role: SQL Server Developer
An accountable care organization (ACO) is a coordinated group of healthcare providers who have agreed to share responsibility
for the care of a defined population of individuals. The Medicare Payment Advisory Commission defines an ACO as: a group of
primary care providers, specialists and/or hospitals and other health professionals who coordinate the full continuum of care
and are accountable for the overall quality of care and costs for a defined population. (Medicare Payment Advisory
Commission). ACO providers coordinate amongst themselves, and with each individual, to improve the individual’s quality of
care, the efficacy of the care and to reduce the rate of increasing cost of care over time.
Responsibilities:
Worked on multiple applications for Data extractions, Analytics and reporting needs.
Point of contact for all the Data processes, Data Mappings, Data dictionaries, Data pulls and Reporting solutions.
Designed and Developed SSIS Packages to import and export data from MS Excel, SQL Server 2008, Flat files and used SSIS
Package Configuration, Expressions, passing variables dynamically, logging, Event Handler.
Involved in various Transformation and data cleansing activities using various Control flow and data flow tasks in SSIS
packages during data migration.
Applied various data transformations like Lookup, Aggregate, Sort, Multicasting, Conditional Split, Derived column etc.
Creating SQL Server Agent Jobs to maintain and run various SSIS packages periodically or as scheduled.
Created Pie charts, Bar charts, Dashboards with drill down, drill through reports based on business requirements in SSRS.
Environment: MS SQL Server 2008/2005, SSIS, SSRS, Visual Studio, MS Access 2003/2007.