0% found this document useful (0 votes)
27 views

Informatica MDM Technical Architect

Uploaded by

Suresh Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views

Informatica MDM Technical Architect

Uploaded by

Suresh Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 17

Sujeet patel

Informatica MDM Technical Architect (Informatica certified)

[email protected] M: +91-90153-76513

Summary
• A senior MDM Technical Architect with a very strong professional background in Master data management,
Data warehouse, Banking, Finance, Services domain and over 12+ Years of industry experience (6 year on-
site Singapore) in the software engineering, processes and methodologies, development and management.
Used different technology like Informatica MDM, Informatica MDM SaaS, Informatica Business360,
Informatica Customer360, Informatica Product360, Informatica Supplier360Informatica IDQ, SQL/PLSQL,
SAP BO, ETL, Informatica MDM, Hadoop (HDFS and Hive), Core Java, XML SOAP APIs, Web services
and ETL job automation process.

• Ability to work in a fast paced environment and willingness to stretch and achieve aggressive goals.

• Expertise in Installing and Managing Informatica MDM, Metadata Manager and Informatica
Data Quality, Informatica MDM.

• Experience in code deployment of Informatica MDM. Coordination with Developers and


Domain Experts Building Business Glossary, Traceability and Data Lineage in Metadata Repository.

• Experience in Informatica ETL Installation, Upgrade, BugFix and Migration of data.

• Involved in the full development lifecycle from requirements gathering through development and
support using Informatica MDM, Repository Manager, Designer, Server Manager, Workflow
Manager, and Workflow Monitor. Experience working in complex, cross-functional and geographically
dispersed team environments.

• Strong experience in Development of Tableau and ETL on UNIX/Windows, Migration of


repositories, and up gradation of repositories. Professional Experience in ETL Architecture using
Tableau and ETL 8.6.1/9.0.1/9.1.0 9.5.1/9.6.0 and 9.6.1/10.1.

• Experience in Designing and developing mappings from various transformation logics like, Source
Qualifier, Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Union, Joiner,
Transaction Control and Update Strategy.
Experience in overall Big Data (HADOOP) processes and architecture, performed build, tested and deployed
activities related to Big Data implementations.
• Knowledge of Big data systems including Hadoop, HDP components, HDFS, Hive, Pig, Amazon Web
services, Ambari Clustering, Ranger security policies, Yarn , Oozie, Zookeper, and Map reduce2.
• Hands on experience in identifying and resolving performance bottlenecks in various levels like
sources, targets, mappings, transformations and sessions
• Over 9+ years of strong Data Warehousing experience using Tableau and ETL (Mapplets, Source
Analyzer, Mapping Designer, Transformations).
• Extensive experience in the areas of Data Warehouse, ETL (extraction, transformation and
loading), BI (Business Intelligence) and related areas.
• Designed, developed and tested complex ETL workflows and related processes.

Experience

Informatica MDM, Informatica MDM SaaS, Informatica Business360, Informatica Customer360, Informatica
Product360, ETL developer at XXXX
Project: XX India Delhi

Dec 2019 - Present

• We are working as L3 support and also enhance bug identified in ODS system. We applied many complex
logic for data integration, cleansing and in transformation, we written complex ETL, SQL, PLSQL program for
achieving the desired results from reporting, we are using tools like Informatica MDM, Unix, Oracle and
mysql data bases etc. Provide strong comprehensive administration of assigned systems, including product
evaluation, selection, installation, integration, user support, problem resolution, and ongoing maintenance of
hardware, operating systems, networking, and application software.
• Deploying in Informatica code from Dev to Test, Test to Stage and Stage to Prod. Perform Unit testing and
validation after code migration.
• Created ETL jobs and mappings with the help of Tableau and ETL to integrate the data from different
heterogeneous sources like flat files, csvs, ms sql database, oracle database.
• Development includes two crucial jobs – Pre-landing job and Landing job in ETL. Jobs perform pass through
mappings for moving data from staging to landing tables.
• Understanding the application/business requirement
• Involved in design analysis at offshore
• Lead project at various stage from requirement gathering and business rules analysis to creating the Logical
data model and physical model.
• Involved in business analysis and technical design session with business and technical staff to develop
requirements document and ETL specifications.

Informatica MDM, Informatica MDM SaaS, Informatica Business360, Informatica Customer360, Informatica
Product360, ETL developer at Nityo InfoTech ltd Singapore.
Project: Lombard Risk regulatory integration INTESA, HL BANK Singapore

May 2019 – Dec 2019

• Lombard risk regulatory reporting is a regulatory tool used by many banks globally, We need to extract data
from different sources and push to lombard risk for reporting. We applied many complex logic for data
integration, cleansing and in transformation, we written complex ETL, SQL, PLSQL program for achieving
the desired results from reporting.
• Provide strong comprehensive administration of assigned systems, including product evaluation, selection,
installation, integration, user support, problem resolution, and ongoing maintenance of hardware, operating
systems, networking, and application software.
• Deploying in Informatica code from Dev to Test, Test to Stage and Stage to Prod. Perform Unit testing and
validation after code migration.
• Created ETL jobs and mappings with the help of Tableau and ETL to integrate the data from different
heterogeneous sources like flat files, csvs, ms sql database, oracle database.
• Development includes two crucial jobs – Pre-landing job and Landing job in ETL. Jobs perform pass through
mappings for moving data from staging to landing tables.
• Understanding the application/business requirement
• Involved in design analysis at offshore
• Lead project at various stage from requirement gathering and business rules analysis to creating the Logical
data model and physical model.
• Involved in business analysis and technical design session with business and technical staff to develop
requirements document and ETL specifications.
• Experience in assisting with a data center migration from Pentaho ETL and Tableau and ETL.
• Experience in working on Pentaho report designer for BI implementation.
• Provide IT Operational support by resolving incidents, requests, and problems associated with various IT
systems.
• Provide strong comprehensive administration of assigned systems, including product evaluation, selection,
installation, integration, user support, problem resolution, and ongoing maintenance of hardware, operating
systems, networking, and application software.
• Deploying in Informatica code From Dev to Test, Test to Stage and Stage to Prod. Perform Unit testing and
validation after code migration.
• Written scripts for repository backup and configured emails.
• Involved in BI Projects – Pentaho Analytics and Pentaho Reporting. Crucial role to migrate objects across
OBIEE, Noetix Analytics, Informatica, Tableau and ETL ETL .
• On call and weekend support for Stage and Production Failures and Cutover from lower environments.
• Co –ordinate with BA, Developers, Administrator on day to day activities – Includes data issues, technical
aspect, report analytics and technical specification.
• Configure connection – Relational and application in Informatica and validating the Performance for all the
workflows.

Informatica MDM, Informatica MDM SaaS, Informatica Business360, Informatica Customer360, Informatica
Product360, ETL developer at Smart Soft pte Ltd (Through Tech Mahindra).
Project: MDM (JTC Summit Singapore) JTC Summit Singapore

June 2018 – May 2019

• JTC is the one of biggest Singapore government industrial construction company. We had done
implementation for MDM where we had used BI and ETL for extracting data from different sources through
ETL and visualizing data through BI solution.
• Provide strong comprehensive administration of assigned systems, including product evaluation,
selection, installation, integration, user support, problem resolution, and ongoing maintenance of hardware,
operating systems, networking, and application software.
• Deploying in Informatica code from Dev to Test, Test to Stage and Stage to Prod. Perform Unit testing
and validation after code migration.
• Written scripts for repository backup and configured emails.
• Involved in BI Projects – Pentaho Analytics and Pentaho Reporting. Crucial role to migrate objects
across OBIEE, Noetix Analytics, Informatica, Tableau and ETL.
• Developed the code based on Low level design – includes more detail information tables, database and
data flow.
• Created ETL jobs and mappings with the help of Tableau and ETL to integrate the data from different
heterogeneous sources like flat files, csvs, ms sql database, oracle database.
• Development includes two crucial jobs – Pre-landing job and Landing job in ETL. Jobs perform pass
through mappings for moving data from staging to landing tables.
• Understanding the application/business requirement
• Involved in design analysis at offshore
• Provide strong comprehensive administration of assigned systems, including product evaluation,
selection, installation, integration, user support, problem resolution, and ongoing maintenance of hardware,
operating systems, networking, and application software.
• Deploying in Informatica code From Dev to Test, Test to Stage and Stage to Prod. Perform Unit testing
and validation after code migration.
• Written scripts for repository backup and configured emails.

Informatica MDM, ETL, Informatica MDM SaaS, Informatica Business360, Informatica Customer360,
Informatica Product360, IDQ, Informatica MDM developer at Schellden global services
Project: CAG (Changi airport group) Changi Airport Singapore

July 2017 – June 2018

• CAG is Changi airport group which is the one of biggest airport of Singapore. we used to get data from
different sources of aiport like Changi insider (CI), Ishop change (ISC), Changi millinery (MIL), points of sells
of change (POS), OCID etc . we have developed many mappings , sessions and workflows for each source
system . all the workflows are automated and schedule as per business requirements , sometimes source
vendors used to place files late on share path so in that case we have communicate with source vendors and
request for correct data , once they will provide the correct data we have to re load it CDI systems .
• The main of this project was to migrate all the data from IBM server to HCL server , we had developed
many mappings , sessions workflows sessions and master jobs for implementing all business requirement . we
had automated all the jobs and scripts using Tableau and ETL and Autosys and third party scheduler and
Informatica MDM , Core Java , XML SOAP APIs , Web services and ETL job automation process .
• Majorly involves in analyzing the requirement and preparing low level design document from high level
design document/Business requirement document.
• Supporting role towards Business Analyst and Team lead in requirement phase, peer review and
preparing high level design.
• Developed the code based on Low level design – includes more detail information tables, database and
data flow.
• Created ETL jobs and mappings with the help of Tableau and ETL to integrate the data from different
heterogeneous sources like flat files, csvs, ms sql database, oracle database.
• Development includes two crucial jobs – Pre-landing job and Landing job in ETL. Jobs perform pass
through mappings for moving data from staging to landing tables.
• Understanding the application/business requirement
• Involved in design analysis at offshore
• Lead project at various stage from requirement gathering and business rules analysis to creating the
Logical data model and physical model.
• Involved in business analysis and technical design session with business and technical staff to develop
requirements document and ETL specifications.
• Experience in assisting with a data center migration from Pentaho ETL and Tableau and ETL ETL.
• Experience in working on Pentaho report designer for BI implementation.
• Provide IT Operational support by resolving incidents, requests, and problems associated with various
IT systems.
• Provide strong comprehensive administration of assigned systems, including product evaluation,
selection, installation, integration, user support, problem resolution, and ongoing maintenance of hardware,
operating systems, networking, and application software.
• Deploying in Informatica code From Dev to Test, Test to Stage and Stage to Prod. Perform Unit testing
and validation after code migration.
• Written scripts for repository backup and configured emails.
• Involved in BI Projects – Pentaho Analytics and Pentaho Reporting. Crucial role to migrate objects
across OBIEE, Noetix Analytics, Informatica, Tableau and ETL ETL .
• On call and weekend support for Stage and Production Failures and Cutover from lower environments.
• Co –ordinate with BA, Developers, Administrator on day to day activities – Includes data issues,
technical aspect, report analytics and technical specification.
• Configure connection – Relational and application in Informatica and validating the Performance for all
the workflows.
• WAN Optimization based on Throughput rows/sec of data transaction using different source and target.
• Designed and implemented the error handling strategy for ETL team.
• Developed an ETL Informatica mapping in order to load data into staging area. Extracted from flat files
and databases and loaded into Oracle 11g target database. And Informatica MDM , Core Java , XML SOAP
APIs , Web services and ETL job automation process
• Monitored data warehouse week end and month-end loads to ensure successful completion.
• Responsible for designing and developing of mappings, mapplets, sessions and workflows for loading the
data from source to target database using Informatica MDM.
• Extracted date from various sources like Flat files, databases and loaded it into target systems
using Informatica 9.x.
• Developed mappings using various transformations like update strategy, lookup, stored procedure, router,
joiner, sequence generator and expression transformation.
• Tuned mappings and SQL queries for better performance and efficiency.
• Responsible for the ETL to take care of the frequency of the Load to the datawarehouse Oracle tables.
• Upgraded Informatica MDM from 9.1 to 9.6.1

Informatica MDM, Informatica MDM SaaS, Informatica Business360, Informatica Customer360, Informatica
Product360, ETL developer at BCT
Project: American eagle Outfitters (AEO) USA Chennai, India

April 2017 – July 2017

• American eagle outfitters AEO is one of the leading company in retail sector, they are having many
physical outlets and also they used to receive order from their own portal. we were getting data from different
sources of AEO, we had to integrate the data and we had to store that in Data warehouse. on top of that we
were doing reporting. also for getting more trusted records from different sources we had implemented
Informatica MDM and had defined many match and merge rules.
• Lead project at various stage from requirement gathering and business rules analysis to creating the
Logical data model and physical model and Informatica MDM, Core Java , XML SOAP APIs , Web services
and ETL job automation process.
• Involved in business analysis and technical design session with business and technical staff to develop
requirements document and ETL specifications.
• Experience in assisting with a data center migration from Pentaho ETL and Tableau and ETL.
• Experience in working on Pentaho report designer for BI implementation.
• Provide IT Operational support by resolving incidents, requests, and problems associated with various
IT systems.
• Provide strong comprehensive administration of assigned systems, including product evaluation,
selection, installation, integration, user support, problem resolution, and ongoing maintenance of hardware,
operating systems, networking, and application software.
• Deploying in Informatica code from Dev to Test, Test to Stage and Stage to Prod. Perform Unit testing
and validation after code migration.
• Written scripts for repository backup and configured emails.
• Involved in BI Projects – Pentaho Analytics and Pentaho Reporting. Crucial role to migrate objects
across OBIEE, Noetix Analytics, Informatica, Tableau and ETL.
• On call and weekend support for Stage and Production Failures and Cutover from lower environments.
• Co –ordinate with BA, Developers, Administrator on day to day activities – Includes data issues,
technical aspect, report analytics and technical specification.
• Configure connection – Relational and application in Informatica and validating the Performance for all
the workflows.
• WAN Optimization based on Throughput rows/sec of data transaction using different source and target.
• Designed and implemented the error handling strategy for ETL team.
• Developed an ETL Informatica mapping in order to load data into staging area. Extracted from flat files
and databases and loaded into Oracle 11g target database.
• Monitored data warehouse week end and month-end loads to ensure successful completion.
• Responsible for designing and developing of mappings, mapplets, sessions and workflows for loading the
data from source to target database using Informatica MDM.
• Extracted date from various sources like Flat files, databases and loaded it into target systems
using Informatica 9.x.
• Developed mappings using various transformations like update strategy, lookup, stored procedure, router,
joiner, sequence generator and expression transformation.
• Tuned mappings and SQL queries for better performance and efficiency.
• Responsible for the ETL to take care of the frequency of the Load to the data warehouse Oracle tables.
• Upgraded Informatica MDM from 9.1 to 9.6.1

Informatica MDM, ETL, Hadoop developer at HCL


Project: UIDAI SSUP Update government of India. Noida Sector 16 Delhi India, HCL

June 2015 – Jan 2017

• This is one of the biggest biometric project in the world. the main technology being used from BI
perspective in this project is Tableau and ETL ,Pentaho PDI , Bigdata hive, Mysql . The aim of this project is
to provide the unique identification number to every people of India. The data volume of this project was very
high, so for managing the data volume and performance we have used Hadoop hive technology.
• Lead project at various stage from requirement gathering and business rules analysis to creating the
Logical data model and physical model.
• Involved in business analysis and technical design session with business and technical staff to develop
requirements document and ETL specifications and Informatica MDM , Core Java , XML SOAP APIs , Web
services and ETL job automation process.
• Experience in assisting with a data center migration from Pentaho ETL and Tableau and ETL.
• Experience in working on Pentaho report designer for BI implementation.
• Provide IT Operational support by resolving incidents, requests, and problems associated with various
IT systems.
• Provide strong comprehensive administration of assigned systems, including product evaluation,
selection, installation, integration, user support, problem resolution, and ongoing maintenance of hardware,
operating systems, networking, and application software.
• Deploying in Informatica code from Dev to Test, Test to Stage and Stage to Prod. Perform Unit testing
and validation after code migration.
• Written scripts for repository backup and configured emails.
• Involved in BI Projects – Pentaho Analytics and Pentaho Reporting. Crucial role to migrate objects
across OBIEE, Noetix Analytics, Informatica, Tableau and ETL.
• On call and weekend support for Stage and Production Failures and Cutover from lower environments.
• Co –ordinate with BA, Developers, Administrator on day to day activities – Includes data issues,
technical aspect, report analytics and technical specification.
• Configure connection – Relational and application in Informatica and validating the Performance for all
the workflows.
• WAN Optimization based on Throughput rows/sec of data transaction using different source and target.
• Designed and implemented the error handling strategy for ETL team and Informatica MDM, Core Java ,
XML SOAP APIs , Web services and ETL job automation process .
• Developed an ETL Informatica mapping in order to load data into staging area. Extracted from flat files
and databases and loaded into Oracle 11g target database.
• Monitored data warehouse week end and month-end loads to ensure successful completion.
• Responsible for designing and developing of mappings, mapplets, sessions and workflows for loading the
data from source to target database using Informatica MDM.
• Extracted date from various sources like Flat files, databases and loaded it into target systems
using Informatica 9.x.
• Developed mappings using various transformations like update strategy, lookup, stored procedure, router,
joiner, sequence generator and expression transformation.
• Tuned mappings and SQL queries for better performance and efficiency.
• Responsible for the ETL to take care of the frequency of the Load to the data warehouse Oracle tables.
• Upgraded Informatica MDM from 9.1 to 9.6.1

Informatica ETL, EDW and SQL/PLSQL Developer at Cognizant


Project: TRYG Denmark Gurgaon Delhi India, Cognizant

Dec 2014 - June 2015

• The aim of this project was migrate the existing interfaces of SAP BODS to Tableau and ETL and point
out the target location to Hadoop Hive database. Some of the new interfaces we had developed from scratch
using Tableau and ETL.
• Created data model for end to end project, written SQL and Hive Script for table creation in different
layers and Informatica MDM, Core Java, XML SOAP APIs, Web services and ETL job automation process.
• Developed many mappings for implementing business requirement such as SCD type 2 Dimension
model implementation and population of Fact from different grains.
• Applied performance tuning methodology and strategy for getting better performance. Optimized
source, target and transformations used between source and targets.
• Interact with Business Analyst to assist him in Understanding the Source and Target System.
• Responsible to pull the data from XML files, Flat files (Fixed width & Delimited) and COBOL files by
using complex transformations like Normalizer, XML Source Qualifier etc.
• Extracted and transformed data from various sources like relational databases (Oracle, SQL Server).
• Created Mappings using Mapping Designer to load the data from various sources, using different
transformations like Source Qualifier, Expression, Lookup (Connected and Unconnected), Aggregator, Update
Strategy, Joiner, Filter, and Sorter transformations.
• Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session
Parameters.
• Worked on different tasks in Workflows like sessions, events raise, event wait, decision mail,
command, worklets, assignment, timer and scheduling of the workflow.
• Worked on Power Center Designer tools like Source Analyzer, Warehouse Designer, Mapping
Designer, Mapplet Designer and Transformation Developer.
• Used Debugger within the Mapping Designer to test the data flow between source and target and to
troubleshoot the invalid mappings.
• Efficiently implemented Change Data Capture (CDC) to extract information from numerous Oracle
tables and experienced Pushdown Optimization in Informatica MDM.
• Designed and developed UNIX Shell scripts to report job failure alerts.
• Used Workflow Manager for Creating, Validating, Testing and running the sequential, parallel, and
initial and Incremental Load.
• Used Version Control to check in and checkout versions of objects.
• Worked on SQL tools like TOAD and SQL Developer to run SQL Queries and validate the data.
• Scheduled Informatica Jobs through Autosys scheduling tool.
• Involved in Unit testing, System testing and UAT to check data consistency.
• Assisted QA team to fix and find solutions for the production issues
• Prepared all documents necessary for knowledge transfer such as ETL strategy, ETL development
standards, ETL processes, etc .

Informatica MDM, ETL developer at Cognizant


Project : EDW MAXLIFE PHASE 2 Gurgaon Delhi India ,Maxlife (Onsite)

May 2014 – Jan 2015

• We had provided BI solution for Maxlife Marketing agents, Head marketing agents and their respective
reporting, so that they can evaluate their business performance on MTD(month to month basis) , QTD (Quarter
to quarter basis) and YTD (Year to year basis) . We had developed many measures using different
mathematical formula and had implemented the same using Tableau and ETL and Data warehouse approach.
Had communicated and participated in all phases of project from requirement gathering to Development,
Testing and Production environments.
• Other responsibilities include troubleshooting, documentation, researching and resolving Business
submitted data issues, developing and modifying code to improve the data loading experience, researching and
resolving defects, and executing other development and analysis tasks as assigned and Informatica MDM ,
Core Java , XML SOAP APIs , Web services and ETL job automation process.
• Lead all design decisions on technical implementation and concepts (i.e. templates, standards) and
Decide on ETL implementation issues.
• Worked 24/7 for operational support and monitoring thousands of ETL jobs and other batch processing
systems.
• Excellent knowledge in Java, Web services, Object oriented programming, SQL and RDBMS concepts
• Interacting with the technical team to make them understand the client requirements.
• Interacting with the quality assurance team to test the perfect test case scenarios.
• Monthly Maintenance and Patching activity where been the primary on-call support for the daily &
nightly "Batch"
• Manage the offshore team and guided the team for Support issue fixes, PM and Enhancements
• Thousands of jobs are related to Informatica (Parse, Splitter and Loader), Executable (File Copy
• And Identity resolution), Stored Procedure (Data Load and Transformation) and FTP (Secure FTP and
Tumbleweed)
• Acquired the knowledge of existing inbuilt production applications for support and accordingly
prepared standard documents.
• Experience with multiple source systems / sources an Targets (Flat Files, Hierarchical Files, DB Tables,
XSDs)
• Created ETL standards, processes, Best practices and Project governance
• Learned the existing Informatica and other subject areas, source systems, target system, operational
data, jobs, deployment Processes and Production Support activities Created design and technical specification
documents for SSIS according to requirements.
• Created mappings using the Transformations like Source qualifier, Aggregator, Expression, lookup,
Router, Filter, Rank, Sequence Generator, and Update Strategy.
• Created Mapplets, reusable transformations and used them in different mappings.
• Performed the data profiling and analysis making use of Informatica Data Quality (IDQ)
• Worked on creating the mappings using standardizer, Parser, Address Validator, Merge, case converter etc.
in Informatica Data Quality.
• Imported the IDQ address standardized mappings into Informatica Designer as a Mapplet
• Involved in performance tuning and optimization of Informatica mappings and sessions using features like
Session Partitioning, setting DTM Buffer Size and Cache Calculation (Data/Index cache) to manage very large
volume of data.
• Developed Informatica SCD type-I, II and III mappings. Extensively used almost all of the
transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, Mapplet and
others.
• Used Informatica debugging techniques to debug the mappings and used session log files and bad files
to trace errors occurred while loading.

Informatica MDM, ETL, Informatica MDM developer at Cognizant


Project: Publeo Intigration AstraZeneca Kolkata , India

June 2013 - May 2014

• We had to provide proper business solution for medical representatives like when they visit to a
particular doctor which sample they should present and also visit to those doctors only for whom respective
sample medicine was applicable.
• Defined ETL standards, processes, Best practices and Project governance
• Understand the existing subject areas, source systems, target system, operational data, jobs, deployment
• Processes and Production Support activities and Informatica MDM , Core Java , XML SOAP APIs ,
Web services and ETL job automation process
• Worked as Informatica Lead for ETL projects to Design, Develop Informatica mappings.
• Worked with Informatica IDQ(Data Analyst, Developer) with various data profiling techniques to cleanse,
match/remove duplicate data.
• Actively involved in designing, developing, creating mappings, mapplets using Informatica source,
warehouse and loading the data.
• Worked with Tableau and ETL to move data from multiple sources into a common target area such as
Data Marts and Data Warehouse.
• Involved in planning and building servers in server migration project
• Installed and configured Power Center 9.1.0 and 9.5.1on Windows platform.
• Creation and maintenance of Informatica users and privileges. Migration of Informatica
Mappings/Sessions/
• Created Groups, roles, privileges and assigned them to each user group.
• Developed UNIX shell automation scripts to stop/start the application
• Worked with production support, ETL Issues and Performance tuning.
• Created source and Target Definitions, Reusable transformations, mapplets and worklets.
• Created Mappings and extensively used transformations like Source Qualifier, Filter, Update Strategy,
Lookup, Expression, Router, Joiner, Normalizer, Aggregator and Sequence Generator, Web services
Consumer.
• Installing and Configuring of Informatica MDM Hub Console Hub Store Cleanse and Match Server
Address Doctor Tableau and ETL applications
• All facets of MDM implementations including Data Profiling metadata acquisition data migration
validation reject processing and pre landing processing
• Performing requirement gathering analysis design development testing implementation support and
maintenance phases of both MDM and Data Integration Projects
• Master Data Management MDM Data Integration concepts in large scale implementation environments
• Creating landing tables Staging tables Base Tables as per the data model and data sources
• Worked on Informatica client tools like Source Analyzer, Warehouse Designer, Mapping Designer,
Mapplet Designer and Transformations Developer.
• Involved in developing multidimensional data modeling using star schema and snow flake schema.
• Used Informatica to extract and load data from Oracle database and flat files to Oracle tables.
• Created various mappings using Designer which include Expression, Router, Aggregator, Stored
Procedure, Update strategy and Look-up Transformations.
• Created Mapplets using Mapplet Designer to reuse in different mappings.
• Handled Performance Tuning by checking bottlenecks also created partitions, SQL override in Source
qualifier.
• Designed and developed Reporting System using Business Objects.
• Performed Unit Testing and documented the complete Mappings

ETL Informatica MDM, ETL, IDQ developer at Cognizant


Project: Channel Integration CI AstraZeneca Kolkata , India
October 2011 – May 2013

• The main of this project was to migrate all the data from IBM server to HCL server, we had developed
many mappings, sessions workflows sessions and master jobs for implementing all business requirement . we
had automated all the jobs and scripts using Tableau and ETL and Autosys and third party scheduler .
• Majorly involves in analyzing the requirement and preparing low level design document from high level
design document/Business requirement document.
• Supporting role towards Business Analyst and Team lead in requirement phase, peer review and
preparing high level design.
• Developed the code based on Low level design – includes more detail information tables, database and
data flow and Informatica MDM , Core Java , XML SOAP APIs , Web services and ETL job automation
process.
• Created ETL jobs and mappings with the help of Tableau and ETL to integrate the data from different
heterogeneous sources like flat files, csvs, ms sql database, oracle database.
• Development includes two crucial jobs – Pre-landing job and Landing job in ETL. Jobs perform pass
through mappings for moving data from staging to landing tables.
• Understanding the application/business requirement
• Involved in design analysis at offshore
• Interacting with clients to provide better technical solution in accordance with the business
• Impact analysis of new business and determining work estimate in co-ordination with the onsite team
• Driving other associates of my team in desired direction for proper accomplishment of work, guiding
them, and keeping them motivated
• Analyzing the requirement and preparing low level design document from high level design
document/Business requirement document
• Preparing Functional Specification from Design documents
• Design and development of modules (graphs, wrapper scripts,) based on detailed design
• Analyzing the system components and extracting the business logic, hence enhancing the application as
per client’s requirement
• Design and development test scenarios for unit and integration testing, as well as positive and negative
testing and generating the test data as per test cases
• Development of Informatica Mappings, sessions and workflows to meet the business requirements
• Tuning of the Informatica mappings code to satisfy the performance benchmarks.
• Business Analysis and Systems Design.
• Developed Logical and Physical database design.
• Extensively used XML transformations in Informatica for processing data.
• Created transformations in Informatica such as XML parser and XML generator transformations for
processing xml related data.
• Maintained Shared objects at the Informatica level to avoid redundancy.
• Extensively used packages, stored procedures and database triggers.
• Analysis is done based on business rules in existing systems and user requirements.

Education
Date of Birth 12-Jan-1988
Nationality Indian
Marital Status Married
Designation Sr. Associate
Email ID
Mobile No.

Bachelor’s In ECE – 2007 to 2011

GPA – A – 82%
West Bengal University of Technology Kolkata India.

You might also like