0% found this document useful (0 votes)
80 views

Murshith Data Analyst Profile

Murshithullah has over 9 years of experience as a Business/Data Analyst with expertise in data analysis, ETL development, data warehousing, and reporting. He has worked on projects involving data integration from various sources like Salesforce, Oracle, and Hadoop into data warehouses built on AWS Redshift and Vertica. Some of his responsibilities include data profiling, cleansing, migration, creating metrics and KPIs, and building dashboards in Tableau. He has extensive experience with tools like Informatica, Xplenty, AWS, Python, SQL, Hive and technologies like AWS Redshift, Oracle, PostgreSQL, HDFS and Hadoop.

Uploaded by

AhamedSharif
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
80 views

Murshith Data Analyst Profile

Murshithullah has over 9 years of experience as a Business/Data Analyst with expertise in data analysis, ETL development, data warehousing, and reporting. He has worked on projects involving data integration from various sources like Salesforce, Oracle, and Hadoop into data warehouses built on AWS Redshift and Vertica. Some of his responsibilities include data profiling, cleansing, migration, creating metrics and KPIs, and building dashboards in Tableau. He has extensive experience with tools like Informatica, Xplenty, AWS, Python, SQL, Hive and technologies like AWS Redshift, Oracle, PostgreSQL, HDFS and Hadoop.

Uploaded by

AhamedSharif
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

MURSHITHULLAH H

B.TECH (IT) with 9+ years of experience in IT. Highly accomplished Data Analyst seeking
assignments as Sr. Business/Data Analyst in Marketing/Sales domain with proven success in
Data analyzing.
Mobile: 09060963311
Email:[email protected]
------------------------------------------------------------------------------------------------------------------
PROFESSIONAL SYNOPISIS
▪ Having 9+ years of IT experience in the field of Business/Data Intelligence Analyst solutions, DB
Development and Reporting and Analysis of Sales and Marketing Data, ETL Development, Data
Warehousing and Modeling.
▪ Extensive experience in ETL design and development (Using Informatica and Xplenty Tool),
Oracle SQL &PL/SQL, Airflow, AWS Services (S3, Redshift DB, Athena, Glue and EC2),
Unix/Linux scripting for file Handling and job automation, Big Data Hadoop Framework (like
HDFS,SparkSQL, Sqoop, Hive and Pig), Python Scripting for data analysis using pandas and
numpy.
▪ Exposure of Marketing and Sales Analytics, Customer Profiling and Customer Segmentation.
▪ Experience in Data Quality Projects, Acquisition Process and Handling Adhoc Request.
▪ Strong experience in Business and Data Analysis, Data Cleansing, Data Profiling, Data Migration
and Data Integration.
▪ Proficient in source system analysis and good debugging skills with extensively worked on Data
cleansing and Data Staging of operational sources using ETL processes for data warehouses.
▪ Experienced in developing accurate business process requirements for Hospitality, Sales and
Marketing, Transportation and Telecommunication.
▪ Good knowledge and exposure in RDBMS, Dimensional Modeling, Star and Snowflake Schemas.
▪ Having Knowledge on R Scripting and Spark

TECHNICAL SKILLS

Database : Oracle 9i/10g/11g, AWS Redshift & Postgres


ETL Tools : Informatica 8.6/9.5, Airflow and Xplenty
Language : SQL, PL/SQL, PostgreSQL, Hive, Python, UNIX Scripting
Big Data : Greenplum and Hadoop
Cloud Services : AWS (Redshift, S3, EC2 and Athena)

WORK EXPERIENCE

SITEMINDER

Brief description of project

Title : Project Runway


Client : SITEMINDER
Duration : Nov 17 to July 2020
Environment : AWS (Redshift, S3, EC2 and Athena)
Tools/Languages : Python, R Scripts, Xplenty ETL, SQL, AWS Glue, Airflow (Scheduler)
Role : Data Operation Manager

Description:
SiteMinder is the global hotel industry’s leading guest acquisition platform, ranked among
technology pioneers for its smart and simple solutions that put hotels everywhere their guests are,
at every stage of their journey. They have many products such as Channel Manager, Little Hotelier,

i
TBB, Canvas etc... and also, can integrate multiple OTA into a single platform. As a Data Operation
manager my role is to manages all data operation like Data Integration from different source
systems like Salesforce, Zuora, Siteminder Product Application's Data into a single Data Warehouse
platform in Redshift. After Data Integration from multiple source system analyzed and creates
foundation layer for Visualization and Data analytics purpose which is used by business users to
derive the key metrics.

Responsibilities:
▪ Responsible for integration, analysis and automation of multiple source systems like Salesforce,
Zuora, Siteminder Product Application's Data (CM, LH, TBB etc.)
▪ Responsible for Data Extraction from Salesforce, Zuora API’s using Python scripts, used different
Python libraries to extract and transform data before moving data to S3/Redshift
▪ Responsible for extensively creating packages/workflows in Xplenty ETL tool for data integration
and file processing activities. Also managed Xplenty natively for all job automation activities.
▪ Used Python data analysis libraries to analyze and transform huge volume of data to business
metrics.
▪ Responsible for leading team from migrating source files from traditional csv/text files to
columnar file formats such as ORC/Parquet.
▪ Responsible for creating dags/jobs in Airflow for data processing activities.
▪ Involved in gathering and analyzing requirement from business users by preparing
Data/Business documents.
▪ Responsible for generating end to end lifecycle for Siteminder App DB key metrics which was
extensively used by business team as needle movers.
▪ Responsible for creating and maintaining multiple functional layers in Redshift DW, which were
source for Tableau reporting and Data analytics. By creating different stages of
foundational/functional layers have reduced extensive load on tableau reporting which resulted
in high performance dashboards.
▪ Created SQL workbook by analyzing all source systems, which were used by Analytic team to
solve business metrics.
▪ Responsible to generate complete Customer universe to identify customer growth, Net mrr ,
Transactional mrr and other key business metrics.
▪ Responsible for analyzing sales & marketing data and Siteminder APP DB Data to identify the
key metrics and create foundation layer for Sales & Marketing Dashboard and APP DB
Dashboards.
▪ Responsible for Maintaining Redshift DB for better performance, also was responsible for
upgrading Redshift to cope up with business/data growth.
▪ Responsible for migration sql scripts from redshift to Athena sql.

Introlligent Solutions Pvt. Ltd.

Brief description of project

Title : MQM (Manufacturing Quality Management)


Client : Apple
Duration : Jun 17 to Nov 17
Environment : Oracle, Hadoop, Unix Scripting, Tableau and Python
Role : Senior Data Quality Analyst
Description:
Apple have many manufacturing sites. Manufacturing and Data Management (MQM) provide
support to Channel Manager in different manufacturing sites. MQM responsible for product full
lifecycle analysis to include requirements, activities and design such as setup sites, family hierarchy,
Module, Test station, Test task etc... for new product. Every day Manufacturing Sites sends millions
of machines test result data into the MQM system. MQM responsible for fixing data quality related
issues and also monitor performance and quality control plans to identify improvements.

ii
Responsibilities:
▪ Create and Monitor reports for Completeness, accuracy, timeliness on all product at all sites.
▪ Research & Communicate Errors to CM.
▪ Fix and reload Data.
▪ Validate missing/Errored data that is resend by CM.
▪ Code & Validate scrips for fixing production data.
▪ MQM Supports for new sites, buildings, lines, model tables, Product task, family etc...
▪ Involving various Data Analysis task.

Infogain India Pvt. Ltd. (August 2016 to Jun 2017)

Brief description of project

Title : Channel Data Lake


Client : Hewlett-Packard Enterprise (HPE)
Duration : Aug 16 to Jun 17
Environment : Hadoop (HDFS, Sqoop, Pig and Hive) and Unix Scripting
Role : Consultant

Description:
CDL is a consolidated repository that provides a global reporting capability for channel data
consolidation, storage, reporting and distribution to systems that support Sales/Partner
Compensation, Sales/Claims Management. It is used as central repository for all channel data
worldwide for global. This project involves in upgrading existing business regional model to global
model. By upgrading to global business model for improving sales and compensation of retailers

Responsibilities:
▪ Involved in creating data model to consolidate the regional business to global.
▪ Designing the Project flow from source to reporting with audit information.
▪ Documented the complete process flow to describe program development, logic and
implementation.
▪ Implemented ETL process using pig, hive, sqoop to load into destination DW system.
▪ Involved in identifying different data marts for consolidating subject areas for different sources.
▪ Responsible for defining structure of inbound/incoming files and prepared scripts for extracting
data from source systems.
▪ Analyzed source data to leverage existing DW model to global DW model as per business asks.
▪ Involved in data enrichment activity in Staging Area by using ETL Process like Data Merging,
Data Cleansing and Data Aggregation.
▪ Involved in creating reporting layer by creating views in Vertica.

TEK System (March 2016 to August 2016)

Brief description of project

Title : Ekart Analytics


Client : Flipkart
Duration : Mar 16 to Aug 16
Environment : Hadoop, FDP (Flipkart Data Platform), Hive Scripting and QlikView
Role : Senior Data Analyst

iii
Description:

Flipkart is now India’s leading e-Commerce marketplace. Ekart logistics is Flipkart’s own
logistics service. Worked with Ekart Analytics team for various activities. Involved in creating fact's
using Hive scripting and make it available for reporting using FDP – (Flipkart Data Platform). We are
Creating Report to support all stakeholders. Maintaining all the forward and reverse shipments
related details.

Responsibilities:
▪ Responsible to create and maintain facts for analyzing.
▪ Responsible to create Reports and Dashboards using FDP.
▪ Responsible to analyze data using hive.
▪ Responsible for extract data for QlickView visualization.
▪ Maintaining all the shipments reverse shipments related details.
▪ Used data analysis techniques to identify structure, quality of data, data issues & suggest
solutions.
▪ Communicate with stack holders to gather requirement for all return shipments.

VMware Software India Pvt Ltd. (April 2013 to March 2016)

Brief description of project

Title : Customer Data Mart & APJ Marketing Analysis


Client : Internal
Duration : April 2013 to Mar 2016
Environment : Greenplum (Big Data), PostgreSQL, Oracle 10g (SQL, PL/SQL), Batch
scripts, Tableau and Hadoop
Role : Business Intelligence Analyst
Description:
CDM stands for Customer Data Mart to build the complete Account Universe for APAC to
Support Sales and Marketing Operation Team. The Main objective of the CDM is Account Profiling,
Coverage Mapping, RAD9, Sales rep Analysis and Product Heat Map Analysis. Integrate different
sources for building the single data repository for CDM accounts and generate the complete
hierarchy, Bookings and RAL mapping. Which is used for the End user to perform various analysis
and also identify the high potential account to target.
Responsible for creating foundational data by integrating different Data Marts (Bookings,
Contacts, Opportunity, Lead, Campaigns etc.) and developing the Batch script to automate the
different jobs. Handled both file and database data as a source. My role is to analyze the various
Data Marts and prepare the foundational data for the same.
Responsibilities:
▪ Working with Global Team to understand the functional requirement for the Project.
▪ Collecting profile database from number of sources, Import, Segregate & filtration of data by
each segment and region.
▪ Used data analysis techniques to identify structure, quality of data, data issues & suggest
solutions.
▪ Documented the complete process flow to describe program development, logic and
implementation.
▪ Worked with internal architects and, assisting in the development of current and target CDM
data architectures.
▪ Responsible to generate complete account universe, identify bookings and mapping RAL account
and validation.
▪ Responsible to analyze the Different Data marts for Sales & Marketing Support
▪ Involved in writing foundational database scripts in Greenplum using PostgreSQL and PL/pgSQL
function.

iv
▪ Responsible for extract data for Tableau visualization and generating Statistics reports.
▪ Responsible for Data Export and Import in Hadoop using Sqoop.
▪ Involved in various list pull activities as per the Operational team needs.
▪ Responsible for migrating all the Oracle scripts to Greeplum.
Tecnotree Convergence (May 2012 to April 2013)

Brief description of project


Title : PPMS (Prepaid Pin Management System)
Client : MTN-IranCell
Duration : May 2012 to April 2013.
Environment : Informatica 8.6, Oracle 10g(SQL,PL/SQL), Unix scripts
Role : ETL & PL/SQL Developer
Description:
The Prepaid Management System (PPMS) is the system for end-to-end process management
for recharge card provisioning, sales, transfer movement (Stock movement/stock transfer),
activation etc. It manages and tracks the recharge vouchers activities throughout the vouchers’
entire lifecycle. PPMS deals with various voucher management processes which include voucher file
generation for manufacture, packing file processing, voucher provisioning and activation, status
change file updating and also facilitates blacklisting functionality and damage pin recovery to the
business. Used Informatica to populate staging database as well as data mart for EDW analyze and
reporting
Responsibilities:
▪ Extraction of data from different Sources like Oracle, flat files.
▪ Creation of Mappings using different Transformations like Source Qualifier, Filter, Expression,
Router, Aggregator, Sorter, Lookup and Update Strategy transformations.
▪ Designed and Developed different mappings and tuned them for better performance.
▪ Used Informatica Workflow Manager for creating, running the sessions and workflows.
▪ Responsible in implementing database objects (Package, procedures, functions, triggers).
▪ Design the process for data cleansing and cleaning
▪ Understanding business data relationships.
▪ uses BULK_COLLECT and PL/SQL collections in order to increase the performance
▪ Responsible for maintaining and fixing the bugs in procedures, triggers and functions.
Worked as Software Engineer in Aroha Technologies, Bangalore (July 2011 to May 2012)

ACADEMIC QUALIFICATION

Degree Institute University Year of passing Percentage


B.TECH MAM Engineering Anna University 2009 70
Information College
Technology
H.S.C National Matric. State Board 2005 74
Hr. Sec. School
S.S.C National Matric. State Board 2003 76
Hr. Sec. School

PERSONAL DETAILS
Date of Birth : 13 June 1988
Marital Status : Married
Language Known : English, Tamil
Passport : N7892526

You might also like