0% found this document useful (0 votes)
49 views

Resume 2

Srikanth Thota is an Associate Consultant located in Hyderabad with over 2.9 years of experience in the IT industry. He has skills in Azure Data Factory, Azure Data Lake, Azure Synapse Analytics, Azure Stream Analytics, Azure Storage, Informatica Power Center, Tableau and Open Text Content server. He is a Microsoft certified Azure Data Engineer Associate and Azure Data Fundamentals certified. His experience includes working as a data engineer on Azure data services and as a senior analyst on a POC using Azure Data Lake Gen 2, Azure Blob Storage, and Azure SQL Database. He also has experience as an Open Text Livelink server administrator.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
49 views

Resume 2

Srikanth Thota is an Associate Consultant located in Hyderabad with over 2.9 years of experience in the IT industry. He has skills in Azure Data Factory, Azure Data Lake, Azure Synapse Analytics, Azure Stream Analytics, Azure Storage, Informatica Power Center, Tableau and Open Text Content server. He is a Microsoft certified Azure Data Engineer Associate and Azure Data Fundamentals certified. His experience includes working as a data engineer on Azure data services and as a senior analyst on a POC using Azure Data Lake Gen 2, Azure Blob Storage, and Azure SQL Database. He also has experience as an Open Text Livelink server administrator.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

SRIKANTH THOTA

Designation : Associate
Location : Hyderabad

Experience Summary

A result oriented professional and passionate engineer working on Azure Data Services, having 2.9 years
of experience in IT industry and Microsoft certified Azure Data Engineer.

Work History:

KEY SKILLS:
• Functional: Analysis and Development, Excellent Communication and Interpretation
skills

• Technical: Azure Data Factory, Azure Data Lake, Azure Synapse Analytics, Azure Stream Analytics,
Azure Storage, Informatica Power Center, Tableau and Open Text Content server.

Certifications:
• Microsfot certified Azure Data Engineer Associate(DP-200 &DP-201)
• Microsoft certified Azure Data Fundamentals (DP-900)
• Microsoft certified Azure Fundamentals (AZ-900)

Published & Presented / Awards


• Received EXTRA MILE award (Q4 2020).

Experience
01 Client: )

Project is a global engineering, technical and business services organization wholly


Description: owned by the Lloyd’s Register Foundation, a UK charity dedicated to
research and education in science and engineering. Founded in 1760 as a
marine classification society, operating across many industry sectors such as
Energy, Marine, LRQA, Rail, and Consulting.

Role/Title: Associate consultant


Responsibilities: Working as a Data Engineer and my responsibilities include working on
SQL Server, Azure Data Factory and Azure Storage

• Gathering and analyzing the requirements from the business users


to provide the compatible solution.

• Creating the Ingestion pipelines using ADF (Azure Data Factory)


based on the Source requirements

• Scheduled the ADF Pipeline with Event based and Scheduled


triggers.

• Ingesting source data from various relational and non-relational


data sources into Azure Data Lake through ADF pipelines

• Collecting business requirement from Client, and converting huge


volume of Data to required ready to use information to the
endusers.

• To Identify & fix the bugs in the code using SQL Server Management
Studio

Tools: Azure Data Factory, Azure Storage Services, Azure Synapse, SQL Server
Management Studio and MS SQL Server

02 Project Type : POC on Azure Data Services


Role/Title : Senior Analyst
Responsibilities : • Building a solution architecture for a data engineering solution using
Azure Data Engineering technologies such as Azure Data Factory
(ADF), Azure Data Lake Gen2, Azure Blob Storage, Azure SQL Database.

• Integrating data from HTTP clients, Azure Blob Storage and Azure Data
Lake Gen2 using Azure Data Factory.

• Branching and Chaining activities in Azure Data Factory (ADF) Pipelines


using control flow activities such as Get Metadata, If Condition, For
Each, Delete, Validation etc.

• Using Parameters in Pipelines, Datasets and Linked Services to create


a metadata driven pipelines in Azure Data Factory (ADF).

• Scheduling pipelines using triggers such as Event Trigger, Schedule


Trigger and Tumbling Window Trigger in Azure Data Factory (ADF).

• Creating Mapping Data Flows to create transformation logic using


Source, Filter, Select, Pivot, Derived Column, Aggregate and Sink
transformation.

03 Client:
Project a global engineering, technical and business services organization wholly
Description: owned by the Lloyd’s Register Foundation, a UK charity
dedicated to research and education in science and engineering. Founded in
1760 as a marine classification society, operating across many industry
sectors such as Energy, Marine, LRQA, Rail, and Consulting.
Role/Title: Senior Analyst

Responsibilities: Worked as Livelink server administrator to support & maintain the


environment. I have worked Open Text Livelink ECM CS 10, SQL Server,
Workflows, Live Report creation.

• Downloading the bulk data from Azure Blob storage and storing
in Content server.
• Development and Maintenance of 24*7 running application and its
enhancement.

• Handled Operations work of application for daily health check-up,


Search Index, checking servers etc.

• Installing and configuring patches and modules.

• Handled issues related to Content Server functionalities like Live


Reports, Web Reports, Workflows, Importer/Exporter, Categories,
Enterprise Connect, WebDAV etc.

• Performing patching related admin activities and housekeeping


activities. User administration and managing custom group • Live reports/
Database query's

• Work on continuous service improvement plans.

• Worked on resource planning, provided training on Content Server,


actively participated in management work

Tools: Open Text Content server 10.5, Azure Blob storage.

04 Project Name: Restaurant Analysis (ETL + Reporting)

Project In this project we extracted the data from flat files into a Staging table and Description:
then stored it into different dimension tables in oracle database and from
those dimension tables we created fact table using Informatica Power center
10.2.0 ETL tool. Tableau was used to generate different reports which
visualizes the data using different charts and tables.

Software / Oracle 11G, Informatica Power Center 10.2.0 & Tableau Desktop Languages:

Educational Background :
Degree / Board/University From To % of
Examination Year Year marks
Graduation

XII th

X th

You might also like