CANDIDATE COVER LETTER
OVERALL RATING: 4/5 5
Candidate Information
Date: 18-06-2024 Location:Bangalore
Candidate Name: Bhargav Valmeeki Assessor Name: Devendra
Position Applied For: Integration Developer Reason for Job Change: For Better Opportunities
Total Years of Experience: 4.3 Years Relevant Years of Experience: 4.3 Years
[Link] Role Analysis and Understanding
*Strong Knowledge in AWS
*Strong Knowledge in SQL,Python
*Maitillion Tool Exp is Required
[Link] Skills Evaluation
a. Core Competencies: List the primary technical skills required for the position (e.g. programming languages,
frameworks, tools etc).
Skill Assessor Insight Rating (1-5)
Skill 1 SQL,Python 4
4
Skill 2 AWS
Skill 3 Matillion 4
3. Behavioural and Soft Skills Evaluation
a. Communication Skills: Assess the candidate’s ability to communicate effectively.
Criteria Assessor Insight Rating (1-5)
Quality in Communication Good 4
Effective Listening Good Listener 4
Articulation of Thought
4. Overall Qualitative Assessment and Recommendation
He has 4.3 years of experience with ETL Tools and Snowflake Integration using Matillion Tool and
Redshift ,Apart from this he is Very strong in SQL,Python Scripting and has Good exposure with
AWS Cloud
BHARGAVA RAMUDU V
PROFESSIONAL SUMMARY:
Overall 4 Years and 3 months of experience in IT industry
in industry sectors.
Good hands-on experience on Snowflake and database
objects.
Hands-on experience in AWS S3, Snow pipe, Time Travel,
Cloning, clustering, data distribution and continuous data
loading using Snow pipe and other features of Snowflake.
I have the knowledge in DBT.
Hands on experience on MySQL, SQL Server.
Hands on experience on Netezza and metillion.
Hands on experience in architecting data pipeline
solutions using AWS, SQL, Python.
Experience on Snowflake Multi-Cluster Warehouses &
Snowflake Virtual Warehouses. Good exposure in SQL,
Snowflake.
Excellent communication, interpersonal, analytical skills,
and strong ability to perform in a team.
Ability to accept and learn new technologies.
SKILLS SUMMARY:
SKILLS SUMMARY:
• Extensive experience on requirement gathering, analysis
and designing mapping documents.
• Created EDD (External Design Document) and IDD
(Internal Design Document)
Cloud Data Snowflake, SnowSQL ,
Warehouse Snowpipe , AWS
Databases MY SQL
Programming SQL, Python
Languages
Operating Windows, Linux
Systems
Cloud Snowflake, AWS S3, AWS IAM,
Solutions AWS EC2, MYSQL, DBT
Architect
WORKING SUMMARY:
Currently working as a Snowflake Developer in
NTT DATA , Pune from Feb 2022 to till date
Worked as a Data Engineer at PROFESIONAL H.R.
SERVICE [Link] Bangalore From Oct - 2020 to Oct -
2021.
Worked as a ETL Developer at S R S MULTI AMC
SERVICES Bangalore From Dec - 2019 to Sep -
2020.
EDUCATIONAL QUALIFICATIONS:
[Link] from SK UNIVERSITY ANANTAPUR
PROJECTS HANDLED
Project 3:
Role : Snowflake Developer.
Client : Cummins
Environment : Snowflake, AWS S3,
SQL.
Duration : Feb 2022 to till date.
Roles and Responsibilities:
Responsible for all activities related to the
development, implementation, administration and
support of ETL processes for large scale Snowflake
cloud data warehouse.
Bulk loading from external stage (AWS S3) to
internal stage (snowflake) using COPY command.
Involved in data cleaning to maintain quality and
better performance.
Created data base objects like Snowflake tables, file
formats, sequences, internal and external stages.\
Data loaded from various areas from internal stage
and on local machine.
Used import and export from internal stage
(Snowflake) vs external stage (S3 Bucket).
Writing complex Snow Sql scripts in Snowflake
cloud data warehouse to Business Analysis and
reporting.
Performed data quality checks to improve the
performance of loading and unloading.
Used COPY, LIST, PUT and GET commands for
validating internal and external stage files.
Create SnowPipe for continues data load from S3.
Perform troubleshooting analysis and resolution of
critical issues.
Involved in data analysis and handling ad-hoc
requests by interacting with business analysts,
clients and resolve the issues as part of production
support.
Worked on streams and tasks to identify Change
Data Captures.
Project 2:
Role : Digital engineering engineer.
Client : Waste Management
Environment : Snowflake, Netezza, Matillion
and SQL
Duration : Oct - 2020 to Oct - 2021.
Primary Responsibilities:
Working as an ETL Developer for the SNOW5-
Netezza to Snowflake migration project.
Analyzed existing Unix+Netezza Based
Transformation Framework and converted into
Matillion+ Snowflake.
Built Various Orchestration and Transformation Jobs
to implement existing Transformation logic using
Matillion.
Created the tables and views in snowflake as per
client provided logic(xfr).
Created a reusable job to capture load statistics.
Validated the tables to meet client requirements.
Project 1:
Professional Experience (Tech Mahindra)
Project ANA-Bluelake Data platform
Name
Project The next passenger system (Altea), a future
Description passenger service integrating domestic and
international services replace the current
domestic passenger system (able-D). With the
changes of the upstream system, Data Platform
shall also be adapted to correctly process the
analytic information from the new system.
Business Airline Domain – PSS System
Domain
Duration (Dec - 2019 to Sep - 2020.)
Technology Snowflake, Snow SQL, MYSQL
Role in Project ETL Developer
- Bulk loading from the external stage (AWS
S3),internal stage to snowflake cloud using the
COPY command
- Loading data into snowflake tables from the
internal stage using snowsql.
- Used COPY, LIST, PUT and Get commands foe
validating the internal stage lifes.
- Used import and Export from the internal
stage (snowflake) from the external stage
(AWS S3)
- Developed snowflake procedures for
executing branching and looping.
- Created SnowPipe for continuous data
ingestion from the s3 bucket.
- Performed data quality issue analysis using
Snow SQL by building analytical warehouse on
Snowflake
- Cloned production data for code modification
and testing.
- Coordinating with Team on Bug tracking and
updating in reports.