0% found this document useful (0 votes)
102 views3 pages

K.suhas Chandra: Professional Summary

This document contains a summary of Suhas Chandra's professional experience and qualifications. He has over 4 years of experience in application development using Java and big data technologies like Hadoop. Some of his key skills and responsibilities include developing applications using Pig, Hive, Sqoop and MapReduce; setting up and configuring Hadoop clusters; writing scripts to process and analyze large datasets; and working on projects to migrate data warehouses to Hadoop platforms. He holds a Bachelor's degree in Electronics and Communication Engineering and has worked as a Hadoop developer at Avantha Technologies.

Uploaded by

jackson jackson
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as RTF, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
102 views3 pages

K.suhas Chandra: Professional Summary

This document contains a summary of Suhas Chandra's professional experience and qualifications. He has over 4 years of experience in application development using Java and big data technologies like Hadoop. Some of his key skills and responsibilities include developing applications using Pig, Hive, Sqoop and MapReduce; setting up and configuring Hadoop clusters; writing scripts to process and analyze large datasets; and working on projects to migrate data warehouses to Hadoop platforms. He holds a Bachelor's degree in Electronics and Communication Engineering and has worked as a Hadoop developer at Avantha Technologies.

Uploaded by

jackson jackson
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as RTF, PDF, TXT or read online on Scribd
You are on page 1/ 3

k.

suhas chandra

+91-8367561912 E-Mail: [email protected]

Professional Summary:

·0 4 years of overall IT experience in Application Development in Java and Big


Data Hadoop.
·1 2 years of exclusive experience in Hadoop and its components like HDFS, Map
Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie
·2 Extensive Experience in Setting Hadoop Cluster
·3 Good working knowledge with Map Reduce and Apache Pig
·4 Involved in writing the Pig scripts to reduce the job execution time
·5 Have executed projects using Java/J2EE technologies such as Core Java,
Servlets, Jsp, Struts
·6 Very well experienced in designing and developing both server side and client
side applications.
·7 Experience with various IDE’s for development of project (Eclipse) and Excellent
communication, interpersonal, analytical skills, and strong ability to perform as
part of team.
·8 Exceptional ability to learn new concepts.
·9 Hard working and enthusiastic.
·10 Knowledge on FLUME and NO-SQL

Professional Experience:
.
·11 Worked as a hadoop developer in Avantha Technologies Pvt Limited,
Bangalore

Qualifications:
·12 Bachelors Degree from JNTUH in Electronics and Communication Engineering.

Technical Skills:

Languages Java, Java Script, HTML, XML, MapReduce, Pig, Sqoop, Pig, Hive,
Hbase.
J2EE Technologies : JSP, Servlets, JDBC and EJB
Servers IBM Web Sphere Application Server 7.0, Web Logic and Tomcat
Frameworks IBM EAD4J Framework Struts, Spring, Hibernate, Hadoop.
Java IDEs Eclipse.
Version Control / RTC, Rational Clear case, Rational Clear Quest, Rational Portfolio
Tracking Tools Manager, Rational Req pro, Rational Quality Management, SVN,
CVS, Visual SourceSafe (VSS)
Databases DB2 9.x, Oracle, SQL (DDL, DML, DCL) and PL/SQL.
Design Skills J2EE design patterns, Object Oriented Analysis and Design (OOAD),
UML.
Operating Systems Windows7, Windows XP, 2000, 2003, Unix and Linux

Project Details:

PROJECT #1:
Project Name : Target – Web Intelligence
Client : Target Minneapolis, Minnesota, USA.
Environment : Hadoop, Apache Pig, Hive, SQOOP, Java, UNIX, MySQL
Duration : Nov 2011 to till Date
Role : Hadoop Developer

Description:

This Project is all about the rehousting of their (Target) current existing project
into Hadoop platform. Previously Target was using mysql DB for storing their
competitor’s retailer’s information.[The Crawled web data]. Early Target use to
have only 4 competitor retailers namely Amazon.com, walmart.com etc….

But as and when the competitor retailers are increasing the data generated out of
their web crawling is also increased massively and which cannot be
accomodable in a mysql kind of data box with the same reason Target wants to
move it Hadoop, where exactly we can handle massive amount of data by means
of its cluster nodes and also to satisfy the scaling needs of the Target business
operation.

Roles and Responsibilities:

·13 Moved all crawl data flat files generated from various retailers to HDFS for further
processing.
·14 Written the Apache PIG scripts to process the HDFS data.
·15 Created Hive tables to store the processed results in a tabular format.
·16 Developed the sqoop scripts in order to make the interaction between Pig and
MySQL Database.
·17 Involved in gathering the requirements, designing, development and testing
·18 Writing the script files for processing data and loading to HDFS
·19 Writing CLI commands using HDFS.
·20 Developed the UNIX shell scripts for creating the reports from Hive data.
·21 Completely involved in the requirement analysis phase.
·22 Analyzing the requirement to setup a cluster
·23 Created two different users (hduser for performing hdfs operations and map red
user for performing map reduce operations only)
·24 Ensured NFS is configured for Name Node
·25 Setting Password less hadoop
·26 Setting up cron job to delete hadoop logs/local old job files/cluster temp files
·27 Setup Hive with MySQL as a Remote Metastore
·28 Moved all log/text files generated by various products into HDFS location
·29 Written Map Reduce code that will take input as log files and parse the logs and
structure them in tabular format to facilitate effective querying on the log data
·30 Created External Hive Table on top of parsed data.

You might also like