0% found this document useful (3 votes)
2K views4 pages

Interview Prep Kit - 241008 - 222226

Interview kit

Uploaded by

bhavesh bonde
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (3 votes)
2K views4 pages

Interview Prep Kit - 241008 - 222226

Interview kit

Uploaded by

bhavesh bonde
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Attaching everything that is provided as a part of this complete Interview

preparation kit for Data Engineers.

More focus is on Apache Spark/Pyspark as 90% of the questions in interviews are from
spark only.

𝗜𝘁 𝗶𝗻𝗰𝗹𝘂𝗱𝗲𝘀 𝗤&𝗔 𝗳𝗿𝗼𝗺


1. Hadoop - 200+ Q&A
2. Hive - 300+ Q&A
3. Spark - 1000+ Questions with answers including scenario based questions.
4. Airflow - 50+ Q&A
5. Hbase - 100+ Q&A
6. Kafka - Cheat Sheet - 200+ Q&A
7. Sqoop - All commands - 50+ Q&A
8. Mapreduce - 50+ Q&A
9. Hadoop and Linux Commands

These questions are designed to help you prepare for the following roles:
1. Big Data Enginner
2. Big Data Developer
3. On-Prem
4. Spark Developer

You can check it out here - https://topmate.io/shubham_wadekar/1038815

For latest Coupon code checkout the above link.

Interview prep kit by Shubham Wadekar


1. Spark
1000+ Questions with answers

100+ Scenario based coding questions.

2. Hadoop – Hadoop and Linux Commands


200+ Questions with answers

3. Hive
300+ Questions with answers

Interview prep kit by Shubham Wadekar


4. Airflow
50+ Questions with answers

5. Hbase
100+ Questions with answers

6. Kafka
200+ Questions with answers

7. Sqoop
Commands and 50+ Questions with answers

Interview prep kit by Shubham Wadekar


8. Mapreduce
50+ Questions with answers

Interview prep kit by Shubham Wadekar

You might also like