Insights from Over 100 + Interviews

Insights from Over 100 + Interviews

In the last 6 to 8 months, I’ve given between 120 to 150 interviews, ranging from top product-based companies to service-based companies, for senior data engineering roles. Through this extensive experience, I’ve gathered key insights that I’d like to share:

???????????????????? ????????????????????????:

? ?????????? ??????????: Even if you don’t feel fully prepared, start giving interviews as soon as possible. This practice will help you identify key areas to focus on and improve your confidence.

? ???????????????? ???????????????? ??????????????????: Often, you might already have enough knowledge to crack the offers. The interviews will help you realize your strengths and areas that need improvement.

?????????? ??????????:

??. ??????:

? Master SQL: Be very proficient in SQL, capable of writing complex and optimized queries.

? Practice Regularly: Aim to solve around 40 to 50 SQL questions. Use practice platforms like LeetCode, DataLemur, and StrataScratch.

? Understand Patterns: Focus on common topics such as window functions, finding the second highest or third highest values, using lead and lag functions, and optimizing queries to reduce the number of subqueries.

??. ????????????:

? Strong Coding Skills: Develop strong coding skills in Python. Practice solving easy to medium-level problems.

? Practice Platforms: Utilize LeetCode to understand common coding patterns like the two-pointer approach, sliding window, arrays, and strings.

? Daily Practice: Aim to solve around 40 to 50 Python coding problems to cover common patterns.

??. ??????????????:

? Write PySpark Code: Be proficient in writing code in PySpark. Understand Spark internals and optimization techniques.

? Translating SQL to PySpark: Practice translating your SQL skills to PySpark, focusing on writing optimized code.

? Deep Dive into Spark:

? Understand Spark Internals: Gain a deep understanding of Spark internals, including the architecture and how Spark works.

? Optimization Techniques: Focus on optimization techniques such as broadcast joins, partitioning, and understanding out-of-memory errors and driver errors.

? Common Questions: Be prepared to answer common questions about Spark architecture, performance tuning, and error handling.

? Learning Resources: Learn Spark internals and Big Data fundamentals from resources like TrendyTech. They offer comprehensive content on Spark internals and Big Data fundamentals, which I found extremely helpful.

??. ?????????? ??????????????????:

? Hands-on Experience: Gain hands-on experience with any cloud platform. AWS is highly in demand, followed by Azure and GCP.

? Company Requirements: Understand that while some companies might not emphasize cloud skills, having this knowledge is beneficial.

???????? ????????????????????:

? Build Confidence: Start giving mock interviews as early as possible. This will help you build confidence and expose you to common interview questions.

? Repeated Practice: Continuous practice will prepare you for real interviews and help you perform better.

?????? ??????????????????:

? Practice Regularly: Consistent practice in SQL, Python, and PySpark is crucial.

? Understand Patterns: Focus on understanding patterns in coding problems.

? Mock Interviews: Participate in mock interviews to build confidence and identify areas for improvement.

? Cloud Skills: While not always required, having cloud skills can give you an edge.

These insights have been invaluable in my own journey and have helped me secure multiple offers with significant hikes. By focusing on these areas and continuously practicing, you’ll be well-prepared to crack data engineering interviews and land your dream job.


Bhai carelon ka bata sakte ho joining process please?

S Krishna kumar

Azure Data Engineer | Azure Data Bricks | Apache Spark | Azure Data Factory | Azure Data Lake Storage | Azure SQL | Azure Synapse | Python | SQL |

8 个月

Thanks for sharing

Priyanka Babar

Data Engineering | Data Analytics | SQL(5 star in hackerrank)| Python | Excel | Databricks | Amazon AWS

8 个月

Thank you so much for this valuable share

要查看或添加评论,请登录

Satyam Meena的更多文章

社区洞察

其他会员也浏览了