Deploying Databricks on AWS

Deploying Databricks on AWS

Databricks is Cloud agnostic Platform as a Service ( PaaS) offering available in all three public clouds . In this ultra short article we will see how to deploy Databricks Workspace on AWS.

You should have an access to your AWS Cloud Account with admin privileges ( billing administrator ) / viewer role.?

In my case, I am using the FREE AWS account & trying to deploy Databricks for free 14 days trial. ( Do remember to cancel it before 14 days )

Let's start the Step by Step process.

Step1: To sign up for a free trial of Databricks, Fill the relevant details in the below page.

No alt text provided for this image


Step2: Open the Welcome email you received from Databricks after you signed up for an account and click the link to verify your email address and create your password.

When you click?Submit, you are taken directly to the Databricks account console.

No alt text provided for this image

Step3: Select a subscription plan. ( I am choosing Premium)

No alt text provided for this image

Step4: Click?Continue?to open the?Workspaces?page.

Step5: Click?Create workspace?to set up a Databricks workspace.

No alt text provided for this image

Step6: Click on Start Quickstart. You will be navigated to your AWS account. Login into your account & complete the template.

No alt text provided for this image
No alt text provided for this image
No alt text provided for this image

Step7: Click on Create stack. Wait for few minutes [ Have coffee preferably :) ]. Yes all necessary stuff created for us by the template (isn't it just amazing ? )

No alt text provided for this image

Step8: Return to your Databricks account. You will see your workspace appearing there.

No alt text provided for this image

Step9: Click on your workspace name.

No alt text provided for this image

Step10: Click on the URL ( Yes, that's the one we were waiting for ). Provide your id / password. We are in our brand new workspace & ready to do whatever we want i.e. Data Engineering , Data Science , ML, Data Analysis or SQL Analytics

No alt text provided for this image

Note the 14 days trial counter at the top.

By default a compute cluster ( Single Node) is already created. ( The auto termination is 60 mins, I will make it to 15 mins to avoid unnecessary eating up my EC2 machines )

No alt text provided for this image

Step11: Let's create a Notebook & execute few commands to ascertain that everything is working fine.

No alt text provided for this image
No alt text provided for this image

Wow!! See how quickly we managed to create fresh Databricks workspace & started using it.

Thats it. We are done. Now from here onwards everything remains the same the way we work in Azure, GCP or Community Edition Databricks.?

This marks the end to this article. I hope, I am able to provide you something new to learn. Thanks for reading, Please provide your feedback in the comment section. Please like & share if you have liked the content.?

Thanks !! Happy Sunday, Happy Learning !!


Anmol Srivastava

Cloud Consultant @ Amdocs | Databricks Certified Professional | Modern Data Platform | Consulting | Azure | AWS | Data Lakes & Warehouse | ETL | SAP | Solutioning | 4x Databricks | 3x Azure | Social Enthusiast

2 周

Hi Deepak, i am little bit curious that why did you not describe taking 2 aws accounts that could show control plane and classic compute plane difference as well ?

回复

要查看或添加评论,请登录

Deepak Rajak的更多文章

社区洞察

其他会员也浏览了