Deploying Databricks on AWS
Deepak Rajak
Data Engineering /Advanced Analytics Technical Delivery Lead at Exusia, Inc.
Databricks is Cloud agnostic Platform as a Service ( PaaS) offering available in all three public clouds . In this ultra short article we will see how to deploy Databricks Workspace on AWS.
You should have an access to your AWS Cloud Account with admin privileges ( billing administrator ) / viewer role.?
In my case, I am using the FREE AWS account & trying to deploy Databricks for free 14 days trial. ( Do remember to cancel it before 14 days )
Let's start the Step by Step process.
Step1: To sign up for a free trial of Databricks, Fill the relevant details in the below page.
Step2: Open the Welcome email you received from Databricks after you signed up for an account and click the link to verify your email address and create your password.
When you click?Submit, you are taken directly to the Databricks account console.
Step3: Select a subscription plan. ( I am choosing Premium)
Step4: Click?Continue?to open the?Workspaces?page.
Step5: Click?Create workspace?to set up a Databricks workspace.
Step6: Click on Start Quickstart. You will be navigated to your AWS account. Login into your account & complete the template.
领英推荐
Step7: Click on Create stack. Wait for few minutes [ Have coffee preferably :) ]. Yes all necessary stuff created for us by the template (isn't it just amazing ? )
Step8: Return to your Databricks account. You will see your workspace appearing there.
Step9: Click on your workspace name.
Step10: Click on the URL ( Yes, that's the one we were waiting for ). Provide your id / password. We are in our brand new workspace & ready to do whatever we want i.e. Data Engineering , Data Science , ML, Data Analysis or SQL Analytics
Note the 14 days trial counter at the top.
By default a compute cluster ( Single Node) is already created. ( The auto termination is 60 mins, I will make it to 15 mins to avoid unnecessary eating up my EC2 machines )
Step11: Let's create a Notebook & execute few commands to ascertain that everything is working fine.
Wow!! See how quickly we managed to create fresh Databricks workspace & started using it.
Thats it. We are done. Now from here onwards everything remains the same the way we work in Azure, GCP or Community Edition Databricks.?
This marks the end to this article. I hope, I am able to provide you something new to learn. Thanks for reading, Please provide your feedback in the comment section. Please like & share if you have liked the content.?
Thanks !! Happy Sunday, Happy Learning !!
Cloud Consultant @ Amdocs | Databricks Certified Professional | Modern Data Platform | Consulting | Azure | AWS | Data Lakes & Warehouse | ETL | SAP | Solutioning | 4x Databricks | 3x Azure | Social Enthusiast
2 周Hi Deepak, i am little bit curious that why did you not describe taking 2 aws accounts that could show control plane and classic compute plane difference as well ?