Install Spark on Windows (Local machine) with PySpark – Step by Step - SQLRelease

In this post, we will learn how we can install Apache Spark on a local Windows Machine in a pseudo-distributed mode (managed by Spark’s standalone cluster manager) and run it using PySpark (Spark’s Python API). It is useful if we want to learn Spark and we don't have access to a cloud-based Spark cluster. We can configure our local machine in such a way so that we can start using it in a pseudo-distributed mode for Spark development.

#Spark #PySpark #SQLRelease #Python

https://bit.ly/2Wi1Ocn

https://www.dhirubhai.net/feed/update/urn:li:activity:6645575245459746816

?

要查看或添加评论,请登录

Gopal Krishna Ranjan的更多文章

社区洞察

其他会员也浏览了