Install Spark on Windows (Local machine) with PySpark – Step by Step - SQLRelease
Gopal Krishna Ranjan
Computer Scientist @Adobe | Blogger at SQLRelease.com | Data Engineering, Data Science, and Machine Learning enthusiast
In this post, we will learn how we can install Apache Spark on a local Windows Machine in a pseudo-distributed mode (managed by Spark’s standalone cluster manager) and run it using PySpark (Spark’s Python API). It is useful if we want to learn Spark and we don't have access to a cloud-based Spark cluster. We can configure our local machine in such a way so that we can start using it in a pseudo-distributed mode for Spark development.
#Spark #PySpark #SQLRelease #Python
https://www.dhirubhai.net/feed/update/urn:li:activity:6645575245459746816
?