Pinpoint Spark Jobs to Optimize Using the Spark UI
Sai Prasad Padhy
Senior Big Data Engineer | Azure Data Engineer | Hadoop | PySpark | ADF | SQL
When working with Spark, knowing which jobs to optimize can save a lot of time and resources. The Spark UI is a powerful tool to help you identify the jobs that need attention.
We can do this by adding Logging to Spark jobs
Here is how you can do this:
Spark UI looks like below: