课程: Cloud Hadoop: Scaling Apache Spark

Scaling Apache Hadoop and Spark

- [Lynn] Using best practices for Spark-based distributed processing is key to ensuring your applications run efficiently. I'm Lynn Langit. I'll show you how to use Spark patterns for scaling all types of workloads, so let's jump in and build effective Spark solutions.

内容