课程: Apache Spark Deep Learning Essential Training
今天就学习课程吧!
今天就开通帐号,24,100 门业界名师课程任您挑!
Using deep learning in Spark - Spark DataFrames教程
课程: Apache Spark Deep Learning Essential Training
Using deep learning in Spark
- [Instructor] There are three major ways to do deep learning in Spark. They are inference, transfer learning, and model training. Let's look at each of them in turn in a little more detail. We could take a model like ResNet 50, or Inception Version Three, or VGG16, these are deep learning models that have been trained on ImageNet dataset, and apply them to a large dataset in parallel using Spark. This will allow you to apply image classification to a large dataset of images very quickly and put them into one of the 1000 classes. You could take the YOLO Version Three model, for example, which is used for object detection, and apply it in parallel using a Spark function. You could also use PySpark's map function to get a distributed inference by calling TensorFlow, Keras, or PyTorch. The second way to use deep learning in Spark is via transfer learning. Transfer learning is using a trained neural network that would have been trained on a dataset that is similar to the one you're…
随堂练习,边学边练
下载课堂讲义。学练结合,紧跟进度,轻松巩固知识。