课程: Google Cloud Professional Machine Learning Engineer Cert Prep

免费学习该课程!

今天就开通帐号,24,700 门业界名师课程任您挑!

TensorFlow serving with a GPU-enabled Docker

TensorFlow serving with a GPU-enabled Docker

- [Narrator] Here we have TensorFlow for production. And TensorFlow serving with Docker is one of the easiest ways to serve out a TensorFlow based model. And in fact, let's take a look at a CPU based workflow here first, and then we'll move on to GPU. So first up, we need to download the TensorFlow serving Docker image and repo. So, I'm going to go ahead and grab this. And I'm going to go into an environment I already have running here, and get this cooking. There we go. So we are able to do that pull, I already have it loaded. Next up, what I'm going to do is I'm going to make sure that I have this repo cloned, which is the serving code. In fact, I do. So that's great. Next step, let's go over and run the TensorFlow serving container and open up the rest API port. Because I'm going to do this in GitHub code spaces, it should work out quite well. Let's go ahead and do this. Awesome. So we can see the…

内容