课程: AI Engineering Use Cases and Projects on AWS: Production-Grade LLM Systems

免费学习该课程!

今天就开通帐号,24,700 门业界名师课程任您挑!

Rust LLM project extension

Rust LLM project extension

Welcome to the Capstone challenge. In this Capstone challenge, I already have a working project that does model proxy routing, so you can send something from one model to another. You can have local Ollama with DeepSeek or also send it to Bedrock. But let's actually go to the next level, and let's get into how to build your own context-aware processing extension. So if we look at the diagram here, first up we have context-aware LLM processing system. In this case, we're using Rust to have an input layer, a context manager, a template engine, and then also this async runtime. So we have extremely high performance, you can't get any better performance than we can get with Rust here. And we also have incredible safety, and also we have amazing project management. So I can just put this in my path and boom, we're ready to go. Let's go ahead and walk through what the responsibilities will be here if we go ahead and take a look at this. So first step in phase one, you're going to build a…

内容