Semantic Kernel: A new way to integrate AI into your Apps
Semantic Kernel with Saket Kumar

Semantic Kernel: A new way to integrate AI into your Apps

Have you ever wanted to use the power of large language models (LLMs) like OpenAI, Azure OpenAI, or Hugging Face in your applications, but found it too hard or time-consuming to do so? If yes, then you might be interested in Semantic Kernel, a new open-source SDK that lets you easily combine AI services with conventional programming languages like C#, Python, and Java.

What is Semantic Kernel?

Semantic Kernel is an SDK that allows you to define plugins that can be chained together with AI models. Plugins are composed of prompts and native functions that can respond to triggers and perform actions. For example, you can create a plugin that can generate a poem, a story, a code snippet, or a graphic art based on a user’s input.

Semantic Kernel also allows you to orchestrate plugins with AI. With Semantic Kernel planners, you can ask an LLM to generate a plan that achieves a user’s unique goal. Afterwards, Semantic Kernel will execute the plan for the user. For example, you can create a planner that can help a user write a blog post, a resume, a song, or a speech.

Why we use Semantic Kernel?

Semantic Kernel makes it easy to integrate AI services into your existing apps and services. You can use Semantic Kernel to leverage the same AI orchestration patterns that power Microsoft 365 Copilot and Bing in your own apps, while still using your existing development skills and investments.

Semantic Kernel also makes AI development extensible. You can use Semantic Kernel to orchestrate plugins from different sources, such as OpenAI, Azure OpenAI, or Hugging Face, on top of nearly any model. You can also use Semantic Kernel to add memories and skills to your applications with AI plugins that allow you to interact with the real world.

Benefits of using Semantic Kernel

  • Create AI apps that combine the best of both worlds: the natural language understanding and generation of LLMs and the logic and functionality of conventional programming languages.
  • Save time and effort by using pre-defined prompts and plugins that can handle common tasks and scenarios.
  • Customize and extend your AI apps by creating your own prompts and plugins that suit your specific needs and preferences.
  • Deploy and scale your AI apps easily by using Semantic Kernel’s cloud-based services and connectors.

Prerequisites for using Semantic Kernel

To get started with Semantic Kernel, you need:

  • An API key from either OpenAI or Azure OpenAI.
  • A preferred programming language: C#, Python, or Java.
  • A preferred development environment: Visual Studio or VS Code.
  • The Semantic Kernel SDK for your language: you can install it from NuGet.

You can find more details and tutorials on how to use Semantic Kernel on Microsoft Learn

Amrit Ranjan

Program Delivery Manager at Microsoft | AI & ML Solutions | Agile Project Management | Innovation Leader

1 年

Incredible read, Saket Kumar! ?? The introduction of Semantic Kernel as an open-source SDK for seamlessly integrating AI services into apps is a game-changer. Your breakdown of its benefits, extensibility, and ease of use with existing development skills is invaluable. Excited to explore how Semantic Kernel can elevate AI app development. ?? #AI #SemanticKernel #TechInnovation

要查看或添加评论,请登录

Saket Kumar的更多文章

社区洞察

其他会员也浏览了