Semantic Kernel-Plugins-Semantic Functions

Semantic Kernel-Plugins-Semantic Functions


Continuing from our previous article on Semantic Kernel Basics, let's now progress into a deeper understanding of its functionalities and practical applications.

In the last article, we set up different AI tools and used templates. In this article, let's explore one of the core components in SK which is the Plugin.

The Challenge

Last time, the sample we executed didn’t have much scalability for building an AI orchestrator. This post will focus on enhancing the process by transferring the semantic function into a Plugin. This move will facilitate its reuse across various applications and situations. Moreover, this feature empowers us to refine and alter the prompt without the need to recompile the application.

What is Plugin?

Plugins are composed of prompts and native functions. In traditional programming, we write a function that performs actions based on the logic that is written inside the function. On the other hand, prompt engineering involves crafting prompts to guide an AI model’s behavior. These are used to easily organize what actions you could do in your application. In your application, you likely have a variety of prompts, each serving a different purpose in interacting with AI models. Rather than having these prompts dispersed throughout your code, Semantic Kernel allows you to neatly organize them into Plugins and Functions within a folder structure. In this context, a prompt is treated as a Function that can be invoked using Semantic Kernel, while a Plugin is a group of related Functions (prompts) that collectively accomplish a shared task.

Within a plugin, you can create two types of functions: prompt functions and native functions. In this post, we will see how to create a semantic function.

Prompt Function

A prompt function expects a natural language input and uses an LLM to interpret what is being asked, then act accordingly to return an appropriate response.


In Semantic Kernel, a semantic function is composed of two components:

  • Prompt Template: the natural language query or command that will be sent to the LLM.
  • Configuration object: contains the settings and options for the semantic function, such as the service that it should use, the parameters it should expect, and the description of what the function does.

Creating a Prompt Function

In this post, we will create a plugin called CoursePlan, which encompasses distinct functions. Our goal with this plugin is to provide a topic as input, generating a course plan inclusive of Title, Topics, and Study Plan. Each function has its prompt stored in the skprompt.txt file, along with a dedicated configuration detailed within config.json.

For our CoursePlan example, we can organize our prompts into the following folder structure:

Plugins/

└── CoursePlan/

    ├── Title/

    │   ├── config.json

    │   └── skprompt.txt

    ├── Chapters/

    │   ├── config.json

    │   └── skprompt.txt

    ├── Plan/

    │   ├── config.json

    │   └── skprompt.txt        

Inside the CoursePlan folder, we need to define our function by creating a folder with files. The first one is the file containing the prompt itself, which is simply a text file called skprompt.txt.

Similar to passing parameters into a function in traditional programming languages, you can structure prompts with parameters in Semantic Kernel. For example, the Title function could be formulated in this manner:

GENERATE EXACTLY ONE COURSE TITLE FOR THE COURSE TOPIC BELOW

TITLE MUST BE:

- CATCHY AND INFORMATIVE

- LENGTH IS 60-70 CHARACTERS

- STRAIGHTFORWARD AND EASY TO UNDERSTAND

GENERATE THE TITLE ABOUT:

+++++

{{$course}}

+++++        

skprompt.txt


The second file, config.json, is used to set up LLM parameters and describe what the prompt does. In the input section, we specify the input parameters that the prompt is going to accept.

{
  "schema": 1,
  "type": "completion",
  "description": "a function that generates course title",
  "completion": {
    "max_tokens": 500,
    "temperature": 0.2
  },
  "input": {
    "parameters": [
      {
        "name": "topic",
        "description": "The topic to generate a course module for",
        "defaultValue": ""
      }
    ]
  }
}        

config.json

Executing the Prompt Function Plugin

First, we have to setup the kernel like we did in last post then we import plugin into the kernel.

var pluginsDirectory = Path.Combine(Directory.GetCurrentDirectory(), "Plugins/CoursePlan");

var courseModulePlugin= kernel.ImportPluginFromPromptDirectory(pluginsDirectory, "CoursePlan");        

First, we get the full path of the plugin directory within our application and next we import all the functions included in the plugin.

Now we can execute the function in the same way as previously we called by passing the hardcoded prompt directly.

KernelArguments variables = new KernelArguments
{
    { "topic", "OpenAI." }
};

var title = await kernel.InvokeAsync( courseModulePlugin["Title"],variables); 
Console.WriteLine(title);
variables.Add("title", title.GetValue<string>());        

Here Kernel executes the Prompt Function Title and gives the Title for the topic we provided. We use the KernelArguments variables collection to set the title so that when we can pass as input parameter to the next function Chapters.

var chapters = await kernel.InvokeAsync(courseModulePlugin["Chapters"],variables);
Console.WriteLine(chapters);
variables.Add("chapters", chapters.GetValue<string>());        

Now we have executed the function Chapters to generate Chapters for the Course Title we got in previous execution.

Let’s now execute our third function Plan. This function will utilize the Chapters generated in previous function execution to generate a study plan for the chapters.

var studyPlan = await kernel.InvokeAsync(courseModulePlugin["Plan"], variables);
Console.WriteLine(studyPlan);        

The results should look like below with Title, Chapters and study plan.

Prompt Template Syntax

The Semantic Kernel prompt template language is a simple and powerful way to define and compose AI functions using plain text. It supports three basic features that allow you to:

1. Include variables,

2. Call external functions, and

3. Pass parameters to functions.

Double curly braces have a special use case in the template; they are used to inject variables, values, and functions into templates.

The Title function has a variable called {{$course}}, which accepts an input parameter to generate the title.

To call an external function and pass a parameter to it, use the following syntax: {{namespace.functionName $varName}} and {{namespace.functionName "value"}}. IFor example, we can call Title function inside the Chapters function as below:

GENERATE A TABLE OF CHAPTERS FOR THE COURSE

THE TABLE OF CHAPTERS MUST BE:
- PROVIDE A CLEAR, CONCISE OVERVIEW OF THE COURSE CONTENT
- STRUCTURED IN A LOGICAL WAY THAT REFLECTS THE FLOW OF THE COURSE CONTENT
- INCLUDE ONLY MAIN CHAPTERS  IN THE COURSE, DO NOT INCLUDE SUB CHAPTERS
- USE CLEAR, UNDERSTANDABLE LANGUAGE 
- IN MARKDOWN FORMAT
- SEPARATE EACH CHAPTER BY AN EMPTY LINE


GENERATE THE TABLE OF CONTENT FOR COURSE WITH MAIN CHAPTERS:
+++++
{{CoursePlan.Title $topic}}
+++++        

Then now execute the following code.

var chapters = await   kernel.RunAsync(variables,courseModulePlugin["Chapters"]);
Console.WriteLine(chapters);        

When we run the function Chapters, it executes the Title function first . After the execution it places the response which is course title into the prompt of the function Chapters and executes the Chapters function to generate a list of topics by the course title.

Please find the GitHub repository link here for complete code.



要查看或添加评论,请登录

Yedupati Narsimha Rao的更多文章

  • Semantic Kernel - Native Functions

    Semantic Kernel - Native Functions

    In the previous post we have seen how to templatize a prompt to make it more reusable and importing into Semantic…

    3 条评论
  • Semantic Kernel - Basics

    Semantic Kernel - Basics

    Integrating generative AI services from the popular AI providers like OpenAI is straightforward. They provide simple…

社区洞察

其他会员也浏览了