Building an LLM Agent Using Semantic Kernel SDK

Building an LLM Agent Using Semantic Kernel SDK

So, one of my recent projects involved building a fitness plugin to provide personalized fitness advice and a meal planner, all through an LLM agent. While searching for the best available tools out there, I came across Semantic Kernel, a new Microsoft SDK that that makes it easier to incorporate AI into existing applications.

Semantic Kernel allows developers to seamlessly integrate advanced AI technologies with native code, unlocking a host of new opportunities for AI-driven applications.

Semantic Kernel already has a built-in Planner. Intrigued by the potential of enhancing the existing LLM's capabilities, I decided to give it a shot. Let's explore more about Semantic Kernel and its built-in Planner feature.

Benefits of Semantic Kernel

Using the Semantic Kernel offers several distinct benefits that can significantly enhance the development and operation of applications utilizing LLMs.

  1. Fast Integration: The Semantic Kernel is designed for quick and easy embedding into any type of application. This feature enables developers to rapidly prototype, test, and deploy AI functionalities within their apps, reducing development time and accelerating time to market.
  2. Extensibility: With Semantic Kernel, your applications can easily connect to external data sources and services. This integration capability allows your apps to utilize natural language processing alongside real-time information, expanding the scope and effectiveness of AI interactions.
  3. Better Prompting: The Semantic Kernel includes templated prompts that streamline the creation of semantic functions. These templates enable dynamic content and behavior customization, making it simpler to design complex interactions that are tailored to the needs of your users.
  4. Reusable Code: One of the most powerful features of the Semantic Kernel is the reusability of its components. Skills developed for one application can be easily adapted and reused in new projects. This not only saves time but also allows for the consistent application of proven solutions across different applications, enhancing both efficiency and reliability.

Components of Semantic Kernel

When creating an application with Semantic Kernel, there are several components at our disposal that can enhance the user experience.


Semantic Kernel Components Architecture

Below is a list of key components that make up the Semantic Kernel framework:

  • Kernel
  • Memories
  • Planner
  • Connectors
  • Plugins

The focus would be to shed light on the Planner.

Planner

The Planner is a central feature of the Semantic Kernel, critical to its operation and effectiveness. It utilizes a combination of native and semantic functions that are registered within the kernel. The planner is designed to intelligently formulate an execution plan based on the user's request. This functionality allows the Planner to analyze the input, determine the necessary steps, and sequence them effectively to achieve the desired outcome.


Source: Automatically orchestrate AI with planners

When integrating planning functionality using the Semantic Kernel SDK, developers can choose from a variety of planners tailored to different use cases.

  • SequentialPlanner
  • BasicPlanner
  • ActionPlanner
  • StepwisePlanner

Each of these planners offers unique advantages, depending on the complexity of the tasks and the specific requirements of the application being developed. Let's dive deeper into these planners, and I will provide code snippets for you to test and evaluate their capabilities.

Basic Planner

The BasicPlanner within the Semantic Kernel is designed to produce a JSON-based execution plan that addresses user requests in a sequential manner. This planner takes the provided input and systematically creates a step-by-step plan where each step is evaluated in order. This approach is particularly useful for straightforward tasks that require a clear and orderly progression of operations.

from semantic_kernel import Kernel
from semantic_kernel.functions import KernelFunctionFromPrompt
from semantic_kernel.services import OpenAIChatCompletion, AzureChatCompletion
from semantic_kernel.plugins import TextPlugin
from semantic_kernel.planners import BasicPlanner
from semantic_kernel.settings import openai_settings_from_dot_env, azure_openai_settings_from_dot_env

# Initialize the Kernel
kernel = Kernel()
service_id = "default"

# Placeholder for service selection logic
selectedService = Service.OpenAI

# Add services based on the selected service
if selectedService == Service.OpenAI:
    api_key, org_id = openai_settings_from_dot_env()
    kernel.add_service(
        OpenAIChatCompletion(service_id=service_id, ai_model_id="gpt-3.5-turbo-1106", api_key=api_key, org_id=org_id)
    )
elif selectedService == Service.AzureOpenAI:
    deployment, api_key, endpoint = azure_openai_settings_from_dot_env()
    kernel.add_service(
        AzureChatCompletion(service_id=service_id, deployment_name=deployment, endpoint=endpoint, api_key=api_key)
    )

# Add plugins
plugins_directory = "../../samples/plugins/"
kernel.add_plugin(parent_directory=plugins_directory, plugin_name="SummarizePlugin")
kernel.add_plugin(parent_directory=plugins_directory, plugin_name="WriterPlugin")
kernel.add_plugin(TextPlugin(), "TextPlugin")

# Define and add a Kernel function
inspiration_quote_func = KernelFunctionFromPrompt(
    function_name="InspirationQuote",
    plugin_name="WriterPlugin",
    prompt="""
    {{$input}}

    Generate an inspirational quote based on the above input.
    """,
    prompt_execution_settings=OpenAIChatPromptExecutionSettings(
        service_id=service_id,
        max_tokens=150,
        temperature=0.5
    )
)
kernel.add_function("WriterPlugin", inspiration_quote_func)

# Display all plugins and their functions
for plugin in kernel.plugins.values():
    for function in plugin:
        print(f"Plugin: {plugin.name}, Function: {function.name}")

# Create and execute a plan
planner = BasicPlanner(service_id)

ask = """
It's my best friend's birthday next week, and I want to surprise them with a unique poem.
They admire the works of Maya Angelou and love the simplicity of haiku. Please craft a poem combining Angelou's thematic depth with the haiku structure.
"""

# Execute the plan asynchronously
new_plan = await planner.create_plan(goal=ask, kernel=kernel)

results = await planner.execute_plan(new_plan, kernel)        

Sequential Planner

This planner manages complex tasks by setting up a detailed plan where different steps are connected to each other. Each step has its own specific inputs and outputs, which helps in carrying out each part of the task very precisely. This careful setup makes sure that everything works together smoothly to get the job done well.

from semantic_kernel.planners import SequentialPlanner

planner = SequentialPlanner(kernel, service_id)

# Create the plan asynchronously
sequential_plan = await planner.create_plan(goal=ask)

# Print each step's description and state
for step in sequential_plan._steps:
    print(step.description, ":", step._state.__dict__)

# Invoke the plan and get the result
result = await sequential_plan.invoke(kernel)
print(result)        

Action Planner

The ActionPlanner is tailored for simple, one-step tasks. It creates a plan that uses just one function, making it ideal for tasks that need a quick and straightforward response to what the user asks.

from semantic_kernel import Kernel
from semantic_kernel.planners import ActionPlanner
from semantic_kernel.core_plugins import MathPlugin, TextPlugin, TimePlugin

# Initialize the Kernel
kernel = Kernel()
service_id = "default"

# Initialize and add the ActionPlanner
planner = ActionPlanner(kernel, service_id)

# Add core plugins to the kernel
kernel.add_plugin(MathPlugin(), "math")
kernel.add_plugin(TimePlugin(), "time")
kernel.add_plugin(TextPlugin(), "text")

# Define the task for the planner
ask = "What is the multiplication of -60 and 90?"

# Create the plan asynchronously
plan = await planner.create_plan(goal=ask)

# Invoke the plan and get the result
result = await plan.invoke(kernel)
print(result)        

Stepwise Planner

The StepwisePlanner operates with precision and care, handling tasks step-by-step. It carefully completes one part of the plan at a time, checking the results after each phase before moving on. This methodical approach ensures that every step contributes correctly towards achieving the final goal.

from semantic_kernel import Kernel
from semantic_kernel.connectors.search_engine import BingConnector
from semantic_kernel.core_plugins import WebSearchEnginePlugin, TimePlugin, MathPlugin
from semantic_kernel.planners import StepwisePlanner, StepwisePlannerConfig
from semantic_kernel.utils.settings import bing_search_settings_from_dot_env

# Initialize the Kernel
kernel = Kernel()

# Set up Bing Search Engine Connector and Plugin
BING_API_KEY = bing_search_settings_from_dot_env()
connector = BingConnector(BING_API_KEY)
kernel.add_plugin(WebSearchEnginePlugin(connector), plugin_name="WebSearch")

# Add Time and Math Plugins
kernel.add_plugin(TimePlugin(), "time")
kernel.add_plugin(MathPlugin(), "math")

# Set up the Stepwise Planner with specific configuration
planner_config = StepwisePlannerConfig(max_iterations=10, min_iteration_time_ms=1000)
planner = StepwisePlanner(kernel, planner_config)

# Define the task for the planner
ask = """How many total ODI world cups have Australia men's cricket team has won during the past 20 years? And which teams are they against?"""

# Create the plan synchronously
plan = planner.create_plan(goal=ask)

# Invoke the plan asynchronously and print the result
result = await plan.invoke(kernel)
print(result)

# Print details of each step in the plan
for index, step in enumerate(plan._steps):
    print("Step:", index)
    print("Description:", step.description)
    print("Function:", step.plugin_name + "." + step._function.name)
    if 'results' in result.metadata:
        print(f"  Output: {','.join(str(res) for res in result.metadata['results'])}")        

As shown in these examples, planners are incredibly useful tools. They can automatically mix and match the functions you've set up to handle different tasks.

As LLMs gets better and as more advanced planners are developed, you can count on these tools to handle even more complex situations for you. Looking at the future, these planners will be key to achieving smarter and more sophisticated AI solutions.



要查看或添加评论,请登录

社区洞察

其他会员也浏览了