Developing XApps: Introducing Microsoft's Semantic Kernel
MKonda

Developing XApps: Introducing Microsoft's Semantic Kernel

LangChain got a serious competitor!

Traditional software application development is expected to get a face lift! With the advent of AI, it is only a matter of time before most of the applications are either redesigned, refactored or reengineered to embrace AI models. Green-field applications will be redrawn and rearchitected to harness and exploit the power of AI models.

With that onset of Generative AI, software developers are expected to become overnight AI coding champions to develop and implement applications that'd integrate with multitude of models to tap into the power of AI. Just as there are a thousand varieties of mushroom, so are the Generative AI models. Most applications may need more than one model to carry out the business functions. This multi-model integration, development, testing and deployment will bring varied challenges - learning individual APIs being one of them.

Just as we carry a travel adapter to plugin to different power sockets across the world, the invocation of models and their APIs needs adapters that’d abstract the individualised APIs and call signatures. That’s exactly what Microsoft’s Semantic Kernel project is aimed at!?

Semantic Kernel is the alternative to LangChain in the "accelerated app" (XApps) development space. The XApps are what I call traditional apps with AI capability. If you are already familiar with LangChain, then understanding Semantic Kernel should be pretty straightforward. Semantic Kernel is the Microsoft's version of LangChain supporting C#, Python and Java.

Introducing Semantic Kernel

Microsoft developed Semantic Kernel to facilitate developing AI powered software applications - it is a software development kit (SDK) to integrate software programs written in C#, Python and Java with the models - like OpenAI’s GPT3.5 or 4, Meta’s Llama2 and thousands of models that are hosted on multiple platforms like HuggingFace or Azure or elsewhere.

Probably you’ve may have heard of LangChain framework - a framework that’d help integrate large language models with our traditional applications. Go over my earlier articles where I used LangChain framework to draw the power of OpenAI’s gpt3.5 model in our Summariser App or LegalBot.

Invoking OpenAI’s GPT using Semantic Kernel

Let’s create a sample Python App that will connect to OpenAI’s GPT3.5 model. Our aim is to simply prompt the GPT model with a question and print the response. This app doesn’t use any fancy UI yet - command line for now.

The SK library is available to download, so let’s get that library download:

pip3 install semantic-kernel        

Once the setup is completed, let’s create an environment file (.env file) in our Python project with two keys:

OPENAI_API_KEY=“YOUR_OPEN_AI_KEY”
OPENAI_ORG_ID="YOUR_OPEN_AI_ORG_KEY”        

The org_ID must be that code of your Organisation - not the English name.

These keys allow our client to open a secure line with OpenAI via the Kernel’s connector.

We will then code our main logic in a main.py. Let’s first add two imports:

import semantic_kernel as kernel
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion        

The first line imports the library while the second one imports the connector to the OpenAI - the `OpenAIChatCompletion` class.

The next step is to get the kernel instantiated:

semantickernel = kernel.Kernel()        

We then fetch the above keys as they are expected to be passed on to the OpenAI connector:

api_key, org_id = kernel.openai_settings_from_dot_env()        

These keys will be needed for authenticating our program against OpenAI’s GPT model.

Once the kernel is instantiated and fetched the keys, the next step is to create connection to the OpenAI:

chatCompletion = OpenAIChatCompletion("gpt-3.5-turbo", api_key, org_id)        

We have an instance of a wrapper pointing to GPT3.5 model. The next step is to get the connection established with openAI, which can be invoked using add_chat_service method:

semantickernel.add_chat_service("chat-gpt", chatCompletion)        

The add_chat_service call expects the previously instantiated chatCompletion object.

We then wrap the prompt (question) in a function (calling create_semantic_function):

gptFunction = semantickernel.create_semantic_function("What is BTC?")        

We have a new function (gptFunction) created for us. All we need to do is to invoke it:

response = gptFunction()        

And we print the response to the console - which in this case will be the chat completion (response from the GPT3.5 model):


As you can see, similar to LangChain, SK abstracts away the APIs behind the scene.

Semantic Kernel is OpenSource

One of the cool thing is Microsoft made SemanticKernel an Open source project. This can only attract more brains to mature the SK project as it takes off. Currently it supports C#, Python and Java but there's no reason why community wouldn't start supporting other languages.

You can find the Semantic Kernel's project on GitHub here.

Semantic Kernel for Java

As a Java developer, I struggled to find a suitable framework for developing XApps using Java. I was this close to start developing a framework :) Fortunately, Spring has started the support by creating the AI project and so is Microsoft's SK for Java too. Spring's AI project, which is available on the GitHub is under development. Similarly, Microsoft's SK for Java is still in development too.

I will post the articles explaining these two projects for Java engineers - stay tuned!


Wrap up

The race for accelerated (xapp) app development has begun and so is the frameworks and libraries that would support the development. Semantic Kernel, which is powered by the plethora of Copilot plugins in Microsoft's world (Copilot for M365, Copilot for Excel, Copilot for Word..etc et) is also available for developers to embrace it to integrate traditional applications with AI capabilities.


If you like my content, don't forget to follow me!

Me @ Medium || LinkedIn || Twitter || GitHub

Sanjay Lalwani

Tech. Lead | Senior Software Engineer | Investment Bank | Brokerage Firm | Repo Trades | Java | Spring Framework | Spring Boot | Docker | Microservices | React | Sybase | SQL Server | Oracle | PowerBuilder

1 年

Thanks Madhu for the excellent info. I was looking for this.

回复
pavan kumar

FULL STACK JAVA TRAINER| PYTHON| TABLEAU TRAINER | DATA SCIENCE ENTHUSIAST

1 年

Thanks for sharing. Where I can get the training. Let me know

回复

要查看或添加评论,请登录

Madhusudhan Konda的更多文章

社区洞察

其他会员也浏览了