How to jump start for o3 on Azure!
Peter Smulovics
Distinguished Engineer at Morgan Stanley, Microsoft MVP, Vice Chair of Technical Oversight Committee, Chair of Open Source Readiness, and Emerging Technologies in The Linux Foundation, FSI Autism Hackathon organizer
Azure OpenAI Service now includes the new o3?mini reasoning model—a lighter, cost?efficient successor to earlier reasoning models (such as o1?mini) that brings several new capabilities to the table. These enhancements include:
In addition to these advances, Microsoft’s new o3?mini is now complemented by Semantic Kernel—a powerful, open?source SDK that enables developers to combine AI services (like Azure OpenAI) with custom code easily. Semantic Kernel provides an orchestration layer to integrate plugins, planners, and services, allowing you to build robust and modular AI applications in C#.
Prerequisites
Before getting started, ensure you have:
Setting Up Your Project
Code Sample: Using o3?mini with Semantic Kernel in C#
Below is a complete C# code sample demonstrating how to use the o3?mini model from Azure OpenAI Service directly—and how to integrate Semantic Kernel to add an orchestration layer. This lets you later add custom functions (plugins) that can be automatically invoked by your agent.
Note: The code includes placeholders for new properties (like ReasoningEffort) and is structured to work with Semantic Kernel’s abstractions. Please consult the latest Semantic Kernel documentation for the precise API details.
using System;
using System.Threading.Tasks;
using Azure;
using Azure.AI.OpenAI;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
namespace AzureO3MiniDemo
{
// (Optional) Define an enum for reasoning effort if supported by your SDK version.
public enum ReasoningEffort
{
Low,
Medium,
High
}
class Program
{
static async Task Main(string[] args)
{
// Replace with your Azure OpenAI endpoint and API key.
string endpoint = "https://<your-resource-name>.openai.azure.com/";
string apiKey = "<your-api-key>";
// The deployment name for your o3-mini model.
string deploymentName = "o3-mini";
// Create an instance of OpenAIClient for direct API calls (if needed).
OpenAIClient client = new OpenAIClient(new Uri(endpoint), new AzureKeyCredential(apiKey));
// Now, set up Semantic Kernel and add the Azure OpenAI chat completion service.
var kernelBuilder = Kernel.CreateBuilder();
kernelBuilder.AddAzureOpenAIChatCompletion(deploymentName, endpoint, apiKey);
// Optionally, add custom plugins here.
// For example: kernelBuilder.Plugins.AddFromType<YourCustomPlugin>();
Kernel kernel = kernelBuilder.Build();
// Create a prompt and configure completion options.
string prompt = "Write a short poem about the beauty of nature.";
CompletionsOptions options = new CompletionsOptions()
{
Prompts = { prompt },
MaxTokens = 100,
Temperature = 0.7f
};
// NEW: Set the reasoning effort level (if supported).
// options.ReasoningEffort = ReasoningEffort.Medium;
// (Optional) Specify a JSON schema for structured outputs.
// options.StructuredOutputSchema = "{ \"type\": \"object\", \"properties\": { \"poem\": { \"type\": \"string\" } } }";
try
{
// Query the o3-mini model using the Semantic Kernel abstraction.
Response<Completions> response = await kernel.GetService<IChatCompletionService>()
.GetCompletionsAsync(deploymentName, options);
Completions completions = response.Value;
Console.WriteLine("Response from o3-mini:");
foreach (var choice in completions.Choices)
{
Console.WriteLine(choice.Text.Trim());
Console.WriteLine(new string('-', 40));
}
}
catch (Exception ex)
{
Console.WriteLine($"An error occurred: {ex.Message}");
}
}
}
}
领英推荐
Integrating Semantic Kernel Plugins
Semantic Kernel allows you to extend your application with custom plugins. For example, you can create functions that use Azure Search or other services and have them automatically invoked based on user input. This makes it easier to build AI agents that are both flexible and tailored to your business logic.
Example: Adding a Custom Plugin
Below is a simplified example of a custom plugin function that could be added to your Semantic Kernel setup. This plugin might, for instance, fetch additional context or data needed by your application:
using Microsoft.SemanticKernel.Plugins;
using System.Threading.Tasks;
public class CustomDataPlugin
{
[KernelFunction, Description("Fetches additional context data for the prompt")]
[return: Description("A string containing supplemental data.")]
public async Task<string> GetSupplementalDataAsync([Description("Parameter for the data query")] string query)
{
// Your logic here, e.g., make an HTTP call to fetch data.
await Task.Delay(100); // Simulate async operation.
return $"Supplemental data for query: {query}";
}
}
Once defined, you can register your plugin with the kernel builder:
kernelBuilder.Plugins.AddFromType<CustomDataPlugin>();
Semantic Kernel will now have the ability to call this plugin function automatically when the context of your user input suggests it is needed.
Running the Application
Conclusion
This article demonstrates how to use the new o3?mini model on Azure OpenAI Service with C# and how to further enhance your application by integrating Semantic Kernel. With Semantic Kernel, you can easily orchestrate AI functions, add custom plugins, and switch between providers (OpenAI vs. Azure OpenAI) with minimal changes to your codebase. This makes it an excellent tool for building sophisticated AI agents and applications.
For more details on Semantic Kernel, check out:
Happy coding!