Azure AI Studio - Prompt Flow & RAG Copilot
Kim Weiland
?? 3x Top Voice (Technological Innovation, Data Architecture, Artificial Intelligence) | Specializing in ?? Cloud Native, Azure, MLOps, DevOps, CI/CD, Terraform, Infrastructure as Code, and Cloud Adoption Framework ???
This article is focusing on Azure AI Studio - Prompt Flow & RAG Copilot.
In the rapidly advancing field of artificial intelligence (AI), developers are often tasked with creating complex AI solutions. These solutions integrate machine learning models, AI services, prompt engineering solutions, and custom code. Microsoft Azure has been instrumental in this area, providing a variety of services for building AI solutions. However, the challenge has been managing multiple tools and web portals for a single project.
Azure AI Studio is a breakthrough in this domain. It consolidates the features of Azure Machine Learning, Azure OpenAI, and other Azure AI services into a single workspace. This collaborative platform enables developers to work in tandem with data scientists and other professionals to construct AI solutions.
This article provides an overview of Azure AI Studio and its application in creating and managing AI development projects.
Azure AI Studio
You need an Azure AI Hub in your Azure subscription to host projects.
Azure AI Studio is a one-stop-shop for AI development, merging various Azure AI services into a single platform. It combines the model catalog and Prompt Flow from Azure Machine Learning Service, generative AI capabilities of Azure OpenAI service, and integrates with Azure AI Services for various AI functionalities.
It provides a collaborative workspace, Azure AI Hubs, for data scientists and developers to work together. It also allows for project creation, scalable computing, integration with data sources, and other cloud services. It offers web-based code development environments and automation libraries.
With Azure AI Studio, teams can efficiently work on AI projects, deploy models, test generative AI models, integrate data for prompt engineering, define workflows, and integrate content security filters. It’s a powerful tool for expanding AI solutions with multiple functions.
How does it Work
An AI Hub serves as a collaborative workspace for the development and management of AI solutions. To utilize the features and capabilities of AI Studio for solution development, at least one Azure AI Hub is required.
An Azure AI Hub can host one or more projects. Each project encapsulates the tools and resources used to create a specific AI solution. For instance, you can create a project to facilitate collaboration between data scientists and developers in building a custom Copilot business application or process.
An Azure AI Hub is the cornerstone for AI development projects on Azure, allowing you to define shared resources that can be used across multiple projects. With AI Studio, you can manage members, compute instances, connections with resources, and define policies for behavior management.
All AI development in Azure AI Studio takes place within a project. You can create a new project and use it to deploy large language models (LLMs), test models, add your own data to expand prompts, define flows that combine models, prompts, and custom code, evaluate model responses to prompts, manage indices and datasets for custom data, define content filters to avoid potentially harmful responses, use Visual Studio Code in the browser to create custom code, and deploy solutions as web apps and containerized services.
You can use Azure AI Studio to create an Azure AI Hub, or you can create a hub while creating a new project. This creates an AI Hub resource in your Azure subscription in the resource group you specify, providing a workspace for collaborative AI development.
In addition to the central AI Hub resource, additional Azure resources are created to provide supporting services. These include a storage account, a key vault, a container registry, an Application Insights resource, and an Azure OpenAI Service resource.
Azure AI Studio serves as an integration point for other AI services, such as speech, language, and vision. By adding more AI services to your solution, you can add even more features to your AI solution.
When to use Azure AI Studio
Azure AI Studio is a comprehensive platform designed to empower developers and data scientists in creating bespoke copilots and advanced, market-ready, responsible generative AI applications. Here’s a succinct summary of its key features:
Prompt Flow
Prompt Flow in Azure AI Studio is a powerful tool for harnessing the capabilities of Large Language Models (LLMs). It’s a one-stop solution for managing, developing, and deploying LLM applications. Here’s a brief rundown:
Development lifecycle of a Large Language Model (LLM) application
The development lifecycle of a Large Language Model (LLM) application is a comprehensive process that includes several key stages:
Creating a Large Language Model (LLM) application with Prompt Flow involves understanding its core components:
Once you understand how a flow is structured and what you can use it for, you can start creating a flow. This involves adding new nodes (or tools) to your flow, defining the expected inputs and outputs, and linking nodes together. By defining the inputs, connecting nodes, and defining the desired outputs, you can create a flow that helps you create LLM applications for various purposes.
Creating a Large Language Model (LLM) application with Prompt Flow involves two crucial steps: configuring connections and setting up runtimes.
Get started with prompt flow in the Azure AI Studio
It's best to go through the whole thing yourself and test it:
RAG-based copilot solution
Language models, particularly when used in chat interfaces, offer an intuitive way to deliver coherent and impressive responses to user queries. However, a key challenge in implementing these models is ensuring “groundedness” - that is, making sure the model’s responses are rooted in factual information or a specific context.
Ungrounded Prompts and Responses: When a language model generates a response to a prompt, it bases its answer on the data it was trained on, which often consists of large amounts of uncontextualized text from the internet or other sources. While the resulting response may be grammatically coherent and logical, it may not be grounded in relevant, factual data. This can lead to uncontextualized or even inaccurate responses that may include invented information.
Grounded Prompts and Responses: To address this, you can ground the prompt with relevant, factual context from a data source. The prompt, along with this grounding data, can then be submitted to the language model to generate a contextualized, relevant, and accurate response. The data source can be any repository of relevant data. For instance, data from a product catalog database could be used to ground a prompt about product recommendations, ensuring the response includes details of actual products in the catalog.
Understanding Language Models and Copilots
Language models are great at generating engaging text, making them perfect for building copilots - chat-based applications that assist users. However, to ensure the language model provides factual and relevant information, a technique called Retrieval Augmented Generation (RAG) is used.
领英推荐
Retrieval Augmented Generation (RAG)
RAG is a process that retrieves relevant information for a user’s initial prompt. It involves three steps:
This ensures the language model uses relevant information when responding, rather than just its training data.
Grounding Data in Azure AI Project
Azure AI Studio allows you to build a custom copilot using your own data to ground prompts. It supports various data connections like Azure Blob Storage, Azure Data Lake Storage Gen2, and Microsoft OneLake, ensuring your copilot’s responses are grounded in reality and specific context.
You can also upload files or folders to the storage used by your AI Studio project.
Building a Grounded Copilot with Azure AI Studio
When creating a copilot that uses your own data to generate accurate responses, efficient data search is crucial. Azure AI Studio, integrated with Azure AI Search, allows you to retrieve relevant context in your chat flow.
Azure AI Search
Azure AI Search is a retriever you can include when building a language model application with Prompt Flow. It allows you to bring your own data, index it, and query the index to retrieve any needed information.
Using a Vector Index
While a text-based index improves search efficiency, a better data retrieval solution can often be achieved using a vector-based index. This index contains embeddings representing the text tokens in your data source.
Embeddings are special data representations that a search engine can use to easily find relevant information. Specifically, an embedding is a vector of floating-point numbers.
For instance, consider two documents:
These documents contain semantically related texts, even though different words are used. By creating vector embeddings for the text in the documents, the relation between the words in the text can be mathematically calculated.
The distance between vectors can be calculated by measuring the cosine of the angle between two vectors, also known as the cosine similarity. This computes the semantic similarity between documents and a query.
By representing words and their meanings with vectors, you can extract relevant context from your data source, even when your data is stored in different formats (text or image) and languages.
To use vector search to search your data, you need to create embeddings when creating your search index. To create embeddings for your search index, you can use an Azure OpenAI embedding model available in the Azure AI Studio.
Creating a Search Index with Azure AI Search
In Azure AI Search, a search index is a way to organize your content to make it searchable. Think of it as a catalog in a library that contains relevant data about books, making any book easy to find.
The integration of Azure AI Search in Azure AI Studio simplifies the process of creating an index suitable for language models. You can add your data to Azure AI Studio and then use Azure AI Search to create an index using an embedding model. This index asset is stored in Azure AI Search and queried by Azure AI Studio when used in a chat flow.
Configuring Your Search Index
The configuration of your search index depends on your data and the context you want your language model to use. For instance, keyword search enables you to retrieve information that exactly matches the search query. Semantic search goes a step further by retrieving information that matches the meaning of the query instead of the exact keyword, using semantic models. The most advanced technique currently is vector search, which creates embeddings to represent your data. This technique allows for more nuanced and contextually relevant search results.
Searching
There are several methods to query information in an index:
When you create a search index in Azure AI Studio, you’re guided to configure an index that is most suitable to use in combination with a language model. When your search results are used in a generative AI application, hybrid search provides the most accurate results.
Hybrid search is a combination of keyword (and full text), and vector search, with the optional addition of semantic ranking. When you create an index compatible with hybrid search, the retrieved information is precise when exact matches are available (using keywords), and still relevant when only conceptually similar information can be found (using vector search).
Prompt Flow and Large Language Models (LLMs)
Prompt Flow is a development framework for defining flows that orchestrate interactions with an LLM. A flow begins with one or more inputs, usually a question or prompt entered by a user. The flow is then defined as a series of connected tools, each performing a specific operation on the inputs and other environmental variables. Finally, the flow has one or more outputs, typically to return the generated results from an LLM.
Using RAG in a Prompt Flow
The key to using the RAG pattern in a prompt flow is to use an Index Lookup tool to retrieve data from an index. This allows subsequent tools in the flow to use the results to augment the prompt used to generate output from an LLM.
Creating a Chat Flow
Prompt Flow provides various samples you can use as a starting point to create an application. When you want to combine RAG and a language model in your application, you can clone the Multi-round Q&A on your data sample. This sample contains the necessary elements to include RAG and a language model.
Modifying Query with History
The first step in the flow is a Large Language Model (LLM) node that takes the chat history and the user’s last question and generates a new question that includes all necessary information. This generates more succinct input that is processed by the rest of the flow.
Looking Up Relevant Information: You use the Index Lookup tool to query the search index you created with Azure AI Search, finding the relevant information from your data source. Index lookup tool for flows in Azure Machine Learning - Azure Machine Learning | Microsoft Learn
Generating Prompt Context: The output of the Index Lookup tool is the retrieved context you want to use when generating a response to the user. You parse this output into a suitable format to be used in a prompt sent to a language model.
Defining Prompt Variants: When constructing the prompt to send to your language model, you can use variants to represent different prompt contents. This helps ground the chatbot’s responses and explore which content provides the most groundedness.
Chatting with Context: Finally, you use an LLM node to send the prompt to a language model to generate a response using the relevant context retrieved from your data source. The response from this node is also the output of the entire flow.
After configuring the sample chat flow to use your indexed data and the language model of your choosing, you can deploy the flow and integrate it with an application to offer users a copilot experience.
Create a custom copilot that uses your own data
Launch the exercise and follow the instructions.
Thank you for taking the time to read this article on Azure AI Studio - Prompt Flow & RAG Copilot. I hope you found it informative and helpful. As we continue to explore the exciting world of AI, remember that the journey of learning is ongoing. Stay curious, keep asking questions, and never stop learning. Your engagement and feedback are greatly appreciated. Until next time, happy reading! ??????
MCT | Business Applications Portfolio Lead @ Avanade | Power Platform & Copilot Studio Expert | Content Creator
3 个月Love this article!!! ?? great work here! Thanks for sharing!