Improving A.I. Chats by Enhancing AzureAI ChatGPT Prompts and Responses with Prompt Flow

Improving A.I. Chats by Enhancing AzureAI ChatGPT Prompts and Responses with Prompt Flow

Users and organizations that have been working with ChatGPT (public or private instances) and with Microsoft’s CoPilot have found that the tools do a great job at what they do, but fall short in areas where native A.I. chat technologies have limitations. ?Native ChatGPT and Microsoft CoPilot merely respond with what they're programmed to respond with, usually good, but sometimes not good, and in the past you have had NO control over the output. I've written in the past that if you better frame your question, you can get a better response, but that becomes an art.

As I’ve commented “if A.I. doesn't do what you want, give it a couple months and it will get better”. The exciting piece is the core technology to improve A.I. in the Microsoft world of things is NOW available and is the "next level" in making A.I. way better for enterprises.

This technology is still a bit science project-y in the sense that it is foundational technology that’ll eventually be more tightly integrated with out of the box connectors, etc to make it more user friendly, but for those interested in seeing and fiddling with where Microsoft is taking A.I. to “make it better”, this article provides you a snapshot of what's now available.

Prompt Flow that Expands Beyond ChatGPT

Microsoft can’t necessarily “fix” ChatGPT’s limitations as the core technology of ChatGPT is developed by the non-Microsoft owned OpenAI company, so Microsoft has added on a tool they call "Prompt Flow" that helps to fill in gaps where ChatGPT falters.? Effectively Prompt Flow allows you to ADD to your chat query that includes integrating multiple chat/prompt sequences, Langchain, SQL query, and Python queries where the aggregated input is then fed into the Chat model.? That way rather than depending solely on a basic ChatGPT response (with its limitations), the prompt query and response will be inclusive of results from multiple sources that can be "trained" for even better response accuracy.

Fiddling with Prompt Flow

Prompt Flow is now embedded into Azure AI Studio for you to play with, but in typical Microsoft fashion, the documentation is poorly written, took a bit of reasoning to even get going on this, but once you get past a few of the quirks, it's extremely functional.

Here’s the step-by-step documentation on the thing with examples / labs to fiddle with -?https://learn.microsoft.com/en-us/azure/ai-studio/tutorials/deploy-copilot-ai-studio

What the docs and examples provide you is the ability to take a Git repo of sample Prompt Flow content (MD5 content, pre-built Python code, etc) and build the thing right in Azure AI Studio to see how it works!? (takes about 30 minutes to build once you get past a few of the undocumented (poorly documented) steps that I’ll provide insight to below)

How Prompt Flow Helps with A.I. Chats

As noted above, Prompt Flow enables the ability to INTEGRATE multiple tools together to go beyond just a basic GenAI chat response. By adding Python parsers, prompt modifications, and LLM model mods, you make the process and responses "better" and have the ability to "train" ChatGPT to improve your chats.

  • An example might be rather than having to teach users how to "chat better", Prompt Flow can automatically supplement a user's simple query like "what's this doc about?" and have Prompt Flow programmed to add to the query information about what doc, who the user is, what else the chat query should include, etc SO that the response will be better formulated for the user. So the user's not simply at the mercy of GenAI ChatGPT blurting out whatever it has on its mind as a response, but you can better streamline the chat and analysis...
  • And THEN, Prompt Flow can also assess the output and format it in a manner that is better suited for the user, so automatically providing output in a bulleted format, or include references and quotes, or putting citations in specific format and style.

Prompt Flow can assist both on the frontend (better question) and the output (better formatted response) which many times is half the game in providing users a "better experience" with A.I. Chats

So for ALL the things you might have struggled with ChatGPT on to date not answering questions as you would expect, THIS Prompt Flow is the tool that improves the experience.

This is the game changer we’ve been needing to move A.I. to the next level.?Prompt Flow has been available for a while now, the problem is technical people who tech through this don’t understand user/business needs, and business people don’t understand the technicalities of Prompt Flow. To get it working right, it takes someone who understands BOTH users/business AND technical/prompt flow to make this valuable

What has kept my company busy is we have consultants that have skills in both the business and the tech side of this A.I. thing, where they're listening to a customer’s need, and then tech through Prompt Flow to frame out a solution.

Getting Hands-on with Prompt Flow

The documentation/guide I linked above has you working in the Microsoft Azure AI Studio, which for those working with Azure OpenAI, it's basically a click box off the Open AI Studio interface to get started...

After you follow the steps to create the Project, Hub and upload data, you can then enable Compute and click to Chat with your Prompt Flow content

Additional Tips

  • Working with Prompt Flow assumes you've already mastered working with the basics of creating an Azure AI environment - https://learn.microsoft.com/en-us/azure/ai-services/openai/use-your-data-quickstart
  • Once you have the basics down and understand the concepts and the place where Azure Storage, Azure AI Search, and Azure OpenAI fits in, as you start to fiddle with Prompt Flow, reason through what the Microsoft documentation is looking for you to do as many of the screenshots and steps are different these days than documented
  • Do note by default the Prompt Flow creation process creates 3 VMs for the compute model. Size that down to just 1 VM to save $$ (3 is for global redundancy which is great, but way overkill for a test)
  • When your “compute model” is enabled and running, it's burning $$ - so after you fiddle with it – make sure to STOP your compute session (you can go back into Prompt Flow and click to turn on your Compute session for a future Chat sessions…)

  • If you try to create the thing and it fails, you can go in and delete the deployment, but you have to now work backwards as you can’t just delete the project as there are dependencies that were created in the automated build.? And don’t just go whacking the resources in your Azure resource group as you can accidentally delete something like the Key Vault that "owns" all of the Prompt Flow resources. If you delete the keys, you then can't delete the resources. As noted, this is a “bit” quirky…
  • For a guide on integrating Prompt Flow with Langchain as a supplemental provider to a basic ChatGPT setup, the process has been documented as such: https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-integrate-with-langchain

Wrap-up

If you’re using Microsoft CoPilot or AzureAI/ChatGPT and you want to “improve” the responses you get where ChatGPT falls short, you can dive in to fiddle with Microsoft’s Prompt Flow technology.

This is the next level evolution of A.I. chat tools that supplement what we have today, and provides digital innovation leaders the ability to help users with a better experience and get more out of A.I. Chats than available before!

Peleke Sensini

Aws Cloud Engineer || Azure Cloud Engineer || Cloud DevOps Engineer

7 个月

Funny how i've been playing with this recently not realizing how big of a deal it is. This means alot coming from you. Rand Morimoto

Yousaf Ali

Microsoft Partner Profitability Specialist (We make partnering with Microsoft easy!)

7 个月

Good one for the newbies or non technical users.

要查看或添加评论,请登录

Rand Morimoto的更多文章

社区洞察

其他会员也浏览了