The art of Prompt Crafting - Part three
Jeroen Egelmeers
Master Prompt Engineering and prompt your business forward ?? Prompt Engineering Advocate ?? GenAI Whisperer ?? Public Speaker & (Co-)host ?? Author
In this part of the "Crafting AI Prompts Framework" series and the "Art of Prompt Crafting," we dive into the sustainable and maintainable implementation of prompts for companies. Of course, this is entirely based on the Crafting AI Prompts Framework. If you haven't read the framework yet, I'll refer to it multiple times throughout this blog. I highly recommend reading the first blog about the framework before proceeding.
Over the past few months, I had the privilege to discuss "Prompting" with many individuals. The main problem we encountered was that prompts were becoming excessively long. Some prompts even exceeded 1,000 words! This poses a sustainability and manageability issue over time. So, what's next?
Prompt Libraries
"But Jeroen, we already have prompt libraries. What's wrong with them?"
First, let's dive into how most "Prompt Libraries" function. To put it succinctly: they're essentially databases of your prompts, presented in a user-friendly interface. This setup allows you to easily search through and select prompts, making it a breeze to copy, paste, and utilize the prompt you need.
While I believe Prompt Libraries are an excellent starting point, the issue with most of them is their lack of sustainability and maintainability. Here's why: initially, you may draft prompts for just a few use cases. However, for each modification (when not using parameters), you'll need a new prompt. Conversely, if you're using parameters, you might lean towards one prompt per use case. For those unfamiliar with parameters, they're akin to integrating variables into the prompt. The prompt library then auto-fills these based on the information you input. However, this method isn't sustainable. Furthermore, I can envision the need for various filters. What if a prompt works exceptionally well for ChatGPT but falls short for Bing? Do you generate a unique prompt for each client? What then becomes the solution?
Why isn't it maintainable?
My foundation lies in Software Engineering, and I genuinely believe we have much to glean from its practices. In its early days, we encountered files bursting with lines of code. Nowadays, we embrace brevity, clarity, and sound architecture. Why not take a page out of that book and adopt those principles from the outset?
Consider this: when you craft a prompt, there's a high likelihood that components of that prompt can be repurposed. If we examine the Crafting AI Prompts Framework, nearly every building block can potentially be repurposed for other prompts, with the possible exception of the Context section (given its specificity).
Now, imagine formatting these as a table, equipped with unique IDs. This format could seamlessly align with numerous other prompts you draft in the future. As a result, you might find yourself repeatedly constructing the same frameworks for your prompts, which, candidly, is not the ideal approach!
Prompt Component Library to the rescue!
I believe that storing complete prompts in a single library might not be the most effective approach to prompting (or at least, while I understand the appeal, incorporating building blocks could enhance the process). So, a word of advice for Prompt Library developers:
Revisiting the Crafting AI Prompts Framework, I posit that the most efficient approach would be to treat the building blocks from the Crafting AI Prompts Framework as "components." These can then be cataloged in a Prompt Component Library.
For instance, consider the "table format with unique ID's" scenario. This could be a prompt component housed in the "Format" section of the Prompt Component Library.
By adopting this method, you're cultivating a library of "components" which can be seamlessly integrated into subsequent Prompt Recipes.
Prompt Recipes?
Absolutely! In Prompt Recipes, you combine the various parts from the Prompt Components to create the desired prompt. For instance, if you're crafting a LinkedIn post and you want to incorporate specific hashtags, use emoticons in place of bullet points, and so on, you'd fetch those "Prompt Components" from the "Prompt Component Library" and incorporate them into your Prompt Recipe.
But that's just the beginning! As you construct a prompt, you'd likely want to append more details to it. This might include information like which model it was tested on (and its version), its creation date, the creator's name, and any recipe parameters that need to be inputted (more on this later). Additionally, a descriptive title and summary would be handy for easy retrieval later. You might also want to note requirements like the availability of specific plugins, especially if using platforms like ChatGPT or if any integration parameters where set to the API (for example at OpenAI).
Additionally, you might want to incorporate "variables" or, more aptly termed, "parameters." In such cases, it's imperative to include these parameters in the Recipe. This ensures users are aware they must input these parameters before utilizing the prompt recipe.
领英推荐
Lastly, if you're employing any API parameters (like adjusting the temperature, max_tokens, etc.), it's essential to include those details in the Prompt Recipe. This ensures that another user can replicate the exact output you achieved, maintaining the validity of your tests.
With the recipe and the component library established, how does the final product appear?
This illustrates how I've integrated various components into recipes. It would be even more advantageous if we had tools allowing for the simple drag-and-drop of these components into our recipes. In essence, crafting recipes would resemble a "no-code" or "no-prompt" platform for users, assisted by Prompt Engineers who add new components to the Component Library for the users to use. Everything is readily available; the task is to assemble it cohesively within your Prompt Recipe.
This method truly adds value. The value isn't solely in the completed prompt but in its individual components. So, why not deconstruct them into reusable components, mirroring practices in Software Engineering?
Also, consider the numerous new opportunities that arise, such as labeling the components and identifying the strategies used, such as Chain-of-Thought Prompting , Set of Mark prompting, and so on. I'll delve deeper into this topic in the next article of this blog series.
Workflows
This introduces a fascinating perspective. In the inaugural blog of this series, I briefly discussed Zapier . I believe its flow aligns perfectly with prompting! Essentially, you initiate a prompt, execute a specific action, and upon completion, potentially trigger subsequent prompts. This concept can be termed "workflow prompting." Instead of automating just one specific aspect of your workflow using prompts, you aim to automate the entire workflow!
To delve deeper into this concept:
You initiate a task (a workflow task). Upon its completion, it instigates the succeeding task. Each task is aligned with a specific prompt Recipe that it activates.
For instance, a workflow task might be: "Create user stories and upload to Jira." Within the prompt Recipe, user stories are generated utilizing a tool like ChatGPT. Subsequently, the workflow task (leveraging Zapier, for instance, if not already embedded in another tool) automates the transfer to Jira. Once this is accomplished, the next workflow task is set in motion.
The beauty of it is, you can design a default workflow that's accessible to everyone in your organization. Additionally, users have the flexibility to customize and add their own elements to the workflow, ensuring it aligns with their specific needs.
Crafting AI Prompts framework integration
The entirety of this framework aligns seamlessly with the aforementioned strategy. Every component from the Crafting phase can be found in the Prompt Component Library. The Validation Phase components are vital and should be addressed individually:
Lastly, prompt recipes come with versions (history), model information, and all other needs to inform the user what has been used to get to the desired goal (G) to facilitate the Adapt and Improve (AI) process.
Getting you time through AI automation, so that you can focus on growth | speaker & trainer
1 年Like the term prompt component library man, good job.
Master Prompt Engineering and prompt your business forward ?? Prompt Engineering Advocate ?? GenAI Whisperer ?? Public Speaker & (Co-)host ?? Author
1 年Part Two: The art of Prompt Crafting - Part Two https://www.dhirubhai.net/pulse/art-prompt-crafting-part-two-jeroen-egelmeers
Master Prompt Engineering and prompt your business forward ?? Prompt Engineering Advocate ?? GenAI Whisperer ?? Public Speaker & (Co-)host ?? Author
1 年Part one: The art of Prompt Crafting https://www.dhirubhai.net/pulse/art-prompt-crafting-jeroen-egelmeers