Role, Context, and Action Awareness: The Simplest Yet Effective Prompt Engineering Tactic
Dr. Ahmed S. ELSHEIKH - EDBAs, MBA/MSc
R&D Manager, ITIDA-SECC ★ AI/Data Business & Platform Economy Strategist | Strategy Advisor | Enterprise Architect | Executive Coach | Executive DBAs (SKEMA/BSI & iaelyon, France) | MBA (Warwick, UK)
In the 54th edition of?this newsletter, entitled “GenAI Attention Tuning: Prompt Engineering Tactics to Maximize the ROI from the General Purpose LLMs,” it was concluded that in some situations, enterprises are forced to follow the shortest and easiest path to leverage the power of generative AI by subscribing to the already existing?“General-Purpose Large Language Models”?services and avoiding building upon the?“Best-Fit Foundation Models”?that can turn the?“Enterprise Data Assets”?into?“Actionable Knowledge”?through the?“Fine-Tuning”?and?“Retraival Agmanted Generation - RAG”?strategies due to the associated challenges in terms of?“Technical Complexity, Costs, and Significant Efforts/Time.”?The suitable strategy in these situations is to use the well-known?“Prompt Engineering Techniques”?to help these?“General-Purpose Large Language Models” stay away from all they have learned knowledge and focus on just a few knowledge domains to start composing the response.?Furthermore, these prompt engineering techniques help the general purpose of large language models to sharpen the memory of the models and increase the logical flow between a sequence of responses. This strategy can be called ?“Attention Tuning.”
However, it was also discussed that this powerful attention-tuning strategy doesn’t come without cost. Using a?“Sequence of Attention-Tuning Prompts”?instead of straightforward, one-shot prompts may dramatically increase the cost of using these general-purpose models. Hence, it is logically intuitive that reducing the overall cost of reaching the optimum response from?these?“General-Purpose Large Language Models”?means designing effective?“Prompt Engineering Tactics”?that make the overall process as efficient as possible. Taking into consideration that there are so many known prompt engineering techniques that can be examined, such as “Least-To-Most, Iterative Prompting, and Chain-of-Thoughts,” still a simple yet effective prompt engineering tactic may be needed as a straightforward solution to remember while interacting with these?general-purpose large language models.
Here comes the focus of this edition of the newsletter. Following the simplest definition of intelligence may guide us to the simplest prompt engineering tactic. As the simplest definition of intelligence means that the agent is aware of both his surrounding environment and is able to plan and take the proper actions accordingly, it will be logically intuitive to tell the?general-purpose large language models?three pieces of information in sequence and in a single prompt. The prompt should tell the agent his expected?“Role to Perform,”?then as much as possible information about his?“Surrounding Environment”?and the required?“Actions to Accomplish.”
As an example, if you need to use the?general-purpose large language models,?carry out some market research and strategy analysis effort to analyze the strategic situation of an international organization to understand its competitive advantages and its long-term strategy, the simplest yet effective prompt following the above-mentioned structure may be as follows:?
领英推荐
“As an experienced market researcher and strategy advisor who has a great understanding of the market dynamics of a certain industry, please analyze the strategic situation and position of a certain organization and explain in detail the competitive advantages. Then compare these strategic advantages to its working market key success factors to determine the level of strategy to market fit.”?
In this simple yet effective prompt, the expected LLM’s role to perform was clearly stated at the beginning of the prompt (i.e.,?experienced market researcher and strategy advisor). Then, the second part of the prompt gives information about the context or the surrounding environment (i.e., the market dynamics of a certain industry). The third part requested detailed actions to be performed, including analyzing the competitive advantages of the organization and comparing these strategic advantages to its working market key success factors to determine the level of strategy to market fit.”?
Hence, and to conclude,?it is logically intuitive that reducing the overall cost of reaching the optimum response from?these?“General-Purpose Large Language Models”?means designing effective?“Prompt Engineering Tactics” that make the overall process as efficient as possible to avoid increasing the cost associated with?the long?“Sequence of Attention-Tuning Prompts.”?Following the simplest definition of intelligence guides us to the simplest yet effective prompt engineering tactic. This simplest logical, intuitive tactic includes telling the?general-purpose large language models?his expected?“Role to Perform,”?then as much as possible information about his?“Surrounding Environment”?and the required?“Actions to Accomplish.”?By providing the general-purpose large language models?sufficient awareness about these?three pieces of information in sequence, the generated response may be to the point and reduce the overall prompting cost.