LangChain Prompt Templates, Memory, and Chains
Sarthak Pattnaik
Generative AI | Data Analytics | Data Engineering | Ex-Business Technology Analyst @Deloitte | MS Applied Data Analytics - Boston University
LangChain is an open source LLM application development framework. LangChain has multiple modular components that can be used in conjunction with each other. LangChain works efficiently with prompt templates. The tools and APIs that are part of LangChain make the process of building LLM applications more simplified. To import language models into LLMs, one requires an API key
An Introduction to Prompt Templates
Prompt engineering is used to instruct a large language model to curtail the open-endedness in its responses and be very specific to the details it includes in its responses. In LangChain, there is a provision to use ChatOpenAI from chat_models to perform prompt engineering. The PromptTemplate class in LangChain, formalizes the composition of prompts without the need to hard code contexts and queries. Output parsers are used to instruct the model to generate the output in a specified format. The output_parsers library allows the user to pass a response_schema as an argument to the outputParser to curate a response that is in tandem with the schema structure.
Customizing Conversational Memory
Using LangChain we can orchestrate a long and free-flowing conversation. Conversation memory in LangChain saves the context of each conversation in a buffer memory to remember the details of the previous interaction. To make sure that the language model understands that the conversation is between a large language model and a human to circumvent hallucinations, we initialize the ConversationChain. The prompt template that is incorporated by the conversation chain once it is initialized using the ConversationChain is the following:
{history}
Human: {input}
AI:
There is myriad conversation memory that can be used in conjunction with the ConversationChain. These modify the {history} parameter. Some prominent examples include:
1.????? ConversationBufferMemory
This is the most straightforward conversation memory in LangChain. The raw input of the past conversation is passed in its raw form to the history parameter. The pitfall of using ConversationBufferMemory is that as the length of the conversation increases, the ability of this memory to remember plummets due to the number of tokens threshold. To avoid excessive token usage, we use ConversationSummaryMemory.
?
2.????? ConversationSummaryMemory
This memory summarizes the conversation before passing it to the history parameter. In stark contrast to the ConversationBufferMemory, the summary memory has a significant advantage when it comes to longer conversations since the number of tokens used decreases with the increase in the length of the conversation.
The pitfall of using the Summary memory is that for small conversations, it uses a gigantic number of tokens. In such scenarios, it is pragmatic to use the buffer memory.
?
领英推荐
3.????? ConversationBufferWindowMemory
Similar to the buffer memory with the inclusion of a window parameter that indicates the how much of the past conversation must be retained by the memory before forgetting. It helps limiting the number of tokens used which can be manipulated based on our requirements. It is one of the best when it comes to keeping a track of recent conversation.
4.????? ConversationSummaryBufferMemory
The ConversationSummaryBufferMemory is a mix of ConversationSummaryMemory and ConversationBufferWindowMemory. It summarizes the earliest tokens in a conversation while maintaining the maximum token limit.
Chains in LangChains
LLM chains are used to add functionalities around the language models. LLM chains offer a solution that mitigates the complexity that is involved in chaining multiple LLM calls. LLMChain is the most basic form of LLM chain which takes a large language model and a prompt as parameter inputs and curates a response to a certain user input based on the instructions embedded in the prompt.
There are different types of chains used for application development in LangChain:
References
Co-founder at Concorde AI France
5 个月Hi, have you tried using conversations (Summary or others) with LCEL ? It seems impossible to do
Associate General Manager HCL Tech Ltd., Digital Delivery Excellence SAP, Agile Coach (ICP-ACP), SAFe-5.0 SA, PSM 1, SAP Activate, DevOps
6 个月That’s really clarifies my fundamental on LongChain.
Director, Information Security at Fiserv India Pvt. Ltd
6 个月Great post Sarthak. Immensely benefited after reading your post. Keep sharing.