Is Langchain going to die?
Avinash Dubey
CTO & Top Thought Leadership Voice | AI & ML Book Author | Web3 & Blockchain Enthusiast | Startup Transformer | Leading the Next Digital Revolution ??
Last week, OpenAI rolled out a series of updates, with the most significant one being the function calling feature in the Chat Completions API. This function interacts with external APIs and tools using OpenAI’s API, and it remarkably mirrors the capabilities of LangChain, sparking a discussion about its significance in the construction of autonomous agents.
Santiago, a renowned expert in machine learning, shed light on the implications of function calling. He explained, "With this capability, we can create bespoke functions that the model utilizes to resolve queries that call for real-time data such as weather forecasts, stock market values, or sports scores." Prior to this, such functionality was exclusively available through LangChain, but it has now been integrated directly into the API.
This update provides users the option to circumvent third-party interfaces like LangChain and build directly using OpenAI’s API, applicable for both GPT-3.5-Turbo and GPT-4 models.
Derek Haynes, an expert in Language Model Learning (LLM), expressed his excitement about the new feature, saying that it had significantly reduced the complexity of his previous ReAct-based agent prompts. "The introduction of this function-calling feature has led me to question whether LangChain is necessary or whether it adds unwanted complexity," he pondered.
In a blog post, Haynes elaborated on the role of LangChain and the implications of OpenAI's 'calling function' update. The approach, as he explained, allows LLMs like GPT-3.5-Turbo and GPT-4 to use external tools to complete tasks, evaluate results, and use these results to determine future actions. This might involve using a tool like Zapier to scan unread emails and then OpenAI's models to condense their content.
Despite its utility, this approach necessitates substantial effort from the user, including presenting available tools in a digestible format for the LLM, implementing custom code to parse responses and retrieve valuable data, and accurately instructing the LLM on task execution.
Haynes pointed out that this process can generate complex, fragile string parsing code. Here, LangChain can provide structure, but it may also decrease code readability. A HackerNews user, 'fbrncci,' argued that as the application's complexity increases, maintaining it with LangChain becomes more challenging.
OpenAI, through its website, offered examples and applications of the update. "Language models are increasingly becoming more capable beyond just normal text generation, with function calling within applications representing an exciting development," said AI expert Greg Kamradt, in a video entitled Function Calling via ChatGPT API – First Look With LangChain.
领英推荐
He emphasized the necessity of decision-making systems and how language models can be used as reasoning engines. He argued that using freeform text is not the most efficient way to communicate with other computers, suggesting JSON format as a more optimal approach, which function calling makes possible.
LangChain promptly offered support within an hour of OpenAI's update announcement, adding to the choices available for users. Despite the competition, including contenders like Hugging Face, LangChain remains a viable option due to its range of features and plugins.
The update, however, has led some to ask whether OpenAI has stolen LangChain's thunder. Santiago agreed, stating, "OpenAI did take this from LangChain." He was quick to add that this would not spell the end for LangChain. "LangChain has a lot more to offer," he asserted, raising questions about OpenAI's intent to drain LangChain's functionalities and incorporate them into its own API.