Prompts for APIs...

In my previous article, I spoke about "Prompts for APIs" as one of the use case for ChatGPT. In this article, I wanted to elaborate a little more on this use case. I will first talk about how we traditionally implement this use case and then I will talk about how ChatGPT may change this implementation. The example that I will use to explain the use case is actually a real business example that I have earlier implemented. However, I have masked this example in order to make it anonymous. So, this example is from a large organization which has multiple 3rd party and in-house applications which do not talk to each other. There are manual hand-off required to orchestrate a business function that uses all these stand-alone application. The use case is where a customer is recorded in the CRM application, once the customer is recorded, the customer information is used to retrieve the SSN details from another application "Customer Information". With the retrieved SSN, a manual hand-off is done to query the "Credit Information" application which returns the credit details of the customer. With the credit details information, it queries the "Credit Eligibility" application which determines whether the customer is credit eligible or not. Due to the manual hand-offs, the eligibility determination takes between 2-3 weeks. We wanted to automate these interactions to reduce the duration to an hour. So, we architected the solution as below. This is what I am referring to as "the traditional" way to choreograph between disparate applications

No alt text provided for this image

As part of the solution the wrapper and the kafka topics were developed as new components. The wrappers were developes as microservices using Java, Springboot and Apache Camel. When the process starts, it calls wrapper 1 which queries the CRM to get new customers and writes an event to the Kafka Topic (Topic 1), the next wrapper or service subscribes to that topic and then executes the next step of the flow. This continues until the eligibility is determines and the eligibility results are published

Now, as ChatGPT matures, I think this use-case can be implemented leveraging prompts. This will reduce the complexity of using Kubernetes for hosting the microservices(wrappers in this case) and also the complex coding required for Java, Springboot and Apache Camel. The solution in that case may look like as below

No alt text provided for this image

The services or the wrappers are replaced by prompts(written in "optimized" english prompts) hosted in a prompt server. The Kafka layer which earlier enabled the choreography of the services may get replaced by a "Prompt Workflow" solution which again can be implemented using "optimized" english like prompts.

I know the current technology is probably not ready yet with respect to security, scalability and redundancy, but I think this is a possibility which has the potential to simplify, and accelerate the implementation of this type of use cases. Considering all other aspects like security, scalability and redundancy of the architecture to be constant, a lot will depend on whether it is easier to write good prompts faster than good code.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了