Building a G2C AI Chatbot using OpenAI Assistant API and Botpress
Jonathan Aw
Bridging Business Goals & Technological Innovation for Transformative and Sustainable Growth | Senior Account Technology Strategist at Microsoft
Introduction
The use of state-of-the-art Generative AI technology has the potential to revolutionize the way public services are delivered. One of the most useful applications of this technology is in building G2C AI Chatbots that can help explain complex government policies to citizens and residents in simple, easy-to-understand language. This is a useful use case for most government agencies, as it can help improve the accessibility and transparency of government services.
However, there is a common misconception that AI is only for data scientists or machine learning experts. This is simply not true. With the right tools and guidance, anyone can build an AI chatbot that can help improve public services.
In this article, I’m thrilled to share my personal experience in building a working prototype of a G2C AI chatbot using state-of-the-art Software-as-a-Service (SaaS) solutions in just 2 days! From research and learning to the completion of the first working prototype, I was able to create a chatbot that can help people with their queries.
Let’s learn together through a step-by-step tutorial and explore some considerations in potentially bringing such a prototype AI Chatbot from development into production. So, let’s dive in.
Target readers:
My primary target readers are people who are looking to improve the delivery of public services. They can use the insights from this article to understand how to build AI chatbots that can be used to explain complex government policies and improve the accessibility of government services for citizens. However, the solution is generic and could be applied to non-Government settings too. The readers can fall into either category:
·?????? AI Enthusiasts and Tech-Savvy Individuals: People with an interest in AI, even without a background in data science or machine learning, can use this guide to understand the process of building an AI chatbot. This group also includes tech-savvy individuals who are always on the lookout for the latest trends and applications in the tech world.
·?????? Non-Tech Individuals: Contrary to popular belief, one doesn’t need to be a tech expert to leverage AI. With the right tools and guidance, anyone interested in improving public services or those curious about AI can benefit from this article. This includes professionals in non-tech roles within government agencies or enterprises.
Remember, the goal is to make AI accessible and useful to everyone, not just the tech experts. Happy reading!
Disclaimer
This article contains the author’s personal opinions for knowledge sharing purposes.
This is not an official guide on creating G2C Chatbots for the Government. The readers should apply their own discretion when using this information for implementing their respective Agencies' use cases.
This article mentioned the use of OpenAI and Botpress Cloud for developing a prototype AI Chatbot that is unclassified. It is important to note that OpenAI and Botpress Cloud are commercial Software-as-a-Service (SaaS) based overseas. The use of these overseas SaaS for use cases that process and store any classified government data is strictly prohibited without obtaining official clearance beforehand as per the IM8 SaaS Usage Policy.
Showcasing the featured AI Chatbot
But before we dive into the specifics of building a prototype AI Chatbot, let me show you the result. I strongly encourage you to give the HealthierSG Pal AI Chatbot a try. You can either click on the following link or scan the QR Code on your mobile device to access the HealthierSG Pal AI Chatbot.
The prototype chatbot was built with public domain information that is web-scraped from the official Healthier SG website. The chatbot explicitly informs users not to share their personal data. Therefore, it does not handle, or process classified government information and is unlikely to receive sensitive personal data from its users.?
Caveats:
This prototype is not an official MOH endorsed Chatbot.
I'm using a lower-tier OpenAI subscription for development, which has stringent rate-limiting limitation imposed on its API. I seek your patience if you encounter issues with connectivity.
It also has a maximum limit on bot invocations because the Bot's UI is hosted on a free tier of Botpress Cloud for development.
Let me know, through a LinkedIn message, if you do encounter any issues accessing the Bot UI as the limit might have been reached. I will need to upgrade my Botpress subscription if that happens.
Step by Step instructions:
Step 1: Define the Chatbot Use Case and Goals:
Before building a chatbot, it’s crucial to outline its use case and objectives.
The goal of the HealthierSG Pal AI Chatbot is to guide Singapore residents and caretakers on the Healthier SG initiative and provide information that is easy to understand and apply.
The chatbot shall perform the following tasks:
·?????? Explaining the Healthier SG initiative and its benefits: The chatbot should be able to explain the Healthier SG initiative and its benefits to users.
·?????? Guiding users on how to enroll and participate in the program: The chatbot should be able to guide users on how to enroll and participate in the Healthier SG program.
·?????? Providing recommendations for Healthier SG Clinics and Medical Care Facilities: The chatbot should be able to provide recommendations for Healthier SG clinics and medical care facilities based on the user’s postal code.
·?????? Suggesting Community and Lifestyle Activities related to Healthier SG: The chatbot should be able to suggest community and lifestyle activities related to Healthier SG based on the user’s postal code.
·?????? Answering general queries about the Healthier SG initiative: The chatbot should be able to answer general queries about the Healthier SG initiative.
The chatbot should support the following interaction:
A Webchat Virtual Assistant: The AI chatbot should be able to interact with users via a Webchat user interface that can be embedded into the official Healthier SG website if eventually approved for production.
By addressing these points meticulously, we’ll have a clear roadmap for developing a chatbot that is both efficient and effective in meeting its intended purpose.
Step 2: Choosing the AI platform to build the AI Chatbot Backend Service.
First, we need to choose an AI platform for building the AI Chatbot backend. The backend fully handles the AI’s conversation with the users and is the most important part of the AI Chatbot.
I’ve chosen OpenAI Assistant API for the following reasons:
Powerful Models: OpenAI Assistant API leverages powerful models such as GPT-4 and GPT-4o which can generate human-like responses to text input.
Ease of Use: OpenAI Assistant API is easy to use and requires minimal coding knowledge. It provides a simple and intuitive interface for building AI chatbots.
·?????? Playground: OpenAI Assistant API provides a Playground that allows you to develop the API without coding. This makes it easy to experiment with different large language models and configurations with zero coding efforts!
·??????? Retrieval Function: OpenAI Assistant API includes a retrieval function that simplifies the implementation of RAG LLM (Retriever-Augmented Generation Language Model). This function allows developers to retrieve relevant information from a large corpus of text and use it to generate responses.
Support: OpenAI has a large and active community of developers who provide support and guidance. This makes it easy to get help when you need it.
Security Compliance & Accreditation: OpenAI invests in security as they believe it is foundational to their mission. They safeguard computing efforts that advance artificial general intelligence and continuously prepare for emerging security threats. OpenAI encrypts all data at rest (AES-256) and in transit (TLS 1.2+) and uses strict access controls to limit who can access data. Their security team has an on-call rotation that has 24/7/365 coverage and is paged in case of any potential security incident. OpenAI complies with GDPR and CCPA and can execute a Data Processing Agreement if your organization or use case requires it. The OpenAI API has been evaluated by a third-party security auditor and is SOC 2 Type 2 compliant. The OpenAI API undergoes annual third-party penetration testing, which identifies security weaknesses before they can be exploited by malicious actors.
?
Step 3: Building the AI Chatbot Backend on OpenAI Assistant API
Now that we have chosen the AI platform for building the AI Chatbot backend, let me point you to some relevant quick learning materials before we dive in and begin building our Chatbot backend.
The following are some recommended reading materials. These provide foundational knowledge for building AI Chat Bot that has a knowledgebase containing our custom data. You may skip these for now and return to reading them when you are ready to build your own chat bot.
?
3.1 Set Up the OpenAI Development Environment
Creating an Account: First go to the OpenAI website and click on the “Sign Up” button. Proceed to complete the sign-up process and login to OpenAI platform portal.
Before we could begin to develop any OpenAI API, we must ensure we have OpenAI credits in our account. The credits will be consumed when invoking OpenAI's API. Please proceed to top-up sufficient credits with a credit card.? Check out the pricing information on the page.
3.2 Begin building the Assistant API in OpenAI’s Playground
Next, head over to OpenAI’s Playground and begin experimenting!
OpenAI’s Playground is a great tool to explore the capabilities of the Assistants API and learn how to build our own Assistant without writing any code.
Here are the steps to use OpenAI’s Playground to build Assistants.
Navigate to OpenAI Playground: After navigating to the OpenAI Playground, select Assistants from the top dropdown.
Create an Assistant: Then select the down arrow below that and choose + Create assistant.?Now fill out the details for the assistant.
3.3 Getting the information for the Chatbot’s Knowledgebase
A custom knowledgebase is an essential component in building an AI chatbot. It is a repository of information that the chatbot can use to answer user queries. The knowledgebase is created by collecting and organizing data from various sources, such as Websites, FAQs, manuals, and other documents. By using a custom knowledgebase, the chatbot can be given up-to-date custom knowledge for it to provide more accurate and relevant responses to user queries. A well-designed knowledgebase can help mitigate the risks of AI hallucinations and provide a better user experience.
Getting the Knowledge Base: The HealthierSG Pal prototype AI Chatbot was built with public domain data that is web-scraped from the official Healthier SG website.
Regarding Web scraping: Web scraping is a powerful tool for data collection, but it’s important to approach it responsibly and ethically.
Here are some key points to consider:
·?????? Respect the Website’s Terms of Use: Many websites have terms of use that explicitly prohibit web scraping. Always review these terms before you start scraping.
·?????? Avoid Overloading Servers: Web scraping can put a significant load on a website’s server.?Be mindful of the frequency and volume of your requests to avoid causing disruptions.
·?????? Respect Robots.txt: The robots.txt file on a website provides guidelines about what parts of the site should not be accessed by bots.?It is good practice to respect these guidelines.
·?????? Don’t Scrape Personal or Copyrighted Data: Scraping personal data can infringe on privacy rights, while scraping copyrighted data can infringe on intellectual property rights.?Always ensure the data you’re scraping is publicly available and not protected by copyright.
·?????? Be Transparent: If you’re scraping data for research or business purposes, be transparent about your intentions.?
·?????? Use Specialized Web Scraping Tools: There are many tools available that can help you scrape data in a responsible and efficient manner.
Disclaimer: This advice is intended as a general guide and does not constitute legal advice.?Laws regarding web scraping vary by country and website and can be complex.?Always consult with a legal professional if you have any doubts about the legality of your web scraping activities.
Considerations for transiting to production: Before using curated data sources as the knowledgebase for production purposes, it is important to seek clearance from the appropriate authority. ?
3.4 Testing our AI Chatbot Backend API in OpenAI’s Playground
Test in the Playground: We can test our AI Assistant API in the playground by clicking the three dots to the right of the assistant name details.
??
Step 4: Choosing the Chatbot UI Platform (Frontend User Interface)
While the OpenAI Assistant Api provides the backend service for the AI Chatbot, we will need a solution for the Chatbot Frontend User Interface (UI), which is the user-facing component of the Chatbot.
Once we have a functional backend for the AI Chatbot, we will need to build the Chatbot’s UI that can be accessed by the target users. This can be done using a Chatbot UI platform such as Botpress or building a custom interface using a programming language such as Python.
I’ve chosen Botpress Cloud (SaaS) for the following reasons:
Fast and Simple to deploy a fully managed Chatbot UI for my Prototype OpenAI Assistant API Bot
·?????? Quick Deployment: Botpress Cloud allows me to test and deploy my chatbots in the cloud in a matter of minutes.
·?????? OpenAI Assistant API Integration: Integrating a custom OpenAI Assistant API chatbot into our website or messaging channel is simple with Botpress Cloud using its pre-built OpenAI Chatbot template.
·?????? Easy Channel Integration: With just a few clicks, I can easily connect my chatbot to a Webchat UI and other leading channels, including Slack, Telegram, Teams, Twillio, SunCo, etc.
Scalable: Botpress Cloud is scalable and can handle large volumes of traffic. It is designed to be deployed in a production environment and can be scaled horizontally to handle increased traffic.
Security: I think the Botpress Cloud SaaS is sufficiently secured for hosting a non-sensitive unclassified prototype Chatbot.
Caveats on Security: Botpress Cloud does not seem to have any specific security accreditation information on its website. While it may be ok using a non-accredited SaaS for hosting non-sensitive and unclassified prototype Chatbots for experimental purposes, the readers should use their own discretion when evaluating the security standards of this Chatbot UI Platform for their production purposes. Fortunately, Botpress, as an open-source platform, also offers several other options for deployment, including Govt Data Centers, GCC2.0 and GCC+.? Production concerns are addressed in detail in a latter section of this article.
Analytics: Botpress provides a dashboard to capture users’ chat history for review and analysis. This can be used to refine the Ai Chatbot backend service for continuous improvements.
?
Step 5: Building the Chatbots User Interface on Botpress Cloud
Now that we have chosen the Chatbot Frontend User Interface (UI) platform for our Chatbot, let’s dive in and begin building our Chatbot backend.?
?
Step 5.1: Setting up the Botpress Cloud Environment
Creating an Account: First, sign up for a free Botpress Cloud Subscription Account at https://Botpress.com/
Step 5.2: Deploying our OpenAI Assistant API as a Botpress Chatbot
Next, follow the step-by-step guide on how to integrate an OpenAI Assistant API with a Botpress Chatbot:
Step 5.2.1: Create our OpenAI Assistant API backend (We already completed this in Step 3)
Step 5.2.2: Download the Botpress Template for deploying Open Ai Assistant API as a Botpress Chatbot
Step 5.2.3: Import the Template into Botpress Studio
Step 5.2.4: Configure our new Botpress Bot to integrate with our OpenAI Assistant API backend.?This is a one-time setup process.
Step 5.2.5: Share or Deploy our Botpress Bot Assistant as a Webchat User Interface
?
领英推荐
?
Step 6: Testing our Chatbot via its Web Chat User Interface
With the chatbot interface built, it’s time to test the chatbot. Start by testing the chatbot ourselves, then with a small group of users and gather feedback to improve the chatbot’s performance. Gradually invite more users to join in the testing.
Testing Tips: When testing chatbots and conversational AI, it is essential to focus on the following key components:
·?????? Intent Recognition: Ensure the chatbot correctly identifies the user’s intent behind the input.
·?????? Entity Extraction: Evaluate the chatbot’s ability to extract relevant information (entities) from the user’s input. In my case, the Singapore Postal code for location-based recommendations.
·?????? Dialog Management: Verify the chatbot’s ability to manage and maintain context throughout the conversation.
·?????? Response Generation: Check the chatbot’s ability to generate accurate and relevant responses based on the user’s input and context.
·?????? Integration: In my case, my Chatbot can compose contextual web URLs that lead users to https://ask.gov.sg for location-based recommendations. Test that it works properly.
?
Monitor and Improve the Chatbot: After deploying the chatbot, it’s important to monitor its performance and gather feedback from users. Use this feedback to improve the chatbot’s performance and add new features as needed.
Controlling Cost - The quality assurance of the Chatbot can be more rigorous by inviting more users for testing. However, be aware that these will expend more credits on OpenAI and Botpress Cloud SaaS. Make sure we set the appropriate usage limitation on OpenAI to prevent getting a shocking bill.
?
?
??
Understanding the Throttling Limits of Open Ai
The OpenAI platform enforces rate limits on API usage. These rate limits determine the number of requests that can be sent per second and can be increased by upgrading the usage tier. Strategies to avoid rate limit errors include quantizing rate limits, implementing exponential back-off logic, and optimizing prompts to reduce the chance of hitting the rate limits. Additionally, API usage is subject to approved usage limits, which are set by OpenAI and can be increased by upgrading the usage tier. For more detailed information, it is recommended to refer to the updated guidance on rate limits provided by OpenAI.
·?????? OpenAI’s Rate Limit
·?????? OpenAI’s Rate Limit Advice
Advanced Topic: Transitioning from prototype in production.
This section is about planning for production of an AI chatbot. It is aimed at the more technically savvy readers. Non-technical readers may want to seek assistance from their agency's IT department on this matter.
Disclaimer: Please note that the information presented in this section is for reference only and should not be considered as a guarantee that the information is complete or fully applicable to all government agency’s unique requirements for their AI Chatbot use cases. It is important to use your own discretion and consult your agency’s IT department when using the information to plan for the production of your AI chatbots.
Chatbot Backend API
Although the OpenAI Assistant API can adequately meet the demands of a prototype AI Chatbot, the readers should carefully consider the factors listed below when deciding whether it is also suitable for a production AI Chatbot.
Segregate Development and Production Instances
It is recommended to segregate the Development and Production instance into separate OpenAI subscription accounts. Each OpenAI account may maintain multiple Assistant API Instances for A/B testing and Blue/Green Deployment purposes if necessary.
Manage Scalability Limitations
API Scalability: The OpenAI Assistant API currently does not impose a limit on the number of assistants a single organization can have. However, rate limits may become a determining factor when interacting with many assistants. While there are no explicit limits on the number of assistants, it's advisable to reach out to OpenAI to inquire about the specifics of our use case, as rate limits may come into play when dealing with a significant volume of assistants.
Knowledgebase Scalability: The retrieval function is used for implementing the Chatbot's knowledgebase and it has the following limitations on scalability.
File Size Limit: The largest file that can be uploaded is 512 MB, which may restrict the amount of data that can be processed in a single retrieval operation. Also note that the overall storage limit for an org is limited to 100GB.
File Quantity Limit: There is a limit of 10k files per assistant, which is more than sufficient for use cases requiring access to many documents or datasets.
Cost: Retrieval is priced at $0.20 per GB per assistant per day, which can become costly, especially if an application stores and retrieves large amounts of data across multiple assistants.
Token Limit: There is a maximum of 2 million tokens of use per item that can be used as data for the assistant, which may limit the depth and breadth of information the assistant can retrieve and process.
Lack of Control: Users have limited control over the retrieval function, as the assistant's backend will fill the context with all available data that can fit, potentially leading to less predictable behavior.?
These limitations suggest that while the Retrieval function is a powerful feature of the Assistant API and might be adequate for most G2C Chatbots with a small knowledgebase, it may require careful consideration of its constraints for other types of AI applications with extensive data retrieval needs.
Limitations of OpenAI’s Assistant API (v2)
Token Usage and Cost Management:
The OpenAI Assistant API (v2) now includes mechanisms for better managing token usage. It provides detailed feedback on token consumption, allowing developers to monitor how many tokens are used in a session, which helps in managing and estimating costs more effectively. Developers can now access token usage information, facilitating a clearer understanding and optimization of costs. However, there is still some complexity in estimating token usage due to the variable nature of token consumption based on the length of conversations and context management.
Model Costs and Context Management:
The Assistant API supports several models based on GPT-4 and GPT-4 Turbo, each with unique considerations:
GPT-4:
GPT-4 Turbo:
GPT-4o:
Future Improvements:
OpenAI is actively working on enhancing the API based on user feedback. They have been responsive to concerns about cost transparency and management, and there are ongoing efforts to refine the pricing structure and provide better tools for cost estimation and control. As the platform evolves, it is reasonable to expect further improvements in cost management features and pricing models.
Ballpark cost estimates for OpenAI Chatbot Backend
Meanwhile, we can still use the usage statistics on OpenAI and Botpress dashboards to get a sense of ballpark costs of running OpenAI’s Assistant API.
?
For my case, I have 2 OpenAI Assistant APIs, both using GPT-4-1106-preview model, enabled with Retrieval function, and maintaining a total knowledgebase size of 8MB. I ran them for 7 days and served approximately 100 user prompts in total as recorded in Botpress Dashboard. Based on the usage statistics on OpenAI and Botpress dashboards, my OpenAI expenditure was 7.65USD. Therefore, it costs an average of 0.0765USD per user prompt on OpenAI. If we assume that on average each user session consists of 10 user prompts, then each session would cost 0.765USD.
?
If we have 10,000 user sessions per month, then each month would incur 7,650USD.
*Note that the expenditure was based on GPT-4 pricing in January 2024. The latest GPT-4o model is optimized for speed and cost. GPT-4o is generally cheaper and faster, balancing performance and cost effectively, which makes it suitable for high-volume use cases.
Control over Hyperparameters: Developers have expressed the need for more control over hyperparameters such as temperature and seed to manage the unpredictability of results and to ensure consistent performance. But from my personal experience, the default Assistant API performs satisfactorily for my AI Chatbot. However, such a feature might be useful for more complex use cases. It is likely that Open Ai will provide these in future release as they are already supported in their legacy APIs.
Finetuning the LLM Model: The OpenAI Assistant API currently does not directly support the fine-tuning of large language models (LLMs). While OpenAI does provide fine-tuning capabilities for models like GPT-3.5 Turbo, these fine-tuned models are typically accessed via the legacy OpenAI Completions API, not their new Assistant API.
There is some discussion in the OpenAI community about integrating fine-tuned models into the Assistant API, and OpenAI has promised that this feature is coming soon. Therefore, soon, you should be able to fine-tune OpenAI's LLMs, such as GPT-3.5 Turbo, and potentially achieve improved performance for specific tasks, and integrate these fine-tuned models with the Assistant API for use in chatbot applications. However, it's important to note that fine-tuning is a complex process that requires careful consideration. While it can improve a model's performance on specific tasks and domains, it may not reliably assimilate new knowledge into the model, which is a big impediment if you need to frequently update the knowledgebase of your AI Chatbot with new information.
?
Chatbot Frontend User Interface
Although the Botpress Cloud SaaS can adequately meet the demands of a prototype AI Chatbot, the readers should carefully consider the factors listed below when deciding whether it is also suitable for a production AI Chatbot.
Compliance and Security – Deploy Botpress into GDC/GCC/GCC+
Given the lack of security accreditations, it is not recommended to use Botpress Cloud SaaS as the Chatbot UI platform for production for important use cases. This concern can be addressed by deploying Botpress into either Govt Data Centers, GCC2.0 or GCC+. When using these hosting options for Botpress, the hosting infrastructure architecture should be designed to comply with the IM8 standards.
For GDC / GCC Azure / GCC GCP / GCC+
You can deploy Botpress locally using npm, yarn, or pnpm.
For GCC AWS:
There’s even a GitHub repository that provides an example of deploying Botpress to AWS using the AWS CDK.
The AWS CDK uses Infra-as-code (IaC) to provision Botpress that runs on highly available AWS cloud infrastructure with the following characteristics:
·?????? Uses?Fargate?as its compute engine.
·?????? Deploys a?publicly accessible?Botpress instance. All backing services (Postgresql database, Redis cache cluster) are private.
·?????? Installs the Botpress server, duckling, and the Botpress Language Server?in a separate container.
·?????? Uses?Aurora?Postgresql as its database.
·?????? Uses Redis (through?ElastiCache) as its cache cluster.
·?????? Contains 2 Botpress nodes out-of-the box. Can?scale horizontally?to more nodes.
·?????? Uses an?Application Load Balancer?to balance ingress traffic to all Botpress Server nodes.
·?????? Uses?AWS WAF?to add additional security in front of the Application Load Balancer
?
Security Measures when self-hosting Botpress in GDC/GCC/GCC+
Make use of Botpress’ s Security Features
·?????? Standard login (configurable)
o?? Password complexity & expiration policies
o?? Specified login duration.
o?? Maximum login attempts & automatic locking of accounts.
o?? Session lengths & expiry
·?????? Audit trails: track via logs which files users are accessing and editing and when.
·?????? Encryption
o?? at rest (via PostgreSQL) - stored data and records encrypted with industry standard AES-256.
o?? in flight - transmitted information are encrypted with TLS
·?????? Rate limits & various HTTP-server limits
?
Botpress Software Licensing
·?????? Open Source AGPLv3?(build from source code)
·?????? Botpress Proprietary License?(binaries)?
?
?
Evaluating AI Chatbot (Frontend + Backend) for production readiness.
Performance:
Validate the chatbot’s ability to manage a large number of simultaneous conversations and respond within acceptable response time.
Security Penetration Testing
The well-known OWASP Top 10 is a list of top security risks for a web application.?Our chatbots are available over a public web frontend, and as such, all the OWASP security risks apply to those chatbots as well. Hence, it is necessary to perform Security Penetration Testing.
?
Permissions to use curated information for the Chatbot’s Knowledgebase
Before using curated data sources as the knowledgebase for production purposes, it is important to seek clearance from the appropriate authority.
Establish an agreement on the process of getting updates and refreshing the chatbot’s knowledgebase for each curated data source.
Conclusion
In this article, we’ve explored the potential of Generative AI technology in revolutionizing public services, particularly using G2C AI Chatbots. These chatbots can simplify complex government policies, making them more accessible and transparent to citizens and residents.
We’ve debunked the common misconception that AI is only for data scientists or machine learning experts. With the right tools and guidance, anyone can build an AI chatbot. This democratization of AI technology is a significant step towards improving public services.
Through a step-by-step guide, we’ve demonstrated how to build a fully functional prototype G2C AI Chatbot from scratch using Open AI Assistant API and Botpress.
We’ve also discussed an advanced topic on transitioning from prototype into production and shared best practices for testing and production.
The target audience for this article is broad, encompassing public officers who may be AI enthusiasts, tech-savvy or non-tech individuals. The goal is to make AI accessible, fun and useful to everyone, not just the tech experts.
In conclusion, the future of public services lies in leveraging AI technology. By building AI chatbots, we can improve the delivery of public services, making them more efficient, accessible, and user-friendly. There’s no better time than now to dive into the world of AI chatbots. Happy building!
?
About the Author
Jonathan AW is a Lead Solution Architect from Govtech, working as Principal IT Consultant (Architecture & Solutions) at MOH. He has a strong background in technology, software engineering, and solution architecture, with a focus on AWS Cloud Architecture. In his free time, he enjoys cycling, music, photography, and videography.
Industrial Engineer | MBA | Data Scientist
9 个月Hi, thanks a lot for this info. I'm having trouble with the video (Botpress tutorial) because the template isn't working well with ChatGPT 4o version.